CN115493513A - Visual system applied to space station mechanical arm - Google Patents

Visual system applied to space station mechanical arm Download PDF

Info

Publication number
CN115493513A
CN115493513A CN202210977497.7A CN202210977497A CN115493513A CN 115493513 A CN115493513 A CN 115493513A CN 202210977497 A CN202210977497 A CN 202210977497A CN 115493513 A CN115493513 A CN 115493513A
Authority
CN
China
Prior art keywords
camera
wrist
target
mechanical arm
elbow
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210977497.7A
Other languages
Chinese (zh)
Inventor
陈磊
谭启蒙
杜晓东
李大明
王飞
***
张雪涛
胡成威
高升
王友渔
熊明华
张昕蕊
李中衡
丁健
王震
许哲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Spacecraft System Engineering
Original Assignee
Beijing Institute of Spacecraft System Engineering
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Spacecraft System Engineering filed Critical Beijing Institute of Spacecraft System Engineering
Priority to CN202210977497.7A priority Critical patent/CN115493513A/en
Publication of CN115493513A publication Critical patent/CN115493513A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/245Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures using a plurality of fixed, simultaneously operating transducers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/08Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division
    • H04N7/0806Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division the signals being two or more video signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Signal Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A visual system applied to a mechanical arm of a space station comprises at least two wrist cameras, at least one elbow holder camera, at least one butt-joint camera, a data communication device and a data processing device; at least one elbow pan-tilt camera is mounted at an elbow of the mechanical arm and used for monitoring; each wrist of the mechanical arm is provided with at least one wrist camera for acquiring pose information of a target and further guiding the tail end of the mechanical arm to be close to the target; at least one docking camera is arranged on the docking part of the cabin body of the space station and used for acquiring pose information of a target and further guiding a mechanical arm to complete docking of the cabin body; all the wrist cameras and the elbow holder cameras are in data communication with the data communication device; the data communication device and all the docking cameras perform data communication with the data processing device and communicate with the outside through the data processing device.

Description

Visual system applied to space station mechanical arm
Technical Field
The invention relates to a vision system applied to a mechanical arm of a space station, which can meet the monitoring and measuring requirements of the mechanical arm of the space station in an in-orbit task.
Background
The large space manipulator is an indispensable tool in the space station construction process. The vision system is an essential component of the space manipulator, two main tasks of monitoring and measuring are carried out, and all working modes of the manipulator except open-loop control cannot be independently realized without the guidance and assistance of the vision system. In the process of the operation of the space manipulator in the rail, the space manipulator not only can move in a large range, but also needs to complete a series of precise operations with high precision and strong stability. Therefore, in order to meet the application requirements of large space manipulators, the vision system is required to clearly image the surrounding space environment and space targets within a large-span observation range distance (from hundreds of millimeters to dozens of meters), and simultaneously, the detection, identification and high-precision three-dimensional measurement can be completed on various space targets with large size change.
Most of the large space manipulators which run on foreign rails or are verified in flight at present are also provided with vision systems, but the vision systems have one or more of the following problems:
(1) The tasks required to be carried out by the mechanical arm are single, the functional requirements on a visual system are simple, and the current complex space station tasks cannot be adapted to;
(2) The cameras in the visual system are mostly configured in a monocular mode, and once a monocular camera product fails, the current visual and monitoring functions are lost;
(3) Two monitoring cameras are arranged on an arm rod near an elbow of the mechanical arm, visual monitoring is respectively carried out on areas where two tail ends are located, and on-orbit resource consumption is large;
(4) The information processing resources of the vision system are concentrated, so that the efficiency of the control system still has a space for improving under the condition that the data transmission bandwidth is limited.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: the defects of the prior art are overcome, and the monitoring and measuring requirements of the space station mechanical arm in the in-orbit task are met.
The purpose of the invention is realized by the following technical scheme:
a visual system applied to a mechanical arm of a space station comprises at least two wrist cameras, at least one elbow holder camera, at least one butt-joint camera, a data communication device and a data processing device;
at least one elbow pan-tilt camera is mounted at an elbow of the mechanical arm and used for monitoring;
each wrist of the mechanical arm is provided with at least one wrist camera for acquiring pose information of a target and further guiding the tail end of the mechanical arm to be close to the target;
at least one docking camera is arranged on the docking part of the cabin body of the space station and used for acquiring pose information of a target and further guiding a mechanical arm to complete the docking of the cabin body;
all the wrist cameras and the elbow holder cameras are in data communication with the data communication device;
the data communication device and all the docking cameras perform data communication with the data processing device and communicate with the outside through the data processing device.
Preferably, the elbow holder camera drives the camera to follow through the holder so as to keep monitoring the target all the time.
Preferably, the wrist camera is also used for guiding the capturing, grabbing and locking of the target adapter by the end of the mechanical arm.
Preferably, the elbow holder camera adopts a monocular camera; the wrist camera and the butt camera both adopt binocular cameras.
Preferably, wrist camera, butt joint camera all have monocular measurement and two mesh measurement mode concurrently, and all can independently switch.
Preferably, the monocular measurement mode is used for measuring the three-dimensional pose of the target, the binocular measurement mode is used for synchronously acquiring images of a binocular camera, and the pose of the target is calculated through image feature matching.
Preferably, the wrist camera and the butt camera are used for independently calculating the real-time pose of the target.
Preferably, the wrist camera, the docking camera and the elbow pan-tilt camera are all provided with light sources for active illumination.
Preferably, the data communication device is an ethernet switch.
Preferably, the vision system further comprises a vision marker for the wrist camera and/or the docking camera to acquire pose information of the target.
Compared with the prior art, the invention has the following beneficial effects:
(1) The invention provides a visual system applied to a space station mechanical arm, aiming at the problems that the space station mechanical arm has various tasks and the requirements on visual monitoring and measurement are complex, and the visual system can be compatible with various task requirements on-cabin crawling, cabin section transposition and butt joint, off-cabin cargo carrying, space station off-cabin state inspection, hovering aircraft capturing and the like, and has the capabilities of position measurement and speed measurement.
(2) Aiming at the reliability problem of a monocular camera, the invention provides that both the wrist camera and the butt camera have a binocular vision measurement and monocular vision measurement multi-mode autonomous switching function, and have an autonomous optimization function when the left eye and the right eye are both in a monocular vision measurement mode; and under two measurement modes, the device has the function of autonomously judging the measurement effectiveness, and still has the functions of target detection and identification and three-dimensional pose measurement when one point on the visual marker is missing or invisible.
(3) Aiming at the problem that the operation of two tail ends of the mechanical arm needs visual monitoring, the single double-shaft pan-tilt is configured at the elbow joint of the mechanical arm, so that the large-scale monitoring is realized by using a small number of cameras, and the monitoring capability of a visual system is effectively enhanced.
(4) Aiming at the problem of large image transmission data volume of the mechanical arm vision camera, the Ethernet is used as a communication channel for image data transmission, so that synchronous high-speed transmission of multiple paths of vision images is realized; meanwhile, an image processing circuit and a pose calculating circuit are configured on the camera, so that the pose calculating function is realized at the camera end, the calculation resources of a mechanical arm controller are not occupied, and the efficiency of a control system is improved; the measuring result output by the camera is provided with the measuring state identification information and the verification information, so that the reliability of the control system in using the measuring result is improved.
Drawings
Fig. 1 is a block diagram of a vision system.
Fig. 2 is a target adapter visual indicia pattern and dimensions.
Fig. 3 is a docking camera visual marker pattern and dimensions.
Fig. 4 is a vision system information flow diagram.
Detailed Description
To make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in further detail below with reference to the accompanying drawings.
Example 1:
the vision system can complete the monitoring of the mechanical arm working area at a middle and long distance; in a short distance, the high-precision three-dimensional pose of the target can be output, the end effector of the guide mechanical arm further adjusts the pose of the end effector to gradually reduce the relative pose between the coordinate system of the end effector and the coordinate system of the target adapter, and the end effector enters a capture area of the target adapter to complete the capture, locking and other operations of the target adapter; the docking camera can output the high-precision three-dimensional pose of the visual mark on the cabin section, and the mechanical arm is guided to complete the transposition control of the cabin section.
As shown in fig. 1 and 4, the vision system includes a wrist camera a, a wrist camera B, an elbow pan-tilt camera, a docking camera, an ethernet switch, a wrist camera vision marker, a docking camera vision marker.
The wrist camera A is arranged at the tail end A, the wrist camera B is arranged at the tail end B, the operation target area of the tail end B is observed, and the space station mechanical arm can be guaranteed to realize the on-cabin crawling task. The wrist camera A and the wrist camera B both adopt binocular cameras.
The elbow cloud platform camera is installed at the elbow of arm, drives the elbow camera follow-up of fixing on it through the biax cloud platform to realize the monitoring on a large scale of monocular elbow camera.
The docking camera is mounted on the cabin and used for guiding the mechanical arm to complete secondary docking of the cabin. The butt camera adopts a binocular camera.
The Ethernet switch is installed at the elbow of the mechanical arm, and the interconnection of the Ethernet network of the mechanical arm and the Ethernet network in the space station cabin is realized. The Ethernet switch receives the image data output by the camera, detects and controls the data flow in real time, and stores and forwards the image data to the in-cabin Ethernet switch.
The wrist camera, the elbow camera and the butt-joint camera can independently control the on-off and the brightness of the light source based on the statistical results of the pixel gray values of different areas of the image, the exposure time can be automatically calculated according to frame frequency constraint, and the wrist camera and the butt-joint camera can also perform local exposure control according to the detection result of the visual mark by using the information of the image area where the visual mark is located. The wrist camera, the elbow camera and the butt camera are all provided with a lens hood, and the inside of the lens adopts the light path design of stray light elimination and is sprayed with stray light eliminating paint.
The wrist camera and the butt joint camera have monocular measurement and binocular measurement modes. And in the monocular vision measurement mode, a single monocular camera is adopted to complete the three-dimensional pose measurement of the target. And synchronously acquiring images of the binocular cameras in a binocular vision measurement mode, and calculating the pose of the target through image feature matching. The binocular measurement mode can effectively improve the measurement precision, increase the pose measurement function of a non-cooperative target, have the function of monocular measurement backup, and still provide pose data to complete the control of the mechanical arm even if one of the two cameras fails.
The wrist camera and the butt-joint camera are both provided with an image processing circuit and a pose calculation circuit, the real-time pose calculation function can be independently realized at the camera end, and the pose calculation result is sent to the mechanical arm controller through a communication bus. The image processing circuit and the pose calculation circuit are used for automatically judging the validity of the pose measurement result by fusing the detection number of the visual mark points in the single-frame image, the processing time consumption of the single-frame image, the pose calculation result of continuous multi-frame images and other information. When judging that one of the left eye camera and the right eye camera cannot work normally, the binocular vision measuring mode is autonomously switched to the monocular vision measuring mode, and when judging that the left eye camera and the right eye camera are both normal, the current measuring mode, the communication state between the left eye camera and the right eye camera, the time consumption for recovering normal work and other information are fused, and the monocular vision measuring mode is autonomously switched to the binocular vision measuring mode. The image processing circuit and the pose calculation circuit are stored in a single-particle immune chip, the main program stores 3 parts in a memory, the main program is read by the bootstrap program and voted by two out of three when the image processing circuit and the pose calculation circuit are started, write-back is carried out when data caused by single-particle effect are inconsistent, and triple modular redundancy processing is carried out on key data in the program running process, so that the anti-irradiation capacity is improved. The image processing circuit and the pose calculation circuit are also provided with a watchdog circuit, and can be automatically reset and recovered when a single event occurs.
The visual indicia include two types: and the wrist camera visual mark and the butt camera visual mark are respectively arranged on the target adapter and the space station cabin section and are used for target detection, identification and pose calculation of the wrist camera and the butt camera in a short distance.
And the wrist camera A, the wrist camera B, the elbow holder camera or the butt-joint camera are all provided with light sources which actively illuminate the target.
Wrist camera A, wrist camera B, elbow cloud platform camera, butt joint camera, ethernet switch, visual mark all possess the modularization interface, can realize changing in the rail.
The visual mark pattern is mainly solid circular pattern and circular pattern. The visual mark uses only black and white as the primary colors of the pattern (no other colors than black and white are used).
The target adapter visual marker is shown in fig. 2, and 8 solid circle/circular ring patterns are designed, wherein 4 circular ring patterns are all distributed on the surface of the target adapter visual marker flat plate, 3 of the rest 4 solid circle patterns are distributed on the surface of the target adapter visual marker flat plate, and 1 solid circle pattern is distributed on the top end plane of the target adapter visual marker cylindrical protrusion.
The visual marker of the docking camera is shown in fig. 3, and 8 solid circle/circular ring patterns are designed, wherein 6 circular ring patterns are all distributed on the surface of the target adapter plate, 1 of the rest 2 solid circle patterns is distributed on the surface of the target adapter plate, and 1 is distributed on the top end plane of the cylindrical protrusion.
The formula for calculating the three-dimensional pose of the target in the monocular vision measurement mode is
Figure BDA0003798923500000061
Wherein s is i Is a non-zero scale factor, (u) i ,v i ) Marking the center P for vision i Corresponding image pixel coordinates, (X) i ,Y i ,Z i ) Marking the center P for vision i And (3) three-dimensional coordinates in a target coordinate system, wherein A is a camera internal reference matrix, R is a 3 multiplied by 3 orthogonal unit rotation matrix, t is a three-dimensional translation vector, and R and t represent the three-dimensional pose of the target. The three-dimensional pose of the target can be obtained by solving the equation set.
In a binocular vision measurement mode, firstly, the coordinates of the marker in a binocular camera coordinate system are solved according to the matching result of the camera model and the image characteristics, then, the marker corresponds to the target vision marker, namely, the one-to-one correspondence relation of the marker and the coordinates of the marker in the target coordinate system is established, and further, the position and the attitude of the target are calculated.
Taking the coordinate system of the left eye camera as a reference coordinate system, and the internal reference matrixes of the left eye camera and the right eye camera are respectively A l And A r The relative pose relationship between the two cameras is R c And t c (R c Is a 3 × 3 orthogonal unit rotation matrix, t c Is a three-dimensional translation vector), the calculation formula of the three-dimensional pose of the target in the binocular vision measurement mode is
Figure BDA0003798923500000071
Figure BDA0003798923500000072
Wherein s is li 、s ri Are all non-zero scale factors, (X) ci ,Y ci ,Z ci ) As the three-dimensional coordinates of the center Pi of the visual mark point under the coordinate system of the left eye camera, (u) li ,v li ) And (u) ri ,v ri ) The coordinates of the center Pi of the visual mark point are respectively the image pixel coordinates corresponding to the left eye camera and the right eye camera.
Left eye camera coordinate system O c X c Y c Z c And a target coordinate system O o X o Y o Z o The conversion relationship between can be expressed as
Figure BDA0003798923500000073
Wherein (X) ci ,Y ci ,Z ci ) As the three-dimensional coordinates of the center Pi of the visual mark point under the coordinate system of the left eye camera, (X) oi ,Y oi ,Z oi ) The three-dimensional coordinate of the center Pi of the visual marking point under a target coordinate system is shown, R is a 3 multiplied by 3 orthogonal unit rotation matrix, t is a three-dimensional translation vector, and R and t represent the three-dimensional pose of the target. The three-dimensional pose of the target can be obtained by simultaneously solving the equation set.
The wrist camera, the elbow camera and the Ethernet switch are communicated with the mechanical arm controller through a 1553B bus, and the butt-joint camera is communicated with the space station information system through the 1553B bus in real time, so that bidirectional data transmission of remote control instructions and telemetering data (including a target three-dimensional pose calculation result) is realized. The target three-dimensional pose calculation result output by the camera is provided with measurement state identification information such as a measurement mode, pose measurement effectiveness, the number of mark points and the like and verification information calculated in a sum verification mode. The camera and the Ethernet switch receive the remote control command through a 1553B bus respectively to complete camera imaging control, light source illumination control and Ethernet switch data forwarding control; target three-dimensional pose data obtained through real-time calculation and relevant telemetering data of equipment such as a camera and an Ethernet switch are uploaded through a 1553B bus, and necessary information is provided for the on-orbit operation of the mechanical arm; the RT address of the wrist camera and the butt camera can be set.
The vision system communicates with the space station information system in real time through the Ethernet switch by a hundred-mega Ethernet bus, and realizes the real-time high-speed transmission of the coded image data. The instrument in the space station completes decoding and display of the image, and visual information in a mechanical arm working area is provided for astronauts and ground operators; the Ethernet port supports full duplex data communication; the image data is transmitted by RTP protocol. The image communication protocol adopts RTP protocol for transmission.
Aiming at the problem of complex illumination adaptability caused by alternation of a low-orbit illumination area and a ground shadow area, relative motion positions between a camera and a target and various optical characteristics of a target background area, the invention provides that a wrist camera, an elbow camera and a docking camera are all integrated with light sources, have the functions of autonomous control of the light sources, manual exposure capability controlled by multi-mode automatic exposure and remote control instructions and stray light inhibition capability, effectively improve the imaging quality of the camera and further enhance the monitoring capability and the measuring capability of a visual system.
Aiming at the on-orbit use requirement of 10-15 years, the camera and the data communication device adopt interface modular design and man-machine work efficiency design, and have the capacity of on-orbit replacement for the spaceman when going out of the cabin; the camera and the data communication device have the functions of parameter on-orbit injection modification and software on-orbit injection modification and design; the camera and the data communication device jointly support multi-format image acquisition such as color compressed images and original gray level images, and further support the camera to implement on-orbit calibration and measure the pose by using the camera image on the ground.
Aiming at the problem of irradiation environment adaptability such as severe solar ultraviolet radiation, total ionization dose, single event effect and the like, the materials and the process of a camera lens film layer and a visual marking paint layer meet the use requirement of a space environment; the image processing circuit and the pose calculating circuit configured by the camera have the autonomous detection and autonomous recovery capability of faults caused by single event, and can ensure the reliability of pose calculation.
Example 2:
a vision system applied to a large-scale space station mechanical arm can meet the monitoring and measuring requirements of the space station mechanical arm in an in-orbit task.
When the space station mechanical arm executes a task in an on-orbit mode, different stages including a long-distance positioning target, a middle-distance approaching target, a short-distance accurate operation target and the like are carried out, and a vision system is required to be used in each stage. The visual system consists of a wrist camera, an elbow holder camera, a docking camera, an Ethernet switch, a visual mark and the like.
The arm is at the rail operation in-process, and the cabin body is kept away from to the elbow camera, can overlook the region below the arm with overlooking the mode, and the field of vision is difficult to be sheltered from, can have better on-orbit monitoring effect. The mechanical arm uses the elbow camera to perform safety inspection before starting the mechanical arm, monitoring the working state of the mechanical arm in operation, and monitoring the cabin section transposition and secondary butt joint process. In the motion process of the mechanical arm, the motion of the elbow camera holder is controlled by the mechanical arm controller to drive the elbow camera fixed on the mechanical arm holder to follow up so as to keep monitoring the target all the time.
When the effective observation area of the target adapter enters the view field of the wrist camera, the mechanical arm obtains high-precision three-dimensional pose information of the visual mark of the target adapter by using the wrist camera, guides the end effector of the mechanical arm to further approach the target adapter and finishes the precise operations of capturing, grabbing, locking and the like of the target adapter, wherein the precise operations are high in precision and strong in stability.
And when the effective observation area of the docking mechanism enters the view field of the docking camera, the mechanical arm acquires the high-precision three-dimensional pose information of the visual marker of the docking camera by using the docking camera, and guides the mechanical arm to finish secondary docking of the cabin.
The invention provides a vision system applied to a mechanical arm of a space station.
The wrist camera and the butt camera have the functions of binocular vision measurement and monocular vision measurement independent switching; in the two measurement modes, when one point on the visual marker is lost or invisible, the three-dimensional pose measurement system still has the functions of target detection and identification and three-dimensional pose measurement.
The wrist camera and the butt camera are integrated with an image processing circuit and a pose calculation circuit, so that a real-time pose calculation function can be independently realized at a camera end, calculation resources of a mechanical arm controller are not occupied, and pose calculation results are sent to the mechanical arm controller through a communication bus.
The elbow cradle head camera is installed on an elbow joint, the elbow camera fixed on the elbow cradle head is driven to follow through the double-shaft cradle head, and large-range monitoring of the monocular elbow camera can be achieved.
And an Ethernet switch is configured, and simultaneously, image data output by multiple paths of cameras is received and forwarded to the space station cabin through one path of Ethernet bus, so that the number of cabin penetrating cables is reduced.
Camera pose calculation software, ethernet switch operating system software and application software in the vision system all have the functions of on-orbit injection modification and upgrading, and imaging control parameters, image data compression coding parameters and pose measurement parameters support on-orbit injection modification.
And each device of the vision system adopts an interface modular design, so that rail maintenance and replacement can be realized.
Hundred mega Ethernet bus is selected to realize real-time high-speed transmission of image data.
The camera provided by the invention has two image data output modes of an uncompressed image and a compressed image, the uncompressed image supports on-track calibration of the camera, and the compressed image reduces code stream and supports simultaneous transmission of multiple paths of images under the condition of meeting bandwidth constraint.
The invention has the redundancy backup function, the wrist camera adopts a double-machine hot backup mode under the monocular vision measurement mode, the elbow camera pan-tilt motor adopts a double-winding backup mode, and the Ethernet switch adopts a double-machine cold backup mode.
The vision system adopts two elements of circle and ring to form a plurality of vision marking patterns, and realizes the measurement tasks of different cameras. The visual marks of black and white are realized by adopting a paint spraying process, and the paint and the process meet the requirement of long-term use on the rail.
Those skilled in the art will appreciate that those matters not described in detail in the present specification are well known in the art.
Although the present invention has been described with reference to the preferred embodiments, it is not intended to limit the present invention, and those skilled in the art can make variations and modifications of the present invention without departing from the spirit and scope of the present invention by using the methods and technical contents disclosed above.

Claims (10)

1. A visual system applied to a mechanical arm of a space station is characterized by comprising at least two wrist cameras, at least one elbow holder camera, at least one butt-joint camera, a data communication device and a data processing device;
at least one elbow pan-tilt camera is arranged at the elbow of the mechanical arm and used for monitoring;
each wrist of the mechanical arm is provided with at least one wrist camera for acquiring pose information of a target and further guiding the tail end of the mechanical arm to be close to the target;
at least one docking camera is arranged on the docking part of the cabin body of the space station and used for acquiring pose information of a target and further guiding a mechanical arm to complete the docking of the cabin body;
all the wrist cameras and the elbow holder cameras are in data communication with the data communication device;
the data communication device and all the docking cameras perform data communication with the data processing device and communicate with the outside through the data processing device.
2. The vision system as claimed in claim 1, wherein said elbow pan-tilt camera follows the camera through the pan-tilt to keep monitoring the target at all times.
3. The vision system of claim 1, wherein said wrist camera is further adapted to guide capturing, grabbing, locking of a target adapter by a robot arm tip.
4. The vision system as claimed in claim 1, wherein said elbow pan-tilt camera employs a monocular camera; the wrist camera and the butt camera both adopt binocular cameras.
5. The vision system as claimed in claim 4, wherein said wrist camera and said docking camera both have monocular and binocular measurement modes and are capable of autonomous switching.
6. The vision system according to claim 5, wherein the monocular measurement mode is used to measure the three-dimensional pose of the target, and the binocular measurement mode is used to synchronously acquire images of binocular cameras, and the pose of the target is calculated through image feature matching.
7. The vision system according to any one of claims 1 to 6, wherein the wrist camera and the docking camera are used for independently performing target real-time pose calculation.
8. A visual system according to any one of claims 1 to 6 wherein the wrist camera, docking camera and elbow pan-tilt camera are each provided with a light source for active illumination.
9. A visual system according to any one of claims 1 to 6 wherein the data communication means employs an Ethernet switch.
10. A vision system according to any one of claims 1 to 6, further comprising a visual marker for the wrist camera and or the docking camera to acquire pose information of an object.
CN202210977497.7A 2022-08-15 2022-08-15 Visual system applied to space station mechanical arm Pending CN115493513A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210977497.7A CN115493513A (en) 2022-08-15 2022-08-15 Visual system applied to space station mechanical arm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210977497.7A CN115493513A (en) 2022-08-15 2022-08-15 Visual system applied to space station mechanical arm

Publications (1)

Publication Number Publication Date
CN115493513A true CN115493513A (en) 2022-12-20

Family

ID=84466339

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210977497.7A Pending CN115493513A (en) 2022-08-15 2022-08-15 Visual system applied to space station mechanical arm

Country Status (1)

Country Link
CN (1) CN115493513A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116442233A (en) * 2023-04-28 2023-07-18 哈尔滨工业大学 Hand-eye calibration method for seven-degree-of-freedom space manipulator on-orbit operation

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016193781A1 (en) * 2015-05-29 2016-12-08 Benemérita Universidad Autónoma De Puebla Motion control system for a direct drive robot through visual servoing
CN108044651A (en) * 2017-10-19 2018-05-18 北京航空航天大学 A kind of space manipulator kinematics parameters on-orbit calibration method based on binocular vision
CN108908291A (en) * 2018-06-29 2018-11-30 北京空间飞行器总体设计部 A kind of multi-arm robot for space of maintainable technology on-orbit
CN109454633A (en) * 2018-09-12 2019-03-12 华中科技大学 A kind of multi-functional in-orbit maintaining robot system
CN112556658A (en) * 2020-09-24 2021-03-26 北京空间飞行器总体设计部 Butt joint ring capture point measuring method and system based on binocular stereo vision

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016193781A1 (en) * 2015-05-29 2016-12-08 Benemérita Universidad Autónoma De Puebla Motion control system for a direct drive robot through visual servoing
CN108044651A (en) * 2017-10-19 2018-05-18 北京航空航天大学 A kind of space manipulator kinematics parameters on-orbit calibration method based on binocular vision
CN108908291A (en) * 2018-06-29 2018-11-30 北京空间飞行器总体设计部 A kind of multi-arm robot for space of maintainable technology on-orbit
CN109454633A (en) * 2018-09-12 2019-03-12 华中科技大学 A kind of multi-functional in-orbit maintaining robot system
CN112556658A (en) * 2020-09-24 2021-03-26 北京空间飞行器总体设计部 Butt joint ring capture point measuring method and system based on binocular stereo vision

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
张凯锋;周晖;温庆平;桑瑞鹏;: "空间站机械臂研究", 空间科学学报, no. 06, 15 November 2010 (2010-11-15), pages 612 - 619 *
陈金宝;聂宏;李有光;王小涛;: "空间站大机械臂锁合末端效应器设计及关键技术研究", 宇航学报, no. 12, 30 December 2013 (2013-12-30), pages 1529 - 1539 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116442233A (en) * 2023-04-28 2023-07-18 哈尔滨工业大学 Hand-eye calibration method for seven-degree-of-freedom space manipulator on-orbit operation
CN116442233B (en) * 2023-04-28 2024-04-12 哈尔滨工业大学 Hand-eye calibration method for seven-degree-of-freedom space manipulator on-orbit operation

Similar Documents

Publication Publication Date Title
CN112505065B (en) Method for detecting surface defects of large part by indoor unmanned aerial vehicle
CN110262507B (en) Camera array robot positioning method and device based on 5G communication
DE102010045752B4 (en) Visual perception system and method for a humanoid robot
KR101945019B1 (en) System for swarm flight of unmanned aerial vehicles for acquiring images of crop growing distribution and method thereof
CN102590882B (en) Foreign body monitoring system for airport road surface
CN111421539A (en) Industrial part intelligent identification and sorting system based on computer vision
CN109967292A (en) A kind of automatic spraying system and its method based on the reconstruct of workpiece profile information three-dimensional
CN110977964A (en) Intelligent inspection robot for detecting micro-leakage of power plant equipment operation and detection method
CN110113570A (en) A kind of autonomous cruising inspection system of power transmission line unmanned machine and its working method
CN105203046A (en) Multi-line array laser three-dimensional scanning system and method
CN102922521A (en) Mechanical arm system based on stereo visual serving and real-time calibrating method thereof
CN108985184B (en) Automatic mounting system and method for multipurpose aircraft plug-in
CN111624994A (en) Robot inspection method based on 5G communication
CN115493513A (en) Visual system applied to space station mechanical arm
CN107103624A (en) Stereoscopic vision handling system and its method for carrying
CN109731708A (en) Auto repair auto spray painting method based on image recognition
CN109459984A (en) A kind of positioning grasping system and its application method based on three-dimensional point cloud
CN113031462A (en) Port machine inspection route planning system and method for unmanned aerial vehicle
CN111774775B (en) Three-dimensional vision system for gantry type robot welding of large-scale structural part and control method
CN116774736B (en) Unmanned aerial vehicle autonomous inspection system and method free of preset route
CN110861073A (en) Visual detection system and method of robot based on overhead high-voltage transmission line
CN110202581A (en) Compensation method, device and the electronic equipment of end effector of robot operating error
CN113001142B (en) Automatic double-mechanical-arm assembling system for large-scale block optical assembly
CN114296477A (en) Unmanned mobile platform autonomous landing method for air-ground cooperative combat
CN109176464A (en) Cable auxiliary assembly system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination