CN115519546B - Space science experiment robot is cooperated to world based on intelligent vision - Google Patents

Space science experiment robot is cooperated to world based on intelligent vision Download PDF

Info

Publication number
CN115519546B
CN115519546B CN202211309652.4A CN202211309652A CN115519546B CN 115519546 B CN115519546 B CN 115519546B CN 202211309652 A CN202211309652 A CN 202211309652A CN 115519546 B CN115519546 B CN 115519546B
Authority
CN
China
Prior art keywords
sample box
module
space
science experiment
representing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211309652.4A
Other languages
Chinese (zh)
Other versions
CN115519546A (en
Inventor
于强
鲁鹏飞
于泽华
刘晓珂
任俊竹
戴宏伟
霍晓智
李欣泽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National Space Science Center of CAS
Original Assignee
National Space Science Center of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National Space Science Center of CAS filed Critical National Space Science Center of CAS
Priority to CN202211309652.4A priority Critical patent/CN115519546B/en
Publication of CN115519546A publication Critical patent/CN115519546A/en
Application granted granted Critical
Publication of CN115519546B publication Critical patent/CN115519546B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1612Programme controls characterised by the hand, wrist, grip control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1661Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1689Teleoperation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Analysing Materials By The Use Of Radiation (AREA)

Abstract

The invention relates to an intelligent vision-based space science experiment robot in coordination with the world, which comprises a space material science experiment sample box vision positioning module, a space material science experiment sample box grabbing module, a space material science experiment operation instruction database module and a space material science experiment operation module; the visual positioning module is used for acquiring pose information of a door handle of the batch sample management module in the high-temperature cabinet, a sample box for the space material science experiment and a sample box handle in the X-ray transmission imaging module, and sending the pose information to the space material experiment sample box grabbing module; the grabbing module is used for planning a path according to the pose information and executing a sample box replacement task; the operation instruction database module is used for storing related instruction data by using the database; the space and earth cooperative operation module is used for calling data stored in the space material science experiment operation instruction database module and issuing relevant instructions of a sample box replacement task to the space material science experiment sample box grabbing module.

Description

Space science experiment robot is cooperated to world based on intelligent vision
Technical Field
The invention relates to the field of space science experiments developed on space stations in China, in particular to a space science experiment robot based on intelligent vision.
Background
The space station high-temperature material science experiment cabinet (Gao Wengui) in China is about to develop the space material science research by using the advanced technologies such as the sample experiment on-line measurement technology, the X-ray perspective imaging technology and the like. The high-temperature cabinet supports melt growth and solidification scientific experiments of various materials.
High temperature cabinets 16 samples can be loaded in a batch. After all 16 samples were tested, the astronaut was required to replace the samples in whole batches.
The high Wen Gui therefore faces the following problems during on-track experiments:
(1) Replacement of the entire batch of samples during the track. Astronauts need long-term training on the ground and multiple operations in space to accomplish such tasks.
(2) And the problem of avoiding the radiation of the X-ray experiment is solved. In the case of its power on, the astronaut is not able to access it. Thus, many experimental operations on-line in real time cannot be done.
(3) And the problem of improving the overall automation and intelligent degree of experimental equipment. There are 13 scientific experiment cabinets on our country's space station, involve many disciplines such as space material, space fluid, space burning, space physics, space life. The astronauts carry a plurality of missions and have limited number of people, and the experiment task is heavy, so that the workload of the astronauts needs to be lightened, and more space science experiments are carried out.
The astronaut can replace astronauts to efficiently and safely finish space heavy experimental tasks, so that the safety of astronauts is ensured not to be influenced by dangerous experimental processes such as high temperature, radiation and the like, the efficiency of space science experiments is improved, and the cost of the space science experiments is reduced. The device can be used for not only the high-temperature material science experiment cabinet in the space station, but also other experiment cabinets.
Disclosure of Invention
The invention aims to realize the replacement of a sample box in a high-temperature cabinet when the advanced technology is developed in the high-temperature material science experiment cabinet (Gao Wengui) of a space station to develop the space material science research, and provides an intelligent vision-based space-earth collaborative space science experiment robot.
In order to achieve the above purpose, the present invention is realized by the following technical scheme.
The invention provides an intelligent vision-based space science experiment robot which is used for completing replacement of an experiment material sample box in a space station high-temperature cabinet in an on-orbit manner; the robot includes: the system comprises a space material science experiment sample box vision positioning module, a space material science experiment sample box grabbing module, a space material science experiment operation instruction database module and a space and earth cooperative operation module; wherein,,
The visual positioning module of the space material science experiment sample box is used for acquiring pose information of a door handle of the batch sample management module in the high-temperature cabinet, the space material science experiment sample box and a sample box handle in the X-ray transmission imaging module by using a space science experiment image acquisition technology, and sending the pose information to the space material experiment sample box grabbing module;
the space material science experiment sample box grabbing module is used for planning a path according to pose information and executing a sample box replacement task;
the space material science experiment operation instruction database module is used for storing instruction data related to the robot by using the database;
the space-earth cooperative operation module is used for calling instruction data stored in the space material science experiment operation instruction database module and issuing related instructions of a sample box replacement task to the space material science experiment sample box grabbing module; and the system is also used for monitoring the working state of the robot module.
As one of the improvements of the above technical solutions, the task of replacing the sample cartridge sequentially includes the following steps:
grasping a high Wen Gui batch sample management module door handle and opening a high Wen Gui batch sample management module door;
Extracting a space material science experiment sample box in the high-temperature cabinet or a space material science experiment sample box in the X-ray transmission imaging module;
taking out a material sample box which has completed a space material science experiment;
replacing the new material sample cartridge;
a space material science experiment sample box inserted into the high temperature cabinet;
the tall Wen Gui batch sample management module door handle is grasped and the tall Wen Gui batch sample management module door is closed.
As one of the improvements of the above technical solutions, the visual positioning module of the experimental sample box for space material science includes: the system comprises a space science experiment image sensor, a pose sensing part, a space material science experiment material sample box data analysis part and a space material science experiment material sample box pose data exchange part; wherein,,
the space science experiment image sensor is used for obtaining three-dimensional information of a door handle in a high Wen Gui batch sample management module, a space material science experiment sample box and a sample box handle in an X-ray transmission imaging module by using a space science experiment image obtaining technology, and transmitting the three-dimensional information to the pose sensing part;
the pose sensing component is used for converting three-dimensional information of a door handle in a high Wen Gui batch sample management module, a space material science experiment sample box and a sample box handle in an X-ray transmission imaging module into image information of the door handle in the high Wen Gui batch sample management module, the space material science experiment sample box and the sample box handle in the X-ray transmission imaging module, and transmitting the image information to the space material science experiment material sample box data analysis component;
The space material science experiment material sample box data analysis component is used for analyzing the image information transmitted by the space material science experiment sample box visual positioning module to obtain pose information of a door handle in a high Wen Gui batch sample management module, the space material science experiment sample box and a sample box handle in an X-ray transmission imaging module;
the space material science experiment material sample box pose data exchange component is used for transmitting pose information of a door handle in a high Wen Gui batch sample management module, the space material science experiment sample box and a sample box handle in an X-ray transmission imaging module to the heaven-earth cooperative operation module.
As one of the improvements of the above technical solution, the visual positioning module of the spatial material science experiment sample box, the obtaining pose information includes:
firstly, calibrating a camera, and then calibrating a projector:
Figure BDA0003907488870000031
where u denotes the horizontal axis coordinate in the pixel coordinate system, v denotes the vertical axis coordinate in the pixel coordinate system, Z c The method comprises the steps of representing object distances, A representing a camera internal reference matrix, R representing a camera rotation matrix, T representing a camera translation matrix, X representing an X-axis coordinate in a world coordinate system, Y representing a Y-axis coordinate in the world coordinate system, and Z representing a Z-axis coordinate in the world coordinate system;
Performing two-dimensional Fourier transform:
Figure BDA0003907488870000032
f (u, v) represents a frequency domain image after Fourier transform, M represents the width of the image, N represents the height of the image, F (x, y) represents a time domain image before Fourier transform,
selecting a transfer function H (u, v):
Figure BDA0003907488870000033
wherein D (u, v) represents the distance from the point (u, v) to the Fourier transform center, D 0 Represents a cut-off frequency;
performing two-dimensional Fourier inverse transformation:
Figure BDA0003907488870000034
using an edge detection algorithm to obtain gray code pixel coordinates; wherein,,
world coordinates of the point to be measured P:
Figure BDA0003907488870000041
wherein O is c Representing the origin in the camera coordinate system,X woc Represents the origin x coordinate, Y in the camera coordinate system woc Representing the origin y-coordinate, Z in the camera coordinate system woc Representing the origin z coordinate in the camera coordinate system, R c Representing a rotation matrix in a camera coordinate system, t c Representing a translation matrix in the camera coordinate system, P c Representing the P coordinate, X of a point in the camera coordinate system wpc Representing the x-coordinate, Y-coordinate of point P in the camera coordinate system wpc Representing the y-coordinate, Z, of point P in the camera coordinate system wpc Representing the z-coordinate, x of point P in the camera coordinate system c Representing normalized x-coordinate, y in camera coordinate system c Representing normalized y-coordinate, z in camera coordinate system c Representing normalized z-coordinate, O, in camera coordinate system p Represents the origin, X, in the projector coordinate system wop Represents the origin x coordinate, Y in the projector coordinate system wop Representing the origin y coordinate, Z in the projector coordinate system wop Representing the origin z coordinate, P in the projector coordinate system p Representing the P coordinate, X point in the projector coordinate system wpp X-coordinate, Y-coordinate representing origin in projector coordinate system wpp Y-coordinate, Z, representing the origin in the projector coordinate system wpp Z coordinate, x representing origin in projector coordinate system p Representing the x-coordinate, y-coordinate of point P in the projector coordinate system p Representing the y-coordinate, z of point P in the projector coordinate system p Representing the z-coordinate of point P in the projector coordinate system;
performing ICP point cloud registration, comprising:
given two point clouds X and P,
X=(x 1 ,x 2 ,…,x n ) (6)
P=(p 1 ,p 2 ,…,p n ) (7)
wherein n represents the total number of point clouds, x n Representing the point cloud in set X, p n Representing a point cloud in the set P;
solving for R and T, minimizing E (R, T),
Figure BDA0003907488870000042
wherein E (R, t) represents an error function, R represents a rotation matrix, t represents a translation matrix, x i Representing the point cloud in set X, p i Representing a point cloud in the set P;
acquiring centroids of two groups of point clouds:
Figure BDA0003907488870000051
Figure BDA0003907488870000052
acquiring coordinates of points in two groups of point clouds with the mass center as an origin:
X′=x i -u x =x′ i (11)
P′=p i -u p =p′ i (12)
x 'represents the centroid coordinates of the set X, X' i Representing centroid coordinates of set X; p 'represents the centroid coordinates of the set P, P' i Representing centroid coordinates of set P;
omega is obtained and SVD decomposition is carried out on the omega:
Figure BDA0003907488870000053
omega represents the matrix to be SVD decomposed, U represents the orthogonal matrix, T represents the transpose, delta 1 Representing the non-zero singular values of matrix ω, δ 2 Representing the non-zero singular values of matrix ω, δ 3 Non-zero singular values representing a matrix ω;
the transformation relation between the two groups of point clouds is as follows:
R=VU T (14)
v represents an orthogonal matrix, and U represents an orthogonal matrix;
t=u x -Ru p (15)
t represents a translation matrix.
As one of the improvements of the above technical solutions, the space material science experiment sample box grabbing module includes: a mechanical arm and a clamping hand; the mechanical arm and the clamping hand are used for completing the task of replacing the sample box according to the acquired pose information and the planned path, and specifically comprise the following steps:
setting an initial position of a mechanical arm;
the method comprises the steps that the position of a door handle in a high Wen Gui batch sample management module, the position of a sample box in a space material science experiment sample box and the position of a sample box handle in an X-ray transmission imaging module are obtained through a space material science experiment sample box visual positioning module, and position information is sent to a mechanical arm;
planning a path of the mechanical arm;
the mechanical arm and the clamping hand are controlled to realize the operation of opening the door in the high Wen Gui batch sample management module, so that the task of replacing the sample box in the space material science experiment is completed, or the task of replacing the sample box in the X-ray transmission imaging module is completed.
As one of the improvements of the above technical solution, the path planning for the mechanical arm includes:
inserting an intermediate point between a starting point and an ending point by adopting fifth-degree polynomial interpolation, wherein the expression is as follows:
Figure BDA0003907488870000061
wherein θ (t) represents an angular displacement at time t,
Figure BDA0003907488870000062
represents the angular velocity at time t,/-, and>
Figure BDA0003907488870000063
indicating the angular acceleration at time t, a 0 、a 1 、a 2 、a 3 、a 4 、a 5 Representation ofCorrelation coefficients to be solved in the formulas;
and constraining the angular speed of the starting point and the stopping point, wherein the constraint condition meets the following formula:
Figure BDA0003907488870000064
wherein θ (t) 0 ) The position of the starting point is indicated,
Figure BDA0003907488870000065
indicating the start point speed, +.>
Figure BDA0003907488870000066
Acceleration of starting point, t 0 For the starting point time, θ (t f ) Indicating the end point position>
Figure BDA0003907488870000067
Indicating termination point speed, +.>
Figure BDA0003907488870000068
Indicating the acceleration at the end point, t f Indicating the end point moment;
solving to obtain:
Figure BDA0003907488870000069
θ 0 the starting position is indicated as such,
Figure BDA00039074888700000610
indicates the initial speed +.>
Figure BDA00039074888700000611
Represents the initial acceleration, θ f Indicates the termination position +.>
Figure BDA00039074888700000612
Indicates termination speed, ++>
Figure BDA00039074888700000613
Indicating a termination acceleration.
As one of the improvements of the technical scheme, the heaven-earth cooperative operation module comprises a robot operation module and a robot state monitoring module; wherein,,
the robot operation module is used for controlling the motors of the mechanical arm and the clamping hand, and controlling the corresponding mechanical arm motor to move and controlling the clamping hand to finish the clamping action after receiving the operation instruction, so that the task of replacing the sample box is finished;
The state monitoring module is used for monitoring the motion state and key parameters of the robot during the on-orbit period so as to judge the working state of the robot.
As one of the improvements of the above technical solutions, the motor of the mechanical arm includes: the wrist strap comprises a base motor, a shoulder motor, an elbow motor, a first wrist motor, a second wrist motor and a third wrist motor;
the motor of the machine base is positioned at the bottommost layer of the mechanical arm and is used for controlling the whole mechanical arm to rotate in a horizontal plane;
the shoulder motor is positioned at the upper part of the motor of the machine base and used for controlling the mechanical arm to rotate on a vertical surface;
the elbow motor is positioned in the middle of the mechanical arm and used for driving the first wrist motor, the second wrist motor and the third wrist motor to move back and forth;
the first wrist motor, the second wrist motor and the third wrist motor are used for controlling the clamping hand to rotate back and forth and left and right, and then the clamping hand is moved to a designated position.
As one of the improvements of the above technical solution, the robot-related parameter table includes: an experiment parameter table, an alarm log table, a machine parameter table, an action flow table, a replacement flow instruction table and a picture video storage table; wherein,,
the database comprises: business application, client driver and OpenGauss server; the business application supports the functions of the heaven-earth cooperative control module; the client driver is responsible for receiving an access request from a service application, returning an execution result to the application, communicating with an OpenGauss server, issuing SQL to be executed on the OpenGauss server, and receiving the execution result; the OpenGauss server is in charge of storing service data, executing a data query task and returning an execution result to the client driver;
The experiment parameter table is used for storing state information of a door handle in a time and high Wen Gui batch sample management module, a space material science experiment sample box and a sample box handle in an X-ray transmission imaging module, and can also monitor working state information in the experiment process in real time; the state information of the door handle in the Gao Wengui batch sample management module, the spatial material science experiment sample box and the sample box handle in the X-ray transmission imaging module comprises: time, the opening and closing state of a door handle in a high Wen Gui batch sample management module, the sample replacement batch, the number of unrechanged samples and the sample replacement state of a space material science experiment sample box; the working state information in the experimental process comprises: temperature, shell temperature, current and voltage in the high-temperature cabinet;
the alarm log table is used for recording the abnormal time, abnormal data value, expected data value and abnormal physical quantity information of experimental data when certain parameters exceed a specified range in the experimental process;
the robot parameter table is used for recording state parameters including the current state of the motor of the mechanical arm and the state parameters of the robot; the states of the mechanical arm motor include: the angle of rotation, the distance of movement and the power of the motor; the state parameters of the robot include: ambient temperature;
The action flow table is used for recording the execution time of the action, the ID number of the action to be executed, the object to execute the action, the parameters used in the execution process of the action, the duration of executing the action and the state of executing the action;
the robot can sequentially execute related actions according to the action sequence in the replacement flow table to complete the task of replacing the sample;
the picture video storage table is used for storing videos and images shot by the visual positioning module of the experimental sample box of the space material science, can see pictures shot by the visual positioning module in real time in the experimental process, and can also review experimental videos after the experiment is finished.
Compared with the prior art, the invention has the advantages that:
1. according to the invention, a three-dimensional visual perception algorithm is adopted to obtain the space pose of the high-temperature cabinet and the sample in the space station, the space pose is input into the motion controller, the robot is driven to complete the replacement task of the whole batch of samples (replacement is carried out one by one), the on-line experimental operation of astronauts is replaced, the overall automation and the intelligent level of experimental equipment are improved, the workload of astronauts can be lightened, and the space science experimental task can be completed quickly and efficiently;
2. The invention can ensure the safety of astronauts, thoroughly avoid the influence of dangerous experimental processes such as high temperature, X-ray radiation and the like on astronauts, improve the efficiency of space science experiments and reduce the operation cost of the space science experiments; the robot of the invention is used for carrying out auxiliary experiments, has short operation time and stable operation, can liberate hands of an astronaut, saves the time of the astronaut, and ensures that the astronaut puts more time and energy on other important tasks;
3. the robot is mainly applied to the experimental process of developing the space material science by the high-temperature material science experiment cabinet (Gao Wengui) of the space station in China, and can also be applied to the experimental process of other scientific experiment cabinets on the space station in China;
4. according to the invention, by setting databases such as different instructions, flows, parameters and the like in the space material science experiment operation instruction database module, an astronaut can conveniently obtain instructions from the databases required by operation, the precise control of the robot is realized, and the task of replacing the sample box in the space station is completed.
Drawings
FIG. 1 is a block diagram of the components of the present invention;
FIG. 2 is a block diagram of a visual positioning module of the spatial material science experiment sample box of the present invention;
FIG. 3 is a diagram of a robotic arm model of the present invention;
FIG. 4 is a diagram of a database architecture of the present invention.
Detailed Description
The invention discloses an intelligent vision-based space science experiment robot- "astronaut", which relates to a method for automatically completing space station science experiment tasks (including replacing experiment samples, replacing experiment modules, performing experiments in an X-ray transmission imaging module and the like) by means of artificial intelligence and space-earth cooperative operation. The space science experiment robot based on intelligent vision includes: the system comprises a space material science experiment sample box vision positioning module, a space material experiment sample box grabbing module, an aerospace knight-errant heaven-earth cooperative operation module and a space material science experiment operation instruction database module. The robot is mainly applied to the experimental process of developing the space material science by the high-temperature material science experiment cabinet (Gao Wengui) of the space station in China, and can also be applied to the experimental process of other scientific experiment cabinets on the space station in China. The three-dimensional visual perception algorithm is adopted to obtain the space pose of the high-temperature cabinet and the sample in the space station, the space pose is input into the motion controller, the robot is driven to complete the replacement task of the whole batch of samples (replacement is carried out one by one), the on-line experimental operation of astronauts is replaced, the overall automation and the intelligent level of experimental equipment are improved, the workload of astronauts can be lightened, and the space science experimental task is completed quickly and efficiently. The invention designs a space science experiment assisting robot for assisting astronauts to efficiently and safely complete space heavy science experiment tasks, ensures the safety of astronauts, thoroughly avoids the influence of dangerous experiment processes such as high temperature, X-ray radiation and the like on astronauts, improves the efficiency of space science experiments, and reduces the operation cost of the space science experiments. The auxiliary experiment is carried out by using the astronaut, the operation time is short, the operation is stable, and meanwhile, the hands of the astronaut can be liberated, so that the time of the astronaut is saved, and the astronaut can put more time and energy on other important tasks.
The invention aims to provide a method for an experimental robot of space science based on the space-earth coordination of intelligent vision, which is an auxiliary robot capable of replacing astronauts to perform experiments in space stations:
in order to achieve the above purpose, the present invention adopts the following technical scheme: as shown in fig. 1, the space science experiment robot based on intelligent vision mainly comprises:
the system comprises a space material science experiment sample box vision positioning module, a space material science experiment sample box grabbing module, an aerospace knight-errant heaven-earth cooperative operation module and a space material science experiment operation instruction database module; wherein,,
the space material science experiment sample box visual positioning module is used for collecting pose information of a door handle of a batch sample management module in the high-temperature cabinet, a sample box handle in the space material science experiment sample box and the X-ray transmission imaging module, and the like, sending the pose information to the space material experiment sample box grabbing module, and completing the replacement task of the sample box;
the space material science experiment sample box grabbing module is used for executing a sample box replacement task and mainly comprises the tasks of opening and closing a door handle of a high Wen Gui batch sample management module, extracting and inserting the space material science experiment sample box in a high-temperature cabinet or the space material science experiment sample box in an X-ray transmission imaging module, taking out the material sample box which has completed the space material science experiment, replacing a new material sample box and the like;
The astronaut's heaven-earth cooperative operation module is used for assisting ground operators in judging whether the current working state of the robot is good, taking out heaven-earth cooperative instructions from a database, sending the instructions to the robot on a space station through a heaven-earth instruction injection system, and mutually verifying the instructions with the received 6-joint mechanical arm data to provide visual heaven-earth cooperative space science experiment robot state information based on intelligent vision for the ground operators;
the space material science experiment operation instruction database module is used for storing uplink injection operation instructions, space-earth cooperative operation parameters, and space-earth cooperative space science experiment robot alarm logs based on intelligent vision, and space high-temperature material science experiment flows are convenient for ground staff to call and check, and space operators in space stations can conveniently monitor space high-temperature material science experiment processes on line in real time and the working states of the space-earth cooperative space science experiment robots based on intelligent vision.
Further, the visual positioning module of the space material science experiment sample box comprises a pose sensing part, a space material science experiment material sample box data analysis part and a space material science experiment material sample box pose data exchange part;
Further, the pose sensing component is used for transmitting visual information such as door handles in the high Wen Gui batch sample management module, sample box handles in the space material science experiment sample box and the X-ray transmission imaging module to the data analysis component of the sample box of the space material science experiment sample box, and obtaining image information such as the door handles in the high Wen Gui batch sample management module, the sample box of the space material science experiment sample box and the sample box handles in the X-ray transmission imaging module by adopting a space science experiment image sensor acquisition technology;
further, the space material science experiment material sample box data analysis component is used for analyzing the image information transmitted by the space material science experiment sample box visual positioning module to obtain pose information of a door handle in a high Wen Gui batch sample management module, the space material science experiment sample box, a sample box handle in an X-ray transmission imaging module and the like;
further, the space material science experiment material sample box pose data exchange component is used for transmitting pose information of door handles in the high Wen Gui batch sample management module, the space material science experiment sample box and the sample box handles in the X-ray transmission imaging module to the controller;
Further, the spatial science experiment image sensor is characterized in that a spatial science experiment image acquisition technology is used for projecting a pattern with a special structure designed in advance onto the surface of a three-dimensional space object, and the spatial science experiment image sensor is used for observing the distortion condition of imaging on the three-dimensional physical surface, so that three-dimensional information such as a door handle in a high Wen Gui batch sample management module, a spatial material science experiment sample box, a sample box handle in an X-ray transmission imaging module and the like is obtained, and the flow is shown in a figure 2;
furthermore, the principle of the spatial science experimental image acquisition technology is that a mathematical model of a replacement sample is extracted by using an algorithm based on imitation learning, and preliminary generalization is performed on a working scene of an astronaut at a space station to reduce a conversion space of a control quantity, and then an action strategy is optimized by using a reinforcement learning algorithm. The experience of the actual operation of astronauts is introduced by using imitation learning, the learning process is accelerated, and the feasibility of the action strategy in the initial stage of reinforcement learning training is improved, so that the reliability of the task execution link is effectively improved.
The vision calibration needs to calibrate the camera first and then calibrate the projector:
Figure BDA0003907488870000111
The method adopts two-dimensional Fourier transform:
Figure BDA0003907488870000112
transfer function:
Figure BDA0003907488870000113
inverse two-dimensional fourier transform:
Figure BDA0003907488870000114
and obtaining the gray code pixel coordinates by using an edge detection algorithm.
World coordinates of the point to be measured P:
Figure BDA0003907488870000115
ICP point cloud registration is as follows:
given two point cloud sets
X=(x 1 ,x 2 ,…,x n ) (6)
P=(p 1 ,p 2 ,…,p n ) (7)
Solving for R and T minimizes the following formula:
Figure BDA0003907488870000116
centroid of two sets of point clouds:
Figure BDA0003907488870000117
Figure BDA0003907488870000121
coordinates of points in the two sets of point clouds with centroid as origin:
X′=x i -u x =x′ i (11)
P′=p i -u p =p′ i (12)
omega and subjecting it to SVD decomposition:
Figure BDA0003907488870000122
the transformation relationship between the two groups of point clouds is as follows:
R=VU T (14)
t=u x -Ru p (15)
furthermore, the space material science experiment sample box grabbing module mainly comprises a mechanical arm and a clamping hand, and performs path planning on the mechanical arm; the mechanical arm is shown in fig. 3, and a 6-axis mechanical arm and a clamping hand thereof are selected.
Further, the working process of the 'astronaut' mechanical arm is that the mechanical arm is set, the mechanical arm comprises an initial position, the position of a door handle in a high Wen Gui batch sample management module, the position of a space material science experiment sample box, the position of a sample box handle in an X-ray transmission imaging module and the like, the position of the door handle in the high Wen Gui batch sample management module, the position of the space material science experiment sample box and the position of the sample box handle in the X-ray transmission imaging module are positioned through a space material science experiment sample box visual positioning module, after the space material science experiment sample box visual positioning module obtains positioning information, the position information is sent to the mechanical arm, after the position information is obtained, the mechanical arm and a mechanical clamp are controlled to realize the operation of opening the door in the high Wen Gui batch sample management module, the task of replacing the space material science experiment sample box is completed, or the task of replacing the sample box in the X-ray transmission imaging module is completed.
Further, path planning is performed on the mechanical arm to realize smooth movement of the mechanical arm along the track, relevant path planning is performed on the mechanical arm, an intermediate point is inserted between a starting point and an ending point, the condition that angular velocity change is not smooth and acceleration has jump can be solved by polynomial interpolation for five times, and the expression is as follows:
Figure BDA0003907488870000123
constraint of angular velocity of start point:
Figure BDA0003907488870000131
/>
solving:
Figure BDA0003907488870000132
further, the 'astronaut' heaven and earth cooperative operation module comprises an 'astronaut' mechanical arm, a hand clamping operation module and an 'astronaut' state monitoring module; the space material science experiment sample box and the space experiment material sample box in the X-ray transmission imaging module are pulled out and inserted, the space material science experiment sample box which has completed the experiment is taken out, the new space material science experiment sample box is replaced, or the space experiment material sample box in the X-ray transmission imaging module which has completed the experiment is taken out, the space experiment material sample box in the new X-ray transmission imaging module is replaced, and other tasks are completed by controlling 6 motors of the mechanical arm and the hand clamping operation module, controlling the corresponding 6-axis motors of the mechanical arm to move and controlling the hand clamping to complete clamping actions after receiving the operation instruction; the 'astronaut' state monitoring module is used for monitoring the motion state and key parameters of the 'astronaut' during the on-orbit period, and is convenient for astronauts and ground staff to judge the working state of the 'astronaut' through the monitoring module.
Further, the 'astronaut' mechanical arm and the hand clamping operation module are used for controlling the movement of the mechanical arm and monitoring the state of the mechanical arm, wherein the mechanical arm mainly comprises a 6-axis motor and a hand clamping motor, and 6 motors in the mechanical arm are respectively a base motor, a shoulder motor, an elbow motor, a wrist motor 1, a wrist motor 2 and a wrist motor 3. The motor of the machine base is positioned at the bottommost layer of the mechanical arm and is used for controlling the whole mechanical arm to rotate in a horizontal plane; the shoulder motor is positioned at the upper part of the motor of the machine base and used for controlling the mechanical arm to rotate on a vertical surface; the elbow motor is positioned in the middle of the mechanical arm and is responsible for driving the other three wrist motors to move back and forth; the wrist motors 1, 2 and 3 are responsible for accurately controlling the clamping hands to rotate forwards, backwards, leftwards and rightwards in a certain range, the clamping hands are moved to the designated positions, and the mechanical arm can move and rotate in a three-dimensional space under the cooperation of 6 motors. After analyzing an experimental task, an experimenter determines an experimental action, and programming an astronaut to finish generating operation instructions, sending the instructions to a space station from the ground through an astronaut' heaven-earth cooperative operation module, and executing the operation instructions by the astronaut to finish corresponding experiments;
Further, the astronaut and ground staff can check the video and the picture shot by the visual positioning module of the experimental sample box of the space material at any time through the visual positioning module of the experimental sample box of the space material, so as to judge the running state of the equipment and monitor the experimental operation process, help the astronaut and the ground staff to check the experiment, the inquiry problem and the like, and simultaneously can display the running power, the running temperature and the like of 6 motors of the mechanical arm and the hand clamping motor, so that the astronaut and the ground staff can conveniently judge whether the working state and the working environment of the mechanical arm are normal, can timely find the abnormal condition of the mechanical arm, facilitate quick processing, avoid danger, and can also see the position information of the current mechanical arm on a monitoring page for judging the progress of the experimental process, and can also avoid collision of the mechanical arm and other equipment;
further, the space material science experiment operation instruction database module comprises an experiment parameter table, an alarm log table, a machine parameter table, an experiment flow table, a replacement flow instruction table, a picture video storage table and the like which are related to the astronaut by using a database;
Further, the database mainly comprises three parts, namely a business application, a client driver and an OpenGauss server, wherein the business application is matched with the world collaboration function. The client driver is responsible for receiving an access request from an application, returning an execution result to the application, communicating with an OpenGauss server, issuing SQL to be executed on the OpenGauss server, and receiving the execution result; the OpenGauss server is in charge of storing service data, executing a data query task and returning an execution result to the client driver; storage is a server local Storage resource, data is stored in a lasting mode, and the detailed architecture is shown in fig. 4;
the experimental parameter table comprises the steps of storing state information such as door handles in a high Wen Gui batch sample management module, a space material science experiment sample box, a sample box handle in an X-ray transmission imaging module and the like, and simultaneously monitoring working state information in the experiment process in real time;
table 1 table of experimental parameters
Figure BDA0003907488870000141
Figure BDA0003907488870000151
When certain parameters in the experimental process exceed a specified range, the alarm log table records the abnormal time, abnormal data value, abnormal physical quantity and other information of the experimental data, so that the follow-up staff can conveniently pick up and view the information;
Table 2 alarm log table
Field name Field description Type(s) Length of Main key
Time Alarm time Date 20 Yes
Name Name of physical quantity Varchar 20
Value Alarm value Varchar 20
Expectvalue Expected value Varchar 20
The robot parameter meter comprises the current state of a mechanical arm motor, such as a rotating angle, a moving distance, power of the motor and the like, and state parameters of the robot, such as an ambient temperature and the like;
table 3 machine ginseng number table
Figure BDA0003907488870000152
/>
Figure BDA0003907488870000161
The action flow table records the execution time of the action, the ID number of the action, the object for executing the action, the parameters used in the execution process of the action, the duration for executing the action and the state for executing the action respectively;
TABLE 4 action flow sheet
Field name Field description Type(s) Length of Main key
Time Action execution time Date 20 Yes
ID Action ID Int 10
Subject Executing action objects Varchar 20
Para Motion parameters Decimal 20
Duration Duration of time Int 20
State Execution state Int 10
The replacement flow instruction sheet is used for storing a sample replacement flow in the experimental process, sorting a plurality of actions stored in the action flow sheet according to the execution time, and sequentially executing related actions according to the action sequence in the replacement flow sheet by using the astronaut so as to complete the replacement task of the sample;
table 5 replacement flow instruction sheet
Figure BDA0003907488870000171
Figure BDA0003907488870000181
The picture video storage table is used for storing videos and images shot by the visual positioning module of the experimental sample box of the space material science, can see pictures shot by the visual positioning module in real time in the experimental process, and can also look back at experimental videos after the experiment is finished, so that the monitoring and analysis of the experimental process by workers are facilitated;
Table 6 picture video memory table
Field name Field description Type(s) Length of Main key
Time Picture acquisition time Date 20 Yes
ID Picture numbering Int 10
Name Picture name Varchar 20
Pic_URL Picture path Varchar 50
Time Video acquisition time Date 20 Yes
ID Video numbering Int 10
Name Video name Varchar 20
video_URL Video path Varchar 50
The technical scheme provided by the invention is further described below by combining with the embodiment.
Examples
The embodiment of the invention provides an intelligent vision-based space science experiment robot in coordination with the world. The invention can utilize the mechanical arm to finish the replacement of the sample. The whole operation flow of the robot mainly comprises the steps of obtaining an uplink instruction from a space material science experiment operation instruction database module, performing visual positioning by a space material science experiment sample box visual positioning module, performing path planning by a mechanical arm, controlling the mechanical arm in the space material science experiment sample box grabbing module to finish the actions of taking out a sample box, replacing the sample box and the like;
the method comprises the steps that an uplink instruction is obtained from a space material science experiment operation instruction database module, an experiment parameter table in the database can be read through an aerospace knight-errant' space-earth cooperative operation module, and whether state information such as a door handle, a space material science experiment sample box, a sample box handle in an X-ray transmission imaging module and the like in a high Wen Gui batch sample management module can be used for performing experiments is judged; reading a machine parameter table in a database, and judging whether the states (current value, voltage value, position coordinates, rotation angle and the like) of all motor parameters of the robot are normal or not; making a relevant action table of the movement of the mechanical arm (such as the initial position of the movement, the door handle position in a high Wen Gui batch sample management module, the sample box position in a space material science experiment, the sample box handle position in an X-ray transmission imaging module and the like), making a corresponding action table for each action of the mechanical arm, assigning a unique action ID for each given action, and enabling the robot to finish the corresponding action by virtue of the action ID in the experiment flow table; after the action table is manufactured, corresponding execution time and action ID are corresponding according to the time sequence of the movement of the mechanical arm, so that the mechanical arm can execute corresponding operation at the corresponding time, and the task of replacing samples is completed; the obtained up-down instructions can also be video and images shot by a visual positioning module of a space material science experiment sample box in a space material science experiment operation instruction database module, and ground staff and astronauts can review and watch in real time.
The visual positioning module is used for performing visual positioning through the visual positioning module of the sample box for the space material science experiment, and mainly comprises the steps of positioning the initial position of the mechanical arm, the door handle position in the high Wen Gui batch sample management module, the position of the sample box for the space material science experiment, the position of the sample box handle in the X-ray transmission imaging module and the like by using a visual positioning system, storing the coordinates of the position information in a database and sending the coordinates to the mechanical arm, and the mechanical arm performs related operation according to the position information to finish the replacement of samples.
After coordinates such as a door handle position in the initial position and high Wen Gui batch sample management module, a space material science experiment sample box position, a sample box handle position in the X-ray transmission imaging module are obtained, the mechanical arm needs to carry out path planning, and mainly comprises the following steps:
(1) Task planning generally requires high-level task decision making, takes a planning problem determined by an initial state and a planning target state of the mechanical arm as input, and reasonably plans out a behavior decision and a motion sequence required by the mechanical arm to complete the task.
(2) Path planning indicates an end effector path to a target point in a specified pose or desired configuration, requiring a specific trajectory planning algorithm to generate a motion trajectory for the end of the robotic arm. The measurement index of the good and bad track planning is whether the mechanical arm is smooth and smooth in the running process, and the occurrence of extreme positions and high-speed points is generally required to be eliminated, so that abnormal dangerous movements such as high speed, shaking and the like of the mechanical arm are avoided.
(3) Track optimization generally refers to optimizing the position, speed and acceleration of a path point on the basis of path planning, so as to optimize indexes such as the motion performance of a space mechanical arm and the task execution efficiency, and find and reach an optimal path.
After the moving path is determined, the mechanical arm in the space material science experiment sample box grabbing module can be controlled according to the position information of the target to move to the target positions such as the door handle position in the high Wen Gui batch sample management module, the space material science experiment sample box position, the sample box handle position in the X-ray transmission imaging module, and the like, so that the tasks of opening the door of the sample management module, taking out the space material science experiment sample box and replacing the sample box are completed.
As can be seen from the above specific description of the invention, the space science experiment robot based on intelligent vision is used for space station experimenters to assist space science experiments by using aerospace knight, so that scientific experiment tasks in space stations can be more efficiently and rapidly completed, and personal safety of astronauts is improved. The robot is an auxiliary experimental robot in a first-instance space station in China. By using the control technology of the space robot, the remote space robot can be operated by using the operation of a ground person and the intelligent operation to jointly complete the appointed task. The ground experimenter can check the experiment progress and develop the experiment in real time by utilizing the aerospace knight and combining the heaven and earth cooperative operation, so that the efficiency of the space science experiment is greatly improved, and the problem of space-time isolation between the person and the operation target is solved. The method comprises the following steps: the system comprises a space material science experiment sample box vision positioning module, a space material experiment sample box grabbing module, an aerospace knight-errant heaven-earth cooperative operation module and a space material science experiment operation instruction database module.
Finally, it should be noted that the above embodiments are only for illustrating the technical solution of the present invention and are not limiting. Although the present invention has been described in detail with reference to the embodiments, it should be understood by those skilled in the art that modifications and equivalents may be made thereto without departing from the spirit and scope of the present invention, which is intended to be covered by the appended claims.

Claims (4)

1. The space science experiment robot is used for completing replacement of the experiment material sample box in the space station high-temperature cabinet on orbit based on intelligent vision; the robot is characterized by comprising: the system comprises a space material science experiment sample box vision positioning module, a space material science experiment sample box grabbing module, a space material science experiment operation instruction database module and a space and earth cooperative operation module; wherein,,
the visual positioning module of the space material science experiment sample box is used for acquiring pose information of a door handle of the batch sample management module in the high-temperature cabinet, the space material science experiment sample box and a sample box handle in the X-ray transmission imaging module by using a space science experiment image acquisition technology, and sending the pose information to the space material experiment sample box grabbing module;
The space material science experiment sample box grabbing module is used for planning a path according to pose information and executing a sample box replacement task;
the space material science experiment operation instruction database module is used for storing instruction data related to the robot by using the database;
the space-earth cooperative operation module is used for calling the data stored in the space material science experiment operation instruction database module and issuing related instructions of a sample box replacement task to the space material science experiment sample box grabbing module; the working state monitoring device is also used for monitoring the working state of the robot module;
the space material science experiment sample box vision positioning module includes: the system comprises a space science experiment image sensor, a pose sensing part, a space material science experiment material sample box data analysis part and a space material science experiment material sample box pose data exchange part; wherein,,
the space science experiment image sensor is used for obtaining three-dimensional information of a door handle in a high Wen Gui batch sample management module, a space material science experiment sample box and a sample box handle in an X-ray transmission imaging module by using a space science experiment image obtaining technology, and transmitting the three-dimensional information to the pose sensing part;
The pose sensing component is used for converting three-dimensional information of a door handle in a high Wen Gui batch sample management module, a space material science experiment sample box and a sample box handle in an X-ray transmission imaging module into image information of the door handle in the high Wen Gui batch sample management module, the space material science experiment sample box and the sample box handle in the X-ray transmission imaging module, and transmitting the image information to the space material science experiment material sample box data analysis component;
the space material science experiment material sample box data analysis component is used for analyzing the image information transmitted by the space material science experiment sample box visual positioning module to obtain pose information of a door handle in a high Wen Gui batch sample management module, the space material science experiment sample box and a sample box handle in an X-ray transmission imaging module;
the space material science experiment material sample box pose data exchange component is used for transmitting pose information of a door handle in a high Wen Gui batch sample management module, the space material science experiment sample box and a sample box handle in an X-ray transmission imaging module to the heaven-earth cooperative operation module;
the visual positioning module of the space material science experiment sample box comprises the following steps of:
Firstly, calibrating a camera, and then calibrating a projector:
Figure FDA0004240695720000021
where u denotes the horizontal axis coordinate in the pixel coordinate system, v denotes the vertical axis coordinate in the pixel coordinate system, Z c The method comprises the steps of representing object distances, A representing a camera internal reference matrix, R representing a camera rotation matrix, T representing a camera translation matrix, X representing an X-axis coordinate in a world coordinate system, Y representing a Y-axis coordinate in the world coordinate system, and Z representing a Z-axis coordinate in the world coordinate system;
performing two-dimensional Fourier transform:
Figure FDA0004240695720000022
f (u, v) represents a frequency domain image after Fourier transform, M represents the width of the image, N represents the height of the image, F (x, y) represents a time domain image before Fourier transform,
selecting a transfer function H (u, v):
Figure FDA0004240695720000023
wherein D (u, v) represents the distance from the point (u, v) to the Fourier transform center, D 0 Represents a cut-off frequency;
performing two-dimensional Fourier inverse transformation:
Figure FDA0004240695720000024
using an edge detection algorithm to obtain gray code pixel coordinates; wherein,,
world coordinates of the point to be measured P:
Figure FDA0004240695720000025
wherein O is c Representing the origin, X, in the camera coordinate system woc Represents the origin x coordinate, Y in the camera coordinate system woc Representing the origin y-coordinate, Z in the camera coordinate system woc Representing the origin z coordinate in the camera coordinate system, R c Representing a rotation matrix in a camera coordinate system, t c Representing a translation matrix in the camera coordinate system, P c Representing the P coordinate, X of a point in the camera coordinate system wpc Representing the x-coordinate, Y-coordinate of point P in the camera coordinate system wpc Representing the y-coordinate, Z, of point P in the camera coordinate system wpc Representing the z-coordinate, x of point P in the camera coordinate system c Representing normalized x-coordinate, y in camera coordinate system c Representing normalized y-coordinate, z in camera coordinate system c Representing normalized z-coordinate, O, in camera coordinate system p Represents the origin, X, in the projector coordinate system wop Represents the origin x coordinate, Y in the projector coordinate system wop Representing the origin y coordinate, Z in the projector coordinate system wop Representing the origin z coordinate, P in the projector coordinate system p Representing the P coordinate, X point in the projector coordinate system wpp X-coordinate, Y-coordinate representing origin in projector coordinate system wpp Y-coordinate, Z, representing the origin in the projector coordinate system wpp Z coordinate, x representing origin in projector coordinate system p Representing the x-coordinate, y-coordinate of point P in the projector coordinate system p Representing the y-coordinate of point P in the projector coordinate system,z p Representing the z-coordinate of point P in the projector coordinate system;
performing ICP point cloud registration, comprising:
given two point clouds X and P,
X=(x 1 ,x 2 ,…,x n ) (6)
P=(p 1 ,p 2 ,…,p n ) (7)
wherein n represents the total number of point clouds, x n Representing the point cloud in set X, p n Representing a point cloud in the set P;
solving for R and T, minimizing E (R, T),
Figure FDA0004240695720000031
wherein E (R, t) represents an error function, R represents a rotation matrix, t represents a translation matrix, x i Representing the point cloud in set X, p i Representing a point cloud in the set P;
acquiring centroids of two groups of point clouds:
Figure FDA0004240695720000032
Figure FDA0004240695720000033
acquiring coordinates of points in two groups of point clouds with the mass center as an origin:
X′=x i -u x =x′ i (11)
P′=p i -u p =p′ i (12)
x 'represents the centroid coordinates of the set X, X' i Representing centroid coordinates of set X; p 'represents the centroid coordinates of the set P, P' i Representing centroid coordinates of set P;
omega is obtained and SVD decomposition is carried out on the omega:
Figure FDA0004240695720000041
omega represents the matrix to be SVD decomposed, U represents the orthogonal matrix, T represents the transpose, delta 1 Representing the non-zero singular values of matrix ω, δ 2 Representing the non-zero singular values of matrix ω, δ 3 Non-zero singular values representing a matrix ω;
the transformation relation between the two groups of point clouds is as follows:
R=VU T (14)
v represents an orthogonal matrix, and U represents an orthogonal matrix;
t=u x -Ru p (15)
t represents a translation matrix;
the space material science experiment sample box snatchs module includes: a mechanical arm and a clamping hand; the mechanical arm and the clamping hand are used for completing the task of replacing the sample box according to the acquired pose information and the planned path, and specifically comprise the following steps:
setting an initial position of a mechanical arm;
the method comprises the steps that the position of a door handle in a high Wen Gui batch sample management module, the position of a sample box in a space material science experiment sample box and the position of a sample box handle in an X-ray transmission imaging module are obtained through a space material science experiment sample box visual positioning module, and position information is sent to a mechanical arm;
Planning a path of the mechanical arm;
the mechanical arm and the clamping hand are controlled to realize the operation of opening the door in the high Wen Gui batch sample management module, so that the task of replacing the sample box in the space material science experiment is completed, or the task of replacing the sample box in the X-ray transmission imaging module is completed;
the path planning for the mechanical arm comprises the following steps:
inserting an intermediate point between a starting point and an ending point by adopting fifth-degree polynomial interpolation, wherein the expression is as follows:
Figure FDA0004240695720000042
wherein θ (t) represents an angular displacement at time t,
Figure FDA0004240695720000043
represents the angular velocity at time t,/-, and>
Figure FDA0004240695720000044
indicating the angular acceleration at time t, a 0 、a 1 、a 2 、a 3 、a 4 、a 5 Representing the correlation coefficient to be solved in the formulas;
and constraining the angular speed of the starting point and the stopping point, wherein the constraint condition meets the following formula:
Figure FDA0004240695720000051
wherein θ (t) 0 ) The position of the starting point is indicated,
Figure FDA0004240695720000052
indicating the start point speed, +.>
Figure FDA0004240695720000053
Acceleration of starting point, t 0 For the starting point time, θ (t f ) Indicating the end point position>
Figure FDA0004240695720000054
Indicating termination point speed, +.>
Figure FDA0004240695720000055
Indicating the acceleration at the end point, t f Indicating the end point moment;
solving to obtain:
Figure FDA0004240695720000056
θ 0 the starting position is indicated as such,
Figure FDA0004240695720000057
indicates the initial speed +.>
Figure FDA0004240695720000058
Represents the initial acceleration, θ f Indicates the termination position +.>
Figure FDA0004240695720000059
Indicates termination speed, ++>
Figure FDA00042406957200000510
Indicating a termination acceleration;
the robot-related parameter table includes: an experiment parameter table, an alarm log table, a machine parameter table, an action flow table, a replacement flow instruction table and a picture video storage table; wherein,,
The database comprises: business application, client driver and OpenGauss server; the business application supports the functions of the heaven-earth cooperative control module; the client driver is responsible for receiving an access request from a service application, returning an execution result to the application, communicating with an OpenGauss server, issuing SQL to be executed on the OpenGauss server, and receiving the execution result; the OpenGauss server is in charge of storing service data, executing a data query task and returning an execution result to the client driver;
the experiment parameter table is used for storing state information of a door handle in a time and high Wen Gui batch sample management module, a space material science experiment sample box and a sample box handle in an X-ray transmission imaging module, and can also monitor working state information in the experiment process in real time; the state information of the door handle in the Gao Wengui batch sample management module, the spatial material science experiment sample box and the sample box handle in the X-ray transmission imaging module comprises: time, the opening and closing state of a door handle in a high Wen Gui batch sample management module, the sample replacement batch, the number of unrechanged samples and the sample replacement state of a space material science experiment sample box; the working state information in the experimental process comprises: temperature, shell temperature, current and voltage in the high-temperature cabinet;
The alarm log table is used for recording the abnormal time, abnormal data value, expected data value and abnormal physical quantity information of experimental data when certain parameters exceed a specified range in the experimental process;
the robot parameter table is used for recording state parameters including the current state of the motor of the mechanical arm and the state parameters of the robot; the states of the mechanical arm motor include: the angle of rotation, the distance of movement and the power of the motor; the state parameters of the robot include: ambient temperature;
the action flow table is used for recording the execution time of the action, the ID number of the action to be executed, the object to execute the action, the parameters used in the execution process of the action, the duration of executing the action and the state of executing the action;
the robot can sequentially execute related actions according to the action sequence in the replacement flow table to complete the task of replacing the sample;
the picture video storage table is used for storing videos and images shot by the visual positioning module of the experimental sample box of the space material science, can see pictures shot by the visual positioning module in real time in the experimental process, and can also review experimental videos after the experiment is finished.
2. The intelligent vision-based world cooperative space science experiment robot of claim 1, wherein the sample cartridge replacement task sequentially comprises the following steps:
grasping a high Wen Gui batch sample management module door handle and opening a high Wen Gui batch sample management module door;
extracting a space material science experiment sample box in the high-temperature cabinet or a space material science experiment sample box in the X-ray transmission imaging module;
taking out a material sample box which has completed a space material science experiment;
replacing the new material sample cartridge;
a space material science experiment sample box inserted into the high temperature cabinet;
the tall Wen Gui batch sample management module door handle is grasped and the tall Wen Gui batch sample management module door is closed.
3. The intelligent vision-based world cooperative space science experiment robot of claim 1, wherein the world cooperative operation module comprises a robot operation module and a robot state monitoring module; wherein,,
the robot operation module is used for controlling the motors of the mechanical arm and the clamping hand, and controlling the corresponding mechanical arm motor to move and controlling the clamping hand to finish the clamping action after receiving the operation instruction, so that the task of replacing the sample box is finished;
The state monitoring module is used for monitoring the motion state and key parameters of the robot during the on-orbit period so as to judge the working state of the robot.
4. The intelligent vision-based world cooperative space science experiment robot of claim 1, wherein the motor of the mechanical arm comprises: the wrist strap comprises a base motor, a shoulder motor, an elbow motor, a first wrist motor, a second wrist motor and a third wrist motor;
the motor of the machine base is positioned at the bottommost layer of the mechanical arm and is used for controlling the whole mechanical arm to rotate in a horizontal plane;
the shoulder motor is positioned at the upper part of the motor of the machine base and used for controlling the mechanical arm to rotate on a vertical surface;
the elbow motor is positioned in the middle of the mechanical arm and used for driving the first wrist motor, the second wrist motor and the third wrist motor to move back and forth;
the first wrist motor, the second wrist motor and the third wrist motor are used for controlling the clamping hand to rotate back and forth and left and right, and then the clamping hand is moved to a designated position.
CN202211309652.4A 2022-10-25 2022-10-25 Space science experiment robot is cooperated to world based on intelligent vision Active CN115519546B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211309652.4A CN115519546B (en) 2022-10-25 2022-10-25 Space science experiment robot is cooperated to world based on intelligent vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211309652.4A CN115519546B (en) 2022-10-25 2022-10-25 Space science experiment robot is cooperated to world based on intelligent vision

Publications (2)

Publication Number Publication Date
CN115519546A CN115519546A (en) 2022-12-27
CN115519546B true CN115519546B (en) 2023-06-27

Family

ID=84704372

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211309652.4A Active CN115519546B (en) 2022-10-25 2022-10-25 Space science experiment robot is cooperated to world based on intelligent vision

Country Status (1)

Country Link
CN (1) CN115519546B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116130037B (en) * 2023-01-28 2023-10-10 钢研纳克检测技术股份有限公司 Material high-throughput preparation-statistics mapping characterization integrated research and development system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5499320A (en) * 1993-03-24 1996-03-12 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Extended task space control for robotic manipulators
JP3376029B2 (en) * 1993-07-15 2003-02-10 株式会社東芝 Robot remote control device
CN103302668B (en) * 2013-05-22 2016-03-16 东南大学 Based on control system and the method thereof of the Space teleoperation robot of Kinect
CN111496770B (en) * 2020-04-09 2023-04-07 上海电机学院 Intelligent carrying mechanical arm system based on 3D vision and deep learning and use method
CN114505869A (en) * 2022-02-17 2022-05-17 西安建筑科技大学 Chemical reagent intelligent distribution machine control system
CN114912287B (en) * 2022-05-26 2023-07-25 四川大学 Robot autonomous grabbing simulation system and method based on target 6D pose estimation

Also Published As

Publication number Publication date
CN115519546A (en) 2022-12-27

Similar Documents

Publication Publication Date Title
Malhan et al. Identifying feasible workpiece placement with respect to redundant manipulator for complex manufacturing tasks
Corke et al. Real-time vision, tracking and control
Sharma et al. Motion perceptibility and its application to active vision-based servo control
Fu et al. Active learning-based grasp for accurate industrial manipulation
CN115519546B (en) Space science experiment robot is cooperated to world based on intelligent vision
CN114912287A (en) Robot autonomous grabbing simulation system and method based on target 6D pose estimation
CN111325768A (en) Free floating target capture method based on 3D vision and simulation learning
Xu et al. Efficient object manipulation to an arbitrary goal pose: Learning-based anytime prioritized planning
Liu et al. A mixed perception-based human-robot collaborative maintenance approach driven by augmented reality and online deep reinforcement learning
CN110202581A (en) Compensation method, device and the electronic equipment of end effector of robot operating error
Allen et al. Optimal path planning for image based visual servoing
Su et al. A ROS based open source simulation environment for robotics beginners
Liu et al. An image based visual servo approach with deep learning for robotic manipulation
CN110434854B (en) Redundant manipulator visual servo control method and device based on data driving
Behera et al. A hybrid neural control scheme for visual-motor coordination
Liu et al. Visual servoing with deep learning and data augmentation for robotic manipulation
Van Molle et al. Learning to grasp from a single demonstration
Li et al. A novel semi-autonomous teleoperation method for the tiangong-2 manipulator system
Zhao Multifeature video modularized arm movement algorithm evaluation and simulation
Wan et al. DeepClaw: A robotic hardware benchmarking platform for learning object manipulation
Li A Design of Robot System for Rapidly Sorting Express Carton with Mechanical Arm Based on Computer Vision Technology
Senapati Design and control of an articulated robotic arm using visual inspection for replacement activities
Sicard et al. Learning the velocity kinematics of iCub for model-based control: XCSF versus LWPR
Li et al. Research on grasping strategy based on residual network
Pu et al. A general mobile manipulator automation framework for flexible tasks in controlled environments

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant