CN111923049B - Visual servo and multitasking control method for flying mechanical arm based on spherical model - Google Patents
Visual servo and multitasking control method for flying mechanical arm based on spherical model Download PDFInfo
- Publication number
- CN111923049B CN111923049B CN202010848414.5A CN202010848414A CN111923049B CN 111923049 B CN111923049 B CN 111923049B CN 202010848414 A CN202010848414 A CN 202010848414A CN 111923049 B CN111923049 B CN 111923049B
- Authority
- CN
- China
- Prior art keywords
- mechanical arm
- flying
- unmanned aerial
- aerial vehicle
- formula
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 43
- 230000000007 visual effect Effects 0.000 title claims abstract description 41
- 239000011159 matrix material Substances 0.000 claims description 41
- 230000005484 gravity Effects 0.000 claims description 14
- 239000012636 effector Substances 0.000 claims description 12
- 230000008569 process Effects 0.000 claims description 11
- 230000004069 differentiation Effects 0.000 claims description 9
- 238000005096 rolling process Methods 0.000 claims description 9
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 claims description 4
- 230000008859 change Effects 0.000 claims description 4
- 238000013459 approach Methods 0.000 claims description 3
- 238000004364 calculation method Methods 0.000 claims description 3
- 230000000694 effects Effects 0.000 claims description 3
- 238000010606 normalization Methods 0.000 claims description 3
- HOWHQWFXSLOJEF-MGZLOUMQSA-N systemin Chemical compound NCCCC[C@H](N)C(=O)N[C@@H](CCSC)C(=O)N[C@@H](CCC(N)=O)C(=O)N[C@@H]([C@@H](C)O)C(=O)N[C@@H](CC(O)=O)C(=O)OC(=O)[C@@H]1CCCN1C(=O)[C@H]1N(C(=O)[C@H](CC(O)=O)NC(=O)[C@H](CCCN=C(N)N)NC(=O)[C@H](CCCCN)NC(=O)[C@H](CO)NC(=O)[C@H]2N(CCC2)C(=O)[C@H]2N(CCC2)C(=O)[C@H](CCCCN)NC(=O)[C@H](CO)NC(=O)[C@H](CCC(N)=O)NC(=O)[C@@H](NC(=O)[C@H](C)N)C(C)C)CCC1 HOWHQWFXSLOJEF-MGZLOUMQSA-N 0.000 claims description 3
- 108010050014 systemin Proteins 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 4
- 238000005259 measurement Methods 0.000 description 3
- 238000009434 installation Methods 0.000 description 2
- 230000010399 physical interaction Effects 0.000 description 2
- 238000011084 recovery Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
- B25J9/1605—Simulation of manipulator lay-out, design, modelling of manipulator
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/08—Control of attitude, i.e. control of roll, pitch, or yaw
- G05D1/0808—Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Manipulator (AREA)
Abstract
The invention provides a visual servo and multitasking control method of a flying mechanical arm based on a spherical model, wherein a servo structure comprises an unmanned aerial vehicle, a mechanical arm and a control module; the mechanical arm is arranged on the mechanical arm mounting surface of the unmanned aerial vehicle; the mechanical arm mounting surface is also provided with a depth camera, the shooting angle of the depth camera is set at an angle with the mechanical arm mounting surface, and the control module enables the depth camera and the mechanical arm to be combined into a robot hand-eye calibration system; an actuator is arranged at the tail end of the mechanical arm; the control module is a control module capable of controlling the flight of the unmanned aerial vehicle; when the flying mechanical arm needs to operate the target, the control module evaluates the distance of the target through the depth camera, if the target is located outside the operation range of the mechanical arm, the control module drives the unmanned aerial vehicle to fly towards the target, and if the target is located within the operation range of the mechanical arm, the control module drives the mechanical arm to operate the target; the invention can mount the mechanical arm with active manipulation capability on the unmanned aerial vehicle.
Description
Technical Field
The invention relates to the technical field of unmanned aerial vehicles, in particular to a visual servo and multitasking control method of a flying mechanical arm based on a spherical model.
Background
The unmanned aerial vehicle system has the characteristics of strong movement maneuverability, high flexibility and the like, and can fly freely in the three-dimensional world, so that the unmanned aerial vehicle system is widely applied to industries in various fields in recent years, such as industrial factory inspection, battlefield reconnaissance, natural disaster reconnaissance, map creation, measurement and the like.
The flying mechanical arm system is characterized in that a mechanical arm with a certain degree of freedom is arranged on the unmanned aerial vehicle, and compared with the conventional unmanned aerial vehicle, the operation type unmanned aerial vehicle system has a plurality of advantages, can realize rapid capture of an aerial or ground target in the flying process, and can rapidly reach a complex environment which cannot be accessed by the ground robot to execute fine operation tasks such as installation or recovery of measurement equipment; however, conventional unmanned aerial vehicle systems greatly limit their ability to actively maneuver, and therefore, developing a flying robotic arm for physical interaction between the unmanned aerial vehicle and the environment is of great significance. The invention provides a visual servo and multitasking coordination control method for a flying mechanical arm based on a spherical model according to the requirements.
Disclosure of Invention
The invention provides a visual servo and multitasking control method for a flying mechanical arm based on a spherical model, which can mount a mechanical arm with active control capability on an unmanned aerial vehicle.
The invention adopts the following technical scheme.
The visual servo structure of the flying mechanical arm based on the spherical model comprises a six-degree-of-freedom unmanned aerial vehicle, a three-degree-of-freedom mechanical arm and a control module; the mechanical arm is arranged at a mechanical arm mounting surface of the base of the unmanned aerial vehicle; a depth camera is further arranged at the mounting surface of the mechanical arm; the shooting angle of the depth camera is set at an angle with the mounting surface of the mechanical arm, and the control module is connected with the depth camera and the mechanical arm and enables the depth camera and the mechanical arm to be combined into the robot hand-eye calibration system.
The mechanical arm comprises three rotating joints; the mechanical arm mounting surface is positioned right below the platform of the base; the depth camera is an Intel RealSense camera.
An actuator is arranged at the tail end of the mechanical arm; the control module is a control module capable of controlling the flight of the unmanned aerial vehicle; when the flying mechanical arm needs to operate the target, the control module evaluates the distance of the target through the depth camera, if the target is located outside the operation range of the mechanical arm, the control module drives the unmanned aerial vehicle to fly towards the target, and if the target is located within the operation range of the mechanical arm, the control module drives the mechanical arm to operate the target.
The control method uses the flying mechanical arm vision servo structure based on the spherical model, and comprises the following steps of;
step S1: the control module acquires target object information through the depth camera, establishes a spherical coordinate system for servo flight control based on the spherical model, and calculates the difference of the real-time position of the target object relative to the expected position of the unmanned aerial vehicle in real time based on the spherical coordinate system in the flight process of the unmanned aerial vehicle so as to control the unmanned aerial vehicle to fly towards the target object by a visual servo control method based on the spherical coordinate system and obtain the servo speed of the unmanned aerial vehicle;
step S2: when the unmanned aerial vehicle is close to the target object, and the target object reaches a desired position relative to the unmanned aerial vehicle, the control module acquires target object information and information of an actuator at the tail end of the mechanical arm through the depth camera, and performs servo control on the mechanical arm based on a spherical coordinate system, and meanwhile obtains the angular speed of the mechanical arm in the spherical coordinate system;
step S3: the motion of the mechanical arm is controlled through a visual error equation of the flying mechanical arm so as to eliminate the visual error of the depth camera, so that the control module can carry out servo control on the mechanical arm to move the actuator, and the actuator at the tail end of the mechanical arm is moved and positioned to the target object;
step S4: the method comprises the steps of controlling a flying mechanical arm to position and grab a target object by adopting a coordination control method aiming at multiple tasks, wherein the multiple tasks comprise mechanical arm tail end position control, target object posture estimation, dynamic compensation system gravity center change and mechanical arm joint limit avoidance, and in the process of positioning and grabbing the target object, the coordination control method enables the task type executed by the flying mechanical arm to be matched with the current working condition of the flying mechanical arm.
The spherical model adopts latitude and longitude coordinates m (theta z ,θ x ) T The coordinate of the position point of the actuator of the mechanical arm or the object under the camera coordinate system is assumed to be M i =(x i y i z i ) T The following steps are:
wherein ,R2 =x 2 +y 2 +z 2 。
The step S1 comprises the following steps of;
step S11: the spherical coordinate system is a camera coordinate system taking the depth camera as an origin, and the depth camera takes a speed v in a world coordinate system o =(v T ,w T ) T Moving, setting the position point of the object observed by the camera as a spherical feature point, wherein the speed of the spherical feature point relative to a camera coordinate system isNamely:
step S12: deriving the formula I, and combining the formula II to obtain a spherical-based image jacobian matrix, wherein the spherical-based image jacobian matrix is as follows:
step S13: the spherical feature point motion equation is obtained as follows:
wherein ,vo =(v T ,w T ) T ∈R 6×1 ,v=(v x v y v z ) T ∈R 3×1 、w=(w x w y w z ) T ∈R 3×1 Respectively representing the translational and rotational speeds of the drone.
The step S2 comprises the following steps;
step S21: let p c =(x c y c z c ) T ∈R 3×1 The position of the position point P of the mechanical arm end effector under the camera coordinate system is expressed, and the time differentiation of the formula I can be obtained:
wherein ,
step S22: the tail end point P of the mechanical arm is accelerated under a camera coordinate systemIn connection with equation two, it can be expressed as a function of the motion of the end effector:
wherein ,B v e =( B T e B W e ) T ∈R 6×1 representing the velocity vector of the end point of the arm. sk () represents the oblique symmetry matrix of the vector.
Step S23: according to the robotics of the present invention, B v e can be expressed as a function of the angular velocity of the joints of the mechanical arm:
wherein L (q) ∈R 6×3 Is a mechanical arm jacobian matrix.
Step S24: the comprehensive formula five, the formula six and the formula seven can be obtained:
the visual error equation of the flying mechanical arm is as follows:
e=m e -m 0 (equation nine)
The time differentiation of equation nine can be obtained:
wherein ,Jfirst =(L s -J ov -J owz )∈R 2×7 Representing a first task jacobian matrix of the system,representing the generalized speed input of the system, finally +.>
The step S4 comprises the following steps:
step S41: the first task is to control an actuator at the tail end position of the mechanical arm, and the input control law of the first task of the system can be obtained through a formula ten, wherein the input control law is as follows:
wherein, if J first Is of full rank of lines, thenΛ 1 Is a positive constant;
at this time, a system task coordination control weighting formula containing weights for separating the flight platform and the mechanical arm variables is required to be designed, and the differential result of the formula is brought into a formula nine, so that the final system can tend to be exponentially stable:
in order to avoid the robot arm being fully stretched to a singular point during the process of grasping the target object using visual guidance, the following two possible situations are considered. Firstly, when the end effector is far away from the target, the moving unmanned aerial vehicle is better than the moving mechanical arm to move the four-rotor platform; secondly, when the end effector approaches to the target, the target is in the operable range of the mechanical arm, and the servo effect is better achieved through the movement of the mechanical arm than the movement of the four-rotor platform; for a flying manipulator, the movements of the flying platform and the manipulator are quite different; thus, a system mission coordination control weighting formula is designed that contains weights to separate the flight deck and the robot variables:
wherein: mu is a normal number, W 1 ,W 2 The weight matrix for separating the four-rotor platform variable and the robot arm joint variable is as follows:
n * is the expected value of the visual error norm of the motion of the flying platform, and n is the current value of the visual error norm of the motion of the flying platform. D () is a defined threshold function, defined as follows:
and carrying out mathematical differentiation on the task coordination control weighting formula to obtain:
according to the mass center solving theory of the multi-body system, the mass center C of the system can be obtained G The calculation formula performs the task coordination control weighting formula ψThe partial derivative is obtained:
by comparing the formula thirteen with the formula seventeen, the weight-containing value can be obtainedNamely the following formula:
step S42: the second task: positioning the tail end of the mechanical arm; in a second task, solving a small number of equation sets according to an EPnP algorithm in a known corresponding relation between the coordinates of the 2D point and the 3D point to obtain a posture matrix T of the camera under the reference system of the mass center of the target object c ∈R 3×3 Inverting the matrix to obtain the posture of the target under the camera coordinate system, which is marked as T c -1 The position of the target object under the coordinate system C is p c ∈R 3×1 The positioning gesture matrix of the target object under the coordinate system C is as follows:
step S43: third task: system center of gravity shift control
The yaw angle only changes the body direction of the operation type unmanned aerial vehicle, and the influence on the gesture is very small, so the gesture of the operation type unmanned aerial vehicle is mainly controlled to pitch and roll stability. So assume the expected value ψ of the yaw angle d Output ideal rolling angle for fixed valuePitch angle theta d . Therefore, on the basis of integral modeling of the flying mechanical arm, the position controller is designed to be:
wherein ,u1 To input lift to the system, M s For the total mass of the system, η= (η) 1 η 2 η 2 ) T ,F(C G ) Is a system center of gravity shift parameter. In the unmanned aerial vehicle body base coordinate system, according to the multi-body system centroid solving theory, the system centroid C G The method comprises the following steps:
wherein ,Mi Representing the weight of the ith joint of the mechanical arm, C Gi Representing the position coordinates of the mass center of the joint i in the base coordinate system of the unmanned plane body;
the attitude controller mainly controls the rolling angle, the pitch angle and the yaw angle of the operation type unmanned aerial vehicle, and the actual inertial tensor deviation value of the system is estimatedAnd center of gravity offset parameter->In order to enable the attitude of the flying mechanical arm to reach the expected attitude, an attitude error function is defined as follows:
with inversion control, the following control laws are designed:
wherein ,u2 、u 3 、u 4 Respectively a rolling moment input, a pitching moment input and a yawing moment input of the system, s i (i=1, 2, 3) state errors defined for the system, the state variables are respectivelyRespectively adjust the corresponding parameters k i 、l i To adjust the upper limit values of the attitude error and the state error, the function G (·) represents the Gaussian basis function output, μ i 、ξ i (i=1, 2, 3) is a normal number, p, q, r is an angular velocity vector in coordinate system B, +.>Is a constant parameter matrix.
Step S44: fourth task: avoiding joint limitation
Defining a joint angle error:
e q =q * q (formula twenty-five)
wherein ,q* As a function of the desired value of the joint angle,q L 、q H the joint limit minimum and maximum values. The following task functions are defined:
wherein ,kt Is controlled byWeight matrix for making joint angle normalization, k t =diag{(q H1 -q L1 ) -2 (q H2 -q L2 ) -2 (q H3 -q L3 ) -2 Desired value of task function isThe desired task is therefore->The method comprises the following steps:
wherein ,is a matrix J f =(I 3×3 0 3×4 ) Is a pseudo-inverse of (a).
Compared with the prior art, the invention has the following beneficial effects: unlike classical visual servo modeling schemes, the control system of the present invention uses image features based on spherical models to build visual error equations. The visual servo scheme provided by the invention has a field of view as large as possible, and is very suitable for unmanned aerial vehicles. It eliminates the need to preserve field of view characteristics, reducing the uncertainty of optical flow field motion. Aiming at the characteristic of under-actuation and high coupling of the flying mechanical arm, the invention adopts the image jacobian matrix with decoupling of linear speed and angular speed to carry out visual servo tasks.
Meanwhile, a task coordination function between the unmanned aerial vehicle and the manipulator is designed, a weighted jacobian matrix based on the visual error norm of the unmanned aerial vehicle is obtained, and a pseudo-inverse matrix of the jacobian matrix is deduced. The measures improve task coordination between the unmanned aerial vehicle and the manipulator, and avoid the limit of the joints of the manipulator.
Aiming at the problem of gravity center offset caused by movement of a manipulator, the invention designs a sliding mode position control system based on gravity center offset compensation. Meanwhile, an attitude controller based on inversion control is designed, and the stability of the flying mechanical arm system is improved. The unmanned aerial vehicle can stably hover over the target, and the manipulator can orderly and safely grasp the target.
Compared with a conventional unmanned aerial vehicle, the flying mechanical arm can quickly capture an aerial or ground target in the flying process, can quickly reach a complex environment which cannot be accessed by a ground robot to execute fine operation tasks such as installation or recovery of measurement equipment, and the like, and greatly improves the physical interaction capability between the unmanned aerial vehicle and the environment. Therefore, the method has great application value.
Drawings
The invention is described in further detail below with reference to the attached drawings and detailed description:
FIG. 1 is a schematic diagram of an apparatus according to the present invention;
FIG. 2 is a schematic flow chart of the present invention;
FIG. 3 is a schematic diagram of the coordinate variation during servo flight control in step S1;
FIG. 4 is a schematic diagram of the coordinate variation of the servo control of the mechanical arm in step S2;
FIG. 5 is a schematic diagram of the coordinate change when the robot arm is servo-controlled to move the actuator in step S3;
in the figure: 1-unmanned aerial vehicle; 2-depth camera; 3-mechanical arm.
Detailed Description
As shown in the figure, the visual servo structure of the flying mechanical arm based on the spherical model comprises a six-degree-of-freedom unmanned aerial vehicle, a three-degree-of-freedom mechanical arm and a control module; the mechanical arm is arranged at a mechanical arm mounting surface of the base of the unmanned aerial vehicle; a depth camera is further arranged at the mounting surface of the mechanical arm; the shooting angle of the depth camera is set at an angle with the mounting surface of the mechanical arm, and the control module is connected with the depth camera and the mechanical arm and enables the depth camera and the mechanical arm to be combined into the robot hand-eye calibration system.
The mechanical arm comprises three rotating joints; the mechanical arm mounting surface is positioned right below the platform of the base; the depth camera is an Intel RealSense camera.
An actuator is arranged at the tail end of the mechanical arm; the control module is a control module capable of controlling the flight of the unmanned aerial vehicle; when the flying mechanical arm needs to operate the target, the control module evaluates the distance of the target through the depth camera, if the target is located outside the operation range of the mechanical arm, the control module drives the unmanned aerial vehicle to fly towards the target, and if the target is located within the operation range of the mechanical arm, the control module drives the mechanical arm to operate the target.
The control method uses the flying mechanical arm vision servo structure based on the spherical model, and comprises the following steps of;
step S1: the control module acquires target object information through the depth camera, establishes a spherical coordinate system for servo flight control based on the spherical model, and calculates the difference of the real-time position of the target object relative to the expected position of the unmanned aerial vehicle in real time based on the spherical coordinate system in the flight process of the unmanned aerial vehicle so as to control the unmanned aerial vehicle to fly towards the target object by a visual servo control method based on the spherical coordinate system and obtain the servo speed of the unmanned aerial vehicle;
step S2: when the unmanned aerial vehicle is close to the target object, and the target object reaches a desired position relative to the unmanned aerial vehicle, the control module acquires target object information and information of an actuator at the tail end of the mechanical arm through the depth camera, and performs servo control on the mechanical arm based on a spherical coordinate system, and meanwhile obtains the angular speed of the mechanical arm in the spherical coordinate system;
step S3: the motion of the mechanical arm is controlled through a visual error equation of the flying mechanical arm so as to eliminate the visual error of the depth camera, so that the control module can carry out servo control on the mechanical arm to move the actuator, and the actuator at the tail end of the mechanical arm is moved and positioned to the target object;
step S4: the method comprises the steps of controlling a flying mechanical arm to position and grab a target object by adopting a coordination control method aiming at multiple tasks, wherein the multiple tasks comprise mechanical arm tail end position control, target object posture estimation, dynamic compensation system gravity center change and mechanical arm joint limit avoidance, and in the process of positioning and grabbing the target object, the coordination control method enables the task type executed by the flying mechanical arm to be matched with the current working condition of the flying mechanical arm.
The spherical model adopts latitude and longitude coordinates m (theta z ,θ x ) T The coordinate of the position point of the actuator of the mechanical arm or the object under the camera coordinate system is assumed to be M i =(x i y i z i ) T The following steps are:
wherein ,R2 =x 2 +y 2 +z 2 。
In this example, as shown in fig. 3, step S1 specifically includes:
the step S1 comprises the following steps of;
step S11: the spherical coordinate system is a camera coordinate system taking the depth camera as an origin, and the depth camera takes a speed v in a world coordinate system o =(v T ,w T ) T Moving, setting the position point of the object observed by the camera as a spherical feature point, wherein the speed of the spherical feature point relative to a camera coordinate system isNamely:
step S12: deriving the formula I, and combining the formula II to obtain a spherical-based image jacobian matrix, wherein the spherical-based image jacobian matrix is as follows:
step S13: the spherical feature point motion equation is obtained as follows:
wherein ,vo =(v T ,w T ) T ∈R 6×1 ,v=(v x v y v z ) T ∈R 3×1 、w=(w x w y w z ) T ∈R 3×1 Respectively representing the translational and rotational speeds of the drone.
In this example, as shown in fig. 4, step S2 specifically includes:
the step S2 comprises the following steps;
step S21: let p c =(x c y c z c ) T ∈R 3×1 The position of the position point P of the mechanical arm end effector under the camera coordinate system is expressed, and the time differentiation of the formula I can be obtained:
wherein ,
step S22: the tail end point P of the mechanical arm is accelerated under a camera coordinate systemIn connection with equation two, it can be expressed as a function of the motion of the end effector:
wherein ,B v e =( B T e B W e ) T ∈R 6×1 representing the velocity vector of the end point of the arm. sk () represents the oblique symmetry matrix of the vector.
Step S23: according to the robotics of the present invention, B v e can be represented asThe mechanical arm joint angular velocity function is as follows:
wherein L (q) ∈R 6×3 Is a mechanical arm jacobian matrix.
Step S24: the comprehensive formula five, the formula six and the formula seven can be obtained:
in this example, as shown in fig. 5, step S3 specifically includes:
the visual error equation of the flying mechanical arm is as follows:
e=m e -m 0 (equation nine)
The time differentiation of equation nine can be obtained:
wherein ,Jfirst =(L s -J ov -J owz )∈R 2×7 Representing a first task jacobian matrix of the system,representing the generalized speed input of the system, finally +.>
The step S4 comprises the following steps:
step S41: the first task is to control an actuator at the tail end position of the mechanical arm, and the input control law of the first task of the system can be obtained through a formula ten, wherein the input control law is as follows:
wherein, if J first Is of full rank of lines, thenΛ 1 Is a positive constant;
at this time, a system task coordination control weighting formula containing weights for separating the flight platform and the mechanical arm variables is required to be designed, and the differential result of the formula is brought into a formula nine, so that the final system can tend to be exponentially stable:
in order to avoid the robot arm being fully stretched to a singular point during the process of grasping the target object using visual guidance, the following two possible situations are considered. Firstly, when the end effector is far away from the target, the moving unmanned aerial vehicle is better than the moving mechanical arm to move the four-rotor platform; secondly, when the end effector approaches to the target, the target is in the operable range of the mechanical arm, and the servo effect is better achieved through the movement of the mechanical arm than the movement of the four-rotor platform; for a flying manipulator, the movements of the flying platform and the manipulator are quite different; thus, a system mission coordination control weighting formula is designed that contains weights to separate the flight deck and the robot variables:
wherein: mu is a normal number, W 1 ,W 2 The weight matrix for separating the four-rotor platform variable and the robot arm joint variable is as follows:
n * is the expected value of the visual error norm of the motion of the flying platform, and n is the current value of the visual error norm of the motion of the flying platform. D () is a defined threshold function, defined as follows:
and carrying out mathematical differentiation on the task coordination control weighting formula to obtain:
according to the mass center solving theory of the multi-body system, the mass center C of the system can be obtained G The calculation formula performs the task coordination control weighting formula ψThe partial derivative is obtained:
by comparing the formula thirteen with the formula seventeen, the weight-containing value can be obtainedNamely the following formula:
step S42: the second task: positioning the tail end of the mechanical arm; in a second task, solving a small number of equation sets according to an EPnP algorithm in a known corresponding relation between the coordinates of the 2D point and the 3D point to obtain a posture matrix T of the camera under the reference system of the mass center of the target object c ∈R 3×3 Inverting the matrix to obtain the posture of the target under the camera coordinate system, which is marked as T c -1 The position of the target object under the coordinate system C is p c ∈R 3×1 The positioning gesture matrix of the target object under the coordinate system C is as follows:
step S43: third task: system center of gravity shift control
The yaw angle only changes the body direction of the operation type unmanned aerial vehicle, and the influence on the gesture is very small, so the gesture of the operation type unmanned aerial vehicle is mainly controlled to pitch and roll stability. So assume the expected value ψ of the yaw angle d Output ideal rolling angle for fixed valuePitch angle theta d . Therefore, on the basis of integral modeling of the flying mechanical arm, the position controller is designed to be:
wherein ,u1 To input lift to the system, M s For the total mass of the system, η= (η) 1 η 2 η 2 ) T ,F(C G ) Is a system center of gravity shift parameter. In the unmanned aerial vehicle body base coordinate system, according to the multi-body system centroid solving theory, the system centroid C G The method comprises the following steps:
wherein ,Mi Representing the weight of the ith joint of the mechanical arm, C Gi Representing the position coordinates of the mass center of the joint i in the base coordinate system of the unmanned plane body;
the attitude controller mainly controls the rolling angle, the pitch angle and the yaw angle of the operation type unmanned aerial vehicle, and the actual inertial tensor deviation value of the system is estimatedAnd center of gravity offset parameter->In order to enable the attitude of the flying mechanical arm to reach the expected attitude, an attitude error function is defined as follows:
with inversion control, the following control laws are designed:
wherein ,u2 、u 3 、u 4 Respectively a rolling moment input, a pitching moment input and a yawing moment input of the system, s i (i=1, 2, 3) state errors defined for the system, the state variables are respectivelyRespectively adjust the corresponding parameters k i 、l i To adjust the upper limit values of the attitude error and the state error, the function G (·) represents the Gaussian basis function output, μ i 、ξ i (i=1, 2, 3) is a normal number, p, q, r is an angular velocity vector in coordinate system B, +.>Is a constant parameter matrix.
Step S44: fourth task: avoiding joint limitation
Defining a joint angle error:
e q =q * q (formula twenty-five)
wherein ,q* As a function of the desired value of the joint angle,q L 、q H the joint limit minimum and maximum values. Defining the following task functions; />
wherein ,kt Is a weight matrix for controlling the angle normalization of the joint, k t =diag{(q H1 -q L1 ) -2 (q H2 -q L2 ) -2 (q H3 -q L3 ) -2 Desired value of task function isThe desired task is therefore->The method comprises the following steps:
wherein ,is a matrix J f =(I 3×3 0 3×4 ) Is a pseudo-inverse of (a). />
Claims (7)
1. Flight mechanical arm vision servo structure based on spherical model, its characterized in that: the servo structure comprises a six-degree-of-freedom unmanned aerial vehicle, a three-degree-of-freedom mechanical arm and a control module; the mechanical arm is arranged at a mechanical arm mounting surface of the base of the unmanned aerial vehicle; a depth camera is further arranged at the mounting surface of the mechanical arm; the shooting angle of the depth camera is set at an angle with the mounting surface of the mechanical arm, and the control module is connected with the depth camera and the mechanical arm and enables the depth camera and the mechanical arm to be combined into a robot hand-eye calibration system;
the mechanical arm comprises three rotating joints; the mechanical arm mounting surface is positioned right below the platform of the base;
an actuator is arranged at the tail end of the mechanical arm; the control module is a control module capable of controlling the flight of the unmanned aerial vehicle; when the flying mechanical arm needs to operate the target, the control module evaluates the distance of the target through the depth camera, if the target is located outside the operation range of the mechanical arm, the control module drives the unmanned aerial vehicle to fly towards the target, and if the target is located within the operation range of the mechanical arm, the control module drives the mechanical arm to operate the target;
the control method uses a flight mechanical arm visual servo structure based on a spherical model and comprises the following steps of;
step S1: the control module acquires target object information through the depth camera, establishes a spherical coordinate system for servo flight control based on the spherical model, and calculates the difference of the real-time position of the target object relative to the expected position of the unmanned aerial vehicle in real time based on the spherical coordinate system in the flight process of the unmanned aerial vehicle so as to control the unmanned aerial vehicle to fly towards the target object by a visual servo control method based on the spherical coordinate system and obtain the servo speed of the unmanned aerial vehicle;
step S2: when the unmanned aerial vehicle is close to the target object, and the target object reaches a desired position relative to the unmanned aerial vehicle, the control module acquires target object information and information of an actuator at the tail end of the mechanical arm through the depth camera, and performs servo control on the mechanical arm based on a spherical coordinate system, and meanwhile obtains the angular speed of the mechanical arm in the spherical coordinate system;
step S3: the motion of the mechanical arm is controlled through a visual error equation of the flying mechanical arm so as to eliminate the visual error of the depth camera, so that the control module can carry out servo control on the mechanical arm to move the actuator, and the actuator at the tail end of the mechanical arm is moved and positioned to the target object;
step S4: the method comprises the steps of controlling a flying mechanical arm to position and grab a target object by adopting a coordination control method aiming at multiple tasks, wherein the multiple tasks comprise mechanical arm tail end position control, target object posture estimation, dynamic compensation system gravity center change and mechanical arm joint limit avoidance, and in the process of positioning and grabbing the target object, the coordination control method enables the task type executed by the flying mechanical arm to be matched with the current working condition of the flying mechanical arm.
2. The spherical model-based visual servo structure of a flying manipulator according to claim 1, wherein: the spherical model adopts latitude and longitude coordinates m (theta z ,θ x ) T The coordinate of the position point of the actuator of the mechanical arm or the object under the camera coordinate system is assumed to be M i =(x i y i z i ) T The following steps are:
wherein ,R2 =x 2 +y 2 +z 2 。
3. The spherical model-based visual servo structure of a flying manipulator according to claim 1, wherein: the step S1 comprises the following steps of;
step S11: the spherical coordinate system is a camera coordinate system taking the depth camera as an origin, and the depth camera takes a speed v in a world coordinate system o =(v T ,w T ) T Moving, setting the position point of the object observed by the camera as a spherical feature point, wherein the speed of the spherical feature point relative to a camera coordinate system isNamely:
step S12: deriving the formula I, and combining the formula II to obtain a spherical-based image jacobian matrix, wherein the spherical-based image jacobian matrix is as follows:
step S13: the spherical feature point motion equation is obtained as follows:
wherein ,vo =(v T ,w T ) T ∈R 6×1 ,v=(v x v y v z ) T ∈R 3×1 、w=(w x w y w z ) T ∈R 3×1 Respectively representing the translational and rotational speeds of the drone.
4. The spherical model-based visual servoing and multitasking control method for a flying robot arm of claim 3, comprising: the step S2 comprises the following steps;
step S21: let p c =(x c y c z c ) T ∈R 3×1 The position of the position point P of the mechanical arm end effector under the camera coordinate system is expressed, and the time differentiation of the formula I can be obtained:
wherein ,
step S22: the tail end point P of the mechanical arm is accelerated under a camera coordinate systemIn connection with equation two, it can be expressed as a function of the motion of the end effector:
wherein ,B v e =( B T e B W e ) T ∈R 6×1 a velocity vector representing a distal point of the robotic arm; sk () represents the oblique symmetry matrix of the vector;
step S23: according to the robotics of the present invention, B v e can be expressed as a function of the angular velocity of the joints of the mechanical arm:
wherein L (q) ∈R 6×3 The matrix is a mechanical arm jacobian matrix;
step S24: the comprehensive formula five, the formula six and the formula seven can be obtained:
5. the spherical model-based visual servoing and multitasking control method for a flying robot arm of claim 3, comprising: the visual error equation of the flying mechanical arm is as follows:
e=m e -m 0 (equation nine)
The time differentiation of equation nine can be obtained:
wherein ,Jfirst =(L s -J ov -J owz )∈R 2×7 Representing a first task jacobian matrix of the system,representative ofGeneralized speed input of the system, finally +.>
6. The spherical model-based visual servoing and multitasking control method for a flying robot arm of claim 4, comprising:
the step S4 comprises the following steps:
step S41: the first task is to control an actuator at the tail end position of the mechanical arm, and the input control law of the first task of the system can be obtained through a formula ten, wherein the input control law is as follows:
wherein, if J first Is of full rank of lines, thenΛ 1 Is a positive constant;
at this time, a system task coordination control weighting formula containing weights for separating the flight platform and the mechanical arm variables is required to be designed, and the differential result of the formula is brought into a formula nine, so that the final system can tend to be exponentially stable:
in order to avoid the mechanical arm being fully stretched to a singular point in the process of capturing a target object by using visual guidance by the flying mechanical arm, the following two possible situations are considered; firstly, when the end effector is far away from the target, the moving unmanned aerial vehicle is better than the moving mechanical arm to move the four-rotor platform; secondly, when the end effector approaches to the target, the target is in the operable range of the mechanical arm, and the servo effect is better achieved through the movement of the mechanical arm than the movement of the four-rotor platform; for a flying manipulator, the movements of the flying platform and the manipulator are quite different; thus, a system mission coordination control weighting formula is designed that contains weights to separate the flight deck and the robot variables:
wherein: mu is a normal number, W 1 ,W 2 The weight matrix for separating the four-rotor platform variable and the robot arm joint variable is as follows:
n * is the expected value of the visual error norm of the movement of the flying platform, and n is the current value of the visual error norm of the movement of the flying platform; d () is a defined threshold function, defined as follows:
and carrying out mathematical differentiation on the task coordination control weighting formula to obtain:
according to the mass center solving theory of the multi-body system, the mass center C of the system can be obtained G The calculation formula performs the task coordination control weighting formula ψThe partial derivative is obtained:
by comparing the formula thirteen with the formula seventeen, the weight-containing value can be obtainedNamely the following formula:
step S42: the second task: positioning the tail end of the mechanical arm; in a second task, solving a small number of equation sets according to an EPnP algorithm in a known corresponding relation between the coordinates of the 2D point and the 3D point to obtain a posture matrix T of the camera under the reference system of the mass center of the target object c ∈R 3×3 Inverting the matrix to obtain the posture of the target under the camera coordinate system, which is marked as T c -1 The position of the target object under the coordinate system C is p c ∈R 3×1 The positioning gesture matrix of the target object under the coordinate system C is as follows:
step S43: third task: system center of gravity shift control
The yaw angle only changes the body direction of the operation type unmanned aerial vehicle, and the influence on the gesture is very small, so the gesture of the operation type unmanned aerial vehicle is mainly controlled to pitch and roll stability; so assume the expected value ψ of the yaw angle d Output ideal rolling angle for fixed valuePitch angle theta d The method comprises the steps of carrying out a first treatment on the surface of the Therefore, on the basis of integral modeling of the flying mechanical arm, the position controller is designed to be:
wherein ,u1 To input lift to the system, M s For the total mass of the system, η= (η) 1 η 2 η 2 ) T ,F(C G ) Is a system gravity center offset parameter; in the unmanned aerial vehicle body base coordinate system, according to the multi-body system centroid solving theory, the system centroid C G The method comprises the following steps:
wherein ,Mi Representing the weight of the ith joint of the mechanical arm, C Gi Representing the position coordinates of the mass center of the joint i in the base coordinate system of the unmanned plane body;
the attitude controller mainly controls the rolling angle, the pitch angle and the yaw angle of the operation type unmanned aerial vehicle, and the actual inertial tensor deviation value of the system is estimatedAnd center of gravity offset parameter->In order to enable the attitude of the flying mechanical arm to reach the expected attitude, an attitude error function is defined as follows:
with inversion control, the following control laws are designed:
wherein ,u2 、u 3 、u 4 Respectively the rolling moment input and the pitching moment of the systemInput, yaw moment input, s i (i=1, 2, 3) state errors defined for the system, the state variables are respectivelyRespectively adjust the corresponding parameters k i 、l i To adjust the upper limit values of the attitude error and the state error, the function G (·) represents the Gaussian basis function output, μ i 、ξ i (i=1, 2, 3) is a normal number, p, q, r is an angular velocity vector in coordinate system B, +.>Is a constant parameter matrix;
step S44: fourth task: avoiding joint limitation
Defining a joint angle error:
e q =q * q (formula twenty-five)
wherein ,q* As a function of the desired value of the joint angle,q L 、q H the joint limit minimum value and the joint limit maximum value; the following task functions are defined:
wherein ,kt Is a weight matrix for controlling the angle normalization of the joint, k t =diag{(q H1 -q L1 ) -2 (q H2 -q L2 ) -2 (q H3 -q L3 ) -2 Desired value of task function isThe desired task is therefore->The method comprises the following steps:
wherein ,is a matrix J f =(I 3×3 0 3×4 ) Is a pseudo-inverse of (a).
7. The spherical model-based visual servo structure of a flying manipulator according to claim 1, wherein: the depth camera is an Intel RealSense camera.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010848414.5A CN111923049B (en) | 2020-08-21 | 2020-08-21 | Visual servo and multitasking control method for flying mechanical arm based on spherical model |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010848414.5A CN111923049B (en) | 2020-08-21 | 2020-08-21 | Visual servo and multitasking control method for flying mechanical arm based on spherical model |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111923049A CN111923049A (en) | 2020-11-13 |
CN111923049B true CN111923049B (en) | 2023-11-03 |
Family
ID=73305696
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010848414.5A Active CN111923049B (en) | 2020-08-21 | 2020-08-21 | Visual servo and multitasking control method for flying mechanical arm based on spherical model |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111923049B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116117785A (en) * | 2021-11-12 | 2023-05-16 | 华为技术有限公司 | Method and device for calibrating kinematic parameters of a robot |
CN115364255B (en) * | 2022-09-27 | 2023-09-01 | 江苏理工学院 | Virus disinfection device with wide application range and stable disinfection and control method |
CN115922731B (en) * | 2023-01-09 | 2023-05-30 | 深圳鹏行智能研究有限公司 | Control method of robot and robot |
CN116700348B (en) * | 2023-07-12 | 2024-03-19 | 湖南文理学院 | Visual servo control method and system for four-rotor aircraft with limited vision |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104322048A (en) * | 2012-05-22 | 2015-01-28 | Otoy公司 | Portable mobile light stage |
CN107139178A (en) * | 2017-05-10 | 2017-09-08 | 哈尔滨工业大学深圳研究生院 | A kind of grasping means of unmanned plane and its view-based access control model |
CN108248845A (en) * | 2018-01-31 | 2018-07-06 | 湖南大学 | A kind of rotor flying mechanical arm system and algorithm based on dynamic center of gravity compensation |
CN108453738A (en) * | 2018-03-30 | 2018-08-28 | 东南大学 | A kind of quadrotor based on Opencv image procossings independently captures the control method of operation in the air |
CN109895099A (en) * | 2019-03-28 | 2019-06-18 | 哈尔滨工业大学(深圳) | A kind of flight mechanical arm visual servo grasping means based on physical feature |
CN110900581A (en) * | 2019-12-27 | 2020-03-24 | 福州大学 | Four-degree-of-freedom mechanical arm vision servo control method and device based on RealSense camera |
CN111015673A (en) * | 2020-01-02 | 2020-04-17 | 福州大学 | Four-degree-of-freedom mechanical arm teleoperation system and method for operation type flying robot |
CN212635747U (en) * | 2020-08-21 | 2021-03-02 | 福州大学 | Visual servo structure of flying mechanical arm based on spherical model |
-
2020
- 2020-08-21 CN CN202010848414.5A patent/CN111923049B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104322048A (en) * | 2012-05-22 | 2015-01-28 | Otoy公司 | Portable mobile light stage |
CN107139178A (en) * | 2017-05-10 | 2017-09-08 | 哈尔滨工业大学深圳研究生院 | A kind of grasping means of unmanned plane and its view-based access control model |
CN108248845A (en) * | 2018-01-31 | 2018-07-06 | 湖南大学 | A kind of rotor flying mechanical arm system and algorithm based on dynamic center of gravity compensation |
CN108453738A (en) * | 2018-03-30 | 2018-08-28 | 东南大学 | A kind of quadrotor based on Opencv image procossings independently captures the control method of operation in the air |
CN109895099A (en) * | 2019-03-28 | 2019-06-18 | 哈尔滨工业大学(深圳) | A kind of flight mechanical arm visual servo grasping means based on physical feature |
CN110900581A (en) * | 2019-12-27 | 2020-03-24 | 福州大学 | Four-degree-of-freedom mechanical arm vision servo control method and device based on RealSense camera |
CN111015673A (en) * | 2020-01-02 | 2020-04-17 | 福州大学 | Four-degree-of-freedom mechanical arm teleoperation system and method for operation type flying robot |
CN212635747U (en) * | 2020-08-21 | 2021-03-02 | 福州大学 | Visual servo structure of flying mechanical arm based on spherical model |
Also Published As
Publication number | Publication date |
---|---|
CN111923049A (en) | 2020-11-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111923049B (en) | Visual servo and multitasking control method for flying mechanical arm based on spherical model | |
Lippiello et al. | Image-based visual-impedance control of a dual-arm aerial manipulator | |
Kim et al. | Vision-guided aerial manipulation using a multirotor with a robotic arm | |
Danko et al. | A parallel manipulator for mobile manipulating UAVs | |
Santamaria-Navarro et al. | Uncalibrated visual servo for unmanned aerial manipulation | |
Zhong et al. | A practical visual servo control for aerial manipulation using a spherical projection model | |
Lee et al. | Adaptive image-based visual servoing for an underactuated quadrotor system | |
CN111185907B (en) | Pose stability control method for operation type flying robot after grabbing | |
CN110427043B (en) | Pose controller design method based on gravity center offset of operation flying robot | |
Di Lucia et al. | Attitude stabilization control of an aerial manipulator using a quaternion-based backstepping approach | |
Yang et al. | An optimized image-based visual servo control for fixed-wing unmanned aerial vehicle target tracking with fixed camera | |
CN112068576A (en) | Task-hierarchical timing optimization-based underwater unmanned ship-double mechanical arms cooperative control method | |
CN115480583B (en) | Visual servo tracking and impedance control method for flying operation robot | |
Lai et al. | Image dynamics-based visual servo control for unmanned aerial manipulatorl with a virtual camera | |
Shi et al. | Adaptive image-based visual servoing for hovering control of quad-rotor | |
Kim et al. | Cooperation in the air: A learning-based approach for the efficient motion planning of aerial manipulators | |
Garofalo et al. | Task-space tracking control for underactuated aerial manipulators | |
Panetsos et al. | A deep reinforcement learning motion control strategy of a multi-rotor uav for payload transportation with minimum swing | |
Lai et al. | An onboard-eye-to-hand visual servo and task coordination control for aerial manipulator based on a spherical model | |
Danko et al. | Evaluation of visual servoing control of aerial manipulators using test gantry emulation | |
CN212635747U (en) | Visual servo structure of flying mechanical arm based on spherical model | |
Tong et al. | Neural network based visual servo control under the condition of heavy loading | |
Mišković et al. | Unmanned marsupial sea-air system for object recovery | |
Srivastava et al. | Range estimation and visual servoing of a dynamic target using a monocular camera | |
Quan et al. | Singularity-robust hybrid visual servoing control for aerial manipulator |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |