CN109895099B - Flying mechanical arm visual servo grabbing method based on natural features - Google Patents

Flying mechanical arm visual servo grabbing method based on natural features Download PDF

Info

Publication number
CN109895099B
CN109895099B CN201910241810.9A CN201910241810A CN109895099B CN 109895099 B CN109895099 B CN 109895099B CN 201910241810 A CN201910241810 A CN 201910241810A CN 109895099 B CN109895099 B CN 109895099B
Authority
CN
China
Prior art keywords
mechanical arm
unmanned aerial
aerial vehicle
vehicle body
speed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910241810.9A
Other languages
Chinese (zh)
Other versions
CN109895099A (en
Inventor
陈浩耀
罗斌
刘云辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Graduate School Harbin Institute of Technology
Original Assignee
Shenzhen Graduate School Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Graduate School Harbin Institute of Technology filed Critical Shenzhen Graduate School Harbin Institute of Technology
Priority to CN201910241810.9A priority Critical patent/CN109895099B/en
Publication of CN109895099A publication Critical patent/CN109895099A/en
Application granted granted Critical
Publication of CN109895099B publication Critical patent/CN109895099B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention provides a visual servo grabbing method of an aircraft mechanical arm based on natural features, which comprises the steps of extracting front-end natural feature points and performing rear-end visual servo calculation, wherein homography matrix transformation relation between images can be obtained through matching of real-time images and expected feature points, and then the positions of corner points of the expected images in the real-time images can be obtained through affine transformation of homography matrices; the real-time angular point position obtained by the front end is used for making a difference with the expected angular point position information to obtain the servo speed of the camera, and finally the unmanned aerial vehicle and the mechanical arm are subjected to speed control; the invention can be applied to the long-distance transportation and carrying task, solves the problem of visual servo without artificial marks, and has high efficiency and accuracy in grabbing of the mechanical arm, thereby having great application value.

Description

Flying mechanical arm visual servo grabbing method based on natural features
Technical Field
The invention relates to the field of unmanned aerial vehicles, in particular to a visual servo grabbing method of an aircraft mechanical arm based on natural characteristics.
Background
With the rapid development of robotics, the robot is widely applied to industrial manufacturing, military operations and civil life. No matter military application and civil use, the transportation of the multi-rotor unmanned aerial vehicle plays an important role, and the multi-rotor unmanned aerial vehicle can vertically take off, land and hover. Gradually, multi-rotor unmanned aerial vehicles are not limited to transportation, and composite unmanned aerial vehicles will become a new generation of aerial operation robots and will replace humans to complete dangerous and complex environment aerial operation tasks.
In recent years, researchers have come to expect that unmanned aerial vehicles can grasp objects in the environment with arms like humans, and therefore, flying robot arms have been created. The flight mechanical arm is a complex underactuated system, and meanwhile, the carried multi-degree-of-freedom mechanical arm is different from an industrial mechanical arm in control, and is currently in preliminary research globally. Because the motion and the snatching of flying mechanical arm must just can go on under the robot body keeps static or steady state in most environment, this has greatly restricted the motion of robot, can produce the interference to unmanned aerial vehicle when the mechanical arm moves simultaneously, and visual servo can be fine suppresses various interference, realizes accurate snatching. At present, the flying mechanical arm in China only has few related documents, and although a few related inventions of the unmanned mechanical arm are patented, the related visual servo grabbing of the flying mechanical arm is difficult to find. Usually, the target of the visual servo is an artificial marker, needs to be preset in advance, cannot be friendly interacted with the environment, and is not in accordance with a real scene. Most researchers only consider grabbing within the stroke range of the mechanical arm, and in practice, the distance between the unmanned aerial vehicle and a target object is larger than a threshold value, and the unmanned aerial vehicle can grab the target object only through the motion of the mechanical arm, so that the task cannot be completed. The invention provides a visual servo based on natural characteristics and a multi-task combined visual servo strategy according to the requirement.
An unmanned aerial vehicle and a vision-based grabbing method thereof (publication number: CN 107139178A). The invention provides an unmanned aerial vehicle which comprises an unmanned aerial vehicle body, a mechanical arm, a steering engine control joint and a camera, wherein the mechanical arm is fixedly connected below the unmanned aerial vehicle body, the steering engine control joint is connected to the front part above the unmanned aerial vehicle body, the steering engine control joint is in single degree of freedom, and the steering engine control joint is connected with the camera to calibrate each joint of the mechanical arm. According to the method, the camera is arranged on the machine body, the moving range of the camera is small, the mechanical arm is not subjected to visual servo, the inverse kinematics of the mechanical arm is solved by utilizing the pose relation of the camera and a target object, and the real-time performance cannot be realized. Meanwhile, the degree of freedom of the mechanical arm is low, the mechanical arm can only move in one plane, and any pose can not be reached.
A vision servo control method in positioning and tracking of a maneuvering target of an unmanned aerial vehicle (publication number: CN 105353772A). The invention carries out the positioning of the target, the calculation of the attitude angle set value of the target tracking and the attitude angle set value of the route tracking according to the imaging sequence of the target by establishing a geodetic coordinate system, a machine body coordinate system, a camera coordinate system, an image coordinate system and a machine body geodetic transition coordinate system and according to the relationship among the established coordinate systems, and finishes the visual servo control. Only a single fixed camera is used, and the device has the advantages that a cradle head and laser ranging equipment do not need to be tracked, the size and the cost of the load are effectively reduced, and the concealment of reconnaissance is improved. But the camera cannot move freely. Limiting its visible range. The invention only relates to the flight control of the unmanned aerial vehicle, and does not relate to the servo grabbing of a flight mechanical arm.
A flying robot system and a control method based on a force feedback device and VR sensing (publication No. CN 109164829A). The unmanned aerial vehicle comprises an unmanned aerial vehicle part and a ground station part, wherein the unmanned aerial vehicle part comprises an unmanned aerial vehicle body, a binocular camera, an airborne computer, a mechanical arm, a control system and a visual label. The control system comprises a flight master control and a mechanical arm master control, and an airborne computer and a control system are installed on the unmanned aerial vehicle body; the ground station part comprises a VR head display, a force feedback device and a ground station host, and the airborne computer sends the flight control command and the mechanical arm control command received from the ground station host to the control system. According to the invention, through a VR sensing and displaying technology, a three-dimensional scene in front of the unmanned aerial vehicle is observed in real time at a first visual angle, a target object is observed and positioned more accurately, and the difficulty of subsequent mechanical arm control is reduced. The force feedback equipment is adopted to control the mechanical arm to move, and the position of the mechanical arm is accurately adjusted in real time by using the flying mechanical arm grabbing technology. According to the invention, feedback is provided manually by VR equipment, and autonomous feedback can not be carried out to grab the target. The invention does not relate to the problem of visual servo grabbing of an aircraft mechanical arm.
A rotor flight mechanical arm system and algorithm based on dynamic gravity center compensation (published: CN 108248845A). The system comprises a rotor flight platform, an image sensor, a connecting frame, a mechanical arm system, a system controller and a ground station control device; the rotor flight platform comprises a rotor craft and a flight controller; the image sensor is arranged in front of and below the rotor flying platform; the connecting frame is a mechanical plate and is used for fixedly connecting the rotor flight platform, the image sensor and the mechanical arm system; the mechanical arm system is arranged right below the rotor flight platform; the system controller is arranged right above the rotor flight platform; and the ground station control device and the system controller are communicated in a wireless mode. The invention designs a gravity center compensation controller, which mainly aims at the problem of disturbance of a mechanical arm on an unmanned aerial vehicle. The invention does not relate to the problem of how to perform efficient and accurate grabbing.
Disclosure of Invention
In order to solve the problems, the invention provides a visual servo grabbing method of an aircraft mechanical arm based on natural features, through front-end natural feature point extraction, a servo target does not need any artificial marker, and only needs to shoot an expected image containing the target in advance; the rear-end vision servo is resolved and is adopted the vision servo based on the image as the basis, through the speed transform relation of robotics, can carry out speed control with the servo speed transmission of camera to unmanned aerial vehicle body and arm. In addition, the camera is arranged on the mechanical arm, the visual servo task of the unmanned aerial vehicle is performed at a long distance, the visual servo task of the mechanical arm is performed at a short distance, the multi-task combined visual servo control is realized, and the high-efficiency and accurate grabbing can be performed, and the specific invention content is as follows:
a visual servo grabbing method of an aircraft mechanical arm based on natural features is characterized by comprising the following steps: the unmanned aerial vehicle comprises an unmanned aerial vehicle body, wherein a mechanical arm base is arranged on the unmanned aerial vehicle body, and a mechanical arm is arranged on the mechanical arm base; a plurality of rotary joints are arranged in the mechanical arm, and a steering engine is arranged in each rotary joint; the tail end of the mechanical arm is provided with a clamping jaw and a camera; the unmanned aerial vehicle body is provided with a control system, the control system controls the unmanned aerial vehicle body, the mechanical arm and the clamping jaw to move, and the camera is electrically connected with the control system; the flight mechanical arm visual servo grabbing method comprises the following steps:
step 1: shooting an expected image containing a target object, manually framing any rectangular area in the target object, recording the pixel position of the framed area at the corner point of the expected image, and searching the position of the framed area in the real-time image, wherein the characteristic point detected in the area is marked as an expected characteristic point;
step 2: the camera captures a real-time image, and the control system detects and matches the characteristic points of the real-time image captured by the camera;
and step 3: by matching the real-time image with the expected characteristic points, a homography matrix transformation relation between the images can be obtained, and then the positions of the corner points of the expected image in the real-time image can be solved by utilizing affine transformation of the homography matrix;
and 4, step 4: obtaining the servo speed of the camera by using the difference between the obtained real-time angular point position and the expected angular point position information;
step 5: speed transformation relation through robotics can with the servo speed of camera transmits to the unmanned aerial vehicle body with the arm carries out speed control.
Preferably, the central axis of the camera is parallel to the central axis of the clamping jaw. Therefore, the clamping of the clamping jaw is more accurate.
Preferably, the feature points are ORB feature points.
Preferably, the improved ORB feature point algorithm is as follows:
step 2.1: inputting an image;
step 2.2: a Gaussian image pyramid;
step 2.3: grid division of a quadtree;
step 2.4: extracting FAST angular points with directions;
step 2.5: removing redundant angular points;
step 2.6: a characteristic point descriptor.
Preferably, the robotic arm is capable of providing a pitch angle and a yaw angle. So that the target is always at the very center of the field of view.
Preferably, the mechanical arm comprises a first joint, a second joint, a third joint, a fourth joint, a fifth joint and a sixth joint; the tail end of the mechanical arm is a sixth joint, and when the distance between the unmanned aerial vehicle body and the target object is greater than a threshold value, only linear speeds in x, y and z axis directions, an angular speed rotating along the z axis direction and the speed of a fifth joint of the mechanical arm of the unmanned aerial vehicle body are controlled; when the distance of unmanned aerial vehicle body distance target object is less than the threshold value, only control the speed of arm. The multitask is servo, makes the scope that the arm snatched wider, more accurate.
Preferably, the calculation process of transmitting the servo speed of the camera to the unmanned aerial vehicle body and the mechanical arm in the step 5 is as follows:
an inertial coordinate system O defined by the flight mechanical arm, an unmanned aerial vehicle body coordinate system b, a mechanical arm base coordinate system a, a mechanical arm end effector coordinate system e, a camera optical center coordinate system c, a target object center coordinate system t and coordinate systems d and O of expected grabbing positionsb=[x y z]TAnd RbRepresenting the position and rotation of the drone body coordinate system relative to the inertial coordinate system,
Figure GDA0002621139390000041
the direction of the unmanned aerial vehicle body is expressed in the form of Euler angles, q [ q ]1q2q3q4q5q6]TThe joint angles of the joints of the robot arm are shown. The derivation of the differential kinematics of the whole flying mechanical arm is briefly explained, and the speed of the tail end of the flying mechanical arm is equal to the sum of the speed generated by the motion of the unmanned machine body and the speed generated by the motion of the flying mechanical arm:
ve=ve_a+ve_b(1)
wherein v iseRepresenting the final velocity, v, of the end of the arme_aRepresenting the velocity, v, of the end of the arm resulting from the movement of the arme _bThe terminal speed of the arm that represents unmanned aerial vehicle body motion production.
From the differential kinematics of the mechanical arm, one can obtain:
Figure GDA0002621139390000042
wherein J (q) represents the geometric Jacobian matrix of the robotic arm,
Figure GDA0002621139390000043
representing the rotation component in the homogeneous matrix solved by the positive kinematics of the mechanical arm,
Figure GDA0002621139390000044
representing angular velocity, J, of each joint2Representing the reduced variables.
The speed of the mechanical arm end effector is expressed as follows when the speed is transferred to an unmanned plane body coordinate system:
Figure GDA0002621139390000045
wherein
Figure GDA0002621139390000046
Representing the rotation component of the base of the unmanned aerial vehicle body relative to the tail end of the mechanical arm,
Figure GDA0002621139390000047
representing the translation component of the tail end of the mechanical meter relative to the base of the unmanned aerial vehicle body, S (a) representing the vector to the anti-symmetric matrix, vbIs the speed of motion of the unmanned aerial vehicle body, J1Representing the reduced variables.
Combining the formula (2) and the formula (3) to obtain:
Figure GDA0002621139390000048
the multi-rotor unmanned aerial vehicle is an under-actuated control system, and controllable quantities are linear velocity components of three degrees of freedom and rotation angular velocity along the z axis of the aircraft. The angular velocities on the remaining two axes are adjusted by the inertial measurement unit of the aircraft itself, obtaining an internal feedback loop as a stable flight, so equation (4) can be written:
ve=Jv=Jcovco+Jucvuc(5)
wherein JucFor corresponding uncontrollable components in the joint Jacobian matrix J
Figure GDA0002621139390000051
Column vector of, JcoThe Jacobian matrix with the uncontrollable part removed in the Jacobian matrix J is combined,
Figure GDA0002621139390000052
is a controllable component.
Because in the actual flight process, many rotor unmanned aerial vehicle remain stationary state throughout, and pitch angle and roll angle are close to zero promptly, therefore uncontrollable part hardly causes the influence to the system, and formula (5) can be approximately written into:
ve=Jv=Jcovco+Jucvuc≈Jcovco(6)
setting the velocity distribution matrix to W ∈ R10*nN depends on the number of control variables required, vu∈Rn*1For vectors of control quantities, to address the actual problem vco
Extracting corresponding vuEquation (6) can be further written as:
ve=JcoWvu=Juvu(7)
when the distance from the main body of the unmanned aerial vehicle to the target object is greater than the threshold, the velocity distribution matrix in equation (7) can be written as:
Figure GDA0002621139390000053
when the distance from the unmanned aerial vehicle body to the target object is less than the threshold, the velocity distribution matrix in equation (7) can be written as:
Figure GDA0002621139390000054
the invention has the following beneficial effects:
1. the invention provides a visual servo grabbing method based on natural characteristics aiming at the problem that the visual servo cannot use artificial characteristics in a common scene, the traditional visual servo needs to be attached with artificial markers, the artificial markers cannot be attached in advance due to condition limitation in practical application, and the visual servo grabbing of a target object can be realized only by framing a target object area in advance by using the natural characteristic method.
2. Aiming at the problem of mechanical arm visual servo overstroke, the invention provides a multi-task combined visual servo control strategy, when the distance between an unmanned aerial vehicle body and a target object is greater than a threshold value (the threshold value can be set according to actual requirements), the speed of a servo is transmitted to the unmanned aerial vehicle body through a speed distribution matrix to carry out visual servo, and in order to ensure that a camera can always see the target object, a joint of a mechanical arm is used for carrying out pitch angle and yaw angle servo; when the unmanned aerial vehicle body gets into the workspace of arm, only carry out visual servo operation to the arm, make to snatch more high-efficient, accurate.
The invention can be applied to long-distance transportation and carrying tasks, especially can be applied to disaster rescue sites or places which are difficult to reach by people in complex terrain environments and the like, solves the problem of visual servo without artificial marks, and has great application value.
Drawings
Figure 1 is a block diagram of an aircraft arm of the present invention;
FIG. 2 is an algorithm block diagram of an aircraft robot grab based on a visual servoing algorithm of natural features in accordance with the present invention;
FIG. 3 is a block diagram of an improved ORB feature point detection algorithm of the present invention;
fig. 4 is a schematic diagram of a coordinate system associated with the robot arm of the present invention.
The technical features indicated by the reference numerals in the drawings are as follows:
1. a propeller; 2. a propeller motor; 3. an unmanned aerial vehicle body; 4. a battery; 5. a mechanical arm base; 6. a first steering engine; 7. a second steering engine; 8. a third steering engine; 9. the mechanical arm is connected with the bracket; 10. a fourth steering engine; 11. a fifth steering engine; 12. a sixth steering engine; 13. a camera; 14. a clamping jaw steering engine; 15. a clamping jaw.
Detailed Description
The following further describes embodiments of the present invention with reference to the accompanying drawings:
as shown in fig. 1, the flying mechanical arm comprises an unmanned aerial vehicle body 3 and a mechanical arm, wherein a mechanical arm base 5, a plurality of propellers 1 which are uniformly distributed and a propeller motor 2 for controlling the rotation of the propellers are arranged on the unmanned aerial vehicle body 3; an accommodating space is formed between the mechanical arm base 5 and the unmanned aerial vehicle body 3, the battery 4 is accommodated in the accommodating space, the mechanical arm is fixedly mounted on the lower surface of the mechanical arm base 5, a plurality of rotary joints are arranged in the mechanical arm, each rotary joint corresponds to one degree of freedom, each rotary joint is provided with a steering engine, the tail end of the mechanical arm is provided with a clamping jaw 15 and a camera 13, a mechanical arm connecting support 9 is further arranged on the mechanical arm, a control system is arranged on the unmanned aerial vehicle body and controls the movement of the unmanned aerial vehicle body 3, the mechanical arm and the clamping jaw 15, the camera 13 is electrically connected with the control system, and the mechanical arm is adjusted through the rotary joints, and the clamping jaw 15 at the tail end of the mechanical arm clamps a target object. The central axis of the camera 13 is coaxial with the central axis of the clamping jaw 15, so that the target object can be grabbed more accurately and efficiently. The central axis of the jaw 15 is an axis of the jaw 15 in the clamping direction. The camera is a color camera.
The mechanical arm has six degrees of freedom, and each degree of freedom comprises motor, reduction gear group and encoder the steering wheel. The reduction gear set is installed on the output shaft of motor, set up a plurality of intermeshing's gear in the reduction gear set, set up the rotation axis that is used for connecting the steering wheel among the steering wheel on the play shaft gear among the reduction gear set. The steering engine further comprises a steering engine body, a steering engine base, an auxiliary steering wheel, a main steering wheel and a steering engine outer side wall, wherein the steering engine outer side wall is arranged between the auxiliary steering wheel and the main steering wheel and is distributed along the circumferential direction of the steering engine body, the steering engine body is arranged on the steering engine base, and the auxiliary steering wheel and the main steering wheel are respectively arranged on the outer side of the top wall and the outer side of the bottom wall, which are opposite to each other, of the steering engine body. The main rudder disk and the auxiliary rudder disk are provided with mounting holes which are mutually opposite, and the mounting holes are used for fixing a transmission connecting piece, so that the auxiliary rudder disk can synchronously move along with the main rudder disk.
The steering wheel includes first steering wheel 6, second steering wheel 7, third steering wheel 8, fourth steering wheel 10, fifth steering wheel 11, sixth steering wheel 12 and clamping jaw steering wheel 14, the vice steering wheel of first steering wheel 6 with arm base 5 is connected, thereby the main steering wheel of first steering wheel 6 with the steering wheel lateral wall of second steering wheel 7 is connected and is driven second steering wheel 7 and rotate. Thereby the main steering wheel of second steering wheel 7 links to each other with the one end fixed side of extension armed lever and drives the extension armed lever and rotate, the other end of extension arm with the steering wheel base of third steering wheel 8 is connected, and extension armed lever drives third steering wheel 8 when rotating and rotates, the main steering wheel of third steering wheel 8 and vice steering wheel link to each other with the one end of two connecting rods that are parallel to each other respectively, and the other end of two connecting rods that are parallel to each other is fixed respectively on the steering wheel lateral wall of fourth steering wheel 10, realize through the connecting rod that the main steering wheel of third steering wheel 8 and vice steering wheel drive fourth steering wheel 10 and rotate. The main steering wheel of the fourth steering engine 10 is connected with the steering engine base of the fifth steering engine 11 so as to drive the fifth steering engine 11 to rotate, the main steering wheel and the auxiliary steering wheel of the fifth steering engine 11 are respectively connected with one ends of two mutually parallel connecting rods, the other ends of the two mutually parallel connecting rods are connected with the outer side wall of the steering engine of the sixth steering engine 12, and the main steering wheel and the auxiliary steering wheel of the fifth steering engine are used for driving the sixth steering engine 12 to rotate through the connecting rods. Each steering wheel corresponds a joint, so, flight arm includes first joint, second joint, third joint, fourth joint, fifth joint and sixth joint and clamping jaw joint.
The invention mainly comprises a visual servo algorithm based on natural characteristics and a multi-task combined visual servo strategy, and the following two aspects are developed and discussed respectively:
as shown in fig. 2, the method for capturing the flying robot arm by visual servo mainly comprises the steps of extracting natural feature points at the front end and calculating the visual servo at the rear end, and mainly comprises the following steps:
step 1: shooting an expected image containing a target object, manually framing any rectangular area in the target object, recording the pixel position of the framed area at the corner point of the expected image, and searching the position of the framed area in the real-time image, wherein the characteristic point detected in the area is marked as an expected characteristic point;
step 2: the camera captures a real-time image, and the control system detects and matches the characteristic points of the real-time image captured by the camera;
and step 3: by matching the real-time image with the expected characteristic points, a homography matrix transformation relation between the images can be obtained, and then the positions of the corner points of the expected image in the real-time image can be solved by utilizing affine transformation of the homography matrix;
and 4, step 4: obtaining the servo speed of the camera by using the difference between the obtained real-time angular point position and the expected angular point position information;
step 5: speed transformation relation through robotics can with the servo speed of camera transmits to the unmanned aerial vehicle body with the arm carries out speed control.
The extraction of the front-end natural feature points means that a servo target does not need any artificial marker, only a desired image containing the target needs to be shot in advance, any rectangular area in the target is artificially framed, the pixel positions of the framing area at the corner points of the desired image are recorded and used for searching the position of the framing area in a real-time image, and the feature points detected in the area are marked as the desired feature points; the method comprises the steps of capturing images through a color camera, detecting and matching feature points, matching real-time images with expected feature points to obtain a homography matrix transformation relation between the images, and then obtaining the corner position of the corner of the expected images in the real-time images by using affine transformation of the homography matrix.
The back-end visual servo resolving adopts the visual servo based on the image as a basis, and the difference between the real-time angular point position acquired by the front end and the expected angular point position information is used as the front-end input of the visual servo based on the image, so that the servo speed of the camera can be finally obtained. Speed transformation relation through robotics can transmit the servo speed of camera to unmanned aerial vehicle body 3 and arm and carry out speed control.
The characteristic points are improved by adopting ORB characteristic points, an algorithm flow chart is shown in figure 3, a Gaussian image pyramid is introduced, the scale invariance of the characteristic points is increased, the characteristic points are unevenly distributed and gathered on a foreground object and are not beneficial to characteristic point matching, therefore, a quad-tree structure is introduced for gridding and extracting the characteristic points, and the characteristic points of each small square region of the image are ensured to be detected by reducing a threshold value.
The multitask joint visual servoing strategy is as follows:
as shown in fig. 4, an inertial coordinate system O, a body coordinate system b, a base coordinate system a, an end effector coordinate system e, a camera optical center coordinate system c, a target center coordinate system t, and a desired grasping position coordinate system d, O defined by the flying robot armb=[x y z]TAnd RbRepresenting the position and rotation of the drone body coordinate system relative to the inertial coordinate system,
Figure GDA0002621139390000081
the direction of the drone body is represented in the form of euler angles, wherein,
Figure GDA0002621139390000082
is the roll angle of the unmanned aerial vehicle body, theta is the pitch angle of the unmanned aerial vehicle body, psi represents the yaw angle of the unmanned aerial vehicle body, q [ q ] q1q2q3q4q5q6]TThe joint angles of the joints of the robot arm are shown. The derivation of the differential kinematics of the whole flying mechanical arm is briefly explained, and the speed of the tail end of the flying mechanical arm is equal to the sum of the speed generated by the motion of the unmanned machine body and the speed generated by the motion of the flying mechanical arm:
ve=ve_a+ve_b(1)
wherein v iseRepresenting the final velocity, v, of the end of the arme_aRepresenting the velocity, v, of the end of the arm resulting from the movement of the arme _bThe terminal speed of the arm that represents unmanned aerial vehicle body motion production.
From the differential kinematics of the mechanical arm, one can obtain:
Figure GDA0002621139390000091
wherein J (q) represents the geometric Jacobian matrix of the robotic arm,
Figure GDA0002621139390000092
representing the rotation component in the homogeneous matrix solved by the positive kinematics of the mechanical arm,
Figure GDA0002621139390000093
representing angular velocity, J, of each joint2Representing the reduced variables.
The speed of the mechanical arm end effector is expressed as follows when the speed is transferred to an unmanned plane body coordinate system:
Figure GDA0002621139390000094
wherein
Figure GDA0002621139390000095
Representing the rotation component of the base of the unmanned aerial vehicle body relative to the tail end of the mechanical arm,
Figure GDA0002621139390000096
representing the translation component of the tail end of the mechanical meter relative to the base of the unmanned aerial vehicle body, S (a) representing the vector to the anti-symmetric matrix, vbIs the speed of motion of the unmanned aerial vehicle body, J1Representing the reduced variables.
Combining the formula (2) and the formula (3) to obtain:
Figure GDA0002621139390000097
the multi-rotor unmanned aerial vehicle is an under-actuated control system, and controllable quantities are linear velocity components of three degrees of freedom and rotation angular velocity along the z axis of the aircraft. The angular velocities on the remaining two axes are adjusted by the inertial measurement unit of the aircraft itself, obtaining an internal feedback loop as a stable flight, so equation (4) can be written:
ve=Jv=Jcovco+Jucvuc(5)
where v represents the velocity vector of all states of the flying robot, JucFor corresponding uncontrollable components in the joint Jacobian matrix J
Figure GDA0002621139390000098
Column vector of, JcoThe Jacobian matrix with the uncontrollable part removed in the Jacobian matrix J is combined,
Figure GDA0002621139390000099
is a controllable component.
Because in the actual flight process, many rotor unmanned aerial vehicle remain stationary state throughout, and pitch angle and roll angle are close to zero promptly, therefore uncontrollable part hardly causes the influence to the system, and formula (5) can be approximately written into:
ve=Jv=Jcovco+Jucvuc≈Jcovco(6)
the mechanical arm is fixed under the unmanned aerial vehicle body, the motion of the two interferes force and moment on the two sides, if the unmanned aerial vehicle body is adopted to hover in the operable range of the mechanical arm for visual servo grabbing, the problem of mutual interference can be reduced to a great extent, when the remote mechanical arm exceeds a working space, the mechanical arm cannot grab, the mechanical arm cannot just fly to fall in the grabbing range in an actual task, therefore, the motion of the visual servo is required to be transmitted to the unmanned aerial vehicle body, the visual servo task of the unmanned aerial vehicle is carried out at a long distance, the camera is installed at the tail end of the mechanical arm, when the servo of the unmanned aerial vehicle body is close to a target object, the target object can be gradually separated from the visual field of the camera, if the mechanical arm provides a pitch angle and a yaw angle real-time servo target object, the target object can be ensured to be always in the center of the visual field of vision, therefore, the speed distribution rate is provided by the patent to realize10*nN depends on the control quantity depending on the needNumber of (v)u∈Rn*1For vectors of control quantities, to address the actual problem vco
Extracting corresponding vuEquation (6) can be further written as:
ve=JcoWvu=Juvu(7)
wherein Ju=JcoW represents a controllable Jacobian matrix with weight, and the visual servo process of the flight mechanical arm is divided into a long distance and a short distance. Remote time mechanical arm and unmanned aerial vehicle body can the simultaneous action, but the large amplitude movement of mechanical arm and unmanned aerial vehicle body focus skew can lead to unmanned aerial vehicle body flight unstability, consequently this patent adopts the visual servo of unmanned aerial vehicle body leading, the linear velocity of unmanned aerial vehicle's x, y, z axle direction, along the angular velocity of the rotatory and the speed of the fifth joint of arm of z axle direction, the speed distribution matrix in formula (7) can write into:
Figure GDA0002621139390000101
closely for the feasible workspace that the unmanned aerial vehicle body got into the arm, the unmanned aerial vehicle body hovered, only the arm motion, speed distribution matrix can write:
Figure GDA0002621139390000102
the above provides a detailed description of an embodiment of a visual servo grabbing method for an aircraft mechanical arm based on natural features. The principles and embodiments of the present invention are explained herein using specific examples, which are presented only to assist in understanding the core concepts of the present invention. It should be noted that, for those skilled in the art, it is possible to make various improvements and modifications to the present invention without departing from the principle of the present invention, and those improvements and modifications also fall within the scope of the claims of the present invention.

Claims (4)

1. A visual servo grabbing method of an aircraft mechanical arm based on natural features is characterized by comprising the following steps: the flying mechanical arm comprises an unmanned aerial vehicle body, a mechanical arm base is arranged on the unmanned aerial vehicle body, and a mechanical arm is arranged on the mechanical arm base; a plurality of rotary joints are arranged in the mechanical arm, and a steering engine is arranged in each rotary joint; the tail end of the mechanical arm is provided with a clamping jaw and a camera; the unmanned aerial vehicle body is provided with a control system, the control system controls the unmanned aerial vehicle body, the mechanical arm and the clamping jaw to move, and the camera is electrically connected with the control system; the flight mechanical arm visual servo grabbing method comprises the following steps:
step 1: shooting an expected image containing a target object, manually framing any rectangular area in the target object, recording the pixel position of the framed area at the corner point of the expected image, and searching the position of the framed area in the real-time image, wherein the characteristic point detected in the area is marked as an expected characteristic point;
step 2: the camera captures a real-time image, and the control system detects and matches the characteristic points of the real-time image captured by the camera;
and step 3: by matching the real-time image with the expected characteristic points, a homography matrix transformation relation between the images can be obtained, and then the positions of the corner points of the expected image in the real-time image can be solved by utilizing affine transformation of the homography matrix;
and 4, step 4: obtaining the servo speed of the camera by using the difference between the obtained real-time angular point position and the expected angular point position information;
and 5: through the speed transformation relation of robotics, the servo speed of the camera can be transmitted to the unmanned aerial vehicle body and the mechanical arm for speed control;
the mechanical arm can provide a pitch angle and a yaw angle;
the mechanical arm comprises a first joint, a second joint, a third joint, a fourth joint, a fifth joint and a sixth joint; the tail end of the mechanical arm is a sixth joint, and when the distance between the unmanned aerial vehicle body and the target object is larger than a threshold value, the mechanical arm only controls the linear speeds of the unmanned aerial vehicle body in the directions of the x axis, the y axis and the z axis, the angular speed of the unmanned aerial vehicle body rotating along the direction of the z axis and the speed of a fifth joint of the mechanical arm; when the distance between the unmanned aerial vehicle body and the target object is smaller than a threshold value, only controlling the speed of the mechanical arm;
in the step 5, the calculation process of transmitting the servo speed of the camera to the unmanned aerial vehicle body and the mechanical arm is as follows:
an inertial coordinate system O defined by the flight mechanical arm, an unmanned aerial vehicle body coordinate system b, a mechanical arm base coordinate system a, a mechanical arm end effector coordinate system e, a camera optical center coordinate system c, a target object center coordinate system t and coordinate systems d and O of expected grabbing positionsb=[xy z]TAnd RbRepresenting the position and rotation of the drone body coordinate system relative to the inertial coordinate system,
Figure FDA0002621139380000011
the direction of the drone body is represented in the form of euler angles, wherein,
Figure FDA0002621139380000012
for the roll angle of unmanned aerial vehicle body, theta is the pitch angle of unmanned aerial vehicle body, psi represents the yaw angle of unmanned aerial vehicle body, q ═ q1q2q3q4q5q6]TThe joint angles of all joints of the mechanical arm are shown, the derivation of the differential kinematics of the whole flying mechanical arm is briefly explained, and the speed of the tail end of the mechanical arm in the flying mechanical arm is equal to the sum of the speed generated by the motion of the unmanned aerial vehicle body and the speed generated by the motion of the flying mechanical arm:
ve=ve_a+ve_b(1)
wherein v iseRepresenting the final velocity, v, of the end of the arme_aRepresenting the velocity, v, of the end of the arm resulting from the movement of the arme_bThe tail end speed of the mechanical arm generated by the motion of the unmanned aerial vehicle body is represented,
from the differential kinematics of the mechanical arm, one can obtain:
Figure FDA0002621139380000021
wherein J (q) represents the geometric Jacobian matrix of the robotic arm,
Figure FDA0002621139380000022
representing the rotation component in the homogeneous matrix solved by the positive kinematics of the mechanical arm,
Figure FDA0002621139380000023
representing angular velocity, J, of each joint2The variables after the reduction are represented as,
the speed of the mechanical arm end effector is expressed as follows when the speed is transferred to an unmanned plane body coordinate system:
Figure FDA0002621139380000024
wherein
Figure FDA0002621139380000025
Representing the rotation component of the base of the unmanned aerial vehicle body relative to the tail end of the mechanical arm,
Figure FDA0002621139380000026
representing the translation component of the tail end of the mechanical meter relative to the base of the unmanned aerial vehicle body, S (a) representing the vector to the anti-symmetric matrix, vbIs the speed of motion of the unmanned aerial vehicle body, J1The variables after the reduction are represented as,
combining the formula (2) and the formula (3) to obtain:
Figure FDA0002621139380000027
the multi-rotor unmanned aerial vehicle is an under-actuated control system, controllable quantity is linear velocity component of three degrees of freedom and rotation angular velocity along the z axis of the aircraft, and rotation angular velocity on the remaining two axes is obtained through an inertia measuring device of the aircraft to be used as an internal feedback loop for stable flight to be adjusted, so that formula (4) can be written as follows:
ve=Jv=Jcovco+Jucvuc(5)
where v represents the velocity vector of all states of the flying robot, JucFor corresponding uncontrollable components in the joint Jacobian matrix J
Figure FDA0002621139380000028
Column vector of, JcoThe Jacobian matrix with the uncontrollable part removed in the Jacobian matrix J is combined,
Figure FDA0002621139380000029
in order to be a controllable component of the signal,
because in the actual flight process, many rotor unmanned aerial vehicle remain stationary state throughout, and pitch angle and roll angle are close to zero promptly, therefore uncontrollable part hardly causes the influence to the system, and formula (5) can be approximately written into:
ve=Jv=Jcovco+Jucvuc≈Jcovco(6)
setting the velocity distribution matrix to w ∈ R10*nN depends on the number of control variables required, vu∈Rn*1For vectors of control quantities, to address the actual problem vco
Extracting corresponding vuEquation (6) can be further written as:
ve=JcoWvu=Juvu(7)
wherein Ju=JcoW, represents the weighted controllable jacobian matrix, and when the distance of the drone body from the target is greater than the threshold, the velocity distribution matrix in equation (7) can be written as:
Figure FDA0002621139380000031
when the distance from the unmanned aerial vehicle body to the target object is less than the threshold, the velocity distribution matrix in equation (7) can be written as:
Figure FDA0002621139380000032
2. the method according to claim 1, wherein the method comprises: the central axis of the camera is parallel to the central axis of the clamping jaw.
3. The method according to claim 1, wherein the method comprises: the characteristic points adopt ORB characteristic points.
4. The method according to claim 3, wherein the method comprises: the improved ORB feature point algorithm is as follows:
step 2.1: inputting an image;
step 2.2: a Gaussian image pyramid;
step 2.3: grid division of a quadtree;
step 2.4: extracting FAST angular points with directions;
step 2.5: removing redundant angular points;
step 2.6: a characteristic point descriptor.
CN201910241810.9A 2019-03-28 2019-03-28 Flying mechanical arm visual servo grabbing method based on natural features Active CN109895099B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910241810.9A CN109895099B (en) 2019-03-28 2019-03-28 Flying mechanical arm visual servo grabbing method based on natural features

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910241810.9A CN109895099B (en) 2019-03-28 2019-03-28 Flying mechanical arm visual servo grabbing method based on natural features

Publications (2)

Publication Number Publication Date
CN109895099A CN109895099A (en) 2019-06-18
CN109895099B true CN109895099B (en) 2020-10-02

Family

ID=66953908

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910241810.9A Active CN109895099B (en) 2019-03-28 2019-03-28 Flying mechanical arm visual servo grabbing method based on natural features

Country Status (1)

Country Link
CN (1) CN109895099B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110286688B (en) * 2019-06-19 2021-03-16 广东工业大学 Control method for underwater four-rotor unmanned aerial vehicle with mechanical arm
CN110427043B (en) * 2019-09-04 2021-09-28 福州大学 Pose controller design method based on gravity center offset of operation flying robot
CN111360840A (en) * 2020-04-28 2020-07-03 沈阳优诚自动化工程有限公司 Mechanical arm sorting method based on visual algorithm
CN111923049B (en) * 2020-08-21 2023-11-03 福州大学 Visual servo and multitasking control method for flying mechanical arm based on spherical model
CN112589787B (en) * 2020-12-02 2022-09-16 上海纽钛测控技术有限公司 Visual positioning and hand-eye calibration method for loading and unloading samples of mechanical arm of feeding turntable
CN114536327A (en) * 2022-01-24 2022-05-27 四川广目科技有限公司 Intelligent industrial mechanical arm driving system based on ROS system
CN114291264A (en) * 2022-02-14 2022-04-08 天津七六四通信导航技术有限公司 Unmanned aerial vehicle's first aid device
CN114842056A (en) * 2022-04-19 2022-08-02 深圳鳍源科技有限公司 Multi-machine-position first machine visual angle following method, system, device and equipment
CN114862848B (en) * 2022-07-05 2022-10-21 江苏顺联工程建设有限公司 Intelligent control method of hoisting equipment for municipal construction
CN115520376B (en) * 2022-09-27 2023-08-25 哈尔滨工业大学 Space-ground-based dual-purpose mobile operation platform and pose control system and control method thereof

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9758301B2 (en) * 2015-03-24 2017-09-12 Joseph Porat System and method for overhead warehousing
US10023311B2 (en) * 2016-03-10 2018-07-17 International Business Machines Corporation Automatic painting system with drone, user interface and computer vision
CN106570820B (en) * 2016-10-18 2019-12-03 浙江工业大学 A kind of monocular vision three-dimensional feature extracting method based on quadrotor drone
CN106985159B (en) * 2017-05-10 2023-07-07 哈尔滨工业大学深圳研究生院 Flight mechanical arm with flexible grabber
CN107139178B (en) * 2017-05-10 2024-02-23 哈尔滨工业大学深圳研究生院 Unmanned aerial vehicle and vision-based grabbing method thereof
CN107627303B (en) * 2017-09-22 2021-03-09 哈尔滨工程大学 PD-SMC control method of visual servo system based on eye-on-hand structure
CN108015764B (en) * 2017-11-20 2020-07-14 中国运载火箭技术研究院 Spatial zero prior target capturing method based on multi-source visual information fusion
CN107902089A (en) * 2017-12-14 2018-04-13 郑州启硕电子科技有限公司 A kind of unmanned plane of scalable crawl
CN108170160A (en) * 2017-12-21 2018-06-15 中山大学 It is a kind of to utilize monocular vision and the autonomous grasping means of airborne sensor rotor wing unmanned aerial vehicle
CN108453738B (en) * 2018-03-30 2021-04-16 东南大学 Control method for four-rotor aircraft aerial autonomous grabbing operation based on Opencv image processing
CN208181424U (en) * 2018-04-25 2018-12-04 浙江工业大学 The unmanned plane Full-automatic grasping device of view-based access control model identifying system
CN109398688B (en) * 2018-11-16 2020-06-30 湖南大学 Rotor flight double-mechanical-arm target positioning and grabbing system and method

Also Published As

Publication number Publication date
CN109895099A (en) 2019-06-18

Similar Documents

Publication Publication Date Title
CN109895099B (en) Flying mechanical arm visual servo grabbing method based on natural features
CN107139178B (en) Unmanned aerial vehicle and vision-based grabbing method thereof
CN107416195B (en) Eagle-like grabbing system of aerial operation multi-rotor aircraft
CN107309872B (en) Flying robot with mechanical arm and control method thereof
CN109397249B (en) Method for positioning and grabbing robot system by two-dimensional code based on visual identification
CN109079799B (en) Robot perception control system and control method based on bionics
CN205891228U (en) Flying robot
CN111687821B (en) Rotary parallel flying mechanical arm system and expected rotation angle calculating method
CN109164829B (en) Flying mechanical arm system based on force feedback device and VR sensing and control method
CN105014687A (en) Mechanical arm with multi-rotor-wing unmanned aerial vehicle
CN110842918B (en) Robot mobile processing autonomous locating method based on point cloud servo
CN108945536B (en) Rendezvous and docking experimental platform based on rotor craft
CN105923168A (en) Rotorcraft flight simulating platform applied to airborne cradle head testing
Meng et al. Design and implementation of rotor aerial manipulator system
CN114378827B (en) Dynamic target tracking and grabbing method based on overall control of mobile mechanical arm
CN206123654U (en) Vision -guided's omnidirectional movement double arm robot
Watanabe et al. Image-based visual PID control of a micro helicopter using a stationary camera
CN209649972U (en) A kind of land and air double-used operation type flying robot
CN108170160A (en) It is a kind of to utilize monocular vision and the autonomous grasping means of airborne sensor rotor wing unmanned aerial vehicle
CN205230375U (en) Unmanned aerial vehicle target tracker
CN116714780A (en) Rotor flying mechanical arm and planning and control method for rapid aerial grabbing
Wu et al. Aerial grasping based on VR perception and haptic control
CN206913156U (en) A kind of unmanned plane
CN207731158U (en) A kind of underwater robot
CN115657474A (en) Flexible interaction control method for aircraft mechanical arm aiming at man-machine cooperative transportation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant