CN113741496A - Autonomous accurate landing method and landing box for multi-platform unmanned aerial vehicle - Google Patents

Autonomous accurate landing method and landing box for multi-platform unmanned aerial vehicle Download PDF

Info

Publication number
CN113741496A
CN113741496A CN202110980963.2A CN202110980963A CN113741496A CN 113741496 A CN113741496 A CN 113741496A CN 202110980963 A CN202110980963 A CN 202110980963A CN 113741496 A CN113741496 A CN 113741496A
Authority
CN
China
Prior art keywords
dimensional code
unmanned aerial
aerial vehicle
positioning information
yaw
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110980963.2A
Other languages
Chinese (zh)
Inventor
耿虎军
闫玉巧
仇梓峰
杨彬
李方用
胡炎
张泽勇
杨福琛
熊恒斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CETC 54 Research Institute
Original Assignee
CETC 54 Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CETC 54 Research Institute filed Critical CETC 54 Research Institute
Priority to CN202110980963.2A priority Critical patent/CN113741496A/en
Publication of CN113741496A publication Critical patent/CN113741496A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses an autonomous precise landing method and a landing box for a multi-platform unmanned aerial vehicle, and belongs to the technical field of unmanned aerial vehicle landing recovery. Firstly, setting a target according to the size of a landing platform, so that an unmanned aerial vehicle collects a two-dimensional code image on the target through a camera, and obtaining positioning information according to the condition of the collected two-dimensional code image; then, the unmanned aerial vehicle adjusts the position in real time according to the locating information until the target is accurately landed. The invention is suitable for various unmanned aerial vehicle platforms, can be conveniently adapted to various unmanned aerial vehicle platforms by adopting the accurate landing positioning box, and the unmanned aerial vehicle only needs to provide power output and flight control input.

Description

Autonomous accurate landing method and landing box for multi-platform unmanned aerial vehicle
Technical Field
The invention relates to the technical field of unmanned aerial vehicle landing recovery, in particular to an autonomous precise landing method and a landing box for a multi-platform unmanned aerial vehicle.
Background
At present, in unmanned aerial vehicle's landing recovery field, generally confirm unmanned aerial vehicle's real-time position through satellite positioning techniques such as GPS, big dipper, RTK to control of descending. However, the satellite positioning technology has the defects of poor positioning accuracy, easy electromagnetic interference and the like, and the unmanned aerial vehicle moving to the landing platform on the vehicle can not land on the platform correctly due to low real-time performance of the satellite positioning technology. Therefore, the satellite positioning technology can not meet the requirement of accurate landing of the unmanned aerial vehicle.
In recent years, cases of positioning and landing of the unmanned aerial vehicle by using an image recognition and processing technology have appeared, and the unmanned aerial vehicle landing by using an image has the advantages of high positioning precision, high real-time performance, no electromagnetic interference and the like, but is easily influenced by illumination, shielding and the like. Especially mooring unmanned aerial vehicle, because the mooring line is located the unmanned aerial vehicle below, thereby shelter from the image very easily and lead to descending the failure.
In addition, the landing mode that uses image technology needs to reform transform unmanned aerial vehicle, makes unmanned aerial vehicle possess stable image acquisition and throughput, and different unmanned aerial vehicle's transformation mode is difficult to be different.
Disclosure of Invention
In view of the above, the invention provides an autonomous accurate landing method and a landing box for a multi-platform unmanned aerial vehicle, the method and the landing box are suitable for various unmanned aerial vehicle platforms, the accurate landing positioning box can be conveniently adapted to various unmanned aerial vehicle platforms, and the unmanned aerial vehicle only needs to provide power supply output and flight control input.
In order to achieve the purpose, the invention adopts the technical scheme that:
an autonomous accurate landing method for a multi-platform unmanned aerial vehicle comprises the following steps:
(1) setting a target according to the size of the landing platform, wherein four two-dimensional codes used for representing four different ID numbers are arranged on the target;
(2) the unmanned aerial vehicle acquires a two-dimensional code image on the target through the camera, and positioning information is obtained according to the acquired two-dimensional code image condition;
(3) the unmanned aerial vehicle adjusts the position in real time according to the positioning information;
(4) and (3) repeating the steps (2) and (3) until the target is accurately dropped.
Further, among the four two-dimensional codes, the first two-dimensional code and the second two-dimensional code are the same in size and are just opposite to each other in edge-to-edge arrangement, the third two-dimensional code and the fourth two-dimensional code are smaller than the first two-dimensional code and the second two-dimensional code in size, the connecting line of the central points of the third two-dimensional code and the fourth two-dimensional code and the connecting line of the central points of the first two-dimensional code and the second two-dimensional code are vertically bisected each other, the first two-dimensional code and the second two-dimensional code are in the high-order field of view of the unmanned aerial vehicle, and the third two-dimensional code and the fourth two-dimensional code are in the low order field of view of the unmanned aerial vehicle.
Further, the camera is the wide angle camera of fixed focus, the sign indicating number system of two-dimensional code is aprilTag, and the array size is 8X 8, and the size of first two-dimensional code and second two-dimensional code is 20cm X20 cm, and the size of third two-dimensional code and fourth two-dimensional code is 10cm X10 cm, and the interval of first two-dimensional code and second two-dimensional code is 20cm, and the interval of third two-dimensional code and fourth two-dimensional code is 10 cm.
Further, in the step (2), the unmanned aerial vehicle flies to the position right above the target through GPS navigation, so that the two-dimensional code image on the target appears in the visual field range of the fixed-focus wide-angle camera; the two-dimensional code image that unmanned aerial vehicle gathered has 15 kinds of circumstances, and the locating information that each kind of circumstances obtained is:
1) only the first two-dimensional code is identified: positioning of outputsThe information is (X)0+20cm,Y0,Z0,Yaw0) Wherein X is0For the first two-dimensional code from the X-direction deviation, Y, of the centre of the drone0For the first two-dimensional code from the Y-direction deviation, Z, of the centre of the drone0For Z-direction deviation of the first two-dimensional code from the center of the unmanned aerial vehicle, Yaw0The deviation angle between the positive direction of the first two-dimensional code and the positive direction of the unmanned aerial vehicle is set;
2) only the second two-dimensional code is recognized: the output positioning information is (X)1-20cm,Y1,Z1,Yaw1) Wherein X is1Is the X-direction deviation, Y, of the second two-dimensional code from the center of the unmanned aerial vehicle1Is the Y-direction deviation of the second two-dimensional code from the center of the unmanned aerial vehicle, Z1For Z direction deviation of second two-dimensional code from center of unmanned aerial vehicle, Yaw1The deviation angle between the positive direction of the second two-dimensional code and the positive direction of the unmanned aerial vehicle is set;
3) only the third two-dimensional code is recognized, and the output positioning information is (X)2,Y2+7.5cm,Z2,Yaw2) Wherein X is2Is the X-direction deviation, Y, of the third two-dimensional code from the center of the unmanned aerial vehicle2Is the Y-direction deviation, Z, of the third two-dimensional code from the center of the unmanned aerial vehicle2Is Z-direction deviation of the third two-dimensional code from the center of the unmanned aerial vehicle, Yaw2The deviation angle between the positive direction of the third two-dimensional code and the positive direction of the unmanned aerial vehicle is set;
4) only the fourth two-dimensional code is recognized: the output positioning information is (X)3,Y3-7.5cm,Z3,Yaw3) Wherein X is3Is the X-direction deviation, Y, of the fourth two-dimensional code from the center of the unmanned aerial vehicle3Is the Y-direction deviation of the fourth two-dimensional code from the center of the unmanned aerial vehicle, Z3Is Z-direction deviation of the fourth two-dimensional code from the center of the unmanned aerial vehicle, Yaw3The deviation angle between the positive direction of the fourth two-dimensional code and the positive direction of the unmanned aerial vehicle is set;
5) the first two-dimensional code and the second two-dimensional code are simultaneously recognized, and the output positioning information is ((X)0+X1)/2,(Y0+Y1)/2,(Z0+Z1)/2,Yaw0);
6) The first and the third two-dimensional codes are recognized simultaneously, and the output positioning information is ((X)0+X2+20cm)/2,(Y0+Y2+7.5cm)/2,(Z0+Z2)/2,Yaw0);
7) The first and the fourth two-dimensional codes are recognized simultaneously, and the output positioning information is ((X)0+X3+20cm)/2,(Y0+Y3-7.5cm)/2,(Z0+Z3)/2,Yaw0);
8) The second and third two-dimensional codes are recognized simultaneously, and the output positioning information is ((X)1+X2-20cm)/2,(Y1+Y2+7.5cm)/2,(Z1+Z2)/2,Yaw1);
9) The second and the fourth two-dimensional codes are simultaneously recognized, and the output positioning information is ((X)1+X3-20cm)/2,(Y1+Y3-7.5cm)/2,(Z1+Z3)/2,Yaw1);
10) The third and the fourth two-dimensional codes are simultaneously recognized, and the output positioning information is ((X)2+X3)/2,(Y2+Y3)/2,(Z2+Z3)/2,Yaw2);
11) The first, the second and the third two-dimensional codes are simultaneously recognized, and the output positioning information is ((X)0+X1+X2)/3,(Y0+Y1+Y2+7.5cm)/3,(Z0+Z1+Z2)/3,Yaw0);
12) The first, the third and the fourth two-dimensional codes are simultaneously recognized, and the output positioning information is ((X)0+X1+X2)/3,(Y0+Y1+Y2+7.5cm)/3,(Z0+Z1+Z2)/3,Yaw0);
13) The first, the second and the fourth two-dimensional codes are simultaneously recognized, and the output positioning information is ((X)0+X1+X2)/3,(Y0+Y1+Y2+7.5cm)/3,(Z0+Z1+Z2)/3,Yaw0);
14) The second, third and fourth two-dimensional codes are simultaneously recognized, and the output positioning information is ((X)0+X1+X2)/3,(Y0+Y1+Y2+7.5cm)/3,(Z0+Z1+Z2)/3,Yaw1);
15) Four two-dimensional codes are identified simultaneously, and the output positioning information is ((X)1+X2+X3+X4)/4,(Y1+Y2+Y3+Y4)/4,(Z1+Z2+Z3+Z4)/4,Yaw0)。
Further, the specific mode of the step (3) is as follows:
(301) the unmanned aerial vehicle adjusts the horizontal position and the course angle of the unmanned aerial vehicle according to the positioning information, keeps vertical alignment with the target and descends at a constant speed;
(302) during descending, the unmanned aerial vehicle adjusts the posture and the position according to a negative feedback mechanism, compares the real-time positioning information with a preset error range value, and changes the position of the unmanned aerial vehicle through a PID control algorithm if the real-time positioning information is not within the error range value so that the position of the unmanned aerial vehicle is within the error range value; if the position of the unmanned aerial vehicle is within the error range, the unmanned aerial vehicle is enabled to be lowered to the height all the time.
The utility model provides a box of independently descending accurately towards multi-platform unmanned aerial vehicle, its includes tight wide camera, control module and the control interface of focusing, tight wide camera is used for gathering the two-dimensional code image on the target, and the control interface is used for passing control command to unmanned aerial vehicle, and control module is used for carrying out following procedure:
(1) obtaining positioning information according to the two-dimensional code image condition acquired by the fixed-focus wide-angle camera;
(2) adjusting the position in real time according to the positioning information;
(3) and (3) repeating the steps (1) and (2) until the target is accurately dropped.
Further, in the step (1), the two-dimensional code image acquired has 15 cases, and the positioning information obtained in each case is as follows:
1) only the first two-dimensional code is identified: the output positioning information is (X)0+20cm,Y0,Z0,Yaw0) Wherein X is0For the first two-dimensional code from the X-direction deviation, Y, of the centre of the drone0For the first two-dimensional code from the Y-direction deviation, Z, of the centre of the drone0For Z-direction deviation of the first two-dimensional code from the center of the unmanned aerial vehicle, Yaw0The deviation angle between the positive direction of the first two-dimensional code and the positive direction of the unmanned aerial vehicle is set;
2) only the second two-dimensional code is recognized: the output positioning information is (X)1-20cm,Y1,Z1,Yaw1) Wherein X is1Is the X-direction deviation, Y, of the second two-dimensional code from the center of the unmanned aerial vehicle1Is the Y-direction deviation of the second two-dimensional code from the center of the unmanned aerial vehicle, Z1For Z direction deviation of second two-dimensional code from center of unmanned aerial vehicle, Yaw1The deviation angle between the positive direction of the second two-dimensional code and the positive direction of the unmanned aerial vehicle is set;
3) only the third two-dimensional code is recognized, and the output positioning information is (X)2,Y2+7.5cm,Z2,Yaw2) Wherein X is2Is the X-direction deviation, Y, of the third two-dimensional code from the center of the unmanned aerial vehicle2Is the Y-direction deviation, Z, of the third two-dimensional code from the center of the unmanned aerial vehicle2Is Z-direction deviation of the third two-dimensional code from the center of the unmanned aerial vehicle, Yaw2The deviation angle between the positive direction of the third two-dimensional code and the positive direction of the unmanned aerial vehicle is set;
4) only the fourth two-dimensional code is recognized: the output positioning information is (X)3,Y3-7.5cm,Z3,Yaw3) Wherein X is3Is the X-direction deviation, Y, of the fourth two-dimensional code from the center of the unmanned aerial vehicle3Is the Y-direction deviation of the fourth two-dimensional code from the center of the unmanned aerial vehicle, Z3Is Z-direction deviation of the fourth two-dimensional code from the center of the unmanned aerial vehicle, Yaw3The deviation angle between the positive direction of the fourth two-dimensional code and the positive direction of the unmanned aerial vehicle is set;
5) the first two-dimensional code and the second two-dimensional code are simultaneously recognized, and the output positioning information is ((X)0+X1)/2,(Y0+Y1)/2,(Z0+Z1)/2,Yaw0);
6) The first and the third two-dimensional codes are recognized simultaneously, and the output positioning information is ((X)0+X2+20cm)/2,(Y0+Y2+7.5cm)/2,(Z0+Z2)/2,Yaw0);
7) All in oneThe first and the fourth two-dimensional codes are recognized, and the output positioning information is ((X)0+X3+20cm)/2,(Y0+Y3-7.5cm)/2,(Z0+Z3)/2,Yaw0);
8) The second and third two-dimensional codes are recognized simultaneously, and the output positioning information is ((X)1+X2-20cm)/2,(Y1+Y2+7.5cm)/2,(Z1+Z2)/2,Yaw1);
9) The second and the fourth two-dimensional codes are simultaneously recognized, and the output positioning information is ((X)1+X3-20cm)/2,(Y1+Y3-7.5cm)/2,(Z1+Z3)/2,Yaw1);
10) The third and the fourth two-dimensional codes are simultaneously recognized, and the output positioning information is ((X)2+X3)/2,(Y2+Y3)/2,(Z2+Z3)/2,Yaw2);
11) The first, the second and the third two-dimensional codes are simultaneously recognized, and the output positioning information is ((X)0+X1+X2)/3,(Y0+Y1+Y2+7.5cm)/3,(Z0+Z1+Z2)/3,Yaw0);
12) The first, the third and the fourth two-dimensional codes are simultaneously recognized, and the output positioning information is ((X)0+X1+X2)/3,(Y0+Y1+Y2+7.5cm)/3,(Z0+Z1+Z2)/3,Yaw0);
13) The first, the second and the fourth two-dimensional codes are simultaneously recognized, and the output positioning information is ((X)0+X1+X2)/3,(Y0+Y1+Y2+7.5cm)/3,(Z0+Z1+Z2)/3,Yaw0);
14) The second, third and fourth two-dimensional codes are simultaneously recognized, and the output positioning information is ((X)0+X1+X2)/3,(Y0+Y1+Y2+7.5cm)/3,(Z0+Z1+Z2)/3,Yaw1);
15) Output by recognizing four two-dimensional codes simultaneouslyThe positioning information is ((X)1+X2+X3+X4)/4,(Y1+Y2+Y3+Y4)/4,(Z1+Z2+Z3+Z4)/4,Yaw0)。
Further, the specific mode of the step (2) is as follows:
(201) the unmanned aerial vehicle adjusts the horizontal position and the course angle of the unmanned aerial vehicle according to the positioning information, keeps vertical alignment with the target and descends at a constant speed;
(202) during descending, the unmanned aerial vehicle adjusts the posture and the position according to a negative feedback mechanism, compares the real-time positioning information with a preset error range value, and changes the position of the unmanned aerial vehicle through a PID control algorithm if the real-time positioning information is not within the error range value so that the position of the unmanned aerial vehicle is within the error range value; if the position of the unmanned aerial vehicle is within the error range, the unmanned aerial vehicle is enabled to be lowered to the height all the time.
The invention has the beneficial effects that:
1. the invention is suitable for various unmanned aerial vehicle platforms, and can be applied to autonomous and accurate take-off and landing of the specially-made target provided by the invention no matter a rotor unmanned aerial vehicle, a mooring unmanned aerial vehicle and a composite wing unmanned aerial vehicle, or a vehicle-mounted mobile platform and a fixed take-off and landing platform.
2. The method used by the invention has wide transportability, the positioning box can be adapted to various unmanned aerial vehicle platforms through accurate landing, and the unmanned aerial vehicle only needs to provide power output and flight control input.
3. The invention has the advantages of stability, real-time performance and accuracy, can provide positioning information output frequency above 30Hz, and has landing precision reaching centimeter level.
Drawings
FIG. 1 is a schematic diagram of a two-dimensional image of a tailored target according to an embodiment of the present invention;
FIG. 2 is a schematic view of an autonomous precision drop box in an embodiment of the present invention;
FIG. 3 is a schematic diagram illustrating processing output of positioning information according to an embodiment of the present invention;
fig. 4 is a schematic view of a scene in which the unmanned aerial vehicle autonomously and accurately lands in the embodiment of the present invention.
Detailed Description
The technical solution of the present invention is further explained with reference to the accompanying drawings. It is to be understood that these are only some of the embodiments of the present invention and are not necessarily all embodiments. All other embodiments, which can be obtained by a person skilled in the art without inventive step based on the following embodiments, belong to the scope of protection of the present invention.
As shown in fig. 2, an independently accurate box that descends towards multi-platform unmanned aerial vehicle, it includes tight wide camera, control module and control interface, tight wide camera is used for gathering the two-dimensional code image on the target, and control interface is used for passing control instruction to unmanned aerial vehicle, and control module is used for carrying out following procedure:
(1) obtaining positioning information according to the two-dimensional code image condition acquired by the fixed-focus wide-angle camera;
(2) adjusting the position in real time according to the positioning information;
(3) and (3) repeating the steps (1) and (2) until the target is accurately dropped.
An autonomous and accurate landing method for a multi-platform unmanned aerial vehicle adopts a special target and an accurate landing positioning box to perform accurate landing. The purpose-made target consists of a plurality of two-dimensional code images with different sizes; the precise landing positioning box comprises an embedded intelligent board card and a fixed-focus wide-angle camera, and a precise landing positioning algorithm is operated; the fixed-focus wide-angle camera collects a two-dimensional code image on a special landing target and transmits the two-dimensional code image to the embedded intelligent board card, the precise landing positioning algorithm runs on the embedded intelligent board card, and the two-dimensional code image is processed and analyzed to obtain positioning information of the two-dimensional code image; the unmanned aerial vehicle adjusts the position of the unmanned aerial vehicle in real time according to the positioning information of the two-dimensional code image, and finally, the unmanned aerial vehicle accurately lands on the purpose-made target.
The method is realized in a specific way as follows:
the characteristics of multi-platform unmanned aerial vehicles such as a rotor unmanned aerial vehicle, a mooring unmanned aerial vehicle and a composite wing unmanned aerial vehicle are comprehensively considered, and a plurality of two-dimensional code images with different sizes are combined to form a purpose-made target;
the precise landing positioning box is designed by adopting a light aluminum alloy material, an embedded intelligent board card, a fixed-focus wide-angle camera and a precise landing positioning algorithm are contained, and various interfaces and a side wing structure convenient for the installation of the unmanned aerial vehicle are reserved;
as shown in fig. 4, the fixed-focus wide-angle camera acquires a two-dimensional code image on the special target and transmits the two-dimensional code image to the embedded intelligent board card, the precise landing positioning algorithm runs on the embedded intelligent board card, and the two-dimensional code image is processed and analyzed to obtain positioning information of the two-dimensional code image;
the unmanned aerial vehicle adjusts the position of the unmanned aerial vehicle in real time according to the positioning information of the two-dimensional code image, and finally, the unmanned aerial vehicle accurately lands on the purpose-made target.
The two-dimension code images have a large dimension of 20cm by 20cm and a small dimension of 10cm by 10cm, the two large-dimension code images are placed on the left side and the right side of a central point C of the special target, and the two small-dimension code images are placed on the upper side and the lower side of the central point C of the special target and are simultaneously positioned between the two large-dimension code images. A schematic diagram of a specially-made target two-dimensional code image is shown in fig. 1, wherein the distance from the central point of the large-size two-dimensional code image to the point C is 20cm, and the distance from the central point of the small-size two-dimensional code image to the point C is 7.5 cm;
the accurate landing positioning box is designed by adopting a light aluminum alloy material, and is internally provided with an embedded intelligent board card and a fixed-focus wide-angle camera, and a side wing structure which is provided with various interfaces and is convenient for the installation of the unmanned aerial vehicle is reserved by operating an accurate landing positioning algorithm. Specifically, the method comprises the following steps:
the embedded intelligent board card is positioned in the accurate landing positioning box and has the functions of a CPU, a GPU and a memory, one HDMI, one gigabit network port and one DP interface are arranged outside, and two USB interfaces and two SDI interfaces are arranged outside;
the fixed-focus wide-angle camera is positioned in the accurate landing positioning box, the aperture coefficient is F1.6-F3.5, the field angle is 120 degrees, the resolution is 1920 x 1080, and the USB interface of the fixed-focus wide-angle camera is positioned on a USB1 position on the accurate landing positioning box;
flank structure convenient to unmanned aerial vehicle installation is located the bottom of accurate descending locating box, and accessible screw combination is fixed in the unmanned aerial vehicle below.
The fixed-focus wide-angle camera collects two-dimensional code images on the specially-made targets and transmits the two-dimensional code images to the embedded intelligent board card, and the accurate landing positioning algorithm runs on the embedded intelligent board card and processes and analyzes the two-dimensional code images.
When landing, the unmanned aerial vehicle firstly flies to the position right above the special target through GPS navigation, so that a two-dimensional code image on the special target appears in the visual field range of the fixed-focus wide-angle camera;
and the fixed-focus wide-angle camera transmits the two-dimensional code image to a precise positioning landing algorithm through a USB interface.
According to the difference of unmanned aerial vehicle platform, 15 kinds of circumstances probably appear in the unmanned aerial vehicle decline in-process in the two-dimensional code image, wherein: 4 cases of 1 two-dimensional code are identified, 6 cases of 2 two-dimensional codes are identified, 4 cases of 3 two-dimensional codes are identified, and 1 case of 4 two-dimensional codes is identified;
the precise landing positioning algorithm performs different processing modes on the 15 situations respectively, and outputs the positioning information of the two-dimensional code image, and a schematic diagram of the positioning information processing and outputting is shown in fig. 3.
After the positioning information is obtained, the unmanned aerial vehicle adjusts the horizontal position and the course angle of the unmanned aerial vehicle according to the positioning information, keeps vertical alignment with the specially-made target and descends at a constant speed;
in the descending process, the unmanned aerial vehicle adjusts the posture and the position according to a negative feedback mechanism, and the positioning information is ensured to meet a preset error range value until the unmanned aerial vehicle descends to the special target.
Particularly, at the in-process that descends, unmanned aerial vehicle compares real-time locating information with predetermined error range value, if not in error range, then changes unmanned aerial vehicle's position through PID control algorithm, makes unmanned aerial vehicle's position in error range value. If when unmanned aerial vehicle's position was in error range, just made unmanned aerial vehicle reduce the height all the time, until the height is less than the threshold height, unmanned aerial vehicle chance shut down and accomplish to descend.
PID control algorithms are well known to those skilled in the art and implement a control function for a controlled object by adjusting P, I, D three parameters, wherein the proportional part P: the response speed of the system can be accelerated by increasing the proportionality coefficient, and the steady-state error is reduced; but too large a proportionality coefficient may affect the stability of the system; differential portion D: the derivative effect can reflect the rate of change of the error signal. The larger the change speed is, the stronger the differential action is, thereby being beneficial to reducing the oscillation and increasing the stability of the system; an integration section I: the smaller the integration time constant, the stronger the integration. The integral control action can eliminate the steady-state error of the system; however, too much integration will degrade the stability of the system.
The invention is suitable for various unmanned aerial vehicle platforms, can be conveniently adapted to various unmanned aerial vehicle platforms by adopting the accurate landing positioning box, and the unmanned aerial vehicle only needs to provide power output and flight control input.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (8)

1. The autonomous accurate landing method for the multi-platform unmanned aerial vehicle is characterized by comprising the following steps of:
(1) setting a target according to the size of the landing platform, wherein four two-dimensional codes used for representing four different ID numbers are arranged on the target;
(2) the unmanned aerial vehicle acquires a two-dimensional code image on the target through the camera, and positioning information is obtained according to the acquired two-dimensional code image condition;
(3) the unmanned aerial vehicle adjusts the position in real time according to the positioning information;
(4) and (3) repeating the steps (2) and (3) until the target is accurately dropped.
2. The autonomous precise landing method for the multi-platform unmanned aerial vehicle as claimed in claim 1, wherein the four two-dimensional codes comprise a first two-dimensional code and a second two-dimensional code which are the same in size and are arranged side-to-side in a facing manner, a third two-dimensional code and a fourth two-dimensional code which are the same in size and are arranged side-to-side in a facing manner, the third two-dimensional code and the fourth two-dimensional code are smaller than the first two-dimensional code and the second two-dimensional code in size, the distance between the third two-dimensional code and the fourth two-dimensional code is smaller than the distance between the first two-dimensional code and the second two-dimensional code, the connecting line of the central points of the third two-dimensional code and the fourth two-dimensional code and the connecting line of the central points of the first two-dimensional code and the second two-dimensional code are vertically bisected by each other, the first two-dimensional code and the second two-dimensional code are in the high-order field of the unmanned aerial vehicle, and the third two-dimensional code and the fourth two-dimensional code are in the low-order field of the unmanned aerial vehicle.
3. The autonomous precise landing method for the multi-platform unmanned aerial vehicle of claim 2, wherein the camera is a fixed-focus wide-angle camera, the two-dimensional code is AprilTag, the array size is 8 x 8, the sizes of the first two-dimensional code and the second two-dimensional code are 20cm x 20cm, the sizes of the third two-dimensional code and the fourth two-dimensional code are 10cm x 10cm, the distance between the first two-dimensional code and the second two-dimensional code is 20cm, and the distance between the third two-dimensional code and the fourth two-dimensional code is 10 cm.
4. The autonomous precise landing method for the multi-platform unmanned aerial vehicle according to claim 3, wherein in the step (2), the unmanned aerial vehicle flies right above the target through GPS navigation, so that the two-dimensional code image on the target appears in the visual field range of the fixed-focus wide-angle camera; the two-dimensional code image that unmanned aerial vehicle gathered has 15 kinds of circumstances, and the locating information that each kind of circumstances obtained is:
1) only the first two-dimensional code is identified: the output positioning information is (X)0+20cm,Y0,Z0,Yaw0) Wherein X is0For the first two-dimensional code from the X-direction deviation, Y, of the centre of the drone0For the first two-dimensional code from the Y-direction deviation, Z, of the centre of the drone0For Z-direction deviation of the first two-dimensional code from the center of the unmanned aerial vehicle, Yaw0The deviation angle between the positive direction of the first two-dimensional code and the positive direction of the unmanned aerial vehicle is set;
2) only the second two-dimensional code is recognized: the output positioning information is (X)1-20cm,Y1,Z1,Yaw1) Wherein X is1Is a second two-dimensional codeX-direction deviation from center of unmanned aerial vehicle, Y1Is the Y-direction deviation of the second two-dimensional code from the center of the unmanned aerial vehicle, Z1For Z direction deviation of second two-dimensional code from center of unmanned aerial vehicle, Yaw1The deviation angle between the positive direction of the second two-dimensional code and the positive direction of the unmanned aerial vehicle is set;
3) only the third two-dimensional code is recognized, and the output positioning information is (X)2,Y2+7.5cm,Z2,Yaw2) Wherein X is2Is the X-direction deviation, Y, of the third two-dimensional code from the center of the unmanned aerial vehicle2Is the Y-direction deviation, Z, of the third two-dimensional code from the center of the unmanned aerial vehicle2Is Z-direction deviation of the third two-dimensional code from the center of the unmanned aerial vehicle, Yaw2The deviation angle between the positive direction of the third two-dimensional code and the positive direction of the unmanned aerial vehicle is set;
4) only the fourth two-dimensional code is recognized: the output positioning information is (X)3,Y3-7.5cm,Z3,Yaw3) Wherein X is3Is the X-direction deviation, Y, of the fourth two-dimensional code from the center of the unmanned aerial vehicle3Is the Y-direction deviation of the fourth two-dimensional code from the center of the unmanned aerial vehicle, Z3Is Z-direction deviation of the fourth two-dimensional code from the center of the unmanned aerial vehicle, Yaw3The deviation angle between the positive direction of the fourth two-dimensional code and the positive direction of the unmanned aerial vehicle is set;
5) the first two-dimensional code and the second two-dimensional code are simultaneously recognized, and the output positioning information is ((X)0+X1)/2,(Y0+Y1)/2,(Z0+Z1)/2,Yaw0);
6) The first and the third two-dimensional codes are recognized simultaneously, and the output positioning information is ((X)0+X2+20cm)/2,(Y0+Y2+7.5cm)/2,(Z0+Z2)/2,Yaw0);
7) The first and the fourth two-dimensional codes are recognized simultaneously, and the output positioning information is ((X)0+X3+20cm)/2,(Y0+Y3-7.5cm)/2,(Z0+Z3)/2,Yaw0);
8) The second and third two-dimensional codes are recognized simultaneously, and the output positioning information is ((X)1+X2-20cm)/2,(Y1+Y2+7.5cm)/2,(Z1+Z2)/2,Yaw1);
9) The second and the fourth two-dimensional codes are simultaneously recognized, and the output positioning information is ((X)1+X3-20cm)/2,(Y1+Y3-7.5cm)/2,(Z1+Z3)/2,Yaw1);
10) The third and the fourth two-dimensional codes are simultaneously recognized, and the output positioning information is ((X)2+X3)/2,(Y2+Y3)/2,(Z2+Z3)/2,Yaw2);
11) The first, the second and the third two-dimensional codes are simultaneously recognized, and the output positioning information is ((X)0+X1+X2)/3,(Y0+Y1+Y2+7.5cm)/3,(Z0+Z1+Z2)/3,Yaw0);
12) The first, the third and the fourth two-dimensional codes are simultaneously recognized, and the output positioning information is ((X)0+X1+X2)/3,(Y0+Y1+Y2+7.5cm)/3,(Z0+Z1+Z2)/3,Yaw0);
13) The first, the second and the fourth two-dimensional codes are simultaneously recognized, and the output positioning information is ((X)0+X1+X2)/3,(Y0+Y1+Y2+7.5cm)/3,(Z0+Z1+Z2)/3,Yaw0);
14) The second, third and fourth two-dimensional codes are simultaneously recognized, and the output positioning information is ((X)0+X1+X2)/3,(Y0+Y1+Y2+7.5cm)/3,(Z0+Z1+Z2)/3,Yaw1);
15) Four two-dimensional codes are identified simultaneously, and the output positioning information is ((X)1+X2+X3+X4)/4,(Y1+Y2+Y3+Y4)/4,(Z1+Z2+Z3+Z4)/4,Yaw0)。
5. The autonomous precise landing method for the multi-platform unmanned aerial vehicle according to claim 3, wherein the specific manner of the step (3) is as follows:
(301) the unmanned aerial vehicle adjusts the horizontal position and the course angle of the unmanned aerial vehicle according to the positioning information, keeps vertical alignment with the target and descends at a constant speed;
(302) during descending, the unmanned aerial vehicle adjusts the posture and the position according to a negative feedback mechanism, compares the real-time positioning information with a preset error range value, and changes the position of the unmanned aerial vehicle through a PID control algorithm if the real-time positioning information is not within the error range value so that the position of the unmanned aerial vehicle is within the error range value; if the position of the unmanned aerial vehicle is within the error range, the unmanned aerial vehicle is enabled to be lowered to the height all the time.
6. The utility model provides a box that independently descends accurately towards multi-platform unmanned aerial vehicle, a serial communication port, including fixed focus wide camera, control module and control interface, fixed focus wide camera is used for gathering the two-dimensional code image on the target, and control interface is used for passing control command to unmanned aerial vehicle, and control module is used for carrying out following procedure:
(1) obtaining positioning information according to the two-dimensional code image condition acquired by the fixed-focus wide-angle camera;
(2) adjusting the position in real time according to the positioning information;
(3) and (3) repeating the steps (1) and (2) until the target is accurately dropped.
7. The autonomous precise landing box for the multi-platform unmanned aerial vehicle according to claim 1, wherein in step (1), the two-dimensional code images collected have 15 conditions, and the positioning information obtained in each condition is as follows:
1) only the first two-dimensional code is identified: the output positioning information is (X)0+20cm,Y0,Z0,Yaw0) Wherein X is0For the first two-dimensional code from the X-direction deviation, Y, of the centre of the drone0For the first two-dimensional code from the Y-direction deviation, Z, of the centre of the drone0For Z-direction deviation of the first two-dimensional code from the center of the unmanned aerial vehicle, Yaw0Is the positive direction of the first two-dimensional codeA declination angle from the positive direction of the drone;
2) only the second two-dimensional code is recognized: the output positioning information is (X)1-20cm,Y1,Z1,Yaw1) Wherein X is1Is the X-direction deviation, Y, of the second two-dimensional code from the center of the unmanned aerial vehicle1Is the Y-direction deviation of the second two-dimensional code from the center of the unmanned aerial vehicle, Z1For Z direction deviation of second two-dimensional code from center of unmanned aerial vehicle, Yaw1The deviation angle between the positive direction of the second two-dimensional code and the positive direction of the unmanned aerial vehicle is set;
3) only the third two-dimensional code is recognized, and the output positioning information is (X)2,Y2+7.5cm,Z2,Yaw2) Wherein X is2Is the X-direction deviation, Y, of the third two-dimensional code from the center of the unmanned aerial vehicle2Is the Y-direction deviation, Z, of the third two-dimensional code from the center of the unmanned aerial vehicle2Is Z-direction deviation of the third two-dimensional code from the center of the unmanned aerial vehicle, Yaw2The deviation angle between the positive direction of the third two-dimensional code and the positive direction of the unmanned aerial vehicle is set;
4) only the fourth two-dimensional code is recognized: the output positioning information is (X)3,Y3-7.5cm,Z3,Yaw3) Wherein X is3Is the X-direction deviation, Y, of the fourth two-dimensional code from the center of the unmanned aerial vehicle3Is the Y-direction deviation of the fourth two-dimensional code from the center of the unmanned aerial vehicle, Z3Is Z-direction deviation of the fourth two-dimensional code from the center of the unmanned aerial vehicle, Yaw3The deviation angle between the positive direction of the fourth two-dimensional code and the positive direction of the unmanned aerial vehicle is set;
5) the first two-dimensional code and the second two-dimensional code are simultaneously recognized, and the output positioning information is ((X)0+X1)/2,(Y0+Y1)/2,(Z0+Z1)/2,Yaw0);
6) The first and the third two-dimensional codes are recognized simultaneously, and the output positioning information is ((X)0+X2+20cm)/2,(Y0+Y2+7.5cm)/2,(Z0+Z2)/2,Yaw0);
7) The first and the fourth two-dimensional codes are recognized simultaneously, and the output positioning information is ((X)0+X3+20cm)/2,(Y0+Y3-7.5cm)/2,(Z0+Z3)/2,Yaw0);
8) The second and third two-dimensional codes are recognized simultaneously, and the output positioning information is ((X)1+X2-20cm)/2,(Y1+Y2+7.5cm)/2,(Z1+Z2)/2,Yaw1);
9) The second and the fourth two-dimensional codes are simultaneously recognized, and the output positioning information is ((X)1+X3-20cm)/2,(Y1+Y3-7.5cm)/2,(Z1+Z3)/2,Yaw1);
10) The third and the fourth two-dimensional codes are simultaneously recognized, and the output positioning information is ((X)2+X3)/2,(Y2+Y3)/2,(Z2+Z3)/2,Yaw2);
11) The first, the second and the third two-dimensional codes are simultaneously recognized, and the output positioning information is ((X)0+X1+X2)/3,(Y0+Y1+Y2+7.5cm)/3,(Z0+Z1+Z2)/3,Yaw0);
12) The first, the third and the fourth two-dimensional codes are simultaneously recognized, and the output positioning information is ((X)0+X1+X2)/3,(Y0+Y1+Y2+7.5cm)/3,(Z0+Z1+Z2)/3,Yaw0);
13) The first, the second and the fourth two-dimensional codes are simultaneously recognized, and the output positioning information is ((X)0+X1+X2)/3,(Y0+Y1+Y2+7.5cm)/3,(Z0+Z1+Z2)/3,Yaw0);
14) The second, third and fourth two-dimensional codes are simultaneously recognized, and the output positioning information is ((X)0+X1+X2)/3,(Y0+Y1+Y2+7.5cm)/3,(Z0+Z1+Z2)/3,Yaw1);
15) Four two-dimensional codes are identified simultaneously, and the output positioning information is ((X)1+X2+X3+X4)/4,(Y1+Y2+Y3+Y4)/4,(Z1+Z2+Z3+Z4)/4,Yaw0)。
8. The autonomous precise landing box for multi-platform unmanned aerial vehicles according to claim 1, wherein the specific manner of step (2) is as follows:
(201) the unmanned aerial vehicle adjusts the horizontal position and the course angle of the unmanned aerial vehicle according to the positioning information, keeps vertical alignment with the target and descends at a constant speed;
(202) during descending, the unmanned aerial vehicle adjusts the posture and the position according to a negative feedback mechanism, compares the real-time positioning information with a preset error range value, and changes the position of the unmanned aerial vehicle through a PID control algorithm if the real-time positioning information is not within the error range value so that the position of the unmanned aerial vehicle is within the error range value; if the position of the unmanned aerial vehicle is within the error range, the unmanned aerial vehicle is enabled to be lowered to the height all the time.
CN202110980963.2A 2021-08-25 2021-08-25 Autonomous accurate landing method and landing box for multi-platform unmanned aerial vehicle Pending CN113741496A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110980963.2A CN113741496A (en) 2021-08-25 2021-08-25 Autonomous accurate landing method and landing box for multi-platform unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110980963.2A CN113741496A (en) 2021-08-25 2021-08-25 Autonomous accurate landing method and landing box for multi-platform unmanned aerial vehicle

Publications (1)

Publication Number Publication Date
CN113741496A true CN113741496A (en) 2021-12-03

Family

ID=78732901

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110980963.2A Pending CN113741496A (en) 2021-08-25 2021-08-25 Autonomous accurate landing method and landing box for multi-platform unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN113741496A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108919830A (en) * 2018-07-20 2018-11-30 南京奇蛙智能科技有限公司 A kind of flight control method that unmanned plane precisely lands
CN110703807A (en) * 2019-11-18 2020-01-17 西安君晖航空科技有限公司 Landmark design method for large and small two-dimensional code mixed image and landmark identification method for unmanned aerial vehicle
CN110991207A (en) * 2019-11-19 2020-04-10 山东大学 Unmanned aerial vehicle accurate landing method integrating H pattern recognition and Apriltag two-dimensional code recognition
CN110989661A (en) * 2019-11-19 2020-04-10 山东大学 Unmanned aerial vehicle accurate landing method and system based on multiple positioning two-dimensional codes

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108919830A (en) * 2018-07-20 2018-11-30 南京奇蛙智能科技有限公司 A kind of flight control method that unmanned plane precisely lands
CN110703807A (en) * 2019-11-18 2020-01-17 西安君晖航空科技有限公司 Landmark design method for large and small two-dimensional code mixed image and landmark identification method for unmanned aerial vehicle
CN110991207A (en) * 2019-11-19 2020-04-10 山东大学 Unmanned aerial vehicle accurate landing method integrating H pattern recognition and Apriltag two-dimensional code recognition
CN110989661A (en) * 2019-11-19 2020-04-10 山东大学 Unmanned aerial vehicle accurate landing method and system based on multiple positioning two-dimensional codes

Similar Documents

Publication Publication Date Title
CN110991207B (en) Unmanned aerial vehicle accurate landing method integrating H pattern recognition and Apriltag two-dimensional code recognition
CN109911231B (en) Unmanned aerial vehicle autonomous carrier landing method and system based on GPS and image recognition hybrid navigation
CN107544550B (en) Unmanned aerial vehicle automatic landing method based on visual guidance
CN109792951B (en) Unmanned aerial vehicle air route correction system for pollination of hybrid rice and correction method thereof
RU2615587C1 (en) Method of accurate landing of unmanned aircraft
CN104808685A (en) Vision auxiliary device and method for automatic landing of unmanned aerial vehicle
CN110221625B (en) Autonomous landing guiding method for precise position of unmanned aerial vehicle
CN110618691B (en) Machine vision-based method for accurately landing concentric circle targets of unmanned aerial vehicle
CN109885084A (en) A kind of multi-rotor unmanned aerial vehicle Autonomous landing method based on monocular vision and fuzzy control
CN105867397A (en) Unmanned aerial vehicle accurate position landing method based on image processing and fuzzy control
CN109683629A (en) Unmanned plane electric stringing system based on integrated navigation and computer vision
CN106502257A (en) A kind of unmanned plane precisely lands jamproof control method
CN112947569B (en) Visual servo target tracking control method for quad-rotor unmanned aerial vehicle based on preset performance
CN111831010A (en) Unmanned aerial vehicle obstacle avoidance flight method based on digital space slice
CN113759943A (en) Unmanned aerial vehicle landing platform, identification method, landing method and flight operation system
CN104965513A (en) Son hopping robot recovery system and recovery method
CN106155082A (en) A kind of unmanned plane bionic intelligence barrier-avoiding method based on light stream
CN114910918A (en) Positioning method and device, radar device, unmanned aerial vehicle system, controller and medium
CN112119428A (en) Method, device, unmanned aerial vehicle, system and storage medium for acquiring landing position
CN113741496A (en) Autonomous accurate landing method and landing box for multi-platform unmanned aerial vehicle
CN112558619A (en) Ultrasonic-assisted unmanned aerial vehicle autonomous stable landing system and method
CN116243725A (en) Substation unmanned aerial vehicle inspection method and system based on visual navigation
CN115755575A (en) ROS-based double-tripod-head unmanned aerial vehicle autonomous landing method
Priambodo et al. A Vision and GPS Based System for Autonomous Precision Vertical Landing of UAV Quadcopter
CN115755950A (en) Unmanned aerial vehicle fixed-point landing method based on laser radar and camera data fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20211203