CN112789571A - Unmanned aerial vehicle landing method and device and unmanned aerial vehicle - Google Patents

Unmanned aerial vehicle landing method and device and unmanned aerial vehicle Download PDF

Info

Publication number
CN112789571A
CN112789571A CN201880094799.1A CN201880094799A CN112789571A CN 112789571 A CN112789571 A CN 112789571A CN 201880094799 A CN201880094799 A CN 201880094799A CN 112789571 A CN112789571 A CN 112789571A
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
reference image
flying height
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880094799.1A
Other languages
Chinese (zh)
Inventor
崔希鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Autel Robotics Co Ltd
Shenzhen Autel Intelligent Aviation Technology Co Ltd
Original Assignee
Shenzhen Autel Intelligent Aviation Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Autel Intelligent Aviation Technology Co Ltd filed Critical Shenzhen Autel Intelligent Aviation Technology Co Ltd
Publication of CN112789571A publication Critical patent/CN112789571A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Image Analysis (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention relates to an unmanned aerial vehicle landing method and device and an unmanned aerial vehicle. According to the invention, by collecting the reference image containing the flying point shot by the unmanned aerial vehicle in the flying process and carrying out stage-by-stage template matching in the landing process of the unmanned aerial vehicle, the sensor error of the unmanned aerial vehicle can be eliminated, and the distance of the unmanned aerial vehicle from the flying point can be obtained in real time, so that the unmanned aerial vehicle is controlled to be continuously close to the flying point in the landing process and finally land on the flying point accurately. In the whole landing process of the unmanned aerial vehicle, other auxiliary equipment is not needed, the landing effect is good, and the requirement on the precision of a sensor carried by the unmanned aerial vehicle is not high.

Description

Unmanned aerial vehicle landing method and device and unmanned aerial vehicle Technical Field
The invention relates to the technical field of aircrafts, in particular to an unmanned aerial vehicle landing method, an unmanned aerial vehicle landing device, an unmanned aerial vehicle and a computer readable storage medium.
Background
Along with the development of a visual algorithm and the application of the visual algorithm on an Unmanned Aerial Vehicle (UAV) platform, the intelligent tracking has a good tracking effect. How to realize the whole unmanned aerial vehicle from tracking to landing is an important direction for improving the intellectualization of the unmanned aerial vehicle.
At present, unmanned aerial vehicle among the prior art has been able to realize that unmanned aerial vehicle is accurate to land on unmanned aerial vehicle's flying site, but this technique needs place auxiliary assembly at the flying site, like the signal source to send the signal to unmanned aerial vehicle and guide the accurate landing of unmanned aerial vehicle. In addition, can also set up special sign or be based on abundant texture image at the point of departure, directly use the texture pattern near the point of departure to combine visual technique guide unmanned aerial vehicle accurate landing.
However, for a small unmanned aerial vehicle, the auxiliary equipment is used for guiding the unmanned aerial vehicle to land accurately, so that the small unmanned aerial vehicle has the defect of inconvenience in carrying; the method for setting the special mark at the flying starting point has the defect of inconvenient operation. And for based on abundant texture image, the mode that the texture pattern near the point of departure combines visual technology to guide the accurate landing of unmanned aerial vehicle is sparse in the texture of landing point, and there is the landing effect again poor under the great condition of sensor error that unmanned aerial vehicle carried on, there is the shortcoming of error in landing point.
Disclosure of Invention
In view of the above, there is a need to provide a method and apparatus for landing an unmanned aerial vehicle, an unmanned aerial vehicle and a computer readable storage medium, which can accurately land to a departure point without other auxiliary devices.
A method of landing an unmanned aerial vehicle, the method comprising:
in the takeoff process of the unmanned aerial vehicle, acquiring images which are shot by the unmanned aerial vehicle at different flight heights and contain a takeoff point, and taking the acquired images containing the takeoff point as reference images;
in the landing process of the unmanned aerial vehicle, repeating the step of template matching until the unmanned aerial vehicle lands at the flying starting point; wherein the template matching comprises:
acquiring an image shot by the unmanned aerial vehicle at the current flying height;
acquiring a reference image matched with the current flying height of the unmanned aerial vehicle;
carrying out template matching on an image shot by the unmanned aerial vehicle at the current flying height and a reference image matched with the current flying height of the unmanned aerial vehicle to obtain the coordinates of the flying starting point or the distance between the unmanned aerial vehicle and the flying starting point;
and controlling the unmanned aerial vehicle to fly to the flying point according to the coordinates of the flying point or the distance between the unmanned aerial vehicle and the flying point.
In an embodiment of the present invention, in the takeoff process of the unmanned aerial vehicle, acquiring images including a takeoff point, which are shot by the unmanned aerial vehicle at different flight heights, includes:
and acquiring an image containing the flying point at preset intervals.
In an embodiment of the present invention, the predetermined distance is 1 meter.
In an embodiment of the present invention, the method further includes:
and in the takeoff process of the unmanned aerial vehicle, when the flying height of the unmanned aerial vehicle is greater than the preset height, stopping collecting the image containing the takeoff point.
In an embodiment of the present invention, the preset height is 15 meters.
In an embodiment of the present invention, before the acquiring the image of the drone taken at the current flying height, the method further includes:
and determining that the current flying height of the unmanned aerial vehicle is within a preset height range.
In an embodiment of the present invention, the reference image matched with the current flying height of the unmanned aerial vehicle refers to a reference image acquired at the same flying height as the current flying height during the takeoff process of the unmanned aerial vehicle.
In an embodiment of the present invention, the reference image matched with the current flying height of the unmanned aerial vehicle refers to a reference image acquired at a flying height closest to the current flying height during takeoff of the unmanned aerial vehicle.
In an embodiment of the present invention, the template matching further includes:
determining a category of the scene in the reference image matched with the current flying height of the unmanned aerial vehicle, wherein the category of the scene comprises a scene with rich texture or a scene with sparse texture;
extracting a feature map of the image shot by the unmanned aerial vehicle at the current flying height and a feature map of the reference image matched with the current flying height of the unmanned aerial vehicle according to the category of the scene;
then, the template matching is performed on the image shot by the unmanned aerial vehicle at the current flying height and the reference image matched with the current flying height of the unmanned aerial vehicle, and the template matching includes:
and carrying out template matching on the characteristic graph of the image shot by the unmanned aerial vehicle at the current flying height and the characteristic graph of the reference image.
In an embodiment of the present invention, the determining the category of the scene in the reference image matching the current flying height of the drone includes:
extracting a first-order gradient map of the reference image matched with the current flying height of the unmanned aerial vehicle;
acquiring a gradient histogram of the reference image matched with the current flying height of the unmanned aerial vehicle according to the first-order gradient map;
judging whether a value reflecting the texture richness degree of the scene in the gradient histogram is larger than a first preset value or not according to the gradient histogram;
and if so, judging that the scene in the reference image is a texture-rich scene.
In an embodiment of the invention, the feature map of the image taken by the drone at the current flying height and the feature map of the reference image matched with the current flying height of the drone comprise first order gradient maps.
In an embodiment of the invention, the feature map of the image shot by the unmanned aerial vehicle at the current flying height and the feature map of the reference image matched with the current flying height of the unmanned aerial vehicle further include a grayscale map.
In an embodiment of the present invention, if a value reflecting the texture richness of the scene in the gradient histogram is not greater than the first preset value, the method further includes:
acquiring a second-order gradient map of the reference image matched with the current flying height of the unmanned aerial vehicle;
acquiring a gradient histogram of the reference image matched with the current flying height of the unmanned aerial vehicle according to the second-order gradient map;
judging whether a value reflecting the texture richness degree of the scene in the gradient histogram is larger than a second preset value or not according to the gradient histogram;
and if so, judging that the scene in the reference image is a texture sparse scene.
In an embodiment of the invention, the feature map of the image captured by the drone at the current flying height and the feature map of the reference image include a second-order gradient map.
In an embodiment of the present invention, if a value reflecting the texture richness degree of the scene in the reference image in the gradient histogram is not greater than the second preset value, it is determined that the scene in the reference image is a non-texture scene.
In order to solve the technical problem, the invention also provides an unmanned aerial vehicle landing device, which comprises:
the unmanned aerial vehicle control system comprises an acquisition module, a control module and a control module, wherein the acquisition module is used for acquiring images containing flying points shot by the unmanned aerial vehicle at different flying heights in the take-off process of the unmanned aerial vehicle and taking the acquired images containing the flying points as reference images; and
the template matching module is used for repeating the step of template matching in the landing process of the unmanned aerial vehicle until the unmanned aerial vehicle lands to the flying starting point; wherein the template matching module comprises:
the acquisition module is used for acquiring an image shot by the unmanned aerial vehicle at the current flying height; and
acquiring a reference image matched with the current flying height of the unmanned aerial vehicle;
the matching module is used for performing template matching on an image shot by the unmanned aerial vehicle at the current flying height and a reference image matched with the current flying height of the unmanned aerial vehicle so as to obtain the coordinates of the flying point or the distance between the unmanned aerial vehicle and the flying point; and
and the control module is used for controlling the unmanned aerial vehicle to fly to the flying point according to the coordinate of the flying point or the distance between the unmanned aerial vehicle and the flying point.
In an embodiment of the present invention, the acquisition module is specifically configured to:
and acquiring an image containing the flying point at preset intervals.
In an embodiment of the present invention, the predetermined distance is 1 meter.
In an embodiment of the present invention, the acquisition module is further configured to:
in the takeoff process of the unmanned aerial vehicle, when the flying height of the unmanned aerial vehicle is larger than a preset height, images containing the takeoff points are not collected any more.
In an embodiment of the present invention, the preset height is 15 meters.
In an embodiment of the present invention, the template matching module further includes:
and the determining module is used for determining that the current flying height of the unmanned aerial vehicle is within a preset height range.
In an embodiment of the present invention, the reference image matched with the current flying height of the unmanned aerial vehicle refers to a reference image acquired at the same flying height as the current flying height during the takeoff process of the unmanned aerial vehicle.
In an embodiment of the present invention, the reference image matched with the current flying height of the unmanned aerial vehicle refers to a reference image acquired at a flying height closest to the current flying height during takeoff of the unmanned aerial vehicle.
In an embodiment of the present invention, the template matching module further includes:
a texture determination module for determining a category of a scene within the reference image that matches the current flying height of the drone, wherein the category of the scene includes a scene with rich texture or a scene with sparse texture; and
the extraction module is used for extracting a feature map of the image shot by the unmanned aerial vehicle at the current flying height and a feature map of the reference image matched with the current flying height of the unmanned aerial vehicle according to the category of the scene; then:
and the matching module carries out module matching on the characteristic diagram of the image shot by the unmanned aerial vehicle at the current flying height and the characteristic diagram of the reference image.
In an embodiment of the present invention, the texture determining module is specifically configured to:
extracting a first-order gradient map of the reference image matched with the current flying height of the unmanned aerial vehicle;
acquiring a gradient histogram of the reference image matched with the current flying height of the unmanned aerial vehicle according to the first-order gradient map;
judging whether a value reflecting the texture richness degree of the scene in the gradient histogram is larger than a first preset value or not according to the gradient histogram;
and if so, judging that the scene in the reference image is a texture-rich scene.
In an embodiment of the invention, the feature map of the image taken by the drone at the current flying height and the feature map of the reference image matched with the current flying height of the drone comprise first order gradient maps.
In an embodiment of the invention, the feature map of the image shot by the unmanned aerial vehicle at the current flying height and the feature map of the reference image matched with the current flying height of the unmanned aerial vehicle further include a grayscale map.
In an embodiment of the invention, the texture determining module is further configured to:
if the value reflecting the texture richness degree of the scene in the reference image in the gradient histogram is not larger than the first preset value;
then, acquiring a second-order gradient map of the reference image matched with the current flying height of the unmanned aerial vehicle;
acquiring a gradient histogram of the reference image matched with the current flying height of the unmanned aerial vehicle according to the second-order gradient map;
judging whether a value reflecting the texture richness degree of the scene in the gradient histogram is larger than a second preset value or not according to the gradient histogram;
and if so, judging that the scene in the reference image is a texture sparse scene.
In an embodiment of the invention, the feature map of the image captured by the drone at the current flying height and the feature map of the reference image include a second-order gradient map.
In an embodiment of the invention, the texture determining module is further configured to:
and if the value reflecting the texture richness degree of the scene in the reference image in the gradient histogram is not greater than the second preset value, judging that the scene in the reference image is a non-texture scene.
In order to solve the technical problem, the invention also provides an unmanned aerial vehicle, which comprises:
a body;
the machine arm is connected with the machine body;
the power device is arranged on the machine arm;
a processor disposed within the fuselage or horn; and
a memory communicatively coupled to the processor, the memory disposed within the fuselage or horn; wherein the content of the first and second substances,
the storage is stored with instructions executable by the processor, and when the processor executes the instructions, the unmanned aerial vehicle landing method is realized.
In order to solve the technical problem, the present invention further provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the processor is enabled to execute the unmanned aerial vehicle landing method.
According to the invention, by collecting the reference image containing the flying point shot by the unmanned aerial vehicle in the flying process and carrying out stage-by-stage template matching in the landing process of the unmanned aerial vehicle, the sensor error of the unmanned aerial vehicle can be eliminated, and the distance of the unmanned aerial vehicle from the flying point can be obtained in real time, so that the unmanned aerial vehicle is controlled to be continuously close to the flying point in the landing process and finally land on the flying point accurately. In the whole landing process of the unmanned aerial vehicle, other auxiliary equipment is not needed, the landing effect is good, and the requirement on the precision of a sensor carried by the unmanned aerial vehicle is not high.
Drawings
Fig. 1 is a schematic structural diagram of an unmanned aerial vehicle according to an embodiment of the present invention.
FIG. 2 is a flowchart of a method for landing an UAV according to an embodiment of the present invention;
FIG. 3 is a flow chart illustrating one embodiment of template matching in the method of FIG. 2 according to the present invention;
FIG. 4 is a flowchart of one embodiment of step S114 in the flowchart of FIG. 3;
fig. 5 is a block diagram of an embodiment of a landing device of an unmanned aerial vehicle according to the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The invention provides a method and a device for controlling an unmanned aerial vehicle to accurately land on a flying starting point and the unmanned aerial vehicle capable of accurately landing on the flying starting point. By the method, the unmanned aerial vehicle can land accurately.
As shown in fig. 1, the unmanned aerial vehicle landing method of the present invention includes:
and S10, acquiring images containing the flying points shot by the unmanned aerial vehicle at different flying heights in the takeoff process of the unmanned aerial vehicle, and taking the acquired images containing the flying points as reference images.
The collection contains the image of flying spot and can gather through the image equipment on the unmanned aerial vehicle, and the image that contains the flying spot of gathering can be stored in unmanned aerial vehicle's memory. It should be noted that, in the present invention, the takeoff point may refer to an area where the unmanned aerial vehicle takes off, or may refer to a coordinate point of the takeoff position of the unmanned aerial vehicle. In the takeoff process of the unmanned aerial vehicle, when the unmanned aerial vehicle reaches a certain height, the image equipment acquires an image containing a flying point. In an embodiment of the present invention, an image including a flying spot is acquired at every predetermined distance. The predetermined distance may be determined as desired or empirically, and in one embodiment of the present invention, the predetermined distance is 1 meter. In other possible embodiments, the preset distance may also be 2 meters, 3 meters, etc. In addition, when unmanned aerial vehicle's flying height is greater than and predetermines the height, stop to gather and contain the image of departure point. Similarly, the preset height may also be determined according to needs or experience, and in an embodiment of the present invention, the preset height is 15 meters. In other possible embodiments, the preset altitude may also be set to 60 meters, 100 meters, etc. when the GPS error on the drone is large.
And S11, repeating the step of template matching in the unmanned aerial vehicle landing process until the unmanned aerial vehicle lands to a flying starting point.
At unmanned aerial vehicle descending in-process, the operation of repeated template matching of carrying on can acquire the skew distance of flying spot of unmanned aerial vehicle in real time to overcome unmanned aerial vehicle's sensor error, control unmanned aerial vehicle accuracy descends to flying spot.
As shown in fig. 2, in an embodiment of the present invention, the template matching further includes:
s111, in the unmanned aerial vehicle landing process, confirming that the current flying height of the unmanned aerial vehicle is within a preset height range.
The template matching operation is started according to whether the flying height of the unmanned aerial vehicle meets the preset condition or not. Therefore, the landing process of the unmanned aerial vehicle can be divided into at least two height intervals according to the current flight height of the unmanned aerial vehicle, and each height interval corresponds to one-time template matching, so that the method is also called staged template matching. In other possible embodiments, the template matching may be performed continuously throughout the landing of the drone, rather than in stages.
For example, the landing process of a drone can be divided into the following three phases:
the first stage is as follows: the current flying height H of the unmanned aerial vehicle is more than or equal to 25 meters;
and a second stage: 25 m is larger than the current flying height H of the unmanned aerial vehicle and is more than or equal to 13 m;
and a third stage: 13 m is larger than the current flying height H of the unmanned aerial vehicle and is more than or equal to 4 m.
That is, when detecting that the flying height H of the unmanned aerial vehicle is in any one of the above-mentioned three altitude ranges, start template matching operation once promptly, consequently, unmanned aerial vehicle's whole process of descending carries out three times template matching operation altogether.
And S112, acquiring an image acquired by the unmanned aerial vehicle at the current flying height.
S113, obtaining a reference image matched with the current flying height of the unmanned aerial vehicle.
In an embodiment of the invention, the reference image matched with the current flying height of the unmanned aerial vehicle is a reference image acquired at the flying height same as the current flying height of the unmanned aerial vehicle in the takeoff process of the unmanned aerial vehicle. In one embodiment of the present invention, in step S10, an image containing the flying spot is acquired as a reference image every 1 meter. If the current flight height of the unmanned aerial vehicle is 15 meters, the reference image matched with the current flight height of the unmanned aerial vehicle is the reference image acquired when the flight height of the unmanned aerial vehicle is 15 meters in the takeoff process of the unmanned aerial vehicle.
In other possible embodiments, the reference image that matches the current flying height of the drone may also refer to a reference image that is acquired at an altitude that is closest to the current flying height of the drone during takeoff of the drone. For example, the current flying height of the unmanned aerial vehicle is 25 meters, and since the reference image is not collected any more when the flying height of the unmanned aerial vehicle is greater than 15 meters, the reference image matched with the current flying height of the unmanned aerial vehicle is the reference image collected the last time in the takeoff process of the unmanned aerial vehicle, that is, the reference image collected when the flying height of the unmanned aerial vehicle is 15 meters.
S114, determining the category of the scene in the reference image matched with the current flying height of the unmanned aerial vehicle, wherein the category of the scene comprises a scene with rich texture or a scene with sparse texture.
As shown in fig. 3, in an embodiment of the present invention, the steps further include:
s1141, extracting a first-order gradient map of the reference image matched with the current flying height of the unmanned aerial vehicle.
In an embodiment of the present invention, a sobel template may be used to extract the first order gradient map of the reference image.
In the X direction, the following 3 × 3 templates were used:
-1 0 1
-2 0 2
-1 0 1
in the Y direction, the following 3 × 3 template is used:
-1 -2 -1
0 0 0
1 2 1
thereby obtaining gradient maps in the X direction and the Y direction respectively, and then according to the formula: pixel-to-pixel (| pixel)xThe first order gradient map after the superposition can be obtained by | plus | pixel _ y |)/2.
S1142, obtaining a gradient histogram of the reference image matched with the current flying height of the unmanned aerial vehicle according to the first-order gradient map.
Gradient histograms are basic techniques in image processing, and can well describe the distribution of gradients.
The specific method comprises the following steps:
the range of the first order gradient image is 0-199; with a step size of 2, there are a total of 100 bins, recorded using the array hist [100 ]. Traversing the whole gradient map, the remainder of each pixel value for 2 is marked as idx, and hist [ idx ] is accumulated. Recording the maximum value max _ hist, and performing normalization processing: each bin is multiplied by 199.0/max _ hist, the range of values in the last hist array being: 0-199.
S1143, judging whether the value reflecting the texture abundance degree of the scene in the gradient histogram is larger than a first preset value according to the gradient histogram.
After the gradient histogram is obtained, the distribution of the gradient histogram is further counted, and the number of the bins which is larger than 15 is counted from 100 bins, wherein 15 is an empirical value and can be selected according to experience. If the number greater than 15 is recorded as count _ bin, the ratio of the number of bins greater than 15 to the total number of bins can be obtained: that is, the hist _ ratio is count _ bin/100. The hist _ ratio is a value reflecting the richness of the texture of the scene. The first preset value T is empirically taken to be 0.1. It is determined whether hist _ ratio is greater than T.
S1144, if the hist _ ratio is larger than T, judging that the scene in the reference image is a scene with rich texture.
S1145, if the hist _ ratio is not greater than T, acquiring a second-order gradient map of the reference image matched with the current flight altitude of the unmanned aerial vehicle.
In one embodiment of the present invention, the second order gradient map of the reference image is extracted still using a sobel template.
In the X direction, the following 3 × 3 templates were used:
-1 0 1
-2 0 2
-1 0 1
in the Y direction, the following 3 × 3 template is used:
-1 -2 -1
0 0 0
1 2 1
thereby obtaining gradient maps in the X direction and the Y direction respectively, and then according to the formula: pixel | pixel _ x | + | pixel _ y |)/2 can result in a superimposed second-order gradient map.
S1146, obtaining a gradient histogram of the reference image matched with the current flying height of the unmanned aerial vehicle according to the second-order gradient image.
This step is basically the same as step S1142, and is not described again here.
S1147, judging whether the value reflecting the texture abundance degree of the scene in the gradient histogram is larger than a second preset value according to the gradient histogram.
And (4) counting the distribution of the gradient histogram, wherein the number of the bins is larger than 15, and 15 is an empirical value and can be selected according to experience. If the number of bins larger than 15 is recorded as count _ bin, the ratio of bins larger than 15 to the total bins can be obtained: the hist _ ratio is count _ bin/100. The second preset value T is empirically taken to be 0.13. It is determined whether hist _ ratio is greater than T. The value of the second preset value may be the same as or different from the first preset value.
S1148, if the hist _ ratio is larger than T, judging that the scene in the reference image is a scene with sparse texture.
S1149, if the hist _ ratio is not greater than T, judging that the scene in the reference image is a scene without texture or a scene with very sparse texture, and at the moment, the method is not applicable any more.
S115, extracting the feature graph of the image acquired by the unmanned aerial vehicle at the current flying height and the feature graph of the reference image matched with the unmanned aerial vehicle at the current flying height according to the category of the scene in the reference image.
The feature maps to be extracted in step S115 are different between the scene with rich texture and the scene with sparse texture.
For scenes with rich textures, the feature map of the image acquired by the unmanned aerial vehicle at the current flying height and the feature map of the reference image matched with the current flying height of the unmanned aerial vehicle comprise a first-order gradient map. Namely, a first order gradient map of the image acquired by the drone at the current altitude and a first order gradient map of the reference image matching the current altitude of the drone.
In other possible embodiments, the characteristic map may further include a grayscale map of the image acquired by the drone at the current flight altitude and a grayscale map of the reference map matched to the current flight altitude of the drone.
For scenes with sparse textures, the feature map of the image acquired by the unmanned aerial vehicle at the current flying height and the feature map of the reference image matched with the current flying height of the unmanned aerial vehicle comprise a two-step degree map. Namely, a second-order gradient map of the image acquired by the unmanned aerial vehicle at the current flying height and a second-order gradient map of the reference image matched with the current flying height of the unmanned aerial vehicle.
According to the invention, different characteristic graphs are used for template matching aiming at different scenes, so that the accuracy and robustness of template matching can be improved.
S116, carrying out template matching on the characteristic diagram of the image shot by the unmanned aerial vehicle at the current flying height and the characteristic diagram of the reference image so as to obtain the coordinates of the flying point or the distance between the unmanned aerial vehicle and the flying point.
Template matching requires that the dimensions of the current image and the reference image be approximately equal. Therefore, before template matching, two images need to be preprocessed:
taking the current flying height of the unmanned aerial vehicle as 25 meters and taking the image acquired by the unmanned aerial vehicle when the flying height is 13 meters as an example, the scales are approximately equal, that is, if the reference image acquired by the unmanned aerial vehicle when the flying height is 13 meters and the current image acquired by the unmanned aerial vehicle when the flying height is 25 meters both contain the same object, the image areas of the object are basically consistent in the two images. Because the object at a close distance is large and the object at a far distance is small in the camera model, the reference image acquired by the unmanned aerial vehicle at the flying height of 13 meters is down-sampled according to 13/25-0.52 to obtain a new image, and then the new image is basically in the same scale as the current image acquired by the unmanned aerial vehicle at the flying height of 25 meters.
Estimating the yaw angle: taking the current yaw angle of the unmanned aerial vehicle as 0 degree as an example, taking the acquired current image as a reference, turning the current image from-18 degrees to 18 degrees anticlockwise, and performing template matching every 3 degrees to obtain a response value. And finding a yaw angle corresponding to the maximum response value in the 13-time template matching result, and correcting the yaw angle of the unmanned aerial vehicle into: the current yaw angle + angle of the drone.
Height estimation: the height error is from [ -1,1], the step length is 0.5 meters, 5 times of template matching are carried out, the maximum corresponding height error is delta _ z, and the flying height of the unmanned aerial vehicle is corrected to be: the current flying height of the drone + delta z.
Calculating the horizontal distance: the unmanned aerial vehicle is preprocessed before a reference image acquired when the flying height is 13 meters is acquired, an image with the same scale as a current image acquired when the flying height is 25 meters is acquired, a position corresponding to a maximum response value is found by using a template matching algorithm, and the position is converted into a world coordinate system according to the posture of the unmanned aerial vehicle: assuming that the origin of the pixel coordinate is the optical center position, the pixel coordinate of the matching result is (Δ u, Δ v), the camera internal reference matrix (only including the focal length) is K, the aircraft rotation matrix is R, and the calculated coordinates of the flying point under the world coordinate system are (X, Y, Z), then:
(X,Y,h) T=R TK -1s(Δu,Δv,1) T
according to the current flying height h of the unmanned aerial vehicle, the dimension s is solved, so that accurate X and Y can be obtained, namely the horizontal distance between the unmanned aerial vehicle and the flying starting point in the X direction and the Y direction.
And S117, controlling the unmanned aerial vehicle to fly to the flying point according to the coordinates of the flying point or the distance between the unmanned aerial vehicle and the flying point.
According to the invention, by collecting the reference image containing the flying point shot by the unmanned aerial vehicle in the flying process and carrying out stage-by-stage template matching in the landing process of the unmanned aerial vehicle, the sensor error of the unmanned aerial vehicle can be eliminated, and the distance of the unmanned aerial vehicle from the flying point can be obtained in real time, so that the unmanned aerial vehicle is controlled to be continuously close to the flying point in the landing process and finally land on the flying point accurately.
As shown in fig. 4, the present invention also provides a landing apparatus 20 for a drone, the apparatus 20 comprising:
the collecting module 21 is configured to collect images including a flying point shot by the unmanned aerial vehicle at different flying heights in a takeoff process of the unmanned aerial vehicle, and use the collected images including the flying point as a reference image; and
the template matching module 22 is used for repeating the step of template matching in the landing process of the unmanned aerial vehicle until the unmanned aerial vehicle lands at the flying starting point; wherein the template matching module 22 comprises:
the acquiring module 221 is configured to acquire an image shot by the unmanned aerial vehicle at the current flying height; and
acquiring a reference image matched with the current flying height of the unmanned aerial vehicle;
a matching module 225, configured to perform template matching on an image captured at the current flying height by the unmanned aerial vehicle and a reference image matched with the current flying height of the unmanned aerial vehicle, so as to obtain coordinates of the flying point or a distance from the unmanned aerial vehicle to the flying point; and
and the control module 226 is configured to control the unmanned aerial vehicle to fly to the departure point according to the coordinates of the departure point or the distance from the unmanned aerial vehicle to the departure point.
Optionally, the acquisition module 21 is specifically configured to:
and acquiring an image containing the flying point at preset intervals.
Optionally, the preset distance is 1 meter.
Optionally, the acquisition module 21 is further configured to:
in the takeoff process of the unmanned aerial vehicle, when the flying height of the unmanned aerial vehicle is larger than a preset height, images containing the takeoff points are not collected any more.
Optionally, the preset height is 15 meters.
Optionally, the template matching module 22 further includes:
a determination module 222, configured to determine that the flying height of the drone is within a preset height range.
Optionally, the reference image matched with the current flying height of the unmanned aerial vehicle refers to a reference image acquired at the flying height same as the current flying height in the takeoff process of the unmanned aerial vehicle.
Optionally, the reference image matched with the current flying height of the unmanned aerial vehicle refers to a reference image acquired at a flying height closest to the current flying height in the takeoff process of the unmanned aerial vehicle.
Optionally, the template matching module 22 further includes:
a texture determining module 223, configured to determine a category of the scene in the reference image that matches the current flying height of the drone, where the category of the scene includes a scene with rich texture or a scene with sparse texture; and
an extracting module 224, configured to extract, according to the category of the scene, a feature map of the image captured by the drone at the current flying height and a feature map of the reference image matched to the current flying height of the drone; then:
the matching module 225 performs module matching on the feature map of the image shot by the unmanned aerial vehicle at the current flying height and the feature map of the reference image.
Optionally, the texture determining module 223 is specifically configured to:
extracting a first-order gradient map of the reference image matched with the current flying height of the unmanned aerial vehicle;
acquiring a gradient histogram of the reference image matched with the current flying height of the unmanned aerial vehicle according to the first-order gradient map;
judging whether a value reflecting the texture richness degree of the scene in the gradient histogram is larger than a first preset value or not according to the gradient histogram;
and if so, judging that the scene in the reference image is a texture-rich scene.
Optionally, the feature map of the image shot by the drone at the current flying height and the feature map of the reference image matched with the current flying height of the drone include a first-order gradient map.
Optionally, the feature map of the image shot by the unmanned aerial vehicle at the current flying height and the feature map of the reference image matched with the current flying height of the unmanned aerial vehicle further include a grayscale map.
Optionally, the texture determining module 223 is further configured to:
if the value reflecting the texture richness degree of the scene in the reference image in the gradient histogram is not larger than the first preset value;
then, acquiring a second-order gradient map of the reference image matched with the current flying height of the unmanned aerial vehicle;
acquiring a gradient histogram of the reference image matched with the current flying height of the unmanned aerial vehicle according to the second-order gradient map;
judging whether a value reflecting the texture richness degree of the scene in the gradient histogram is larger than a second preset value or not according to the gradient histogram;
and if so, judging that the scene in the reference image is a texture sparse scene.
Optionally, the feature map of the image shot by the unmanned aerial vehicle at the current flying height and the feature map of the reference image include a second-order gradient map.
Optionally, the texture determining module 223 is further configured to:
and if the value reflecting the texture richness degree of the scene in the reference image in the gradient histogram is not greater than the second preset value, judging that the scene in the reference image is a non-texture scene.
In the embodiment of the present invention, the acquisition module 21 may be an image device mounted on an unmanned aerial vehicle, such as a camera. The template matching module 22 may be a processor (processor) on the drone or a Field Programmable Gate Array (FPGA). The obtaining module 221, the matching module 225, and the control module 226 may be a flight control chip of the unmanned aerial vehicle. The determination module 222 may be a height sensor of the drone and the texture determination module 223 may be a vision chip of the drone.
In addition, the detailed functions of the modules in the device can refer to the description of the unmanned aerial vehicle landing method, and are not repeated herein.
The present invention further provides an unmanned aerial vehicle 30, as shown in fig. 5, the unmanned aerial vehicle 30 includes a main body 31, a boom 32 connected to the main body 31, a power device 33 disposed at one end of the boom 32, a pan/tilt head 35 connected to the main body 31, an image device 34 connected to the pan/tilt head 35, and a processor 36 and a memory 37 disposed in the main body 31.
In this embodiment, the number of arms 32 is 4, i.e. the aircraft is a four-rotor aircraft, and in other possible embodiments, the number of arms 32 may also be 3, 6, 8, 10, etc. The drone 30 may also be other movable objects such as manned vehicles, aeromodelling, unmanned airships, fixed wing drones, unmanned hot air balloons, and the like.
The power unit 33 includes a motor 332 provided at one end of the horn 32 and a propeller 331 connected to a rotating shaft of the motor 332. The shaft of the motor 332 rotates to rotate the propeller 331 to provide lift to the drone 30.
The pan/tilt head 35 is used to reduce or even eliminate the vibration transmitted to the image device 34 by the power device 33, so as to ensure that the image device 34 can capture a stable and clear image or video.
The imaging device 34 may be a binocular camera, a monocular camera, an infrared imaging device, an ultraviolet imaging device, a camcorder, or the like. The imaging device 34 may be directly mounted on the drone 30, or may be mounted on the drone 30 through a pan/tilt head 35 as shown in this embodiment, the pan/tilt head 35 allowing the imaging device 34 to rotate around at least one axis relative to the drone 30.
The processor 36 may include a plurality of functional units, such as a flight control unit for controlling the flight attitude of the aircraft, a target recognition unit for recognizing a target, a tracking unit for tracking a specific target, a navigation unit (e.g., gps (global Positioning system), beidou) for navigating the aircraft, and a data processing unit for processing environmental information acquired by an associated onboard device (e.g., the imaging device 34), and the like.
The memory 36 has stored therein a computer program which, when executed by the processor, causes the processor to perform the method described in the embodiments shown in fig. 1-3.
The invention also proposes a computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, causes the processor to carry out the method described in the embodiments shown in fig. 1-3.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a non-volatile computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the program is executed. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), or the like.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (32)

  1. An unmanned aerial vehicle landing method, the method comprising:
    in the takeoff process of the unmanned aerial vehicle, acquiring images which are shot by the unmanned aerial vehicle at different flight heights and contain a takeoff point, and taking the acquired images containing the takeoff point as reference images;
    in the landing process of the unmanned aerial vehicle, repeating the step of template matching until the unmanned aerial vehicle lands at the flying starting point; wherein the template matching comprises:
    acquiring an image shot by the unmanned aerial vehicle at the current flying height;
    acquiring a reference image matched with the current flying height of the unmanned aerial vehicle;
    carrying out template matching on an image shot by the unmanned aerial vehicle at the current flying height and a reference image matched with the current flying height of the unmanned aerial vehicle to obtain the coordinates of the flying starting point or the distance between the unmanned aerial vehicle and the flying starting point;
    and controlling the unmanned aerial vehicle to fly to the flying point according to the coordinates of the flying point or the distance between the unmanned aerial vehicle and the flying point.
  2. An unmanned aerial vehicle landing method according to claim 1, wherein during takeoff of the unmanned aerial vehicle, acquiring images including a takeoff point of the unmanned aerial vehicle, taken at different flight heights, comprises:
    and acquiring an image containing the flying point at preset intervals.
  3. An unmanned aerial vehicle landing method according to claim 2, wherein the predetermined distance is 1 meter.
  4. A method of landing an unmanned aerial vehicle, according to any of claims 1-3, further comprising:
    and in the takeoff process of the unmanned aerial vehicle, when the flying height of the unmanned aerial vehicle is greater than the preset height, stopping collecting the image containing the takeoff point.
  5. An unmanned aerial vehicle landing method according to claim 4, wherein the preset height is 15 metres.
  6. A method of landing a drone according to any one of claims 1 to 5, wherein prior to the acquiring of the image taken by the drone at the current flight altitude, the method further comprises:
    and determining that the current flying height of the unmanned aerial vehicle is within a preset height range.
  7. An unmanned aerial vehicle landing method according to any of claims 1-6, wherein the reference image that matches the current flying height of the unmanned aerial vehicle is a reference image that is acquired at the same flying height as the current flying height during takeoff of the unmanned aerial vehicle.
  8. An unmanned aerial vehicle landing method according to any of claims 1-6, wherein the reference image that matches the current flying height of the unmanned aerial vehicle is a reference image acquired at a flying height closest to the current flying height during takeoff of the unmanned aerial vehicle.
  9. An unmanned aerial vehicle landing method according to any of claims 1-8, wherein the template matching further comprises:
    determining a category of the scene in the reference image matched with the current flying height of the unmanned aerial vehicle, wherein the category of the scene comprises a scene with rich texture or a scene with sparse texture;
    extracting a feature map of the image shot by the unmanned aerial vehicle at the current flying height and a feature map of the reference image matched with the current flying height of the unmanned aerial vehicle according to the category of the scene;
    then, the template matching is performed on the image shot by the unmanned aerial vehicle at the current flying height and the reference image matched with the current flying height of the unmanned aerial vehicle, and the template matching includes:
    and carrying out template matching on the characteristic graph of the image shot by the unmanned aerial vehicle at the current flying height and the characteristic graph of the reference image.
  10. A method for unmanned aerial vehicle landing according to claim 9, wherein the determining the category of the scene within the reference image that matches the current altitude of flight of the unmanned aerial vehicle comprises:
    extracting a first-order gradient map of the reference image matched with the current flying height of the unmanned aerial vehicle;
    acquiring a gradient histogram of the reference image matched with the current flying height of the unmanned aerial vehicle according to the first-order gradient map;
    judging whether a value reflecting the texture richness degree of the scene in the gradient histogram is larger than a first preset value or not according to the gradient histogram;
    and if so, judging that the scene in the reference image is a texture-rich scene.
  11. A method for unmanned aerial vehicle descent according to claim 10, wherein the signature of the image taken by the unmanned aerial vehicle at the current flight altitude and the signature of the reference image matched to the current flight altitude of the unmanned aerial vehicle comprise first order gradient maps.
  12. An unmanned aerial vehicle landing method according to claim 11, wherein the feature map of the image taken by the unmanned aerial vehicle at the current flying height and the feature map of the reference image matched to the current flying height of the unmanned aerial vehicle further comprise a grey-scale map.
  13. An unmanned aerial vehicle landing method according to any of claims 10-12, wherein if the value in the histogram of gradients reflecting the richness of the texture of the scene is not greater than the first predetermined value, the method further comprises:
    acquiring a second-order gradient map of the reference image matched with the current flying height of the unmanned aerial vehicle;
    acquiring a gradient histogram of the reference image matched with the current flying height of the unmanned aerial vehicle according to the second-order gradient map;
    judging whether a value reflecting the texture richness degree of the scene in the gradient histogram is larger than a second preset value or not according to the gradient histogram;
    and if so, judging that the scene in the reference image is a texture sparse scene.
  14. An unmanned aerial vehicle landing method according to claim 13, wherein the feature map of the image taken by the unmanned aerial vehicle at the current flying height and the feature map of the reference image comprise second order gradient maps.
  15. An unmanned aerial vehicle landing method according to claim 13 or 14, wherein if the value reflecting the texture richness of the scene in the reference image in the gradient histogram is not greater than the second preset value, it is determined that the scene in the reference image is a non-texture scene.
  16. An unmanned aerial vehicle landing device, its characterized in that, the device includes:
    the unmanned aerial vehicle control system comprises an acquisition module, a control module and a control module, wherein the acquisition module is used for acquiring images containing flying points shot by the unmanned aerial vehicle at different flying heights in the take-off process of the unmanned aerial vehicle and taking the acquired images containing the flying points as reference images; and
    the template matching module is used for repeating the step of template matching in the landing process of the unmanned aerial vehicle until the unmanned aerial vehicle lands to the flying starting point; wherein the template matching module comprises:
    the acquisition module is used for acquiring an image shot by the unmanned aerial vehicle at the current flying height; and
    acquiring a reference image matched with the current flying height of the unmanned aerial vehicle;
    the matching module is used for performing template matching on an image shot by the unmanned aerial vehicle at the current flying height and a reference image matched with the current flying height of the unmanned aerial vehicle so as to obtain the coordinates of the flying point or the distance between the unmanned aerial vehicle and the flying point; and
    and the control module is used for controlling the unmanned aerial vehicle to fly to the flying point according to the coordinate of the flying point or the distance between the unmanned aerial vehicle and the flying point.
  17. An unmanned aerial vehicle landing device according to claim 1, wherein the acquisition module is specifically configured to:
    and acquiring an image containing the flying point at preset intervals.
  18. An unmanned aerial vehicle landing arrangement according to claim 17, wherein the predetermined distance is 1 metre.
  19. An unmanned aerial vehicle landing arrangement according to any of claims 16-18, wherein the acquisition module is further configured to:
    in the takeoff process of the unmanned aerial vehicle, when the flying height of the unmanned aerial vehicle is larger than a preset height, images containing the takeoff points are not collected any more.
  20. An unmanned aerial vehicle landing arrangement according to claim 19, wherein the predetermined height is 15 metres.
  21. An unmanned aerial vehicle landing arrangement according to any of claims 16-20, wherein the template matching module further comprises:
    and the determining module is used for determining that the current flying height of the unmanned aerial vehicle is within a preset height range.
  22. An unmanned aerial vehicle landing arrangement according to any of claims 16 to 21, wherein the reference image matching the current flight level of the unmanned aerial vehicle is a reference image acquired at the same flight level as the current flight level during takeoff of the unmanned aerial vehicle.
  23. An unmanned aerial vehicle landing apparatus according to any of claims 16 to 21, wherein the reference image matching the current flight level of the unmanned aerial vehicle is a reference image acquired at a flight level closest to the current flight level during takeoff of the unmanned aerial vehicle.
  24. An unmanned aerial vehicle landing arrangement according to any of claims 16-23, wherein the template matching module further comprises:
    a texture determination module for determining a category of a scene within the reference image that matches the current flying height of the drone, wherein the category of the scene includes a scene with rich texture or a scene with sparse texture; and
    the extraction module is used for extracting a feature map of the image shot by the unmanned aerial vehicle at the current flying height and a feature map of the reference image matched with the current flying height of the unmanned aerial vehicle according to the category of the scene; then:
    and the matching module carries out module matching on the characteristic diagram of the image shot by the unmanned aerial vehicle at the current flying height and the characteristic diagram of the reference image.
  25. An unmanned aerial vehicle landing device of claim 24, wherein the texture determination module is specifically configured to:
    extracting a first-order gradient map of the reference image matched with the current flying height of the unmanned aerial vehicle;
    acquiring a gradient histogram of the reference image matched with the current flying height of the unmanned aerial vehicle according to the first-order gradient map;
    judging whether a value reflecting the texture richness degree of the scene in the gradient histogram is larger than a first preset value or not according to the gradient histogram;
    and if so, judging that the scene in the reference image is a texture-rich scene.
  26. An unmanned aerial vehicle landing apparatus of claim 25, wherein the signature of the image taken by the unmanned aerial vehicle at the current flight altitude and the signature of the reference image matching the current flight altitude of the unmanned aerial vehicle comprise first order gradient maps.
  27. An unmanned aerial vehicle landing apparatus of claim 26, wherein the profile of the image taken by the unmanned aerial vehicle at the current flight altitude and the profile of the reference image matched to the current flight altitude of the unmanned aerial vehicle further comprise a grey-scale map.
  28. A drone landing arrangement according to any one of claims 25 to 27, wherein the texture determination module is further configured to:
    if the value reflecting the texture richness degree of the scene in the reference image in the gradient histogram is not larger than the first preset value;
    then, acquiring a second-order gradient map of the reference image matched with the current flying height of the unmanned aerial vehicle;
    acquiring a gradient histogram of the reference image matched with the current flying height of the unmanned aerial vehicle according to the second-order gradient map;
    judging whether a value reflecting the texture richness degree of the scene in the gradient histogram is larger than a second preset value or not according to the gradient histogram;
    and if so, judging that the scene in the reference image is a texture sparse scene.
  29. An unmanned aerial vehicle landing apparatus of claim 28, wherein the signature of the image taken by the unmanned aerial vehicle at the current flight altitude and the signature of the reference image comprise second order gradient maps.
  30. A drone landing arrangement according to claim 28 or 29, wherein the texture determination module is further configured to:
    and if the value reflecting the texture richness degree of the scene in the reference image in the gradient histogram is not greater than the second preset value, judging that the scene in the reference image is a non-texture scene.
  31. An unmanned aerial vehicle, comprising:
    a body;
    the machine arm is connected with the machine body;
    the power device is arranged on the machine arm;
    a processor disposed within the fuselage or horn; and
    a memory communicatively coupled to the processor, the memory disposed within the fuselage or horn; wherein the content of the first and second substances,
    the memory has stored therein instructions executable by the processor, which when executed by the processor, implement the method of any one of claims 1-15.
  32. A computer-readable storage medium, in which a computer program is stored which, when executed by a processor, causes the processor to carry out the method of any one of claims 1 to 15.
CN201880094799.1A 2018-07-05 2018-07-05 Unmanned aerial vehicle landing method and device and unmanned aerial vehicle Pending CN112789571A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/094664 WO2020006732A1 (en) 2018-07-05 2018-07-05 Unmanned aerial vehicle landing method and apparatus, and unmanned aerial vehicle

Publications (1)

Publication Number Publication Date
CN112789571A true CN112789571A (en) 2021-05-11

Family

ID=69060717

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880094799.1A Pending CN112789571A (en) 2018-07-05 2018-07-05 Unmanned aerial vehicle landing method and device and unmanned aerial vehicle

Country Status (2)

Country Link
CN (1) CN112789571A (en)
WO (1) WO2020006732A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110968107A (en) * 2019-10-25 2020-04-07 深圳市道通智能航空技术有限公司 Landing control method, aircraft and storage medium
CN113238574B (en) * 2021-05-08 2022-12-13 一飞(海南)科技有限公司 Cluster performance unmanned aerial vehicle landing detection control method, system, terminal and application

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105447853A (en) * 2015-11-13 2016-03-30 深圳市道通智能航空技术有限公司 Flight device, flight control system and flight control method
CN106054903A (en) * 2016-07-27 2016-10-26 中南大学 Multi-rotor unmanned aerial vehicle self-adaptive landing method and system
CN106774423A (en) * 2017-02-28 2017-05-31 亿航智能设备(广州)有限公司 The landing method and system of a kind of unmanned plane
CN107450590A (en) * 2017-08-07 2017-12-08 深圳市科卫泰实业发展有限公司 A kind of unmanned plane auxiliary landing method

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109643129A (en) * 2016-08-26 2019-04-16 深圳市大疆创新科技有限公司 The method and system of independent landing
WO2018053861A1 (en) * 2016-09-26 2018-03-29 SZ DJI Technology Co., Ltd. Methods and system for vision-based landing
CN106371447B (en) * 2016-10-25 2020-07-07 南京奇蛙智能科技有限公司 Control method for all-weather accurate landing of unmanned aerial vehicle
CN107065925B (en) * 2017-04-01 2020-04-07 成都通甲优博科技有限责任公司 Unmanned aerial vehicle return method and device
CN107943090A (en) * 2017-12-25 2018-04-20 广州亿航智能技术有限公司 The landing method and system of a kind of unmanned plane

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105447853A (en) * 2015-11-13 2016-03-30 深圳市道通智能航空技术有限公司 Flight device, flight control system and flight control method
CN106054903A (en) * 2016-07-27 2016-10-26 中南大学 Multi-rotor unmanned aerial vehicle self-adaptive landing method and system
CN106774423A (en) * 2017-02-28 2017-05-31 亿航智能设备(广州)有限公司 The landing method and system of a kind of unmanned plane
CN107450590A (en) * 2017-08-07 2017-12-08 深圳市科卫泰实业发展有限公司 A kind of unmanned plane auxiliary landing method

Also Published As

Publication number Publication date
WO2020006732A1 (en) 2020-01-09

Similar Documents

Publication Publication Date Title
US11604479B2 (en) Methods and system for vision-based landing
US20200344464A1 (en) Systems and Methods for Improving Performance of a Robotic Vehicle by Managing On-board Camera Defects
CN109765930B (en) Unmanned aerial vehicle vision navigation
CN109324337B (en) Unmanned aerial vehicle route generation and positioning method and device and unmanned aerial vehicle
CN107544550B (en) Unmanned aerial vehicle automatic landing method based on visual guidance
WO2019040804A1 (en) Systems and methods for improving performance of a robotic vehicle by managing on-board camera obstructions
CN107240063A (en) A kind of autonomous landing method of rotor wing unmanned aerial vehicle towards mobile platform
CN108140245B (en) Distance measurement method and device and unmanned aerial vehicle
WO2020181719A1 (en) Unmanned aerial vehicle control method, unmanned aerial vehicle, and system
CN109460046B (en) Unmanned aerial vehicle natural landmark identification and autonomous landing method
CN110282135B (en) Accurate pesticide spraying system and method for plant protection unmanned aerial vehicle
CN106292126B (en) A kind of intelligence aerial survey flight exposal control method, unmanned aerial vehicle (UAV) control method and terminal
CN110914780B (en) Unmanned aerial vehicle operation plan creation system, method, and program
CN104808685A (en) Vision auxiliary device and method for automatic landing of unmanned aerial vehicle
CN110570463B (en) Target state estimation method and device and unmanned aerial vehicle
US11922819B2 (en) System and method for autonomously landing a vertical take-off and landing (VTOL) aircraft
CN109341686B (en) Aircraft landing pose estimation method based on visual-inertial tight coupling
CN109857128B (en) Unmanned aerial vehicle vision fixed-point landing method, system, equipment and storage medium
CN112789571A (en) Unmanned aerial vehicle landing method and device and unmanned aerial vehicle
CN114564042A (en) Unmanned aerial vehicle landing method based on multi-sensor fusion
US20220139078A1 (en) Unmanned aerial vehicle, communication method, and program
CN114689030A (en) Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision
CN113961021A (en) Power inspection unmanned aerial vehicle autonomous take-off and landing method based on two-dimensional code positioning
CN112119428A (en) Method, device, unmanned aerial vehicle, system and storage medium for acquiring landing position
CN113655803A (en) System and method for calibrating course of rotor unmanned aerial vehicle in tunnel environment based on vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210511