CN109857128B - Unmanned aerial vehicle vision fixed-point landing method, system, equipment and storage medium - Google Patents

Unmanned aerial vehicle vision fixed-point landing method, system, equipment and storage medium Download PDF

Info

Publication number
CN109857128B
CN109857128B CN201811552919.6A CN201811552919A CN109857128B CN 109857128 B CN109857128 B CN 109857128B CN 201811552919 A CN201811552919 A CN 201811552919A CN 109857128 B CN109857128 B CN 109857128B
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
point
image
landing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811552919.6A
Other languages
Chinese (zh)
Other versions
CN109857128A (en
Inventor
毛曙源
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fengyi Technology Shenzhen Co ltd
Original Assignee
Fengyi Technology Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fengyi Technology Shenzhen Co ltd filed Critical Fengyi Technology Shenzhen Co ltd
Priority to CN201811552919.6A priority Critical patent/CN109857128B/en
Publication of CN109857128A publication Critical patent/CN109857128A/en
Application granted granted Critical
Publication of CN109857128B publication Critical patent/CN109857128B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides a visual fixed-point landing method of an unmanned aerial vehicle, which comprises the following steps: s1, collecting ground image I by using down-looking camera of unmanned aerial vehicle0On which an image point u is selected0=(ux,uy) As a drop point; s2, according to the position and the posture of the unmanned aerial vehicle, taking the takeoff moment position of the unmanned aerial vehicle as an original point, and calculating the position P of the landing point under the world coordinate system as [ x ═ x [ ]w,yw,zw]T(ii) a S3, using position P ═ xw,yw,zw]TAdjusting the position of the unmanned aerial vehicle as a target point, and flying towards the target point; s4, carrying out flight process on the current frame image and the ground image I of the unmanned aerial vehicle0Matching is carried out, and the position u of the falling point on the current image frame is estimatedt(ii) a S5, according to the position utAnd recalculating the position of the landing point in the world coordinate system and updating the target point, and repeating the steps S2-S5 in the flight process of the unmanned aerial vehicle until the unmanned aerial vehicle flies above the target point and lands. The method of the invention improves the precision of autonomous landing of the unmanned aerial vehicle.

Description

Unmanned aerial vehicle vision fixed-point landing method, system, equipment and storage medium
Technical Field
The invention relates to the technical field of unmanned aerial vehicle flight, in particular to a visual fixed-point landing method for an unmanned aerial vehicle.
Background
The fixed-point landing of the multi-rotor unmanned aerial vehicle means that the unmanned aerial vehicle flies in the high altitude, a ground image is collected through a downward-looking camera, a landing target point is selected on the ground image through the manual work, and the unmanned aerial vehicle automatically flies to the sky of the target point and accurately lands to a set landing point.
The unmanned aerial vehicle fixed-point landing can be applied to emergency rescue, automatic logistics docking, forced landing of the unmanned aerial vehicle in emergency situations and the like.
The existing fixed-point landing scheme of the unmanned aerial vehicle is to estimate the position of a landing point through the attitude and the current ground height of the unmanned aerial vehicle, and use the position as a target point to control the unmanned aerial vehicle to fly above the target point and land.
The existing method has the following limitations:
1. the landing point position is estimated according to the attitude and the ground height of the unmanned aerial vehicle, the estimation method is poor in precision, and errors of the attitude estimation and the ground height of the unmanned aerial vehicle mainly come from errors.
2. After the landing point is estimated, image information is not used, which means that after the position is estimated after the target point is seen once at the beginning, the target point is not seen at the back, and the position is adjusted only by using inertial navigation.
Disclosure of Invention
In order to solve the problems, the invention provides a visual fixed-point landing method, a system, equipment and a storage medium for an unmanned aerial vehicle, which improve the calculation precision of the position of the estimated landing point of a single-frame image and improve the landing precision by tracking the position of the landing point on continuous frame images in the landing process.
The invention relates to an unmanned aerial vehicle vision fixed-point landing method, which comprises the following steps: s1, collecting ground image I by using down-looking camera of unmanned aerial vehicle0On which an image point u is selected0=(ux,uy) As a drop point; s2, according to the position and the posture of the unmanned aerial vehicle, taking the takeoff moment position of the unmanned aerial vehicle as an original point, and calculating the position P of the landing point under the world coordinate system as [ x ═ x [ ]w,yw,zw]T(ii) a S3, using position P ═ xw,yw,zw]TAdjusting the position of the unmanned aerial vehicle as a target point, and flying towards the target point; s4, carrying out flight process on the current frame image and the ground image I of the unmanned aerial vehicle0Matching is carried out, and the position u of the falling point on the current image frame is estimatedt(ii) a S5, according to the position utAnd recalculating the position of the landing point in the world coordinate system and updating the target point, and repeating the steps S2-S5 in the flight process of the unmanned aerial vehicle until the unmanned aerial vehicle flies above the target point and lands.
Preferably, according to the position and the attitude of the unmanned aerial vehicle, the position of the takeoff moment of the unmanned aerial vehicle is taken as an origin, and the position P of the landing point in the world coordinate system is calculated as [ x ═ xw,yw,zw]TThe calculation method comprises the following steps:
Figure BDA0001911109190000021
the world coordinate system takes the takeoff time position of the unmanned aerial vehicle as an origin, and the x axis refers toTowards the north, the y axis points to the east, and the z axis points to the center of the earth; [ n ] ofx,ny,nz]The intermediate variable N represents a unit direction vector of a ray from the center of the camera to the target point in a world coordinate system; h is the current height of the unmanned aerial vehicle relative to the ground; t is t c=[xc,yc,zc]TThe position of the unmanned plane in the world coordinate system.
Preferably, the intermediate variable N ═ Nx,ny,nz]The calculation method comprises the following steps:
Figure BDA0001911109190000022
wherein R iscThe attitude of the unmanned aerial vehicle under the world coordinate system is a 3 multiplied by 3 rotation matrix; f. ofx,fyIs the focal length of the camera, cx,cyIs the optical center.
Preferably, the method for calculating the current height h of the unmanned aerial vehicle relative to the ground comprises the following steps: obtain barometer height h at unmanned aerial vehicle take-off moment0(ii) a Obtain barometer height h of unmanned aerial vehicle flight in-processtAnd measuring the height h of the unmanned aerial vehicle to the ground through a sensortof(ii) a When the sensor is effective, the height h of the unmanned aerial vehicle to the ground is htofAnd according to h at that timetUpdate h0=ht-htof(ii) a When the sensor is invalid, the height h of the unmanned aerial vehicle to the ground is ht-h0
Preferably, the sensor for measuring the ground height of the unmanned aerial vehicle is a downward-looking single-point sensor or an ultrasonic sensor.
Preferably, the position P ═ xw,yw,zw]TThe position of the unmanned aerial vehicle is adjusted to be a target point, and the control method for flying towards the target point comprises the following steps:
according to the target position P ═ xw,yw,zw]TWith the current position tb=[xb,yb,zb]TThe error e is calculated and the error is calculated,
Figure BDA0001911109190000031
the control quantity is calculated on the basis of the error e,
v=f(e),
wherein v is the control speed of the unmanned aerial vehicle, and f is an error mapping function.
Preferably, the error mapping function f employs a proportional control algorithm:
f(e)=k×e,
wherein k is a proportionality coefficient.
Preferably, matching the current frame image of the unmanned aerial vehicle in the flight process with the ground image I0, and estimating the position u of the landing point on the current frame imagetThe method comprises the following steps: on the ground image I0Above u0Selecting an image block of an area as a center, and extracting feature points from the image block; and extracting the feature points of the current frame image, and matching the feature points extracted from the image blocks.
Preferably, the position u of the landing point on the current image frame is estimatedtThe method comprises the following steps: according to the matched characteristic point pairs of the t-1 th frame and the current t frame
Figure BDA0001911109190000041
The homography matrix H is calculated such that
Figure BDA0001911109190000042
According to the falling point position u of the t-1 th framet-1Deducing the position u of the t-th frame landing pointt=Hut-1
The invention also provides an unmanned aerial vehicle vision fixed-point landing system, which comprises: a target selection unit configured to acquire a ground image I at a downward-looking camera of the unmanned aerial vehicle0Upper selection drop point u0=(ux,uy) (ii) a A position calculation unit configured to calculate the position of the landing point in the world coordinate system based on the position and attitude of the unmanned aerial vehicle and using the takeoff moment position of the unmanned aerial vehicle as an originPut P ═ xw,yw,zw]T(ii) a A position adjusting unit configured to adjust a position P ═ xw,yw,zw]TAdjusting the position of the unmanned aerial vehicle as a target point, and flying towards the target point; an image matching unit configured to match the current frame image of the unmanned aerial vehicle in the flight process with the ground image I 0Matching is carried out, and the position u of the falling point on the current image frame is estimatedt(ii) a A location updating unit configured to update the location according to the location utAnd recalculating the position of the landing point in the world coordinate system and updating the target point until the unmanned aerial vehicle lands.
The invention also provides a computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method as described above.
The unmanned aerial vehicle vision fixed point landing method provided by the invention adjusts the unmanned aerial vehicle to fly towards the target point by calculating the position of the landing point under the world coordinate system, and recalculates the position of the landing point on the continuous frame images in the flying process to continuously update the target point until the unmanned aerial vehicle flies above the target point and lands. It has improved the precision of descending position estimation, makes unmanned aerial vehicle's autonomic descending more accurate and intelligent.
Drawings
Preferred embodiments of the present invention will now be described with reference to the accompanying drawings, which are provided for purposes of illustrating preferred embodiments of the invention and not for purposes of limiting the same. In the attached figures, the drawing is shown,
fig. 1 is a general flow chart of a visual fixed-point landing method of an unmanned aerial vehicle according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a landing point position estimation according to an embodiment of the present invention.
Detailed Description
The present invention is specifically described in the embodiments, but is not limited to the embodiments.
Fig. 1 is a general flow chart of the visual fixed-point landing method of the unmanned aerial vehicle according to the embodiment of the present invention.
As shown in fig. 1, the visual fixed-point landing method for the unmanned aerial vehicle according to the embodiment of the present invention includes the following steps:
step S1, collecting ground image I through down-looking camera of unmanned aerial vehicle0On which an image point u is selected0=(ux,uy) As a drop point.
Unmanned aerial vehicle flies at high altitude, and ground image I is acquired through downward-looking camera0Selecting an image point u on the ground image0=(ux,uy) As landing target points. The image coordinate system is defined as a 2D coordinate system with the original upper left corner, the x axis towards the right and the y axis towards the bottom, and uxIs the coordinate position of an image point on the x-axis, uyIs the coordinate position of the image point on the y-axis.
Step S2, according to the position and the posture of the unmanned aerial vehicle, taking the takeoff time position of the unmanned aerial vehicle as an origin, and calculating the position P of the landing point in the world coordinate system as [ x ]w,yw,zw]T
FIG. 2 is a schematic diagram of a landing point position estimation according to an embodiment of the present invention.
Calculating the position P of the landing point in the world coordinate system as [ x ═ xw,yw,zw]TThe calculation method comprises the following steps:
Figure BDA0001911109190000051
the world coordinate system takes the takeoff time position of the unmanned aerial vehicle as an origin, the x axis points to the north, the y axis points to the east, and the z axis points to the center of the earth; [ n ] of x,ny,nz]The intermediate variable N represents a unit direction vector of a ray from the center of the camera to a target point under a world coordinate system; h is the current height of the unmanned aerial vehicle relative to the ground; t is tc=[xc,yc,zc]TThe position of the unmanned plane in the world coordinate system.
Intermediate variable N ═ Nx,ny,nz]The calculating method comprises the following steps:
Figure BDA0001911109190000061
wherein R iscThe attitude of the unmanned aerial vehicle under the world coordinate system is a 3 multiplied by 3 rotation matrix; f. ofx,fyFocal length of down-view camera for unmanned aerial vehicle, cx,cyIs the optical center.
The method for calculating the current height h of the unmanned aerial vehicle relative to the ground, namely a height estimation algorithm, comprises the following steps:
obtain barometer height h at unmanned aerial vehicle take-off moment0. Height h of barometer0Can be obtained by a barometer sensor mounted on the drone.
Obtain barometer height h of unmanned aerial vehicle flight in-processtAnd measuring the height h of the unmanned aerial vehicle to the ground through a sensortof. The sensor for measuring the height of the unmanned aerial vehicle relative to the ground can adopt a downward-looking single-point sensor or an ultrasonic sensor. Because the measuring range of the sensor is limited, the height of the ground in the unmanned aerial vehicle is too high or too low, and the sensor can possibly fail.
When the sensor is effective, the height h of the unmanned aerial vehicle to the ground is htofAnd according to the height h of the barometer at that timetUpdate h0The updated formula is h 0=ht-htof
When the sensor fails, the altitude of the unmanned aerial vehicle to the ground is obtained by subtracting the altitude of the barometer
h=ht-h0
As shown in fig. 2, the current ground height h of the unmanned aerial vehicle is calculated through a height estimation algorithm, and the position P of the landing point in the world coordinate system is calculated according to the position and the attitude of the unmanned aerial vehicle, so as to obtain the target point of the unmanned aerial vehicle in flight.
Step S3, using position P ═ xw,yw,zw]TFor the target point, adjust unmanned aerial vehicle's position, fly towards the target point.
The control method for flying towards the target point comprises the following steps:
according to target position P ═ xw,yw,zw]TWith the current position tb=[xb,yb,zb]TThe error e is calculated and the error is calculated,
Figure BDA0001911109190000071
the control amount is calculated on the basis of the error e,
v=f(e),
wherein v is the control speed of the unmanned aerial vehicle, and f is an error mapping function.
The error control algorithm comprises a proportional-integral control algorithm and a proportional-integral-derivative control algorithm, and the error mapping function f in the embodiment adopts the proportional control algorithm:
f(e)=k×e,
wherein k is a proportionality coefficient.
And (3) controlling the unmanned aerial vehicle to fly towards the target point P calculated in the step (2) through a proportional control algorithm, and calculating a control quantity according to the error e, so that the flying precision of the unmanned aerial vehicle towards the target point is improved.
Step S4, the current frame image and the ground image I of the unmanned aerial vehicle in the flying process are processed0Matching is carried out, and the position u of the falling point on the current image frame is estimated t
The current frame image and the ground image I of the unmanned aerial vehicle in the flight process0Matching is carried out, including: on the ground image I0Above by u0Selecting an image block of an area as a center, and extracting feature points from the image block; and extracting the feature points of the current frame image, and matching the feature points extracted from the image block.
The selected image blocks can be rectangular or circular, and the like, visual features are extracted from the image blocks, and the extraction algorithm includes, but is not limited to, SIFT, SURF, FAST, ORB and other algorithms.
In the embodiment, the ORB feature extraction and feature description method is adopted, the running time of the ORB feature description algorithm is far superior to SIFT and SURF, and the ORB feature description algorithm can be used for real-time feature detection. The ORB features are based on a characteristic point detection and description technology of FAST corners, have scale and rotation invariance, and simultaneously have invariance to noise and perspective affine.
The ORB feature detection is mainly divided into two steps of feature extraction and feature description:
firstly, detecting direction FAST characteristic points;
FAST corner detection is a FAST corner feature detection algorithm based on machine learning, FAST feature point detection with direction is to judge 16 pixel points on the circumference where interest points are located, and if the judged current center pixel point is dark or bright, whether the current center pixel point is a corner or not is determined. FAST corner detection is realized by an acceleration algorithm, and usually, a point set on a return circle is firstly sequenced, so that the calculation process is greatly optimized.
Secondly, BRIEF feature description;
the key point information of the features extracted from the image is usually only the position information (possibly including scale and direction information) of the features in the image, and the matching of the feature points cannot be well performed only by using the information, so that more detailed information is needed to distinguish the features, which is a feature descriptor. In addition, the change of the scale and the direction of the image caused by the change of the visual angle can be eliminated through the feature descriptor, and the matching between the images can be better realized.
The BRIEF descriptor mainly forms small interest areas by randomly selecting a plurality of points in the area around the interest points, binarizes the gray levels of the small interest areas and analyzes the small interest areas into binary code strings, takes string characteristics as descriptors of the characteristic points, selects the areas near the key points and compares the strength of each bit, and then judges whether the current key point code is 0 or 1 according to two binary points in an image block. Because all codes of the BRIEF descriptor are binary numbers, the storage space of the computer is saved.
On the ground image I according to the above ORB method0Above by u0And selecting an area of image blocks for the center to extract feature points.
And extracting the feature points of the current frame image, and matching the feature points extracted from the image blocks. In the flight process of the unmanned aerial vehicle, the current frame image and the ground image acquired in the step 1 are subjected to image feature matching algorithmI0And (6) matching. Image feature matching algorithms include, but are not limited to, optical flow tracking, SIFT feature matching, and the like. And after matching is completed, estimating the position of the image falling point on the current image frame. The estimation method includes, but is not limited to, DLT (direct linear variation) algorithm.
Estimating the position u of the landing point on the current image frametThe method comprises the following steps:
according to the matched characteristic point pairs of the t-1 th frame and the current t frame
Figure BDA0001911109190000091
The homography matrix H is calculated such that
Figure BDA0001911109190000092
Thus, the falling point position u is determined based on the t-1 th framet-1Deducing the position u of the t-th frame landing pointt=Hut-1
Step S5, according to the position utRecalculating the position of the landing point in the world coordinate system and updating the target point; and continuously repeating the steps S2-S5 during the flight process of the unmanned aerial vehicle until the unmanned aerial vehicle flies above the target point and lands.
Based on the position u calculated in step S4tThe position of the landing point is recalculated and the target point is updated in accordance with the calculation method in step S2. And continuously repeating the steps S2 to S5 in the flight process of the unmanned aerial vehicle until the unmanned aerial vehicle flies above the target point and lands.
The invention also provides an unmanned aerial vehicle vision fixed point landing system, which comprises: a target selection unit configured to acquire a ground image I at a downward-looking camera of the unmanned aerial vehicle0Upper selection drop point u0=(ux,uy) (ii) a The position calculation unit is configured to calculate the position P of the landing point under the world coordinate system by taking the takeoff moment position of the unmanned aerial vehicle as an origin according to the position and the posture of the unmanned aerial vehicle as the [ x ]w,yw,zw]T(ii) a A position adjusting unit configured to adjust a position P ═ xw,yw,zw]TAdjusting the position of the unmanned aerial vehicle as a target point, and flying towards the target point; an image matching unit configured to match the current frame image of the unmanned aerial vehicle in the flight process with the ground image I0Matching is carried out, and the position u of the falling point on the current image frame is estimatedt(ii) a A location updating unit configured to update the location according to the location utAnd recalculating the position of the landing point in the world coordinate system and updating the target point until the unmanned aerial vehicle lands.
The invention also provides a computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method as described above.
According to the method, the system, the equipment and the storage medium, the position of the target point under the world coordinate system is continuously updated according to the current position of the unmanned aerial vehicle in the flying process of the unmanned aerial vehicle, so that the unmanned aerial vehicle is continuously adjusted to fly towards the target point, and the unmanned aerial vehicle arrives above the target point and lands, and the landing precision of the unmanned aerial vehicle is improved.
The above embodiments are preferred embodiments of the present invention, and are not intended to limit the scope of the present invention, and all modifications and substitutions that are within the spirit and principle of the present invention are intended to be protected by the present invention.

Claims (8)

1. An unmanned aerial vehicle vision fixed point landing method is characterized by comprising the following steps:
s1, collecting ground image I through downward-looking camera of unmanned aerial vehicle0On which an image point u is selected0=(ux,uy) As a drop point;
s2, according to the position and the posture of the unmanned aerial vehicle, taking the takeoff moment position of the unmanned aerial vehicle as an original point, and calculating the position P of the landing point under the world coordinate system as [ x ═ x [ ]w,yw,zw]T
S3, using position P ═ xw,yw,zw]TAdjusting the position of the unmanned aerial vehicle as a target point, and flying towards the target point;
s4, carrying out flight process on the current frame image and the ground image I of the unmanned aerial vehicle0Matching is carried out, and the position u of the falling point on the current image frame is estimatedt
S5, according to the position utRecalculating the position of the landing point in the world coordinate system and updating the target point, repeating the steps S2-S5 in the flight process of the unmanned aerial vehicle until the unmanned aerial vehicle flies above the target point and lands,
wherein, the current frame image and the ground image I of the unmanned aerial vehicle in the flight process0Matching is carried out, and the position u of the falling point on the current image frame is estimated tThe method comprises the following steps:
on the ground image I0Above u0Selecting an image block of an area as a center, and extracting feature points from the image block;
extracting the feature points of the current frame image, matching with the feature points extracted from the image blocks,
the position u of the estimated landing point on the current image frametThe method comprises the following steps:
according to the matched characteristic point pairs of the t-1 th frame and the current t frame
Figure FDA0003386722990000011
The homography matrix H is calculated such that
Figure FDA0003386722990000012
According to the falling point position u of the t-1 th framet-1Deducing the position u of the t-th frame landing pointt=Hut-1
2. An unmanned aerial vehicle visual fixed-point landing method according to claim 1,
according to the position and the posture of the unmanned aerial vehicle, the takeoff moment position of the unmanned aerial vehicle is used as an original point, and the position P of the landing point under the world coordinate system is calculated as [ x ]w,yw,zw]TThe calculation method comprises the following steps:
Figure FDA0003386722990000021
wherein the content of the first and second substances,
the world coordinate system takes the takeoff time position of the unmanned aerial vehicle as an origin, the x axis points to the north, the y axis points to the east, and the z axis points to the center of the earth;
[nx,ny,nz]the intermediate variable N represents a unit direction vector of a ray from the center of the camera to the target point in a world coordinate system;
h is the current height of the unmanned aerial vehicle relative to the ground;
tc=[xc,yc,zc]Tthe position of the unmanned plane in the world coordinate system.
3. An unmanned aerial vehicle visual fixed-point landing method according to claim 2,
The intermediate variable N ═ Nx,ny,nz]The calculating method comprises the following steps:
Figure FDA0003386722990000022
wherein, the first and the second end of the pipe are connected with each other,
Rcthe attitude of the unmanned aerial vehicle under the world coordinate system is a 3 multiplied by 3 rotation matrix;
fx,fyis the focal length of the camera, cx,cyIs the optical center.
4. An unmanned aerial vehicle visual fixed-point landing method according to claim 2,
the method for calculating the current height h of the unmanned aerial vehicle relative to the ground comprises the following steps:
obtain barometer height h at unmanned aerial vehicle take-off moment0
Obtain barometer height h of unmanned aerial vehicle flight in-processtAnd measuring the height h of the unmanned aerial vehicle to the ground through a sensortof
When the sensor is effective, the height h of the unmanned aerial vehicle to the ground is htofAnd according to h at that timetUpdate h0=ht-htof
When the sensor is invalid, the height h of the unmanned aerial vehicle to the ground is ht-h0
5. An unmanned aerial vehicle visual fixed-point landing method according to claim 4,
the sensor for measuring the ground height of the unmanned aerial vehicle is a downward-looking single-point sensor or an ultrasonic sensor.
6. An unmanned aerial vehicle visual fixed-point landing method according to claim 3,
the position P ═ xw,yw,zw]TThe position of the unmanned aerial vehicle is adjusted to be a target point, and the control method for flying towards the target point comprises the following steps:
according to the target position P ═ xw,yw,zw]TWith the current position tb=[xb,yb,zb]TThe error e is calculated and the error is calculated,
Figure FDA0003386722990000031
The control amount is calculated on the basis of the error e,
v=f(e),
wherein v is the control speed of the unmanned aerial vehicle, and f is an error mapping function.
7. Unmanned aerial vehicle vision fixed point descending system, a serial communication port, include
A target selection unit configured to acquire a ground image I at a downward-looking camera of the unmanned aerial vehicle0Upper selection drop point u0=(ux,uy);
A position calculation unit configured to use the takeoff moment position of the unmanned aerial vehicle as an origin point according to the position and the attitude of the unmanned aerial vehicle,calculating the position P ═ x of the falling point in the world coordinate systemw,yw,zw]T
A position adjusting unit configured to adjust a position P ═ xw,yw,zw]TAdjusting the position of the unmanned aerial vehicle as a target point, and flying towards the target point;
an image matching unit configured to match the current frame image of the unmanned aerial vehicle in the flight process with the ground image I0Matching is carried out, and the position u of the falling point on the current image frame is estimatedt
A location updating unit configured to update the location u according to the positiontRecalculating the position of the landing point in the world coordinate system and updating the target point until the unmanned aerial vehicle lands,
wherein, the current frame image and the ground image I of the unmanned aerial vehicle in the flight process0Matching is carried out, and the position u of the falling point on the current image frame is estimatedtThe method comprises the following steps:
on the ground image I0Above by u 0Selecting an image block of an area as a center, and extracting feature points from the image block;
extracting the feature points of the current frame image, matching with the feature points extracted from the image blocks,
the position u of the estimated landing point on the current image frametThe method comprises the following steps:
according to the matched characteristic point pairs of the t-1 th frame and the current t frame
Figure FDA0003386722990000041
The homography matrix H is calculated such that
Figure FDA0003386722990000042
According to the falling point position u of the t-1 th framet-1Deducing the position u of the t-th frame landing pointt=Hut-1
8. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 6.
CN201811552919.6A 2018-12-18 2018-12-18 Unmanned aerial vehicle vision fixed-point landing method, system, equipment and storage medium Active CN109857128B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811552919.6A CN109857128B (en) 2018-12-18 2018-12-18 Unmanned aerial vehicle vision fixed-point landing method, system, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811552919.6A CN109857128B (en) 2018-12-18 2018-12-18 Unmanned aerial vehicle vision fixed-point landing method, system, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN109857128A CN109857128A (en) 2019-06-07
CN109857128B true CN109857128B (en) 2022-07-15

Family

ID=66891391

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811552919.6A Active CN109857128B (en) 2018-12-18 2018-12-18 Unmanned aerial vehicle vision fixed-point landing method, system, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN109857128B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110310243B (en) * 2019-06-28 2023-04-25 广东工业大学 Unmanned aerial vehicle photogrammetry image correction method, system and storage medium
CN111951327A (en) * 2020-07-07 2020-11-17 中国人民解放军93114部队 Accurate estimation method and device for landing point position of flight device and electronic equipment
CN112774073B (en) * 2021-02-05 2022-02-11 燕山大学 Unmanned aerial vehicle guided multi-machine cooperation fire extinguishing method and fire extinguishing system thereof
CN114489140A (en) * 2022-02-16 2022-05-13 中国电子科技集团公司第五十四研究所 Unmanned aerial vehicle accurate autonomous take-off and landing method in non-identification environment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1901153A1 (en) * 2006-09-12 2008-03-19 OFFIS e.V. Control system for unmanned 4-rotor-helicopter
CN102967305A (en) * 2012-10-26 2013-03-13 南京信息工程大学 Multi-rotor unmanned aerial vehicle pose acquisition method based on markers in shape of large and small square
CN104729506A (en) * 2015-03-27 2015-06-24 北京航空航天大学 Unmanned aerial vehicle autonomous navigation positioning method with assistance of visual information
WO2017053627A1 (en) * 2015-09-24 2017-03-30 Amazon Technologies, Inc. Method to determine a planar surface for unmanned aerial vehicle descent

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2272343A (en) * 1992-11-10 1994-05-11 Gec Ferranti Defence Syst Automatic aircraft landing system calibration
US9031782B1 (en) * 2012-01-23 2015-05-12 The United States Of America As Represented By The Secretary Of The Navy System to use digital cameras and other sensors in navigation
KR101350242B1 (en) * 2012-05-29 2014-01-14 서울대학교산학협력단 Method and apparatus for searching a landing site of aircraft using a depth map
CN103226356A (en) * 2013-02-27 2013-07-31 广东工业大学 Image-processing-based unmanned plane accurate position landing method
US8996207B2 (en) * 2013-06-24 2015-03-31 Honeywell International Inc. Systems and methods for autonomous landing using a three dimensional evidence grid
CN104215239B (en) * 2014-08-29 2017-02-08 西北工业大学 Guidance method using vision-based autonomous unmanned plane landing guidance device
KR101684293B1 (en) * 2015-03-17 2016-12-21 건국대학교 산학협력단 System and method for detecting emergency landing point of unmanned aerial vehicle
CN104932522B (en) * 2015-05-27 2018-04-17 深圳市大疆创新科技有限公司 A kind of Autonomous landing method and system of aircraft
KR101645722B1 (en) * 2015-08-19 2016-08-05 아이디어주식회사 Unmanned aerial vehicle having Automatic Tracking and Method of the same
CN106708066B (en) * 2015-12-20 2019-07-26 中国电子科技集团公司第二十研究所 View-based access control model/inertial navigation unmanned plane independent landing method
CN105775150B (en) * 2016-03-17 2017-12-22 英华达(上海)科技有限公司 Unmanned vehicle and its landing method
CN107544550B (en) * 2016-06-24 2021-01-15 西安电子科技大学 Unmanned aerial vehicle automatic landing method based on visual guidance
CN106054929B (en) * 2016-06-27 2018-10-16 西北工业大学 A kind of unmanned plane based on light stream lands bootstrap technique automatically
CN106774423B (en) * 2017-02-28 2020-08-11 亿航智能设备(广州)有限公司 Landing method and system of unmanned aerial vehicle
CN108694728A (en) * 2017-04-11 2018-10-23 北京乐普盛通信息技术有限公司 Unmanned plane guides landing method, apparatus and system
CN108960695A (en) * 2017-05-22 2018-12-07 顺丰科技有限公司 A kind of unmanned plane operation method and operation platform
CN107202982B (en) * 2017-05-22 2018-08-07 徐泽宇 A kind of beacon arrangement and image processing method based on UAV position and orientation calculating
CN108983807B (en) * 2017-06-05 2021-08-10 北京臻迪科技股份有限公司 Unmanned aerial vehicle fixed-point landing method and system
WO2019023914A1 (en) * 2017-07-31 2019-02-07 深圳市大疆创新科技有限公司 Image processing method, unmanned aerial vehicle, ground console, and image processing system thereof
CN107943077A (en) * 2017-11-24 2018-04-20 歌尔股份有限公司 A kind of method for tracing, device and the unmanned plane of unmanned plane drop target
CN107943090A (en) * 2017-12-25 2018-04-20 广州亿航智能技术有限公司 The landing method and system of a kind of unmanned plane

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1901153A1 (en) * 2006-09-12 2008-03-19 OFFIS e.V. Control system for unmanned 4-rotor-helicopter
CN102967305A (en) * 2012-10-26 2013-03-13 南京信息工程大学 Multi-rotor unmanned aerial vehicle pose acquisition method based on markers in shape of large and small square
CN104729506A (en) * 2015-03-27 2015-06-24 北京航空航天大学 Unmanned aerial vehicle autonomous navigation positioning method with assistance of visual information
WO2017053627A1 (en) * 2015-09-24 2017-03-30 Amazon Technologies, Inc. Method to determine a planar surface for unmanned aerial vehicle descent

Also Published As

Publication number Publication date
CN109857128A (en) 2019-06-07

Similar Documents

Publication Publication Date Title
CN109857128B (en) Unmanned aerial vehicle vision fixed-point landing method, system, equipment and storage medium
TWI657011B (en) Unmanned aerial vehicle, control system for unmanned aerial vehicle and control method thereof
CN111326023B (en) Unmanned aerial vehicle route early warning method, device, equipment and storage medium
CN108873917A (en) A kind of unmanned plane independent landing control system and method towards mobile platform
CN114216454B (en) Unmanned aerial vehicle autonomous navigation positioning method based on heterogeneous image matching in GPS refusing environment
CN111213155A (en) Image processing method, device, movable platform, unmanned aerial vehicle and storage medium
CN110988912A (en) Road target and distance detection method, system and device for automatic driving vehicle
CN110570463B (en) Target state estimation method and device and unmanned aerial vehicle
CN110458877B (en) Navigation method based on bionic vision for fusing infrared and visible light information
CN109782788A (en) Unmanned plane low latitude obstacle avoidance system and control method based on binocular vision
KR102373492B1 (en) Method for correcting misalignment of camera by selectively using information generated by itself and information generated by other entities and device using the same
CN112486207A (en) Unmanned aerial vehicle autonomous landing method based on visual identification
CN110097498B (en) Multi-flight-zone image splicing and positioning method based on unmanned aerial vehicle flight path constraint
Bao et al. Vision-based horizon extraction for micro air vehicle flight control
CN111288989A (en) Visual positioning method for small unmanned aerial vehicle
CN108426576A (en) Aircraft paths planning method and system based on identification point vision guided navigation and SINS
CN112258409A (en) Monocular camera absolute scale recovery method and device for unmanned driving
JP2019196150A (en) System, method, and program for identifying safe landing area, and storage medium for storing the program
CN112802096A (en) Device and method for realizing real-time positioning and mapping
US9816786B2 (en) Method for automatically generating a three-dimensional reference model as terrain information for an imaging device
CN109871024A (en) A kind of UAV position and orientation estimation method based on lightweight visual odometry
CN117036989A (en) Miniature unmanned aerial vehicle target recognition and tracking control method based on computer vision
CN113723568A (en) Remote sensing image characteristic point elevation obtaining method based on multiple sensors and sea level
Zhou et al. Real-time object detection and pose estimation using stereo vision. An application for a Quadrotor MAV
CN111615677A (en) Safe landing method and device for unmanned aerial vehicle, unmanned aerial vehicle and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20210713

Address after: 518000 5th floor, block B, building 1, software industry base, Yuehai street, Nanshan District, Shenzhen City, Guangdong Province

Applicant after: Fengyi Technology (Shenzhen) Co.,Ltd.

Address before: 518000 Xuefu Road (south) and Baishi Road (east) intersection of Nanshan District, Shenzhen City, Guangdong Province, 6-13 floors, Block B, Shenzhen Software Industry Base

Applicant before: SF TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant