CN114489140A - Unmanned aerial vehicle accurate autonomous take-off and landing method in non-identification environment - Google Patents

Unmanned aerial vehicle accurate autonomous take-off and landing method in non-identification environment Download PDF

Info

Publication number
CN114489140A
CN114489140A CN202210142956.XA CN202210142956A CN114489140A CN 114489140 A CN114489140 A CN 114489140A CN 202210142956 A CN202210142956 A CN 202210142956A CN 114489140 A CN114489140 A CN 114489140A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
landing
take
ground
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210142956.XA
Other languages
Chinese (zh)
Inventor
耿虎军
仇梓峰
高峰
王港
柴兴华
李晨阳
熊恒斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CETC 54 Research Institute
Original Assignee
CETC 54 Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CETC 54 Research Institute filed Critical CETC 54 Research Institute
Priority to CN202210142956.XA priority Critical patent/CN114489140A/en
Publication of CN114489140A publication Critical patent/CN114489140A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses an unmanned aerial vehicle accurate autonomous take-off and landing method in an environment without identification, and belongs to the technical field of take-off and landing of unmanned aerial vehicles. Firstly, recording a multi-level characteristic group and a ground height by an unmanned aerial vehicle at a frequency of 30Hz during takeoff, and forming a take-off and landing time-space corresponding matrix; then, when the unmanned aerial vehicle lands, the unmanned aerial vehicle acquires an image of the ground in real time through a vertically downward camera, performs re-identification and positioning on the image and a corresponding matrix when the unmanned aerial vehicle takes off and lands, and outputs position information of the unmanned aerial vehicle to flight control; and finally, the flight control controls the posture of the unmanned aerial vehicle in real time through the position information until the unmanned aerial vehicle accurately lands on the original takeoff place. The invention can achieve the taking off and the autonomous landing with pixel-level precision only by the GPS and the camera under the environment without setting a special mark.

Description

Unmanned aerial vehicle accurate autonomous take-off and landing method in non-identification environment
Technical Field
The invention relates to the technical field of takeoff and landing of unmanned aerial vehicles, in particular to a precise autonomous takeoff and landing method of an unmanned aerial vehicle in an environment without identification.
Background
At present, in the field of takeoff and landing of unmanned aerial vehicles, the real-time position of the unmanned aerial vehicle is generally determined through satellite positioning technologies such as GPS, Beidou and RTK, or through identifying a specific identification code, so that landing control is performed. However, the satellite positioning technology has the disadvantages of poor positioning accuracy, susceptibility to electromagnetic interference, and the like; through the mode of identifying the specific identification code, the defects that the field deployment is complicated, the identification code is easy to be shielded, the identification code is easy to lose and the like exist.
In recent years, cases of using a visual positioning system to perform unmanned aerial vehicle navigation appear, and an environment map is generally constructed by a multi-view camera so as to estimate the attitude change of the unmanned aerial vehicle. This approach then requires a high computational power and the constant change in the environment can lead to a continuous increase in the accumulated error, which cannot be applied to the precise landing phase of the drone.
Disclosure of Invention
In view of the above, the invention provides an unmanned aerial vehicle accurate autonomous take-off and landing method under an environment without identification.
In order to achieve the purpose, the invention adopts the technical scheme that:
an unmanned aerial vehicle accurate autonomous taking off and landing method under an environment without identification comprises the following steps:
(1) during takeoff, the unmanned aerial vehicle acquires a ground image in real time through a vertically downward camera, records a multi-level characteristic group, a ground height and a takeoff point coordinate at a frequency of 30Hz, and forms a takeoff and landing time-space corresponding matrix;
(2) during landing, the unmanned aerial vehicle acquires an image of the ground in real time through a vertically downward camera, performs re-identification and positioning on the image and a corresponding matrix during taking off and landing, and outputs position information of the unmanned aerial vehicle to a flight controller;
(3) the flight controller controls the attitude of the unmanned aerial vehicle in real time through the position information;
(4) and (4) repeating the steps (2) and (3), and accurately landing the aircraft on the original takeoff place.
Further, the multi-level feature group comprises a basic point-line feature and a depth global feature; the ground height is output by a GPS module in the unmanned aerial vehicle and corresponds to the multi-level feature groups one by one; the flying point coordinate consists of longitude and latitude of the unmanned aerial vehicle; the take-off and landing time-space corresponding matrix comprises a take-off and flying point coordinate and one or more time sequences of 30 frames per second; the one or more 30-frame-per-second time sequences each include one-to-one multi-level feature sets and ground heights.
Further, the basic point-line features are obtained by processing a ground image through a point feature extraction algorithm and a line feature extraction algorithm and fusing the processed features; the depth global feature is obtained by extracting global information in the ground image through a convolutional neural network.
Further, the specific mode of the step (2) is as follows:
(201) the unmanned aerial vehicle flies to the coordinates of the flying point through GPS navigation, so that the image of the flying point area appears in the visual field range of the vertically downward camera;
(202) the drone begins to descend at a constant speed; in the descending process, the unmanned aerial vehicle acquires the current multi-level feature group and the ground distance height in real time, and compares the current multi-level feature group and the ground distance height with a take-off and landing time-space corresponding matrix recorded in take-off by taking the ground distance height as a standard;
(203) calculating the distance of the multi-level feature groups in the two take-off and landing time-space corresponding matrixes by a similarity measurement method; and combining the calculated distance with the current ground height to form positioning information, and outputting the positioning information to the flight controller.
Further, the specific mode of the step (3) is as follows:
(301) the unmanned aerial vehicle adjusts the horizontal position and the course angle of the unmanned aerial vehicle according to the positioning information, keeps the pixel-level alignment with the flying starting point, and descends at a constant speed;
(302) during descending, the unmanned aerial vehicle adjusts the posture and the position according to a negative feedback mechanism, compares the real-time positioning information with a preset error range value, and changes the position of the unmanned aerial vehicle through a PID control algorithm if the real-time positioning information is not within the error range value so that the position of the unmanned aerial vehicle is within the error range value; if the position of the unmanned aerial vehicle is within the error range, the unmanned aerial vehicle is enabled to be lowered to the height all the time.
The invention has the beneficial effects that:
1. the unmanned aerial vehicle automatic control system is suitable for various emergency situations and severe environments, and can realize the autonomous and accurate take-off and landing of the unmanned aerial vehicle without setting a specific mark in advance.
2. The method used by the invention has wide transportability, and the unmanned aerial vehicle only needs to provide GPS and flight control input.
3. The invention has the advantages of stability, real-time performance and accuracy, can provide positioning information output frequency above 30Hz, and has landing precision reaching pixel level.
Drawings
FIG. 1 is a schematic diagram illustrating fusion in an embodiment of the present invention;
FIG. 2 is a schematic diagram of a take-off and landing spatio-temporal correspondence matrix in an embodiment of the present invention.
Detailed Description
The technical solution of the present invention is further explained with reference to the accompanying drawings. It is to be understood that these are only some of the embodiments of the present invention and are not necessarily all embodiments. All other embodiments, which can be obtained by a person skilled in the art without inventive step based on the following embodiments, belong to the scope of protection of the present invention.
An unmanned aerial vehicle accurate autonomous taking off and landing method under an environment without identification comprises the steps that firstly, during taking off, an unmanned aerial vehicle records a multi-level characteristic group and a ground height at a frequency of 30Hz and forms a taking off and landing time-space corresponding matrix; then, when the unmanned aerial vehicle lands, the unmanned aerial vehicle acquires an image of the ground in real time through a vertically downward camera, performs re-identification and positioning on the image and a corresponding matrix when the unmanned aerial vehicle takes off and lands, and outputs position information of the unmanned aerial vehicle to flight control; and finally, the flight control controls the posture of the unmanned aerial vehicle in real time through the position information until the unmanned aerial vehicle accurately lands on the original takeoff place. The method can achieve take-off and autonomous landing with pixel-level precision only through the GPS and the camera under the environment without setting special marks.
Further, the multi-level feature group comprises a basic point-line feature and a depth global feature; the ground height is output by a GPS module in the unmanned aerial vehicle and corresponds to the multi-level feature groups one by one; the flying point coordinate consists of longitude and latitude of the unmanned aerial vehicle; the take-off and landing time-space corresponding matrix comprises a take-off point coordinate and a plurality of time sequences of 30 frames per second; the plurality of time sequences of 30 frames per second comprise multi-level feature groups and ground heights which correspond to one another in each sequence. "several" means one or more.
Further, the basic point-line features are obtained by processing the image through a point feature extraction algorithm and a line feature extraction algorithm and fusing the processed features; the depth global features are obtained by extracting global information in the image through a convolutional neural network.
Further, in the step (2), the unmanned aerial vehicle flies to the coordinates of the flying point through GPS navigation, so that the image of the flying point area appears in the visual field range of the vertically downward camera; thereafter, the drone begins to descend at a constant speed; in the descending process, the unmanned aerial vehicle acquires the current multi-level feature group and the ground distance height in real time, and compares the current multi-level feature group and the ground distance height with a take-off and landing time-space corresponding matrix recorded in take-off by taking the ground distance height as a standard; calculating the distance of the multi-level feature groups in the two take-off and landing space-time corresponding matrixes by a similarity measurement method; and combining the calculated distance with the current ground height to form positioning information, and outputting the positioning information to the flight control.
Further, the specific mode of the step (3) is as follows:
(301) the unmanned aerial vehicle adjusts the horizontal position and the course angle of the unmanned aerial vehicle according to the positioning information, keeps the pixel-level alignment with the flying starting point, and descends at a constant speed;
(302) during descending, the unmanned aerial vehicle adjusts the posture and the position according to a negative feedback mechanism, compares the real-time positioning information with a preset error range value, and changes the position of the unmanned aerial vehicle through a PID control algorithm if the real-time positioning information is not within the error range value so that the position of the unmanned aerial vehicle is within the error range value; if the position of the unmanned aerial vehicle is within the error range, the unmanned aerial vehicle is enabled to be lowered to the height all the time.
The method is realized in a specific way as follows:
(1) during takeoff, the unmanned aerial vehicle records the multi-level characteristic group, the ground height and the flying point coordinate at the frequency of 30Hz and forms a time-space corresponding matrix during takeoff and landing;
(2) during landing, the unmanned aerial vehicle acquires an image of the ground in real time through a vertically downward camera, performs re-identification and positioning on the image and a corresponding matrix during taking off and landing, and outputs position information of the unmanned aerial vehicle to flight control;
(3) the flight control controls the attitude of the unmanned aerial vehicle in real time through the position information;
(4) and (4) repeating the steps (2) and (3), and accurately landing the aircraft on the original takeoff place.
In the step (1), processing the image collected by the vertically downward camera through a point feature extraction algorithm, a line feature extraction algorithm and a convolutional neural network algorithm, and fusing the processed features in a serial-to-parallel mode to obtain a multi-level feature group, wherein the principle is as shown in the following formula:
Figure BDA0003507185910000041
wherein, m (X)1,X2,…,Xk) Representing the fusion characteristics, omega, obtained after the fusion operationkThe weights are fused for the different region features,
Figure BDA0003507185910000042
it is indicated that the operation is in series,
Figure BDA0003507185910000043
representing parallel operations, XkDifferent regions are represented, X represents a global region, f represents a point feature extraction algorithm function, g represents a line feature extraction algorithm function, and j represents a convolutional neural network algorithm function. The fusion scheme is shown in figure 1.
The ground height and the flying point coordinates are output by a GPS module in the unmanned aerial vehicle, and are in one-to-one correspondence with the multi-level feature groups to jointly form a take-off and landing time-space correspondence matrix, as shown in the attached figure 2. In the figure, the take-off and landing time-space corresponding matrixWith several sequences S1~SnIn a first sequence S1By way of example, by a ground height H 130 multi-level feature groups MFG1-1~MFG1-30And a pair of flying spot coordinates P1And (4) forming.
In the step (2), the unmanned aerial vehicle flies to the flying point coordinate P through GPS navigationnEnabling the image of the flying spot area to appear in the visual field range of the vertically downward camera; thereafter, the drone begins to descend at a constant speed; in the descending process, the unmanned aerial vehicle acquires the current multi-level feature group MFG ' and the ground distance height H ' in real time, and compares the current multi-level feature group MFG ' and the ground distance height H ' with a ground distance height H ' as a standard and a rising and falling time-space corresponding matrix recorded in the process of taking off.
Performing distance calculation on multi-level feature groups MFG and MFG' in two take-off and landing space-time corresponding matrixes by a similarity measurement method; and combining the calculated distance with the current ground height to form positioning information, and outputting the positioning information to the flight control.
Similarity measurement methods are common knowledge of those skilled in the art, i.e., a measure that comprehensively assesses how close two things are. The closer two things are, the larger their similarity measure is, and the further apart the two things are, the smaller their similarity measure is. In the invention, the similarity degree between two characteristic groups can be comprehensively evaluated by a similarity measurement method to obtain the distance between the two characteristic groups.
The specific mode of the step (3) is as follows: the unmanned aerial vehicle adjusts the horizontal position and the course angle of the unmanned aerial vehicle according to the positioning information, keeps the pixel-level alignment with the flying starting point, and descends at a constant speed; during descending, the unmanned aerial vehicle adjusts the posture and the position according to a negative feedback mechanism, compares the real-time positioning information with a preset error range value, and changes the position of the unmanned aerial vehicle through a PID control algorithm if the real-time positioning information is not within the error range value so that the position of the unmanned aerial vehicle is within the error range value; if unmanned aerial vehicle's position is in error range, make unmanned aerial vehicle reduce the height all the time, until the height is less than the threshold height, unmanned aerial vehicle chance is shut down and is accomplished the landing.
PID control algorithms are well known to those skilled in the art and implement a control function for a controlled object by adjusting P, I, D three parameters, wherein the proportional part P: the response speed of the system can be accelerated by increasing the proportionality coefficient, and the steady-state error is reduced; but too large a proportionality coefficient may affect the stability of the system; differential portion D: the derivative effect can reflect the rate of change of the error signal. The larger the change speed is, the stronger the differential action is, thereby being beneficial to reducing the oscillation and increasing the stability of the system; an integration section I: the smaller the integration time constant, the stronger the integration. The integral control action can eliminate the steady-state error of the system; however, too much integration will degrade the stability of the system.
The invention is suitable for various unmanned aerial vehicle platforms, can be conveniently adapted to various unmanned aerial vehicle platforms by adopting the accurate landing positioning box, and the unmanned aerial vehicle only needs to provide power output and flight control input.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (5)

1. An unmanned aerial vehicle accurate autonomous take-off and landing method under an environment without identification is characterized by comprising the following steps:
(1) during takeoff, the unmanned aerial vehicle acquires a ground image in real time through a vertically downward camera, records a multi-level characteristic group, a ground height and a takeoff point coordinate at a frequency of 30Hz, and forms a takeoff and landing time-space corresponding matrix;
(2) during landing, the unmanned aerial vehicle acquires an image of the ground in real time through a vertically downward camera, performs re-identification and positioning on the image and a corresponding matrix during taking off and landing, and outputs position information of the unmanned aerial vehicle to a flight controller;
(3) the flight controller controls the attitude of the unmanned aerial vehicle in real time through the position information;
(4) and (4) repeating the steps (2) and (3), and accurately landing the aircraft on the original takeoff place.
2. The precise autonomous take-off and landing method for the unmanned aerial vehicle in the unidentified environment according to claim 1, wherein the multilevel feature group comprises a basic point line feature and a deep global feature; the ground height is output by a GPS module in the unmanned aerial vehicle and corresponds to the multi-level feature groups one by one; the flying point coordinate consists of longitude and latitude of the unmanned aerial vehicle; the take-off and landing time-space corresponding matrix comprises a take-off and flying point coordinate and one or more time sequences of 30 frames per second; the one or more 30-frame-per-second time series comprise a one-to-one correspondence of multi-level feature sets and ground heights in each series.
3. The unmanned aerial vehicle precise autonomous take-off and landing method under the unidentified environment according to claim 2, characterized in that the basic point-line features are obtained by processing the ground image through a point feature extraction algorithm and a line feature extraction algorithm and fusing the processed features; the depth global feature is obtained by extracting global information in the ground image through a convolutional neural network.
4. The accurate autonomous take-off and landing method for the unmanned aerial vehicle in the non-identification environment according to claim 1, wherein the specific manner of the step (2) is as follows:
(201) the unmanned aerial vehicle flies to the coordinates of the flying point through GPS navigation, so that the image of the flying point area appears in the visual field range of the vertically downward camera;
(202) the drone begins to descend at a constant speed; in the descending process, the unmanned aerial vehicle acquires the current multi-level feature group and the ground distance height in real time, and compares the current multi-level feature group and the ground distance height with a take-off and landing time-space corresponding matrix recorded in take-off by taking the ground distance height as a standard;
(203) calculating the distance of the multi-level feature groups in the two take-off and landing space-time corresponding matrixes by a similarity measurement method; and combining the calculated distance with the current ground height to form positioning information, and outputting the positioning information to the flight controller.
5. The accurate autonomous take-off and landing method for the unmanned aerial vehicle in the non-identification environment according to claim 1, wherein the specific manner of step (3) is as follows:
(301) the unmanned aerial vehicle adjusts the horizontal position and the course angle of the unmanned aerial vehicle according to the positioning information, keeps the pixel-level alignment with the flying starting point, and descends at a constant speed;
(302) during descending, the unmanned aerial vehicle adjusts the posture and the position according to a negative feedback mechanism, compares the real-time positioning information with a preset error range value, and changes the position of the unmanned aerial vehicle through a PID control algorithm if the real-time positioning information is not within the error range value so that the position of the unmanned aerial vehicle is within the error range value; if the position of the unmanned aerial vehicle is within the error range, the unmanned aerial vehicle is enabled to be lowered to the height all the time.
CN202210142956.XA 2022-02-16 2022-02-16 Unmanned aerial vehicle accurate autonomous take-off and landing method in non-identification environment Pending CN114489140A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210142956.XA CN114489140A (en) 2022-02-16 2022-02-16 Unmanned aerial vehicle accurate autonomous take-off and landing method in non-identification environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210142956.XA CN114489140A (en) 2022-02-16 2022-02-16 Unmanned aerial vehicle accurate autonomous take-off and landing method in non-identification environment

Publications (1)

Publication Number Publication Date
CN114489140A true CN114489140A (en) 2022-05-13

Family

ID=81482057

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210142956.XA Pending CN114489140A (en) 2022-02-16 2022-02-16 Unmanned aerial vehicle accurate autonomous take-off and landing method in non-identification environment

Country Status (1)

Country Link
CN (1) CN114489140A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102417037A (en) * 2010-09-28 2012-04-18 株式会社拓普康 Automatic taking-off and landing system
CN106774423A (en) * 2017-02-28 2017-05-31 亿航智能设备(广州)有限公司 The landing method and system of a kind of unmanned plane
CN106927059A (en) * 2017-04-01 2017-07-07 成都通甲优博科技有限责任公司 A kind of unmanned plane landing method and device based on monocular vision
CN109791414A (en) * 2016-09-26 2019-05-21 深圳市大疆创新科技有限公司 The method and system that view-based access control model lands
CN109857128A (en) * 2018-12-18 2019-06-07 顺丰科技有限公司 Unmanned plane vision pinpoint landing method, system, equipment and storage medium
CN110001980A (en) * 2019-04-19 2019-07-12 深圳市道通智能航空技术有限公司 A kind of aircraft landing method and device
CN113608542A (en) * 2021-08-12 2021-11-05 山东信通电子股份有限公司 Control method and equipment for automatic landing of unmanned aerial vehicle

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102417037A (en) * 2010-09-28 2012-04-18 株式会社拓普康 Automatic taking-off and landing system
CN109791414A (en) * 2016-09-26 2019-05-21 深圳市大疆创新科技有限公司 The method and system that view-based access control model lands
CN106774423A (en) * 2017-02-28 2017-05-31 亿航智能设备(广州)有限公司 The landing method and system of a kind of unmanned plane
CN106927059A (en) * 2017-04-01 2017-07-07 成都通甲优博科技有限责任公司 A kind of unmanned plane landing method and device based on monocular vision
CN109857128A (en) * 2018-12-18 2019-06-07 顺丰科技有限公司 Unmanned plane vision pinpoint landing method, system, equipment and storage medium
CN110001980A (en) * 2019-04-19 2019-07-12 深圳市道通智能航空技术有限公司 A kind of aircraft landing method and device
CN113608542A (en) * 2021-08-12 2021-11-05 山东信通电子股份有限公司 Control method and equipment for automatic landing of unmanned aerial vehicle

Similar Documents

Publication Publication Date Title
EP3326041B1 (en) Method and device for terrain simulation flying of unmanned aerial vehicle and unmanned aerial vehicle
CN109324337B (en) Unmanned aerial vehicle route generation and positioning method and device and unmanned aerial vehicle
EP2771842B1 (en) Identification and analysis of aircraft landing sites
CN105182994A (en) Unmanned-aerial-vehicle fixed-point landing method
CN109885084A (en) A kind of multi-rotor unmanned aerial vehicle Autonomous landing method based on monocular vision and fuzzy control
CN110221625B (en) Autonomous landing guiding method for precise position of unmanned aerial vehicle
JP7492718B2 (en) System, method, program, and storage medium for storing the program for identifying a safe landing area
CN105867397A (en) Unmanned aerial vehicle accurate position landing method based on image processing and fuzzy control
CN110832494A (en) Semantic generation method, equipment, aircraft and storage medium
CN109739257A (en) Merge the patrol unmanned machine closing method and system of satellite navigation and visual perception
KR101792945B1 (en) Remote Radiation Surveillance Method and System using an Unmanned Aerial Vehicles with Laser Range Sensor
Bao et al. Vision-based horizon extraction for micro air vehicle flight control
CN110799983A (en) Map generation method, map generation equipment, aircraft and storage medium
CN102190081A (en) Vision-based fixed point robust control method for airship
EP3121676B1 (en) Air vehicle navigation system and method of flying an air vehicle
CN113093772A (en) Method for accurately landing hangar of unmanned aerial vehicle
CN110879617A (en) Infrared-guided unmanned aerial vehicle landing method and device
CN111413708A (en) Unmanned aerial vehicle autonomous landing site selection method based on laser radar
US11741702B2 (en) Automatic safe-landing-site selection for unmanned aerial systems
US11808578B2 (en) Global positioning denied navigation
CN109857128A (en) Unmanned plane vision pinpoint landing method, system, equipment and storage medium
CN110196601A (en) Unmanned aerial vehicle (UAV) control method, apparatus, system and computer readable storage medium
CN108766035A (en) A kind of unmanned plane terrain match flight control system under dot density guiding
CN114636405A (en) Aircraft sensor system synchronization
CN114489140A (en) Unmanned aerial vehicle accurate autonomous take-off and landing method in non-identification environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination