CN114527792A - Unmanned aerial vehicle landing guiding method, device, equipment and storage medium - Google Patents

Unmanned aerial vehicle landing guiding method, device, equipment and storage medium Download PDF

Info

Publication number
CN114527792A
CN114527792A CN202210085062.1A CN202210085062A CN114527792A CN 114527792 A CN114527792 A CN 114527792A CN 202210085062 A CN202210085062 A CN 202210085062A CN 114527792 A CN114527792 A CN 114527792A
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
offset
landed
landing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210085062.1A
Other languages
Chinese (zh)
Inventor
刘奇
王五丰
张泉
帅率
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Fl Intelligence Technology Co ltd
Original Assignee
Wuhan Fl Intelligence Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Fl Intelligence Technology Co ltd filed Critical Wuhan Fl Intelligence Technology Co ltd
Priority to CN202210085062.1A priority Critical patent/CN114527792A/en
Publication of CN114527792A publication Critical patent/CN114527792A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a landing guiding method, device, equipment and storage medium for an unmanned aerial vehicle, and belongs to the technical field of visual positioning. The invention detects whether the unmanned plane to be landed has landing requirement; when the unmanned aerial vehicle to be landed has a landing requirement, recognizing a plurality of positioning marks on an apron through the pan-tilt camera; acquiring the offset between the unmanned aerial vehicle to be landed and each positioning mark; according to the offset guide treat that the descending unmanned aerial vehicle descends, through calculating the offset between each location identifier on unmanned aerial vehicle and the parking apron, guide unmanned aerial vehicle to descend according to this offset, can guarantee that unmanned aerial vehicle descends accurately, promoted the descending reliability of unmanned aerial vehicle.

Description

Unmanned aerial vehicle landing guiding method, device, equipment and storage medium
Technical Field
The invention relates to the technical field of visual positioning, in particular to a landing guiding method, a landing guiding device, landing guiding equipment and a storage medium for an unmanned aerial vehicle.
Background
At present, most of multi-rotor unmanned aerial vehicles return to the air and land by depending on GPS positioning, network RTK positioning, base station RTK positioning and vision comparison positioning. GPS positioning: the precision is not high enough, and is generally +/-6-8 m. This requires a large take-off and landing area. Network RTK positioning: the precision can reach +/-0.1-0.3 m. But depending on the network signal of the operator, if the network signal is not good, the positioning mode can not be used. And (3) positioning a base station in an RTK way: the precision can reach +/-0.1-0.3 m. But requires an additional RTK base station to be erected. Visual comparison and positioning: and comparing the picture during landing with the picture during takeoff. The precision is not high enough and the reliability is poor. Failure is likely if non-characteristic textures are encountered, such as smooth and flat ground, uniform grass, uniform land, or low light. The present unmanned aerial vehicle descending scheme, the precision is not high enough, and the reliability is relatively poor.
The above is only for the purpose of assisting understanding of the technical aspects of the present invention, and does not represent an admission that the above is prior art.
Disclosure of Invention
The invention mainly aims to provide a method, a device, equipment and a storage medium for guiding the landing of an unmanned aerial vehicle, and aims to solve the technical problems that the landing scheme of the unmanned aerial vehicle in the prior art is not high enough in precision and poor in reliability.
In order to achieve the aim, the invention provides an unmanned aerial vehicle landing guiding method, wherein a tripod head camera is arranged in the unmanned aerial vehicle;
the unmanned aerial vehicle landing guiding method comprises the following steps:
detecting whether the unmanned aerial vehicle to be landed has landing requirements;
when the unmanned aerial vehicle to be landed has landing requirements, recognizing a plurality of positioning marks on the apron through the pan-tilt camera;
acquiring the offset between the unmanned aerial vehicle to be landed and each positioning mark;
and guiding the unmanned aerial vehicle to be landed to land according to the offset.
Optionally, whether the unmanned aerial vehicle that detects to wait to land exists the landing demand includes:
acquiring the current height of the unmanned aerial vehicle to be landed above the apron;
when detecting the unmanned aerial vehicle to be landed is maintained at the current height, the unmanned aerial vehicle to be landed is judged to have a landing requirement.
Optionally, the obtaining offset between the unmanned aerial vehicle to be landed and each positioning identifier includes:
shooting an identifier image of each positioning identifier through the holder camera;
determining a visual field central point of the holder camera;
and determining the actual distance offset of each positioning identifier relative to the center point of the visual field and the angle offset of each positioning identifier according to the identifier image.
Optionally, the determining, from the identifier image, an actual distance offset and an angular offset of each location identifier with respect to the center point of the field of view includes:
acquiring the pixel side length of each positioning identifier and the pixel distance offset of each positioning identifier relative to the center point of the visual field from the identifier image;
acquiring the actual side length of each positioning identifier;
calculating the actual distance offset of each positioning identifier relative to the view center point according to the pixel side length, the pixel distance offset and the actual side length;
and determining the included angle of each positioning identifier and the horizontal axis corresponding to the holder camera in the horizontal direction according to the identifier image to obtain the angle offset.
Optionally, the guiding the unmanned aerial vehicle to be landed to land according to the offset includes:
calculating an average distance offset according to the actual distance offsets of the positioning identifiers relative to the center point of the field of view;
calculating an average angle offset according to the angle offset of each positioning identifier;
and guiding the unmanned aerial vehicle to be landed to land according to the average distance offset and the average angle offset.
Optionally, the guiding the unmanned aerial vehicle to be landed to land according to the average distance offset and the average angle offset includes:
calculating a distance adjustment amount according to the average distance offset;
calculating an angle adjustment according to the average angle offset;
and controlling the unmanned aerial vehicle to be landed to move and rotate according to the distance adjustment amount and the angle adjustment amount so as to complete landing.
Optionally, after guiding the unmanned aerial vehicle to be landed to land according to the offset, the method further includes:
descending the unmanned aerial vehicle to be landed from the current height to a new height according to the offset and a preset landing amplitude;
maintaining the unmanned aerial vehicle to be landed at a new height, and acquiring new offset between the unmanned aerial vehicle to be landed and each positioning identifier;
continuing to guide the unmanned aerial vehicle to be landed to land according to the new offset;
and continuing to execute the step of descending the unmanned aerial vehicle to be descended from the current height to a new height according to the offset according to the preset descending amplitude until the unmanned aerial vehicle to be descended descends to the parking apron.
In addition, in order to achieve the purpose, the invention also provides an unmanned aerial vehicle landing guide device, wherein a tripod head camera is arranged in the unmanned aerial vehicle;
unmanned aerial vehicle descending guiding device includes:
the detection module is used for detecting whether the unmanned aerial vehicle to be landed has landing requirements;
the shooting module is used for identifying a plurality of positioning marks on the apron through the cradle head camera when the unmanned aerial vehicle to be landed has landing requirements;
the calculation module is used for acquiring the offset between the unmanned aerial vehicle to be landed and each positioning mark;
and the control module is used for guiding the unmanned aerial vehicle to be landed to land according to the offset.
In addition, in order to achieve the above object, the present invention further provides a landing guidance device for an unmanned aerial vehicle, including: a memory, a processor, and a drone landing guidance program stored on the memory and running on the processor, the drone landing guidance program configured to implement a drone landing guidance method as described above.
In addition, to achieve the above object, the present invention further provides a storage medium, where an unmanned aerial vehicle landing guidance program is stored, and when executed by a processor, the unmanned aerial vehicle landing guidance program implements the unmanned aerial vehicle landing guidance method as described above.
The invention detects whether the unmanned plane to be landed has landing requirements; when the unmanned aerial vehicle to be landed has landing requirements, recognizing a plurality of positioning marks on the apron through the pan-tilt camera; acquiring the offset between the unmanned aerial vehicle to be landed and each positioning mark; according to the offset guide treat that the descending unmanned aerial vehicle descends, through calculating the offset between each location identifier on unmanned aerial vehicle and the parking apron, guide unmanned aerial vehicle to descend according to this offset, can guarantee that unmanned aerial vehicle descends accurately, promoted the reliability that unmanned aerial vehicle descends.
Drawings
Fig. 1 is a schematic structural diagram of an unmanned aerial vehicle landing guidance device in a hardware operating environment according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of a first embodiment of the unmanned aerial vehicle landing guidance method of the present invention;
fig. 3 is a schematic diagram of a location identifier in an embodiment of the unmanned aerial vehicle landing guidance method of the present invention;
fig. 4 is a schematic flow chart of a method for guiding the landing of an unmanned aerial vehicle according to a second embodiment of the present invention;
fig. 5 is a schematic flow chart of a method for guiding the landing of an unmanned aerial vehicle according to a third embodiment of the present invention;
fig. 6 is a block diagram of the structure of the first embodiment of the unmanned aerial vehicle landing guiding device of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Referring to fig. 1, fig. 1 is a schematic structural diagram of an unmanned aerial vehicle landing guidance device in a hardware operating environment according to an embodiment of the present invention.
As shown in fig. 1, the unmanned aerial vehicle landing guidance apparatus may include: a processor 1001, such as a Central Processing Unit (CPU), a communication bus 1002, a user interface 1003, a network interface 1004, and a memory 1005. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display), an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a Wireless interface (e.g., a Wireless-Fidelity (Wi-Fi) interface). The Memory 1005 may be a Random Access Memory (RAM) Memory, or a Non-Volatile Memory (NVM), such as a disk Memory. The memory 1005 may alternatively be a storage device separate from the processor 1001.
Those skilled in the art will appreciate that the configuration shown in figure 1 does not constitute a limitation of the drone landing guide apparatus and may include more or fewer components than shown, or some components in combination, or a different arrangement of components.
As shown in fig. 1, the memory 1005, which is a storage medium, may include therein an operating system, a network communication module, a user interface module, and a drone landing guidance program.
In the drone landing guide apparatus shown in fig. 1, the network interface 1004 is mainly used for data communication with a network server; the user interface 1003 is mainly used for data interaction with a user; the processor 1001 and the memory 1005 of the unmanned aerial vehicle landing guidance device of the present invention may be arranged in the unmanned aerial vehicle landing guidance device, and the unmanned aerial vehicle landing guidance device calls the unmanned aerial vehicle landing guidance program stored in the memory 1005 through the processor 1001 and executes the unmanned aerial vehicle landing guidance method provided by the embodiment of the present invention.
An embodiment of the present invention provides an unmanned aerial vehicle landing guidance method, and with reference to fig. 2, fig. 2 is a schematic flow diagram of a first embodiment of the unmanned aerial vehicle landing guidance method of the present invention.
In this embodiment, the unmanned aerial vehicle landing guidance method includes the following steps:
step S10: whether the unmanned aerial vehicle to be landed has landing requirements is detected.
In this embodiment, the executing body of this embodiment may be an unmanned aerial vehicle landing guidance device, and the unmanned aerial vehicle landing guidance device may be an electronic device such as a personal computer or a server, and may also be other terminal devices that can achieve the same or similar functions.
It should be noted that the existing unmanned aerial vehicle landing mode includes GPS positioning, network RTK positioning, base station RTK positioning, and visual comparison positioning. But present unmanned aerial vehicle descending mode accuracy is not high, and the reliability is relatively poor, for example GPS location: the precision is not high enough, and is generally +/-6-8 m. This requires a large take-off and landing area. Network RTK positioning: the precision can reach +/-0.1-0.3 m. But depending on the operator network signal, if the network signal is not good, the positioning method cannot be used. And (3) positioning a base station in an RTK way: the precision can reach +/-0.1-0.3 m. But requires an additional RTK base station to be erected. Visual comparison and positioning: and comparing the picture during landing with the picture during takeoff. The precision is not high enough and the reliability is poor. Failure is likely if non-characteristic textures are encountered, such as smooth and flat ground, uniform grass, uniform land, or low light.
In order to solve the above problem, comment the location identifier that sets up through shutting down in this embodiment and guide unmanned aerial vehicle to descend, can realize in this embodiment completely automatic, do not need manual intervention, the location is accurate, can realize accurate descending, and can control unmanned aerial vehicle's position and angle simultaneously, specifically can realize according to following mode.
It should be noted that, can trigger unmanned aerial vehicle to descend based on user input's descending instruction in this embodiment, after receiving user input's descending instruction, control unmanned aerial vehicle descends, can also set up a preset time, when reaching preset time, control unmanned aerial vehicle descends, can realize after unmanned aerial vehicle flies a period, automatic landing to the parking apron.
In the concrete implementation, before control unmanned aerial vehicle descends, need detect whether unmanned aerial vehicle has the descending demand, if unmanned aerial vehicle has the descending demand, the control unmanned aerial vehicle that restarts this moment descends, if do not, then can make unmanned aerial vehicle fly.
Further, control is waited to descend unmanned aerial vehicle and is descended, needs to guarantee to wait to descend unmanned aerial vehicle and keep at a take the altitude, can treat through waiting to descend unmanned aerial vehicle in the high judgement of air on the air park and wait that unmanned aerial vehicle exists the descending demand in this embodiment.
In concrete realization, the real-time detection waits to descend the current height of unmanned aerial vehicle above the air park in this embodiment, then detects whether current height changes, if this current height changes in real time, explains that unmanned aerial vehicle is continuing the flight, if this current height does not change, explains this moment waits to descend the certain height that unmanned aerial vehicle hovered on the air park, in this kind of condition, judges to wait to descend unmanned aerial vehicle to have the landing demand in this embodiment. Further, within a preset time period, the height of the unmanned aerial vehicle to be landed above the apron is always the current height, and it is determined that the current height does not change, wherein the preset time period can be correspondingly set according to an actual determination requirement, which is not limited in this embodiment.
It is emphasized that a sensor is also provided on the drone to be landed, which sensor is used to detect the height of the drone to be landed above the apron.
Step S20: when the unmanned aerial vehicle to be landed has landing requirements, a plurality of positioning marks on the parking apron are identified through the cloud deck camera.
In specific implementation, the unmanned aerial vehicle to be landed is further provided with a cloud deck camera, and a plurality of positioning identifiers on the parking apron can be identified through the cloud deck camera. In this embodiment, taking fig. 3 as an example, aprilat adopted by the positioning identifier set in fig. 3 is set on the apron according to the laying shown in fig. 3, and the set number, position and style of the positioning identifier may be adjusted accordingly according to actual requirements, which is not limited in this embodiment.
Step S30: and acquiring the offset between the unmanned aerial vehicle to be landed and each positioning mark.
In specific implementation, in order to ensure that the unmanned aerial vehicle to be landed lands on the parking apron without deviation, the deviation between the unmanned aerial vehicle to be landed and each positioning identifier needs to be determined, that is, the offset in this embodiment includes a distance offset and an angle offset, and the distance offset is the distance deviation between the unmanned aerial vehicle to be landed and the positioning identifier. After the unmanned aerial vehicle lands on the apron, the aircraft nose of the unmanned aerial vehicle is required to be ensured to be in the specified direction, the aircraft nose direction shown in fig. 3 is the specified aircraft nose direction, and the angle offset is the horizontal rotation angle deviation between the current aircraft nose direction of the unmanned aerial vehicle to be landed and the specified aircraft nose direction.
Step S40: and guiding the unmanned aerial vehicle to be landed to land according to the offset.
In specific implementation, after the offset is determined, in this embodiment, the attitude of the unmanned aerial vehicle to be landed is adjusted according to the offset, for example, the displacement of the unmanned aerial vehicle to be landed in the horizontal direction and the rotation angle of the human-computer to be landed in the horizontal direction are adjusted, and it is required to ensure that the center point of the visual field of the pan-tilt camera on the unmanned aerial vehicle to be landed coincides with the center point of the positioning identifier shown in fig. 3.
It should be emphasized that, in this embodiment, the angle offset of the unmanned aerial vehicle to be landed is adjusted first, after the angle offset is in the adjustment-free range, the distance offset of the unmanned aerial vehicle to be landed is further adjusted, and after the distance offset is in the surface adjustment range, the unmanned aerial vehicle to be landed is controlled to start landing.
The embodiment detects whether the unmanned aerial vehicle to be landed has landing requirements; when the unmanned aerial vehicle to be landed has landing requirements, recognizing a plurality of positioning marks on the apron through the pan-tilt camera; acquiring the offset between the unmanned aerial vehicle to be landed and each positioning mark; according to the offset guide treat that the descending unmanned aerial vehicle descends, through calculating the offset between each location identifier on unmanned aerial vehicle and the parking apron, guide unmanned aerial vehicle to descend according to this offset, can guarantee that unmanned aerial vehicle descends accurately, promoted the reliability that unmanned aerial vehicle descends.
Referring to fig. 4, fig. 4 is a schematic flowchart of a method for guiding a landing of an unmanned aerial vehicle according to a second embodiment of the present invention.
Based on the first embodiment, in the method for guiding the unmanned aerial vehicle to land, the step S30 specifically includes:
step S301: and shooting an identifier image of each positioning identifier through the holder camera.
It should be noted that, in the current common method for identifying these visual positioning marks, the focal length of the lens and the size of the photosensitive sensor need to be known in advance. However, in practice, it is inconvenient to know the focal length of the lens and the size of the photosensor. Firstly, many times we cannot obtain these parameters, for example, manufacturers do not inform us of these parameters, and we must reverse the calculation through experiments, which is troublesome. Secondly, many lenses are zooming, perhaps zooming operation is performed in the use process, and perhaps individual difference between the lenses results in variable focal length and difficulty in obtaining accurate values.
In a specific implementation, in this embodiment, the identification of the positioning identifier is based on an identifier image captured by a pan-tilt camera, and then each identifier is identified by processing the identifier image, and an offset between the drone to be landed and each positioning identifier is calculated.
Step S302: and determining the visual field central point of the holder camera.
Step S303: and determining the actual distance offset of each positioning identifier relative to the center point of the visual field and the angle offset of each positioning identifier according to the identifier image.
In concrete implementation, the cloud platform camera has corresponding field of vision central point when shooing, and this field of vision central point is used for guiding unmanned aerial vehicle accuracy to descend. Further, in this embodiment, an actual distance offset amount of each location identifier with respect to the view center point and an angle offset amount of each location identifier may be determined according to the captured identifier image, where the actual distance offset amount represents an actual offset distance between the drone to be landed and the location identifier. If the aircraft nose direction of waiting to descend unmanned aerial vehicle is unanimous with the aircraft nose direction of setting for, then can not have horizontal angle deviation, but when waiting to descend unmanned aerial vehicle to take place horizontal rotation, there is angular deviation between waiting to descend the unmanned aerial vehicle in the identifier image that cloud platform camera shot.
In specific implementation, in this embodiment, the side length of a pixel of each location identifier and the offset of the pixel distance of each location identifier with respect to the center point of the field of view may be determined from an identifier image, and the parameters may be obtained based on an existing tag identification method.
In a specific implementation, the actual side length of each location identifier in the real world is known, and in this embodiment, the ratio of the actual distance to the pixel distance may be obtained by using the side length of the known location code and the pixel side length of the location code in the image, where width inmm is the actual side length of the location identifier, width inpx is the side length of the location identifier, xShiftInPx is the pixel distance offset of the location identifier in the X-axis direction, xShiftInMm is the actual distance offset of the location identifier in the X-axis direction, and yShiftInMm is the actual distance offset of the location identifier in the Y-axis direction. Further, an included angle is calculated according to a tangent value of an included angle between one edge of the positioning identifier and the x axis of the image, and the included angle is an angle offset.
Further, because there are a plurality of location identifiers, in this embodiment, the average distance offset and the average angle offset are calculated in a mean value calculation manner, and then the unmanned aerial vehicle to be landed is guided to land according to the average distance offset and the average angle offset.
In a specific implementation, after obtaining the offset, in this embodiment, an adjustment amount may be obtained based on the offset, specifically, the distance adjustment amount may be determined according to an average distance offset, and the angle adjustment amount may be determined according to an average angle offset. For example, assuming that the average distance offset is S (xShiftInMm, yShiftInMm), the distance adjustment amount may be determined to be d (-xShiftInMm, -yShiftInMm), that is, the unmanned aerial vehicle is controlled to be displaced by the distance of-xShiftInMm in the x direction, and the unmanned aerial vehicle is controlled to be displaced by the distance of-yShiftInMm in the y direction, thereby completing the landing.
In the embodiment, the identifier images of all the positioning identifiers are shot through the holder camera; determining a visual field central point of the holder camera; acquiring the pixel side length of each positioning identifier and the pixel distance offset of each positioning identifier relative to the center point of the visual field from the identifier image; acquiring the actual side length of each positioning identifier; calculating the actual distance offset of each positioning identifier relative to the view center point according to the pixel side length, the pixel distance offset and the actual side length; and determining the included angle of each positioning identifier and the horizontal axis corresponding to the cloud deck camera in the horizontal direction according to the identifier image to obtain the angle offset, and accurately obtaining the distance offset and the angle offset between the unmanned aerial vehicle to be landed and the positioning identifier.
Referring to fig. 5, fig. 5 is a schematic flowchart of a method for guiding the landing of an unmanned aerial vehicle according to a third embodiment of the present invention.
Based on the first embodiment or the second embodiment, a third embodiment of the unmanned aerial vehicle landing guidance method of the present invention is provided.
Taking the first embodiment as an example, in this embodiment, after step S40, the method further includes:
step S50: and descending the unmanned aerial vehicle to be landed to a new height from the current height according to the offset according to a preset landing amplitude.
In a specific implementation, after the angle offset and the distance offset are adjusted to be within the adjustment-free range, the unmanned aerial vehicle to be landed is not directly landed on the parking apron at one time, but the height of the unmanned aerial vehicle to be landed is sequentially reduced according to a preset landing amplitude, for example, the unmanned aerial vehicle is landed sequentially according to HAdjust0, HAdjust1, HAdjust2 and HAdjust3, and HAdjust0 > HAdjust1 > HAdjust2 > HAdjust3, wherein the preset landing amplitude can be set correspondingly according to an actual landing requirement, and this is not limited in this embodiment.
Step S60: and maintaining the unmanned aerial vehicle to be landed at a new height, and acquiring new offset between the unmanned aerial vehicle to be landed and each positioning identifier.
Step S70: and continuously guiding the unmanned aerial vehicle to be landed to land according to the new offset.
Step S80: and continuing to execute the step of descending the unmanned aerial vehicle to be descended from the current height to a new height according to the offset according to a preset descending amplitude until the unmanned aerial vehicle to be descended lands on the parking apron.
In the concrete implementation, treat to descend unmanned aerial vehicle and descend after, treat the height that descends unmanned aerial vehicle this moment and be new height, under this condition, the skew of treating to descend unmanned aerial vehicle also can change, reacquire in this embodiment and treat new offset between descending unmanned aerial vehicle and each location sign, then continue according to foretell mode, treat to descend unmanned aerial vehicle and adjust based on new offset, after the readjustment finishes, continue to descend to another new height, circulate above-mentioned step, descend to the air park until treating to descend unmanned aerial vehicle.
In the embodiment, the unmanned aerial vehicle to be landed is landed from the current height to a new height according to the offset according to the preset landing amplitude; maintaining the unmanned aerial vehicle to be landed at a new height, and acquiring new offset between the unmanned aerial vehicle to be landed and each positioning identifier; continuing to guide the unmanned aerial vehicle to be landed to land according to the new offset; continuing to carry out according to the predetermined descending amplitude according to the offset will treat that the descending unmanned aerial vehicle is followed the step of current height descends to new height, until treat that the descending unmanned aerial vehicle descends to the air park, through hovering unmanned aerial vehicle in a certain altitude department, detect and adjust the back that finishes, further reduce to another height again, detect and adjust again, so many rounds of hover, detect, adjust until descending the air park on, improved the accuracy and the reliability that unmanned aerial vehicle descended.
In addition, an embodiment of the present invention further provides a storage medium, where an unmanned aerial vehicle landing guidance program is stored on the storage medium, and when executed by a processor, the unmanned aerial vehicle landing guidance program implements the steps of the unmanned aerial vehicle landing guidance method described above.
Since the storage medium adopts all technical solutions of all the embodiments, at least all the beneficial effects brought by the technical solutions of the embodiments are achieved, and no further description is given here.
Referring to fig. 6, fig. 6 is a block diagram of the structure of the first embodiment of the unmanned aerial vehicle landing guiding device of the present invention.
As shown in fig. 6, the unmanned aerial vehicle landing guidance device provided in the embodiment of the present invention includes:
and the detection module 10 is used for detecting whether the unmanned aerial vehicle to be landed has landing requirements.
And the shooting module 20 is used for identifying a plurality of positioning identifiers on the apron by the cloud deck camera when the unmanned aerial vehicle to be landed has landing requirements.
And the calculation module 30 is used for acquiring the offset between the unmanned aerial vehicle to be landed and each positioning identifier.
And the control module 40 is used for guiding the unmanned aerial vehicle to be landed to land according to the offset.
The embodiment detects whether the unmanned aerial vehicle to be landed has landing requirements; when the unmanned aerial vehicle to be landed has landing requirements, recognizing a plurality of positioning marks on the apron through the pan-tilt camera; acquiring the offset between the unmanned aerial vehicle to be landed and each positioning mark; according to the offset guide treat that the descending unmanned aerial vehicle descends, through calculating the offset between each location identifier on unmanned aerial vehicle and the parking apron, guide unmanned aerial vehicle to descend according to this offset, can guarantee that unmanned aerial vehicle descends accurately, promoted the reliability that unmanned aerial vehicle descends.
In one embodiment, the drone landing guide device further comprises: a decision module;
the judging module is used for acquiring the current height of the unmanned aerial vehicle to be landed above the apron; when detecting the unmanned aerial vehicle to be landed is maintained at the current height, the unmanned aerial vehicle to be landed is judged to have a landing requirement.
In an embodiment, the calculation module 30 is further configured to capture an identifier image of each positioning identifier by the pan/tilt camera; determining a visual field central point of the holder camera; and determining the actual distance offset of each positioning identifier relative to the center point of the visual field and the angle offset of each positioning identifier according to the identifier image.
In an embodiment, the calculation module 30 is further configured to obtain, from the identifier image, a pixel side length of each location identifier and a pixel distance offset of each location identifier with respect to the center point of the field of view; acquiring the actual side length of each positioning identifier; calculating the actual distance offset of each positioning identifier relative to the view center point according to the pixel side length, the pixel distance offset and the actual side length; and determining the included angle of each positioning identifier and the horizontal axis corresponding to the holder camera in the horizontal direction according to the identifier image to obtain the angle offset.
In one embodiment, the control module 40 is further configured to calculate an average distance offset from the actual distance offset of each location identifier relative to the center point of the field of view; calculating an average angle offset according to the angle offset of each positioning identifier; and guiding the unmanned aerial vehicle to be landed to land according to the average distance offset and the average angle offset.
In an embodiment, the control module 40 is further configured to calculate a distance adjustment amount according to the average distance offset; calculating an angle adjustment quantity according to the average angle offset; and controlling the unmanned aerial vehicle to be landed to move and rotate according to the distance adjustment amount and the angle adjustment amount so as to finish landing.
In an embodiment, the control module 40 is further configured to lower the unmanned aerial vehicle to be landed from the current height to a new height according to the offset and a preset landing amplitude; maintaining the unmanned aerial vehicle to be landed at a new height, and acquiring new offset between the unmanned aerial vehicle to be landed and each positioning identifier; continuing to guide the unmanned aerial vehicle to be landed to land according to the new offset; and continuing to execute the step of descending the unmanned aerial vehicle to be descended from the current height to a new height according to the offset according to the preset descending amplitude until the unmanned aerial vehicle to be descended descends to the parking apron.
It should be understood that the above is only an example, and the technical solution of the present invention is not limited in any way, and in a specific application, a person skilled in the art may set the technical solution as needed, and the present invention is not limited thereto.
It should be noted that the above-described work flows are only exemplary, and do not limit the scope of the present invention, and in practical applications, a person skilled in the art may select some or all of them to achieve the purpose of the solution of the embodiment according to actual needs, and the present invention is not limited herein.
In addition, the technical details that are not described in detail in this embodiment can be referred to the unmanned aerial vehicle landing guidance method provided in any embodiment of the present invention, and are not described herein again.
Further, it is to be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are only for description, and do not represent the advantages and disadvantages of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solution of the present invention or portions thereof that contribute to the prior art may be embodied in the form of a software product, where the computer software product is stored in a storage medium (e.g. Read Only Memory (ROM)/RAM, magnetic disk, optical disk), and includes several instructions for enabling a terminal device (e.g. a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. An unmanned aerial vehicle landing guiding method is characterized in that a tripod head camera is arranged in the unmanned aerial vehicle;
the unmanned aerial vehicle landing guiding method comprises the following steps:
detecting whether the unmanned aerial vehicle to be landed has landing requirements;
when the unmanned aerial vehicle to be landed has landing requirements, recognizing a plurality of positioning marks on the apron through the pan-tilt camera;
acquiring the offset between the unmanned aerial vehicle to be landed and each positioning mark;
and guiding the unmanned aerial vehicle to be landed to land according to the offset.
2. An unmanned aerial vehicle landing guidance method as claimed in claim 1, wherein the detecting whether there is a landing need for the unmanned aerial vehicle to land comprises:
acquiring the current height of the unmanned aerial vehicle to be landed above the apron;
detecting the unmanned aerial vehicle to be landed is maintained when the current height, judging that the unmanned aerial vehicle to be landed has a landing requirement.
3. An unmanned aerial vehicle landing guidance method as defined in claim 1, wherein the obtaining offsets between the unmanned aerial vehicle to be landed and each positioning identifier comprises:
shooting an identifier image of each positioning identifier through the holder camera;
determining a visual field central point of the holder camera;
and determining the actual distance offset of each positioning identifier relative to the center point of the visual field and the angle offset of each positioning identifier according to the identifier image.
4. A method for unmanned aerial vehicle landing guidance as defined in claim 3, wherein determining from the identifier image an actual distance offset and an angular offset for each location identifier relative to the center point of view comprises:
acquiring the pixel side length of each positioning identifier and the pixel distance offset of each positioning identifier relative to the center point of the visual field from the identifier image;
acquiring the actual side length of each positioning identifier;
calculating the actual distance offset of each positioning identifier relative to the view center point according to the pixel side length, the pixel distance offset and the actual side length;
and determining the included angle of each positioning identifier and the horizontal axis corresponding to the holder camera in the horizontal direction according to the identifier image to obtain the angle offset.
5. An unmanned aerial vehicle landing guidance method as defined in claim 3, wherein the guiding the unmanned aerial vehicle to be landed to land according to the offset comprises:
calculating an average distance offset according to the actual distance offsets of the positioning identifiers relative to the center point of the field of view;
calculating an average angle offset according to the angle offset of each positioning identifier;
and guiding the unmanned aerial vehicle to be landed to land according to the average distance offset and the average angle offset.
6. An unmanned aerial vehicle landing guidance method as defined in claim 5, wherein the guiding the unmanned aerial vehicle to be landed to land according to the average distance offset and the average angle offset comprises:
calculating a distance adjustment amount according to the average distance offset;
calculating an angle adjustment according to the average angle offset;
and controlling the unmanned aerial vehicle to be landed to move and rotate according to the distance adjustment amount and the angle adjustment amount so as to finish landing.
7. An unmanned aerial vehicle landing guidance method as defined in any of claims 1-6, wherein after guiding the unmanned aerial vehicle to be landed to land according to the offset, the method further comprises:
descending the unmanned aerial vehicle to be landed from the current height to a new height according to the offset and a preset landing amplitude;
maintaining the unmanned aerial vehicle to be landed at a new height, and acquiring new offset between the unmanned aerial vehicle to be landed and each positioning identifier;
continuing to guide the unmanned aerial vehicle to be landed to land according to the new offset;
and continuing to execute the step of descending the unmanned aerial vehicle to be descended from the current height to a new height according to the offset according to the preset descending amplitude until the unmanned aerial vehicle to be descended descends to the parking apron.
8. An unmanned aerial vehicle landing guide device is characterized in that a tripod head camera is arranged in the unmanned aerial vehicle;
unmanned aerial vehicle descending guiding device includes:
the detection module is used for detecting whether the unmanned aerial vehicle to be landed has landing requirements;
the shooting module is used for identifying a plurality of positioning marks on the apron through the cradle head camera when the unmanned aerial vehicle to be landed has landing requirements;
the calculation module is used for acquiring the offset between the unmanned aerial vehicle to be landed and each positioning mark;
and the control module is used for guiding the unmanned aerial vehicle to be landed to land according to the offset.
9. The utility model provides an unmanned aerial vehicle descending guide apparatus, its characterized in that, unmanned aerial vehicle descending guide apparatus includes: a memory, a processor, and a drone landing guidance program stored on the memory and running on the processor, the drone landing guidance program configured to implement the drone landing guidance method of any one of claims 1-7.
10. A storage medium having stored thereon a drone landing guidance program that, when executed by a processor, implements a drone landing guidance method according to any one of claims 1 to 7.
CN202210085062.1A 2022-01-25 2022-01-25 Unmanned aerial vehicle landing guiding method, device, equipment and storage medium Pending CN114527792A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210085062.1A CN114527792A (en) 2022-01-25 2022-01-25 Unmanned aerial vehicle landing guiding method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210085062.1A CN114527792A (en) 2022-01-25 2022-01-25 Unmanned aerial vehicle landing guiding method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114527792A true CN114527792A (en) 2022-05-24

Family

ID=81620170

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210085062.1A Pending CN114527792A (en) 2022-01-25 2022-01-25 Unmanned aerial vehicle landing guiding method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114527792A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115291624A (en) * 2022-07-11 2022-11-04 广州中科云图智能科技有限公司 Unmanned aerial vehicle positioning landing method, storage medium and computer equipment
CN117809261A (en) * 2024-02-29 2024-04-02 西安猎隼航空科技有限公司 Unmanned aerial vehicle image processing method based on deep learning

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104049641A (en) * 2014-05-29 2014-09-17 深圳市大疆创新科技有限公司 Automatic landing method and device and air vehicle
CN106527481A (en) * 2016-12-06 2017-03-22 重庆零度智控智能科技有限公司 Unmanned aerial vehicle flight control method, device and unmanned aerial vehicle
CN107544550A (en) * 2016-06-24 2018-01-05 西安电子科技大学 A kind of Autonomous Landing of UAV method of view-based access control model guiding
CN108475070A (en) * 2017-04-28 2018-08-31 深圳市大疆创新科技有限公司 A kind of control method, control device and the unmanned plane of the landing of unmanned plane palm
CN108549397A (en) * 2018-04-19 2018-09-18 武汉大学 The unmanned plane Autonomous landing method and system assisted based on Quick Response Code and inertial navigation
CN110989661A (en) * 2019-11-19 2020-04-10 山东大学 Unmanned aerial vehicle accurate landing method and system based on multiple positioning two-dimensional codes
CN110991207A (en) * 2019-11-19 2020-04-10 山东大学 Unmanned aerial vehicle accurate landing method integrating H pattern recognition and Apriltag two-dimensional code recognition
CN111142546A (en) * 2019-11-22 2020-05-12 航天时代飞鸿技术有限公司 Multi-rotor unmanned aerial vehicle accurate landing guiding system and method
CN113495569A (en) * 2021-06-17 2021-10-12 上海大风技术有限公司 Unmanned aerial vehicle accurate landing method based on autonomous identification
CN113946157A (en) * 2021-11-29 2022-01-18 无锡科若斯科技有限公司 Fixed-point unmanned aerial vehicle landing method and system based on multifunctional identification and positioning

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104049641A (en) * 2014-05-29 2014-09-17 深圳市大疆创新科技有限公司 Automatic landing method and device and air vehicle
CN107544550A (en) * 2016-06-24 2018-01-05 西安电子科技大学 A kind of Autonomous Landing of UAV method of view-based access control model guiding
CN106527481A (en) * 2016-12-06 2017-03-22 重庆零度智控智能科技有限公司 Unmanned aerial vehicle flight control method, device and unmanned aerial vehicle
CN108475070A (en) * 2017-04-28 2018-08-31 深圳市大疆创新科技有限公司 A kind of control method, control device and the unmanned plane of the landing of unmanned plane palm
CN108549397A (en) * 2018-04-19 2018-09-18 武汉大学 The unmanned plane Autonomous landing method and system assisted based on Quick Response Code and inertial navigation
CN110989661A (en) * 2019-11-19 2020-04-10 山东大学 Unmanned aerial vehicle accurate landing method and system based on multiple positioning two-dimensional codes
CN110991207A (en) * 2019-11-19 2020-04-10 山东大学 Unmanned aerial vehicle accurate landing method integrating H pattern recognition and Apriltag two-dimensional code recognition
CN111142546A (en) * 2019-11-22 2020-05-12 航天时代飞鸿技术有限公司 Multi-rotor unmanned aerial vehicle accurate landing guiding system and method
CN113495569A (en) * 2021-06-17 2021-10-12 上海大风技术有限公司 Unmanned aerial vehicle accurate landing method based on autonomous identification
CN113946157A (en) * 2021-11-29 2022-01-18 无锡科若斯科技有限公司 Fixed-point unmanned aerial vehicle landing method and system based on multifunctional identification and positioning

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115291624A (en) * 2022-07-11 2022-11-04 广州中科云图智能科技有限公司 Unmanned aerial vehicle positioning landing method, storage medium and computer equipment
CN115291624B (en) * 2022-07-11 2023-11-28 广州中科云图智能科技有限公司 Unmanned aerial vehicle positioning landing method, storage medium and computer equipment
CN117809261A (en) * 2024-02-29 2024-04-02 西安猎隼航空科技有限公司 Unmanned aerial vehicle image processing method based on deep learning
CN117809261B (en) * 2024-02-29 2024-05-28 西安猎隼航空科技有限公司 Unmanned aerial vehicle image processing method based on deep learning

Similar Documents

Publication Publication Date Title
US20230350428A1 (en) Methods and system for autonomous landing
CN113038016B (en) Unmanned aerial vehicle image acquisition method and unmanned aerial vehicle
CN114527792A (en) Unmanned aerial vehicle landing guiding method, device, equipment and storage medium
US20200346753A1 (en) Uav control method, device and uav
WO2017075964A1 (en) Unmanned aerial vehicle photographing control method, unmanned aerial vehicle photographing method, mobile terminal and unmanned aerial vehicle
CN106124517A (en) Detect many rotor wing unmanned aerial vehicles detection platform system in structural member surface crack and for the method detecting structural member surface crack
CN110692027A (en) System and method for providing easy-to-use release and automatic positioning of drone applications
CN107167138B (en) A kind of library's intelligence Way guidance system and method
CN105857582A (en) Method and device for adjusting shooting angle, and unmanned air vehicle
CN110622091A (en) Cloud deck control method, device and system, computer storage medium and unmanned aerial vehicle
JPWO2016059877A1 (en) Control apparatus, control method, and air vehicle device
CN111123964B (en) Unmanned aerial vehicle landing method and device and computer readable medium
US20210341924A1 (en) Photography control method and mobile platform
US11924539B2 (en) Method, control apparatus and control system for remotely controlling an image capture operation of movable device
CN205920057U (en) Detect fissured many rotor unmanned aerial vehicle testing platform system in structure surface
KR102122755B1 (en) Gimbal control method using screen touch
CN106292126A (en) A kind of intelligence aerial survey flight exposal control method, unmanned aerial vehicle (UAV) control method and terminal
CN110879617A (en) Infrared-guided unmanned aerial vehicle landing method and device
CN117519125A (en) Control method of self-mobile device
CN113867373A (en) Unmanned aerial vehicle landing method and device, parking apron and electronic equipment
JP2020138681A (en) Control system for unmanned flight vehicle
WO2021056139A1 (en) Method and device for acquiring landing position, unmanned aerial vehicle, system, and storage medium
WO2020237422A1 (en) Aerial surveying method, aircraft and storage medium
CN111382971A (en) Unmanned aerial vehicle multipoint automatic distribution method and device
CN115046531A (en) Pole tower measuring method based on unmanned aerial vehicle, electronic platform and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination