CN113272625A - Aircraft positioning method and device, aircraft and storage medium - Google Patents

Aircraft positioning method and device, aircraft and storage medium Download PDF

Info

Publication number
CN113272625A
CN113272625A CN202080007330.7A CN202080007330A CN113272625A CN 113272625 A CN113272625 A CN 113272625A CN 202080007330 A CN202080007330 A CN 202080007330A CN 113272625 A CN113272625 A CN 113272625A
Authority
CN
China
Prior art keywords
data
visual perception
detection
height
aircraft
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080007330.7A
Other languages
Chinese (zh)
Inventor
刘新俊
高翔
王凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN113272625A publication Critical patent/CN113272625A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)

Abstract

A method, an apparatus, an aircraft (100) and a storage medium for locating an aircraft (100), wherein the method comprises: acquiring visual perception data (S101) of an aircraft (100); acquiring, by at least one sensor device (50), at least one detection data corresponding to the visual perception data (S102); obtaining a detection value corresponding to the visual perception data according to the at least one detection data (S103); based on the detection values, the aircraft (100) is located (S104).

Description

Aircraft positioning method and device, aircraft and storage medium
Technical Field
The present application relates to the field of aircraft control technologies, and in particular, to an aircraft positioning method and apparatus, an aircraft, and a storage medium.
Background
The aircraft such as unmanned aerial vehicle can be used in various fields such as aerial photography, agricultural plant protection, electric power patrol, disaster relief, cruise performance and the like, and is very wide in application. In the navigation positioning of the aircraft, the aircraft positioning is generally performed based on visual perception data, such as visual perception height, and the visual perception data is used as core data. If the vision perception data is abnormal, for example, the vision lens is blocked by a load, is not normally calibrated or is abnormally calibrated, the vision perception data may be abnormal, and thus, the positioning result is not accurate directly, and the flight abnormality of the aircraft is caused.
To avoid flight anomalies due to incorrect positioning, internal detection methods are currently used, which prevent the output of visually perceptible data when anomalies are detected. However, the method cannot detect some calibration-type anomalies, so that the situation that the anomalous visual perception data is output and wrongly positioned still exists, and the flight safety cannot be guaranteed.
Therefore, how to improve the positioning accuracy of the aircraft and further improve the flight safety becomes an urgent problem to be solved.
Disclosure of Invention
Based on this, the application provides an aircraft positioning method, an aircraft positioning device, an aircraft and a storage medium, and aims to improve the accuracy of aircraft positioning so as to improve the safety of flight.
In a first aspect, the present application provides an aircraft positioning method comprising:
acquiring visual perception data of an aircraft;
acquiring at least one detection data corresponding to the visual perception data through at least one sensor device;
obtaining a detection value corresponding to the visual perception data according to the at least one detection data;
based on the detection values, the aircraft is located.
In a second aspect, the present application further provides an aircraft positioning device comprising a memory and a processor;
the memory is used for storing a computer program;
the processor is configured to execute the computer program and, when executing the computer program, implement the following steps:
acquiring visual perception data of an aircraft;
acquiring at least one detection data corresponding to the visual perception data through at least one sensor device;
obtaining a detection value corresponding to the visual perception data according to the at least one detection data;
based on the detection values, the aircraft is located.
In a third aspect, the present application further provides an aircraft, where the aircraft includes a body, a power system disposed in the body, and an aircraft positioning device as described above, where the power system is used to provide power for the aircraft, and the aircraft positioning device is used to position the aircraft.
In a fourth aspect, the present application also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, causes the processor to carry out the aircraft positioning method as described above.
The embodiment of the application provides an aircraft positioning method, an aircraft positioning device, an aircraft and a storage medium, when the aircraft is positioned, the visual perception data of the aircraft are obtained, at least one detection data corresponding to the visual perception data is collected through at least one sensor device, a detection value corresponding to the visual perception data is obtained according to the at least one detection data, and the aircraft is positioned based on the detection value, so that the abnormal visual perception data is prevented from being output to be wrongly positioned, therefore, the positioning accuracy of the aircraft is improved, and the flight safety is further improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a block schematic diagram of a structure of an aircraft provided by an embodiment of the present application;
FIG. 2 is a flow chart illustrating steps of a method for locating an aircraft according to an embodiment of the present disclosure;
FIG. 3 is a flow diagram illustrating sub-steps of the aircraft location method of FIG. 2;
FIG. 4 is a schematic flow chart illustrating a process for determining whether visual perception data is abnormal according to an embodiment of the present application;
FIG. 5 is a flow diagram illustrating sub-steps of the aircraft location method of FIG. 2;
FIG. 6 is a schematic diagram of another process for determining whether visual perception data is abnormal according to an embodiment of the present application;
fig. 7 is a schematic block diagram of a structure of an aircraft positioning device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
The flow diagrams depicted in the figures are merely illustrative and do not necessarily include all of the elements and operations/steps, nor do they necessarily have to be performed in the order depicted. For example, some operations/steps may be decomposed, combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
The aircraft such as unmanned aerial vehicle can be used in various fields such as aerial photography, agricultural plant protection, electric power patrol, disaster relief, cruise performance and the like, and is very wide in application. In the navigation positioning of the aircraft, the aircraft positioning is generally performed based on visual perception data, such as visual perception height, and the visual perception data is used as core data. If the vision perception data is abnormal, for example, the vision lens is blocked by a load, is not normally calibrated or is abnormally calibrated, the vision perception data may be abnormal, and thus, the positioning result is not accurate directly, and the flight abnormality of the aircraft is caused.
To avoid flight anomalies due to incorrect positioning, internal detection methods are currently used, which prevent the output of visually perceptible data when anomalies are detected. However, the method cannot detect some calibration-type anomalies, so that the situation that the anomalous visual perception data is output and wrongly positioned still exists, and the flight safety cannot be guaranteed.
Based on the above problems, embodiments of the present application provide an aircraft positioning method, an aircraft positioning device, an aircraft, and a storage medium, so as to improve accuracy of aircraft positioning, and further improve flight safety.
Referring to fig. 1, fig. 1 is a schematic block diagram of a structure of an aircraft provided in an embodiment of the present application, and as shown in fig. 1, the aircraft 100 includes an airframe 10, a power system 20 disposed in the airframe 10, and an aircraft positioning device 30, where the power system 20 is used to provide power for the aircraft 100, and the aircraft positioning device 20 is used to position the aircraft 100.
The aircraft 100 may be a rotary wing type drone, and of course, the aircraft may also be other types of drones or mobile devices, and the embodiments of the present application are not limited thereto.
Illustratively, the aircraft 100 also includes a vision perception system 40, including but not limited to vision lenses, vision sensors, and the like. During the flight of the aircraft 100, the aircraft 100 acquires a visual image based on the visual perception system 40, and the aircraft positioning device 20 obtains corresponding visual perception data according to the visual image to position the aircraft 100.
In some embodiments, to accurately position the aircraft 100, the aircraft 100 further includes one or more sensor devices 50, the sensor devices 50 being configured to detect one or more detection data during flight of the aircraft 100. The sensor device 50 includes, but is not limited to, an IMU (Inertial measurement unit), a GNSS (Global Navigation Satellite System), an RTK (Real-Time kinematic), a barometer, an ultrasonic sensor, a TOF (Time of flight) sensor, a laser sensor, and the like. The sensed data detected by the one or more sensor devices 50 includes, but is not limited to, absolute altitude, relative distance altitude, acceleration, vertical velocity, horizontal velocity, pitch angle, and the like. The aircraft positioning device 20 positions the aircraft 100 based on the visual perception data and the detection data detected by the one or more sensor devices 50.
Taking the aircraft 100 as an example of a drone, the drone may have one or more propulsion units to allow the drone to fly in the air. The one or more propulsion units may move the drone at one or more, two or more, three or more, four or more, five or more, six or more free angles. In some cases, the drone may rotate about one, two, three, or more axes of rotation. The axes of rotation may be perpendicular to each other. The axes of rotation may be maintained perpendicular to each other throughout the flight of the drone. The axis of rotation may include a pitch axis, a roll axis, and/or a yaw axis. The drone may be movable in one or more dimensions. For example, the drone can move upward due to the lift generated by one or more rotors. In some cases, the drone may be movable along a Z-axis (which may be upward with respect to the drone direction), an X-axis, and/or a Y-axis (which may be lateral). The drone is movable along one, two or three axes perpendicular to each other.
The drone may be a rotorcraft. In some cases, the drone may be a multi-rotor aircraft that may include multiple rotors. The plurality of rotors may rotate to generate lift for the drone. The rotor may be a propulsion unit, allowing the drone to move freely in the air. The rotors may rotate at the same rate and/or may produce the same amount of lift or thrust. The rotor may rotate at will at different rates, generating different amounts of lift or thrust and/or allowing the drone to rotate. In some cases, one, two, three, four, five, six, seven, eight, nine, ten, or more rotors may be provided on the drone. The rotors may be arranged with their axes of rotation parallel to each other. In some cases, the axes of rotation of the rotors may be at any angle relative to each other, so that the motion of the drone may be affected.
The drone may have a plurality of rotors. The rotor may be connected to the body of the drone, which may contain a control unit, an Inertial Measurement Unit (IMU), a processor, a battery, a power source, and/or other sensors. The rotor may be connected to the body by one or more arms or extensions that branch off from a central portion of the body. For example, one or more arms may extend radially from the central body of the drone and may have rotors at or near the ends of the arms.
It will be appreciated that the above nomenclature for the various components of the aircraft is for identification purposes only, and does not limit the embodiments of the present application accordingly.
Hereinafter, the aircraft positioning method provided by the embodiment of the application will be described in detail based on an aircraft and an aircraft positioning device in the aircraft. It should be noted that the aircraft in fig. 1 is only used for explaining the aircraft positioning method provided in the embodiment of the present application, and does not constitute a limitation on an application scenario of the aircraft positioning method.
Referring to fig. 2, fig. 2 is a schematic flowchart of an aircraft positioning method according to an embodiment of the present application. The method can be used in the aircraft positioning device provided by the embodiment to realize the accuracy of aircraft positioning and further improve the safety of flight.
As shown in fig. 2, the aircraft positioning method specifically includes steps S101 to S104.
S101, visual perception data of the aircraft are obtained.
The visual perception data includes, but is not limited to, visual perception altitude, horizontal velocity, vertical velocity, pitch angle and the like, the visual perception data is core data of aircraft positioning, and if the visual perception data is abnormal, errors occur in the aircraft positioning. Therefore, when an aircraft is to be located, first the visual perception data of the aircraft is acquired.
In some embodiments, visual perception data is obtained by an aircraft-based visual perception system, wherein the visual perception system includes, but is not limited to, a visual lens, a visual sensor, and the like. Specifically, in the flying process of the aircraft, corresponding visual images are acquired through the visual perception system, image feature extraction is carried out on the visual images acquired by the visual perception system, and corresponding visual perception data are acquired. For example, the visual image is subjected to digital image processing and feature point extraction to obtain information such as image coordinates, and visual perception data corresponding to the aircraft is obtained through analysis and calculation.
S102, collecting at least one piece of detection data corresponding to the visual perception data through at least one sensor device.
In order to accurately position the aircraft, in addition to the acquisition of the visual perception data, at least one detection data corresponding to the visual perception data is acquired by at least one sensor device. Sensor devices include, but are not limited to, IMU, GNSS, RTK, barometer, TOF, and the like. Corresponding sensed data including, but not limited to, absolute altitude, relative distance altitude, acceleration, vertical velocity, horizontal velocity, pitch angle, etc., is collected by one or more sensor devices.
For example, data is detected by the IMU measuring the three-axis attitude angles (or angular velocities), accelerations, etc. of the aircraft. The IMU has three acceleration sensors for detecting acceleration components of the aircraft relative to the ground vertical, and three angular velocity sensors (gyros) for detecting angular information of the aircraft, such as aircraft pitch angle, inclination angle, sideslip angle, etc.
The GNSS is a space-based radio navigation positioning system that can provide users with all-weather three-dimensional coordinates and speed and time information at any place on the earth's surface or in a near-earth space, and detects data by detecting the three-dimensional coordinates, speed, etc. of an aircraft through the GNSS.
RTK is a real-time dynamic positioning technology based on carrier phase observed values, can provide a three-dimensional positioning result of a measuring station in a specified coordinate system in real time and achieves centimeter-level precision. Three-dimensional coordinate data of the aircraft can be detected by RTK.
The physical quantity measured by the barometer is an atmospheric pressure value, and the corresponding absolute altitude can be calculated according to the atmospheric pressure value. And acquiring the absolute altitude corresponding to the aircraft through a barometer in the flight process of the aircraft.
TOF is the finding of object distance by continuously sending light pulses to the object and then receiving light returning from the object with a sensor, by detecting the time of flight (round trip) of the light pulses. During the flight of the aircraft, the relative distance and height of the aircraft relative to the ground are obtained through TOF.
S103, obtaining a detection value corresponding to the visual perception data according to the at least one detection data.
After the corresponding at least one piece of detection data is acquired through one or more sensor devices, the acquired visual perception data is detected according to the at least one piece of detection data, a detection value corresponding to the visual perception data is acquired, and the detection result of the visual perception data is represented through the detection value.
In some embodiments, specifically, whether the visual perception data is abnormal is detected through the at least one detection data, and a detection value corresponding to the visual perception data is obtained according to the detected abnormal or normal result. Wherein, the detection result obtained under the condition that the visual perception data is normal is different from the detection result obtained under the condition that the visual perception data is abnormal.
Illustratively, if the visual perception data is determined to be normal, obtaining a first detection value corresponding to the visual perception data; and if the visual perception data are determined to be abnormal, obtaining a second detection value corresponding to the visual perception data. Wherein the first detection value is different from the second detection value. For example, it is preset that visual perception data corresponds to a detection value of 0 when normal, and corresponds to a detection value of 1 when abnormal. When the visual perception data are determined to be normal, obtaining a detection value 0 corresponding to the visual perception data; otherwise, when the visual perception data is determined to be abnormal, the detection value 1 corresponding to the visual perception data is obtained.
In some embodiments, when the visual perception data is detected, it is determined whether the visual perception data is abnormal by performing a mutual detection of the visual perception data and the at least one detection data. That is, the visual perception data and the at least one detection data are compared pairwise, and whether the visual perception data are consistent or not is judged to determine whether the visual perception data are abnormal or not.
In some embodiments, as shown in fig. 3, the mutually examining the visual perception data and the at least one detection data to determine whether the visual perception data is abnormal specifically includes substeps S1031 to substep S1033.
S1031, acquiring a first variable quantity corresponding to the visual perception data in a preset time length, and a plurality of second variable quantities corresponding to the detection data in the preset time length respectively;
s1032, if there is an inconsistent variation in the first variation and the plurality of second variations, and the inconsistent variation is the first variation, determining that the visual perception data is abnormal;
s1033, if the first variation and the plurality of second variations are all the same, or there is an inconsistent variation in the first variation and the plurality of second variations and the inconsistent variation is the second variation, determining that the visual perception data is normal.
In order to accurately detect the visual perception data, in this embodiment, it is not directly determined whether the visual perception data is abnormal by mutually detecting the visual perception data and at least one detection data, but it is determined whether the visual perception data is abnormal by mutually detecting a variation of the visual perception data and a variation of the at least one detection data. Specifically, first, the variation of the visual perception data over a period of time and the variation corresponding to each of the plurality of detection data over the period of time are obtained. For example, a variation of the visual perception data in a preset time period and a plurality of variations of the detection data in the preset time period are obtained. It is understood that the preset time period can be flexibly set according to actual situations, and is not particularly limited herein. For the convenience of distinguishing descriptions, the variation of the visual perception data in the preset time period is referred to as a first variation, and the corresponding variation of the detection data in the preset time period is referred to as a second variation.
And then, mutually checking the acquired first variable quantity and the plurality of second variable quantities, and judging the consistency of the first variable quantity and the plurality of second variable quantities. In one case, if there is an inconsistent variation in the first variation and the plurality of second variations, and the inconsistent variation is the first variation, which means that the first variation is inconsistent with the plurality of second variations, it is determined that the visual perception data is abnormal.
In another case, if the first variation and the plurality of second variations are all consistent, it is indicated that all data (the visual perception data and the plurality of detection data) are consistent, and at this time, it is determined that the visual perception data is normal. Or inconsistent variable quantities exist in the first variable quantity and the plurality of second variable quantities, and the inconsistent variable quantities are the second variable quantities, that is, the visual perception data is consistent with most detection data, only a few detection data are inconsistent, and at this time, the visual perception data is determined to be normal.
Illustratively, taking the visual perception height as an example, by obtaining the visual perception height variation of the visual perception height within the preset time period and the height variations of the heights detected by the plurality of sensor devices within the preset time period, the consistency between the visual perception height variation and the height variations corresponding to the heights detected by the plurality of sensor devices respectively is determined, and whether the visual perception height is abnormal or not is determined.
For example, as shown in fig. 4, a GNSS height detected by the GNSS, a TOF height detected by the TOF and a visual perception height are obtained, and a height variation of the GNSS height in a preset time period, a height variation of the TOF height in the preset time period and a height variation of the visual perception height in the preset time period are respectively obtained through calculation, the height variation of the GNSS height in the preset time period, the height variation of the TOF height in the preset time period and the height variation of the visual perception height in the preset time period are mutually checked, and are compared in pairs to determine a height variation of the GNSS height in the preset time period, a height variation of the TOF height in the preset time period and a consistency between the height variations of the visual perception height in the preset time period, so as to determine whether the visual perception height is abnormal. Specifically, if the height variation of the GNSS altitude within the preset time length, the height variation of the TOF altitude within the preset time length, and the height variation of the visual perception altitude within the preset time length are all consistent, it is determined that the visual perception altitude is normal; or if the height variation of the GNSS altitude in the preset time length is consistent with the height variation of the visual perception altitude in the preset time length and is inconsistent with the height variation of the TOF altitude in the preset time length, determining that the visual perception altitude is normal and the TOF altitude is abnormal; or if the height variation of the TOF height within the preset time is consistent with the height variation of the visual perception height within the preset time, but is inconsistent with the height variation of the GNSS height within the preset time, determining that the visual perception height is normal and the GNSS height is abnormal; or if the height variation of the GNSS altitude within the preset time is consistent with the height variation of the TOF altitude within the preset time and is inconsistent with the height variation of the visual perception altitude within the preset time, determining that the visual perception altitude is abnormal.
In the above, the description is given of determining whether the visual perception height is abnormal by taking the visual perception height as an example, and it should be noted that the operation of determining whether other visual perception data such as the vertical speed, the horizontal speed and the like are abnormal is similar to the operation of determining whether the visual perception height is abnormal, and therefore, the description is omitted here.
In some embodiments, as shown in fig. 5, step S103 specifically includes sub-steps S1034 to S1036.
S1034, acquiring a first altitude and a first vertical speed corresponding to the historical positioning of the aircraft.
In the embodiment, the data corresponding to the current positioning of the aircraft, such as the current altitude, the current speed and the like of the aircraft, are calculated and predicted based on the data corresponding to the historical positioning of the aircraft, the calculated and predicted data are compared with the current visual perception data of the aircraft, and whether the visual perception data are abnormal or not is determined according to the comparison result.
Specifically, altitude and vertical speed corresponding to the aircraft historical positioning data are acquired, for example, altitude and vertical speed corresponding to the positioning performed before time t are acquired. For the convenience of description, hereinafter, the altitude corresponding to the historical positioning data is referred to as a first altitude, and the vertical speed corresponding to the historical positioning data is referred to as a first vertical speed. Illustratively, the first height is obtained by performing a fusion process on a historical visual perception height at the time of historical positioning and a plurality of historical detection heights detected by a plurality of sensor devices, and the first vertical velocity is obtained by performing a fusion process on a historical visual perception velocity at the time of historical positioning and a plurality of historical detection velocities detected by a plurality of sensor devices.
And S1035, calculating a current second altitude of the aircraft according to the first altitude and the first vertical speed.
And calculating to obtain the current height of the aircraft based on the obtained first height and the first vertical speed and the corresponding time of the first height and the first vertical speed, wherein for the convenience of distinguishing description, the calculated current height of the aircraft is referred to as a second height hereinafter. For example, if the first altitude is H0 and the first vertical velocity is v, then the current second altitude H1 of the aircraft is obtained by calculation according to the formula H1 — H0+ vt, where t is the time difference between the time corresponding to the first altitude and the first vertical velocity and the current time.
S1036, comparing the second height with the visual perception height to obtain a detection value corresponding to the visual perception height.
And then, comparing the second height obtained by calculation with the visual perception height, and obtaining a detection value corresponding to the visual perception height according to the consistency of the comparison result. Illustratively, the calculated second height is compared with the visual perception height, and whether the visual perception height is abnormal or not is determined according to the consistency of the comparison. Optionally, performing height gross error detection on the second height obtained by calculation and the visual perception height, and if the second height is consistent with the visual perception height, determining that the visual perception height is normal; otherwise, if the second height is not consistent with the visual perception height, the visual perception height is determined to be abnormal. For example, a preset range corresponding to the height difference is preset, and if the height difference between the second height and the visual perception height does not exceed the preset range, the second height is determined to be consistent with the visual perception height, and the visual perception height is determined to be normal; otherwise, if the height difference between the second height and the visual perception height exceeds the preset range, determining that the second height is inconsistent with the visual perception height, and determining that the visual perception height is abnormal.
For example, as shown in fig. 6, a fusion altitude and a fusion vertical velocity output by fusion calculation are calculated based on historical data corresponding to historical positioning of the aircraft, a current altitude of the predicted aircraft is calculated according to a formula H1-H0 + vt, then, altitude gross error detection is performed on the current altitude of the predicted aircraft and the obtained visual perception altitude, and if the current altitude of the predicted aircraft is calculated to be consistent with the obtained visual perception altitude, it is determined that the visual perception altitude is normal; otherwise, if the calculated and predicted current height of the aircraft is not consistent with the obtained visual perception height, the visual perception height is determined to be abnormal.
And obtaining a detection value corresponding to the visual perception height according to the result of whether the determined visual perception height is abnormal. For example, if the visual perception height is determined to be normal, a first detection value corresponding to the visual perception height is obtained; otherwise, if the visual perception height is determined to be abnormal, a second detection value corresponding to the visual perception height is obtained.
And S104, positioning the aircraft based on the detection value.
After a detection value corresponding to the visual perception data is obtained through at least one piece of detection data, the aircraft is positioned based on the obtained detection value, wherein different processing modes for positioning the aircraft are different for different detection values.
For example, when the first detection value is obtained, that is, when the visual perception data is normal, the aircraft is located based on the visual perception data. The aircraft is positioned based on the normal visual perception data, so that positioning errors are avoided, and the flying safety of the aircraft is improved.
When the second detection value is obtained, that is, when the visual perception data is abnormal, the positioning processing is not performed based on the visual perception data. Optionally, when the second detection value is obtained, returning to the operation of step S101 again, and re-acquiring the visual perception data; or when the second detection value is obtained, outputting corresponding visual perception data abnormity prompting information to prompt corresponding abnormity processing. It should be noted that the prompt message includes, but is not limited to, a text prompt message and/or a voice prompt message.
In some embodiments, to further improve the accuracy of the aircraft positioning, a confidence analysis is also performed on the acquired visual perception data. Optionally, the confidence of the visual perception data is determined according to output parameters corresponding to the visual images acquired by the visual perception system of the aircraft. The output parameters corresponding to the visual image comprise at least one of feature average depth, feature point number and image brightness corresponding to the visual image.
The corresponding output of the visual image of the characteristic object near the aircraft is reliable. The smaller the average depth of the features corresponding to the visual image is, the higher the confidence of the corresponding output visual perception data is.
Under the condition of good illumination, the visual image has high brightness, high quality and reliable output. The higher the image brightness corresponding to the visual image, the higher the confidence of the corresponding output visual perception data.
When the number of feature points of the visual image is large, the output is relatively high and reliable. The more the number of the feature points corresponding to the visual image is, the higher the confidence of the corresponding output visual perception data is.
In some embodiments, to reduce unnecessary operations, only low-confidence visually-perceived data is subjected to anomaly detection, while high-confidence data is not subjected to anomaly detection. Specifically, a preset confidence threshold is preset, after the confidence of the visual perception data is obtained, the confidence of the visual perception data is compared with the preset confidence threshold, and if the confidence of the visual perception data is smaller than the preset confidence threshold, that is, the confidence of the visual perception data is low, whether the visual perception data is abnormal is determined according to at least one detection data acquired by at least one sensor device, so as to obtain a detection value corresponding to the visual perception data.
If the confidence of the visual perception data is greater than or equal to the preset confidence threshold value, that is, the confidence of the visual perception data is high, optionally, the aircraft is positioned directly based on the visual perception data.
In some embodiments, in the process of positioning the aircraft, if the detection value corresponding to the obtained visual perception data is the first detection value, that is, when the visual perception data is normal, the visual perception data and at least one detection data are subjected to data fusion to obtain corresponding fusion data. It should be noted that there are various methods for performing data fusion on the visual perception data and the at least one detection data, for example, the visual perception data and the at least one detection data are subjected to data fusion by an EKF method, that is, an extended kalman filter method. The specific data fusion method is not particularly limited in this application.
Thereafter, the aircraft is positioned based on the fusion data, such as fusion altitude, fusion speed, and the like.
For example, before data fusion is performed on the visual perception data and the at least one detection data to obtain corresponding fused data, it is further determined whether the at least one detection data is abnormal, and a detection value corresponding to the at least one detection data is obtained. Optionally, the visual perception data and the at least one detection data are mutually detected to obtain a detection value corresponding to the at least one detection data.
Specifically, the visual perception data and the at least one detection data are mutually detected, whether abnormal detection data exists in the at least one detection data is determined, that is, whether inconsistent detection data exists in the at least one detection data is determined, and the specific mutual detection operation may refer to an operation of determining whether the visual perception data is abnormal, which is not described herein again.
And then, according to the determined result, obtaining a detection value corresponding to at least one detection data. Illustratively, if it is determined that there is no abnormal detection data in the at least one detection data, a third detection value corresponding to the at least one detection data is obtained; on the contrary, if the abnormal detection data exists in the at least one detection data, the fourth detection value corresponding to the at least one detection data is obtained. Wherein the third detection value is different from the fourth detection value.
And when the third detection value is obtained, namely the abnormal detection data does not exist in the at least one piece of detection data, performing data fusion on the normal visual perception data and the at least one piece of detection data to obtain corresponding fusion data. And when the fourth detection value is obtained, namely abnormal detection data exists in the at least one piece of detection data, performing data fusion on the visual perception data and other detection data except the abnormal detection data in the at least one piece of detection data to obtain corresponding fusion data.
In some embodiments, before performing data fusion on the visual perception data and the at least one detection data, obtaining corresponding fused data further includes: obtaining a confidence level of the visual perception data; determining the weight corresponding to the visual perception data according to the corresponding relation between the preset confidence coefficient and the weight and the confidence coefficient of the visual perception data; the performing data fusion on the visual perception data and the at least one detection data to obtain corresponding fusion data includes: and performing data fusion on the visual perception data and the at least one detection data according to the weight corresponding to the visual perception data to obtain the fusion data.
In order to further improve the positioning accuracy of the aircraft, the corresponding relation between the confidence coefficient of the visual perception data and the weight is preset. Wherein, the higher the confidence of the visual perception data, the higher the corresponding weight. After the visual perception data is obtained, the confidence level of the visual perception data is determined by obtaining the confidence level corresponding to the visual perception data, for example, according to output parameters such as image brightness, feature point number, feature average depth and the like corresponding to the visual image. And then determining the weight corresponding to the confidence coefficient of the visual perception data according to the corresponding relation between the preset confidence coefficient and the weight.
For example, it is assumed that the confidence of the preset visual perception data is in the interval range of [ a1, a2) corresponding to the weight b 1; the confidence degree is in the interval range of [ a2, a3) and corresponds to the weight b 2; confidence in the range of [ a3, a4) corresponds to the weight b 3. And if the confidence coefficient a of the visual perception data is determined to be in the range of the [ a1, a2), determining the weight corresponding to the visual perception data to be b 1.
And performing data fusion on the visual perception data and at least one piece of detection data based on the weight corresponding to the determined visual perception data to obtain corresponding fusion data. Therefore, the higher the confidence of the visual perception data, the greater the proportion of the visual perception data in the data fusion, that is, the greater the influence of the visual perception data on the specific numerical value of the fused data. On the contrary, the lower the confidence of the visual perception data, the smaller the proportion of the visual perception data in the data fusion, that is, the smaller the influence of the visual perception data on the specific numerical value of the fusion data. Therefore, the confidence of the visual perception data is used as a consideration factor to carry out data fusion, the aircraft is positioned, and the positioning accuracy is further improved.
According to the aircraft positioning method provided by the embodiment, when the aircraft is positioned, the visual perception data of the aircraft is acquired, the at least one detection data corresponding to the visual perception data is acquired through the at least one sensor device, the detection value corresponding to the visual perception data is acquired according to the at least one detection data, and the aircraft is positioned based on the detection value, so that the abnormal visual perception data is prevented from being output for error positioning, therefore, the positioning accuracy of the aircraft is improved, and the flight safety is improved.
Referring to fig. 7, fig. 7 is a schematic block diagram illustrating a structure of an aircraft positioning device according to an embodiment of the present application. The aircraft positioning device can be applied to aircrafts and control terminals of the aircrafts.
As shown in fig. 7, the aircraft positioning device 700 includes a processor 701 and a memory 702, and the processor 701 and the memory 702 are connected by a bus 703, such as an I2C (Inter-integrated Circuit) bus 703. The aircraft positioning device 700 is applied to a control terminal of an aircraft, and the control terminal is communicated with the aircraft and used for positioning the aircraft and further controlling the aircraft to fly. Alternatively, the aircraft positioning device 700 is applied to an aircraft for positioning the aircraft.
Specifically, the Processor 701 may be a Micro-controller Unit (MCU), a Central Processing Unit (CPU), a Digital Signal Processor (DSP), or the like.
Specifically, the Memory 702 may be a Flash chip, a Read-Only Memory (ROM) magnetic disk, an optical disk, a usb disk, or a removable hard disk.
The processor 701 is configured to run a computer program stored in the memory 702, and when executing the computer program, implement the following steps:
acquiring visual perception data of an aircraft;
acquiring at least one detection data corresponding to the visual perception data through at least one sensor device;
obtaining a detection value corresponding to the visual perception data according to the at least one detection data;
based on the detection values, the aircraft is located.
In some embodiments, when implementing the obtaining of the detection value corresponding to the visual perception data according to the at least one detection data, the processor specifically implements:
determining whether the visual perception data is abnormal according to the at least one detection data;
and obtaining a detection value corresponding to the visual perception data according to the determined result.
In some embodiments, when implementing the obtaining of the detection value corresponding to the visual perception data according to the determined result, the processor specifically implements:
and if the visual perception data is determined to be normal, obtaining a first detection value corresponding to the visual perception data.
In some embodiments, when implementing the obtaining of the detection value corresponding to the visual perception data according to the determined result, the processor specifically implements:
and if the visual perception data is determined to be abnormal, obtaining a second detection value corresponding to the visual perception data.
In some embodiments, the processor, when implementing the locating the aircraft based on the detection values, specifically implements:
when the first detection value is obtained, the aircraft is located based on the visual perception data.
In some embodiments, the processor, when implementing the determining whether the visual perception data is abnormal based on the at least one detection data, specifically implements:
and mutually detecting the visual perception data and the at least one detection data to determine whether the visual perception data is abnormal.
In some embodiments, when the processor performs the mutual detection on the visual perception data and the at least one detection data and determines whether the visual perception data is abnormal, the processor specifically performs:
acquiring a first variable quantity corresponding to the visual perception data in a preset time length and a second variable quantity corresponding to a plurality of detection data in the preset time length respectively;
and if the first variable quantity and the plurality of second variable quantities have inconsistent variable quantities, and the inconsistent variable quantities are the first variable quantities, determining that the visual perception data is abnormal.
In some embodiments, after the obtaining of the first variation corresponding to the visual perception data within a preset time period and the second variation corresponding to the plurality of detection data within the preset time period, the processor further implements:
if the first variable quantity and the plurality of second variable quantities are all consistent, or the first variable quantity and the plurality of second variable quantities have inconsistent variable quantities and the inconsistent variable quantities are second variable quantities, determining that the visual perception data are normal.
In some embodiments, the visual perception data comprises at least one of a visual perception altitude, a horizontal velocity, and a vertical velocity.
In some embodiments, the detection data includes an absolute altitude, a relative distance altitude.
In some embodiments, the visual perception data comprises a visual perception height, the processor, when executing the computer program, further implements:
acquiring a first altitude and a first vertical speed corresponding to the historical positioning of the aircraft;
calculating a current second altitude of the aircraft according to the first altitude and the first vertical speed;
and comparing the second height with the visual perception height to obtain a detection value corresponding to the visual perception height.
In some embodiments, when the processor compares the second height with the visual perception height to obtain a detection value corresponding to the visual perception height, the following is specifically implemented:
comparing the second height with the visual perception height to determine whether the visual perception height is abnormal;
and obtaining a detection value corresponding to the visual perception height according to the determined result.
In some embodiments, when implementing the obtaining of the detection value corresponding to the visually perceived height according to the determined result, the processor specifically implements:
and if the visual perception height is determined to be normal, obtaining a first detection value corresponding to the visual perception height.
In some embodiments, when implementing the obtaining of the detection value corresponding to the visually perceived height according to the determined result, the processor specifically implements:
and if the visual perception height is determined to be abnormal, obtaining a second detection value corresponding to the visual perception height.
In some embodiments, when the processor compares the second height with the visual perception height to determine whether the visual perception height is abnormal, the processor specifically implements:
and if the height difference between the second height and the visual perception height does not exceed a preset range, determining that the visual perception height is normal.
In some embodiments, when the processor compares the second height with the visual perception height to determine whether the visual perception height is abnormal, the processor specifically implements:
and if the height difference between the second height and the visual perception height exceeds a preset range, determining that the visual perception height is abnormal.
In some embodiments, the processor, when implementing the acquiring the visual perception data of the aircraft, implements:
acquiring a visual image by a visual perception system of the aircraft;
and extracting image features of the visual image to obtain the visual perception data.
In some embodiments, the processor, when executing the computer program, further implements:
and determining the confidence of the visual perception data according to the output parameters of the visual image.
In some embodiments, the output parameters include at least one of feature average depth, feature point number, and image brightness.
In some embodiments, the smaller the average depth of the features corresponding to the visual image, the higher the confidence in the visual perception data.
In some embodiments, the more feature points corresponding to the visual image, the higher the confidence of the visual perception data.
In some embodiments, the higher the image brightness corresponding to the visual image, the higher the confidence in the visual perception data.
In some embodiments, when implementing the obtaining of the detection value corresponding to the visual perception data according to the at least one detection data, the processor specifically implements:
and if the confidence of the visual perception data is smaller than a preset confidence threshold, obtaining a detection value corresponding to the visual perception data according to the at least one detection data.
In some embodiments, the processor, when implementing the locating the aircraft based on the detection values, specifically implements:
and when the detection value is a first detection value, performing data fusion on the visual perception data and the at least one detection data to obtain corresponding fusion data, and positioning the aircraft based on the fusion data.
In some embodiments, before performing the data fusion of the visual perception data and the at least one detection data to obtain corresponding fused data, the processor further performs:
and mutually detecting the visual perception data and the at least one detection data to obtain a detection value corresponding to the at least one detection data.
In some embodiments, when the processor performs the mutual detection on the visual perception data and the at least one detection data to obtain a detection value corresponding to the at least one detection data, the processor specifically performs:
mutually detecting the visual perception data and the at least one detection data, and determining whether abnormal detection data exist in the at least one detection data;
and obtaining a detection value corresponding to the at least one detection data according to the determined result.
In some embodiments, when the processor obtains the detection value corresponding to the at least one detection data according to the determined result, specifically:
and if the at least one piece of detection data is determined to have no abnormal detection data, obtaining a third detection value corresponding to the at least one piece of detection data.
In some embodiments, when the processor obtains the detection value corresponding to the at least one detection data according to the determined result, specifically:
and if the abnormal detection data exists in the at least one piece of detection data, obtaining a fourth detection value corresponding to the at least one piece of detection data.
In some embodiments, when the processor performs data fusion on the visual perception data and the at least one detection data to obtain corresponding fusion data, the following is specifically implemented:
and when the third detection value is obtained, performing data fusion on the visual perception data and the at least one detection data to obtain the fusion data.
In some embodiments, when the processor performs data fusion on the visual perception data and the at least one detection data to obtain corresponding fusion data, the following is specifically implemented:
and when the fourth detection value is obtained, performing data fusion on the visual perception data and other detection data except the abnormal detection data in the at least one detection data to obtain the fusion data.
In some embodiments, before performing the data fusion of the visual perception data and the at least one detection data to obtain corresponding fused data, the processor further performs:
obtaining a confidence level of the visual perception data;
determining the weight corresponding to the visual perception data according to the corresponding relation between the preset confidence coefficient and the weight and the confidence coefficient of the visual perception data;
when the processor performs data fusion on the visual perception data and the at least one detection data to obtain corresponding fusion data, the following steps are specifically implemented:
and performing data fusion on the visual perception data and the at least one detection data according to the weight corresponding to the visual perception data to obtain the fusion data.
In some embodiments, the higher the confidence of the visual perception data, the greater the corresponding weight.
It should be noted that, as will be clearly understood by those skilled in the art, for convenience and brevity of description, the specific working process of the aircraft positioning device described above may refer to the corresponding process in the foregoing embodiment of the aircraft positioning method, and is not described herein again.
Embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, where the computer program includes program instructions, and the processor executes the program instructions to implement the steps of the aircraft positioning method provided in the foregoing embodiments.
The computer readable storage medium may be an internal storage unit of the aircraft or the aircraft positioning device according to any of the foregoing embodiments, for example, a hard disk or a memory of the aircraft or the aircraft positioning device. The computer readable storage medium may also be an external storage device of the aircraft or aircraft positioning device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, provided on the aircraft or aircraft positioning device.
It is to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
While the invention has been described with reference to specific embodiments, the scope of the invention is not limited thereto, and those skilled in the art can easily conceive various equivalent modifications or substitutions within the technical scope of the invention. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (66)

1. An aircraft positioning method, comprising:
acquiring visual perception data of an aircraft;
acquiring at least one detection data corresponding to the visual perception data through at least one sensor device;
obtaining a detection value corresponding to the visual perception data according to the at least one detection data;
based on the detection values, the aircraft is located.
2. The method of claim 1, wherein obtaining the detection value corresponding to the visual perception data according to the at least one detection data comprises:
determining whether the visual perception data is abnormal according to the at least one detection data;
and obtaining a detection value corresponding to the visual perception data according to the determined result.
3. The method of claim 2, wherein obtaining the detection value corresponding to the visual perception data according to the determination result comprises:
and if the visual perception data is determined to be normal, obtaining a first detection value corresponding to the visual perception data.
4. The method of claim 2, wherein obtaining the detection value corresponding to the visual perception data according to the determination result comprises:
and if the visual perception data is determined to be abnormal, obtaining a second detection value corresponding to the visual perception data.
5. The method of claim 3, wherein the locating the aircraft based on the detection values comprises:
when the first detection value is obtained, the aircraft is located based on the visual perception data.
6. The method of claim 2, wherein said determining whether the visual perception data is abnormal based on the at least one detection data comprises:
and mutually detecting the visual perception data and the at least one detection data to determine whether the visual perception data is abnormal.
7. The method of claim 6, wherein said cross-checking the visual perception data and the at least one detection data to determine whether the visual perception data is anomalous comprises:
acquiring a first variable quantity corresponding to the visual perception data in a preset time length and a second variable quantity corresponding to a plurality of detection data in the preset time length respectively;
and if the first variable quantity and the plurality of second variable quantities have inconsistent variable quantities, and the inconsistent variable quantities are the first variable quantities, determining that the visual perception data is abnormal.
8. The method according to claim 7, wherein after obtaining a first variation corresponding to the visual perception data in a preset time period and obtaining a second variation corresponding to the detection data in the preset time period, the method further comprises:
if the first variable quantity and the plurality of second variable quantities are all consistent, or the first variable quantity and the plurality of second variable quantities have inconsistent variable quantities and the inconsistent variable quantities are second variable quantities, determining that the visual perception data are normal.
9. The method of claim 1, wherein the visual perception data comprises at least one of a visual perception altitude, a horizontal velocity, and a vertical velocity.
10. The method of claim 1, wherein the detection data includes absolute altitude, relative distance altitude.
11. The method of claim 1, wherein the visual perception data comprises a visual perception height, the method further comprising:
acquiring a first altitude and a first vertical speed corresponding to the historical positioning of the aircraft;
calculating a current second altitude of the aircraft according to the first altitude and the first vertical speed;
and comparing the second height with the visual perception height to obtain a detection value corresponding to the visual perception height.
12. The method of claim 11, wherein comparing the second height with the visually perceived height to obtain a detection value corresponding to the visually perceived height comprises:
comparing the second height with the visual perception height to determine whether the visual perception height is abnormal;
and obtaining a detection value corresponding to the visual perception height according to the determined result.
13. The method of claim 12, wherein obtaining the detection value corresponding to the visually perceived height according to the determination result comprises:
and if the visual perception height is determined to be normal, obtaining a first detection value corresponding to the visual perception height.
14. The method of claim 12, wherein obtaining the detection value corresponding to the visually perceived height according to the determination result comprises:
and if the visual perception height is determined to be abnormal, obtaining a second detection value corresponding to the visual perception height.
15. The method of claim 12, wherein comparing the second height to the visually perceived height to determine whether the visually perceived height is abnormal comprises:
and if the height difference between the second height and the visual perception height does not exceed a preset range, determining that the visual perception height is normal.
16. The method of claim 12, wherein comparing the second height to the visually perceived height to determine whether the visually perceived height is abnormal comprises:
and if the height difference between the second height and the visual perception height exceeds a preset range, determining that the visual perception height is abnormal.
17. The method of claim 1, wherein the obtaining visual perception data of the aircraft comprises:
acquiring a visual image by a visual perception system of the aircraft;
and extracting image features of the visual image to obtain the visual perception data.
18. The method of claim 17, further comprising:
and determining the confidence of the visual perception data according to the output parameters of the visual image.
19. The method of claim 18, wherein the output parameters include at least one of feature average depth, feature point number, and image brightness.
20. The method of claim 19, wherein the visual perception data has a higher confidence level for smaller average depths of features corresponding to the visual image.
21. The method of claim 19, wherein the higher the number of feature points corresponding to the visual image, the higher the confidence in the visual perception data.
22. The method of claim 19, wherein the visual perception data has a higher confidence level when the visual image corresponds to a higher image intensity.
23. The method of claim 18, wherein obtaining a detection value corresponding to the visual perception data according to the at least one detection data comprises:
and if the confidence of the visual perception data is smaller than a preset confidence threshold, obtaining a detection value corresponding to the visual perception data according to the at least one detection data.
24. The method of any of claims 1-23, wherein said locating the aircraft based on the detected values comprises:
and when the detection value is a first detection value, performing data fusion on the visual perception data and the at least one detection data to obtain corresponding fusion data, and positioning the aircraft based on the fusion data.
25. The method of claim 24, wherein said data fusing said visual perception data and said at least one detection data to obtain corresponding fused data comprises:
and mutually detecting the visual perception data and the at least one detection data to obtain a detection value corresponding to the at least one detection data.
26. The method of claim 25, wherein the cross-checking the visual perception data and the at least one detection data to obtain a detection value corresponding to the at least one detection data comprises:
mutually detecting the visual perception data and the at least one detection data, and determining whether abnormal detection data exist in the at least one detection data;
and obtaining a detection value corresponding to the at least one detection data according to the determined result.
27. The method of claim 26, wherein obtaining a detection value corresponding to the at least one detection datum according to the determination comprises:
and if the at least one piece of detection data is determined to have no abnormal detection data, obtaining a third detection value corresponding to the at least one piece of detection data.
28. The method of claim 26, wherein obtaining a detection value corresponding to the at least one detection datum according to the determination comprises:
and if the abnormal detection data exists in the at least one piece of detection data, obtaining a fourth detection value corresponding to the at least one piece of detection data.
29. The method according to claim 27, wherein said data fusing said visual perception data and said at least one detection data to obtain corresponding fused data comprises:
and when the third detection value is obtained, performing data fusion on the visual perception data and the at least one detection data to obtain the fusion data.
30. The method according to claim 27, wherein said data fusing said visual perception data and said at least one detection data to obtain corresponding fused data comprises:
and when the fourth detection value is obtained, performing data fusion on the visual perception data and other detection data except the abnormal detection data in the at least one detection data to obtain the fusion data.
31. The method of claim 24, wherein said data fusing said visual perception data and said at least one detection data to obtain corresponding fused data comprises:
obtaining a confidence level of the visual perception data;
determining the weight corresponding to the visual perception data according to the corresponding relation between the preset confidence coefficient and the weight and the confidence coefficient of the visual perception data;
the performing data fusion on the visual perception data and the at least one detection data to obtain corresponding fusion data includes:
and performing data fusion on the visual perception data and the at least one detection data according to the weight corresponding to the visual perception data to obtain the fusion data.
32. The method of claim 31, wherein the higher the confidence in the visual perception data, the greater the corresponding weight.
33. An aircraft positioning device, comprising a memory and a processor;
the memory is used for storing a computer program;
the processor is configured to execute the computer program and, when executing the computer program, implement the following steps:
acquiring visual perception data of an aircraft;
acquiring at least one detection data corresponding to the visual perception data through at least one sensor device;
obtaining a detection value corresponding to the visual perception data according to the at least one detection data;
based on the detection values, the aircraft is located.
34. The apparatus according to claim 33, wherein the processor, when implementing the obtaining of the detection value corresponding to the visual perception data according to the at least one detection data, implements:
determining whether the visual perception data is abnormal according to the at least one detection data;
and obtaining a detection value corresponding to the visual perception data according to the determined result.
35. The apparatus according to claim 34, wherein the processor, when implementing the obtaining of the detection value corresponding to the visual perception data according to the determined result, implements:
and if the visual perception data is determined to be normal, obtaining a first detection value corresponding to the visual perception data.
36. The apparatus according to claim 34, wherein the processor, when implementing the obtaining of the detection value corresponding to the visual perception data according to the determined result, implements:
and if the visual perception data is determined to be abnormal, obtaining a second detection value corresponding to the visual perception data.
37. The apparatus of claim 35, wherein the processor, in effecting the locating the aircraft based on the detection values, specifically effects:
when the first detection value is obtained, the aircraft is located based on the visual perception data.
38. The apparatus as claimed in claim 35, wherein the processor, in carrying out the determining whether the visual perception data is abnormal based on the at least one detection data, is further configured to:
and mutually detecting the visual perception data and the at least one detection data to determine whether the visual perception data is abnormal.
39. The apparatus according to claim 38, wherein the processor, when performing the mutual detection of the visual perception data and the at least one detection data to determine whether the visual perception data is abnormal, specifically performs:
acquiring a first variable quantity corresponding to the visual perception data in a preset time length and a second variable quantity corresponding to a plurality of detection data in the preset time length respectively;
and if the first variable quantity and the plurality of second variable quantities have inconsistent variable quantities, and the inconsistent variable quantities are the first variable quantities, determining that the visual perception data is abnormal.
40. The apparatus according to claim 39, wherein the processor further performs, after performing the obtaining of the first variation corresponding to the visual perception data in a preset time period and the second variations corresponding to the plurality of detection data in the preset time period, the steps of:
if the first variable quantity and the plurality of second variable quantities are all consistent, or the first variable quantity and the plurality of second variable quantities have inconsistent variable quantities and the inconsistent variable quantities are second variable quantities, determining that the visual perception data are normal.
41. The apparatus of claim 33, wherein the visual perception data comprises at least one of a visual perception altitude, a horizontal velocity, and a vertical velocity.
42. The apparatus of claim 33, wherein the detection data comprises an absolute altitude, a relative distance altitude.
43. The apparatus of claim 33, wherein the visual perception data comprises a visual perception height, and wherein the processor, when executing the computer program, further implements:
acquiring a first altitude and a first vertical speed corresponding to the historical positioning of the aircraft;
calculating a current second altitude of the aircraft according to the first altitude and the first vertical speed;
and comparing the second height with the visual perception height to obtain a detection value corresponding to the visual perception height.
44. The apparatus according to claim 43, wherein the processor, when implementing the comparing of the second height with the visually perceived height to obtain a detection value corresponding to the visually perceived height, implements:
comparing the second height with the visual perception height to determine whether the visual perception height is abnormal;
and obtaining a detection value corresponding to the visual perception height according to the determined result.
45. The apparatus according to claim 44, wherein the processor, when performing the obtaining of the detection value corresponding to the visually perceived height according to the determined result, is further configured to perform:
and if the visual perception height is determined to be normal, obtaining a first detection value corresponding to the visual perception height.
46. The apparatus according to claim 44, wherein the processor, when performing the obtaining of the detection value corresponding to the visually perceived height according to the determined result, is further configured to perform:
and if the visual perception height is determined to be abnormal, obtaining a second detection value corresponding to the visual perception height.
47. The apparatus of claim 44, wherein the processor, in implementing the comparing the second height to the visually perceived height to determine whether the visually perceived height is abnormal, implements:
and if the height difference between the second height and the visual perception height does not exceed a preset range, determining that the visual perception height is normal.
48. The apparatus of claim 44, wherein the processor, in implementing the comparing the second height to the visually perceived height to determine whether the visually perceived height is abnormal, implements:
and if the height difference between the second height and the visual perception height exceeds a preset range, determining that the visual perception height is abnormal.
49. The apparatus of claim 33, wherein the processor, in implementing the obtaining the visual perception data of the aircraft, implements:
acquiring a visual image by a visual perception system of the aircraft;
and extracting image features of the visual image to obtain the visual perception data.
50. The apparatus as claimed in claim 49, wherein the processor, when executing the computer program, further implements:
and determining the confidence of the visual perception data according to the output parameters of the visual image.
51. The apparatus of claim 50, wherein the output parameters comprise at least one of feature average depth, feature point number, and image brightness.
52. The apparatus according to claim 51, wherein the smaller the mean depth of the corresponding features of the visual image, the higher the confidence of the visual perception data.
53. The apparatus according to claim 51, wherein the higher the number of feature points corresponding to the visual image, the higher the confidence of the visual perception data.
54. The apparatus according to claim 51, wherein the higher the image brightness corresponding to the visual image, the higher the confidence of the visual perception data.
55. The apparatus according to claim 50, wherein the processor, when implementing the obtaining of the detection value corresponding to the visual perception data according to the at least one detection data, implements:
and if the confidence of the visual perception data is smaller than a preset confidence threshold, obtaining a detection value corresponding to the visual perception data according to the at least one detection data.
56. The apparatus of any one of claims 33 to 55, wherein the processor, in effecting said locating the aircraft based on the detection values, is further configured to effect:
and when the detection value is a first detection value, performing data fusion on the visual perception data and the at least one detection data to obtain corresponding fusion data, and positioning the aircraft based on the fusion data.
57. The apparatus according to claim 56, wherein the processor further effects, prior to performing the data fusion of the visual perception data and the at least one detection data to obtain corresponding fused data:
and mutually detecting the visual perception data and the at least one detection data to obtain a detection value corresponding to the at least one detection data.
58. The apparatus according to claim 57, wherein the processor, when performing the mutual inspection of the visual perception data and the at least one detection data to obtain a detection value corresponding to the at least one detection data, specifically performs:
mutually detecting the visual perception data and the at least one detection data, and determining whether abnormal detection data exist in the at least one detection data;
and obtaining a detection value corresponding to the at least one detection data according to the determined result.
59. The apparatus as claimed in claim 58, wherein the processor, when implementing the obtaining of the detection value corresponding to the at least one detection data according to the determined result, implements:
and if the at least one piece of detection data is determined to have no abnormal detection data, obtaining a third detection value corresponding to the at least one piece of detection data.
60. The apparatus as claimed in claim 58, wherein the processor, when implementing the obtaining of the detection value corresponding to the at least one detection data according to the determined result, implements:
and if the abnormal detection data exists in the at least one piece of detection data, obtaining a fourth detection value corresponding to the at least one piece of detection data.
61. The apparatus according to claim 59, wherein the processor, when implementing the data fusion of the visual perception data and the at least one detection data to obtain corresponding fused data, implements:
and when the third detection value is obtained, performing data fusion on the visual perception data and the at least one detection data to obtain the fusion data.
62. The apparatus according to claim 59, wherein the processor, when implementing the data fusion of the visual perception data and the at least one detection data to obtain corresponding fused data, implements:
and when the fourth detection value is obtained, performing data fusion on the visual perception data and other detection data except the abnormal detection data in the at least one detection data to obtain the fusion data.
63. The apparatus according to claim 56, wherein the processor further effects, prior to performing the data fusion of the visual perception data and the at least one detection data to obtain corresponding fused data:
obtaining a confidence level of the visual perception data;
determining the weight corresponding to the visual perception data according to the corresponding relation between the preset confidence coefficient and the weight and the confidence coefficient of the visual perception data;
when the processor performs data fusion on the visual perception data and the at least one detection data to obtain corresponding fusion data, the following steps are specifically implemented:
and performing data fusion on the visual perception data and the at least one detection data according to the weight corresponding to the visual perception data to obtain the fusion data.
64. The apparatus according to claim 63, wherein the higher the confidence of the visual perception data, the higher the corresponding weight.
65. An aircraft comprising a fuselage, a power system provided in the fuselage for powering the aircraft, and an aircraft positioning device according to any one of claims 33 to 64 for positioning the aircraft.
66. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by a processor, causes the processor to carry out the aircraft positioning method according to any one of claims 1 to 32.
CN202080007330.7A 2020-05-06 2020-05-06 Aircraft positioning method and device, aircraft and storage medium Pending CN113272625A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/088841 WO2021223122A1 (en) 2020-05-06 2020-05-06 Aircraft positioning method and apparatus, aircraft, and storage medium

Publications (1)

Publication Number Publication Date
CN113272625A true CN113272625A (en) 2021-08-17

Family

ID=77227972

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080007330.7A Pending CN113272625A (en) 2020-05-06 2020-05-06 Aircraft positioning method and device, aircraft and storage medium

Country Status (2)

Country Link
CN (1) CN113272625A (en)
WO (1) WO2021223122A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113992845A (en) * 2021-10-18 2022-01-28 咪咕视讯科技有限公司 Image shooting control method and device and computing equipment

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104932522A (en) * 2015-05-27 2015-09-23 深圳市大疆创新科技有限公司 Autonomous landing method and system for aircraft
CN105120230A (en) * 2015-09-15 2015-12-02 成都时代星光科技有限公司 Unmanned plane image monitoring and transmitting system
CN107543540A (en) * 2016-06-27 2018-01-05 杭州海康机器人技术有限公司 The data fusion and offline mode switching method and device of a kind of flight equipment
WO2018189529A1 (en) * 2017-04-10 2018-10-18 Blue Vision Labs UK Limited Co-localisation
CN109490931A (en) * 2018-09-03 2019-03-19 天津远度科技有限公司 Flight localization method, device and unmanned plane
CN109520497A (en) * 2018-10-19 2019-03-26 天津大学 The unmanned plane autonomic positioning method of view-based access control model and imu
CN110231028A (en) * 2018-03-05 2019-09-13 北京京东尚科信息技术有限公司 Aircraft navigation methods, devices and systems
CN110388917A (en) * 2018-04-23 2019-10-29 北京京东尚科信息技术有限公司 Aircraft monocular vision Scale Estimation Method and device, aircraft guidance system and aircraft
CN110501736A (en) * 2019-08-28 2019-11-26 武汉大学 Utilize vision imaging and GNSS distance measuring signal close coupling positioning system and method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104932522A (en) * 2015-05-27 2015-09-23 深圳市大疆创新科技有限公司 Autonomous landing method and system for aircraft
CN105120230A (en) * 2015-09-15 2015-12-02 成都时代星光科技有限公司 Unmanned plane image monitoring and transmitting system
CN107543540A (en) * 2016-06-27 2018-01-05 杭州海康机器人技术有限公司 The data fusion and offline mode switching method and device of a kind of flight equipment
WO2018189529A1 (en) * 2017-04-10 2018-10-18 Blue Vision Labs UK Limited Co-localisation
CN110231028A (en) * 2018-03-05 2019-09-13 北京京东尚科信息技术有限公司 Aircraft navigation methods, devices and systems
CN110388917A (en) * 2018-04-23 2019-10-29 北京京东尚科信息技术有限公司 Aircraft monocular vision Scale Estimation Method and device, aircraft guidance system and aircraft
CN109490931A (en) * 2018-09-03 2019-03-19 天津远度科技有限公司 Flight localization method, device and unmanned plane
CN109520497A (en) * 2018-10-19 2019-03-26 天津大学 The unmanned plane autonomic positioning method of view-based access control model and imu
CN110501736A (en) * 2019-08-28 2019-11-26 武汉大学 Utilize vision imaging and GNSS distance measuring signal close coupling positioning system and method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113992845A (en) * 2021-10-18 2022-01-28 咪咕视讯科技有限公司 Image shooting control method and device and computing equipment
CN113992845B (en) * 2021-10-18 2023-11-10 咪咕视讯科技有限公司 Image shooting control method and device and computing equipment

Also Published As

Publication number Publication date
WO2021223122A1 (en) 2021-11-11

Similar Documents

Publication Publication Date Title
CN107783106B (en) Data fusion method between unmanned aerial vehicle and barrier
JP6506302B2 (en) Method and apparatus for operating a mobile platform
CN107783545B (en) Obstacle avoidance system of post-disaster rescue rotor unmanned aerial vehicle based on OODA (object oriented data acquisition) ring multi-sensor information fusion
WO2018005882A1 (en) Unmanned aerial vehicle wind turbine inspection systems and methods
CN110455285A (en) A kind of Navigation of Pilotless Aircraft method and navigation device in satellite navigation signals failure
CN110488865B (en) Unmanned aerial vehicle course determining method and device and unmanned aerial vehicle
CN109656265A (en) Program is used in data processing equipment, unmanned plane and its control device, method and processing
CN110377056B (en) Unmanned aerial vehicle course angle initial value selection method and unmanned aerial vehicle
CN111221347B (en) Acceleration compensation method and system in attitude estimation of vertical take-off and landing fixed wing unmanned aerial vehicle
CN111679680A (en) Unmanned aerial vehicle autonomous landing method and system
CN112298602B (en) Unmanned aerial vehicle fault detection method, device, equipment and storage medium
WO2019061083A1 (en) System and method for determining airspeed
US9453921B1 (en) Delayed-based geographic position data generation system, device, and method
CN109521785A (en) It is a kind of to clap Smart Rotor aerocraft system with oneself
CN113296532A (en) Flight control method and device of manned aircraft and manned aircraft
CN111615677A (en) Safe landing method and device for unmanned aerial vehicle, unmanned aerial vehicle and medium
CN109725649A (en) One kind determining high algorithm based on barometer/IMU/GPS Multi-sensor Fusion rotor wing unmanned aerial vehicle
CN112105961B (en) Positioning method based on multi-data fusion, movable platform and storage medium
CN110514208B (en) Course determining method, device and system for aircraft
CN113272625A (en) Aircraft positioning method and device, aircraft and storage medium
US20210229810A1 (en) Information processing device, flight control method, and flight control system
CN116700070B (en) Safety supervision method and system for flight state of unmanned aerial vehicle
CN107807375A (en) A kind of UAV Attitude method for tracing and system based on more GPSs
CN116756686A (en) Method and system for estimating strong disturbance rejection altitude state of aircraft
US20230051574A1 (en) Uav nevigation calibration method, non-transitory computer-readable storage medium and uav implementing the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination