CN115457129A - Aircraft positioning method and device, electronic equipment and storage medium - Google Patents

Aircraft positioning method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN115457129A
CN115457129A CN202211080176.3A CN202211080176A CN115457129A CN 115457129 A CN115457129 A CN 115457129A CN 202211080176 A CN202211080176 A CN 202211080176A CN 115457129 A CN115457129 A CN 115457129A
Authority
CN
China
Prior art keywords
feature
information
aerial vehicle
unmanned aerial
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211080176.3A
Other languages
Chinese (zh)
Inventor
吴镝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Baidi Technology Co ltd
Original Assignee
Shenzhen Baidi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Baidi Technology Co ltd filed Critical Shenzhen Baidi Technology Co ltd
Priority to CN202211080176.3A priority Critical patent/CN115457129A/en
Publication of CN115457129A publication Critical patent/CN115457129A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • G06V10/464Salient features, e.g. scale invariant feature transforms [SIFT] using a plurality of salient features, e.g. bag-of-words [BoW] representations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/772Determining representative reference patterns, e.g. averaging or distorting patterns; Generating dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Remote Sensing (AREA)
  • Navigation (AREA)

Abstract

The embodiment of the invention provides a method and a device for positioning an aircraft, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring first feature information from target area map data, wherein the first feature information at least comprises feature points, feature descriptors corresponding to the feature points and a first feature dictionary corresponding to the feature points; carrying out route planning on the map data of the target area to obtain a plurality of waypoints; respectively acquiring three-dimensional coordinate data of waypoints; for each waypoint, collecting a feature descriptor and a first feature dictionary of an area which can be covered by the field angle of the airborne camera by taking the pixel coordinates of the waypoint as the center, and generating a feature set; organizing the three-dimensional coordinate data and the feature set into attribute information of the waypoint to obtain track information; acquiring real-time image data acquired by an airborne camera according to preset intervals; and determining the current position information of the unmanned aerial vehicle based on the flight path information and the real-time image data. The invention can improve the positioning precision of the aircraft.

Description

Aircraft positioning method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of aircraft positioning technologies, and in particular, to an aircraft positioning method and apparatus, an electronic device, and a storage medium.
Background
At present, the aircraft has wide application and is often applied to industries such as plant protection, urban management, geology, meteorology, electric power, emergency and disaster relief, video shooting and the like.
In the prior art, when an aircraft needs to operate, a position reference can be provided for the aircraft based on a GPS or a more accurate RTK (Real-time kinematic-differential) -GPS. Therefore, the operation process in the prior art works based on the GPS, the GPS is greatly depended on, once no GPS signal exists or the aircraft is shielded, the aircraft cannot work normally, and the positioning accuracy of the aircraft is low.
Disclosure of Invention
The embodiment of the invention provides a method for positioning an aircraft, and aims to solve the problem of low positioning accuracy of the existing aircraft.
In a first aspect, an embodiment of the present invention provides a method for locating an aircraft, where the method includes:
acquiring first feature information from target area map data, wherein the first feature information at least comprises feature points, feature descriptors corresponding to the feature points and a first feature dictionary corresponding to the feature points;
carrying out route planning on the map data of the target area to obtain a plurality of waypoints;
respectively acquiring three-dimensional coordinate data of the waypoints;
for each waypoint, collecting a feature descriptor and a first feature dictionary of an area which can be covered by the field angle of the airborne camera by taking the pixel coordinates of the waypoint as the center, and generating a feature set;
organizing the three-dimensional coordinate data and the feature set into attribute information of the waypoint to obtain track information;
acquiring real-time image data acquired by the airborne camera according to a preset interval;
and determining the current position information of the unmanned aerial vehicle based on the flight path information and the real-time image data.
Optionally, the step of determining the current position information of the unmanned aerial vehicle based on the track information and the real-time image data includes:
judging whether the unmanned aerial vehicle deviates from a route corresponding to the flight path information or not based on the flight path information and the real-time image data;
if the unmanned aerial vehicle does not deviate from the air route, acquiring temporary positioning information of the unmanned aerial vehicle;
and determining the current position information of the unmanned aerial vehicle based on the temporary positioning information.
Optionally, the step of determining whether the unmanned aerial vehicle deviates from the route corresponding to the track information based on the track information and the real-time image data includes:
extracting second feature information from the real-time image data, wherein the second feature information comprises a second feature dictionary;
calculating the matching degree of the second feature dictionary and each first feature dictionary in the feature set;
and if the first characteristic dictionary with the matching degree larger than the preset threshold value does not exist, judging that the unmanned aerial vehicle deviates from the route corresponding to the flight path information.
Optionally, the step of obtaining the temporary positioning information of the unmanned aerial vehicle includes:
if the first feature dictionary with the matching degree larger than the preset threshold exists, obtaining feature points corresponding to the first feature dictionary with the matching degree larger than the preset threshold, and taking the area determined by the feature points as temporary positioning information of the unmanned aerial vehicle.
Optionally, the feature set further includes a first feature descriptor corresponding to each feature point, the second feature information includes a second feature descriptor, and the step of determining the current position information of the unmanned aerial vehicle based on the temporary positioning information includes:
determining neighborhood waypoints based on the temporary positioning information, and acquiring a first feature descriptor associated with the neighborhood waypoints;
matching the second feature descriptor with the first feature descriptor to determine a matched feature point sequence;
and determining the current position information of the unmanned aerial vehicle based on the matched feature point sequence.
Optionally, the step of determining the current position information of the unmanned aerial vehicle based on the matched feature point sequence includes:
and calculating the matched characteristic point sequence by adopting a preset positioning algorithm, and determining the current position information of the unmanned aerial vehicle.
Optionally, the method further includes:
determining the position offset of the current position information and the track information;
and correcting the current position information based on the position offset.
In a second aspect, an embodiment of the present invention further provides a positioning device for an aircraft, where the positioning device for an aircraft includes:
the first obtaining module is used for obtaining first feature information from the target area map data, wherein the first feature information at least comprises feature points, feature descriptors corresponding to the feature points and a first feature dictionary corresponding to the feature points;
the planning module is used for planning a route of the map data of the target area to obtain a plurality of waypoints;
the second acquisition module is used for respectively acquiring the three-dimensional coordinate data of the waypoints;
the acquisition module is used for acquiring a feature descriptor and a first feature dictionary of an area which can be covered by the field angle of the airborne camera by taking the pixel coordinate of each navigation point as the center, and generating a feature set;
the composition module is used for organizing the three-dimensional coordinate data and the feature set into attribute information of the waypoint so as to obtain track information;
the third acquisition module is used for acquiring real-time image data acquired by the airborne camera according to preset intervals;
and the first determining module is used for determining the current position information of the unmanned aerial vehicle based on the flight path information and the real-time image data.
In a third aspect, an embodiment of the present invention provides an electronic device, including: the positioning method comprises the following steps of storing a positioning program, storing a positioning program in the storage, and executing the positioning program by the processor.
In a fourth aspect, the embodiments of the present invention provide a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps in the positioning method for an aircraft provided by the embodiments of the present invention.
In the embodiment of the invention, first characteristic information is acquired from target area map data, and the first characteristic information at least comprises characteristic points, characteristic descriptors corresponding to the characteristic points and a first characteristic dictionary corresponding to the characteristic points; carrying out route planning on the map data of the target area to obtain a plurality of waypoints; respectively acquiring three-dimensional coordinate data of waypoints; for each waypoint, taking the pixel coordinates of the waypoint as the center, collecting a feature descriptor and a first feature dictionary of an area which can be covered by the field angle of the airborne camera, and generating a feature set; organizing the three-dimensional coordinate data and the feature set into attribute information of the waypoint to obtain track information; acquiring real-time image data acquired by an airborne camera according to preset intervals; and determining the current position information of the unmanned aerial vehicle based on the flight path information and the real-time image data. Therefore, the unmanned aerial vehicle is roughly positioned by using the feature dictionary in the flight path information, more accurate current position information is obtained for the matching of the subsequent feature descriptors, and meanwhile, the positioning and navigation of the unmanned aerial vehicle when no GPS signal or the GPS signal is covered are realized by combining with real-time image data, so that the dependence of the unmanned aerial vehicle on the GPS is reduced, and the positioning accuracy of the unmanned aerial vehicle is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flow chart of a method for locating an aircraft according to an embodiment of the present invention;
FIG. 2 is a flow chart of one method provided at step 107 in the embodiment of FIG. 1;
FIG. 3 is a flow chart of one method provided in step 201 in the embodiment of FIG. 2;
fig. 4 is a schematic structural diagram of a positioning device of an aircraft according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a structure provided by the first determining module in the embodiment of FIG. 4;
FIG. 6 is a schematic diagram of a structure provided by the determining unit in the embodiment of FIG. 5;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without inventive step based on the embodiments of the present invention, are within the scope of protection of the present invention.
As shown in fig. 1, fig. 1 is a flowchart of a method for positioning an aircraft according to an embodiment of the present invention. The aircraft positioning method comprises the following steps:
step 101, obtaining first characteristic information from target area map data.
The target area map data is map data acquired in advance, and the target area map data to be operated is extracted from the map data according to actual operation requirements. The target area may include a target area map data, a set of feature points and corresponding feature descriptors, and a feature dictionary corresponding to each feature point.
The first feature information at least comprises feature points, feature descriptors corresponding to the feature points and a first feature dictionary corresponding to the feature points.
Specifically, after the target area map data is obtained, feature extraction may be performed on the target area map data to obtain first feature information.
And 102, planning a route of the map data of the target area to obtain a plurality of waypoints.
Specifically, a corresponding planning scheme can be selected according to the operation requirement, and the route planning can be directly performed on the map data of the target area. For example, if the operation requirement is that the operation of the unmanned aerial vehicle completely covers the area to be targeted, a snake planning method can be adopted for route planning.
It should be noted that the embodiment of the present invention is not limited to the route planning method, and a series of discrete three-dimensional coordinate points (i.e., waypoints) and corresponding attribute information may be used to describe the planning result, i.e., the track information, regardless of the planning method.
After the route planning is carried out on the target area, a plurality of waypoints can be obtained, and the routes of the target area can be obtained by connecting the waypoints. The departure point of the flight path can be specified in the target area according to any requirement, and the determining mode of the departure point is not limited by the embodiment of the invention.
And 103, respectively acquiring three-dimensional coordinate data of the waypoints.
Each waypoint may be described as a three-dimensional coordinate datum denoted as fi. The three-dimensional coordinate data may include a plane coordinate and a height coordinate, among others.
Specifically, the geographic position coordinates of the pixels corresponding to the waypoints can be obtained as plane coordinates; acquiring the height of a pixel corresponding to a waypoint; acquiring the ground height of the unmanned aerial vehicle relative to the ground during planning; the height and the sum of the height to the ground are taken as the height coordinate.
In order to realize the functions of positioning and navigation, the distance between two waypoints cannot be too sparse in the track information, and in one implementation mode, the distance between the two waypoints can be determined by the overlapping degree of two images continuously acquired by an airborne camera.
In practice, the above-mentioned overlap may be set to be more than 30%.
In a specific implementation, the relationship between the overlapping degree and the attributes of the ground altitude and the onboard camera of the unmanned aerial vehicle can be expressed as follows:
J=Hw p n p(1-p)/f
wherein J is the time interval between two images shot by the airborne camera, H is the ground height of the unmanned aerial vehicle, W P is the size of the pixel, n P is the number of the pixel, P is the overlapping degree, and f is the focal length of the airborne camera.
The above-mentioned pixel size and the number of pixels refer to the pixel size and the number of pixels of the vertical and horizontal coordinates of the sensor parallel to the flight direction.
And step 104, for each navigation point, collecting a feature descriptor and a first feature dictionary of an area which can be covered by the field angle of the airborne camera by taking the pixel coordinate of the navigation point as the center, and generating a feature set.
Specifically, the navigation requirement of the unmanned aerial vehicle is met, and the feature set of the waypoint can be determined.
Specifically, for each waypoint, a feature descriptor and a feature dictionary of an area which can be covered by a field angle FOV of the airborne camera may be collected with a pixel coordinate of the waypoint as a center, and a set of the collected feature descriptor and the feature dictionary is recorded as a feature set, and the feature set is di = { xij, tij }, where xij is a feature descriptor of a jth feature point corresponding to the waypoint i, and tij is a feature dictionary of a jth feature point corresponding to the waypoint i.
And 105, organizing the three-dimensional coordinate data and the feature set into attribute information of the waypoint to obtain track information.
Specifically, after the three-dimensional coordinate data and the feature set of each waypoint are obtained, the three-dimensional coordinate data and the feature set can be organized into attribute information of the waypoint, that is, the description of the waypoint is hi = { di, fi }, so that corresponding track information is obtained. The track information can be described as H = { H1, H2, H3, \8230;, hn }.
And 106, acquiring real-time image data acquired by the airborne camera according to a preset interval.
Specifically, after the unmanned aerial vehicle is started, the onboard camera can be started, and the onboard camera is controlled to continuously acquire real-time image data according to preset intervals.
And step 107, determining the current position information of the unmanned aerial vehicle based on the flight path information and the real-time image data.
Aiming at the real-time image data acquired by the unmanned aerial vehicle each time, the unmanned aerial vehicle can be compared with the flight path information, so that the real-time position information of the unmanned aerial vehicle is determined according to the flight path information and the real-time image data.
In one implementation of the embodiment of the present invention, as shown in fig. 2, the step 107 further includes the following steps:
step 201, based on the flight path information and the real-time image data, judging whether the unmanned aerial vehicle deviates from the flight path corresponding to the flight path information.
Firstly, the unmanned aerial vehicle can judge whether the current unmanned aerial vehicle is near a planned route according to the track information and the currently acquired real-time image data, namely whether the current unmanned aerial vehicle deviates from the route corresponding to the track information.
In one implementation manner of the embodiment of the present invention, as shown in fig. 3, step 201 may further include the following steps:
step 301, extracting second feature information from the real-time image data.
As a preferred embodiment of the present invention, the second feature information may include a plurality of second feature points, a second feature dictionary corresponding to each second feature point, a second feature descriptor, and the like.
Specifically, step 301 includes:
the real-time image data is divided into a plurality of block data. For example, the real-time image data may be segmented into N × M block data by using an image segmentation method.
And respectively extracting a preset number of feature points from each block data. Specifically, after obtaining N × M block data, t feature points may be extracted from each block data, respectively, to ensure that the feature points are uniformly distributed in the real-time image data.
In practice, the above N, M, t may satisfy the following conditions: a predetermined number of pixels corresponds to one feature point, for example, 100 pixels corresponds to one feature point.
Of course, besides the segmentation, the feature points may also be directly extracted from the real-time image data according to a preset feature extraction method, which is not limited in this embodiment of the present invention.
A feature point may refer to a point whose location itself has a conventional attribute meaning, such as a corner (Fast corner, herries corner, etc.), an intersection, and so on.
In a specific implementation, a computer vision method may be adopted to extract feature points from each block data, for example, for Fast corner points, the extraction manner may include the following processes: and traversing each pixel point in the block data, selecting 16 surrounding pixels by taking the current pixel point as a center and 3 as a radius, sequentially comparing, marking if the gray difference value is greater than a preset threshold value, and taking the current point as a feature point if the number of the marks is greater than 12.
And generating feature descriptors corresponding to the feature points. Specifically, after the feature points are obtained, a feature description is established for the feature points, and such a feature description may be referred to as a feature descriptor.
As an example, the feature descriptor may include a SURF (FAST up robust feature) descriptor, an ORB (organized FAST and managed BRIEF) descriptor, and the like.
In an implementation, the feature point may be described in combination with a feature descriptor, pixel coordinates of the feature point, and geographic location coordinates of the feature point, that is, the description of the feature point may be denoted as xi = { di, ci, fi }, where di is the feature descriptor of the ith feature point, which is an n-dimensional vector; ci is the pixel coordinate of the ith characteristic point; fi is the geographic position coordinate of the ith feature point, and can be represented by a three-dimensional vector, namely three-dimensional position data.
The set of all feature points and corresponding feature descriptors in the real-time image data may be: x = { X1, X2, X3, \8230;, xn }.
And generating a feature dictionary corresponding to the feature points. Specifically, after the feature points and the corresponding feature descriptors are obtained, a feature dictionary corresponding to each feature point can be created, and the feature dictionary is used for rapidly detecting the current approximate position of the unmanned aerial vehicle, and further performing high-precision image matching and positioning of the unmanned aerial vehicle.
In a specific implementation, a loop detection algorithm may be used to determine a feature dictionary corresponding to each feature point, for example, one loop detection algorithm may include bag of words (bag of words) of DBOW 2.
Step 302, calculating the matching degree between the second feature dictionary and each first feature dictionary in the feature set.
Specifically, after second feature dictionaries corresponding to a plurality of second feature points in the real-time image data are obtained, each second feature dictionary may be respectively matched with each first feature dictionary in the feature set, so as to calculate a matching degree between each second feature dictionary and each first feature dictionary in each feature set.
In the embodiment of the present invention, the method for calculating the matching degree is not limited, and for example, the method for calculating the similarity may be used to calculate the matching degree between the first feature dictionary and the second feature dictionary.
And 303, if the first feature dictionary with the matching degree larger than the preset threshold does not exist, judging that the unmanned aerial vehicle deviates from a route corresponding to the track information.
Specifically, after the matching degree between all the second feature dictionaries and each first feature dictionary in the feature set is calculated, if the matching degree larger than a preset threshold value does not exist, the route corresponding to the unmanned aerial vehicle deviation flight path information can be judged, wherein the deviation route refers to at least one deviation from a flight path.
Step 202, if the unmanned aerial vehicle does not deviate from the air route, acquiring temporary positioning information of the unmanned aerial vehicle.
In a preferred embodiment of the present invention, step 202 may further include the steps of:
if the first feature dictionary with the matching degree larger than the preset threshold exists, the feature points corresponding to the first feature dictionary with the matching degree larger than the preset threshold are obtained, and the region determined by the feature points is used as the temporary positioning information of the unmanned aerial vehicle.
If the matching degree is larger than the preset threshold value, the unmanned aerial vehicle can be judged to be on the route corresponding to the track information, namely the unmanned aerial vehicle is judged not to deviate from the route corresponding to the track information. At this time, feature points corresponding to the first feature dictionary with the matching degree greater than the preset threshold value may be determined as matching feature points, and the region determined by the matching feature points is used as temporary positioning information of the unmanned aerial vehicle, where the temporary positioning information is a rough flight position of the unmanned aerial vehicle.
It should be noted that, if the unmanned aerial vehicle flies according to the flight lines sequentially, because the adjacent waypoints to be matched are determined, the rough flight position of the unmanned aerial vehicle can be determined according to the waypoints already flown without using the feature dictionary.
And step 203, determining the current position information of the unmanned aerial vehicle based on the temporary positioning information.
Specifically, after the temporary positioning information of the unmanned aerial vehicle is obtained, the accurate position information of the unmanned aerial vehicle can be obtained according to the rough positioning information.
In a preferred embodiment of the present embodiment, the sub-step 203 further comprises the steps of:
and determining neighborhood waypoints based on the temporary positioning information, and acquiring a first feature descriptor associated with the neighborhood waypoints.
And matching the second feature descriptor with the first feature descriptor to determine a matched feature point sequence. Specifically, after the temporary positioning information of the unmanned aerial vehicle is obtained, the adjacent waypoints, that is, the neighborhood waypoints, may be determined according to the temporary positioning information, and the first feature descriptor associated with the neighborhood waypoints may be obtained.
Subsequently, the second feature descriptors corresponding to the real-time image data can be respectively matched with the first feature descriptors associated with the neighborhood waypoints, and a group of feature point sequences is obtained after matching is completed.
In one embodiment, the sequence of feature points may include two-dimensional coordinates of matched feature points of the real-time image data and three-dimensional coordinates of feature points of the corresponding matched target area map data.
And determining the current position information of the unmanned aerial vehicle based on the matched feature point sequence.
Specifically, the determining the current position information of the unmanned aerial vehicle based on the matched feature point sequence comprises the following steps: and calculating the matched characteristic point sequence by adopting a preset positioning algorithm, and determining the current position information of the unmanned aerial vehicle.
In one embodiment, the predetermined positioning algorithm may include, but is not limited to, a PNP (camera pose estimation) algorithm.
In a preferred embodiment of the present invention, after obtaining the current position information of the unmanned aerial vehicle, the method may further include the following steps:
determining the position offset of the current position information and the track information; the current position information is corrected based on the position offset amount.
In a specific implementation, a position difference between the current position information and a predetermined waypoint position of the flight path information may be calculated as a position offset amount, and the position offset amount is input to the flight controller to correct the current deviation.
In the embodiment of the invention, first feature information is acquired from target area map data, and the first feature information at least comprises feature points, feature descriptors corresponding to the feature points and a first feature dictionary corresponding to the feature points; carrying out route planning on the map data of the target area to obtain a plurality of waypoints; respectively acquiring three-dimensional coordinate data of waypoints; for each waypoint, taking the pixel coordinates of the waypoint as the center, collecting a feature descriptor and a first feature dictionary of an area which can be covered by the field angle of the airborne camera, and generating a feature set; organizing the three-dimensional coordinate data and the feature set into attribute information of a waypoint to obtain track information; acquiring real-time image data acquired by an airborne camera according to preset intervals; and determining the current position information of the unmanned aerial vehicle based on the flight path information and the real-time image data. Therefore, the unmanned aerial vehicle is roughly positioned by using the feature dictionary in the flight path information, more accurate current position information is obtained for the matching of the following feature descriptors, and meanwhile, the positioning and navigation of the unmanned aerial vehicle when no GPS signal exists or the GPS signal is covered are realized by combining with real-time image data, so that the dependence of the unmanned aerial vehicle on the GPS is reduced, and the positioning accuracy of the unmanned aerial vehicle is improved.
It should be noted that the aircraft positioning method provided by the embodiment of the present invention may be applied to devices such as a smart phone, a computer, and a server that can perform positioning of an aircraft.
As shown in fig. 4, the positioning device 400 of the aircraft includes:
a first obtaining module 401, configured to obtain first feature information from target area map data, where the first feature information at least includes a feature point, a feature descriptor corresponding to the feature point, and a first feature dictionary corresponding to the feature point;
a planning module 402, configured to perform route planning on target area map data to obtain multiple waypoints;
a second obtaining module 403, configured to obtain three-dimensional coordinate data of waypoints respectively;
the acquisition module 404 is configured to acquire, for each waypoint, a feature descriptor and a first feature dictionary of an area that can be covered by a field angle of the airborne camera, with a pixel coordinate of the waypoint as a center, and generate a feature set;
a composition module 405, configured to organize the three-dimensional coordinate data and the feature set into attribute information of a waypoint to obtain track information;
a third obtaining module 406, configured to obtain real-time image data acquired by the airborne camera at preset intervals;
and a first determining module 407, configured to determine current position information of the unmanned aerial vehicle based on the track information and the real-time image data.
Optionally, as shown in fig. 5, the first determining module 407 includes:
a judging unit 4071, configured to judge whether the unmanned aerial vehicle deviates from a route corresponding to the track information based on the track information and the real-time image data;
an obtaining unit 4072, configured to obtain temporary positioning information of the unmanned aerial vehicle if the unmanned aerial vehicle does not deviate from the airline;
a determining unit 4073, configured to determine current position information of the unmanned aerial vehicle based on the temporary positioning information.
Optionally, as shown in fig. 6, the determining unit 4071 includes:
an extraction sub-unit 40711, configured to extract second feature information from the real-time image data, where the second feature information includes a second feature dictionary;
a calculating subunit 40712, configured to calculate a matching degree between the second feature dictionary and each first feature dictionary in the feature set;
and the judging subunit 40713, configured to judge, if there is no first feature dictionary whose matching degree is greater than a preset threshold, a route corresponding to the unmanned aerial vehicle deviation track information.
Optionally, the obtaining unit is further configured to obtain, if there is a first feature dictionary whose matching degree is greater than a preset threshold, a feature point corresponding to the first feature dictionary whose matching degree is greater than the preset threshold, and use an area determined by the feature point as the temporary positioning information of the unmanned aerial vehicle.
Optionally, the feature set further includes a first feature descriptor corresponding to each feature point, the second feature information includes a second feature descriptor, and the determining unit includes:
the first determining subunit is used for determining neighborhood waypoints based on the temporary positioning information and acquiring a first feature descriptor associated with the neighborhood waypoints;
the matching subunit is used for matching the second feature descriptor with the first feature descriptor and determining a matched feature point sequence;
and the second determining subunit is used for determining the current position information of the unmanned aerial vehicle based on the matched characteristic point sequence.
Optionally, the second determining subunit is further configured to calculate the matched feature point sequence by using a preset positioning algorithm, and determine current position information of the unmanned aerial vehicle.
Optionally, the positioning device of the aircraft further includes:
the second determining module is used for determining the position offset of the current position information and the track information;
and the correction module is used for correcting the current position information based on the position offset.
It should be noted that the aircraft positioning device 400 provided in the embodiment of the present invention may be applied to a smart phone, a computer, a server, and other devices that can perform positioning of an aircraft.
The aircraft positioning device 400 provided by the embodiment of the invention can realize each process realized by the aircraft positioning method in the method embodiment, and can achieve the same beneficial effects. To avoid repetition, further description is omitted here.
Referring to fig. 7, fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present invention, as shown in fig. 7, including: memory 502, processor 501 and a computer program of a method for positioning an aircraft stored on memory 502 and executable on processor 501, wherein:
the processor 501 is used for calling the computer program stored in the memory 502, and executing the following steps:
acquiring first feature information from target area map data, wherein the first feature information at least comprises feature points, feature descriptors corresponding to the feature points and a first feature dictionary corresponding to the feature points;
carrying out route planning on the map data of the target area to obtain a plurality of waypoints;
respectively acquiring three-dimensional coordinate data of waypoints;
for each waypoint, collecting a feature descriptor and a first feature dictionary of an area which can be covered by the field angle of the airborne camera by taking the pixel coordinates of the waypoint as the center, and generating a feature set;
organizing the three-dimensional coordinate data and the feature set into attribute information of the waypoint to obtain track information;
acquiring real-time image data acquired by an airborne camera according to preset intervals;
and determining the current position information of the unmanned aerial vehicle based on the flight path information and the real-time image data.
Optionally, the step of determining the current position information of the unmanned aerial vehicle based on the track information and the real-time image data, which is executed by the processor 501, includes:
judging whether the unmanned aerial vehicle deviates from a route corresponding to the flight path information or not based on the flight path information and the real-time image data;
if the unmanned aerial vehicle does not deviate from the air route, acquiring temporary positioning information of the unmanned aerial vehicle;
and determining the current position information of the unmanned aerial vehicle based on the temporary positioning information.
Optionally, the step, executed by the processor 501, of determining whether the unmanned aerial vehicle deviates from the route corresponding to the track information based on the track information and the real-time image data includes:
extracting second feature information from the real-time image data, wherein the second feature information comprises a second feature dictionary;
calculating the matching degree of the second feature dictionary and each first feature dictionary in the feature set;
and if the first feature dictionary with the matching degree larger than the preset threshold value does not exist, judging that the unmanned aerial vehicle deviates from the route corresponding to the track information.
Optionally, the step of acquiring the temporary positioning information of the unmanned aerial vehicle executed by the processor 501 includes:
if the first feature dictionary with the matching degree larger than the preset threshold exists, the feature points corresponding to the first feature dictionary with the matching degree larger than the preset threshold are obtained, and the region determined by the feature points is used as the temporary positioning information of the unmanned aerial vehicle.
Optionally, the feature set further includes a first feature descriptor corresponding to each feature point, the second feature information includes a second feature descriptor, and the step, performed by the processor 501, of determining the current position information of the unmanned aerial vehicle based on the temporary positioning information includes:
determining neighborhood waypoints based on the temporary positioning information, and acquiring a first feature descriptor associated with the neighborhood waypoints;
matching the second feature descriptor with the first feature descriptor to determine a matched feature point sequence;
and determining the current position information of the unmanned aerial vehicle based on the matched characteristic point sequence.
Optionally, the step of determining the current position information of the unmanned aerial vehicle based on the matched feature point sequence executed by the processor 501 includes:
and calculating the matched characteristic point sequence by adopting a preset positioning algorithm to determine the current position information of the unmanned aerial vehicle.
Optionally, the processor 501 further performs the steps of:
determining the position offset of the current position information and the track information;
the current position information is corrected based on the position offset amount.
It should be noted that the electronic device 500 provided in the embodiment of the present invention may be applied to a device such as a smart phone, a computer, and a server that can perform a method for positioning an aircraft.
The electronic device 500 provided by the embodiment of the invention can realize each process realized by the positioning method of the aircraft in the above method embodiments, and can achieve the same beneficial effects. To avoid repetition, further description is omitted here.
The embodiments of the present invention further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements each process of the positioning method for an aircraft or the positioning method for an application-side aircraft provided in the embodiments of the present invention, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware related to instructions of a computer program, and the program can be stored in a computer readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
While the invention has been described in connection with what is presently considered to be the most practical and preferred embodiment, it is to be understood that the invention is not to be limited to the disclosed embodiment, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.

Claims (10)

1. A method of locating an aircraft, the aircraft comprising an onboard camera, the method comprising the steps of:
acquiring first feature information from target area map data, wherein the first feature information at least comprises feature points, feature descriptors corresponding to the feature points and a first feature dictionary corresponding to the feature points;
carrying out route planning on the map data of the target area to obtain a plurality of waypoints;
respectively acquiring three-dimensional coordinate data of the waypoints;
for each waypoint, collecting a feature descriptor and a first feature dictionary of an area which can be covered by the field angle of the airborne camera by taking the pixel coordinates of the waypoint as the center, and generating a feature set;
organizing the three-dimensional coordinate data and the feature set into attribute information of the waypoint to obtain track information;
acquiring real-time image data acquired by the airborne camera according to a preset interval;
and determining the current position information of the unmanned aerial vehicle based on the flight path information and the real-time image data.
2. The method of claim 1, wherein the step of determining current position information of the UAV based on the track information and the real-time image data comprises:
judging whether the unmanned aerial vehicle deviates from a route corresponding to the flight path information or not based on the flight path information and the real-time image data;
if the unmanned aerial vehicle does not deviate from the air route, acquiring temporary positioning information of the unmanned aerial vehicle;
and determining the current position information of the unmanned aerial vehicle based on the temporary positioning information.
3. The method of claim 2, wherein the step of determining whether the UAV deviates from a route corresponding to the track information based on the track information and the real-time image data comprises:
extracting second feature information from the real-time image data, wherein the second feature information comprises a second feature dictionary;
calculating the matching degree of the second feature dictionary and each first feature dictionary in the feature set;
and if the first feature dictionary with the matching degree larger than a preset threshold value does not exist, judging that the unmanned aerial vehicle deviates from the route corresponding to the track information.
4. The method of claim 3, wherein the step of obtaining temporary positioning information for the UAV comprises:
if the first feature dictionary with the matching degree larger than the preset threshold exists, obtaining feature points corresponding to the first feature dictionary with the matching degree larger than the preset threshold, and taking the area determined by the feature points as temporary positioning information of the unmanned aerial vehicle.
5. The method according to claim 3 or 4, wherein the feature set further comprises a first feature descriptor corresponding to each feature point, the second feature information comprises a second feature descriptor, and the step of determining the current position information of the UAV based on the temporary positioning information comprises:
determining neighborhood waypoints based on the temporary positioning information, and acquiring a first feature descriptor associated with the neighborhood waypoints;
matching the second feature descriptor with the first feature descriptor to determine a matched feature point sequence;
and determining the current position information of the unmanned aerial vehicle based on the matched feature point sequence.
6. The method according to claim 5, wherein the step of determining the current position information of the UAV based on the matched sequence of feature points comprises:
and calculating the matched characteristic point sequence by adopting a preset positioning algorithm, and determining the current position information of the unmanned aerial vehicle.
7. The method according to any one of claims 1-6, further comprising:
determining the position offset of the current position information and the track information;
and correcting the current position information based on the position offset.
8. A positioning device for an aircraft, characterized in that it comprises:
the first obtaining module is used for obtaining first feature information from the map data of the target area, wherein the first feature information at least comprises feature points, feature descriptors corresponding to the feature points and a first feature dictionary corresponding to the feature points;
the planning module is used for planning the route of the map data of the target area to obtain a plurality of waypoints;
the second acquisition module is used for respectively acquiring the three-dimensional coordinate data of the waypoints;
the acquisition module is used for acquiring a feature descriptor and a first feature dictionary of an area which can be covered by the field angle of the airborne camera by taking the pixel coordinate of each waypoint as the center to generate a feature set;
the composition module is used for organizing the three-dimensional coordinate data and the feature set into attribute information of the waypoint so as to obtain track information;
the third acquisition module is used for acquiring real-time image data acquired by the airborne camera according to preset intervals;
and the first determining module is used for determining the current position information of the unmanned aerial vehicle based on the flight path information and the real-time image data.
9. An electronic device, comprising: memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps in the method of positioning an aircraft according to any one of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, characterized in that a computer program is stored thereon, which computer program, when being executed by a processor, carries out the steps in the method of positioning an aircraft according to any one of claims 1 to 7.
CN202211080176.3A 2022-09-05 2022-09-05 Aircraft positioning method and device, electronic equipment and storage medium Pending CN115457129A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211080176.3A CN115457129A (en) 2022-09-05 2022-09-05 Aircraft positioning method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211080176.3A CN115457129A (en) 2022-09-05 2022-09-05 Aircraft positioning method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115457129A true CN115457129A (en) 2022-12-09

Family

ID=84303593

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211080176.3A Pending CN115457129A (en) 2022-09-05 2022-09-05 Aircraft positioning method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115457129A (en)

Similar Documents

Publication Publication Date Title
CN109324337B (en) Unmanned aerial vehicle route generation and positioning method and device and unmanned aerial vehicle
CN109931939B (en) Vehicle positioning method, device, equipment and computer readable storage medium
CN111830953B (en) Vehicle self-positioning method, device and system
CN110617821B (en) Positioning method, positioning device and storage medium
CN111213155A (en) Image processing method, device, movable platform, unmanned aerial vehicle and storage medium
EP3106832B1 (en) Cross spectral feature correlation for navigational adjustment
CN112419374B (en) Unmanned aerial vehicle positioning method based on image registration
CN111241988B (en) Method for detecting and identifying moving target in large scene by combining positioning information
CN111829532B (en) Aircraft repositioning system and method
CN113916243A (en) Vehicle positioning method, device, equipment and storage medium for target scene area
CN111986261B (en) Vehicle positioning method and device, electronic equipment and storage medium
US11967091B2 (en) Detection of environmental changes to delivery zone
CN115861860B (en) Target tracking and positioning method and system for unmanned aerial vehicle
CN113223064B (en) Visual inertial odometer scale estimation method and device
WO2018131546A1 (en) Information processing device, information processing system, information processing method, and information processing program
US11461944B2 (en) Region clipping method and recording medium storing region clipping program
CN111104861B (en) Method and apparatus for determining wire position and storage medium
CN113312435A (en) High-precision map updating method and device
CN116912716A (en) Target positioning method, target positioning device, electronic equipment and storage medium
CN110674327A (en) More accurate positioning method and device
CN113469045B (en) Visual positioning method and system for unmanned integrated card, electronic equipment and storage medium
CN113850864B (en) GNSS/LIDAR loop detection method for outdoor mobile robot
CN115457129A (en) Aircraft positioning method and device, electronic equipment and storage medium
CN115077563A (en) Vehicle positioning accuracy evaluation method and device and electronic equipment
CN111414804B (en) Identification frame determining method, identification frame determining device, computer equipment, vehicle and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination