CN115618749B - Error compensation method for real-time positioning of large unmanned aerial vehicle - Google Patents

Error compensation method for real-time positioning of large unmanned aerial vehicle Download PDF

Info

Publication number
CN115618749B
CN115618749B CN202211547468.3A CN202211547468A CN115618749B CN 115618749 B CN115618749 B CN 115618749B CN 202211547468 A CN202211547468 A CN 202211547468A CN 115618749 B CN115618749 B CN 115618749B
Authority
CN
China
Prior art keywords
image
point
error
video frame
aerial vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211547468.3A
Other languages
Chinese (zh)
Other versions
CN115618749A (en
Inventor
罗登
陈翔
杨磊
潘星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Tengdun Technology Co Ltd
Original Assignee
Sichuan Tengdun Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Tengdun Technology Co Ltd filed Critical Sichuan Tengdun Technology Co Ltd
Priority to CN202211547468.3A priority Critical patent/CN115618749B/en
Publication of CN115618749A publication Critical patent/CN115618749A/en
Application granted granted Critical
Publication of CN115618749B publication Critical patent/CN115618749B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • G06F30/27Design optimisation, verification or simulation using machine learning, e.g. artificial intelligence, neural networks, support vector machines [SVM] or training a model
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/086Learning methods using evolutionary algorithms, e.g. genetic algorithms or genetic programming
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/12Computing arrangements based on biological models using genetic models
    • G06N3/126Evolutionary algorithms, e.g. genetic algorithms or genetic programming
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • General Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Software Systems (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mathematical Physics (AREA)
  • Physiology (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Geometry (AREA)
  • Medical Informatics (AREA)
  • Computer Hardware Design (AREA)
  • Genetics & Genomics (AREA)
  • Image Processing (AREA)

Abstract

The invention provides an error compensation method for real-time positioning of a large unmanned aerial vehicle, which comprises the following steps: s1, acquiring an error data set for real-time positioning of an unmanned aerial vehicle; s2, constructing an error model based on the error data set; s3, applying the error model; and S4, updating the error model. The invention fully considers the working mode and working state of each sensor and the problem of data transmission delay, accurately describes the system error of the real-time positioning of the large unmanned aerial vehicle, and avoids the error problem of the matching of the traditional general model and the unmanned aerial vehicle system by combining the sample acquisition and training of the unmanned aerial vehicle in a machine learning mode, thereby remarkably improving the real-time positioning precision of the unmanned aerial vehicle.

Description

Error compensation method for real-time positioning of large unmanned aerial vehicle
Technical Field
The invention relates to the technical field of unmanned aerial vehicle positioning, in particular to an error compensation method for real-time positioning of a large unmanned aerial vehicle.
Background
Real-time positioning of the unmanned aerial vehicle is usually directed by an airborne pos system direct sensor. Correction and correction are needed to be made to the eccentricity component of the on-board GNSS, the eccentricity angle of the inertial navigation and the camera and the azimuth element in the camera in general. Although the above correction can greatly improve the positioning accuracy, the measurement error of inertial navigation, the working state of GNSS \ IMU and the positioning error still restrict the accuracy of direct sensor orientation. These systematic errors are particularly significant when the drone performs positioning tasks for long range, small pitch angles.
Disclosure of Invention
The invention aims to provide an error compensation method for real-time positioning of a large unmanned aerial vehicle, and aims to solve the problem that system errors have great influence when the unmanned aerial vehicle executes positioning tasks of long sight distance and small pitch angle.
The invention provides an error compensation method for real-time positioning of a large unmanned aerial vehicle, which comprises the following steps:
s1, acquiring an error data set for real-time positioning of an unmanned aerial vehicle;
s2, constructing an error model based on the error data set;
s3, applying the error model;
and S4, updating the error model.
Further, step S1 includes the following substeps:
s11, setting a calibration field range, and acquiring a high-resolution image based on a satellite or unmanned aerial vehicle platform in the calibration field range;
s12, planning a flight task of the unmanned aerial vehicle;
s13, enabling the attitude of the photoelectric pod to change periodically in the flight process of the unmanned aerial vehicle;
s14, taking video frames from a video shot by the photoelectric pod camera at fixed intervals, telemetering information of the video frame time and telemetering information 1S before the video frame time, and taking a video frame set I;
s15, for each frame of image in the video frame set I, calculating the geographical position of a pixel in the image according to POS data and executing geographical correction to obtain a new image set J;
s16, respectively carrying out feature matching on the high-resolution images and the images in the new image set J, and endowing the pixels in the video frame set I with accurate geographic coordinates according to the matching result to obtain a point set
Figure 638959DEST_PATH_IMAGE001
Sum point set
Figure 162344DEST_PATH_IMAGE002
S17, regarding the ith image in the video frame set I, based on the point set
Figure 391856DEST_PATH_IMAGE001
Sum point set
Figure 633481DEST_PATH_IMAGE002
The provided pixel coordinates and three-dimensional space coordinates adopt a single-image space back intersection method to calculate exterior orientation elements and line elements thereofIs marked as
Figure 174184DEST_PATH_IMAGE003
The angle element is converted into quaternion and then is recorded as
Figure 614392DEST_PATH_IMAGE004
(ii) a Repeating the process, and calculating the exterior orientation element of each image in the video frame set I;
s18, establishing a position error data set and an attitude error data set: for the ith image in the video frame set I, calculating a projection center and a quaternion corrected by an eccentric component and an eccentric angle according to the shooting time and the telemetering information of the previous 1 s:
Figure 277455DEST_PATH_IMAGE005
Figure 740797DEST_PATH_IMAGE006
Figure 147508DEST_PATH_IMAGE007
and
Figure 379906DEST_PATH_IMAGE008
will be provided with
Figure 213870DEST_PATH_IMAGE005
Yaw angle, GNSS operating state and telemetry information solution with respect to the first 1s of a video frame
Figure 164508DEST_PATH_IMAGE006
The yaw angle and the GNSS working state are used as independent variables,
Figure 109331DEST_PATH_IMAGE009
establishing a position error data set as a dependent variable; in the same way, will
Figure 461815DEST_PATH_IMAGE007
And resolving telemetering information of each attitude angle, inertial navigation working state and first 1s of video frame
Figure 466680DEST_PATH_IMAGE008
The attitude angles and the inertial navigation working state are taken as independent variables,
Figure 904614DEST_PATH_IMAGE010
establishing an attitude error data set as a dependent variable;
and repeating the process until all the images in the video frame set I are calculated to obtain an error data set.
Further, in step S12, the unmanned aerial vehicle flies around the center O of the calibration yard in the calibration yard by using a petal-shaped flight path.
Further, in step S13, the periodic variation of the attitude of the photovoltaic pod during the flight of the unmanned aerial vehicle means that the azimuth angle of the photovoltaic pod is changed
Figure 653127DEST_PATH_IMAGE011
And a pitch angle
Figure 860118DEST_PATH_IMAGE012
As a function of time t, expressed as:
Figure 32955DEST_PATH_IMAGE013
(1)
in the formula,Tthe period of the attitude change of the photoelectric pod;Tof values not greater than the estimated time of flight of the planned route
Figure 958185DEST_PATH_IMAGE014
Figure 244810DEST_PATH_IMAGE015
To representtIs divided byTThe remainder of (1);
Figure 571886DEST_PATH_IMAGE016
and
Figure 652975DEST_PATH_IMAGE017
respectively representing the pitch angle of the photoelectric gondolaMinimum and maximum values of (d);sis a positive integer represented inTIn the period, the pitch angle
Figure 65502DEST_PATH_IMAGE018
The period of time of (a) varies.
Further, step S15 includes:
for the ith image in the video frame set I, directionally calculating an image point set by adopting a direct sensor according to POS data
Figure 155817DEST_PATH_IMAGE019
To obtain a point set
Figure 337400DEST_PATH_IMAGE020
(ii) a Wherein, the image point set
Figure 854969DEST_PATH_IMAGE021
W is the width of the image, h is the height of the image;
set of points
Figure 489213DEST_PATH_IMAGE019
Sum point set
Figure 117640DEST_PATH_IMAGE020
Establishing a quadratic polynomial model expressed as:
Figure 419308DEST_PATH_IMAGE022
(2)
wherein x and y are point sets
Figure 107779DEST_PATH_IMAGE019
The coordinates of (a); x and Y are
Figure 229318DEST_PATH_IMAGE020
The coordinates of (a); a is 0 、a 1 、a 2 、a 3 、a 4 、a 5 、b 0 、b 1 、b 2 、b 3 、b 4 、b 5 To be a coefficient of undeterminationThe method is obtained by a least square method; calculating the geographic coordinates of each pixel of the ith image in the video frame set I by using a quadratic polynomial model, and then performing gray level resampling to obtain a new image;
and repeating the steps, and processing all the images in the video frame set I to obtain a new image set J.
Further, step S16 includes:
for the ith image in the new image set J, matching a local area of the high-resolution image according to the geographic position of the ith image; performing feature matching by taking the local high-resolution image and the ith image in the new image set J as data sources, screening out a correct matching relation by using a RANSAC algorithm, and respectively marking feature point sets of the image in the high-resolution image and the new image set J as feature point sets
Figure 661437DEST_PATH_IMAGE023
Figure 817612DEST_PATH_IMAGE024
Collecting points by formula (3)
Figure 679913DEST_PATH_IMAGE023
Figure 288749DEST_PATH_IMAGE024
Converting the pixel coordinates of the midpoint into geographic coordinates, and recording the geographic coordinates as a point set
Figure 258979DEST_PATH_IMAGE025
Sum point set
Figure 269660DEST_PATH_IMAGE026
Figure 299933DEST_PATH_IMAGE027
(3)
Wherein A is a homogeneous form of affine transformation matrix of the ith image in the high-resolution image or new image set J, and x and y represent
Figure 396065DEST_PATH_IMAGE023
Or
Figure 169986DEST_PATH_IMAGE024
X and Y represent geographical coordinates;
set points
Figure 35174DEST_PATH_IMAGE026
Substituting each point into a quadratic polynomial model to calculate the pixel coordinate of each point in the ith image in the video frame set I, and recording the result as a point set
Figure 236348DEST_PATH_IMAGE001
For point sets
Figure 819776DEST_PATH_IMAGE025
Substituting the geographical coordinates of each point into a high-precision DEM to calculate the elevation; will point set
Figure 335071DEST_PATH_IMAGE025
The geographical coordinates and the obtained elevation of each point are combined to form three-dimensional space coordinates, and the result is recorded as a point set
Figure 117082DEST_PATH_IMAGE002
And repeating the steps until all the images in the new image set J are processed.
Further, step S2 includes the following sub-steps:
s21, optimizing a BP neural network with initial weight and threshold value based on a genetic algorithm to serve as an error model;
s22, standardizing the error data set, disorganizing the sequence, and dividing the error data set into a training set and a test set according to a certain proportion;
s23, searching the initial weight and the threshold value of the BP neural network by using a genetic algorithm;
s24, initializing the BP neural network according to the initial weight and the threshold value in the step S23, taking the square sum of the output of the BP neural network and the dependent variable as a loss function, and continuously updating the weight value and the threshold value of the BP neural network by inputting a training set to obtain an error model;
s25, determining the prediction precision, robustness and convergence of the error model according to the performance of the error model on the training set and the test set; if the error model is not good enough, go back to step S21 to adjust the hyperparameter of BP neural network.
Further, step S3 includes the following sub-steps:
s31, calculating the projection center and quaternion of the telemetering information at the current time and the previous 1S time after correction of the eccentric component and the eccentric angle:
Figure 426841DEST_PATH_IMAGE028
Figure 559882DEST_PATH_IMAGE029
Figure 613289DEST_PATH_IMAGE030
and
Figure 515386DEST_PATH_IMAGE031
s32, mixing
Figure 996045DEST_PATH_IMAGE028
Yaw angle, GNSS operating state and telemetry resolution with respect to the first 1s of the video frame
Figure 636890DEST_PATH_IMAGE029
Inputting the yaw angle and the GNSS working state into a position error model to obtain a projection center compensation value
Figure 228409DEST_PATH_IMAGE032
(ii) a In the same way, will
Figure 985012DEST_PATH_IMAGE030
Calculation of telemetering information of each attitude angle, inertial navigation working state and first 1s of video frame
Figure 902153DEST_PATH_IMAGE031
Inputting each attitude angle and inertial navigation working state into an attitude error model to obtain a quaternion compensation value
Figure 744207DEST_PATH_IMAGE033
S33, calculating the compensated projection center S and quaternion q:
Figure 139416DEST_PATH_IMAGE034
(4)
in the formula,
Figure 16105DEST_PATH_IMAGE035
in order to compensate for the center of the projection before compensation,
Figure 838568DEST_PATH_IMAGE036
in order to compensate for the pre-quaternion,
Figure 167918DEST_PATH_IMAGE037
is a quaternion addition method;
and S34, determining a photographic ray equation of any image point according to the projection center S and the quaternion q, wherein the intersection point of the photographic ray equation and the earth surface is the geographic coordinate of the image point.
Further, step S4 includes the following sub-steps:
s41, the calibration task obtains an error data set according to the step S1; judging whether the image principal point coordinates fall in the coverage area of the high-resolution image according to a certain frequency in daily flight, if so, executing steps S14-S18 to obtain an error data set;
and S42, after the number of the new error data sets is accumulated to reach a preset data volume, randomly extracting data with the same data volume as the new error data sets from the original error data sets, combining the extracted data and the new error data sets into a new training set, and updating the original error model in a mode of inputting the new training set into the original error model for training.
In summary, due to the adoption of the technical scheme, the invention has the beneficial effects that:
1. the invention solves the problem of control points by utilizing high-resolution images, only needs to arrange a small number of control points in the early stage, and saves the workload. The position of the image point in the rear intersection is obtained by feature matching, so that the whole process is automated.
2. The error model fully considers the working mode and working state of each sensor and the problem of data transmission delay, accurately describes the system error of the real-time positioning of the large unmanned aerial vehicle, and avoids the error problem of the matching of the traditional general model and the unmanned aerial vehicle system by combining the sample acquisition and training of the specific unmanned aerial vehicle through a machine learning mode, thereby remarkably improving the real-time positioning precision of the unmanned aerial vehicle.
3. The positioning technology has great advantages in emergency search and rescue and battlefield reconnaissance, is flexible and changeable to operate, does not limit the postures of the airplane and the load during ground observation, and can realize real-time high-precision positioning.
4. The invention has simple upgrading and maintenance. Besides the calibration task, the aircraft can accumulate data when performing a daily flight task, and autonomously learn in real time to improve the accuracy of the model.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention, and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a flowchart of an error compensation method for real-time positioning of a large unmanned aerial vehicle in the embodiment of the present invention.
Fig. 2 is a schematic route view of a petal-shaped flight route for flying when planning a flight mission of an unmanned aerial vehicle according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all embodiments of the present invention. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Examples
As shown in fig. 1, the present embodiment provides an error compensation method for real-time positioning of a large unmanned aerial vehicle, including the following steps:
s1, acquiring an error data set of real-time positioning of the unmanned aerial vehicle:
s11, setting a calibration field range, and acquiring a high-resolution image based on a satellite or unmanned aerial vehicle platform in the calibration field range;
s12, planning the flight mission of the unmanned aerial vehicle: respectively get the sets
Figure 366818DEST_PATH_IMAGE038
And the middle element is used as the flight height H, so that the unmanned aerial vehicle flies around the center O of the calibration field in the calibration field by adopting a petal-shaped flight route.
As shown in fig. 2, the unmanned aerial vehicle starts from the calibration center O, flies to a point a, then turns according to a set turning radius, flies to a point B, and returns to the point O to complete the half 8' flying of the OABO. Then flight in BO direction completes half '8' of OFGO flight. And so on until to
Figure 98014DEST_PATH_IMAGE039
The direction flies back to point O. The complete flight path is OABOFGOCDOOEFOBCOGHODEO.
And S13, enabling the attitude of the photoelectric pod to change periodically in the flight process of the unmanned aerial vehicle.
Specifically, the fact that the attitude of the photoelectric pod is periodically changed in the flight process of the unmanned aerial vehicle means that the azimuth angle of the photoelectric pod is changed
Figure 91377DEST_PATH_IMAGE011
And a pitch angle
Figure 908024DEST_PATH_IMAGE012
As a function of time t, expressed as:
Figure 707352DEST_PATH_IMAGE013
(1)
in the formula,Tthe period of the attitude change of the photoelectric pod;Tof values not greater than the estimated time of flight of the planned route
Figure 230738DEST_PATH_IMAGE014
Figure 660582DEST_PATH_IMAGE015
To representtIs divided byTThe remainder of (1);
Figure 967454DEST_PATH_IMAGE016
and
Figure 242578DEST_PATH_IMAGE017
respectively representing the minimum value and the maximum value of the pitch angle of the photoelectric pod;sis a positive integer expressed inTIn the period, the pitch angle
Figure 948365DEST_PATH_IMAGE018
The period of time of (a) varies.
And S14, extracting video frames, telemetering information of the video frame time and telemetering information 1S before the video frame time from the video shot by the photoelectric pod camera at fixed intervals, and extracting a video frame set I.
Specifically, the telemetry information should include the position, attitude angle, electro-optic pod attitude angle, focus information, etc. of the aircraft.
And S15, for each frame of image in the video frame set I, calculating the geographical position of a pixel in the image according to POS data and executing geographical correction to obtain a new image set J.
Specifically, for the ith image in the video frame set I, a direct sensor is adopted to directionally calculate an image point set according to POS data
Figure 283532DEST_PATH_IMAGE019
To obtain a point set
Figure 74770DEST_PATH_IMAGE020
. Wherein,
Figure 215902DEST_PATH_IMAGE021
w is the width of the image and h is the height of the image.
Set of points
Figure 713879DEST_PATH_IMAGE019
And
Figure 485526DEST_PATH_IMAGE020
establishing a quadratic polynomial model:
Figure 498481DEST_PATH_IMAGE022
(2)
wherein x and y are point sets
Figure 380987DEST_PATH_IMAGE019
With X and Y being
Figure 795788DEST_PATH_IMAGE020
A coordinate of (a) 0 、a 1 、a 2 、a 3 、a 4 、a 5 、b 0 、b 1 、b 2 、b 3 、b 4 、b 5 The undetermined coefficient is obtained by a least square method.
And (3) calculating the geographic coordinate of each pixel of the ith image in the video frame set I according to the formula (2), and performing gray level resampling to obtain a new image.
And repeating the steps, and processing all the images in the video frame set I to obtain a new image set J.
S16, respectively carrying out feature matching on the high-resolution images and the images in the new image set J, and endowing the pixels in the video frame set I with accurate geographic coordinates according to the matching result to obtain a point set
Figure 738336DEST_PATH_IMAGE001
Sum point set
Figure 238587DEST_PATH_IMAGE002
Specifically, for the ith image in the new image set J, the local area of the high-resolution image is matched according to the geographic position of the ith image. Performing feature matching by taking the local high-resolution image and the ith image in the new image set J as data sources, screening out a correct matching relation by using a RANSAC (Random Sample Consensus) algorithm, and respectively recording the image feature point sets in the high-resolution image and the new image set J as feature point sets
Figure 659204DEST_PATH_IMAGE023
Figure 194091DEST_PATH_IMAGE024
The points are collected by the following formula
Figure 307540DEST_PATH_IMAGE023
Figure 292158DEST_PATH_IMAGE024
Converting the pixel coordinate of the midpoint into a geographic coordinate, and recording the geographic coordinate as a point set
Figure 516466DEST_PATH_IMAGE025
Sum point set
Figure 640280DEST_PATH_IMAGE026
Figure 924631DEST_PATH_IMAGE027
(3)
Wherein A is a homogeneous form of affine transformation matrix of the ith image in the high-resolution image or new image set J, and x and y represent
Figure 399474DEST_PATH_IMAGE023
Or
Figure 489790DEST_PATH_IMAGE024
X, Y represent geographical coordinates.
Will point set
Figure 671373DEST_PATH_IMAGE026
Substituting each point into the formula (2) to calculate the pixel coordinate of the ith image of each point in the video frame set I, and recording the result as a point set
Figure 861046DEST_PATH_IMAGE001
For point sets
Figure 823186DEST_PATH_IMAGE025
And (4) substituting the geographical coordinates of each point into a high-precision DEM (Digital Elevation Model) to obtain the Elevation. Will point set
Figure 389296DEST_PATH_IMAGE025
The geographical coordinates and the obtained elevation of each point are combined to form three-dimensional space coordinates, and the result is recorded as a point set
Figure 753281DEST_PATH_IMAGE002
And repeating the steps until all the images in the new image set J are processed.
S17, for the ith image in the video frame set I, based on the point set
Figure 113855DEST_PATH_IMAGE001
Sum point set
Figure 563291DEST_PATH_IMAGE002
The provided pixel coordinates and three-dimensional space coordinates are calculated by adopting a single-image space back intersection method, and line elements of the exterior orientation coordinates are recorded as
Figure 933093DEST_PATH_IMAGE003
The angle element is converted into quaternion and then is recorded as
Figure 823688DEST_PATH_IMAGE004
. And repeating the process, and calculating the external orientation element of each image in the video frame set I.
And S18, establishing a position error data set and an attitude error data set.
Specifically, for the ith image in the video frame set I, calculating the projection center and quaternion corrected by the eccentric component and the eccentric angle according to the shooting time and the telemetering information of the previous 1 s:
Figure 683060DEST_PATH_IMAGE005
Figure 291896DEST_PATH_IMAGE006
Figure 530635DEST_PATH_IMAGE007
and
Figure 541316DEST_PATH_IMAGE008
will be provided with
Figure 571589DEST_PATH_IMAGE005
Yaw angle, GNSS operating state and telemetry resolution with respect to the first 1s of the video frame
Figure 667721DEST_PATH_IMAGE006
The yaw angle and the GNSS working state are used as independent variables,
Figure 441642DEST_PATH_IMAGE009
establishing a position error data set as a dependent variable;in the same way, will
Figure 306830DEST_PATH_IMAGE007
Calculation of telemetering information of each attitude angle, inertial navigation working state and first 1s of video frame
Figure 508004DEST_PATH_IMAGE008
The attitude angles and the inertial navigation working state are taken as independent variables,
Figure 91432DEST_PATH_IMAGE010
an attitude error data set is established as a dependent variable.
And repeating the process until all the images in the video frame set I are calculated to obtain an error data set.
S2, constructing an error model based on the error data set:
respectively constructing a position error model and an attitude error model by adopting the steps S21-S25 for the position error data set and the attitude error data set:
and S21, using a BP neural network for optimizing initial weight and threshold value based on a genetic algorithm as an error model.
The BP neural network hyper-parameter is set as follows: the number of the hidden layers is one, the initial number of the hidden layers is determined through an empirical formula, and then iterative adjustment is carried out according to results. The activation function is sigmoid, the optimization method is Adam, and the learning rate is 0.01. The number of input layer nodes is determined from the error data set, and the output layer nodes are 3 (position error model) and 4 (attitude error model).
S22, the error data set is standardized and broken, and the error data set is divided into a training set and a testing set according to a certain proportion (such as 7.
And S23, searching the initial weight and the threshold value of the BP neural network by using a genetic algorithm.
And S24, initializing the BP neural network according to the initial weight and the threshold value in the step S23, taking the square sum of the output of the BP neural network and the dependent variable as a loss function, and continuously updating the weight value and the threshold value of the BP neural network by inputting a training set to obtain an error model.
And S25, determining the prediction precision, robustness and convergence of the error model according to the performance of the error model on the training set and the test set. If the error model is not good enough, go back to step S21 to adjust the hyperparameter of BP neural network.
S3, applying the error model:
s31, calculating the projection center and quaternion of the telemetering information at the current time and the previous 1S time after correction of the eccentric component and the eccentric angle:
Figure 403465DEST_PATH_IMAGE028
Figure 451055DEST_PATH_IMAGE029
Figure 760814DEST_PATH_IMAGE030
and
Figure 831538DEST_PATH_IMAGE031
s32, mixing
Figure 947262DEST_PATH_IMAGE028
Yaw angle, GNSS operating state and telemetry information solution with respect to the first 1s of a video frame
Figure 787042DEST_PATH_IMAGE029
Inputting the yaw angle and the GNSS working state into a position error model to obtain a projection center compensation value
Figure 330018DEST_PATH_IMAGE032
(ii) a In the same way, will
Figure 622459DEST_PATH_IMAGE030
Calculation of telemetering information of each attitude angle, inertial navigation working state and first 1s of video frame
Figure 538944DEST_PATH_IMAGE031
Inputting each attitude angle and inertial navigation working state into an attitude error model to obtain a quaternion compensation value
Figure 233231DEST_PATH_IMAGE033
S33, calculating the compensated projection center S and quaternion q:
Figure 947109DEST_PATH_IMAGE034
(4)
in the formula,
Figure 992425DEST_PATH_IMAGE035
in order to compensate for the center of the projection before compensation,
Figure 449951DEST_PATH_IMAGE036
in order to compensate for the pre-quaternion,
Figure 998744DEST_PATH_IMAGE037
is quaternion addition.
And S34, determining a photographing ray equation of any image point according to the projection center S and the quaternion q, wherein the intersection point of the photographing ray equation and the earth surface is the geographic coordinate of the image point.
S4, updating the error model:
s41, the calibration task obtains an error data set according to the step S1; judging whether the image principal point coordinates fall in the coverage area of the high-resolution image according to a certain frequency in daily flight, if so, executing steps S14-S18 to obtain an error data set;
and S42, after the number of the new error data sets is accumulated to reach a preset data volume (such as 1000 error data sets), randomly extracting data with the same data volume as the new error data sets from the original error data sets, combining the extracted data and the new error data sets into a new training set, and updating the original error model in a mode of inputting the new training set into the original error model for training. The model training method is the same as step S21, and the learning rate is adjusted to 0.001.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (5)

1. The error compensation method for real-time positioning of the large unmanned aerial vehicle is characterized by comprising the following steps of:
s1, acquiring an error data set for real-time positioning of an unmanned aerial vehicle;
s2, constructing an error model based on the error data set;
s3, applying the error model;
s4, updating the error model;
step S1 includes the following substeps:
s11, setting a calibration field range, and acquiring a high-resolution image based on a satellite or unmanned aerial vehicle platform in the calibration field range;
s12, planning a flight task of the unmanned aerial vehicle;
s13, enabling the attitude of the photoelectric pod to change periodically in the flight process of the unmanned aerial vehicle;
s14, taking video frames from a video shot by the photoelectric pod camera at fixed intervals, telemetering information of the video frame time and telemetering information 1S before the video frame time, and taking a video frame set I;
s15, for each frame of image in the video frame set I, calculating the geographical position of a pixel in the image according to POS data and executing geographical correction to obtain a new image set J;
s16, respectively carrying out feature matching on the high-resolution images and the images in the new image set J, and endowing the pixels in the video frame set I with accurate geographic coordinates according to the matching result to obtain a point set
Figure QLYQS_1
And point set>
Figure QLYQS_2
S17, regarding the ith image in the video frame set I, based on the point set
Figure QLYQS_3
And point set>
Figure QLYQS_4
The provided pixel coordinate and three-dimensional space coordinate are calculated by adopting a single-image space back intersection method, and the line element is recorded as ^ er>
Figure QLYQS_5
The angle element is converted into quaternion and then recorded as->
Figure QLYQS_6
(ii) a Repeating the process, and calculating the exterior orientation element of each image in the video frame set I;
s18, establishing a position error data set and an attitude error data set: for the ith image in the video frame set I, calculating a projection center and a quaternion corrected by an eccentric component and an eccentric angle according to the shooting time and the telemetering information of the previous 1 s:
Figure QLYQS_7
Figure QLYQS_8
、/>
Figure QLYQS_9
and &>
Figure QLYQS_10
Will be provided with
Figure QLYQS_11
Based on the measured values of the global navigation system, yaw angle, GNSS operating state and/or telemetry information resolved with respect to the first 1s of the video frame>
Figure QLYQS_12
Based on the deviation angle and the GNSS operating state as arguments>
Figure QLYQS_13
Establishing a position error data set as a dependent variable; in the same way, will->
Figure QLYQS_14
And each attitude angle, inertial navigation working state and telemetering information resolving of 1s before video frame>
Figure QLYQS_15
Each attitude angle and inertial navigation working state are used as independent variables,
Figure QLYQS_16
establishing an attitude error data set as a dependent variable;
repeating the process until all the images in the video frame set I are calculated to obtain an error data set;
step S15 includes:
for the ith image in the video frame set I, directionally calculating an image point set by adopting a direct sensor according to POS data
Figure QLYQS_17
Get the point set->
Figure QLYQS_18
(ii) a Wherein, the image point set>
Figure QLYQS_19
W is the width of the image, h is the height of the image;
set of points
Figure QLYQS_20
And point set>
Figure QLYQS_21
Establishing a quadratic polynomial model expressed as:
Figure QLYQS_22
(2)/>
wherein x and y are point sets
Figure QLYQS_23
The coordinates of (a); x and Y are->
Figure QLYQS_24
The coordinates of (a); a is 0 、a 1 、a 2 、a 3 、a 4 、a 5 、b 0 、b 1 、b 2 、b 3 、b 4 、b 5 The undetermined coefficient is obtained by a least square method; calculating the geographic coordinates of each pixel of the ith image in the video frame set I by using a quadratic polynomial model, and then performing gray level resampling to obtain a new image;
repeating the steps, and processing all the images in the video frame set I to obtain a new image set J;
step S16 includes:
for the ith image in the new image set J, matching a local area of the high-resolution image according to the geographic position of the ith image; performing feature matching by taking the local high-resolution image and the ith image in the new image set J as data sources, screening out a correct matching relation by using a RANSAC algorithm, and respectively marking feature point sets of the image in the high-resolution image and the new image set J as feature point sets
Figure QLYQS_25
Figure QLYQS_26
By the formula (3) to collect points
Figure QLYQS_27
、/>
Figure QLYQS_28
The pixel coordinate of the midpoint is converted into a geographic coordinate, and the geographic coordinate is recorded as a point set>
Figure QLYQS_29
Sum point set>
Figure QLYQS_30
Figure QLYQS_31
(3)
Wherein A is a homogeneous form of affine transformation matrix of the ith image in the high-resolution image or new image set J, and x and y represent
Figure QLYQS_32
Or>
Figure QLYQS_33
X and Y represent geographical coordinates;
set points
Figure QLYQS_34
Substituting each point into a quadratic polynomial model to calculate the pixel coordinate of each point in the ith image in the video frame set I, and recording the result as a point set ^ and ^>
Figure QLYQS_35
For point sets
Figure QLYQS_36
Substituting the geographical coordinates of each point into a high-precision DEM to obtain the elevation; collect the point>
Figure QLYQS_37
The geographic coordinate of each point and the obtained elevation are combined to form a three-dimensional space coordinate, and the result is recorded as a point set->
Figure QLYQS_38
And repeating the steps until all the images in the new image set J are processed.
2. The method of claim 1, wherein in step S12, the UAV flies around the center O of the calibration center in the calibration field by a petal-shaped flight path.
3. The error compensation method for the real-time positioning of the large unmanned aerial vehicle as claimed in claim 2, wherein in step S13, the step of periodically changing the attitude of the photovoltaic pod during the flight of the unmanned aerial vehicle means that the azimuth angle of the photovoltaic pod is changed
Figure QLYQS_39
And pitch angle->
Figure QLYQS_40
As a function of time t, expressed as:
Figure QLYQS_41
(1)
in the formula,Tthe period of the attitude change of the photoelectric pod;Tof values not greater than the estimated time of flight of the planned route
Figure QLYQS_42
Figure QLYQS_43
To representtIs divided byTThe remainder of (1); />
Figure QLYQS_44
And &>
Figure QLYQS_45
Respectively representing the minimum value and the maximum value of the pitch angle of the photoelectric pod;sis a positive integer expressed inTIn a cycle, the pitch angle is &>
Figure QLYQS_46
The period of time of (a) varies. />
4. The error compensation method for the real-time positioning of the large unmanned aerial vehicle according to claim 3, wherein the step S3 comprises the following substeps:
s31, calculating the projection center and quaternion of the telemetering information at the current time and the previous 1S time after correction of the eccentric component and the eccentric angle:
Figure QLYQS_47
、/>
Figure QLYQS_48
、/>
Figure QLYQS_49
and &>
Figure QLYQS_50
S32, mixing
Figure QLYQS_51
Based on the measured values of the global navigation system, yaw angle, GNSS operating state and/or telemetry information resolved with respect to the first 1s of the video frame>
Figure QLYQS_52
The yaw angle and the GNSS working state are input into the position error model to obtain a projection center compensation value->
Figure QLYQS_53
(ii) a In the same way, will->
Figure QLYQS_54
And each attitude angle, inertial navigation working state and telemetering information resolving of 1s before video frame>
Figure QLYQS_55
Inputting each attitude angle and inertial navigation working state into an attitude error model to obtain a quaternion compensation value->
Figure QLYQS_56
S33, calculating the compensated projection center S and quaternion q:
Figure QLYQS_57
(4)
in the formula,
Figure QLYQS_58
in order to centre of projection before compensation>
Figure QLYQS_59
Is a quaternion before compensation>
Figure QLYQS_60
Adding quaternion;
and S34, determining a photographing ray equation of any image point according to the projection center S and the quaternion q, wherein the intersection point of the photographing ray equation and the earth surface is the geographic coordinate of the image point.
5. The error compensation method for the real-time positioning of the large unmanned aerial vehicle according to claim 4, wherein the step S4 comprises the following substeps:
s41, the calibration task obtains an error data set according to the step S1; judging whether the image principal point coordinates fall in the coverage area of the high-resolution image according to a certain frequency in daily flight, if so, executing steps S14-S18 to obtain an error data set;
and S42, after the number of the new error data sets is accumulated to reach a preset data volume, randomly extracting data with the same data volume as the new error data sets from the original error data sets, combining the extracted data and the new error data sets into a new training set, and updating the original error model in a mode of inputting the new training set into the original error model for training.
CN202211547468.3A 2022-12-05 2022-12-05 Error compensation method for real-time positioning of large unmanned aerial vehicle Active CN115618749B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211547468.3A CN115618749B (en) 2022-12-05 2022-12-05 Error compensation method for real-time positioning of large unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211547468.3A CN115618749B (en) 2022-12-05 2022-12-05 Error compensation method for real-time positioning of large unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN115618749A CN115618749A (en) 2023-01-17
CN115618749B true CN115618749B (en) 2023-04-07

Family

ID=84879701

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211547468.3A Active CN115618749B (en) 2022-12-05 2022-12-05 Error compensation method for real-time positioning of large unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN115618749B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117169980B (en) * 2023-11-01 2024-01-16 中国船舶集团有限公司第七〇七研究所 Accurate compensation method for gravity measurement acceleration eccentric effect error

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102011077074B4 (en) * 2011-06-07 2022-10-27 Esg Elektroniksystem- Und Logistik Gmbh Method and device for determining an error when determining geo-coordinates
CN103345737B (en) * 2013-06-04 2016-08-10 北京航空航天大学 A kind of UAV high resolution image geometric correction method based on error compensation
CN105180963B (en) * 2015-07-22 2018-02-16 北京航空航天大学 Unmanned plane telemetry parameter modification method based on online calibration
CN105222788B (en) * 2015-09-30 2018-07-06 清华大学 The automatic correcting method of the matched aircraft Route Offset error of feature based
CN105424010A (en) * 2015-11-17 2016-03-23 中国人民解放军信息工程大学 Unmanned aerial vehicle video geographic space information registering method
CN106127697B (en) * 2016-06-07 2018-12-11 哈尔滨工业大学 EO-1 hyperion geometric correction method is imaged in unmanned aerial vehicle onboard
CN106352897B (en) * 2016-08-26 2018-06-15 杨百川 It is a kind of based on the silicon MEMS gyro estimation error of monocular vision sensor and bearing calibration
CN106871927B (en) * 2017-01-05 2020-10-20 南京航空航天大学 Unmanned aerial vehicle photoelectric pod installation error calibration method
CN110385720B (en) * 2019-07-26 2020-08-04 南京航空航天大学 Robot positioning error compensation method based on deep neural network
CN110470304B (en) * 2019-08-19 2021-04-20 西安因诺航空科技有限公司 High-precision target positioning and speed measuring method based on unmanned aerial vehicle photoelectric platform
CN111272196A (en) * 2020-02-29 2020-06-12 武汉大学 In-orbit outside orientation element self-checking and correcting method and system under specific shooting condition
CN113793270A (en) * 2021-08-05 2021-12-14 杭州电子科技大学 Aerial image geometric correction method based on unmanned aerial vehicle attitude information
CN114693754B (en) * 2022-05-30 2022-08-19 湖南大学 Unmanned aerial vehicle autonomous positioning method and system based on monocular vision inertial navigation fusion
CN115187798A (en) * 2022-06-15 2022-10-14 中国人民解放军32146部队 Multi-unmanned aerial vehicle high-precision matching positioning method

Also Published As

Publication number Publication date
CN115618749A (en) 2023-01-17

Similar Documents

Publication Publication Date Title
CN101246590B (en) Star loaded camera spacing deformation image geometric correction method
CN106127697B (en) EO-1 hyperion geometric correction method is imaged in unmanned aerial vehicle onboard
CN102741706B (en) The geographical method with reference to image-region
CN105352509B (en) Unmanned plane motion target tracking and localization method under geography information space-time restriction
CN106871927B (en) Unmanned aerial vehicle photoelectric pod installation error calibration method
US7071970B2 (en) Video augmented orientation sensor
CN101598556B (en) Unmanned aerial vehicle vision/inertia integrated navigation method in unknown environment
CN103345737B (en) A kind of UAV high resolution image geometric correction method based on error compensation
CN109708649B (en) Attitude determination method and system for remote sensing satellite
CN106780321B (en) CBERS-02 satellite HR sensor image overall tight orientation and correction splicing method
CN103822615A (en) Unmanned aerial vehicle ground target real-time positioning method with automatic extraction and gathering of multiple control points
CN107040695B (en) satellite-borne video image stabilization method and system based on RPC positioning model
CN115618749B (en) Error compensation method for real-time positioning of large unmanned aerial vehicle
CN114858133B (en) Attitude low-frequency error correction method under fixed star observation mode
CN114111776A (en) Positioning method and related device
CN113415433A (en) Pod attitude correction method and device based on three-dimensional scene model and unmanned aerial vehicle
CN113739767B (en) Method for producing orthoscopic image aiming at image acquired by domestic area array swinging imaging system
CN110411449B (en) Aviation reconnaissance load target positioning method and system and terminal equipment
CN107705272A (en) A kind of high-precision geometric correction method of aerial image
CN113654528B (en) Method and system for estimating target coordinates through unmanned aerial vehicle position and cradle head angle
Li et al. Multi-sensor based high-precision direct georeferencing of medium-altitude unmanned aerial vehicle images
CN113049006B (en) Starlight vector Mongolian difference correction method based on model parameter estimation
CN111044076B (en) Geometric calibration method for high-resolution first-number B satellite based on reference base map
CN112257630A (en) Unmanned aerial vehicle detection imaging method and device of power system
CN117990112B (en) Unmanned aerial vehicle photoelectric platform target positioning method based on robust unscented Kalman filtering

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant