CN115511956A - Unmanned aerial vehicle imaging positioning method - Google Patents

Unmanned aerial vehicle imaging positioning method Download PDF

Info

Publication number
CN115511956A
CN115511956A CN202211479183.0A CN202211479183A CN115511956A CN 115511956 A CN115511956 A CN 115511956A CN 202211479183 A CN202211479183 A CN 202211479183A CN 115511956 A CN115511956 A CN 115511956A
Authority
CN
China
Prior art keywords
coordinate system
aerial vehicle
unmanned aerial
image
axis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211479183.0A
Other languages
Chinese (zh)
Inventor
韩静
陈名洋
陈霄宇
瞿超
魏驰恒
张靖远
王泽�
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Science and Technology
Original Assignee
Nanjing University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Science and Technology filed Critical Nanjing University of Science and Technology
Priority to CN202211479183.0A priority Critical patent/CN115511956A/en
Publication of CN115511956A publication Critical patent/CN115511956A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Analysis (AREA)
  • Data Mining & Analysis (AREA)
  • Computational Mathematics (AREA)
  • Computing Systems (AREA)
  • Algebra (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

The invention discloses an unmanned aerial vehicle imaging positioning method, which comprises the following steps: determining the actual meanings of a yaw angle, a pitch angle and a roll angle of a gyroscope on the unmanned aerial vehicle; step two: establishing a coordinate system required by image positioning; step three: establishing an image pixel coordinate system and an image physical coordinate system according to the image, and calculating a three-dimensional coordinate of the actual position of the target point in a camera coordinate system; step four: converting the camera coordinate system into an unmanned aerial vehicle geographic coordinate system; step five: converting the unmanned aerial vehicle geographic coordinate system into a geodetic rectangular coordinate system; step six: the invention converts the earth rectangular coordinate system into a GPS coordinate system, and has the advantages that: according to the geometrical relationship among the target point, the image point and the measuring point in the target positioning process, the longitude and latitude and the height of the target point are calculated by utilizing a Matlab algorithm, so that the target positioning is realized, and the precision of the target positioning imaging of the unmanned aerial vehicle is improved.

Description

Unmanned aerial vehicle imaging positioning method
Technical Field
The invention relates to the technical field of unmanned aerial vehicle imaging positioning, in particular to an unmanned aerial vehicle imaging positioning method.
Background
The ground target imaging technology of the unmanned aerial vehicle is a modern advanced remote positioning technology, is widely applied in the fields of military affairs and civil affairs, and mainly comprises technical key points such as target detection and identification, target positioning and the like. The precision of the target positioning technology determines the precision level of unmanned aerial vehicle target positioning imaging to a great extent. With the rapid development of the modern technology level, the precision requirement of each field on the target positioning technology is higher and higher.
As the research and development of unmanned aerial vehicle technology, in particular to a target imaging positioning system, is started later than abroad in China, the defects of poor positioning accuracy and poor interference resistance exist in the field of unmanned aerial vehicle imaging positioning in particular. Therefore, the rapid development of the unmanned aerial vehicle imaging positioning technology is important for research and development in the technical field of reconnaissance in China.
Disclosure of Invention
The invention aims to provide an unmanned aerial vehicle imaging positioning method to solve the problems in the background technology.
In order to achieve the purpose, the invention provides the following technical scheme: an unmanned aerial vehicle imaging positioning method comprises the following steps:
the method comprises the following steps: acquiring a yaw angle, a pitch angle and a roll angle when an image is shot by using a gyroscope on the unmanned aerial vehicle; step two: establishing a coordinate system required by image positioning, wherein the coordinate system comprises a camera coordinate system, an unmanned aerial vehicle geographic coordinate system, a GPS coordinate system and a geodetic rectangular coordinate system;
step three: establishing an image pixel coordinate system and an image physical coordinate system according to the image, and then calculating a three-dimensional coordinate of the actual position of a target point on the image in a camera coordinate system;
step four: substituting the three-dimensional coordinates of the target point in the camera coordinate system, which are obtained by calculation in the third step, into the unmanned aerial vehicle geographic coordinate system to obtain the three-dimensional coordinates of the actual position of the target point on the image in the unmanned aerial vehicle geographic coordinate system; step five: substituting the three-dimensional coordinates of the target point obtained by calculation in the fourth step in the unmanned aerial vehicle geographic coordinate system into a geodetic rectangular coordinate system to obtain the three-dimensional coordinates of the actual position of the target point on the image in the geodetic rectangular coordinate system;
step six: and substituting the three-dimensional coordinates of the target point calculated in the step five in the geodetic rectangular coordinate system into the GPS coordinate system to obtain the three-dimensional coordinates of the actual position of the target point on the image in the GPS coordinate system.
Furthermore, in the camera coordinate system, the plane central point of the camera is taken as the origin and is marked as O c The direction of the X-axis and the Y-axis is the same as the direction of the X-axis and the Y-axis in the image, with the optical axis direction of the camera as the Z-axis direction.
Further, in the geographical coordinate system of the unmanned aerial vehicle, the actual position of the unmanned aerial vehicle is taken as an original point, a perpendicular line between the unmanned aerial vehicle and the ground is taken as a Z axis, the south-positive direction of the unmanned aerial vehicle is taken as a Y axis, the east-positive direction of the unmanned aerial vehicle is taken as an X axis, and during overhead shooting, the yaw angle, the pitch angle and the roll angle of the gyroscope respectively represent rotation angles between three polar axes of the camera coordinate system and the unmanned aerial vehicle coordinate system; the GPS coordinate system is a world geodetic coordinate system WGS-84, the origin of coordinates is the earth centroid, the X axis points to the intersection point of the meridian and the equator from the origin, the Z axis points to the north pole direction of the earth, the Y axis is perpendicular to the plane of XOZ and forms a right-hand coordinate system with the X axis and the Z axis, and the coordinates of any point in the coordinate system can be represented as B, L and H and respectively represent the latitude, longitude and height of the point; and the coordinate axis and the coordinate origin of the geodetic rectangular coordinate system are completely coincided with the coordinate axis and the coordinate origin of the GPS coordinate system.
Further, in step three, the image pixel coordinate system is a rectangular coordinate system with the upper left corner of the currently photographed image as the origin and the pixels as the coordinate units, and X is p And Y p Respectively representing the number of lines and columns of the pixel in the digital image, wherein the image physical coordinate system takes the intersection point of the optical axis of the camera and the image sensor in the camera as the origin and the actual physical size as the actual physical sizeRectangular coordinate system of units, wherein X w Axis, Y w Axes respectively associated with X of the image pixel coordinate system p And Y p The axes are parallel.
Furthermore, in the image pixel coordinate system, the image physical coordinate of the pixel point p
Figure 228375DEST_PATH_IMAGE001
Coordinates of the camera coordinate system with the object point P
Figure 348777DEST_PATH_IMAGE002
The relationship of (A) is as follows:
Figure 207143DEST_PATH_IMAGE003
(5-1) wherein
Figure 664669DEST_PATH_IMAGE004
The focal length of the camera, the image pixel coordinate of the pixel point p
Figure 947883DEST_PATH_IMAGE005
With its image physical coordinates
Figure 114553DEST_PATH_IMAGE001
The relationship of (a) to (b) is as follows:
Figure 850428DEST_PATH_IMAGE006
(5-2) wherein, in the above,
Figure 111645DEST_PATH_IMAGE007
the image pixel coordinate of the image principal point, namely the intersection point of the camera optical axis and the image sensor in the camera; dx and dy are respectively a single pixel of the camera at X w And Y w The physical size in the direction, in combination with relation (5-1) and relation (5-2) can coordinate the image pixel of the object
Figure 590644DEST_PATH_IMAGE001
Conversion into image physical coordinates
Figure 256111DEST_PATH_IMAGE001
The coordinates of the target point in the image in the camera coordinate system are
Figure 479282DEST_PATH_IMAGE008
(ii) a Image physical coordinates of pixel point p
Figure 13032DEST_PATH_IMAGE001
Coordinates of the camera coordinate system with the object point P
Figure 349466DEST_PATH_IMAGE002
Figure 920256DEST_PATH_IMAGE004
The focal length of the camera, the image pixel coordinate of the pixel point p
Figure 958619DEST_PATH_IMAGE005
Figure 309442DEST_PATH_IMAGE007
The coordinate of the actual position of the target in the image in the camera coordinate system is obtained as the image principal point
Figure 359437DEST_PATH_IMAGE008
Further, in step four, the step of converting the coordinates of the points in the camera coordinate system to the coordinate system of the drone is as follows: s1: the coordinate system is rotated around the X axis along the spiral direction of the right hand, namely the right hand holds the rotating shaft, the pointing direction of the four fingers is the rotating direction, and the matrix is converted
Figure 569970DEST_PATH_IMAGE009
(ii) a S2: rotating the coordinate system around the Y axis in the right-handed helical direction
Figure 361208DEST_PATH_IMAGE010
Conversion matrix
Figure 987493DEST_PATH_IMAGE011
S3: the coordinate system is arranged along the right around the Z axisHand rotating in spiral direction
Figure 912503DEST_PATH_IMAGE012
Conversion matrix
Figure 356254DEST_PATH_IMAGE013
In the above step
Figure 775734DEST_PATH_IMAGE014
Figure 2447DEST_PATH_IMAGE010
Figure 89351DEST_PATH_IMAGE012
When a picture is shot, the corresponding roll angle, pitch angle and azimuth angle obtained by summing the three attitude angles corresponding to the camera and the holder acquired by the unmanned aerial vehicle are obtained;
the target in the image is represented as the unmanned aerial vehicle in the geographic coordinate system
Figure 500741DEST_PATH_IMAGE015
(6-1) the coordinates of the actual position of the target point in the obtained image in the unmanned aerial vehicle geographic coordinate system are expressed as
Figure 532151DEST_PATH_IMAGE016
Further, in the fourth step, an included angle between a connecting line between the unmanned aerial vehicle and the target point and a perpendicular line from the unmanned aerial vehicle to the ground is calculated
Figure 687189DEST_PATH_IMAGE017
Connection of unmanned aerial vehicle and target point and included angle between connection of unmanned aerial vehicle to image central point
Figure 503966DEST_PATH_IMAGE018
Wherein the included angle
Figure 351837DEST_PATH_IMAGE017
The expression of (a) is as follows:
Figure 745909DEST_PATH_IMAGE019
angle of inclination
Figure 563692DEST_PATH_IMAGE018
The expression of (a) is as follows:
Figure 359610DEST_PATH_IMAGE020
(ii) a Subsequent utilization of acquired unmanned aerial vehicle GPS information when taking the photograph
Figure 378381DEST_PATH_IMAGE021
Figure 132186DEST_PATH_IMAGE022
And
Figure 629027DEST_PATH_IMAGE023
calculating the coordinates of the target in the geographical coordinate system of the unmanned aerial vehicle, and projecting the target on the z-axis of the geographical coordinate system of the unmanned aerial vehicle, namely the coordinates of the z-axis of the geographical coordinate system of the unmanned aerial vehicle of the target are
Figure 669664DEST_PATH_IMAGE024
Respectively calculating the ratio of the z-axis coordinate value to the focal length of the camera
Figure 328178DEST_PATH_IMAGE025
Figure 962422DEST_PATH_IMAGE026
Obtaining the coordinates in the geographic coordinate system of the unmanned aerial vehicle
Figure 138319DEST_PATH_IMAGE027
Wherein
Figure 908829DEST_PATH_IMAGE004
Is the focal length of the camera
Figure 3824DEST_PATH_IMAGE021
Figure 984419DEST_PATH_IMAGE022
And
Figure 88641DEST_PATH_IMAGE023
respectively representing the longitude, latitude and altitude at which the drone took a picture.
Further, in the fifth step, firstly, coordinate conversion is carried out, and the unmanned aerial vehicle geographic coordinate system is rotated around the X axis along the right-hand spiral direction
Figure 323444DEST_PATH_IMAGE028
The right-handed screw principle is adopted here, the right hand holds the rotating shaft, the pointing direction of the four fingers is the rotating direction, and then the rotating shaft rotates around the Z axis pointing to the ground along the right-handed screw direction
Figure 854920DEST_PATH_IMAGE029
And (4) rotating by-90 degrees around the Z axis to obtain the following conversion matrix:
Figure 198176DEST_PATH_IMAGE031
(ii) a (8-1) coordinate translation is performed after rotation, and the translation matrix is as follows:
Figure 965144DEST_PATH_IMAGE032
(ii) a (8-2) in the matrix (8-2), the xyz on the left is the coordinate translation after rotation, and the parameters on the right have the following meanings:
Figure 710246DEST_PATH_IMAGE033
wherein, in the step (A),
Figure 147044DEST_PATH_IMAGE034
is the major radius of the ellipsoid;
Figure 852963DEST_PATH_IMAGE035
the curvature radius of the earth prime circle is;
Figure 298988DEST_PATH_IMAGE036
a first eccentricity of the earth; a combination matrix (7-1) anda matrix (7-2) which can convert the coordinates of the target point from the unmanned aerial vehicle geographic coordinate system to the coordinates of a geodetic rectangular coordinate system and express the coordinates as
Figure 633017DEST_PATH_IMAGE037
The conversion formula is as follows:
Figure 630929DEST_PATH_IMAGE039
longitude, latitude, and altitude of the drone with the capture when taking the photograph
Figure 683199DEST_PATH_IMAGE021
Figure 667335DEST_PATH_IMAGE022
And
Figure 993887DEST_PATH_IMAGE023
Figure 38066DEST_PATH_IMAGE034
is the major radius of an ellipsoid;
Figure 702266DEST_PATH_IMAGE035
the curvature radius of the earth prime circle is;
Figure 490093DEST_PATH_IMAGE036
a first eccentricity of the earth; obtaining a coordinate expression of a target geodetic rectangular coordinate system as
Figure 798715DEST_PATH_IMAGE037
Further, in step six, the conversion formula from the earth rectangular coordinate system to the GPS coordinate system is as follows:
Figure 889161DEST_PATH_IMAGE041
wherein, in the step (A),
Figure 916023DEST_PATH_IMAGE042
Figure 241962DEST_PATH_IMAGE043
Figure 795304DEST_PATH_IMAGE044
respectively the longitude, the latitude and the height of the target to be solved,
Figure 915706DEST_PATH_IMAGE045
Figure 305230DEST_PATH_IMAGE046
Figure 434860DEST_PATH_IMAGE047
for the coordinates of the target in the rectangular coordinate system of the earth, the expression formula of N is as follows:
Figure 452495DEST_PATH_IMAGE048
an iterative algorithm is adopted during calculation, and is set during iteration
Figure 399591DEST_PATH_IMAGE049
Initial value of (2)
Figure 135466DEST_PATH_IMAGE050
The calculation formula is as follows:
Figure 678574DEST_PATH_IMAGE052
then calculate
Figure 816294DEST_PATH_IMAGE049
Update value of (2):
Figure 809658DEST_PATH_IMAGE054
finally, the initial value is set
Figure 157463DEST_PATH_IMAGE055
And updating the value
Figure 628896DEST_PATH_IMAGE056
Comparing the difference between the two, if the difference is within the error range, ending the iteration and finally obtaining the resultFinally, the geodetic coordinates of the target are calculated, otherwise, if the error range is found, the earth coordinates of the target are calculated
Figure 621122DEST_PATH_IMAGE056
Iteration is continued for the initial value until
Figure 392245DEST_PATH_IMAGE057
And with
Figure 102712DEST_PATH_IMAGE058
The difference error between the two is within a range, the error range is less than 0.00000001, and the radian is measured;
wherein
Figure 377835DEST_PATH_IMAGE034
Is the earth's major radius;
Figure 614782DEST_PATH_IMAGE035
the curvature radius of the earth fourth prime circle is;
Figure 684369DEST_PATH_IMAGE036
is a first eccentricity of the ellipsoid and is,
Figure 757498DEST_PATH_IMAGE059
Figure 570733DEST_PATH_IMAGE060
Figure 803132DEST_PATH_IMAGE061
respectively represent the correspondence at the i-th iteration
Figure 902675DEST_PATH_IMAGE049
Figure 587734DEST_PATH_IMAGE035
Figure 204660DEST_PATH_IMAGE062
Of interest, during the specific experiment, the drone was taking a pictureThe height H can be obtained by radar to achieve a more accurate height value.
Compared with the prior art, the invention has the beneficial effects that: according to the geometric relationship among the target point, the image point and the measuring point in the target positioning process, the longitude and latitude and the height of the target point are calculated by utilizing a Matlab algorithm, so that the target positioning is realized, and the accuracy of the target positioning imaging of the unmanned aerial vehicle is improved by analyzing the yaw angle, the roll angle and the pitch angle.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic view of a central perspective projection model of the present invention;
FIG. 2 is a geometric relationship diagram for object localization in accordance with the present invention;
FIG. 3 is a schematic diagram of the transformation from the geographical coordinate system of the unmanned aerial vehicle to the rectangular coordinate system of the earth according to the present invention;
FIG. 4 is a line graph of a 50M hovering GPS error according to the present invention;
FIG. 5 is a line graph of a 10M hovering GPS error according to the present invention;
FIG. 6 is a line drawing of the horizontal error of the attitude angle of the 1.99m camera of the present invention;
FIG. 7 is a diagram of positioning error (horizontal projection) at different heights according to the present invention;
FIG. 8 is a graph of positioning error (range error) at different heights in accordance with the present invention;
FIG. 9 is a diagram of positioning error (horizontal projection) for different attitude angles according to the present invention;
FIG. 10 is a graph of the positioning error of point A under a variable attitude angle according to the present invention;
FIG. 11 is a diagram of the positioning error of point B under a variable attitude angle according to the present invention;
FIG. 12 is a variable attitude angle horizontal error line chart of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1-3, in an embodiment of the present invention, an unmanned aerial vehicle imaging positioning method includes the following steps:
the method comprises the following steps: acquiring a yaw angle, a pitch angle and a roll angle when an image is shot by using a gyroscope on the unmanned aerial vehicle; step two: establishing a coordinate system required by image positioning, wherein the coordinate system comprises a camera coordinate system, an unmanned aerial vehicle geographic coordinate system, a GPS coordinate system and a geodetic rectangular coordinate system;
step three: establishing an image pixel coordinate system and an image physical coordinate system according to the image, and then calculating a three-dimensional coordinate of the actual position of a target point on the image in a camera coordinate system;
step four: substituting the three-dimensional coordinates of the target point in the camera coordinate system, which are obtained by calculation in the third step, into the unmanned aerial vehicle geographic coordinate system to obtain the three-dimensional coordinates of the actual position of the target point on the image in the unmanned aerial vehicle geographic coordinate system; step five: substituting the three-dimensional coordinates of the target point obtained by calculation in the fourth step in the unmanned aerial vehicle geographic coordinate system into a geodetic rectangular coordinate system to obtain the three-dimensional coordinates of the actual position of the target point on the image in the geodetic rectangular coordinate system;
step six: and substituting the three-dimensional coordinates of the target point calculated in the step five in the geodetic rectangular coordinate system into the GPS coordinate system to obtain the three-dimensional coordinates of the actual position of the target point on the image in the GPS coordinate system.
The first embodiment is as follows:
as shown in FIGS. 1-2, an image pixel rectangular coordinate system, X, is established with the upper left corner of the image as the origin and the pixels as coordinate units p And Y p Respectively representing the number of rows and the number of columns of the pixel in the digital image;
the physical coordinates of the imageIs a rectangular coordinate system with the intersection point of the camera optical axis and the image sensor inside the camera as the origin and the actual physical size as the unit, wherein X w Axis, Y w Axes respectively associated with X of the image pixel coordinate system p And Y p The axes are parallel;
as shown in fig. 1, the image physical coordinates of the pixel point p are based on the mapping relationship of the central perspective projection model
Figure 166931DEST_PATH_IMAGE063
Coordinates of the camera coordinate system with the object point P
Figure 578321DEST_PATH_IMAGE064
The relationship of (A) is as follows:
Figure 750676DEST_PATH_IMAGE065
(5-1) wherein
Figure 764768DEST_PATH_IMAGE066
The focal length of the camera, the image pixel coordinate of the pixel point p
Figure 971759DEST_PATH_IMAGE067
With its image physical coordinates
Figure 163837DEST_PATH_IMAGE063
The relationship of (c) is as follows:
Figure 823488DEST_PATH_IMAGE068
(5-2) wherein, in the above,
Figure 516638DEST_PATH_IMAGE069
the image pixel coordinate of the image principal point, namely the intersection point of the camera optical axis and the image sensor in the camera; dx and dy are respectively a single pixel of the camera at X w And Y w The physical size in the direction, in combination with relation (5-1) and relation (5-2) can coordinate the image pixel of the object
Figure 702769DEST_PATH_IMAGE063
Conversion to an imagePhysical coordinates
Figure 455961DEST_PATH_IMAGE063
The geometrical relationship diagram is shown in FIG. 2;
the coordinates of the target point in the image in the camera coordinate system are
Figure 602908DEST_PATH_IMAGE070
(ii) a Image physical coordinates of pixel point p
Figure 706606DEST_PATH_IMAGE063
Coordinates of the camera coordinate system with the object point P
Figure 622610DEST_PATH_IMAGE071
Figure 671337DEST_PATH_IMAGE072
The focal length of the camera, the image pixel coordinate of the pixel point p
Figure 305581DEST_PATH_IMAGE073
Figure 606112DEST_PATH_IMAGE074
The coordinate of the actual position of the target in the image in the camera coordinate system is obtained as the image principal point
Figure 986409DEST_PATH_IMAGE070
Converting the coordinates of the target point in the camera coordinate system into coordinates in the unmanned aerial vehicle geographic coordinate system:
the coordinate conversion steps from points in the camera coordinate system to the unmanned aerial vehicle coordinate system are as follows: s1: the coordinate system is rotated around the X axis along the spiral direction of the right hand, namely the right hand holds the rotating shaft, the pointing direction of the four fingers is the rotating direction, and the matrix is converted
Figure 346983DEST_PATH_IMAGE075
(ii) a S2: rotating the coordinate system around the Y axis in the right-handed helical direction
Figure 202944DEST_PATH_IMAGE076
Conversion matrix
Figure 166220DEST_PATH_IMAGE077
S3: rotating the coordinate system around the Z axis along the right-handed spiral direction
Figure 791237DEST_PATH_IMAGE078
Conversion matrix
Figure 322712DEST_PATH_IMAGE079
In the above step
Figure 275756DEST_PATH_IMAGE080
Figure 183669DEST_PATH_IMAGE076
Figure 928771DEST_PATH_IMAGE078
When the camera and the cradle head acquired by the unmanned aerial vehicle are used for shooting a picture, the camera and the cradle head correspond to the corresponding roll angle, pitch angle and azimuth angle which are obtained by summing up the three attitude angles.
The target in the image is represented as in the unmanned aerial vehicle geographic coordinate system
Figure 490203DEST_PATH_IMAGE081
(6-1) obtaining coordinates of the actual position of the target point in the image in the geographic coordinate system of the unmanned aerial vehicle, and expressing the coordinates as
Figure 55176DEST_PATH_IMAGE082
Calculating the included angle between the connecting line of the unmanned aerial vehicle and the target point and the perpendicular line from the unmanned aerial vehicle to the ground
Figure 110988DEST_PATH_IMAGE083
The connection between the unmanned aerial vehicle and the target point and the included angle between the unmanned aerial vehicle and the image central point connection
Figure 976176DEST_PATH_IMAGE084
Wherein the included angle
Figure 583875DEST_PATH_IMAGE083
The expression of (c) is as follows:
Figure 760778DEST_PATH_IMAGE085
angle of inclination
Figure 744915DEST_PATH_IMAGE084
The expression of (a) is as follows:
Figure 71466DEST_PATH_IMAGE086
(ii) a Subsequent utilization of acquired unmanned aerial vehicle GPS information when taking the photograph
Figure 850066DEST_PATH_IMAGE021
Figure 514266DEST_PATH_IMAGE022
And
Figure 36514DEST_PATH_IMAGE023
calculating the coordinates of the target in the geographical coordinate system of the unmanned aerial vehicle, and projecting the target on the z-axis of the geographical coordinate system of the unmanned aerial vehicle, namely the coordinates of the z-axis of the geographical coordinate system of the unmanned aerial vehicle of the target are
Figure 610715DEST_PATH_IMAGE087
Respectively calculating the ratio of the z-axis coordinate value to the focal length of the camera
Figure 435583DEST_PATH_IMAGE088
Figure 72231DEST_PATH_IMAGE089
Obtaining the coordinates in the geographic coordinate system of the unmanned aerial vehicle
Figure 257225DEST_PATH_IMAGE090
Wherein
Figure 295719DEST_PATH_IMAGE091
Is the focus of a cameraThe distance between the two adjacent plates is equal to each other,
Figure 681701DEST_PATH_IMAGE021
Figure 195859DEST_PATH_IMAGE022
and
Figure 450123DEST_PATH_IMAGE023
respectively representing the longitude, latitude and altitude at which the drone took a picture.
Converting the coordinates of the unmanned aerial vehicle in a geographic coordinate system into the coordinates of the unmanned aerial vehicle in a geodetic rectangular coordinate system:
as shown in fig. 3, firstly, coordinate conversion is performed, and the geographical coordinate system of the unmanned aerial vehicle is rotated around the X axis along the right-hand spiral direction
Figure 467758DEST_PATH_IMAGE092
The right-handed screw principle is adopted here, the right hand holds the rotating shaft, the pointing direction of the four fingers is the rotating direction, and then the rotating shaft rotates around the Z axis pointing to the ground along the right-handed screw direction
Figure 290220DEST_PATH_IMAGE029
And (4) rotating by-90 degrees around the Z axis to obtain the following conversion matrix:
Figure 898531DEST_PATH_IMAGE093
(ii) a (8-1) coordinate translation is performed after rotation, and the translation matrix is as follows:
Figure 566273DEST_PATH_IMAGE094
(ii) a (8-2) in the matrix (8-2), the xyz on the left is the coordinate translation after rotation, and the parameters on the right have the following meanings:
Figure 828627DEST_PATH_IMAGE095
wherein, in the step (A),
Figure 821991DEST_PATH_IMAGE096
is the earth's major radius;
Figure 45162DEST_PATH_IMAGE097
the curvature radius of the earth prime circle is;
Figure 391961DEST_PATH_IMAGE098
a first eccentricity of the earth; combining the matrix (7-1) and the matrix (7-2), the coordinates of the target point can be converted from the unmanned aerial vehicle geographic coordinate system to the geodetic rectangular coordinate system and expressed as the coordinates
Figure 649767DEST_PATH_IMAGE099
The conversion formula is as follows:
Figure 282874DEST_PATH_IMAGE100
and converting the coordinates under the rectangular earth coordinate system into the coordinates under the GPS coordinate system:
the conversion formula from the geodetic rectangular coordinate system to the GPS coordinate system is as follows:
Figure 117974DEST_PATH_IMAGE101
wherein, in the step (A),
Figure 393098DEST_PATH_IMAGE102
Figure 115197DEST_PATH_IMAGE103
Figure 184785DEST_PATH_IMAGE104
respectively the longitude, latitude and height of the target to be solved,
Figure 382548DEST_PATH_IMAGE105
Figure 320417DEST_PATH_IMAGE106
Figure 552815DEST_PATH_IMAGE107
for the coordinates of the target in the rectangular coordinate system of the earth, the expression formula of N is as follows:
Figure 403090DEST_PATH_IMAGE108
an iterative algorithm is adopted during calculation, and the iterative algorithm is set firstly during iteration
Figure 353729DEST_PATH_IMAGE103
Of (2) is calculated
Figure 970655DEST_PATH_IMAGE109
The calculation formula is as follows:
Figure 916614DEST_PATH_IMAGE110
then calculate
Figure 328004DEST_PATH_IMAGE103
Update value of (2):
Figure 500359DEST_PATH_IMAGE111
finally, the initial value is set
Figure 527834DEST_PATH_IMAGE109
And updating the value
Figure 469245DEST_PATH_IMAGE112
Comparing the difference values, if the difference values are within the error range, ending iteration and finally calculating the geodetic coordinates of the target, otherwise, if the error range is found, ending iteration and calculating the geodetic coordinates of the target
Figure 51536DEST_PATH_IMAGE112
The iteration is continued for the initial value until
Figure 835821DEST_PATH_IMAGE113
And
Figure 528971DEST_PATH_IMAGE114
the difference error between the two is within a range, the error range is less than 0.00000001, and the radian is measured;
wherein
Figure 590468DEST_PATH_IMAGE115
Is the earth's major radius;
Figure 484605DEST_PATH_IMAGE116
the curvature radius of the earth fourth prime circle is;
Figure 100395DEST_PATH_IMAGE117
is a first eccentricity of the ellipsoid and is,
Figure 862814DEST_PATH_IMAGE118
Figure 637872DEST_PATH_IMAGE119
Figure 827545DEST_PATH_IMAGE120
respectively represent the correspondence at the i-th iteration
Figure 71576DEST_PATH_IMAGE121
Figure 372107DEST_PATH_IMAGE122
Figure 877038DEST_PATH_IMAGE123
The value of (c).
Hovering error experiment: as shown in fig. 4-6, at a height of 50m and 10m outdoors, the unmanned aerial vehicle hovers and takes a picture, and the landmarks in the picture are calibrated, so that the image GPS positioning is performed, and longitude and latitude errors under 50m and 10m are obtained respectively. Taking the latitude error of the GPS under 10m height as an example, 10 groups of GPS latitude measured values are obtained, the average value of the measured values is obtained, and then the measured value is subtracted from the average value to obtain the latitude average value error under 10m height. Similarly, longitudes at different altitudes may be calculated.
Single point location longitude experiment:
as shown in fig. 9, target positioning analysis is performed at an outdoor height of 2m to 30m, multiple pictures are taken at each height, the same point on the ground is calibrated on the pictures obtained at all heights, the coordinates of the unmanned aerial vehicle at the point are obtained, and the GPS coordinates of the point are calculated. And calculating the precision of the GPS at different heights by taking the GPS data mean value of the point as a calculated GPS value and making a difference with an actually measured GPS value, solving an error mean value of the same height as an error of positioning a single point at the height, and drawing a scatter diagram by taking the actual position of the target as the center, the horizontal axis as the east-west direction and the vertical axis as the north-south direction. And (4) taking two points A and B on the ground to perform the single-point positioning test.
Positioning experiments of different attitude angles in flight:
as shown in fig. 10 to 12, the constant high attitude angle test was performed at heights of 3, 4 and 5m outdoors. If the height is kept unchanged under 3m, changing the flight attitude angle of the unmanned aerial vehicle to shoot a plurality of images, respectively drawing a single-point positioning error scatter diagram under three heights of 3m, 4m and 5m, and drawing a scatter diagram by taking the actual position of the target as an origin, the horizontal axis as the east-west direction and the longitudinal axis as the north-south direction;
and the positioning algorithm takes the target pixel points framed and selected on the image, the derived longitude and latitude and height information of the unmanned aerial vehicle during photographing and the attitude angle information at that time as input, and calculates to obtain the relative position of the target under the coordinates of the unmanned aerial vehicle. And comparing the calculation result of the positioning algorithm with the theoretical GPS coordinate value. The positioning accuracy test results are shown in table 1. In the key target approaching stage, the positioning algorithm can ensure that the unmanned aerial vehicle approaches the position of the target.
TABLE 1 positioning Algorithm accuracy test (positive east and north, negative west and south)
Figure 362246DEST_PATH_IMAGE124
The experimental data show that the higher the flying height the worse the accuracy, but the average error in the overall orthogonal direction is no more than 1m. The algorithm is considered to have a good distance error mean value of 0.83m in the positioning system in the low-altitude approaching process, and has a general distance error mean value of 2.85m in high-altitude flight. Experimental data shows that when the flying height is 1m, the average value of errors in the east-west direction of the unmanned aerial vehicle in the approaching process and the single-point positioning process is 0.05m, the average value of errors in the north-south direction is 0.08m, and the average value of errors in the distance is 0.83m. And when the flying height is 3m, the average value of errors in the east-west direction of the unmanned aerial vehicle in the approaching process and the single-point positioning process is-0.59 m, the average value of errors in the north-south direction is-0.44 m, and the average value of distance errors is 5.24m. The average value of errors in the east-west direction is-0.43 m, the average value of errors in the north-south direction is-1.10 m and the average value of errors in the distance is 2.85m when flying at a higher altitude (higher than 10 m).
According to the experimental result, the errors of the distance measurement between two points with the height of less than 5 meters (taking the distance between the two points as 2 meters as an example) are acceptable in centimeter level, and the higher the height is, the larger the error is. Ranging errors below 7m are considered acceptable. Wherein the mean value of the range errors is 0.02% (3 m), -0.02% (4 m), -0.03% (5 m).
The unmanned aerial vehicle height when the picture is shot in the analysis obtains to the positioning accuracy influence great, adopts the relative height of unmanned aerial vehicle and target that range finding sensor obtained in whole bootstrap system to carry out follow-up data processing. The method is limited by experimental equipment and conditions, cannot fly in a practical manner, and adopts static measurement as a substitute. The laser radar is connected to the computer, the computer receives data sent by the range finder, and the experimental data only has reference significance for actual flight. In a static experiment, the distance measurement precision of the equipment can be improved to 1%, and the actual flying precision needs to be known through further experiments.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.
Furthermore, it should be understood that although the present specification describes embodiments, not every embodiment includes only a single embodiment, and such description is for clarity purposes only, and it is to be understood that all embodiments may be combined as appropriate by one of ordinary skill in the art to form other embodiments as will be apparent to those of skill in the art from the description herein.

Claims (9)

1. An unmanned aerial vehicle imaging positioning method is characterized in that: the method comprises the following steps: the method comprises the following steps: acquiring a yaw angle, a pitch angle and a roll angle when an image is shot by using a gyroscope on the unmanned aerial vehicle; step two: establishing a coordinate system required by image positioning, wherein the coordinate system comprises a camera coordinate system, an unmanned aerial vehicle geographic coordinate system, a GPS coordinate system and a geodetic rectangular coordinate system;
step three: establishing an image pixel coordinate system and an image physical coordinate system according to the image, and then calculating a three-dimensional coordinate of the actual position of a target point on the image in a camera coordinate system;
step four: substituting the three-dimensional coordinates of the target point in the camera coordinate system, which are obtained by calculation in the third step, into the unmanned aerial vehicle geographic coordinate system to obtain the three-dimensional coordinates of the actual position of the target point on the image in the unmanned aerial vehicle geographic coordinate system; step five: substituting the three-dimensional coordinates of the target point obtained by calculation in the fourth step in the unmanned aerial vehicle geographic coordinate system into a geodetic rectangular coordinate system to obtain the three-dimensional coordinates of the actual position of the target point on the image in the geodetic rectangular coordinate system;
step six: and substituting the three-dimensional coordinates of the target point calculated in the step five in the geodetic rectangular coordinate system into the GPS coordinate system to obtain the three-dimensional coordinates of the actual position of the target point on the image in the GPS coordinate system.
2. The unmanned aerial vehicle imaging positioning method of claim 1, wherein: in the camera coordinate system, the center point of the plane of the camera is taken as the origin and is marked as O c The direction of the X-axis and the Y-axis is the same as the direction of the X-axis and the Y-axis in the image, with the optical axis direction of the camera as the Z-axis direction.
3. The unmanned aerial vehicle imaging positioning method of claim 1, wherein: in the geographic coordinate system of the unmanned aerial vehicle, the actual position of the unmanned aerial vehicle is taken as an original point, the perpendicular line between the unmanned aerial vehicle and the ground is taken as a Z axis, the south-righting direction of the unmanned aerial vehicle is taken as a Y axis, the east-righting direction of the unmanned aerial vehicle is taken as an X axis, and during dip shooting, the yaw angle, the pitch angle and the roll angle of a gyroscope respectively represent the rotation angles between three polar axes of a camera coordinate system and an unmanned aerial vehicle coordinate system; the GPS coordinate system is a world geodetic coordinate system WGS-84, the origin of coordinates is the earth centroid, the X axis points to the intersection point of the meridian and the equator from the origin, the Z axis points to the north pole direction of the earth, the Y axis is perpendicular to the plane of XOZ and forms a right-hand coordinate system with the X axis and the Z axis, and the coordinates of any point in the coordinate system can be represented as B, L and H and respectively represent the latitude, longitude and height of the point; and the coordinate axis and the coordinate origin of the geodetic rectangular coordinate system are completely coincided with the coordinate axis and the coordinate origin of the GPS coordinate system.
4. The unmanned aerial vehicle imaging positioning method according to claim 1, wherein: in step three, the image pixel coordinate system is a rectangular coordinate system with the upper left corner of the currently shot image as the origin and the pixels as coordinate units, and X is p And Y p Respectively representing the number of lines and columns of the pixel in a digital image, wherein the image physical coordinate system is a rectangular coordinate system which takes the intersection point of the optical axis of the camera and an image sensor inside the camera as an origin and takes the actual physical dimension as a unit, and X is w Axis, Y w Axes respectively associated with X of the image pixel coordinate system p And Y p The axes are parallel.
5. The unmanned aerial vehicle imaging positioning method of claim 4, wherein: selecting a target point P and a pixel point P in the image, wherein,
Figure 9177DEST_PATH_IMAGE001
is the image physical coordinate of the pixel point p in the image pixel coordinate system,
Figure 779687DEST_PATH_IMAGE002
is the image pixel coordinate of pixel point p in the image pixel coordinate system,
Figure 750048DEST_PATH_IMAGE001
coordinates of the camera coordinate system with the target point P
Figure 606008DEST_PATH_IMAGE003
The relationship of (a) is as follows:
Figure 569285DEST_PATH_IMAGE004
……………….(5-1),
image pixel coordinates of pixel point p
Figure 459881DEST_PATH_IMAGE002
With its image physical coordinates
Figure 991356DEST_PATH_IMAGE001
The relationship of (a) to (b) is as follows:
Figure 941470DEST_PATH_IMAGE005
\8230; \8230; \ 8230; (5-2), wherein the content of the first and second substances,
Figure 583804DEST_PATH_IMAGE006
is the focal length of the camera and,
Figure 63327DEST_PATH_IMAGE007
the image principal point is the image pixel coordinate of the intersection point of the camera optical axis and the image sensor in the camera; dx and dy are respectively a single pixel of the camera at X w And Y w Physical dimensions in directions; the image pixel coordinate of the pixel point p can be obtained by combining the relational expression (5-1) and the relational expression (5-2)
Figure 890338DEST_PATH_IMAGE002
Converting the image physical coordinates of the pixel point p
Figure 720890DEST_PATH_IMAGE001
Simultaneously obtaining the coordinate of the target point P in the image in the camera coordinate system as
Figure 901336DEST_PATH_IMAGE008
6. The unmanned aerial vehicle imaging positioning method of claim 1, wherein: in the fourth step, the step of converting the three-dimensional coordinates of the points in the camera coordinate system to the coordinates of the geographic coordinate system of the unmanned aerial vehicle is as follows: s1: rotating the coordinate system around the X axis along the spiral direction of the right hand, that is, holding the rotating shaft by the right hand, taking the pointing direction of the four fingers as the rotating direction, and converting the matrix
Figure 641890DEST_PATH_IMAGE009
(ii) a S2: rotating the coordinate system around the Y axis along the right-handed spiral direction to convert the matrix
Figure 515168DEST_PATH_IMAGE010
S3: rotating the coordinate system around the Z axis along the right-hand spiral direction to convert the matrix
Figure 301858DEST_PATH_IMAGE011
In the above step
Figure 676208DEST_PATH_IMAGE012
Figure 864744DEST_PATH_IMAGE013
Figure 784289DEST_PATH_IMAGE014
The degrees of a roll angle, a pitch angle and an azimuth angle are acquired by a gyroscope when a picture is taken;
the target in the image is represented as in the unmanned aerial vehicle geographic coordinate system
Figure 323855DEST_PATH_IMAGE015
\8230; (6-1) and the coordinate representation of the actual position of the target point in the obtained image in the geographical coordinate system of the unmanned aerial vehicle is shown as the coordinate
Figure 846103DEST_PATH_IMAGE016
7. The unmanned aerial vehicle imaging positioning method of claim 6, wherein: in the fourth step, the included angle between the connecting line of the unmanned aerial vehicle and the target point and the perpendicular line from the unmanned aerial vehicle to the ground is calculated
Figure 544938DEST_PATH_IMAGE017
The connection between the unmanned aerial vehicle and the target point and the included angle between the unmanned aerial vehicle and the image central point connection
Figure 494439DEST_PATH_IMAGE018
(ii) a Subsequent utilization of acquired unmanned aerial vehicle GPS information when taking the photograph
Figure 786881DEST_PATH_IMAGE019
And
Figure 722607DEST_PATH_IMAGE020
calculating the coordinates of the target in the geographic coordinate system of the unmanned aerial vehicle, and projecting the target on the Z axis of the geographic coordinate system of the unmanned aerial vehicle, namely the coordinates of the Z axis of the geographic coordinate system of the unmanned aerial vehicle of the target point are
Figure 416893DEST_PATH_IMAGE021
Respectively calculating the coordinate of X axis according to the ratio of the coordinate value of Z axis to the focal length of camera
Figure 661930DEST_PATH_IMAGE022
Y-axis coordinates
Figure 176088DEST_PATH_IMAGE023
Obtaining the coordinates in the geographic coordinate system of the unmanned aerial vehicle
Figure 305718DEST_PATH_IMAGE024
Wherein
Figure 854511DEST_PATH_IMAGE025
Is the focal length of the camera
Figure 18251DEST_PATH_IMAGE019
And
Figure 19705DEST_PATH_IMAGE020
respectively representing the longitude, latitude and altitude at which the drone took a picture.
8. The unmanned aerial vehicle imaging positioning method of claim 1, wherein: in the fifth step, firstly, the coordinates are converted, and the unmanned aerial vehicle geographic coordinate system rotates around the X axis along the right-hand spiral direction
Figure 812081DEST_PATH_IMAGE026
The right-handed screw principle is adopted here, the right hand holds the rotating shaft, the pointing direction of the four fingers is the rotating direction, and then the rotating shaft rotates around the Z axis pointing to the ground along the right-handed screw direction
Figure 215380DEST_PATH_IMAGE027
And (4) rotating by-90 degrees around the Z axis to obtain the following conversion matrix:
Figure 943165DEST_PATH_IMAGE029
;(8-1)
the rotation is followed by a coordinate translation, the translation matrix being as follows:
Figure 41702DEST_PATH_IMAGE031
(ii) a (8-2) in the matrix (8-2), the xyz on the left side is the coordinate translation after rotation, and the meaning of the parameters on the right side is as follows:
Figure 778714DEST_PATH_IMAGE032
wherein, in the step (A),
Figure 36520DEST_PATH_IMAGE033
is the major radius of the ellipsoid; n is the curvature radius of the earth prime circle;
Figure 794260DEST_PATH_IMAGE034
a first eccentricity of the earth; the coordinates of the target point can be converted from the unmanned aerial vehicle geographic coordinate system to the coordinates of the earth rectangular coordinate system to be represented as the coordinates of the unmanned aerial vehicle geographic coordinate system by combining the matrix (8-1) and the matrix (8-2)
Figure 770307DEST_PATH_IMAGE035
The conversion formula is as follows:
Figure 779851DEST_PATH_IMAGE036
wherein
Figure 767530DEST_PATH_IMAGE019
And
Figure 102696DEST_PATH_IMAGE020
representing the longitude, latitude and altitude of the unmanned aerial vehicle when taking a picture,
Figure 34880DEST_PATH_IMAGE037
is the earth's major radius;
Figure 238328DEST_PATH_IMAGE038
the curvature radius of the earth fourth prime circle is;
Figure 939568DEST_PATH_IMAGE039
is the first eccentricity of the earth.
9. The unmanned aerial vehicle imaging positioning method of claim 1, wherein: in step six, the conversion formula from the geodetic rectangular coordinate system to the GPS coordinate system is as follows:
Figure 445636DEST_PATH_IMAGE041
wherein, in the process,
Figure 6061DEST_PATH_IMAGE042
Figure 357408DEST_PATH_IMAGE043
Figure 444313DEST_PATH_IMAGE044
respectively the longitude, the latitude and the height of the target to be solved,
Figure 980336DEST_PATH_IMAGE045
Figure 152692DEST_PATH_IMAGE046
Figure 180166DEST_PATH_IMAGE047
for the coordinates of the target in the rectangular coordinate system of the earth, the expression formula of N is as follows:
Figure 121577DEST_PATH_IMAGE048
an iterative algorithm is adopted during calculation, and the iterative algorithm is set firstly during iteration
Figure 969447DEST_PATH_IMAGE043
Of (2) is calculated
Figure 488154DEST_PATH_IMAGE049
The calculation formula is as follows:
Figure 446882DEST_PATH_IMAGE051
then calculate
Figure 242800DEST_PATH_IMAGE043
Update value of (2):
Figure 871359DEST_PATH_IMAGE053
finally, the initial value is set
Figure 18306DEST_PATH_IMAGE049
And updating the value
Figure 515146DEST_PATH_IMAGE054
Comparing the difference values, if the difference values are within the error range, ending iteration and finally calculating the geodetic coordinates of the target, otherwise, if the error range is found, ending iteration and calculating the geodetic coordinates of the target
Figure 555784DEST_PATH_IMAGE054
Iteration is continued for the initial value until
Figure 479877DEST_PATH_IMAGE055
And
Figure 458329DEST_PATH_IMAGE056
the difference error between the two is within a range, the error range is less than 0.00000001, and the radian is measured;
wherein
Figure 758860DEST_PATH_IMAGE037
Is the earth's major radius;
Figure 529370DEST_PATH_IMAGE038
the curvature radius of the earth fourth prime circle is;
Figure 827627DEST_PATH_IMAGE039
is the first eccentricity of the ellipsoid and is,
Figure 683588DEST_PATH_IMAGE057
Figure 787810DEST_PATH_IMAGE058
Figure 537460DEST_PATH_IMAGE059
respectively represent the correspondence at the i-th iteration
Figure 68936DEST_PATH_IMAGE043
Figure 146613DEST_PATH_IMAGE038
Figure 684821DEST_PATH_IMAGE044
The value of (c).
CN202211479183.0A 2022-11-24 2022-11-24 Unmanned aerial vehicle imaging positioning method Pending CN115511956A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211479183.0A CN115511956A (en) 2022-11-24 2022-11-24 Unmanned aerial vehicle imaging positioning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211479183.0A CN115511956A (en) 2022-11-24 2022-11-24 Unmanned aerial vehicle imaging positioning method

Publications (1)

Publication Number Publication Date
CN115511956A true CN115511956A (en) 2022-12-23

Family

ID=84514073

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211479183.0A Pending CN115511956A (en) 2022-11-24 2022-11-24 Unmanned aerial vehicle imaging positioning method

Country Status (1)

Country Link
CN (1) CN115511956A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116774142A (en) * 2023-06-13 2023-09-19 中国电子产业工程有限公司 Coordinate conversion method in non-equal-altitude double-machine cross positioning
CN116839595A (en) * 2023-09-01 2023-10-03 北京宝隆泓瑞科技有限公司 Method for creating unmanned aerial vehicle route
CN117291980A (en) * 2023-10-09 2023-12-26 宁波博登智能科技有限公司 Single unmanned aerial vehicle image pixel positioning method based on deep learning
CN117496114A (en) * 2023-11-29 2024-02-02 西安道达天际信息技术有限公司 Target positioning method, system and computer storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105353772A (en) * 2015-11-16 2016-02-24 中国航天时代电子公司 Visual servo control method for unmanned aerial vehicle maneuvering target locating and tracking
CN110542407A (en) * 2019-07-23 2019-12-06 中国科学院长春光学精密机械与物理研究所 Method for acquiring positioning information of any pixel point of aerial image
US11280608B1 (en) * 2019-12-11 2022-03-22 Sentera, Inc. UAV above ground level determination for precision agriculture

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105353772A (en) * 2015-11-16 2016-02-24 中国航天时代电子公司 Visual servo control method for unmanned aerial vehicle maneuvering target locating and tracking
CN110542407A (en) * 2019-07-23 2019-12-06 中国科学院长春光学精密机械与物理研究所 Method for acquiring positioning information of any pixel point of aerial image
US11280608B1 (en) * 2019-12-11 2022-03-22 Sentera, Inc. UAV above ground level determination for precision agriculture

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
徐诚等: "基于光电测量平台的多目标定位算法", 《中南大学学报》 *
桑金: "空间大地直角坐标与大地坐标反算的非迭代法", 《测绘通报》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116774142A (en) * 2023-06-13 2023-09-19 中国电子产业工程有限公司 Coordinate conversion method in non-equal-altitude double-machine cross positioning
CN116774142B (en) * 2023-06-13 2024-03-01 中国电子产业工程有限公司 Coordinate conversion method in non-equal-altitude double-machine cross positioning
CN116839595A (en) * 2023-09-01 2023-10-03 北京宝隆泓瑞科技有限公司 Method for creating unmanned aerial vehicle route
CN116839595B (en) * 2023-09-01 2023-11-28 北京宝隆泓瑞科技有限公司 Method for creating unmanned aerial vehicle route
CN117291980A (en) * 2023-10-09 2023-12-26 宁波博登智能科技有限公司 Single unmanned aerial vehicle image pixel positioning method based on deep learning
CN117291980B (en) * 2023-10-09 2024-03-15 宁波博登智能科技有限公司 Single unmanned aerial vehicle image pixel positioning method based on deep learning
CN117496114A (en) * 2023-11-29 2024-02-02 西安道达天际信息技术有限公司 Target positioning method, system and computer storage medium

Similar Documents

Publication Publication Date Title
CN112894832B (en) Three-dimensional modeling method, three-dimensional modeling device, electronic equipment and storage medium
CN115511956A (en) Unmanned aerial vehicle imaging positioning method
CN103345737B (en) A kind of UAV high resolution image geometric correction method based on error compensation
Sanz‐Ablanedo et al. Reducing systematic dome errors in digital elevation models through better UAV flight design
CN113850126A (en) Target detection and three-dimensional positioning method and system based on unmanned aerial vehicle
CN107490364A (en) A kind of wide-angle tilt is imaged aerial camera object positioning method
CN112710311B (en) Automatic planning method for three-dimensional live-action reconstruction aerial camera points of terrain adaptive unmanned aerial vehicle
CN110987021B (en) Inertial vision relative attitude calibration method based on rotary table reference
CN104729482B (en) A kind of ground small objects detecting system and method based on dirigible
CN101114022A (en) Navigation multiple spectrum scanner geometric approximate correction method under non gesture information condition
CN113538595B (en) Method for improving geometric precision of remote sensing stereo image by using laser height measurement data in auxiliary manner
CN112184786B (en) Target positioning method based on synthetic vision
CN105444778B (en) A kind of star sensor based on imaging geometry inverting is in-orbit to determine appearance error acquisition methods
CN106023207B (en) It is a kind of to be enjoyed a double blessing the Municipal Component acquisition method of scape based on traverse measurement system
CN110986888A (en) Aerial photography integrated method
CN116309798A (en) Unmanned aerial vehicle imaging positioning method
CN116883604A (en) Three-dimensional modeling technical method based on space, air and ground images
CN107063191B (en) A kind of method of photogrammetric regional network entirety relative orientation
CN107705272A (en) A kind of high-precision geometric correction method of aerial image
CN112985398A (en) Target positioning method and system
CN110068312A (en) A kind of digital zenith instrument localization method based on spherical triangle
CN116124094A (en) Multi-target co-location method based on unmanned aerial vehicle reconnaissance image and combined navigation information
Zhang Photogrammetric processing of low altitude image sequences by unmanned airship
CN112489118B (en) Method for quickly calibrating external parameters of airborne sensor group of unmanned aerial vehicle
CN113483739B (en) Offshore target position measuring method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20221223

RJ01 Rejection of invention patent application after publication