CN115511956A - Unmanned aerial vehicle imaging positioning method - Google Patents
Unmanned aerial vehicle imaging positioning method Download PDFInfo
- Publication number
- CN115511956A CN115511956A CN202211479183.0A CN202211479183A CN115511956A CN 115511956 A CN115511956 A CN 115511956A CN 202211479183 A CN202211479183 A CN 202211479183A CN 115511956 A CN115511956 A CN 115511956A
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- aerial vehicle
- unmanned aerial
- image
- axis
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 29
- 238000003384 imaging method Methods 0.000 title claims abstract description 24
- 239000011159 matrix material Substances 0.000 claims description 24
- 238000006243 chemical reaction Methods 0.000 claims description 18
- 238000004364 calculation method Methods 0.000 claims description 13
- 238000013519 translation Methods 0.000 claims description 9
- 230000003287 optical effect Effects 0.000 claims description 8
- 239000000126 substance Substances 0.000 claims 1
- 238000010586 diagram Methods 0.000 description 9
- 238000005516 engineering process Methods 0.000 description 8
- 238000002474 experimental method Methods 0.000 description 5
- 238000012360 testing method Methods 0.000 description 4
- 238000005259 measurement Methods 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000012827 research and development Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000012612 static experiment Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/16—Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Remote Sensing (AREA)
- Theoretical Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Pure & Applied Mathematics (AREA)
- Mathematical Optimization (AREA)
- Mathematical Analysis (AREA)
- Data Mining & Analysis (AREA)
- Computational Mathematics (AREA)
- Computing Systems (AREA)
- Algebra (AREA)
- Databases & Information Systems (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Networks & Wireless Communication (AREA)
- Position Fixing By Use Of Radio Waves (AREA)
Abstract
The invention discloses an unmanned aerial vehicle imaging positioning method, which comprises the following steps: determining the actual meanings of a yaw angle, a pitch angle and a roll angle of a gyroscope on the unmanned aerial vehicle; step two: establishing a coordinate system required by image positioning; step three: establishing an image pixel coordinate system and an image physical coordinate system according to the image, and calculating a three-dimensional coordinate of the actual position of the target point in a camera coordinate system; step four: converting the camera coordinate system into an unmanned aerial vehicle geographic coordinate system; step five: converting the unmanned aerial vehicle geographic coordinate system into a geodetic rectangular coordinate system; step six: the invention converts the earth rectangular coordinate system into a GPS coordinate system, and has the advantages that: according to the geometrical relationship among the target point, the image point and the measuring point in the target positioning process, the longitude and latitude and the height of the target point are calculated by utilizing a Matlab algorithm, so that the target positioning is realized, and the precision of the target positioning imaging of the unmanned aerial vehicle is improved.
Description
Technical Field
The invention relates to the technical field of unmanned aerial vehicle imaging positioning, in particular to an unmanned aerial vehicle imaging positioning method.
Background
The ground target imaging technology of the unmanned aerial vehicle is a modern advanced remote positioning technology, is widely applied in the fields of military affairs and civil affairs, and mainly comprises technical key points such as target detection and identification, target positioning and the like. The precision of the target positioning technology determines the precision level of unmanned aerial vehicle target positioning imaging to a great extent. With the rapid development of the modern technology level, the precision requirement of each field on the target positioning technology is higher and higher.
As the research and development of unmanned aerial vehicle technology, in particular to a target imaging positioning system, is started later than abroad in China, the defects of poor positioning accuracy and poor interference resistance exist in the field of unmanned aerial vehicle imaging positioning in particular. Therefore, the rapid development of the unmanned aerial vehicle imaging positioning technology is important for research and development in the technical field of reconnaissance in China.
Disclosure of Invention
The invention aims to provide an unmanned aerial vehicle imaging positioning method to solve the problems in the background technology.
In order to achieve the purpose, the invention provides the following technical scheme: an unmanned aerial vehicle imaging positioning method comprises the following steps:
the method comprises the following steps: acquiring a yaw angle, a pitch angle and a roll angle when an image is shot by using a gyroscope on the unmanned aerial vehicle; step two: establishing a coordinate system required by image positioning, wherein the coordinate system comprises a camera coordinate system, an unmanned aerial vehicle geographic coordinate system, a GPS coordinate system and a geodetic rectangular coordinate system;
step three: establishing an image pixel coordinate system and an image physical coordinate system according to the image, and then calculating a three-dimensional coordinate of the actual position of a target point on the image in a camera coordinate system;
step four: substituting the three-dimensional coordinates of the target point in the camera coordinate system, which are obtained by calculation in the third step, into the unmanned aerial vehicle geographic coordinate system to obtain the three-dimensional coordinates of the actual position of the target point on the image in the unmanned aerial vehicle geographic coordinate system; step five: substituting the three-dimensional coordinates of the target point obtained by calculation in the fourth step in the unmanned aerial vehicle geographic coordinate system into a geodetic rectangular coordinate system to obtain the three-dimensional coordinates of the actual position of the target point on the image in the geodetic rectangular coordinate system;
step six: and substituting the three-dimensional coordinates of the target point calculated in the step five in the geodetic rectangular coordinate system into the GPS coordinate system to obtain the three-dimensional coordinates of the actual position of the target point on the image in the GPS coordinate system.
Furthermore, in the camera coordinate system, the plane central point of the camera is taken as the origin and is marked as O c The direction of the X-axis and the Y-axis is the same as the direction of the X-axis and the Y-axis in the image, with the optical axis direction of the camera as the Z-axis direction.
Further, in the geographical coordinate system of the unmanned aerial vehicle, the actual position of the unmanned aerial vehicle is taken as an original point, a perpendicular line between the unmanned aerial vehicle and the ground is taken as a Z axis, the south-positive direction of the unmanned aerial vehicle is taken as a Y axis, the east-positive direction of the unmanned aerial vehicle is taken as an X axis, and during overhead shooting, the yaw angle, the pitch angle and the roll angle of the gyroscope respectively represent rotation angles between three polar axes of the camera coordinate system and the unmanned aerial vehicle coordinate system; the GPS coordinate system is a world geodetic coordinate system WGS-84, the origin of coordinates is the earth centroid, the X axis points to the intersection point of the meridian and the equator from the origin, the Z axis points to the north pole direction of the earth, the Y axis is perpendicular to the plane of XOZ and forms a right-hand coordinate system with the X axis and the Z axis, and the coordinates of any point in the coordinate system can be represented as B, L and H and respectively represent the latitude, longitude and height of the point; and the coordinate axis and the coordinate origin of the geodetic rectangular coordinate system are completely coincided with the coordinate axis and the coordinate origin of the GPS coordinate system.
Further, in step three, the image pixel coordinate system is a rectangular coordinate system with the upper left corner of the currently photographed image as the origin and the pixels as the coordinate units, and X is p And Y p Respectively representing the number of lines and columns of the pixel in the digital image, wherein the image physical coordinate system takes the intersection point of the optical axis of the camera and the image sensor in the camera as the origin and the actual physical size as the actual physical sizeRectangular coordinate system of units, wherein X w Axis, Y w Axes respectively associated with X of the image pixel coordinate system p And Y p The axes are parallel.
Furthermore, in the image pixel coordinate system, the image physical coordinate of the pixel point pCoordinates of the camera coordinate system with the object point PThe relationship of (A) is as follows:(5-1) whereinThe focal length of the camera, the image pixel coordinate of the pixel point pWith its image physical coordinatesThe relationship of (a) to (b) is as follows:(5-2) wherein, in the above,the image pixel coordinate of the image principal point, namely the intersection point of the camera optical axis and the image sensor in the camera; dx and dy are respectively a single pixel of the camera at X w And Y w The physical size in the direction, in combination with relation (5-1) and relation (5-2) can coordinate the image pixel of the objectConversion into image physical coordinatesThe coordinates of the target point in the image in the camera coordinate system are(ii) a Image physical coordinates of pixel point pCoordinates of the camera coordinate system with the object point P,The focal length of the camera, the image pixel coordinate of the pixel point p,The coordinate of the actual position of the target in the image in the camera coordinate system is obtained as the image principal point。
Further, in step four, the step of converting the coordinates of the points in the camera coordinate system to the coordinate system of the drone is as follows: s1: the coordinate system is rotated around the X axis along the spiral direction of the right hand, namely the right hand holds the rotating shaft, the pointing direction of the four fingers is the rotating direction, and the matrix is converted(ii) a S2: rotating the coordinate system around the Y axis in the right-handed helical directionConversion matrix;
S3: the coordinate system is arranged along the right around the Z axisHand rotating in spiral directionConversion matrix。
In the above step,,When a picture is shot, the corresponding roll angle, pitch angle and azimuth angle obtained by summing the three attitude angles corresponding to the camera and the holder acquired by the unmanned aerial vehicle are obtained;
the target in the image is represented as the unmanned aerial vehicle in the geographic coordinate system(6-1) the coordinates of the actual position of the target point in the obtained image in the unmanned aerial vehicle geographic coordinate system are expressed as。
Further, in the fourth step, an included angle between a connecting line between the unmanned aerial vehicle and the target point and a perpendicular line from the unmanned aerial vehicle to the ground is calculatedConnection of unmanned aerial vehicle and target point and included angle between connection of unmanned aerial vehicle to image central pointWherein the included angleThe expression of (a) is as follows:angle of inclinationThe expression of (a) is as follows:(ii) a Subsequent utilization of acquired unmanned aerial vehicle GPS information when taking the photograph、Andcalculating the coordinates of the target in the geographical coordinate system of the unmanned aerial vehicle, and projecting the target on the z-axis of the geographical coordinate system of the unmanned aerial vehicle, namely the coordinates of the z-axis of the geographical coordinate system of the unmanned aerial vehicle of the target areRespectively calculating the ratio of the z-axis coordinate value to the focal length of the camera、Obtaining the coordinates in the geographic coordinate system of the unmanned aerial vehicleWhereinIs the focal length of the camera、Andrespectively representing the longitude, latitude and altitude at which the drone took a picture.
Further, in the fifth step, firstly, coordinate conversion is carried out, and the unmanned aerial vehicle geographic coordinate system is rotated around the X axis along the right-hand spiral directionThe right-handed screw principle is adopted here, the right hand holds the rotating shaft, the pointing direction of the four fingers is the rotating direction, and then the rotating shaft rotates around the Z axis pointing to the ground along the right-handed screw directionAnd (4) rotating by-90 degrees around the Z axis to obtain the following conversion matrix:
(ii) a (8-1) coordinate translation is performed after rotation, and the translation matrix is as follows:
(ii) a (8-2) in the matrix (8-2), the xyz on the left is the coordinate translation after rotation, and the parameters on the right have the following meanings:wherein, in the step (A),is the major radius of the ellipsoid;the curvature radius of the earth prime circle is;a first eccentricity of the earth; a combination matrix (7-1) anda matrix (7-2) which can convert the coordinates of the target point from the unmanned aerial vehicle geographic coordinate system to the coordinates of a geodetic rectangular coordinate system and express the coordinates asThe conversion formula is as follows:;
longitude, latitude, and altitude of the drone with the capture when taking the photograph、And,is the major radius of an ellipsoid;the curvature radius of the earth prime circle is;a first eccentricity of the earth; obtaining a coordinate expression of a target geodetic rectangular coordinate system as。
Further, in step six, the conversion formula from the earth rectangular coordinate system to the GPS coordinate system is as follows:wherein, in the step (A),、、respectively the longitude, the latitude and the height of the target to be solved,、、for the coordinates of the target in the rectangular coordinate system of the earth, the expression formula of N is as follows:an iterative algorithm is adopted during calculation, and is set during iterationInitial value of (2)The calculation formula is as follows:then calculateUpdate value of (2):finally, the initial value is setAnd updating the valueComparing the difference between the two, if the difference is within the error range, ending the iteration and finally obtaining the resultFinally, the geodetic coordinates of the target are calculated, otherwise, if the error range is found, the earth coordinates of the target are calculatedIteration is continued for the initial value untilAnd withThe difference error between the two is within a range, the error range is less than 0.00000001, and the radian is measured;
whereinIs the earth's major radius;the curvature radius of the earth fourth prime circle is;is a first eccentricity of the ellipsoid and is,、、respectively represent the correspondence at the i-th iteration、、Of interest, during the specific experiment, the drone was taking a pictureThe height H can be obtained by radar to achieve a more accurate height value.
Compared with the prior art, the invention has the beneficial effects that: according to the geometric relationship among the target point, the image point and the measuring point in the target positioning process, the longitude and latitude and the height of the target point are calculated by utilizing a Matlab algorithm, so that the target positioning is realized, and the accuracy of the target positioning imaging of the unmanned aerial vehicle is improved by analyzing the yaw angle, the roll angle and the pitch angle.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic view of a central perspective projection model of the present invention;
FIG. 2 is a geometric relationship diagram for object localization in accordance with the present invention;
FIG. 3 is a schematic diagram of the transformation from the geographical coordinate system of the unmanned aerial vehicle to the rectangular coordinate system of the earth according to the present invention;
FIG. 4 is a line graph of a 50M hovering GPS error according to the present invention;
FIG. 5 is a line graph of a 10M hovering GPS error according to the present invention;
FIG. 6 is a line drawing of the horizontal error of the attitude angle of the 1.99m camera of the present invention;
FIG. 7 is a diagram of positioning error (horizontal projection) at different heights according to the present invention;
FIG. 8 is a graph of positioning error (range error) at different heights in accordance with the present invention;
FIG. 9 is a diagram of positioning error (horizontal projection) for different attitude angles according to the present invention;
FIG. 10 is a graph of the positioning error of point A under a variable attitude angle according to the present invention;
FIG. 11 is a diagram of the positioning error of point B under a variable attitude angle according to the present invention;
FIG. 12 is a variable attitude angle horizontal error line chart of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1-3, in an embodiment of the present invention, an unmanned aerial vehicle imaging positioning method includes the following steps:
the method comprises the following steps: acquiring a yaw angle, a pitch angle and a roll angle when an image is shot by using a gyroscope on the unmanned aerial vehicle; step two: establishing a coordinate system required by image positioning, wherein the coordinate system comprises a camera coordinate system, an unmanned aerial vehicle geographic coordinate system, a GPS coordinate system and a geodetic rectangular coordinate system;
step three: establishing an image pixel coordinate system and an image physical coordinate system according to the image, and then calculating a three-dimensional coordinate of the actual position of a target point on the image in a camera coordinate system;
step four: substituting the three-dimensional coordinates of the target point in the camera coordinate system, which are obtained by calculation in the third step, into the unmanned aerial vehicle geographic coordinate system to obtain the three-dimensional coordinates of the actual position of the target point on the image in the unmanned aerial vehicle geographic coordinate system; step five: substituting the three-dimensional coordinates of the target point obtained by calculation in the fourth step in the unmanned aerial vehicle geographic coordinate system into a geodetic rectangular coordinate system to obtain the three-dimensional coordinates of the actual position of the target point on the image in the geodetic rectangular coordinate system;
step six: and substituting the three-dimensional coordinates of the target point calculated in the step five in the geodetic rectangular coordinate system into the GPS coordinate system to obtain the three-dimensional coordinates of the actual position of the target point on the image in the GPS coordinate system.
The first embodiment is as follows:
as shown in FIGS. 1-2, an image pixel rectangular coordinate system, X, is established with the upper left corner of the image as the origin and the pixels as coordinate units p And Y p Respectively representing the number of rows and the number of columns of the pixel in the digital image;
the physical coordinates of the imageIs a rectangular coordinate system with the intersection point of the camera optical axis and the image sensor inside the camera as the origin and the actual physical size as the unit, wherein X w Axis, Y w Axes respectively associated with X of the image pixel coordinate system p And Y p The axes are parallel;
as shown in fig. 1, the image physical coordinates of the pixel point p are based on the mapping relationship of the central perspective projection modelCoordinates of the camera coordinate system with the object point PThe relationship of (A) is as follows:(5-1) whereinThe focal length of the camera, the image pixel coordinate of the pixel point pWith its image physical coordinatesThe relationship of (c) is as follows:(5-2) wherein, in the above,the image pixel coordinate of the image principal point, namely the intersection point of the camera optical axis and the image sensor in the camera; dx and dy are respectively a single pixel of the camera at X w And Y w The physical size in the direction, in combination with relation (5-1) and relation (5-2) can coordinate the image pixel of the objectConversion to an imagePhysical coordinatesThe geometrical relationship diagram is shown in FIG. 2;
the coordinates of the target point in the image in the camera coordinate system are(ii) a Image physical coordinates of pixel point pCoordinates of the camera coordinate system with the object point P,The focal length of the camera, the image pixel coordinate of the pixel point p,The coordinate of the actual position of the target in the image in the camera coordinate system is obtained as the image principal point;
Converting the coordinates of the target point in the camera coordinate system into coordinates in the unmanned aerial vehicle geographic coordinate system:
the coordinate conversion steps from points in the camera coordinate system to the unmanned aerial vehicle coordinate system are as follows: s1: the coordinate system is rotated around the X axis along the spiral direction of the right hand, namely the right hand holds the rotating shaft, the pointing direction of the four fingers is the rotating direction, and the matrix is converted(ii) a S2: rotating the coordinate system around the Y axis in the right-handed helical directionConversion matrix;
S3: rotating the coordinate system around the Z axis along the right-handed spiral directionConversion matrix。
In the above step,,When the camera and the cradle head acquired by the unmanned aerial vehicle are used for shooting a picture, the camera and the cradle head correspond to the corresponding roll angle, pitch angle and azimuth angle which are obtained by summing up the three attitude angles.
The target in the image is represented as in the unmanned aerial vehicle geographic coordinate system(6-1) obtaining coordinates of the actual position of the target point in the image in the geographic coordinate system of the unmanned aerial vehicle, and expressing the coordinates as;
Calculating the included angle between the connecting line of the unmanned aerial vehicle and the target point and the perpendicular line from the unmanned aerial vehicle to the groundThe connection between the unmanned aerial vehicle and the target point and the included angle between the unmanned aerial vehicle and the image central point connectionWherein the included angleThe expression of (c) is as follows:angle of inclinationThe expression of (a) is as follows:(ii) a Subsequent utilization of acquired unmanned aerial vehicle GPS information when taking the photograph、Andcalculating the coordinates of the target in the geographical coordinate system of the unmanned aerial vehicle, and projecting the target on the z-axis of the geographical coordinate system of the unmanned aerial vehicle, namely the coordinates of the z-axis of the geographical coordinate system of the unmanned aerial vehicle of the target areRespectively calculating the ratio of the z-axis coordinate value to the focal length of the camera、Obtaining the coordinates in the geographic coordinate system of the unmanned aerial vehicleWhereinIs the focus of a cameraThe distance between the two adjacent plates is equal to each other,、andrespectively representing the longitude, latitude and altitude at which the drone took a picture.
Converting the coordinates of the unmanned aerial vehicle in a geographic coordinate system into the coordinates of the unmanned aerial vehicle in a geodetic rectangular coordinate system:
as shown in fig. 3, firstly, coordinate conversion is performed, and the geographical coordinate system of the unmanned aerial vehicle is rotated around the X axis along the right-hand spiral directionThe right-handed screw principle is adopted here, the right hand holds the rotating shaft, the pointing direction of the four fingers is the rotating direction, and then the rotating shaft rotates around the Z axis pointing to the ground along the right-handed screw directionAnd (4) rotating by-90 degrees around the Z axis to obtain the following conversion matrix:(ii) a (8-1) coordinate translation is performed after rotation, and the translation matrix is as follows:(ii) a (8-2) in the matrix (8-2), the xyz on the left is the coordinate translation after rotation, and the parameters on the right have the following meanings:wherein, in the step (A),is the earth's major radius;the curvature radius of the earth prime circle is;a first eccentricity of the earth; combining the matrix (7-1) and the matrix (7-2), the coordinates of the target point can be converted from the unmanned aerial vehicle geographic coordinate system to the geodetic rectangular coordinate system and expressed as the coordinatesThe conversion formula is as follows:;
and converting the coordinates under the rectangular earth coordinate system into the coordinates under the GPS coordinate system:
the conversion formula from the geodetic rectangular coordinate system to the GPS coordinate system is as follows:wherein, in the step (A),、、respectively the longitude, latitude and height of the target to be solved,、、for the coordinates of the target in the rectangular coordinate system of the earth, the expression formula of N is as follows:an iterative algorithm is adopted during calculation, and the iterative algorithm is set firstly during iterationOf (2) is calculatedThe calculation formula is as follows:then calculateUpdate value of (2):finally, the initial value is setAnd updating the valueComparing the difference values, if the difference values are within the error range, ending iteration and finally calculating the geodetic coordinates of the target, otherwise, if the error range is found, ending iteration and calculating the geodetic coordinates of the targetThe iteration is continued for the initial value untilAndthe difference error between the two is within a range, the error range is less than 0.00000001, and the radian is measured;
whereinIs the earth's major radius;the curvature radius of the earth fourth prime circle is;is a first eccentricity of the ellipsoid and is,、、respectively represent the correspondence at the i-th iteration、、The value of (c).
Hovering error experiment: as shown in fig. 4-6, at a height of 50m and 10m outdoors, the unmanned aerial vehicle hovers and takes a picture, and the landmarks in the picture are calibrated, so that the image GPS positioning is performed, and longitude and latitude errors under 50m and 10m are obtained respectively. Taking the latitude error of the GPS under 10m height as an example, 10 groups of GPS latitude measured values are obtained, the average value of the measured values is obtained, and then the measured value is subtracted from the average value to obtain the latitude average value error under 10m height. Similarly, longitudes at different altitudes may be calculated.
Single point location longitude experiment:
as shown in fig. 9, target positioning analysis is performed at an outdoor height of 2m to 30m, multiple pictures are taken at each height, the same point on the ground is calibrated on the pictures obtained at all heights, the coordinates of the unmanned aerial vehicle at the point are obtained, and the GPS coordinates of the point are calculated. And calculating the precision of the GPS at different heights by taking the GPS data mean value of the point as a calculated GPS value and making a difference with an actually measured GPS value, solving an error mean value of the same height as an error of positioning a single point at the height, and drawing a scatter diagram by taking the actual position of the target as the center, the horizontal axis as the east-west direction and the vertical axis as the north-south direction. And (4) taking two points A and B on the ground to perform the single-point positioning test.
Positioning experiments of different attitude angles in flight:
as shown in fig. 10 to 12, the constant high attitude angle test was performed at heights of 3, 4 and 5m outdoors. If the height is kept unchanged under 3m, changing the flight attitude angle of the unmanned aerial vehicle to shoot a plurality of images, respectively drawing a single-point positioning error scatter diagram under three heights of 3m, 4m and 5m, and drawing a scatter diagram by taking the actual position of the target as an origin, the horizontal axis as the east-west direction and the longitudinal axis as the north-south direction;
and the positioning algorithm takes the target pixel points framed and selected on the image, the derived longitude and latitude and height information of the unmanned aerial vehicle during photographing and the attitude angle information at that time as input, and calculates to obtain the relative position of the target under the coordinates of the unmanned aerial vehicle. And comparing the calculation result of the positioning algorithm with the theoretical GPS coordinate value. The positioning accuracy test results are shown in table 1. In the key target approaching stage, the positioning algorithm can ensure that the unmanned aerial vehicle approaches the position of the target.
TABLE 1 positioning Algorithm accuracy test (positive east and north, negative west and south)
The experimental data show that the higher the flying height the worse the accuracy, but the average error in the overall orthogonal direction is no more than 1m. The algorithm is considered to have a good distance error mean value of 0.83m in the positioning system in the low-altitude approaching process, and has a general distance error mean value of 2.85m in high-altitude flight. Experimental data shows that when the flying height is 1m, the average value of errors in the east-west direction of the unmanned aerial vehicle in the approaching process and the single-point positioning process is 0.05m, the average value of errors in the north-south direction is 0.08m, and the average value of errors in the distance is 0.83m. And when the flying height is 3m, the average value of errors in the east-west direction of the unmanned aerial vehicle in the approaching process and the single-point positioning process is-0.59 m, the average value of errors in the north-south direction is-0.44 m, and the average value of distance errors is 5.24m. The average value of errors in the east-west direction is-0.43 m, the average value of errors in the north-south direction is-1.10 m and the average value of errors in the distance is 2.85m when flying at a higher altitude (higher than 10 m).
According to the experimental result, the errors of the distance measurement between two points with the height of less than 5 meters (taking the distance between the two points as 2 meters as an example) are acceptable in centimeter level, and the higher the height is, the larger the error is. Ranging errors below 7m are considered acceptable. Wherein the mean value of the range errors is 0.02% (3 m), -0.02% (4 m), -0.03% (5 m).
The unmanned aerial vehicle height when the picture is shot in the analysis obtains to the positioning accuracy influence great, adopts the relative height of unmanned aerial vehicle and target that range finding sensor obtained in whole bootstrap system to carry out follow-up data processing. The method is limited by experimental equipment and conditions, cannot fly in a practical manner, and adopts static measurement as a substitute. The laser radar is connected to the computer, the computer receives data sent by the range finder, and the experimental data only has reference significance for actual flight. In a static experiment, the distance measurement precision of the equipment can be improved to 1%, and the actual flying precision needs to be known through further experiments.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.
Furthermore, it should be understood that although the present specification describes embodiments, not every embodiment includes only a single embodiment, and such description is for clarity purposes only, and it is to be understood that all embodiments may be combined as appropriate by one of ordinary skill in the art to form other embodiments as will be apparent to those of skill in the art from the description herein.
Claims (9)
1. An unmanned aerial vehicle imaging positioning method is characterized in that: the method comprises the following steps: the method comprises the following steps: acquiring a yaw angle, a pitch angle and a roll angle when an image is shot by using a gyroscope on the unmanned aerial vehicle; step two: establishing a coordinate system required by image positioning, wherein the coordinate system comprises a camera coordinate system, an unmanned aerial vehicle geographic coordinate system, a GPS coordinate system and a geodetic rectangular coordinate system;
step three: establishing an image pixel coordinate system and an image physical coordinate system according to the image, and then calculating a three-dimensional coordinate of the actual position of a target point on the image in a camera coordinate system;
step four: substituting the three-dimensional coordinates of the target point in the camera coordinate system, which are obtained by calculation in the third step, into the unmanned aerial vehicle geographic coordinate system to obtain the three-dimensional coordinates of the actual position of the target point on the image in the unmanned aerial vehicle geographic coordinate system; step five: substituting the three-dimensional coordinates of the target point obtained by calculation in the fourth step in the unmanned aerial vehicle geographic coordinate system into a geodetic rectangular coordinate system to obtain the three-dimensional coordinates of the actual position of the target point on the image in the geodetic rectangular coordinate system;
step six: and substituting the three-dimensional coordinates of the target point calculated in the step five in the geodetic rectangular coordinate system into the GPS coordinate system to obtain the three-dimensional coordinates of the actual position of the target point on the image in the GPS coordinate system.
2. The unmanned aerial vehicle imaging positioning method of claim 1, wherein: in the camera coordinate system, the center point of the plane of the camera is taken as the origin and is marked as O c The direction of the X-axis and the Y-axis is the same as the direction of the X-axis and the Y-axis in the image, with the optical axis direction of the camera as the Z-axis direction.
3. The unmanned aerial vehicle imaging positioning method of claim 1, wherein: in the geographic coordinate system of the unmanned aerial vehicle, the actual position of the unmanned aerial vehicle is taken as an original point, the perpendicular line between the unmanned aerial vehicle and the ground is taken as a Z axis, the south-righting direction of the unmanned aerial vehicle is taken as a Y axis, the east-righting direction of the unmanned aerial vehicle is taken as an X axis, and during dip shooting, the yaw angle, the pitch angle and the roll angle of a gyroscope respectively represent the rotation angles between three polar axes of a camera coordinate system and an unmanned aerial vehicle coordinate system; the GPS coordinate system is a world geodetic coordinate system WGS-84, the origin of coordinates is the earth centroid, the X axis points to the intersection point of the meridian and the equator from the origin, the Z axis points to the north pole direction of the earth, the Y axis is perpendicular to the plane of XOZ and forms a right-hand coordinate system with the X axis and the Z axis, and the coordinates of any point in the coordinate system can be represented as B, L and H and respectively represent the latitude, longitude and height of the point; and the coordinate axis and the coordinate origin of the geodetic rectangular coordinate system are completely coincided with the coordinate axis and the coordinate origin of the GPS coordinate system.
4. The unmanned aerial vehicle imaging positioning method according to claim 1, wherein: in step three, the image pixel coordinate system is a rectangular coordinate system with the upper left corner of the currently shot image as the origin and the pixels as coordinate units, and X is p And Y p Respectively representing the number of lines and columns of the pixel in a digital image, wherein the image physical coordinate system is a rectangular coordinate system which takes the intersection point of the optical axis of the camera and an image sensor inside the camera as an origin and takes the actual physical dimension as a unit, and X is w Axis, Y w Axes respectively associated with X of the image pixel coordinate system p And Y p The axes are parallel.
5. The unmanned aerial vehicle imaging positioning method of claim 4, wherein: selecting a target point P and a pixel point P in the image, wherein,is the image physical coordinate of the pixel point p in the image pixel coordinate system,is the image pixel coordinate of pixel point p in the image pixel coordinate system,coordinates of the camera coordinate system with the target point PThe relationship of (a) is as follows:……………….(5-1),
image pixel coordinates of pixel point pWith its image physical coordinatesThe relationship of (a) to (b) is as follows:\8230; \8230; \ 8230; (5-2), wherein the content of the first and second substances,is the focal length of the camera and,the image principal point is the image pixel coordinate of the intersection point of the camera optical axis and the image sensor in the camera; dx and dy are respectively a single pixel of the camera at X w And Y w Physical dimensions in directions; the image pixel coordinate of the pixel point p can be obtained by combining the relational expression (5-1) and the relational expression (5-2)Converting the image physical coordinates of the pixel point pSimultaneously obtaining the coordinate of the target point P in the image in the camera coordinate system as。
6. The unmanned aerial vehicle imaging positioning method of claim 1, wherein: in the fourth step, the step of converting the three-dimensional coordinates of the points in the camera coordinate system to the coordinates of the geographic coordinate system of the unmanned aerial vehicle is as follows: s1: rotating the coordinate system around the X axis along the spiral direction of the right hand, that is, holding the rotating shaft by the right hand, taking the pointing direction of the four fingers as the rotating direction, and converting the matrix(ii) a S2: rotating the coordinate system around the Y axis along the right-handed spiral direction to convert the matrix;
S3: rotating the coordinate system around the Z axis along the right-hand spiral direction to convert the matrix;
In the above step,,The degrees of a roll angle, a pitch angle and an azimuth angle are acquired by a gyroscope when a picture is taken;
the target in the image is represented as in the unmanned aerial vehicle geographic coordinate system\8230; (6-1) and the coordinate representation of the actual position of the target point in the obtained image in the geographical coordinate system of the unmanned aerial vehicle is shown as the coordinate。
7. The unmanned aerial vehicle imaging positioning method of claim 6, wherein: in the fourth step, the included angle between the connecting line of the unmanned aerial vehicle and the target point and the perpendicular line from the unmanned aerial vehicle to the ground is calculatedThe connection between the unmanned aerial vehicle and the target point and the included angle between the unmanned aerial vehicle and the image central point connection(ii) a Subsequent utilization of acquired unmanned aerial vehicle GPS information when taking the photographAndcalculating the coordinates of the target in the geographic coordinate system of the unmanned aerial vehicle, and projecting the target on the Z axis of the geographic coordinate system of the unmanned aerial vehicle, namely the coordinates of the Z axis of the geographic coordinate system of the unmanned aerial vehicle of the target point areRespectively calculating the coordinate of X axis according to the ratio of the coordinate value of Z axis to the focal length of cameraY-axis coordinatesObtaining the coordinates in the geographic coordinate system of the unmanned aerial vehicleWhereinIs the focal length of the cameraAndrespectively representing the longitude, latitude and altitude at which the drone took a picture.
8. The unmanned aerial vehicle imaging positioning method of claim 1, wherein: in the fifth step, firstly, the coordinates are converted, and the unmanned aerial vehicle geographic coordinate system rotates around the X axis along the right-hand spiral directionThe right-handed screw principle is adopted here, the right hand holds the rotating shaft, the pointing direction of the four fingers is the rotating direction, and then the rotating shaft rotates around the Z axis pointing to the ground along the right-handed screw directionAnd (4) rotating by-90 degrees around the Z axis to obtain the following conversion matrix:
the rotation is followed by a coordinate translation, the translation matrix being as follows:(ii) a (8-2) in the matrix (8-2), the xyz on the left side is the coordinate translation after rotation, and the meaning of the parameters on the right side is as follows:wherein, in the step (A),is the major radius of the ellipsoid; n is the curvature radius of the earth prime circle;a first eccentricity of the earth; the coordinates of the target point can be converted from the unmanned aerial vehicle geographic coordinate system to the coordinates of the earth rectangular coordinate system to be represented as the coordinates of the unmanned aerial vehicle geographic coordinate system by combining the matrix (8-1) and the matrix (8-2)The conversion formula is as follows:;
9. The unmanned aerial vehicle imaging positioning method of claim 1, wherein: in step six, the conversion formula from the geodetic rectangular coordinate system to the GPS coordinate system is as follows:wherein, in the process,、、respectively the longitude, the latitude and the height of the target to be solved,、、for the coordinates of the target in the rectangular coordinate system of the earth, the expression formula of N is as follows:an iterative algorithm is adopted during calculation, and the iterative algorithm is set firstly during iterationOf (2) is calculatedThe calculation formula is as follows:then calculateUpdate value of (2):finally, the initial value is setAnd updating the valueComparing the difference values, if the difference values are within the error range, ending iteration and finally calculating the geodetic coordinates of the target, otherwise, if the error range is found, ending iteration and calculating the geodetic coordinates of the targetIteration is continued for the initial value untilAndthe difference error between the two is within a range, the error range is less than 0.00000001, and the radian is measured;
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211479183.0A CN115511956A (en) | 2022-11-24 | 2022-11-24 | Unmanned aerial vehicle imaging positioning method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211479183.0A CN115511956A (en) | 2022-11-24 | 2022-11-24 | Unmanned aerial vehicle imaging positioning method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115511956A true CN115511956A (en) | 2022-12-23 |
Family
ID=84514073
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211479183.0A Pending CN115511956A (en) | 2022-11-24 | 2022-11-24 | Unmanned aerial vehicle imaging positioning method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115511956A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116774142A (en) * | 2023-06-13 | 2023-09-19 | 中国电子产业工程有限公司 | Coordinate conversion method in non-equal-altitude double-machine cross positioning |
CN116839595A (en) * | 2023-09-01 | 2023-10-03 | 北京宝隆泓瑞科技有限公司 | Method for creating unmanned aerial vehicle route |
CN117291980A (en) * | 2023-10-09 | 2023-12-26 | 宁波博登智能科技有限公司 | Single unmanned aerial vehicle image pixel positioning method based on deep learning |
CN117496114A (en) * | 2023-11-29 | 2024-02-02 | 西安道达天际信息技术有限公司 | Target positioning method, system and computer storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105353772A (en) * | 2015-11-16 | 2016-02-24 | 中国航天时代电子公司 | Visual servo control method for unmanned aerial vehicle maneuvering target locating and tracking |
CN110542407A (en) * | 2019-07-23 | 2019-12-06 | 中国科学院长春光学精密机械与物理研究所 | Method for acquiring positioning information of any pixel point of aerial image |
US11280608B1 (en) * | 2019-12-11 | 2022-03-22 | Sentera, Inc. | UAV above ground level determination for precision agriculture |
-
2022
- 2022-11-24 CN CN202211479183.0A patent/CN115511956A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105353772A (en) * | 2015-11-16 | 2016-02-24 | 中国航天时代电子公司 | Visual servo control method for unmanned aerial vehicle maneuvering target locating and tracking |
CN110542407A (en) * | 2019-07-23 | 2019-12-06 | 中国科学院长春光学精密机械与物理研究所 | Method for acquiring positioning information of any pixel point of aerial image |
US11280608B1 (en) * | 2019-12-11 | 2022-03-22 | Sentera, Inc. | UAV above ground level determination for precision agriculture |
Non-Patent Citations (2)
Title |
---|
徐诚等: "基于光电测量平台的多目标定位算法", 《中南大学学报》 * |
桑金: "空间大地直角坐标与大地坐标反算的非迭代法", 《测绘通报》 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116774142A (en) * | 2023-06-13 | 2023-09-19 | 中国电子产业工程有限公司 | Coordinate conversion method in non-equal-altitude double-machine cross positioning |
CN116774142B (en) * | 2023-06-13 | 2024-03-01 | 中国电子产业工程有限公司 | Coordinate conversion method in non-equal-altitude double-machine cross positioning |
CN116839595A (en) * | 2023-09-01 | 2023-10-03 | 北京宝隆泓瑞科技有限公司 | Method for creating unmanned aerial vehicle route |
CN116839595B (en) * | 2023-09-01 | 2023-11-28 | 北京宝隆泓瑞科技有限公司 | Method for creating unmanned aerial vehicle route |
CN117291980A (en) * | 2023-10-09 | 2023-12-26 | 宁波博登智能科技有限公司 | Single unmanned aerial vehicle image pixel positioning method based on deep learning |
CN117291980B (en) * | 2023-10-09 | 2024-03-15 | 宁波博登智能科技有限公司 | Single unmanned aerial vehicle image pixel positioning method based on deep learning |
CN117496114A (en) * | 2023-11-29 | 2024-02-02 | 西安道达天际信息技术有限公司 | Target positioning method, system and computer storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112894832B (en) | Three-dimensional modeling method, three-dimensional modeling device, electronic equipment and storage medium | |
CN115511956A (en) | Unmanned aerial vehicle imaging positioning method | |
CN103345737B (en) | A kind of UAV high resolution image geometric correction method based on error compensation | |
Sanz‐Ablanedo et al. | Reducing systematic dome errors in digital elevation models through better UAV flight design | |
CN113850126A (en) | Target detection and three-dimensional positioning method and system based on unmanned aerial vehicle | |
CN107490364A (en) | A kind of wide-angle tilt is imaged aerial camera object positioning method | |
CN112710311B (en) | Automatic planning method for three-dimensional live-action reconstruction aerial camera points of terrain adaptive unmanned aerial vehicle | |
CN110987021B (en) | Inertial vision relative attitude calibration method based on rotary table reference | |
CN104729482B (en) | A kind of ground small objects detecting system and method based on dirigible | |
CN101114022A (en) | Navigation multiple spectrum scanner geometric approximate correction method under non gesture information condition | |
CN113538595B (en) | Method for improving geometric precision of remote sensing stereo image by using laser height measurement data in auxiliary manner | |
CN112184786B (en) | Target positioning method based on synthetic vision | |
CN105444778B (en) | A kind of star sensor based on imaging geometry inverting is in-orbit to determine appearance error acquisition methods | |
CN106023207B (en) | It is a kind of to be enjoyed a double blessing the Municipal Component acquisition method of scape based on traverse measurement system | |
CN110986888A (en) | Aerial photography integrated method | |
CN116309798A (en) | Unmanned aerial vehicle imaging positioning method | |
CN116883604A (en) | Three-dimensional modeling technical method based on space, air and ground images | |
CN107063191B (en) | A kind of method of photogrammetric regional network entirety relative orientation | |
CN107705272A (en) | A kind of high-precision geometric correction method of aerial image | |
CN112985398A (en) | Target positioning method and system | |
CN110068312A (en) | A kind of digital zenith instrument localization method based on spherical triangle | |
CN116124094A (en) | Multi-target co-location method based on unmanned aerial vehicle reconnaissance image and combined navigation information | |
Zhang | Photogrammetric processing of low altitude image sequences by unmanned airship | |
CN112489118B (en) | Method for quickly calibrating external parameters of airborne sensor group of unmanned aerial vehicle | |
CN113483739B (en) | Offshore target position measuring method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20221223 |
|
RJ01 | Rejection of invention patent application after publication |