CN116124094A - Multi-target co-location method based on unmanned aerial vehicle reconnaissance image and combined navigation information - Google Patents
Multi-target co-location method based on unmanned aerial vehicle reconnaissance image and combined navigation information Download PDFInfo
- Publication number
- CN116124094A CN116124094A CN202211604618.XA CN202211604618A CN116124094A CN 116124094 A CN116124094 A CN 116124094A CN 202211604618 A CN202211604618 A CN 202211604618A CN 116124094 A CN116124094 A CN 116124094A
- Authority
- CN
- China
- Prior art keywords
- target
- coordinate system
- unmanned aerial
- coordinates
- aerial vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 32
- 238000004364 calculation method Methods 0.000 claims description 10
- 239000011159 matrix material Substances 0.000 claims description 10
- 238000003384 imaging method Methods 0.000 claims description 8
- 238000006243 chemical reaction Methods 0.000 claims description 6
- 238000004422 calculation algorithm Methods 0.000 abstract description 6
- 230000008901 benefit Effects 0.000 abstract description 4
- 238000012795 verification Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000004088 simulation Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000002427 irreversible effect Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/02—Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
- G01C11/025—Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures by scanning the object
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C39/00—Aircraft not otherwise provided for
- B64C39/02—Aircraft not otherwise provided for characterised by special use
- B64C39/028—Micro-sized aircraft
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/36—Videogrammetry, i.e. electronic processing of video signals from a single source or from different sources to give parallax or range information
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/45—Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
- G01S19/47—Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/16—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- Aviation & Aerospace Engineering (AREA)
- Navigation (AREA)
Abstract
The invention relates to a multi-target cooperative positioning method based on unmanned aerial vehicle reconnaissance images and combined navigation information, and belongs to the technical field of passive positioning. In an AOA algorithm frame, a photogrammetry collineation equation is transformed, and the combined navigation information of the unmanned aerial vehicle is used as a reference coordinate, so that the collaborative target positioning of multiple unmanned aerial vehicles is completed. The invention does not depend on elevation and ranging information, can simultaneously and accurately position a plurality of targets in real time by utilizing infrared and visible light image information, has better engineering application benefit, and has wide application prospect in the technical field of passive positioning.
Description
Technical Field
The invention relates to the technical field of passive positioning, in particular to a method for simultaneously positioning a plurality of targets by a multi-machine cooperation mode based on unmanned aerial vehicle reconnaissance images and combined navigation information.
Background
The accurate positioning of the ground target is the core capability of the reconnaissance unmanned aerial vehicle, and the positioning method can be divided into two main types of active positioning and passive positioning. In the passive positioning process, the device only receives signals and does not actively emit electromagnetic waves, laser and the like, so that compared with active positioning, the passive positioning process has good concealing characteristics, and the survivability of the unmanned aerial vehicle is improved. Solutions for passive positioning of unmanned aerial vehicle to ground targets mainly include a multi-target positioning method based on a photogrammetric collineation equation (Collinearity Equation), doppler frequency change rate positioning (Doppler Rate of Chang, DRC), phase difference change rate positioning (Phase Difference Rate of Chang, PDRC), direction of Arrival cross positioning (AOA), arrival time difference positioning (Time Difference of Arrival, TDOA), frequency difference positioning (Frequency Difference of Arrival, FDOA), and various joint positioning methods.
In the above mainstream solutions, there are drawbacks as follows: (1) the received information related to the positioning methods such as DRC/PDRC/AOA/TDOA/FDOA is generally radio wave signals, and does not relate to information such as images, videos and the like; (2) the multi-target can be positioned through a single image, but the theoretical basis of the algorithm is a photogrammetry collineation equation, and the premise is that the target area is flat, so that the method cannot be applied under the condition of no elevation information. The invention fuses the two algorithms, thereby solving the respective using defects and achieving the effect of complementary advantages.
Since ranging information is lacking and the solved target information is at least 3 unknowns, multi-machine co-location is required. On the basis, the multi-machine-to-target cooperative positioning takes the position information of the unmanned aerial vehicle as a base station coordinate, and further improves the position accuracy of the unmanned aerial vehicle by an INS+GNSS combined navigation mode, so that the target positioning error is reduced.
Disclosure of Invention
Technical problem to be solved
Firstly, one of the application conditions of a positioning method of a photogrammetry collineation equation is known elevation information, which is difficult to ensure in practical application; secondly, the reconnaissance device does not need to track the targets, but completes wide-area reconnaissance and positioning of a plurality of targets in a non-staring mode, and the efficiency is higher, so that the image information of the wide-area is more in line with the actual application scene than the radio signal of the specific target; thirdly, a passive positioning mode is needed, so that the survivability of the unmanned aerial vehicle is guaranteed. Aiming at the three points, how to position multiple targets simultaneously and obtain target information with higher precision without depending on elevation information and ranging information is a technical problem to be solved.
In order to avoid the defects of the prior art, the invention provides a multi-target cooperative positioning method based on an unmanned aerial vehicle reconnaissance image and combined navigation information.
Technical proposal
Multi-target collaborative positioning method based on unmanned aerial vehicle reconnaissance image and combined navigation information, wherein unmanned aerial vehicle set participating in collaborative positioning is S= { S j I j=1, 2,.. i I=1, 2,., K }, K is the total number of locatable targets; the method is characterized by comprising the following steps:
step 1: calculating an internal azimuth vector of the single unmanned aerial vehicle camera, namely an internal azimuth vector Θ of the jth unmanned aerial vehicle for positioning the ith target Tij :
In the above, f (j) Is focal length, v i 、u i Is the coordinates of the image point a (i) in the pixel coordinate system, v 0 、u o Is the coordinate of the principal point in the pixel coordinate system, d v(j) 、d u(j) The physical dimensions are the longitudinal and transverse dimensions of the pixel;
step 2: calculating the direction-finding angle of the target, namely the high-low direction-finding angle theta when the jth unmanned plane locates the ith target ij And yaw direction finding angle psi ij :
For ground targets: sign (Y) nA(i) -Y ns(i) )=-1;
For an airborne target: sign (Y) nA(i) -Y ns(i) )=1;
Wherein unmanned aerial vehicle S j The coordinates of the imaging center in the navigation coordinate system are (X ns(j) ,Y ns(j) ,Z ns(j) ) The coordinates of the image point a (i) in the imaging coordinate system are (X) sa(i) ,Y sa(i) ,Z sa(i) ) The coordinates of the target point a (i) corresponding to the image point a (i) in the navigation coordinate system are (X) nA(i) ,Y nA(i) ,Z nA(i) ),Is unmanned plane S j A conversion matrix from the imaging coordinate system to the navigation coordinate system;
step 3: establishing a multi-machine co-location equation set
wherein :
U Ti =[x Ti ,y Ti ,z Ti ] T ,(i=1,2,...,K)
S nj =[x nj ,y nj ,z nj ] T j=2, 3,..n is the S for other unmanned aerial vehicles n1 Calculating respective navigation coordinate system positions for the reference points;
due to rank (phi) ij ) =2, thenAnd the position coordinates of the target are 3 unknowns, and therefore,
when N is more than or equal to 2, ensuring that the overdetermined equation number meets the calculation of the target position; calculating to obtain the position coordinates of the target in the navigation coordinate system by adopting a least square method: u (U) Ti =[x Ti ,y Ti ,z Ti ] T I=1, 2,.,. K, and thus the longitude, latitude, and altitude of each target can be obtained;
step 4: calculating coordinates of a target in a geodetic coordinate system
First, using unmanned plane S 1 Coordinates in the earth's coordinate systemCalculating the coordinates of each object in the earth coordinate system as the reference point>
wherein ,the matrix is a conversion matrix from a navigation coordinate system to an earth rectangular coordinate system;
second, object-based earth rectangular coordinatesThrough iterative computation, a target U can be obtained Ti =[x Ti ,y Ti ,z Ti ] T I=1, 2, K is longitude, latitude and altitude in the geodetic coordinate system (λ Ti ,φ Ti ,h Ti )。
A computer system, comprising: one or more processors, a computer-readable storage medium storing one or more programs, wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the methods described above.
A computer readable storage medium, characterized by storing computer executable instructions that when executed are configured to implement the method described above.
Advantageous effects
The invention provides a multi-target cooperative positioning method based on unmanned aerial vehicle reconnaissance images and combined navigation information, which is characterized in that in an AOA algorithm frame, a photogrammetry collineation equation is transformed, and the combined navigation information of an unmanned aerial vehicle is used as a reference coordinate to finish the cooperative target positioning of the plurality of unmanned aerial vehicles. The invention does not depend on elevation and ranging information, can simultaneously and accurately position a plurality of targets in real time by utilizing infrared and visible light image information, has better engineering application benefit, and has wide application prospect in the technical field of passive positioning.
Drawings
The drawings are only for purposes of illustrating particular embodiments and are not to be construed as limiting the invention, like reference numerals being used to refer to like parts throughout the several views.
FIG. 1 is a schematic diagram of the principle of multi-machine co-location of the present invention.
Fig. 2 is a schematic diagram of the image-based multi-target imaging principle according to the present invention.
Fig. 3 is a 1# unmanned aerial vehicle target pixel coordinate distribution for simulation verification in an embodiment of the present invention.
Fig. 4 is a 2# unmanned aerial vehicle target pixel coordinate distribution for simulation verification in an embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention. In addition, technical features of the embodiments of the present invention described below may be combined with each other as long as they do not collide with each other.
Unmanned aerial vehicle set participating in co-location is S= { S j I j=1, 2,., N }, N is the total number of drones; ground target set capable of being scouted and positioned is T= { T i I=1, 2,..k }, K is the total number of locatable targets. The method is characterized by comprising the following steps:
step 1: calculating internal azimuth vector of single-frame unmanned aerial vehicle camera
The coordinate system involved in the calculation process is defined as follows:
pixel coordinate system O px uv (abbreviated as px line). Specifically defined as: origin O of pixel coordinate system px Take in the upper left corner of the image; o (O) px The u-axis is the horizontal axis of the image, representing the number of columns of pixels in the image, and positive to the right; o (O) px The v-axis is the vertical axis of the image, representing the number of rows of pixels in the image, and is positive downward. The coordinate system is measured in pixels (denoted pix). The multiple target points in the pixel coordinate system are noted as: (u) i ,v i ),(i=1,2,...,K)
Image capturing coordinate system O c X c Y c Z c (abbreviated as "c series"). Specifically defined as: origin O c The optical center position of the photoelectric platform camera is taken; o (O) c X c The axis coincides with the optical axis of the camera, and the direction of the pointing camera is positive; o (O) c Y c Shaft and O px The v axis is parallel and positive upwards; o (O) c Z c Shaft and O px The u-axis is parallel and positive to the right; and O is c Y c Z c The plane is parallel to the imaging plane.
In the image information, the coordinates of the image point a (i) in the imaging coordinate system are (X) sa(i) ,Y sa(i) ,Z sa(i) ) The mapping relation between the image capturing coordinate system and the pixel coordinate system of the image point a (i) is as follows:
in the above, f (j) Is focal length, v i 、u i Is the coordinates of the image point a (i) in the pixel coordinate system, v 0 、u o Is the coordinate of the principal point in the pixel coordinate system, d v(j) 、d u(j) Is the physical dimension of the picture element in the longitudinal and transverse directions.
To facilitate calculation of the direction finding angle of the target, the left side of the equation above is defined as the inner azimuth vector Θ of the jth unmanned plane for the ith target location Tij The following steps are:
step 2: calculating the direction finding angle theta of the target ij 、ψ ij
The coordinate system and angle involved in calculating the direction finding angle of the target are defined as follows:
a. coordinate system definition
Navigation coordinate system O n X n Y n Z n (abbreviated as n series). Specifically defined as: the geographic coordinate system pointed by north-day-east is a local horizontal coordinate system. Origin O n The method comprises the steps of taking the inertial navigation device in the center of a missile-borne inertial navigation device; o (O) n Y n Shaft and O n The ellipsoids normal of the points coincide, and the pointing zenith is positive; o (O) n X n Shaft and O n Y n The axis is vertical, and the meridian line is directed north extremely positively; o (O) n Z n The axis being perpendicular to O n X n Y n Plane, squareThe direction is determined according to a right-hand rectangular coordinate system, and the direction is positive.
Body coordinate system O b X b Y b Z b (abbreviated as b series). Specifically defined as: origin O b Taking the mass center of the unmanned aerial vehicle; o (O) b X b The shaft coincides with the longitudinal axis of the machine body, and the direction of the shaft to the head of the machine body is positive; o (O) b Y b The axis is positioned in the longitudinal symmetry plane of the machine body and is connected with O b X b The axis is vertical and the pointing direction is positive; o (O) b Z b The axis being perpendicular to O b X b Y b And the plane, the direction of which is determined according to a right-hand rectangular coordinate system.
b. Angular definition
Pitch angleOrganism O b X b Shaft and horizontal plane (navigation coordinate system O) n X n Z n Plane), if O b X b The axis pointing above the horizontal plane, pitch angle +.>Is positive.
Course angleOrganism O b X b Projection of axis in horizontal plane and O n X n The included angle of the axes. Welcome O n Y n When viewed from above, if the axis is formed by O n X n The shaft rotates to O b X b Heading angle +.>Is positive. (i.e. O b X b When the projection of the axis in the horizontal plane is north-west, heading angle +.>Positive and negative in the opposite direction. )
Roll angle gamma b : organism O b Y b Shaft and longitudinal axis O of machine body b X b Included angle between the vertical faces. From tail of organism to O b X b Viewed axially forward, if O b Y b When the shaft is positioned on the right side of the vertical plane, the rolling angle gamma is positive.
Pitch frame angleOptical axis O of camera c X c Axis and body coordinate system O b X b Z b Included angle of plane, if O c X c Axis direction O b X b Z b Above the plane, the pitch frame angle +.>Is positive.
Yaw frame angleOptical axis O of camera c X c Axis-in-machine-body coordinate system O b X b Z b Projection of plane and O b X b The included angle of the axes. If from O b X b The shaft rotates to O c X c When the projection line of the shaft is anticlockwise, the yaw frame angle +.>Is positive.
Unmanned plane S j The coordinates of the imaging center in the navigation coordinate system are (X ns(j) ,Y ns(j) ,Z ns(j) ) The coordinates of the image point a (i) in the imaging coordinate system are (X) sa(i) ,Y sa(i) ,Z sa(i) ) The coordinates of the target point a (i) corresponding to the image point a (i) in the navigation coordinate system are (X) nA(i) ,Y nA(i) ,Z nA(i) ) Then the collinearity equation is:
wherein the matrixIs unmanned plane S j Conversion matrix from camera coordinate system to navigation coordinate system, < >>All of which are functions of pitch angle, yaw angle, roll angle, pitch frame angle, yaw frame angle. For unmanned plane S j The angles referred to above are identified by subscript j, respectively +.> and />
According to unmanned plane S j The following formula can be obtained from the positional relationship with the target in the navigation system:
ω nj =[ω n1(j) ω n2(j) ω n3(j) ]
κ nj =[κ n1(j) κ n2(j) κ n3(j) ]
ρ nj =|ρ n1(j) ρ n2(j) ρ n3(j) |
the following arrangement is available:
wherein ,θij 、ψ ij Respectively representing the high-low direction finding angle and the yaw direction finding angle of the jth unmanned plane when the jth unmanned plane locates the ith target.
Since the reconnaissance target is a ground target, sign (Y nA(i) -Y ns(i) ) = -1 (for airborne targets: sign (Y) nA(i) -Y ns(i) )=1)。
Step 3: establishing a multi-machine co-location equation set
Unmanned plane S 1 The coordinates in the navigation coordinate system are S n1 =[x n1 ,y n1 ,z n1 ] T And set it as datum point for multi-machine co-location, other unmanned aerial vehicles use S n1 Calculating the respective navigation coordinate system positions for the reference points, denoted S nj =[x nj ,y nj ,z nj ] T (j=2, 3,., N.) from the geometric relationship:
R ij is unmanned plane S j With target U Ti Is a distance of about R in the above formula ij A system of equations can be derived:
Φ ij ×U Ti =Φ ij ×S nj ,(i=1,2,...,K,j=1,2,,N)。
wherein matrix phi ij Is irreversible.
For N unmanned aerial vehicles, there are:
wherein :
U Ti =[x Ti ,y Ti ,z Ti ] T ,(i=1,2,...,K)
due to rank (phi) ij ) =2, thenAnd the position coordinates of the target are 3 unknowns, and therefore,
when N is more than or equal to 2, the overdetermined equation number is ensured to meet the calculation of the target position. And finally, calculating to obtain the position coordinates of the target under the navigation coordinate system by adopting a least square method: u (U) Ti =[x Ti ,y Ti ,z Ti ] T (i=1, 2,.,. K), and thus the longitude, latitude, and altitude of each target can be obtained.
Step 4: calculating coordinates of an object in a geodetic coordinate system (CGCS 2000)
First, using unmanned plane S 1 Coordinates in the rectangular coordinate system of the earthCalculating the coordinates of each object in the rectangular coordinate system of the earth as the reference point>
Rectangular coordinate system O of the earth e X e Y e Z e The (e) system is specifically defined as: z is Z e Axis-pointing BIH1984.0 defined earth protocol pole (CTP), X e The axis is the initial meridian plane of IERS and the same Z as the origin e Intersection of axinormal equatorial planes, Y e Axis and X e 、Z e The shaft forms a right-hand geocentric rectangular coordinate system.Is a transformation matrix from a navigation coordinate system to an earth rectangular coordinate system.
Second, object-based earth rectangular coordinatesThrough iterative computation, a target U can be obtained Ti Longitude, latitude and altitude (lambda) in geodetic coordinates Ti ,φ Ti ,h Ti ),(i=1,2,...,K)。
In order that those skilled in the art will better understand the present invention, the following detailed description of the present invention will be provided with reference to specific examples.
Example 1:
the multi-target cooperative positioning method based on the unmanned aerial vehicle reconnaissance image and the combined navigation information provided by the embodiment of the invention comprises the following steps:
step one, calculating an internal azimuth vector of a single unmanned aerial vehicle camera
According to fig. 2, a pre-existing unmanned plane S j The longitudinal and transverse resolutions of the optical detection component are PxV max(j) ×PxU max(j) The half view angles of the longitudinal and the transverse directions are alpha respectively (j)1/2 ×β (j)1/2 The position of the principal point in the pixel coordinate system is (u) 0(j) ,v 0(j) ) 2 inner azimuth element constants are defined: f (F) v(j) 、F u(j) 。
The coordinates of the object on the single image are (u) i ,v i ) (i=1, 2,.,. K), the internal azimuth vector of the single camera is derived to be Θ Tij The method comprises the following steps:
step two: calculating the direction finding angle theta of the target ij 、ψ ij
Unmanned plane S j With target U Ti The relative position between the two is shown in FIG. 1, and the vector Θ calculated in the first step is combined Tij Obtaining the unmanned aerial vehicle S j To ground target U Ti The direction finding angle of (2) is:
step three, establishing an equation set of multi-machine co-location
Unmanned plane S j The position coordinates in the CGCS2000 coordinate system areThe position coordinate under the navigation coordinate system is S nj =[x nj ,y nj ,z nj ] T (j=1, 2,) N. Converting by utilizing the conversion relation between the geodetic coordinates and the rectangular earth coordinates to obtain the current position of the unmanned aerial vehicle sitting at the rectangular earthCoordinates in the label system are as follows:
unmanned plane S 1 Coordinates in the rectangular coordinate system of the earth are:
unmanned plane S 1 The coordinates in the rectangular coordinate system of the earth are used as datum points to obtain all unmanned aerial vehicles S j Is a navigation system coordinate (x) nj ,y nj ,z nj )。
By integrating the formulas, the equation set of the multi-machine co-location can be obtained as follows:
wherein :
S n1 =[0,0,0] T ,S nj =[x nj ,y nj ,z nj ] T ,(j=2,3,...,N)
U Ti =(x Ti ,y Ti ,z Ti ) T ,(i=1,2,...,K)
step four: calculating coordinates of an object in a geodetic coordinate system (CGCS 2000)
the coordinates of the target point in the rectangular coordinate system of the earth are as follows:
longitude lambda Ti Is solved by (1):
because the value range of the arctan function arctan isAnd longitude lambda Ti Since the range of (a) is (-180 DEG to 180 DEG), quadrant judgment is required to obtain an actual longitude value lambda Ti The method comprises the following steps:
Latitude phi Ti Is solved by (1):
Ground latitude phi Ti The following iterative calculation is performed:
Relative latitude phi Ti 、N Ti The solution requirement of (2) is iterated for 4 times to obtain the final dimension value phi Ti
Altitude h Ti Is solved by (1):
lambda is obtained by the above Ti 、φ Ti 、h Ti Converting its dimension into (°), which is the target U Ti CGCS2000 coordinates of (c).
Step five: algorithm simulation verification
2 unmanned aerial vehicles and 17 targets are set for algorithm verification. Unmanned aerial vehicle numbers are 1# and 2# respectively, that is S= { S j |j=1, 2}; the target numbers are targets 1 to 17, i.e. T= { T i I=1, 2,., 17}; the unmanned aerial vehicles 1# and 2# have the same internal azimuth elements, and corresponding target pixel coordinate distribution is shown in fig. 3 and 4 respectively.
Table 1 unmanned aerial vehicle 1# and unmanned aerial vehicle 2# scouts have the same internal azimuth elements
α 1/2 | β 1/2 | PxV max | PxU max | u 0 | v 0 |
7.2 | 9 | 1024 | 1280 | 640 | 512 |
Table 2 unmanned aerial vehicle 1# shooting time parameters
TABLE 3 unmanned 1# target pixel coordinates
Table 4 unmanned aerial vehicle 2# shooting parameters
TABLE 5 unmanned plane 2# target pixel coordinates
u | -177 | 1324 | -2852 | -928 | 981 | 824 | -1900 | -778 | 758 | 424 | -1184 | -606 | 498 | 96 | -625 | -408 | 191 |
v | 3144 | 4904 | 5178 | 2263 | 2265 | 4318 | 4455 | 2439 | 2434 | 3849 | 3910 | 2641 | 2632 | 3465 | 3485 | 2873 | 2865 |
TABLE 6 calculation of the target direction angle
TABLE 7 Multi-target co-location results
TABLE 8 true target position
Object name | Longitude and latitude | Latitude of latitude | Height |
Target 1 | 113.2839583 | 34.83596488 | 110.0264829 |
Target 2 | 113.292389 | 34.83827819 | 110.0755245 |
Target 3 | 113.284902 | 34.82665696 | 110.2036142 |
Target 4 | 113.2785748 | 34.83448736 | 110.0493136 |
Target 5 | 113.2833883 | 34.841585 | 110.0013196 |
Target 6 | 113.2898635 | 34.83758529 | 110.0499758 |
Target 7 | 113.2846124 | 34.82951313 | 110.1312968 |
Target 8 | 113.2797296 | 34.83480432 | 110.0408605 |
Target 9 | 113.2835085 | 34.8403996 | 110.0015023 |
Target 10 | 113.2876511 | 34.83697823 | 110.0352257 |
Target 11 | 113.2843637 | 34.83196679 | 110.0818718 |
Target 12 | 113.2809992 | 34.83515277 | 110.0338087 |
Target 13 | 113.283642 | 34.83908388 | 110.0049131 |
Target 14 | 113.2856969 | 34.836442 | 110.0281255 |
Target 15 | 113.2841476 | 34.83409742 | 110.0484755 |
Target 16 | 113.2824014 | 34.83553761 | 110.0287477 |
Target 17 | 113.2837909 | 34.83761509 | 110.0127078 |
TABLE 9 target location calculation result deviation values
Conclusion: unmanned aerial vehicle 1# is 400.86 meters away from unmanned aerial vehicle 2#, the flight height is 1054.08 meters, and tables 1-4 show flight and shooting states. Table 7 shows the position calculation results of 17 targets. As can be seen from table 9: the maximum deviation of longitude is 1.87E-06 degrees, the maximum deviation of latitude is 2.19E-06 degrees, and the maximum deviation of altitude is 0.65 m. In conclusion, the method has clear calculation steps, smaller complexity and higher positioning precision.
While the invention has been described with reference to certain preferred embodiments, it will be understood by those skilled in the art that various changes and substitutions of equivalents may be made without departing from the spirit and scope of the invention.
Claims (3)
1. Multi-target collaborative positioning method based on unmanned aerial vehicle reconnaissance image and combined navigation information, wherein unmanned aerial vehicle set participating in collaborative positioning is S= { S j I j=1, 2,.. i I=1, 2,., K }, K is the total number of locatable targets; the method is characterized by comprising the following steps:
step 1: calculating an internal azimuth vector of the single unmanned aerial vehicle camera, namely an internal azimuth vector Θ of the jth unmanned aerial vehicle for positioning the ith target Tij :
In the above, f (j) Is focal length, v i 、u i Is the coordinates of the image point a (i) in the pixel coordinate system, v 0 、u o Is the coordinate of the principal point in the pixel coordinate system, d v(j) 、d u(j) The physical dimensions are the longitudinal and transverse dimensions of the pixel;
step 2: calculating the direction-finding angle of the target, namely the high-low direction-finding angle theta when the jth unmanned plane locates the ith target ij And yaw direction finding angle psi ij :
For ground targets: sign (Y) nA(i) -Y ns(i) )=-1;
For an airborne target: sign (Y) nA(i) -Y ns(i) )=1;
Wherein unmanned aerial vehicle S j The coordinates of the imaging center in the navigation coordinate system are (X ns(j) ,Y ns(j) ,Z ns(j) ) The coordinates of the image point a (i) in the imaging coordinate system are (X) sa(i) ,Y sa(i) ,Z sa(i) ) The coordinates of the target point A (i) corresponding to the image point a (i) in the navigation coordinate system areIs unmanned plane S j A conversion matrix from the imaging coordinate system to the navigation coordinate system;
step 3: establishing a multi-machine co-location equation set
wherein :
U Ti =[x Ti ,y Ti ,z Ti ] T ,(i=1,2,...,K)
S nj =[x nj ,y nj ,z nj ] T j=2, 3,..n is the S for other unmanned aerial vehicles n1 Calculating respective navigation coordinate system positions for the reference points;
due to rank (phi) ij ) =2, thenAnd the position coordinates of the target are 3 unknowns, and therefore,
when N is more than or equal to 2, ensuring that the overdetermined equation number meets the calculation of the target position; calculating by least square method to obtain targetPosition coordinates under the navigation coordinate system: u (U) Ti =[x Ti ,y Ti ,z Ti ] T I=1, 2,.,. K, and thus the longitude, latitude, and altitude of each target can be obtained;
step 4: calculating coordinates of a target in a geodetic coordinate system
First, using unmanned plane S 1 Coordinates in the earth's coordinate systemCalculating the coordinates of each object in the earth coordinate system as the reference point>
wherein ,the matrix is a conversion matrix from a navigation coordinate system to an earth rectangular coordinate system;
2. A computer system, comprising: one or more processors, a computer-readable storage medium storing one or more programs, wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of claim 1.
3. A computer readable storage medium, characterized by storing computer executable instructions that, when executed, are adapted to implement the method of claim 1.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211604618.XA CN116124094A (en) | 2022-12-13 | 2022-12-13 | Multi-target co-location method based on unmanned aerial vehicle reconnaissance image and combined navigation information |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211604618.XA CN116124094A (en) | 2022-12-13 | 2022-12-13 | Multi-target co-location method based on unmanned aerial vehicle reconnaissance image and combined navigation information |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116124094A true CN116124094A (en) | 2023-05-16 |
Family
ID=86305409
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211604618.XA Pending CN116124094A (en) | 2022-12-13 | 2022-12-13 | Multi-target co-location method based on unmanned aerial vehicle reconnaissance image and combined navigation information |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116124094A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116774142A (en) * | 2023-06-13 | 2023-09-19 | 中国电子产业工程有限公司 | Coordinate conversion method in non-equal-altitude double-machine cross positioning |
-
2022
- 2022-12-13 CN CN202211604618.XA patent/CN116124094A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116774142A (en) * | 2023-06-13 | 2023-09-19 | 中国电子产业工程有限公司 | Coordinate conversion method in non-equal-altitude double-machine cross positioning |
CN116774142B (en) * | 2023-06-13 | 2024-03-01 | 中国电子产业工程有限公司 | Coordinate conversion method in non-equal-altitude double-machine cross positioning |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220207824A1 (en) | System and method for determining geo-location(s) in images | |
Zhang et al. | Photogrammetric processing of low‐altitude images acquired by unpiloted aerial vehicles | |
CN107808362A (en) | A kind of image split-joint method combined based on unmanned plane POS information with image SURF features | |
CN104835115A (en) | Imaging method for aerial camera, and system thereof | |
CN103900539B (en) | A kind of aerial cube panoramic imagery object localization method | |
CN106373159A (en) | Simplified unmanned aerial vehicle multi-target location method | |
CN107490364A (en) | A kind of wide-angle tilt is imaged aerial camera object positioning method | |
CN112184786B (en) | Target positioning method based on synthetic vision | |
CN112710311A (en) | Automatic planning method for three-dimensional live-action reconstruction aerial camera points of terrain adaptive unmanned aerial vehicle | |
CN110160503B (en) | Unmanned aerial vehicle landscape matching positioning method considering elevation | |
CN115511956A (en) | Unmanned aerial vehicle imaging positioning method | |
CN116124094A (en) | Multi-target co-location method based on unmanned aerial vehicle reconnaissance image and combined navigation information | |
Liu et al. | A new approach to fast mosaic UAV images | |
Guo et al. | Accurate Calibration of a Self‐Developed Vehicle‐Borne LiDAR Scanning System | |
CN113340272B (en) | Ground target real-time positioning method based on micro-group of unmanned aerial vehicle | |
CN111508028A (en) | Autonomous in-orbit geometric calibration method and system for optical stereo mapping satellite camera | |
CN112750075A (en) | Low-altitude remote sensing image splicing method and device | |
Zhou et al. | Automatic orthorectification and mosaicking of oblique images from a zoom lens aerial camera | |
Lee et al. | Georegistration of airborne hyperspectral image data | |
CN110967021B (en) | Active/passive ranging independent target geographic positioning method for airborne photoelectric system | |
CN107705272A (en) | A kind of high-precision geometric correction method of aerial image | |
CN113781567B (en) | Aerial image target geographic positioning method based on three-dimensional map generation | |
CN115932823A (en) | Aircraft ground target positioning method based on heterogeneous region feature matching | |
Zhang | Photogrammetric processing of low altitude image sequences by unmanned airship | |
Guntel et al. | Accuracy analysis of control point distribution for different terrain types on photogrammetric block |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |