CN107300377B - A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion - Google Patents

A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion Download PDF

Info

Publication number
CN107300377B
CN107300377B CN201610943473.4A CN201610943473A CN107300377B CN 107300377 B CN107300377 B CN 107300377B CN 201610943473 A CN201610943473 A CN 201610943473A CN 107300377 B CN107300377 B CN 107300377B
Authority
CN
China
Prior art keywords
aerial vehicle
image
marker
matrix
unmanned aerial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610943473.4A
Other languages
Chinese (zh)
Other versions
CN107300377A (en
Inventor
邓方
张乐乐
陈杰
邱煌斌
陈文颉
彭志红
白永强
李佳洪
桂鹏
樊欣宇
顾晓丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN201610943473.4A priority Critical patent/CN107300377B/en
Publication of CN107300377A publication Critical patent/CN107300377A/en
Application granted granted Critical
Publication of CN107300377B publication Critical patent/CN107300377B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Medicines Containing Antibodies Or Antigens For Use As Internal Diagnostic Agents (AREA)
  • Image Analysis (AREA)

Abstract

The present invention discloses the rotor wing unmanned aerial vehicle objective localization method under a kind of track of being diversion, and using the single camera photographic subjects image being mounted on unmanned plane, and image is passed back to earth station;The marker with obvious characteristic is selected, and carries out visual identity;Then rotor wing unmanned aerial vehicle is diversion centered on the marker, carries out multipoint images measurement, calculates height and course deviation of the unmanned plane relative to landform where target based on the method for binocular vision model and the mutual iteration of linear regression model (LRM);Next, any static or moving target in camera coverage may be selected in operator, realize that the three-dimensional of target is accurately positioned.The present invention is carried out in same primary aerial mission, and flight leading portion calculates course deviation and relative altitude, and flight back segment carries out three-dimensional accurate positioning;The present invention traditional triangulation location method under track that solves the problems, such as to be diversion can not calculate relative altitude, to realize the three-dimensional localization to target.

Description

A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion
Technical field
The invention belongs to vision measurement fields, and in particular to the rotor wing unmanned aerial vehicle objective under a kind of track of being diversion is fixed Position method.
Background technique
The features such as rotor wing unmanned aerial vehicle is at low cost, VTOL and hovering, in investigation, agricultural insurance, environmental protection and calamity The fields such as rescue are applied widely afterwards.
And the rotor wing unmanned aerial vehicle target positioning of view-based access control model has been current one of research hotspot problem, using vision side Method carries out three-dimensional localization to target, determines the relative altitude of unmanned plane and target by triangulation location method first, then could Carry out the positioning of target.Course deviation brought by the low precision AHRS attitude heading reference system being equipped in view of rotor wing unmanned aerial vehicle Larger, when carrying out vision measurement using the image of unmanned plane shooting, certain offset occurs for the light projected in image.If Using traditional triangulation location method, for the two groups of light projected from left and right view due to being deviated, what is solved is opposite Height will generate biggish error, therefore can not accurately calculate the relative altitude between unmanned plane and object, thus cannot to target into The effective three-dimensional localization of row.
Summary of the invention
In view of this, the present invention provides the rotor wing unmanned aerial vehicle objective localization method under a kind of track of being diversion, it can Course deviation is calculated, reduces the calculating error to relative altitude, to improve rotor wing unmanned aerial vehicle to the three-dimensional localization of target Ability.
The utility model has the advantages that
(1) method provided by the present invention is directed to the rotor wing unmanned aerial vehicle for being equipped with low precision AHRS attitude heading reference system system, It can be precisely calculated course deviation existing for AHRS attitude heading reference system, and then calculate rotor wing unmanned aerial vehicle and mesh under track of being diversion Height where mark between landform, to realize that rotor wing unmanned aerial vehicle positions the 3D vision of target.
Detailed description of the invention
Fig. 1 is rotor wing unmanned aerial vehicle target 3 D positioning system structure chart of the invention;
Fig. 2 is the flow chart of method provided by the present invention;
Fig. 3 is revolved view binocular vision model schematic used in the present invention;
Fig. 4 is monocular-camera ranging model schematic used in the present invention;
Fig. 5 is the iterative process flow chart in method provided by the present invention;
Fig. 6 is in method provided by the present inventionData matched curve;
Fig. 7 is in method provided by the present inventionData matched curve;
Fig. 8 is the locating effect figure of method provided by the present invention.
Specific embodiment
The present invention will now be described in detail with reference to the accompanying drawings and examples.
Following experiment porch is built to verify effectiveness of the invention, using a frame T650 quadrotor drone, one Platform notebook can carry out real time communication as earth station between unmanned plane and earth station, system structure is as shown in Figure 1.
For unmanned plane, GPS positioning system, AHRS attitude heading reference system, altimeter, wireless image transmission are had on machine Module and wireless data transceiver module make the APM flight control system of 3D Robotics company work and are protecting from steady mode Demonstrate,prove the stabilized flight of unmanned plane.Video camera is installed in the Handpiece Location of unmanned plane, depression angle β is 45 °, and passes through wireless image Transmission module returns image to earth station, and obtained respectively by GPS positioning system, AHRS attitude heading reference system and altimeter Position, posture and the elevation information of unmanned plane then pass through wireless data transceiver module and are transferred to earth station.
Earth station runs unmanned plane vision positioning scheduling algorithm based on computer, is connected using USB interface without line number According to transceiver module, being in communication with each other for unmanned plane and earth station is realized.
Based on the experiment porch, as shown in Fig. 2, the rotor wing unmanned aerial vehicle objective localization method under a kind of track of being diversion, The following steps are included:
Step 1: shooting image using the video camera being mounted on unmanned plane, and image is passed back to after system starting Earth station;
Step 2: selecting the stationary body with clear profile as marker from the image of passback, and to marker Carry out visual identity;
Carrying out visual identity for marker in step 2, detailed process is as follows:
Marker is identified with SIFT algorithm, obtains m characteristic point P1,P2...Pm-1,Pm, and by these features Point carries out storage as template, and m is integer;
Step 3: rotor wing unmanned aerial vehicle is diversion centered on the marker, and using the result of visual identity to marker into Row multipoint images measurement, based on the method for binocular vision model and the mutual iteration of linear regression model (LRM) calculate unmanned plane relative to The height and course deviation of landform where target;
The flow chart of step 3 is as shown in figure 5, detailed process is as follows:
Step 3.1, rotor wing unmanned aerial vehicle under track of being diversion using visual identity respectively to N number of image in chronological order into Row measurement carries out feature extraction (1≤i≤N) to current i-th of image using SIFT algorithm, then utilizes the feature in template Point is matched with the characteristic point of present image, obtains w group match point P1,P2...Pw-1,Pw(w≤m) finally takes these matchings The geometric center P of pointf(f≤w) represents the location of pixels of marker in the picture, is denoted asAnd it is recorded in i-th Measured value when image measurement, comprising: unmanned plane shooting point OiIn the position of inertial reference system { I }And posture (ψiii), ψiiiRespectively azimuth, pitch angle and roll angle.
Any two image in step 3.2, the N number of image of selection, shares n group, whereinIt is surveyed first The image of amount as left view L, rear measurement image as right view R, constitute the binocular vision model of revolved view, such as Shown in Fig. 3.
Calculate relative altitude h of the unmanned plane relative to markerj, 1≤j≤n
Wherein, marker is respectively in the location of pixels of left and right viewRl, TlPoint It Wei not the corresponding unmanned plane shooting point O of left viewlRelative to the spin matrix and translation matrix of inertial coordinate system,
Wherein, ψlllRespectively left view unmanned plane shooting point OlCourse angle, pitch angle and roll angle, ψrr, φrRespectively right view unmanned plane shooting point OlCourse angle, pitch angle and roll angle, δ ψ are course deviation, there is ψli-δψ (k), k is the number of iterations, θli, φli(1≤i < N), if initial value δ ψ (0)=0;
Rr, TrThe respectively corresponding unmanned plane shooting point O of right viewrRelative to inertial coordinate system spin matrix and Translation matrix,
Wherein, ψrm- δ ψ (k), θrm, φrm(i < m≤N)
The coordinate of left view and the corresponding unmanned plane shooting point of right view under inertial coordinate system is respectivelyWithR, T are the corresponding camera coordinate system of right view relative to left view Scheme spin matrix, the translation matrix of corresponding camera coordinate system, R=RrRl T, T=Tl-RTTr=Rl(Or-Ol);M=[Pl - RTPr Pl×RTPr]-1T。
Step 3.3, for the n group relative altitude h being calculatedj, gross error is rejected with 3 σ criterion, then asks n group flat Mean value
Step 3.4 obtains relative altitudeAfterwards, using the N point measured value in step 3.2 and based on linear regression mould Type calculates course deviation δ ψ (k);
Generally, [x y z]T, [xp yp zp]TUnmanned plane and object are respectively indicated in the seat of inertial coordinate system { I } Mark, (xf′,yf') indicating that the location of pixels of object in the picture, f are the focal length of video camera, the ranging model of video camera is
Attitude matrixFor
Wherein, relative altitude of the h' between unmanned plane and object, (ψ ', θ ', φ ') indicate unmanned plane in some measurement Course angle, pitch angle and the roll angle of point, wherein pitching angle theta ', the measurement accuracy of roll angle φ ' it is high, error is ignored not Meter, and there are biggish course deviations for the measurement of course angle ψ '.
In the present embodiment, in order to calculate the course deviation of course angle, the mark shot using unmanned plane in different location The N point measured value of object, and solved by linear regression method, specific calculating process is as follows: [xG yG zG]TIndicate mark Object enables [x in the coordinate of inertial coordinate system { I }p yp zp]T=[xG yG zG]T,For the phase of unmanned plane and marker To the average value of height, enableIt substitutes into formula (4), can obtain
Setting parameter θ=[θab]T, θa=[xG,yG]T, θb=δ ψ (k),Position and appearance The measurement equation of state is respectively formula (6) and formula (7):
z1(i)=y1(i)+v1,v1~N (0, R1) (6)
Wherein v1, v2To measure noise, R1, R2For real symmetry positive definite matrix.Then formula (5) is deformed into
Wherein,For attitude misalignment, with Taylor expansion, formula (8) becomes
By formula (8) and formula (9), obtain
If matrixWherein a1,3~a2,5Representing matrix AiIn it is right The element answered;MatrixWherein b1,1~b2,3It indicates in matrix BiIn it is corresponding Element.In the present embodiment, N point vision measurement is carried out to same marker, thereforeIt is corresponding Matrix is A1,…,AN, B1,…,BN, following linear regression model (LRM) is obtained by these measured values,
Wherein, I2For 2 × 2 unit matrix, noise is
V~N (0, R)
Covariance matrix is
The estimated value of parameter θ is
Course deviation δ ψ (k) can be solved by formula (12).
Step 3.5 sets e as constant, if | δ ψ (k)-δ ψ (k-1) | < e obtains the estimated value of final relative altitude With the estimated value of course deviation And execute step 4;Otherwise, step 3.2 is gone to, Current δ ψ (k) is substituted into the calculation formula of left and right view course angle, is found outTo be iterated calculating.
Step 4: selecting any in camera coverage under conditions of relative altitude and course deviation are effectively estimated The true course of unmanned plane, and then basis are calculated using the course deviation obtained for target and the measured value for obtaining the target True course and Height Estimation valueIt realizes and the three-dimensional of target is accurately positioned.
Specifically, it is assumed that selected target and marker is in same plane, that is, the relative altitude estimatedIt is considered that The relative altitude of unmanned plane and target, [xt yt zt]TIndicate that target in the coordinate of inertial coordinate system { I }, hasEnable [xp yp zp]T=[xt yt zt]T,By the true course generation of the measured value of target and unmanned plane Enter formula (4), the coordinate of target is calculated, to realize the three-dimensional localization to target.
The validity of the iterative process is specifically described below, by taking track of being diversion is circumference as an example, radius R=73m, radian Rad=1.5 π, δ ψ=0,1 ..., 59,60deg obtains corresponding 61 groups by formula (1)Then the side of maintenance data fitting Method, withFor dependent variable, δ ψ is independent variable, as shown in fig. 6, obtainingMathematical relationship expression formula:
In the same manner, it enablesIt is solved by formula (10) and obtains 36 groups of δ ψ, then maintenance data The method of fitting, using δ ψ as dependent variable,For independent variable, as shown in fig. 7, obtainingMathematical relationship expression formula:
Ifeδψ=δ ψ-δ ψt, wherein ht, δ ψ t is the true value of relative altitude and course deviation, by formula (9) ,
It is obtained by formula (10),
Wherein, k1, k2For relevant parameter.
The relative altitude that binocular vision model calculates substitutes into equation of linear regression, can effectively calculate course deviation.So Afterwards, by the estimated value back substitution of course deviation to binocular vision model, relative altitude can be precisely calculated.Generally, AHRS system Course deviation be no more than 30deg, so having | k2| > k1> 0, and due to k2< 0, according to formula (15), (16) it is found that by After finite iteration, the estimated value of relative altitudeWith the estimated value of course deviationTrue value will be converged to.
Under conditions of UAV flight's camera, carried out flight test, unmanned plane under track of being diversion to marker into Row image measurement, wherein the radius R=73m for track of being diversion, radian rad=1.5 π, true height of the unmanned plane relative to marker Spend ht=45m, flying speed V=3.44m/s, fGPS=4Hz, the true value δ ψ of course deviationt=30deg, if e= 0.02deg, the effect of method provided by the present invention such as table 1, table 2, as shown in Fig. 8.Wherein listed error e in tableh, eδψ, exy, ezAll referring to root-mean-square error.
1 iterative process of table
2 target positioning result of table
Index Three-dimensional localization of the invention
Relative altitude evaluated error eh/m 0.93
Course estimation error eδψ/deg 1.89
Position error exy/m 10.89
Position error ez/m 0.43
In conclusion the above is merely preferred embodiments of the present invention, being not intended to limit protection model of the invention It encloses.All within the spirits and principles of the present invention, any modification, equivalent replacement, improvement and so on should be included in this hair Within bright protection scope.

Claims (4)

1. the rotor wing unmanned aerial vehicle objective localization method under a kind of track of being diversion, it is characterised in that:
Step 1: extracting stationary body from the image that rotor wing unmanned aerial vehicle is shot as marker;
Step 2: rotor wing unmanned aerial vehicle is diversion centered on the marker, N is carried out to the marker during being diversion The shooting of a angle, and obtain the measured value of every width shooting image;N is positive integer;
Step 3: N number of shooting image is pairwise grouping, every group of shooting image utilizes the binocular vision model meter of revolved view The relative altitude of the relatively described marker of rotor wing unmanned aerial vehicle is calculated, then N group is averaged, as the phase in current iteration round k To height error
Wherein, when calculating relative altitude, course deviation δ ψ (k) needed for the binocular vision model of revolved view uses upper one A iteration round k-1 calculates the course deviation δ ψ (k-1) obtained, and binocular vision model is calculated required course angle ψ (k) more It is newly ψ (k)=ψi- δ ψ (k), wherein ψiFor course angle of the unmanned plane when shooting i-th of image;Initial value δ ψ (0) takes 0;
Step 4: utilizing relative altitude errorWith the measured value of N width shooting image, current iteration round k is calculated Course deviation δ ψ (k);
Step 5: judging whether δ ψ (k) and the deviation of δ ψ (k-1) are less than given threshold, if it is, by last time iteration knot Fruit is as relative altitude estimated valueWith the estimated value of course deviationAnd execute step 6;Otherwise, iteration round k is enabled to add 1, It is transferred to step 3;
Step 6: to the arbitrary target in rotor wing unmanned aerial vehicle camera coverage, using the estimated value of course deviation calculate rotor without Man-machine true course, and then according to true course and Height Estimation valueRealize the three-dimensional localization to target;
Wherein, in the step 4, the concrete mode of course deviation δ ψ (k) is calculated are as follows:
[xGyGzG]TIndicate that marker in the coordinate of inertial coordinate system { I }, can obtain
The ranging model of video camera are as follows:
Wherein, f is the focal length of video camera,For the attitude matrix of the unmanned plane when shooting i-th of image,
1≤i≤N;
Wherein,iii) it is unmanned plane shooting point O when shooting i-th of imageiAt inertial reference system { I } Position and posture, ψiiiRespectively azimuth, pitch angle and roll angle,It is marker in i-th image Location of pixels.
Setting parameter θ=[θab]T, θa=[xG,yG]T, θb=δ ψ (k),Measurement equation group is
Wherein v1, v2To measure noise, R1, R2For real symmetry positive definite matrix, then formula (3) is deformed into
WhereinFor attitude misalignment, with Taylor expansion, formula (4) becomes
By formula (4) and formula (5), obtain
If matrixWherein a1,3~a2,5Representing matrix AiIn corresponding member Element;
MatrixWherein b1,1~b2,3It indicates in matrix BiIn corresponding element;It obtains Following linear regression model (LRM),
Wherein, I2For 2 × 2 unit matrix, noise is V~N (0, R)
Covariance matrix is
The estimated value of parameter θ is
Course deviation δ ψ (k) can be solved by formula (8).
2. rotor wing unmanned aerial vehicle objective localization method as described in claim 1, which is characterized in that the rotor wing unmanned aerial vehicle phase To the relative altitude h of the markerj, 1≤j≤n calculation method are as follows:
Wherein, T=Tl-RTTr=Rl(Or-Ol), M=[Pl -RTPr Pl×RTPr]-1T, R=RrRl T;Pl、PrRespectively marker In the location of pixels of left and right view;R, T are that the corresponding camera coordinate system of right view is sat relative to the corresponding video camera of left view Mark spin matrix, the translation matrix of system, Rl, TlThe respectively corresponding unmanned plane shooting point O of left viewlIt is sat relative to inertial reference Mark the spin matrix and translation matrix of system, Rr, TrThe respectively corresponding unmanned plane shooting point O of right viewrIt is sat relative to inertial reference Mark the spin matrix and translation matrix of system.
3. rotor wing unmanned aerial vehicle objective localization method as claimed in claim 2, which is characterized in that marker pixel position The measurement method set are as follows:
Mark object image obtained in step 1 is identified, several characteristic points are obtained;During being diversion each image into Row identification obtains several characteristic points, and the characteristic point of each image is matched with the characteristic point of mark object image, will be matched Location of pixels of the geometric center of point as marker in the picture.
4. rotor wing unmanned aerial vehicle objective localization method as described in claim 1, which is characterized in that the step 3 is carrying out Before N group is averaged, the gross error of relative altitude is first rejected using 3 σ criterion.
CN201610943473.4A 2016-11-01 2016-11-01 A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion Active CN107300377B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610943473.4A CN107300377B (en) 2016-11-01 2016-11-01 A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610943473.4A CN107300377B (en) 2016-11-01 2016-11-01 A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion

Publications (2)

Publication Number Publication Date
CN107300377A CN107300377A (en) 2017-10-27
CN107300377B true CN107300377B (en) 2019-06-14

Family

ID=60138055

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610943473.4A Active CN107300377B (en) 2016-11-01 2016-11-01 A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion

Country Status (1)

Country Link
CN (1) CN107300377B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109708622A (en) * 2017-12-15 2019-05-03 福建工程学院 The method that three-dimensional modeling is carried out to building using unmanned plane based on Pixhawk
CN110621962A (en) * 2018-02-28 2019-12-27 深圳市大疆创新科技有限公司 Positioning method of movable platform and related device and system
CN110799921A (en) * 2018-07-18 2020-02-14 深圳市大疆创新科技有限公司 Shooting method and device and unmanned aerial vehicle
CN112567201B (en) * 2018-08-21 2024-04-16 深圳市大疆创新科技有限公司 Distance measuring method and device
CN110632941B (en) * 2019-09-25 2020-12-15 北京理工大学 Trajectory generation method for target tracking of unmanned aerial vehicle in complex environment
CN110675453B (en) * 2019-10-16 2021-04-13 北京天睿空间科技股份有限公司 Self-positioning method for moving target in known scene
CN110824295B (en) * 2019-10-22 2021-08-31 广东电网有限责任公司 Infrared thermal image fault positioning method based on three-dimensional graph
CN113469139B (en) * 2021-07-30 2022-04-05 广州中科智云科技有限公司 Data security transmission method and system for unmanned aerial vehicle edge side embedded AI chip
CN115272892B (en) * 2022-07-29 2023-07-11 同济大学 Unmanned aerial vehicle positioning deviation monitoring management and control system based on data analysis
CN117452831B (en) * 2023-12-26 2024-03-19 南京信息工程大学 Four-rotor unmanned aerial vehicle control method, device, system and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10221072A (en) * 1997-02-03 1998-08-21 Asahi Optical Co Ltd System and method for photogrammetry
JP2003083744A (en) * 2001-09-12 2003-03-19 Starlabo Corp Imaging apparatus mounted to aircraft, and aircraft imaging data processing apparatus
CN102519434A (en) * 2011-12-08 2012-06-27 北京控制工程研究所 Test verification method for measuring precision of stereoscopic vision three-dimensional recovery data
CN105424006A (en) * 2015-11-02 2016-03-23 国网山东省电力公司电力科学研究院 Unmanned aerial vehicle hovering precision measurement method based on binocular vision

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10221072A (en) * 1997-02-03 1998-08-21 Asahi Optical Co Ltd System and method for photogrammetry
JP2003083744A (en) * 2001-09-12 2003-03-19 Starlabo Corp Imaging apparatus mounted to aircraft, and aircraft imaging data processing apparatus
CN102519434A (en) * 2011-12-08 2012-06-27 北京控制工程研究所 Test verification method for measuring precision of stereoscopic vision three-dimensional recovery data
CN105424006A (en) * 2015-11-02 2016-03-23 国网山东省电力公司电力科学研究院 Unmanned aerial vehicle hovering precision measurement method based on binocular vision

Also Published As

Publication number Publication date
CN107300377A (en) 2017-10-27

Similar Documents

Publication Publication Date Title
CN107300377B (en) A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion
CN106153008B (en) A kind of rotor wing unmanned aerial vehicle objective localization method of view-based access control model
EP3454008A1 (en) Survey data processing device, survey data processing method, and survey data processing program
CN109945856A (en) Based on inertia/radar unmanned plane autonomous positioning and build drawing method
CN106155081B (en) A kind of a wide range of target monitoring of rotor wing unmanned aerial vehicle and accurate positioning method
CN107741234A (en) The offline map structuring and localization method of a kind of view-based access control model
CN107146256B (en) Camera marking method under outfield large viewing field condition based on differential global positioning system
RU2550811C1 (en) Method and device for object coordinates determination
CN105698762A (en) Rapid target positioning method based on observation points at different time on single airplane flight path
CN102778224B (en) Method for aerophotogrammetric bundle adjustment based on parameterization of polar coordinates
CN109425348A (en) A kind of while positioning and the method and apparatus for building figure
CN110220491A (en) A kind of optics gondola fix error angle evaluation method of unmanned plane
CN110889873A (en) Target positioning method and device, electronic equipment and storage medium
CN108426576A (en) Aircraft paths planning method and system based on identification point vision guided navigation and SINS
CN111208526B (en) Multi-unmanned aerial vehicle cooperative positioning method based on laser radar and positioning vector matching
US9816786B2 (en) Method for automatically generating a three-dimensional reference model as terrain information for an imaging device
CN109839945A (en) Unmanned plane landing method, unmanned plane landing-gear and computer readable storage medium
CN105389819B (en) A kind of lower visible image method for correcting polar line of half calibration and system of robust
Han et al. Multiple targets geolocation using SIFT and stereo vision on airborne video sequences
CN109764864A (en) A kind of indoor UAV position and orientation acquisition methods and system based on color identification
CN114821372A (en) Monocular vision-based method for measuring relative pose of individuals in unmanned aerial vehicle formation
CN112489118B (en) Method for quickly calibrating external parameters of airborne sensor group of unmanned aerial vehicle
CN115144879A (en) Multi-machine multi-target dynamic positioning system and method
Kupervasser et al. Robust positioning of drones for land use monitoring in strong terrain relief using vision-based navigation
Cheng et al. High precision passive target localization based on airborne electro-optical payload

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant