CN107300377A - A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion - Google Patents

A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion Download PDF

Info

Publication number
CN107300377A
CN107300377A CN201610943473.4A CN201610943473A CN107300377A CN 107300377 A CN107300377 A CN 107300377A CN 201610943473 A CN201610943473 A CN 201610943473A CN 107300377 A CN107300377 A CN 107300377A
Authority
CN
China
Prior art keywords
mrow
msub
mtd
msubsup
mtr
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610943473.4A
Other languages
Chinese (zh)
Other versions
CN107300377B (en
Inventor
邓方
张乐乐
陈杰
邱煌斌
陈文颉
彭志红
白永强
李佳洪
桂鹏
樊欣宇
顾晓丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN201610943473.4A priority Critical patent/CN107300377B/en
Publication of CN107300377A publication Critical patent/CN107300377A/en
Application granted granted Critical
Publication of CN107300377B publication Critical patent/CN107300377B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Image Analysis (AREA)
  • Medicines Containing Antibodies Or Antigens For Use As Internal Diagnostic Agents (AREA)

Abstract

The present invention discloses the rotor wing unmanned aerial vehicle objective localization method under a kind of track of being diversion, and using the single camera photographic subjects image being mounted on unmanned plane, and image is passed back into earth station;Mark of the selection with obvious characteristic, and carry out visual identity;Then rotor wing unmanned aerial vehicle is diversion centered on the mark, carries out multipoint images measurement, and the method based on binocular vision model and the mutual iteration of linear regression model (LRM) calculates height and course deviation of the unmanned plane relative to landform where target;Next, any static or moving target in camera coverage may be selected in operating personnel, realize that the three-dimensional of target is accurately positioned.The present invention is carried out with aerial mission once, and flight leading portion calculates course deviation and relative altitude, and flight back segment carries out three-dimensional and is accurately positioned;The present invention solves the problem of triangle polyester fibre method traditional under track of being diversion can not calculate relative altitude, so as to realize the three-dimensional localization to target.

Description

A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion
Technical field
The invention belongs to vision measurement field, and in particular to the rotor wing unmanned aerial vehicle objective under a kind of track of being diversion is determined Position method.
Background technology
Rotor wing unmanned aerial vehicle is so that cost is low, VTOL and the features such as hovering, in investigation, agricultural insurance, environmental protection and calamity The field such as rescue is applied widely afterwards.
And the rotor wing unmanned aerial vehicle target positioning of view-based access control model has been one of current study hotspot problem, using vision side Method carries out three-dimensional localization to target, determines the relative altitude of unmanned plane and target by triangle polyester fibre method first, then could Carry out the positioning of target.The course deviation that the low precision AHRS attitude heading reference systems being equipped with view of rotor wing unmanned aerial vehicle are brought Larger, when the image shot using unmanned plane carries out vision measurement, certain skew occurs for the light projected in image.If Using traditional triangle polyester fibre method, two groups of light from the projection of left and right view are due to being offset, and it is relative that solution is obtained Height will produce larger error, therefore can not accurately calculate the relative altitude between unmanned plane and object, so as to can not enter to target The effective three-dimensional localization of row.
The content of the invention
In view of this, can the invention provides the rotor wing unmanned aerial vehicle objective localization method under a kind of track of being diversion Calculating obtains course deviation, reduces the calculation error to relative altitude, so as to improve three-dimensional localization of the rotor wing unmanned aerial vehicle to target Ability.
Beneficial effect:
(1) method provided by the present invention is directed to the rotor wing unmanned aerial vehicle for being equipped with low precision AHRS attitude heading reference system systems, The course deviation of AHRS attitude heading reference systems presence can be precisely calculated, and then calculates rotor wing unmanned aerial vehicle and mesh under track of being diversion Height between landform where mark, so as to realize that rotor wing unmanned aerial vehicle is positioned to the 3D vision of target.
Brief description of the drawings
Fig. 1 is rotor wing unmanned aerial vehicle target 3 D positioning system structure chart of the invention;
Fig. 2 is the flow chart of method provided by the present invention;
Fig. 3 is revolved view binocular vision model schematic used in the present invention;
Fig. 4 is monocular-camera ranging model schematic used in the present invention;
Fig. 5 is the iterative process flow chart in method provided by the present invention;
Fig. 6 is in method provided by the present inventionData matched curve;
Fig. 7 is in method provided by the present inventionData matched curve;
Fig. 8 is the locating effect figure of method provided by the present invention.
Embodiment
The present invention will now be described in detail with reference to the accompanying drawings and examples.
Build following experiment porch to verify effectiveness of the invention, use the rotor wing unmanned aerial vehicles of a frame T650 tetra-, one Platform notebook can carry out real-time Communication for Power as earth station between unmanned plane and earth station, system architecture is as shown in Figure 1.
For unmanned plane, GPS positioning system, AHRS attitude heading reference systems, altimeter, wireless image transmission are carried on machine Module and wireless data transceiver module, make the APM flight control systems of 3D Robotics companies be operated in from steady pattern to protect Demonstrate,prove the stabilized flight of unmanned plane.Video camera is installed in the Handpiece Location of unmanned plane, depression angle β is 45 °, and passes through wireless image Transport module returns image to earth station, and obtained respectively by GPS positioning system, AHRS attitude heading reference systems and altimeter Position, posture and the elevation information of unmanned plane are then transferred to earth station by wireless data transceiver module.
Earth station runs unmanned plane vision positioning scheduling algorithm based on computer, using USB interface connection without line number According to transceiver module, being in communication with each other for unmanned plane and earth station is realized.
Based on the experiment porch, as shown in Fig. 2 the rotor wing unmanned aerial vehicle objective localization method under a kind of track of being diversion, Comprise the following steps:
Step 1: after system starts, using the video camera shooting image being mounted on unmanned plane, and image is passed back to Earth station;
Step 2: the stationary body with clear profile is selected from the image of passback as mark, and to mark Carry out visual identity;
It is as follows for the detailed process of mark progress visual identity in step 2:
Mark is identified with SIFT algorithms, m characteristic point P is obtained1,P2...Pm-1,Pm, and by these features Point is stored as template, and m is integer;
Step 3: rotor wing unmanned aerial vehicle is diversion centered on the mark, and mark is entered using the result of visual identity Row multipoint images are measured, method based on binocular vision model and the mutual iteration of linear regression model (LRM) calculate unmanned plane relative to The height and course deviation of landform where target;
The flow chart of step 3 is as shown in figure 5, detailed process is as follows:
Step 3.1, rotor wing unmanned aerial vehicle are entered in chronological order to N number of image respectively under track of being diversion using visual identity Row measurement, carries out feature extraction (1≤i≤N) to current i-th of image using SIFT algorithms, then utilizes the feature in template Point is matched with the characteristic point of present image, obtains w group match points P1,P2...Pw-1,Pw(w≤m), finally takes these to match The geometric center P of pointf(f≤w) represents the location of pixels of mark in the picture, is designated asAnd record to i-th Measured value during image measurement, including:Unmanned plane shooting point OiIn inertial reference system { I } positionAnd posture (ψiii), ψiiiRespectively azimuth, the angle of pitch and roll angle.
Step 3.2, any two image chosen in N number of image, have n groups, whereinFirst survey The image of amount as left view L, rear measurement image as right view R, constitute the binocular vision model of revolved view, such as Shown in Fig. 3.
Calculate relative altitude h of the unmanned plane relative to markj, 1≤j≤n
Wherein, mark is respectively in the location of pixels of left and right viewRl, TlPoint Wei not the corresponding unmanned plane shooting point O of left viewlRelative to the spin matrix and translation matrix of inertial coordinate system,
Wherein, ψlllRespectively left view unmanned plane shooting point OlCourse angle, the angle of pitch and roll angle, ψrr, φrRespectively right view unmanned plane shooting point OlCourse angle, the angle of pitch and roll angle, δ ψ are course deviation, there is ψli-δψ (k), k is iterations, θli, φli(1≤i < N), if initial value δ ψ (0)=0;
Rr, TrThe respectively corresponding unmanned plane shooting point O of right viewrRelative to inertial coordinate system spin matrix and Translation matrix,
Wherein, ψrm- δ ψ (k), θrm, φrm(i < m≤N)
The coordinate of left view and the corresponding unmanned plane shooting point of right view under inertial coordinate system is respectivelyWithR, T are the corresponding camera coordinate system of right view relative to left view The spin matrix of corresponding camera coordinate system, translation matrix, R=RrRl T, T=Tl-RTTr=Rl(Or-Ol);M=[Pl - RTPr Pl×RTPr]-1T。
Step 3.3, the n group relative altitudes h obtained for calculatingj, gross error is rejected with 3 σ criterions, then asks n groups to put down Average
Step 3.4, acquisition relative altitudeAfterwards, using the N point measured values in step 3.2 and based on linear regression mould Type calculates course deviation δ ψ (k);
Usually, [x y z]T, [xp yp zp]TThe seat of unmanned plane and object in inertial coordinate system { I } is represented respectively Mark, (xf′,yf') location of pixels of object in the picture is represented, f is the focal length of video camera, and the ranging model of video camera is
Attitude matrixFor
Wherein, h' is the relative altitude between unmanned plane and object, and (ψ ', θ ', φ ') represents that unmanned plane is measured at some Course angle, the angle of pitch and the roll angle of point, wherein, pitching angle theta ', roll angle φ ' measurement accuracy it is high, its error is ignored not Meter, and there is larger course deviation in course angle ψ ' measurement.
In the present embodiment, in order to calculate the course deviation of course angle, the mark shot using unmanned plane in diverse location The N point measured values of thing, and solved by linear regression method, specific calculating process is as follows: [xG yG zG]TRepresent mark Thing makes [x in inertial coordinate system { I } coordinatep yp zp]T=[xG yG zG]T,For the phase of unmanned plane and mark To the average value of height, orderFormula (4) is substituted into, can be obtained
Setting parameter θ=[θab]T, θa=[xG,yG]T, θb=δ ψ (k),Position and appearance The measurement equation of state is respectively formula (6) and formula (7):
z1(i)=y1(i)+v1,v1~N (0, R1) (6)
Wherein v1, v2For measurement noise, R1, R2For real symmetry positive definite matrix.Then formula (5) is deformed into
Wherein,For attitude misalignment, with Taylor expansion, formula (8) is changed into
By formula (8) and formula (9), obtain
If matrixWherein a1,3~a2,5Representing matrix AiIn it is right The element answered;MatrixWherein b1,1~b2,3Represent in matrix BiIn it is corresponding Element.In the present embodiment, N point vision measurements are carried out to same mark, thereforeCorresponding square Battle array is A1,…,AN, B1,…,BN, following linear regression model (LRM) is worth to by these measurements,
Wherein, I2For 2 × 2 unit matrix, noise is
V~N (0, R)
Covariance matrix is
The estimate of parameter θ is
Course deviation δ ψ (k) can be solved by formula (12).
Step 3.5, e is set as constant, if | δ ψ (k)-δ ψ (k-1) | < e, obtain the estimate of final relative altitude With the estimate of course deviation And perform step 4;Otherwise, step 3.2 is gone to, will Current δ ψ (k) are substituted into the calculation formula of left and right view course angle, are obtainedSo as to be iterated calculating.
Step 4: under conditions of relative altitude and course deviation are effectively estimated, any in selection camera coverage Target and the measured value for obtaining the target, calculate the true course for obtaining unmanned plane using the course deviation that draws, and then according to True course and Height Estimation valueRealize and the three-dimensional of target is accurately positioned.
Specifically, it is assumed that selected target is in same plane, that is, the relative altitude estimated with markIt is considered that The relative altitude of unmanned plane and target, [xt yt zt]TRepresent that target, in inertial coordinate system { I } coordinate, hasMake [xp yp zp]T=[xt yt zt]T,By the true course generation of the measured value of target and unmanned plane Enter formula (4), the coordinate for obtaining target is calculated, so as to realize the three-dimensional localization to target.
The validity of the iterative process is specifically described below, exemplified by being circumference by track of being diversion, radius R=73m, radian Rad=1.5 π, δ ψ=0,1 ..., 59,60deg, corresponding 61 groups are obtained by formula (1)Then the side that maintenance data is fitted Method, withFor dependent variable, δ ψ are independent variable, as shown in fig. 6, obtainingMathematical relationship expression formula:
In the same manner, makeSolved by formula (10) and obtain 36 groups of δ ψ, then maintenance data The method of fitting, using δ ψ as dependent variable,For independent variable, as shown in fig. 7, obtainingMathematical relationship expression formula:
Ifeδψ=δ ψ-δ ψt, wherein ht, δ ψ t are the actual value of relative altitude and course deviation, by formula (9) ,
Obtained by formula (10),
Wherein, k1, k2For relevant parameter.
The relative altitude that binocular vision model is calculated substitutes into equation of linear regression, can effectively calculate course deviation.So Afterwards, the estimate back substitution of course deviation can be precisely calculated relative altitude to binocular vision model.Usually, AHRS systems Course deviation be no more than 30deg, so having | k2| > k1> 0, and due to k2< 0, according to formula (15), (16) understand, passed through After finite iteration, the estimate of relative altitudeWith the estimate of course deviationActual value will be converged to.
Under conditions of UAV flight's camera, flight test is carried out, unmanned plane enters under track of being diversion to mark Row image measurement, wherein the radius R=73m for track of being diversion, radian rad=1.5 π, true height of the unmanned plane relative to mark Spend ht=45m, flying speed V=3.44m/s, fGPS=4Hz, the actual value δ ψ of course deviationt=30deg, if e= 0.02deg, the effect such as table 1 of method provided by the present invention, table 2, as shown in Fig. 8.Listed error e wherein in tableh, eδψ, exy, ezAll referring to root-mean-square error.
The iterative process of table 1
The target positioning result of table 2
Index The three-dimensional localization of the present invention
Relative altitude evaluated error eh/m 0.93
Course estimation error eδψ/deg 1.89
Position error exy/m 10.89
Position error ez/m 0.43
In summary, presently preferred embodiments of the present invention is these are only, the protection model of the present invention is not intended to limit Enclose.Within the spirit and principles of the invention, any modification, equivalent substitution and improvements made etc., should be included in this hair Within bright protection domain.

Claims (5)

1. the rotor wing unmanned aerial vehicle objective localization method under a kind of track of being diversion, it is characterised in that:
Step 1: stationary body is extracted in the image shot from rotor wing unmanned aerial vehicle is used as mark;
Step 2: rotor wing unmanned aerial vehicle is diversion centered on the mark, N is carried out to the mark during being diversion The shooting of individual angle, and obtain the measured value of every width shooting image;N is positive integer;
Step 3: N number of shooting image is pairwise grouping, every group of shooting image utilizes the binocular vision model meter of revolved view The relative altitude of the relatively described mark of rotor wing unmanned aerial vehicle is calculated, then N groups are averaged, and are used as the phase in current iteration round k To height error
Wherein, when calculating relative altitude, the course deviation δ ψ (k) needed for the binocular vision model of revolved view use upper one Individual iteration round k-1 calculates the course deviation δ ψ (k-1) obtained, and binocular vision model is calculated into required course angle ψ (k) more It is newly ψ (k)=ψi- δ ψ (k), wherein, ψiFor course angle of the unmanned plane when shooting i-th of image;Initial value δ ψ (0) take 0;
Step 4: using relative altitude error h (k) and the measured value of N width shooting images, calculating obtains current iteration round K course deviation δ ψ (k);
Step 5: judging whether δ ψ (k) and δ ψ (k-1) deviation is less than given threshold, if it is, by last time iteration knot Fruit is used as relative altitude estimateWith the estimate of course deviationAnd perform step 6;Otherwise, iteration round k is made plus 1, It is transferred to step 3;
Step 6: to the arbitrary target in rotor wing unmanned aerial vehicle camera coverage, using the estimate of course deviation calculate rotor without Man-machine true course, and then according to true course and Height Estimation valueRealize the three-dimensional localization to target.
2. rotor wing unmanned aerial vehicle objective localization method as claimed in claim 1, it is characterised in that the rotor wing unmanned aerial vehicle phase To the relative altitude h of the markj, 1≤j≤n computational methods are:
<mrow> <msub> <mi>h</mi> <mi>j</mi> </msub> <mo>=</mo> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <msup> <msub> <mi>R</mi> <mi>l</mi> </msub> <mi>T</mi> </msup> <mrow> <mo>(</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mn>1</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> </mtable> </mfenced> <mo>)</mo> </mrow> <msub> <mi>MP</mi> <mi>l</mi> </msub> <mo>+</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> </mtable> </mfenced> <msup> <mi>MR</mi> <mi>T</mi> </msup> <msub> <mi>P</mi> <mi>r</mi> </msub> <mo>+</mo> <mi>T</mi> <mo>)</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow>
Wherein, T=Tl-RTTr=Rl(Or-Ol), M=[Pl -RTPr Pl×RTPr]-1T, R=RrRl T;Pl、PrRespectively mark In the location of pixels of left and right view;R, T are that the corresponding camera coordinate system of right view video camera corresponding relative to left view is sat Mark spin matrix, the translation matrix, R of systeml, TlThe respectively corresponding unmanned plane shooting point O of left viewlSat relative to inertial reference Mark the spin matrix and translation matrix of system, Rr, TrThe respectively corresponding unmanned plane shooting point O of right viewrSat relative to inertial reference Mark the spin matrix and translation matrix of system.
3. rotor wing unmanned aerial vehicle objective localization method as claimed in claim 2, it is characterised in that the mark pixel position The measuring method put is:
The mark object image obtained in step one is identified, some characteristic points are obtained;Each image enters during being diversion Row identification obtains some characteristic points, by the characteristic point of each image with indicating that the characteristic point of object image is matched, will match Location of pixels of the geometric center of point as mark in the picture.
4. rotor wing unmanned aerial vehicle objective localization method as claimed in claim 1, it is characterised in that in the step 4, meter Calculate course deviation δ ψ (k) concrete mode be:
[xG yG zG]TCoordinate of the mark in inertial coordinate system { I } is represented,For the relatively high of unmanned plane and mark The average value of degree, can be obtained
The ranging model of video camera is:
<mrow> <mfenced open = "(" close = ")"> <mtable> <mtr> <mtd> <msub> <mi>x</mi> <mi>G</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>y</mi> <mi>G</mi> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>=</mo> <mfenced open = "(" close = ")"> <mtable> <mtr> <mtd> <msubsup> <mi>x</mi> <mi>o</mi> <mi>i</mi> </msubsup> </mtd> </mtr> <mtr> <mtd> <msubsup> <mi>y</mi> <mi>o</mi> <mi>i</mi> </msubsup> </mtd> </mtr> </mtable> </mfenced> <mo>+</mo> <mover> <mi>h</mi> <mo>&amp;OverBar;</mo> </mover> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mfrac> <mn>1</mn> <mrow> <mrow> <mo>(</mo> <mrow> <mn>0</mn> <mo>,</mo> <mn>0</mn> <mo>,</mo> <mn>1</mn> </mrow> <mo>)</mo> </mrow> <msubsup> <mi>C</mi> <mi>b</mi> <mi>n</mi> </msubsup> <mfenced open = "(" close = ")"> <mtable> <mtr> <mtd> <msubsup> <mi>x</mi> <mi>f</mi> <mi>i</mi> </msubsup> </mtd> </mtr> <mtr> <mtd> <msubsup> <mi>y</mi> <mi>f</mi> <mi>i</mi> </msubsup> </mtd> </mtr> <mtr> <mtd> <mi>f</mi> </mtd> </mtr> </mtable> </mfenced> </mrow> </mfrac> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mn>1</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> </mtable> </mfenced> <msubsup> <mi>C</mi> <mi>b</mi> <mi>n</mi> </msubsup> <mfenced open = "(" close = ")"> <mtable> <mtr> <mtd> <msubsup> <mi>x</mi> <mi>f</mi> <mi>i</mi> </msubsup> </mtd> </mtr> <mtr> <mtd> <msubsup> <mi>y</mi> <mi>f</mi> <mi>i</mi> </msubsup> </mtd> </mtr> <mtr> <mtd> <mi>f</mi> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow>
Wherein,For shoot i-th of image when unmanned plane attitude matrix, 1≤i≤N,
<mrow> <msubsup> <mi>C</mi> <mi>b</mi> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mrow> <msub> <mi>cos&amp;psi;</mi> <mi>i</mi> </msub> <msub> <mi>cos&amp;theta;</mi> <mi>i</mi> </msub> </mrow> </mtd> <mtd> <mrow> <msub> <mi>cos&amp;psi;</mi> <mi>i</mi> </msub> <msub> <mi>sin&amp;theta;</mi> <mi>i</mi> </msub> <msub> <mi>sin&amp;phi;</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>sin&amp;psi;</mi> <mi>i</mi> </msub> <msub> <mi>cos&amp;phi;</mi> <mi>i</mi> </msub> </mrow> </mtd> <mtd> <mrow> <msub> <mi>sin&amp;psi;</mi> <mi>i</mi> </msub> <msub> <mi>sin&amp;phi;</mi> <mi>i</mi> </msub> <mo>+</mo> <msub> <mi>cos&amp;psi;</mi> <mi>i</mi> </msub> <msub> <mi>sin&amp;theta;</mi> <mi>i</mi> </msub> <msub> <mi>cos&amp;phi;</mi> <mi>i</mi> </msub> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>sin&amp;psi;</mi> <mi>i</mi> </msub> <msub> <mi>cos&amp;theta;</mi> <mi>i</mi> </msub> </mrow> </mtd> <mtd> <mrow> <msub> <mi>cos&amp;psi;</mi> <mi>i</mi> </msub> <msub> <mi>cos&amp;phi;</mi> <mi>i</mi> </msub> <mo>+</mo> <msub> <mi>sin&amp;psi;</mi> <mi>i</mi> </msub> <msub> <mi>sin&amp;theta;</mi> <mi>i</mi> </msub> <msub> <mi>sin&amp;phi;</mi> <mi>i</mi> </msub> </mrow> </mtd> <mtd> <mrow> <msub> <mi>sin&amp;psi;</mi> <mi>i</mi> </msub> <msub> <mi>sin&amp;theta;</mi> <mi>i</mi> </msub> <msub> <mi>cos&amp;phi;</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>cos&amp;psi;</mi> <mi>i</mi> </msub> <msub> <mi>sin&amp;phi;</mi> <mi>i</mi> </msub> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>-</mo> <msub> <mi>sin&amp;theta;</mi> <mi>i</mi> </msub> </mrow> </mtd> <mtd> <mrow> <msub> <mi>sin&amp;phi;</mi> <mi>i</mi> </msub> <msub> <mi>cos&amp;theta;</mi> <mi>i</mi> </msub> </mrow> </mtd> <mtd> <mrow> <msub> <mi>cos&amp;phi;</mi> <mi>i</mi> </msub> <msub> <mi>cos&amp;theta;</mi> <mi>i</mi> </msub> </mrow> </mtd> </mtr> </mtable> </mfenced> </mrow>
Wherein,iii) it is unmanned plane shooting point O when shooting i-th of imageiAt inertial reference system { I } Position and posture, ψiiiRespectively azimuth, the angle of pitch and roll angle,It is mark in i-th image Location of pixels.
Setting parameter θ=[θab]T, θa=[xG,yG]T, θb=δ ψ (k),Measurement equation group is
<mrow> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <msub> <mi>z</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>=</mo> <msub> <mi>y</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>v</mi> <mn>1</mn> </msub> <mo>,</mo> <msub> <mi>v</mi> <mn>1</mn> </msub> <mo>~</mo> <mi>N</mi> <mrow> <mo>(</mo> <mn>0</mn> <mo>,</mo> <msub> <mi>R</mi> <mn>1</mn> </msub> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msubsup> <mi>C</mi> <mrow> <mi>b</mi> <mi>c</mi> </mrow> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>&amp;ap;</mo> <msubsup> <mi>C</mi> <mi>b</mi> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>+</mo> <msubsup> <mi>&amp;delta;C</mi> <mi>b</mi> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <msub> <mi>&amp;theta;</mi> <mi>b</mi> </msub> <mo>+</mo> <msub> <mi>v</mi> <mn>2</mn> </msub> <mo>)</mo> </mrow> <mo>,</mo> <msub> <mi>v</mi> <mn>2</mn> </msub> <mo>~</mo> <mi>N</mi> <mrow> <mo>(</mo> <mn>0</mn> <mo>,</mo> <msub> <mi>R</mi> <mn>2</mn> </msub> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </mrow>
Wherein v1, v2For measurement noise, R1, R2For real symmetry positive definite matrix, then formula (3) is deformed into
<mrow> <msub> <mi>&amp;theta;</mi> <mi>a</mi> </msub> <mo>=</mo> <mi>f</mi> <mrow> <mo>(</mo> <msub> <mi>z</mi> <mn>1</mn> </msub> <mo>(</mo> <mi>i</mi> <mo>)</mo> <mo>-</mo> <msub> <mi>v</mi> <mn>1</mn> </msub> <mo>,</mo> <msubsup> <mi>C</mi> <mrow> <mi>b</mi> <mi>c</mi> </mrow> <mi>n</mi> </msubsup> <mo>(</mo> <mi>i</mi> <mo>)</mo> <mo>-</mo> <msubsup> <mi>&amp;delta;C</mi> <mi>b</mi> <mi>n</mi> </msubsup> <mo>(</mo> <mi>i</mi> <mo>)</mo> <mo>(</mo> <mrow> <msub> <mi>&amp;theta;</mi> <mi>b</mi> </msub> <mo>+</mo> <msub> <mi>v</mi> <mn>2</mn> </msub> </mrow> <mo>)</mo> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>4</mn> <mo>)</mo> </mrow> </mrow>
WhereinFor attitude misalignment, with Taylor expansion, formula (4) is changed into
<mrow> <mtable> <mtr> <mtd> <mrow> <mi>f</mi> <mrow> <mo>(</mo> <mrow> <msub> <mi>z</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>-</mo> <msub> <mi>v</mi> <mn>1</mn> </msub> <mo>,</mo> <msubsup> <mi>C</mi> <mrow> <mi>b</mi> <mi>c</mi> </mrow> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>-</mo> <msubsup> <mi>&amp;delta;C</mi> <mi>b</mi> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mrow> <msub> <mi>&amp;theta;</mi> <mi>b</mi> </msub> <mo>+</mo> <msub> <mi>v</mi> <mn>2</mn> </msub> </mrow> <mo>)</mo> </mrow> </mrow> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>&amp;ap;</mo> <mi>f</mi> <mrow> <mo>(</mo> <mrow> <msub> <mi>z</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>,</mo> <msubsup> <mi>C</mi> <mrow> <mi>b</mi> <mi>c</mi> </mrow> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> </mrow> <mo>)</mo> </mrow> <mo>-</mo> <mfrac> <mrow> <mo>&amp;part;</mo> <mi>f</mi> </mrow> <mrow> <mo>&amp;part;</mo> <msub> <mi>y</mi> <mn>1</mn> </msub> </mrow> </mfrac> <msub> <mo>|</mo> <mrow> <msub> <mi>z</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>,</mo> <msubsup> <mi>C</mi> <mrow> <mi>b</mi> <mi>c</mi> </mrow> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>.</mo> <msub> <mi>v</mi> <mn>1</mn> </msub> <mo>-</mo> <mfrac> <mrow> <mo>&amp;part;</mo> <mi>f</mi> </mrow> <mrow> <mo>&amp;part;</mo> <msub> <mi>&amp;theta;</mi> <mi>b</mi> </msub> </mrow> </mfrac> <msub> <mo>|</mo> <mrow> <msub> <mi>z</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>,</mo> <msubsup> <mi>C</mi> <mrow> <mi>b</mi> <mi>c</mi> </mrow> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>.</mo> <msub> <mi>v</mi> <mn>2</mn> </msub> <mo>-</mo> <mfrac> <mrow> <mo>&amp;part;</mo> <mi>f</mi> </mrow> <mrow> <mo>&amp;part;</mo> <msub> <mi>&amp;theta;</mi> <mi>b</mi> </msub> </mrow> </mfrac> <msub> <mo>|</mo> <mrow> <msub> <mi>z</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>,</mo> <msubsup> <mi>C</mi> <mrow> <mi>b</mi> <mi>c</mi> </mrow> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>.</mo> <msub> <mi>&amp;theta;</mi> <mi>b</mi> </msub> </mrow> </mtd> </mtr> </mtable> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>5</mn> <mo>)</mo> </mrow> </mrow>
By formula (46) and formula (5), obtain
<mrow> <mi>f</mi> <mrow> <mo>(</mo> <msub> <mi>z</mi> <mn>1</mn> </msub> <mo>(</mo> <mi>i</mi> <mo>)</mo> <mo>,</mo> <msubsup> <mi>C</mi> <mrow> <mi>b</mi> <mi>c</mi> </mrow> <mi>n</mi> </msubsup> <mo>(</mo> <mi>i</mi> <mo>)</mo> <mo>)</mo> </mrow> <mo>&amp;ap;</mo> <msub> <mi>&amp;theta;</mi> <mi>a</mi> </msub> <mo>+</mo> <mfrac> <mrow> <mo>&amp;part;</mo> <mi>f</mi> </mrow> <mrow> <mo>&amp;part;</mo> <msub> <mi>&amp;theta;</mi> <mi>b</mi> </msub> </mrow> </mfrac> <msub> <mo>|</mo> <mrow> <msub> <mi>z</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>,</mo> <msubsup> <mi>C</mi> <mrow> <mi>b</mi> <mi>c</mi> </mrow> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>.</mo> <msub> <mi>&amp;theta;</mi> <mi>b</mi> </msub> <mo>+</mo> <mfrac> <mrow> <mo>&amp;part;</mo> <mi>f</mi> </mrow> <mrow> <mo>&amp;part;</mo> <msub> <mi>y</mi> <mn>1</mn> </msub> </mrow> </mfrac> <msub> <mo>|</mo> <mrow> <msub> <mi>z</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>,</mo> <msubsup> <mi>C</mi> <mrow> <mi>b</mi> <mi>c</mi> </mrow> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>.</mo> <msub> <mi>v</mi> <mn>1</mn> </msub> <mo>+</mo> <mfrac> <mrow> <mo>&amp;part;</mo> <mi>f</mi> </mrow> <mrow> <mo>&amp;part;</mo> <msub> <mi>&amp;theta;</mi> <mi>b</mi> </msub> </mrow> </mfrac> <msub> <mo>|</mo> <mrow> <msub> <mi>z</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>,</mo> <msubsup> <mi>C</mi> <mrow> <mi>b</mi> <mi>c</mi> </mrow> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>.</mo> <msub> <mi>v</mi> <mn>2</mn> </msub> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>6</mn> <mo>)</mo> </mrow> </mrow>
If matrixWherein a1,3~a2,5Representing matrix AiIn corresponding member Element;
MatrixWherein b1,1~b2,3Represent in matrix BiIn corresponding element; To following linear regression model (LRM),
<mrow> <mi>f</mi> <mrow> <mo>(</mo> <msub> <mi>z</mi> <mn>1</mn> </msub> <mo>(</mo> <mi>i</mi> <mo>)</mo> <mo>,</mo> <msubsup> <mi>C</mi> <mrow> <mi>b</mi> <mi>c</mi> </mrow> <mi>n</mi> </msubsup> <mo>(</mo> <mi>i</mi> <mo>)</mo> <mo>)</mo> </mrow> <mo>=</mo> <mo>&amp;lsqb;</mo> <msub> <mi>I</mi> <mn>2</mn> </msub> <mo>,</mo> <msub> <mi>B</mi> <mi>i</mi> </msub> <mo>&amp;rsqb;</mo> <mi>&amp;theta;</mi> <mo>+</mo> <mi>V</mi> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>7</mn> <mo>)</mo> </mrow> </mrow>
Wherein, I2For 2 × 2 unit matrix, noise is V~N (0, R)
Covariance matrix is
<mrow> <mi>R</mi> <mo>=</mo> <mi>d</mi> <mi>i</mi> <mi>a</mi> <mi>g</mi> <mrow> <mo>(</mo> <msubsup> <mrow> <mo>{</mo> <msub> <mi>A</mi> <mi>i</mi> </msub> <msub> <mi>R</mi> <mn>1</mn> </msub> <msup> <msub> <mi>A</mi> <mi>i</mi> </msub> <mi>T</mi> </msup> <mo>+</mo> <msub> <mi>B</mi> <mi>i</mi> </msub> <msub> <mi>R</mi> <mn>2</mn> </msub> <msup> <msub> <mi>B</mi> <mi>i</mi> </msub> <mi>T</mi> </msup> <mo>}</mo> </mrow> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>N</mi> </msubsup> <mo>)</mo> </mrow> </mrow>
The estimate of parameter θ is
<mrow> <mtable> <mtr> <mtd> <mrow> <mover> <mi>&amp;theta;</mi> <mo>^</mo> </mover> <mo>=</mo> <msup> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mi>x</mi> <mi>G</mi> </msub> </mtd> <mtd> <msub> <mi>y</mi> <mi>G</mi> </msub> </mtd> <mtd> <mrow> <mi>&amp;delta;</mi> <mi>&amp;psi;</mi> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> </mtable> </mfenced> <mi>T</mi> </msup> <mo>=</mo> <msup> <mrow> <mo>&amp;lsqb;</mo> <mrow> <munderover> <mi>&amp;Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>N</mi> </munderover> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mi>I</mi> <mn>2</mn> </msub> </mtd> </mtr> <mtr> <mtd> <msubsup> <mi>B</mi> <mi>i</mi> <mi>T</mi> </msubsup> </mtd> </mtr> </mtable> </mfenced> <msup> <mrow> <mo>(</mo> <mrow> <msub> <mi>A</mi> <mi>i</mi> </msub> <msub> <mi>R</mi> <mn>1</mn> </msub> <msubsup> <mi>A</mi> <mi>i</mi> <mi>T</mi> </msubsup> <mo>+</mo> <msub> <mi>B</mi> <mi>i</mi> </msub> <msub> <mi>R</mi> <mn>2</mn> </msub> <msubsup> <mi>B</mi> <mi>i</mi> <mi>T</mi> </msubsup> </mrow> <mo>)</mo> </mrow> <mrow> <mo>-</mo> <mn>1</mn> </mrow> </msup> <mrow> <mo>&amp;lsqb;</mo> <mrow> <msub> <mi>I</mi> <mn>2</mn> </msub> <mo>,</mo> <msub> <mi>B</mi> <mi>i</mi> </msub> </mrow> <mo>&amp;rsqb;</mo> </mrow> </mrow> <mo>&amp;rsqb;</mo> </mrow> <mrow> <mo>-</mo> <mn>1</mn> </mrow> </msup> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>&amp;times;</mo> <munderover> <mi>&amp;Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>N</mi> </munderover> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mi>I</mi> <mn>2</mn> </msub> </mtd> </mtr> <mtr> <mtd> <msubsup> <mi>B</mi> <mi>i</mi> <mi>T</mi> </msubsup> </mtd> </mtr> </mtable> </mfenced> <msup> <mrow> <mo>(</mo> <mrow> <msub> <mi>A</mi> <mi>i</mi> </msub> <msub> <mi>R</mi> <mn>1</mn> </msub> <msubsup> <mi>A</mi> <mi>i</mi> <mi>T</mi> </msubsup> <mo>+</mo> <msub> <mi>B</mi> <mi>i</mi> </msub> <msub> <mi>R</mi> <mn>2</mn> </msub> <msubsup> <mi>B</mi> <mi>i</mi> <mi>T</mi> </msubsup> </mrow> <mo>)</mo> </mrow> <mrow> <mo>-</mo> <mn>1</mn> </mrow> </msup> <mi>f</mi> <mrow> <mo>(</mo> <mrow> <msub> <mi>z</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>,</mo> <msubsup> <mi>C</mi> <mrow> <mi>b</mi> <mi>c</mi> </mrow> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> </mrow> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> </mtable> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>8</mn> <mo>)</mo> </mrow> </mrow>
Course deviation δ ψ (k) can be solved by formula (8).
5. rotor wing unmanned aerial vehicle objective localization method as claimed in claim 1, it is characterised in that the step 3 is being carried out Before N groups are averaged, the gross error of relative altitude is first rejected using 3 σ criterions.
CN201610943473.4A 2016-11-01 2016-11-01 A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion Active CN107300377B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610943473.4A CN107300377B (en) 2016-11-01 2016-11-01 A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610943473.4A CN107300377B (en) 2016-11-01 2016-11-01 A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion

Publications (2)

Publication Number Publication Date
CN107300377A true CN107300377A (en) 2017-10-27
CN107300377B CN107300377B (en) 2019-06-14

Family

ID=60138055

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610943473.4A Active CN107300377B (en) 2016-11-01 2016-11-01 A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion

Country Status (1)

Country Link
CN (1) CN107300377B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109708622A (en) * 2017-12-15 2019-05-03 福建工程学院 The method that three-dimensional modeling is carried out to building using unmanned plane based on Pixhawk
WO2019165612A1 (en) * 2018-02-28 2019-09-06 深圳市大疆创新科技有限公司 Method for positioning a movable platform, and related device and system
CN110632941A (en) * 2019-09-25 2019-12-31 北京理工大学 Trajectory generation method for target tracking of unmanned aerial vehicle in complex environment
CN110675453A (en) * 2019-10-16 2020-01-10 北京天睿空间科技股份有限公司 Self-positioning method for moving target in known scene
CN110799921A (en) * 2018-07-18 2020-02-14 深圳市大疆创新科技有限公司 Shooting method and device and unmanned aerial vehicle
CN110824295A (en) * 2019-10-22 2020-02-21 广东电网有限责任公司 Infrared thermal image fault positioning method based on three-dimensional graph
CN112567201A (en) * 2018-08-21 2021-03-26 深圳市大疆创新科技有限公司 Distance measuring method and apparatus
CN113469139A (en) * 2021-07-30 2021-10-01 广州中科智云科技有限公司 Data security transmission method and system for unmanned aerial vehicle edge side embedded AI chip
CN115272892A (en) * 2022-07-29 2022-11-01 同济大学 Unmanned aerial vehicle positioning deviation monitoring management and control system based on data analysis
CN117452831A (en) * 2023-12-26 2024-01-26 南京信息工程大学 Four-rotor unmanned aerial vehicle control method, device, system and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10221072A (en) * 1997-02-03 1998-08-21 Asahi Optical Co Ltd System and method for photogrammetry
JP2003083744A (en) * 2001-09-12 2003-03-19 Starlabo Corp Imaging apparatus mounted to aircraft, and aircraft imaging data processing apparatus
CN102519434A (en) * 2011-12-08 2012-06-27 北京控制工程研究所 Test verification method for measuring precision of stereoscopic vision three-dimensional recovery data
CN105424006A (en) * 2015-11-02 2016-03-23 国网山东省电力公司电力科学研究院 Unmanned aerial vehicle hovering precision measurement method based on binocular vision

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10221072A (en) * 1997-02-03 1998-08-21 Asahi Optical Co Ltd System and method for photogrammetry
JP2003083744A (en) * 2001-09-12 2003-03-19 Starlabo Corp Imaging apparatus mounted to aircraft, and aircraft imaging data processing apparatus
CN102519434A (en) * 2011-12-08 2012-06-27 北京控制工程研究所 Test verification method for measuring precision of stereoscopic vision three-dimensional recovery data
CN105424006A (en) * 2015-11-02 2016-03-23 国网山东省电力公司电力科学研究院 Unmanned aerial vehicle hovering precision measurement method based on binocular vision

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109708622A (en) * 2017-12-15 2019-05-03 福建工程学院 The method that three-dimensional modeling is carried out to building using unmanned plane based on Pixhawk
WO2019165612A1 (en) * 2018-02-28 2019-09-06 深圳市大疆创新科技有限公司 Method for positioning a movable platform, and related device and system
CN110799921A (en) * 2018-07-18 2020-02-14 深圳市大疆创新科技有限公司 Shooting method and device and unmanned aerial vehicle
CN112567201A (en) * 2018-08-21 2021-03-26 深圳市大疆创新科技有限公司 Distance measuring method and apparatus
CN112567201B (en) * 2018-08-21 2024-04-16 深圳市大疆创新科技有限公司 Distance measuring method and device
CN110632941A (en) * 2019-09-25 2019-12-31 北京理工大学 Trajectory generation method for target tracking of unmanned aerial vehicle in complex environment
CN110675453A (en) * 2019-10-16 2020-01-10 北京天睿空间科技股份有限公司 Self-positioning method for moving target in known scene
CN110675453B (en) * 2019-10-16 2021-04-13 北京天睿空间科技股份有限公司 Self-positioning method for moving target in known scene
CN110824295B (en) * 2019-10-22 2021-08-31 广东电网有限责任公司 Infrared thermal image fault positioning method based on three-dimensional graph
CN110824295A (en) * 2019-10-22 2020-02-21 广东电网有限责任公司 Infrared thermal image fault positioning method based on three-dimensional graph
CN113469139A (en) * 2021-07-30 2021-10-01 广州中科智云科技有限公司 Data security transmission method and system for unmanned aerial vehicle edge side embedded AI chip
CN113469139B (en) * 2021-07-30 2022-04-05 广州中科智云科技有限公司 Data security transmission method and system for unmanned aerial vehicle edge side embedded AI chip
CN115272892A (en) * 2022-07-29 2022-11-01 同济大学 Unmanned aerial vehicle positioning deviation monitoring management and control system based on data analysis
CN117452831A (en) * 2023-12-26 2024-01-26 南京信息工程大学 Four-rotor unmanned aerial vehicle control method, device, system and storage medium
CN117452831B (en) * 2023-12-26 2024-03-19 南京信息工程大学 Four-rotor unmanned aerial vehicle control method, device, system and storage medium

Also Published As

Publication number Publication date
CN107300377B (en) 2019-06-14

Similar Documents

Publication Publication Date Title
CN107300377B (en) A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion
CN106153008B (en) A kind of rotor wing unmanned aerial vehicle objective localization method of view-based access control model
CN106155081B (en) A kind of a wide range of target monitoring of rotor wing unmanned aerial vehicle and accurate positioning method
CN103994765B (en) Positioning method of inertial sensor
CN107314771A (en) Unmanned plane positioning and attitude angle measuring method based on coded target
CN107833249A (en) A kind of carrier-borne aircraft landing mission attitude prediction method of view-based access control model guiding
CN104748750A (en) Model constraint-based on-orbit 3D space target attitude estimation method and system
CN102778224B (en) Method for aerophotogrammetric bundle adjustment based on parameterization of polar coordinates
CN107490364A (en) A kind of wide-angle tilt is imaged aerial camera object positioning method
CN104655135B (en) A kind of aircraft visual navigation method based on terrestrial reference identification
CN109425348A (en) A kind of while positioning and the method and apparatus for building figure
CN110849331B (en) Monocular vision measurement and ground test method based on three-dimensional point cloud database model
CN105913417A (en) Method for geometrically constraining pose based on perspective projection line
US9816786B2 (en) Method for automatically generating a three-dimensional reference model as terrain information for an imaging device
CN106885573A (en) Towards the motion capture system Real-time Determination of Attitude method of quadrotor
Fang et al. UKF for integrated vision and inertial sensors based on three-view geometry
CN105004321A (en) Unmanned plane GPS-supported bundle djustment method in consideration of non-synchronous exposal
CN105389819B (en) A kind of lower visible image method for correcting polar line of half calibration and system of robust
Kehoe et al. State estimation using optical flow from parallax-weighted feature tracking
CN106323271A (en) Spacecraft relative attitude measurement vector selection method based on feature singular values
Han et al. Multiple targets geolocation using SIFT and stereo vision on airborne video sequences
CN102620745A (en) Airborne inertial measurement unite (IMU) collimation axis error calibration method
Rabinovitch et al. Full-scale supersonic parachute shape reconstruction using three-dimensional stereo imagery
Kupervasser et al. Robust positioning of drones for land use monitoring in strong terrain relief using vision-based navigation
Werner Precision relative positioning for automated aerial refueling from a stereo imaging system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant