CN104111071B - High-precision position posture calculating method based on laser ranging and camera visual fusion - Google Patents

High-precision position posture calculating method based on laser ranging and camera visual fusion Download PDF

Info

Publication number
CN104111071B
CN104111071B CN201410328295.5A CN201410328295A CN104111071B CN 104111071 B CN104111071 B CN 104111071B CN 201410328295 A CN201410328295 A CN 201410328295A CN 104111071 B CN104111071 B CN 104111071B
Authority
CN
China
Prior art keywords
theta
sin
cos
chi
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410328295.5A
Other languages
Chinese (zh)
Other versions
CN104111071A (en
Inventor
黄建明
魏祥泉
陈凤
刘玉
刘鲁江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Aerospace System Engineering Institute
Original Assignee
Shanghai Aerospace System Engineering Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Aerospace System Engineering Institute filed Critical Shanghai Aerospace System Engineering Institute
Priority to CN201410328295.5A priority Critical patent/CN104111071B/en
Publication of CN104111071A publication Critical patent/CN104111071A/en
Application granted granted Critical
Publication of CN104111071B publication Critical patent/CN104111071B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a high-precision position posture calculating method based on the laser ranging and the camera visual fusion. The high-precision position posture calculating method comprises the steps: (S1) performing linear modeling based on a camera visual position posture measuring algorithm of an NLSM (Non-linear Sigma Model) of a cooperative target; (S2) performing the linear modeling based on a multipoint laser range finder; and (S3) performing the linear modeling based on the laser ranging and the camera fusion. High-precision laser ranging information and high-precision angular resolution information of the camera are comprehensively utilized, so that the measuring precision of the position posture of the target is effectively improved, and the problem of fusion of the cooperative target-based laser ranging and the camera visual position posture measuring algorithm is solved.

Description

The high precision position and posture calculation method being merged with camera vision based on laser ranging
Technical field
The present invention relates to test field of measuring technique, merged based on laser ranging and camera vision particularly to a kind of High precision position and posture calculation method.
Background technology
For sar (synthetic aperture radar, synthetic aperture radar) imaging satellite, the deformation pair of antenna The imaging resolution of satellite, picture quality have great impact.Spaceborne sar antenna can be subject to space each during in orbit Plant the effect of load, these load mainly include thermal force, the Gradient of Gravitation square, atmospheric drag and satellite motion interference etc., from And lead to aerial panel to vibrate, reduce the attitude stability of satellite and pointing accuracy, and have a strong impact on the performance of antenna, lead to into As fuzzy, resolution reduce;More seriously vibration may lead to antenna structure to destroy.
Therefore, after the in-orbit expansion of sar satellite large-scale antenna, need to carry out in-orbit real-time high-precision measurement by measuring system Obtaining the change of shape of antenna, to control it when deformation is larger, thus meeting use requirement, and currently lacking Can effectively in real time high-precision determine target antenna relative position and attitude scheme.
During pose resolves, it is commonly applied to cooperative target, so-called cooperative target is exactly set n on measured target (generally at least 3) control point known to position, and video camera known to a position is set on following the tracks of object, due to cooperation As long as the position of target and video camera, it is known that measuring the relative position between cooperative target and video camera and attitude, just can be obtained Relative position between measured target and tracking object and attitude.And this method resolves complicated, exist again and solve phenomenons, for a long time more Since research focus primarily upon and how to solve and in the pure mathematics problem of how many solution lower that explains the situation, therefore, for reality The research tool of border engineer applied is of great significance.
Content of the invention
The present invention is directed to deficiencies of the prior art, there is provided one kind is melted with camera vision based on laser ranging The high precision position and posture calculation method closing, the present invention is achieved through the following technical solutions:
A kind of high precision position and posture calculation method being merged with camera vision based on laser ranging, is comprised the following steps:
S1, carry out the camera vision position-pose measurement linear modelling of nlsm based on cooperative target, the one-tenth of video camera As model:
u i = f z ci x ci v i = f y ci x ci i = 1,2,3
Wherein, (ui, vi) it is i-th cooperative target in camera image plane coordinate system (of-ufvf) in coordinate, (xci, yci,zci) it is corresponding cooperative target in image space coordinate system (oc-xcyczc) in coordinate, f be focal length of camera.Image plane is sat Relation between mark system, image space coordinate system coordinate and target-based coordinate system is as shown in Figure 2.Image space coordinate system and target-based coordinate system Coordinate Conversion be:
x ci y ci z ci = m ct x ti x ti x ti + t xc t yc t zc , i = 1,2,3
Wherein, (xti, yti, zti) it is coordinate in target-based coordinate system for the cooperative target, (txc, tyc, tzc) it is coordinates of targets It is coordinate in image space coordinate system for the initial point, mctIt is tied to the transition matrix of image space coordinate system, (θ for coordinates of targets123) For the corresponding anglec of rotation, mctIt is expressed as:
m ct = cos θ 2 cos θ 3 - sin θ 1 sin θ 2 sin θ 3 cos θ 2 sin θ 3 + sin θ 1 sin θ 2 cos θ 3 - cos θ 1 sin θ 2 - cos θ 1 sin θ 3 cos θ 1 cos θ 3 sin θ 1 sin θ 2 cos θ 3 + cos θ 2 sin θ 1 sin θ 3 sin θ 2 sin θ 3 - sin θ 1 cos θ 2 cos θ 3 cos θ 1 cos θ 2
Order
f 2 i - 1 ( χ ) = f z ci x ci f 2 i ( χ ) = f y ci x ci i = 1,2,3
Wherein, χ=(txc,tyc,tzc123), according to nlsm principle, obtain:
min(bε-l)t(bε-l)
Wherein, χ0Initial estimate for χ, ε is χ0Corrected value, symbol " t " representing matrix transposition, matrix b be function f (χ) First-order partial derivative, be expressed as:
b = &partiald; f &partiald; χ | χ 0
Element l in matrix liIt is expressed as:
l 2 i - 1 = u i - f 2 i - 1 ( χ 0 ) l 2 i = v i - f 2 i ( χ 0 ) i = 1,2,3
S2, the linear modelling carrying out based on multiple spot laser range finder:
Coordinate in image space coordinate system for the installation site p point of laser range finder is (xpc, ypc, zpc), p and cooperative target Laser ranging distance between mark is di, i=1,2,3;
d i ( χ ) = ( x cp - x ci ) 2 + ( y cp - y ci ) 2 + ( z cp - z ci ) 2 - d i 2 , i = 1,2,3
Linearisation is carried out to it, obtains:
d i ( χ ) = d i ( χ 0 ) + &partiald; d i &partiald; χ ϵ , i = 1,2,3
It is rewritten as matrix form, then have:
D (χ)=d (χ0)+cε
Wherein, c is the first-order partial derivative of conditional equation d (χ), is:
c = &partiald; d &partiald; χ | χ 0
S3, the fusion linear modelling carrying out based on laser ranging and video camera:
Using laser ranging as the mandatory constraints of camera vision Measurement Algorithm, then have:
d(χ0)+c ε=0
Corresponding object function is:
min [ ( b 6,6 ϵ 6,1 - l 6,1 ) t ( b 6,6 ϵ 6,1 - l 6,1 ) + 2 ( d 3,1 + c 3,6 ϵ 6,1 ) t λ 3,1 ]
In formula, λ is the contact number vector limiting conditional equation, and solution formula is expressed as:
λ=[c (btb)-1ct]-1[d+c(btb)-1(btl)]
Corresponding ε is solved:
ε=(btb)-1(btl-ctλ)
Corresponding position and attitude parameter χ is expressed as:
χ(k+1)(k)
In formula, χ(k)Result of calculation for kth time.
The invention comprehensively utilizes the high-precision ranging information of laser ranging and video camera high-precision angular resolution information, Effectively increase the position and attitude certainty of measurement of target, and solve laser ranging and the video camera having merged based on cooperative target The problem of the position and attitude Measurement Algorithm of vision.
Brief description
Shown in Fig. 1 is the flow chart of the present invention.
Shown in Fig. 2 is the relation between camera plane coordinate system, image space coordinate system and target-based coordinate system.
Specific embodiment
Below with reference to the accompanying drawing of the present invention, clear, complete description is carried out to the technical scheme in the embodiment of the present invention With discussion it is clear that a part of example of the only present invention as described herein, it is not whole examples, based on the present invention In embodiment, the every other enforcement that those of ordinary skill in the art are obtained on the premise of not making creative work Example, broadly falls into protection scope of the present invention.
For the ease of the understanding to the embodiment of the present invention, make further below in conjunction with accompanying drawing taking specific embodiment as a example Illustrate, and each embodiment does not constitute the restriction to the embodiment of the present invention.
As shown in figure 1, a kind of high precision position and posture solution being merged with camera vision based on laser ranging disclosed by the invention Calculation method, comprises the following steps:
S1, carry out the camera vision position-pose measurement linear modelling of nlsm based on cooperative target, the one-tenth of video camera As model:
u i = f z ci x ci v i = f y ci x ci i = 1,2,3
Wherein, (ui, vi) it is i-th cooperative target in camera image plane coordinate system (of-ufvf) in coordinate, (xci, yci,zci) it is corresponding cooperative target in image space coordinate system (oc-xcyczc) in coordinate, f be focal length of camera.Image plane is sat Relation between mark system, image space coordinate system coordinate and target-based coordinate system is as shown in Figure 2.Image space coordinate system and target-based coordinate system Coordinate Conversion be:
x ci y ci z ci = m ct x ti x ti x ti + t xc t yc t zc , i = 1,2,3
Wherein, (xti, yti, zti) it is coordinate in target-based coordinate system for the cooperative target, (txc, tyc, tzc) it is coordinates of targets It is coordinate in image space coordinate system for the initial point, mctIt is tied to the transition matrix of image space coordinate system, (θ for coordinates of targets123) For the corresponding anglec of rotation, mctIt is expressed as:
m ct = cos θ 2 cos θ 3 - sin θ 1 sin θ 2 sin θ 3 cos θ 2 sin θ 3 + sin θ 1 sin θ 2 cos θ 3 - cos θ 1 sin θ 2 - cos θ 1 sin θ 3 cos θ 1 cos θ 3 sin θ 1 sin θ 2 cos θ 3 + cos θ 2 sin θ 1 sin θ 3 sin θ 2 sin θ 3 - sin θ 1 cos θ 2 cos θ 3 cos θ 1 cos θ 2
Order
f 2 i - 1 ( χ ) = f z ci x ci f 2 i ( χ ) = f y ci x ci i = 1,2,3
Wherein, χ=(txc,tyc,tzc123), according to nlsm principle, obtain:
min(bε-l)t(bε-l)
Wherein, χ0Initial estimate for χ, ε is χ0Corrected value, symbol " t " representing matrix transposition.Matrix b is function f (χ) First-order partial derivative, be expressed as:
b = &partiald; f &partiald; χ | χ 0
Element l in matrix liIt is expressed as:
l 2 i - 1 = u i - f 2 i - 1 ( χ 0 ) l 2 i = v i - f 2 i ( χ 0 ) i = 1,2,3
S2, the linear modelling carrying out based on multiple spot laser range finder:
Coordinate in image space coordinate system for the installation site p point of laser range finder is (xpc, ypc, zpc), p and cooperative target Laser ranging distance between mark is di, i=1,2,3;
d i ( χ ) = ( x cp - x ci ) 2 + ( y cp - y ci ) 2 + ( z cp - z ci ) 2 - d i 2 , i = 1,2,3
Linearisation is carried out to it, obtains:
d i ( χ ) = d i ( χ 0 ) + &partiald; d i &partiald; χ ϵ , i = 1,2,3
It is rewritten as matrix form, then have:
D (χ)=d (χ0)+cε
Wherein, c is the first-order partial derivative of conditional equation d (χ), is:
c = &partiald; d &partiald; χ | χ 0
S3, the fusion linear modelling based on laser ranging and video camera:
Using laser ranging as the mandatory constraints of camera vision Measurement Algorithm, then have:
d(χ0)+c ε=0
Corresponding object function is:
min [ ( b 6,6 ϵ 6,1 - l 6,1 ) t ( b 6,6 ϵ 6,1 - l 6,1 ) + 2 ( d 3,1 + c 3,6 ϵ 6,1 ) t λ 3,1 ]
Corresponding λ is solved:
λ=[c (btb)-1ct]-1[d+c(btb)-1(btl)]
Corresponding ε is solved:
ε=(btb)-1(btl-ctλ)
Corresponding position and attitude parameter χ is expressed as:
χ(k+1)(k)
In formula, χ(k)Result of calculation for kth time.
Below by way of numerical simulation to the laser ranging based on three cooperative target punctuates for the present embodiment and camera vision phase In conjunction with high precision position and posture calculation method verified:
If relevant parameter is as follows:
1st, camera parameter:
Focal length: 0.105m
Camera target surface: 4400 × 6600
Pixel Dimensions: 5.5e-6m
Pixel extraction precision: 0.1pixel
2nd, laser ranging instrument parameter:
Installation site: [0.1264,0.2445,0.4174] m
Range accuracy: better than 0.1mm
3rd, experimental data
Table 1 gives the actual position of index point in each bin, and table 2 gives corresponding measurement error.Table 3 gives by mistake The result of difference statistics.
Table 1
Table 2
x(mm) y(mm) z(mm) Bin angle (°)
Max value of error 0.189 0.287 1.140 0.035
Error mean 0.078 0.124 0.356 0.0127
Table 3
Error statistics in table 3, max value of error is respectively as follows: with error mean statistical formula
x max = max i = 1,2 . . . n ( | x ri - x ci | ) , x → = σ i = 1 n | x ri - x ci | n
X in formulariFor actual value, xciFor measured value.
Table 2 realizes 50 meters about of distance with as shown by data in table 3, the maximum pact of the measurement error based on three cooperative targets In 1.14mm, 0.0127 ° about of angular surveying maximum error.
The above, the only present invention preferably specific embodiment, but protection scope of the present invention is not limited thereto, Any those familiar with the art the invention discloses technical scope in, the change or replacement that can readily occur in, All should be included within the scope of the present invention.Therefore, protection scope of the present invention should be with scope of the claims It is defined.

Claims (1)

1. a kind of high precision position and posture calculation method being merged based on laser ranging and camera vision it is characterised in that include with Lower step:
S1, carry out the camera vision position-pose measurement linear modelling of nlsm based on cooperative target, the imaging mould of video camera Type:
u i = f z ci x ci v i = f y ci x ci i = 1,2,3
Wherein, (ui, vi) it is i-th cooperative target coordinate in camera plane coordinate system, (xci, yci, zci) it is corresponding Coordinate in image space coordinate system for the cooperative target, image space coordinate system with the Coordinate Conversion of target-based coordinate system is:
x ci y ci z ci = m ct x ti x ti x ti + t xc t yc t zc , i = 1,2,3
Wherein, (xti, yti, zti) it is coordinate in target-based coordinate system for the cooperative target, (txc, tyc, tzc) former for target-based coordinate system Coordinate in image space coordinate system for the point, mctIt is tied to the transition matrix of image space coordinate system, (θ for coordinates of targets123) it is phase The anglec of rotation answered, mctIt is expressed as:
m ct = cos θ 2 cos θ 3 - sin θ 1 sin θ 2 sin θ 3 cos θ 2 sin θ 3 + sin θ 1 sin θ 2 cos θ 3 - cos θ 1 sin θ 2 - cos θ 1 sin θ 3 cos θ 1 cos θ 3 sin θ 1 sin θ 2 cos θ 3 + cos θ 2 sin θ 1 sin θ 3 sin θ 2 sin θ 3 - sin θ 1 cos θ 2 cos θ 3 cos θ 1 cos θ 2
Order
f 2 i - 1 ( χ ) = f z ci x ci f 2 i ( χ ) = f y ci x ci i = 1,2,3
Wherein, χ=(txc,tyc,tzc123), according to nlsm principle, obtain:
minvtV=min (b ε-l)t(bε-l)
Wherein, χ0Initial estimate for χ, ε is χ0Corrected value, symbol " t " representing matrix transposition, matrix b be function f (χ) one Rank partial derivative, is expressed as:
b = &partiald; f &partiald; χ | χ 0
Element l in matrix liIt is expressed as:
l 2 i - 1 = u i - f 2 i - 1 ( χ 0 ) l 2 i = v i - f 2 i ( χ 0 ) i = 1,2,3
S2, the linear modelling carrying out based on multiple spot laser range finder:
Coordinate in image space coordinate system for the installation site p point of laser range finder is (xpc, ypc, zpc), p and cooperative target it Between laser ranging distance be di, i=1,2,3;
d i ( χ ) = ( x cp - x ci ) 2 + ( y cp - y ci ) 2 + ( z cp - z ci ) 2 - d i 2 , i = 1,2,3
Linearisation is carried out to it, obtains:
d i ( χ ) = d i ( χ 0 ) + &partiald; d i &partiald; χ ϵ , i = 1,2,3
It is rewritten as matrix form, then have:
D (χ)=d (χ0)+cε
Wherein, c is the first-order partial derivative of conditional equation d (χ), is:
c = &partiald; d &partiald; χ | χ 0
S3, the fusion linear modelling carrying out based on laser ranging and video camera:
Using laser ranging as the mandatory constraints of camera vision Measurement Algorithm, then have:
d(χ0)+c ε=0
Corresponding object function is:
min [ ( b 6,6 ϵ 6,1 - l 6,1 ) t ( b 6,6 ϵ 6,1 - l 6,1 ) + 2 ( d 3,1 + c 3,6 ϵ 6,1 ) t λ 3,1 ]
Corresponding λ is solved:
λ=[c (btb)-1ct]-1[d+c(btb)-1(btl)]
Corresponding ε is solved:
ε=(btb)-1(btl-ctλ)
Corresponding position and attitude parameter χ is expressed as:
χ(k+1)(k)
In formula, χ(k)Result of calculation for kth time.
CN201410328295.5A 2014-07-10 2014-07-10 High-precision position posture calculating method based on laser ranging and camera visual fusion Active CN104111071B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410328295.5A CN104111071B (en) 2014-07-10 2014-07-10 High-precision position posture calculating method based on laser ranging and camera visual fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410328295.5A CN104111071B (en) 2014-07-10 2014-07-10 High-precision position posture calculating method based on laser ranging and camera visual fusion

Publications (2)

Publication Number Publication Date
CN104111071A CN104111071A (en) 2014-10-22
CN104111071B true CN104111071B (en) 2017-01-18

Family

ID=51707958

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410328295.5A Active CN104111071B (en) 2014-07-10 2014-07-10 High-precision position posture calculating method based on laser ranging and camera visual fusion

Country Status (1)

Country Link
CN (1) CN104111071B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105783880B (en) * 2016-03-22 2018-05-22 天津大学 A kind of monocular laser assisted bay section docking calculation
CN106643735A (en) * 2017-01-06 2017-05-10 中国人民解放军信息工程大学 Indoor positioning method and device and mobile terminal
CN106931879B (en) * 2017-01-23 2020-01-21 成都通甲优博科技有限责任公司 Binocular error measurement method, device and system
CN107300380B (en) * 2017-07-12 2019-12-10 重庆长安汽车股份有限公司 PLC-based robot tail end servo electrode holder pose automatic monitoring equipment and method
CN110455277B (en) * 2019-08-19 2023-04-07 哈尔滨工业大学 High-precision attitude measurement device and method based on data fusion of Internet of things
CN113324538B (en) * 2021-05-08 2022-10-21 中国科学院光电技术研究所 Cooperative target remote high-precision six-degree-of-freedom pose measurement method
CN116628786B (en) * 2023-07-26 2023-10-10 中南大学 Manufacturing method of special-shaped three-dimensional marking ball

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101122800A (en) * 2007-08-24 2008-02-13 北京航空航天大学 Combined type vision navigation method and device
CN101750012A (en) * 2008-12-19 2010-06-23 中国科学院沈阳自动化研究所 Device for measuring six-dimensional position poses of object
US7764384B1 (en) * 2006-11-16 2010-07-27 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Swept frequency laser metrology system
CN103727930A (en) * 2013-12-30 2014-04-16 浙江大学 Edge-matching-based relative pose calibration method of laser range finder and camera

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003156330A (en) * 2001-11-22 2003-05-30 Nec Corp Airborne topography-measuring apparatus and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7764384B1 (en) * 2006-11-16 2010-07-27 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Swept frequency laser metrology system
CN101122800A (en) * 2007-08-24 2008-02-13 北京航空航天大学 Combined type vision navigation method and device
CN101750012A (en) * 2008-12-19 2010-06-23 中国科学院沈阳自动化研究所 Device for measuring six-dimensional position poses of object
CN103727930A (en) * 2013-12-30 2014-04-16 浙江大学 Edge-matching-based relative pose calibration method of laser range finder and camera

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
单目摄像机-激光测距传感器位姿测量***;晁志超等;《光学学报》;20110331;第31卷(第3期);第0312001-1至第0312001-7页 *
基于激光测距与双目视觉信息融合的移动机器人SLAM研究;杜钊君等;《计算机测量与控制》;20130131;第21卷(第1期);第180-183页 *
空间非合作目标超近距离位姿测量技术研究;曾占魁等;《上海航天》;20131231;第30卷(第6期);第10-16页,第72页 *

Also Published As

Publication number Publication date
CN104111071A (en) 2014-10-22

Similar Documents

Publication Publication Date Title
CN104111071B (en) High-precision position posture calculating method based on laser ranging and camera visual fusion
CN105573318B (en) environment construction method based on probability analysis
CN103822615B (en) A kind of multi-control point extracts and the unmanned aerial vehicle target real-time location method be polymerized automatically
CN102607526B (en) Target posture measuring method based on binocular vision under double mediums
CN100429476C (en) Double-sensor laser visual measuring system calibrating method
CN100349018C (en) Internal and external element correcting method of star sensor
CN106441311B (en) A kind of non-cooperative Spacecraft relative pose measurement method based on laser imaging radar
CN103353388B (en) A kind of binocular body formula micro imaging system scaling method of tool camera function and device
CN103954953B (en) The blind source error compensation method of a kind of airborne laser radar based on data-driven
CN102645209B (en) Joint positioning method for spatial points by means of onboard LiDAR point cloud and high resolution images
CN100432628C (en) Converting method and device for measuring daturm of sun sensor
CN102901519B (en) optical push-broom satellite in-orbit stepwise geometric calibration method based on probe element direction angle
CN106803270A (en) Unmanned aerial vehicle platform is based on many key frames collaboration ground target localization method of monocular SLAM
CN103472450B (en) Based on the nonuniform space configuration distributed SAR moving target three-D imaging method of compressed sensing
CN103175485A (en) Method for visually calibrating aircraft turbine engine blade repair robot
CN102359780B (en) Ground target positioning method applied into video monitoring system
CN103871075B (en) A kind of large oval remote sensing satellite earth background relative motion method of estimation
CN102506867B (en) SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system) combined navigation method based on Harris comer matching and combined navigation system
CN103673976A (en) Method and system for converting and unifying composite type precision measuring coordinate system
CN103697883B (en) A kind of aircraft horizontal attitude defining method based on skyline imaging
CN102842117A (en) Method for correcting kinematic errors in microscopic vision system
CN106405581A (en) Evaluation method for coupling direction precision, caused by satellite structure deformation, of multiple types of loads
CN104754323A (en) Calibration method of camera optical axis detection apparatus
CN105334739A (en) FAST whole network control method based on P type learning law of iterative learning
CN111696156A (en) Control point-free remote sensing image coordinate conversion method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant