CN103049912B - Random trihedron-based radar-camera system external parameter calibration method - Google Patents

Random trihedron-based radar-camera system external parameter calibration method Download PDF

Info

Publication number
CN103049912B
CN103049912B CN201210563695.5A CN201210563695A CN103049912B CN 103049912 B CN103049912 B CN 103049912B CN 201210563695 A CN201210563695 A CN 201210563695A CN 103049912 B CN103049912 B CN 103049912B
Authority
CN
China
Prior art keywords
radar
camera
frame
under
parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201210563695.5A
Other languages
Chinese (zh)
Other versions
CN103049912A (en
Inventor
龚小谨
林颖
刘济林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN201210563695.5A priority Critical patent/CN103049912B/en
Publication of CN103049912A publication Critical patent/CN103049912A/en
Application granted granted Critical
Publication of CN103049912B publication Critical patent/CN103049912B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a random trihedron-based radar-camera system external parameter calibration method. According to the method, an external parameter of a system can be solved with only two frames of data by using a trihedron scene a natural environment. The method comprises the following steps of: stipulating a world coordinate system by using a trihedron; performing planar fitting on the trihedron observed in a radar system to obtain a parameter of each plane and solving conversion relation of the world coordinate system and the radar coordinate system and relative motion between the two frames of data under the radar coordinate system; and in a camera system, solving an essential matrix by using matching characteristic points extracted by front and rear frames, then solving relation motion under the camera coordinate system, solving the plane parameter under the camera coordinate system by using the parameter under the radar coordinate system, finally solving an external parameter of a radar-camera, and performing final optimization by using coplanarity of points on corresponding planes under the two coordinate systems. A scene required by the method is simpler; and the method has the characteristics of high anti-interference performance, simple experimental equipment and high flexibility.

Description

A kind of radar based on any trihedral-camera system method for calibrating external parameters
Technical field
The present invention relates to the method for radar-camera system calibrating external parameters, specifically a kind of radar based on any trihedral-camera system method for calibrating external parameters.
Background technology
Being a underlying issue in mobile robot field to the real-time structure of terrain environment, in order to realize this purpose, usually adopting binocular camera system or a radar sensor to provide three-dimensional information.But binocular camera system is owing to needing to carry out fine and close Stereo matching to carry out three-dimensional reconstruction, consuming time comparatively large, easily by external environment influence, so cannot meet the requirement of real-time and accuracy; Although radar sensor can provide real-time three-dimensional information, precision is higher, lacks the colouring information of environment.Therefore, more and more researcher formation radar-camera system that radar and monocular camera combined carrys out the environmental map that real-time providing has color and three-dimensional information.
In recent years, many researchers also provide the method for some radars-camera system calibrating external parameters.Document 1 (Zhang, Q.; Pless, R, " Extrinsic calibration of a camera and laser range finder (improvecamera calibration) ", IEEE International Conference on Intelligent Robots and Systems, IROS, 2004, pp.2301 – 2306.) and document 2 (Unnikrishnan, R.; Hebert, M., " Fast extrinsic calibration ofa laser rangefinder to a camera ", Technical report, Carnegie Mellon University, Rotbotics Institute, 2005.) gridiron pattern scaling board is utilized to carry out computing system external parameter, angle point on gridiron pattern will on camera image manual extraction, and need to gather multiple image, the method depends on the extraction order of accuarcy of angle point to a great extent, also there are certain requirements illumination.Document 3 (Rodriguez F, S.; Fremont, V.; Bonnifait, P, " Extrinsic calibration between a multi-layer lidar and a camera ", IEEE International Conferenceon Multisensor Fusion and Integration for Intelligent Systems, 2008, pp.214 – 219.) and document 4 (Li, G.; Liu, Y.; Dong, L.; Cai, X.; Zhou, D, " An algorithm for extrinsic parameters calibration of acamera and a laser range finder using line features ", IROS, 2007, pp.3854 – 3859.) in order to reduce the error of angle point grid, make use of the demarcation object of special shape, in radar data and view data, extract the corresponding corner location of demarcating thing, but the accuracy that the non-compactness of radar data itself determines the method can not be too high simultaneously.Recently, document 5 (Pandey, G., McBride, J.R., Savarese, S., Eustice, R.M., " Automatictargetless extrinsic calibration of a 3d lidar and camera by maximizing mutual information ", Proceedings of the AAAI National Conference on Artificial Intelligence, 2012.) any scene is not relied on, the three-dimensional point reflection strength only utilizing radar to provide and the mutual information of camera image are to optimize the external parameter of radar-camera system, but not all radar can provide the reflection strength of object, and the reflection strength of unlike material object is not identical.
Summary of the invention
In order to radar-camera data is effectively combined, the object of the present invention is to provide a kind of radar based on any trihedral-camera system method for calibrating external parameters, namely solve the position transformational relation of radar system and camera system; Utilize any trihedral common in natural scene, do not rely on the accuracy of manual input information, and only need two frame data can carry out external parameter fast to solve.
The step of the technical solution used in the present invention is as follows:
(1) utilize radar-camera system, two frame data are gathered to the demarcation scene that there is any trihedral, and according to trihedral definition world coordinate system O w-X wy wz w, there is relative motion between two frame data of collection, and the coloured image of the three-dimensional point cloud having corresponding radar to gather in every frame data and collected by camera;
(2) radar system parameters is solved: in three-dimensional point cloud, mark the point on trihedral three faces, be expressed as:
P l = { P l i j , k , i = 1,2 , j = 1,2,3 , k = n l i j }
Wherein subscript l represents these points at radar fix system O l-X ly lz lunder, i represents the i-th frame data, and j represents a jth face, and k represents a kth point, and the number of each upper point is point parameter on same plane represent:
N l i jT P l i j - d l i j = 0
Wherein for the normal vector of each, for the distance to coordinate origin;
Then according to the coplanarity of trihedral at radar two frame data mid point, to two frames totally 6 faces carry out nonlinear optimization;
Pass through obtain rotation matrix and translation vector that world coordinates in two frame data is tied to radar fix system and obtain the kinematic parameter of radar-camera system under radar fix system between two frames thus, namely the first frame is relative to the rotation matrix of the second frame and translation vector
(3) camera system parameters is solved: utilize the corresponding region in SIFT algorithm trihedral three faces in front and back two two field picture of collected by camera to extract the unique point of front and back frame coupling, be expressed as:
P c = { P c i j , k , i = 1,2 , j = 1,2,3 , k = n c j }
Wherein subscript c represents these points at camera coordinates system O c-X cy cz cunder, i represents the i-th frame data, and j represents a jth face, and k represents a kth unique point, and the front and back two obverse matching characteristic of frame is counted out and is equally
According to P ccalculate the essential matrix E of two frame motions, and obtain the kinematic parameter of radar-camera system under camera coordinates system between two frames further wherein for the first frame under camera coordinates system is relative to the rotation matrix of the second frame, for translation vector normalized result:
t c 1 c 2 = T c 1 c 2 / | | T c 1 c 2 | | , | | t c 1 c 2 | | = 1
Utilize parameter under initialization camera coordinates system , herein for the parameter of each plane of trihedral under camera coordinates system; According to the corresponding projection relation of the unique point extracted, right be optimized;
Pass through obtain world coordinates in two frame data and be tied to the transformational relation of camera coordinates system, be i.e. rotation matrix and translation vector
(4) external parameter is solved: utilize R wc, T wc, R wl, T wlobtain the transformational relation that radar fix is tied to camera coordinates system, i.e. rotation matrix R lcwith translation vector T lc; Utilize coplanarity trihedral put under radar fix system and camera coordinates system, to R lc, T lccarry out nonlinear optimization, the transformational relation R after being optimized lc, T lc.
The concrete grammar that described step (2) solves radar system parameters is:
Utilize linear least-squares optimization object function F 1, to the point on an i-th frame data jth face carry out plane, obtain fitting parameter
F 1 ( N l i j , d l i j ) = arg min N l i j , d l i j Σ k = 1 n l i j | | N l i jT P l i j - d l i j | | 2
Pass through obtain the transformational relation that world coordinates is tied to radar fix system
r wl 1 = N l 1 × N l 3 | | N l 1 × N l 3 | | r wl 3 = N l 3 , T l = N l 1 N l 2 N l 3 - T d l 1 d l 2 d l 3 T r wl 2 = r wl 3 × r wl 1
Here r wl1, r wl2, r wl3 are respectively R wlcolumn vector;
Pass through to obtain under radar fix system the first frame relative to the kinematic parameter of the second frame
R l 1 l 2 = R wl 2 - 1 R wl 1 T l 1 l 2 = T wl 2 - R l 1 l 2 T wl 1
According to the coplanarity that corresponding surface in two frame data is put, to all carry out nonlinear optimization, objective function is expressed as:
F 2 ( N l i j , d l i j ) = arg min N l i j , d l i j Σ j = 1 3 Σ k = 1 n l 2 j | | N l 1 jT ( R l 2 l 1 P l 2 j , k + T l 2 l 1 ) - d l 1 j | | 2 + Σ j = 1 3 Σ k = 1 n l 1 j | | N l 2 jT ( R l 1 l 2 P l 1 j , k + T l 1 l 2 ) - d l 2 j | | 2
Wherein, in formula be the second frame relative to the rotation matrix of the first frame and translation vector; Utilize after optimizing right upgrade.
The concrete grammar that described step (3) solves camera system parameters is:
Two-dimensional points P on image cwith camera coordinates system lower peripheral surface point P s,ctransformational relation be:
P c=G(P s,c)
G () is corresponding projection function herein, is determined by known camera type and internal reference;
According to the epipolar geometry constraints of front and back frame corresponding point, obtain:
G -1(P s,c1) TEG -1(P s,c2)=0
Here E is essential matrix; Existing maturation method is utilized to solve essential matrix, and the motion under decompositing camera coordinates system between two frames wherein for normalized translation vector:
t c 1 c 2 = T c 1 c 2 / | | T c 1 c 2 | | , | | t c 1 c 2 | | = 1
Because the positional distance of radar and camera in radar-camera system is comparatively near, the rotation between coordinate system is less, so according to parameter to be asked under parameter initial estimation camera coordinates system under radar fix system:
{ N c i j , d c i j } = { N l i j , d l i j } | | T c 1 c 2 | | = | | T l 1 l 2 | |
Then, matching characteristic point on image is utilized to utilize re-projection error after recovering three-dimensional coordinate on another two field picture is optimized
F 3 ( N c 1 , d c 1 , N c 2 , d c 2 , R c 1 c 2 , T c 1 c 2 ) = arg min N c 1 , d c 1 N c 2 , d c 2 R c 1 c 2 , T c 1 c 2 Σ k = 1 n c j Σ j = 1 3 | p c 2 j , k - G ( R c 1 c 2 d c 1 G - 1 ( p c 1 j , k ) N c 1 j , T G - 1 ( p c 1 j , k ) + T c 1 c 2 ) | 2 | p c 1 j , k - G ( R c 2 c 1 d c 2 G - 1 ( p c 2 j , k ) N c 2 j , T G - 1 ( p c 2 j , k ) + T c 2 c 1 ) | 2
Wherein, in formula be the second frame relative to the rotation matrix of the first frame and translation vector;
Pass through obtain the transformational relation that world coordinates is tied to camera coordinates system
r wc 1 = N c 1 × N c 3 | | N c 1 × N c 3 | | r wc 3 = N c 3 , T c = N c 1 N c 2 N c 3 - T d c 1 d c 2 d c 3 T r wc 2 = r wc 3 × r wc 1
Here r wc1, r wc2, r wc3 are respectively R wccolumn vector.
The concrete grammar that described step (4) solves external parameter is:
Pass through in any one group of data, solve the transformational relation R between radar and camera coordinates system lc, T lc:
R lc = R wc i R wl i - 1 T lc = T wc i - R wc i R wl i - 1 T wl i , i = 1,2
Further, coplanarity trihedral corresponding surface put under radar fix system and camera coordinates system is utilized, according to objective function F 4to R lc, T lccarry out nonlinear optimization, the R after being optimized lc, T lc.
F 4 ( R lc , T lc ) = arg min R lc , T lc Σ i = 1 2 Σ j = 1 3 Σ k = 1 n l i j | | N c i jT ( R lc P l i j , k + T lc ) - d c i j | | 2
The beneficial effect that the present invention has is:
The present invention has taken into account noiseproof feature, operation complexity requirement is low, does not need to do special setting to environment, is suitable for the external conditions such as arbitrary illumination, weather; Scene of the presently claimed invention is fairly simple, common, has strong interference immunity, experimental facilities is simple, dirigibility is stronger feature concurrently.
Accompanying drawing explanation
Fig. 1 is overview flow chart of the present invention.
Fig. 2 is three coordinate system definition and relation schematic diagram in radar-camera system calibration process.
Embodiment
Below in conjunction with the drawings and specific embodiments, the present invention will be further described.
Fig. 1 gives the techniqueflow of radar-camera system method for calibrating external parameters.
Radar-camera system calibrating external parameters comprises following four parts: 1. gather two frame radar-camera system data; 2. radar system parameters is solved: utilize coordinate trihedral face put under radar system, solve the parameter of each plane in two frame radar datas, relative movement parameters and the transformational relation with world coordinate system; 3. solve camera system parameters: according to known camera parameter, utilize parameter under radar system, solve the parameter of each plane in two frame camera datas, relative movement parameters and the transformational relation with world coordinate system; 4. external parameter is solved: the transformational relation solving radar and camera according to the relativeness of radar and camera coordinates system and world coordinate system, the i.e. external parameter of radar-camera system, and utilize the coplanarity put under different sensors system on same plane to carry out final optimization pass.
1, two frame radar-camera system data are gathered:
As shown in Figure 2, according to the trihedral definition world coordinate system O in scene w-X wy wz w:
With the intersection point in trihedral three faces for initial point O w, plane 1 is X with the intersection of plane 3 waxle, the normal vector direction of plane 3 is Z waxle, then can determine Y according to right-handed system rule wdirection.
Utilize radar-camera system, two frame data are gathered to demarcation scene, and have relative motion between collection two frame data, require that two sensors can observe each face of trihedral simultaneously.
2, radar system parameters is solved:
In radar data, mark out the point on trihedral three faces, be expressed as:
P l = { P l i j , k , i = 1,2 , j = 1,2,3 , k = n l i j } - - - ( 1 )
Wherein subscript l represents these points at radar fix system O l-X ly lz lunder, i represents the i-th frame data, and j represents a jth face, and k represents a kth point, and the number of each upper point is point on same plane can use parameter represent:
N l i jT P l i j - d l i j = 0 - - - ( 2 )
Wherein for the normal vector of each, for the distance to coordinate origin.
Utilize linear least-squares optimization object function F 1, to the point on an i-th frame data jth face carry out plane fitting, obtain parameter
F 1 ( N l i j , d l i j ) = arg min N l i j , d l i j Σ k = 1 n l i j | | N l i jT P l i j - d l i j | | 2 - - - ( 3 )
Pass through the transformational relation that world coordinates is tied to radar fix system can be obtained R wl i , T wl i , i = 1,2 :
r wl 1 = N l 1 × N l 3 | | N l 1 × N l 3 | | r wl 3 = N l 3 , T l = N l 1 N l 2 N l 3 - T d l 1 d l 2 d l 3 T r wl 2 = r wl 3 × r wl 1 - - - ( 4 )
Here r wl1, r wl2, r wl3 are respectively R wlcolumn vector.Pass through the first frame can be obtained under radar fix system relative to the kinematic parameter of the second frame
R l 1 l 2 = R wl 2 - 1 R wl 1 T l 1 l 2 = T wl 2 - R l 1 l 2 T wl 1 - - - ( 5 )
According to the coplanarity that corresponding surface in two frame data is put, to all carry out nonlinear optimization, then utilize after optimizing right upgrade.
The objective function optimized is:
F 2 ( N l i j , d l i j ) = arg min N l i j , d l i j Σ j = 1 3 Σ k = 1 n l 2 j | | N l 1 jT ( R l 2 l 1 P l 2 j , k + T l 2 l 1 ) - d l 1 j | | 2 + Σ j = 1 3 Σ k = 1 n l 1 j | | N l 2 jT ( R l 1 l 2 P l 1 j , k + T l 1 l 2 ) - d l 2 j | | 2 - - - ( 6 )
Wherein, in formula be the second frame relative to the rotation matrix of the first frame and translation vector, with there is relation:
R l 2 l 1 = R l 1 l 2 - 1 , T l 2 l 1 = - R l 1 l 2 - 1 T l 1 l 2 - - - ( 7 )
3, parameter under camera coordinates system is solved:
According to SIFT algorithm, (specific implementation is see document 6 (Lowe, David G. (1999). " Object recognition fromlocal scale-invariant features " .Proceedings of the International Conference on Computer Vision.2.pp.1150 – 1157.)) carry out feature point extraction, coupling front and back two two field picture is obverse, be expressed as:
P c = { P c i j , k , i = 1,2 , j = 1,2,3 , k = n c j } - - - ( 8 )
Wherein subscript c represents these points at camera coordinates system O c-X cy cz cunder, i represents the i-th frame data, and j represents a jth face, and k represents a kth unique point, and the front and back two obverse matching characteristic of frame is counted out and is equally
Two-dimensional points P on image cwith camera coordinates system lower peripheral surface point P s,ctransformational relation be:
P c=G(P s,c) (9)
G () is corresponding projection function herein, is determined by known camera type and internal reference.
According to the epipolar geometry constraints of front and back frame corresponding point, can obtain:
G -1(P s,c1) TEG -1(P s,c2)=0 (10)
Here E is essential matrix.Document 7 (R.I.Hartley and A.Zisserman can be utilized, " MultipleView Geometry in Computer Vision ", Cambridge University Press, ISBN:0521540518,2004,2nd ed.) middle 8 ripe methods solve essential matrix, and the motion under decompositing camera coordinates system between two frames wherein for normalized translation vector:
t c 1 c 2 = T c 1 c 2 / | | T c 1 c 2 | | , | | t c 1 c 2 | | = 1 - - - ( 11 )
Because the positional distance of radar and camera in radar-camera system is comparatively near, the rotation between coordinate system is less, so according to parameter to be asked under parameter initial estimation camera coordinates system under radar fix system:
{ N c i j , d c i j } = { N l i j , d l i j } | | T c 1 c 2 | | = | | T l 1 l 2 | | - - - ( 12 )
Then, matching characteristic point on image is utilized to utilize re-projection error after recovering three-dimensional coordinate on another two field picture is optimized
F 3 ( N c 1 , d c 1 , N c 2 , d c 2 , R c 1 c 2 , T c 1 c 2 ) = arg min N c 1 , d c 1 N c 2 , d c 2 R c 1 c 2 , T c 1 c 2 Σ k = 1 n c j Σ j = 1 3 | p c 2 j , k - G ( R c 1 c 2 d c 1 G - 1 ( p c 1 j , k ) N c 1 j , T G - 1 ( p c 1 j , k ) + T c 1 c 2 ) | 2 | p c 1 j , k - G ( R c 2 c 1 d c 2 G - 1 ( p c 2 j , k ) N c 2 j , T G - 1 ( p c 2 j , k ) + T c 2 c 1 ) | 2 - - - ( 12 )
Wherein, in formula be the second frame relative to the rotation matrix of the first frame and translation vector, with there is relation:
R c 2 c 1 = R c 1 c 2 - 1 , T c 2 c 1 = - R c 1 c 2 - 1 T c 1 c 2 - - - ( 13 )
Pass through the transformational relation that world coordinates is tied to camera coordinates system can be obtained
r wc 1 = N c 1 × N c 3 | | N c 1 × N c 3 | | r wc 3 = N c 3 , T c = N c 1 N c 2 N c 3 - T d c 1 d c 2 d c 3 T r wc 2 = r wc 3 × r wc 1 - - - ( 14 )
Here r wc1, r wc2, r wc3 are respectively R wccolumn vector.
4, external parameter is solved:
Pass through in any one group of data, the transformational relation R between radar and camera coordinates system can be solved lc, T lc:
R lc = R wc i R wl i - 1 T lc = T wc i - R wc i R wl i - 1 T wl i i 1,2 - - - ( 15 )
Further, coplanarity trihedral corresponding surface put under radar fix system and camera coordinates system is utilized, according to objective function F 4to R lc, T lccarry out nonlinear optimization, the R after being optimized lc, T lc.
F 4 ( R lc , T lc ) = arg min R lc , T lc Σ i = 1 2 Σ j = 1 3 Σ k = 1 n l i j | | N c i jT ( R lc P l i j , k + T lc ) - d c i j | | 2 - - - ( 16 )
Hereto, the calibrating external parameters of radar-camera system is complete.

Claims (1)

1., based on radar-camera system method for calibrating external parameters of any trihedral, it is characterized in that the method comprises the following steps:
(1) utilize radar-camera system, two frame data are gathered to the demarcation scene that there is any trihedral, and according to trihedral definition world coordinate system O w-X wy wz w, there is relative motion between two frame data of collection, and the coloured image of the three-dimensional point cloud having corresponding radar to gather in every frame data and collected by camera;
(2) radar system parameters is solved: in three-dimensional point cloud, mark the point on trihedral three faces, be expressed as:
P l = { P l i j , k , i = 1,2 , j = 1,2,3 , k = n l i j }
Wherein subscript l represents these points at radar fix system O l-X ly lz lunder, i represents the i-th frame data, and j represents a jth face, and k represents a kth point, and the number of each upper point is point parameter on same plane represent:
N l i jT P l i j - d l i j = 0
Wherein for the normal vector of each, for the distance to coordinate origin;
Then according to the coplanarity of trihedral at radar two frame data mid point, to two frames totally 6 faces carry out nonlinear optimization;
Pass through obtain rotation matrix and translation vector that world coordinates in two frame data is tied to radar fix system i=1,2, and obtain the kinematic parameter of radar-camera system under radar fix system between two frames thus, namely the first frame is relative to the rotation matrix of the second frame and translation vector
(3) camera system parameters is solved: utilize the corresponding region in SIFT algorithm trihedral three faces in front and back two two field picture of collected by camera to extract the unique point of front and back frame coupling, be expressed as:
P c = { P c i j , k , i = 1,2 , j = 1,2,3 , k = n c j }
Wherein subscript c represents these points at camera coordinates system O c-X cy cz cunder, i represents the i-th frame data, and j represents a jth face, and k represents a kth unique point, and the front and back two obverse matching characteristic of frame is counted out and is equally
According to P ccalculate the essential matrix E of two frame motions, and obtain the kinematic parameter of radar-camera system under camera coordinates system between two frames further wherein for the first frame under camera coordinates system is relative to the rotation matrix of the second frame, for translation vector normalized result:
t c 1 c 2 = T c 1 c 2 / | | T c 1 c 2 | | , | | t c 1 c 2 | | = 1
Utilize parameter under initialization camera coordinates system , herein for the parameter of each plane of trihedral under camera coordinates system; According to the corresponding projection relation of the unique point extracted, right be optimized;
Pass through obtain world coordinates in two frame data and be tied to the transformational relation of camera coordinates system, be i.e. rotation matrix and translation vector i=1,2;
(4) external parameter is solved: utilize R wc, T wc, R wl, T wlobtain the transformational relation that radar fix is tied to camera coordinates system, i.e. rotation matrix R lcwith translation vector T lc; Utilize coplanarity trihedral put under radar fix system and camera coordinates system, to R lc, T lccarry out nonlinear optimization, the transformational relation R after being optimized lc, T lc;
The concrete grammar that described step (2) solves radar system parameters is:
Utilize linear least-squares optimization object function F 1, to the point on an i-th frame data jth face carry out plane fitting, obtain parameter
F 1 ( N l i j , d l i j ) = arg min N l i j , d l i j Σ k = 1 n l i j | | N l i jT P l i j - d l i j | | 2
Pass through obtain the transformational relation that world coordinates is tied to radar fix system i=1,2:
r wl 1 = N l 1 × N l 3 | | N l 1 × N l 3 | | r wl 3 = N l 3 , T l = N l 1 N l 2 N l 3 - T d l 1 d l 2 d l 3 T r wl 2 = r wl 3 × r wl 1
Here r wl1, r wl2, r wl3 are respectively R wlcolumn vector;
Pass through i=1,2 to obtain under radar fix system the first frame relative to the kinematic parameter of the second frame R l 1 l 2 , T l 1 l 2 :
R l 1 l 2 = R w l 2 - 1 R w l 1 T l 1 l 2 = T w l 2 - R l 1 l 2 T w l 1
According to the coplanarity that corresponding surface in two frame data is put, to all carry out nonlinear optimization, objective function is expressed as:
F 2 ( N l i j , d l i j ) = arg min N l i j , d l i j Σ j = 1 3 Σ k = 1 n l 2 j | | N l 1 jT ( R l 2 l 1 P l 2 j , k + T l 2 l 1 ) - d l 1 j | | 2 + Σ j = 1 3 Σ k = 1 n l 1 j | | N l 2 jT ( R l 1 l 2 P l 1 j , k + T l 1 l 2 ) - d l 2 j | | 2
Wherein, in formula be the second frame relative to the rotation matrix of the first frame and translation vector; Utilize after optimizing right upgrade;
The concrete grammar that described step (3) solves camera system parameters is:
Two-dimensional points P on image cwith camera coordinates system lower peripheral surface point P s,ctransformational relation be:
P c=G(P s,c)
G () is corresponding projection function herein, is determined by known camera type and internal reference;
According to the epipolar geometry constraints of front and back frame corresponding point, obtain:
G -1(P s,c1) TEG -1(P s,c2)=0
Here E is essential matrix; Existing maturation method is utilized to solve essential matrix, and the motion under decompositing camera coordinates system between two frames wherein for normalized translation vector:
t c 1 c 2 = T c 1 c 2 / | | T c 1 c 2 | | , | | t c 1 c 2 | | = 1
Because the positional distance of radar and camera in radar-camera system is comparatively near, the rotation between coordinate system is less, so according to parameter to be asked under parameter initial estimation camera coordinates system under radar fix system:
{ N c i j , d c i j } = { N l i j , d l i j } | | T c 1 c 2 | | = | | T l 1 l 2 | |
Then, matching characteristic point on image is utilized to utilize re-projection error after recovering three-dimensional coordinate on another two field picture is optimized
F 3 ( N c 1 , d c 1 , N c 2 , d c 2 , R c 1 c 2 , T c 1 c 2 ) = arg min N c 1 , d c 1 N c 2 , d c 2 R c 1 c 2 , T c 1 c 2 Σ k = 1 n c j Σ j = 1 3 + | p c 1 j , k - G ( R c 2 c 1 d c 2 G - 1 ( p c 2 j , k ) N c 2 j , T G - 1 ( p c 2 j , k ) + T c 2 c 1 ) | 2 | p c 2 j , k - G ( R c 1 c 2 d c 1 G - 1 ( p c 1 j , k ) N c 1 j , T G - 1 ( p c 1 j , k ) + T c 1 c 2 ) | 2
Wherein, in formula be the second frame relative to the rotation matrix of the first frame and translation vector;
Pass through obtain the transformational relation that world coordinates is tied to camera coordinates system i=1,2:
r wc 1 = N c 1 × N c 3 | | N c 1 × N c 3 | | r wc 3 = N c 3 , T c = N c 1 N c 2 N c 3 - T d c 1 d c 2 d c 3 T r wc 2 = r wc 3 × r wc 1
Here r wc1, r wc2, r wc3 are respectively R wccolumn vector;
The concrete grammar that described step (4) solves external parameter is:
Pass through i=1, any one group of data in 2, solve the transformational relation R between radar and camera coordinates system lc, T lc:
R lc = R w c i R w l i - 1 T lc = T w c i - R w c i R w l i - 1 T w l i , i = 1,2
Further, coplanarity trihedral corresponding surface put under radar fix system and camera coordinates system is utilized, according to objective function F 4to R lc, T lccarry out nonlinear optimization, the R after being optimized lc, T lc;
F 4 ( R lc , T lc ) = arg min R lc , T lc Σ i = 1 2 Σ j = 1 3 Σ k = 1 n l i j | | N c i jT ( R lc P l i j , k + T lc ) - d c i j | | 2 .
CN201210563695.5A 2012-12-21 2012-12-21 Random trihedron-based radar-camera system external parameter calibration method Expired - Fee Related CN103049912B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210563695.5A CN103049912B (en) 2012-12-21 2012-12-21 Random trihedron-based radar-camera system external parameter calibration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210563695.5A CN103049912B (en) 2012-12-21 2012-12-21 Random trihedron-based radar-camera system external parameter calibration method

Publications (2)

Publication Number Publication Date
CN103049912A CN103049912A (en) 2013-04-17
CN103049912B true CN103049912B (en) 2015-03-11

Family

ID=48062541

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210563695.5A Expired - Fee Related CN103049912B (en) 2012-12-21 2012-12-21 Random trihedron-based radar-camera system external parameter calibration method

Country Status (1)

Country Link
CN (1) CN103049912B (en)

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105445721B (en) * 2015-12-15 2018-06-12 中国北方车辆研究所 Based on laser radar and video camera combined calibrating method with feature protrusion V-type calibration object
CN108053450B (en) * 2018-01-22 2020-06-30 浙江大学 High-precision binocular camera calibration method based on multiple constraints
CN108399643A (en) * 2018-03-15 2018-08-14 南京大学 A kind of outer ginseng calibration system between laser radar and camera and method
CN109270534B (en) * 2018-05-07 2020-10-27 西安交通大学 Intelligent vehicle laser sensor and camera online calibration method
CN109448060B (en) * 2018-09-04 2020-11-06 惠州市德赛西威智能交通技术研究院有限公司 Camera calibration parameter optimization method based on bat algorithm
CN109242913B (en) 2018-09-07 2020-11-10 百度在线网络技术(北京)有限公司 Method, device, equipment and medium for calibrating relative parameters of collector
CN109100698B (en) * 2018-09-17 2019-08-30 中国电子科技集团公司第二十八研究所 A kind of radar target spherical projection method for maritime formation
CN110969663B (en) * 2018-09-30 2023-10-03 北京魔门塔科技有限公司 Static calibration method for external parameters of camera
CN109483516B (en) * 2018-10-16 2020-06-05 浙江大学 Mechanical arm hand-eye calibration method based on space distance and polar line constraint
CN111360810A (en) * 2018-12-25 2020-07-03 深圳市优必选科技有限公司 External parameter calibration method and device for robot sensor, robot and storage medium
CN109740487B (en) * 2018-12-27 2021-06-15 广州文远知行科技有限公司 Point cloud labeling method and device, computer equipment and storage medium
CN111383279B (en) * 2018-12-29 2023-06-20 阿里巴巴集团控股有限公司 External parameter calibration method and device and electronic equipment
CN111609827B (en) * 2019-02-26 2022-01-11 上汽通用汽车有限公司 Construction method of theoretical precise datum plane of engine cylinder block and engine cylinder block
CN110162089B (en) * 2019-05-30 2020-11-03 北京三快在线科技有限公司 Unmanned driving simulation method and device
CN110599541B (en) * 2019-08-28 2022-03-11 贝壳技术有限公司 Method and device for calibrating multiple sensors and storage medium
CN110675456B (en) * 2019-09-18 2020-06-16 深圳普罗米修斯视觉技术有限公司 Method and device for calibrating external parameters of multi-depth camera and storage medium
CN111260735B (en) * 2020-01-13 2022-07-01 福州大学 External parameter calibration method for single-shot LIDAR and panoramic camera
CN111325801B (en) * 2020-01-23 2022-03-15 天津大学 Combined calibration method for laser radar and camera
TWI755765B (en) 2020-06-22 2022-02-21 中強光電股份有限公司 System for calibrating visual coordinate system and depth coordinate system, calibration method and calibration device
CN112485773B (en) * 2020-11-09 2023-06-06 中国人民解放军军事科学院国防科技创新研究院 External parameter information calibration method for laser radar and inclination angle sensor

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101882313A (en) * 2010-07-14 2010-11-10 中国人民解放军国防科学技术大学 Calibration method of correlation between single line laser radar and CCD (Charge Coupled Device) camera
DE102011112243A1 (en) * 2011-09-01 2012-05-24 Daimler Ag Method for calibration of sensor device used for detecting surroundings of vehicle, involves comparing generated reference data with sensor data of sensor device, to calibrate sensor device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101882313A (en) * 2010-07-14 2010-11-10 中国人民解放军国防科学技术大学 Calibration method of correlation between single line laser radar and CCD (Charge Coupled Device) camera
DE102011112243A1 (en) * 2011-09-01 2012-05-24 Daimler Ag Method for calibration of sensor device used for detecting surroundings of vehicle, involves comparing generated reference data with sensor data of sensor device, to calibrate sensor device

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
An Algorithm for Extrinsic Parameters Calibration of a Camera and a Laser Range Finder Using Line Features;Ganhua Li et al.;《Proceedings of the 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems》;20071029;第3854-3859页 *
Extrinsic calibration between a multi-layer lidar and a camera;Sergio Alberto Rodriguez Florez et al.;《IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems》;20080820;第214-219页 *
Extrinsic Self Calibration of a Camera and a 3D Laser Range Finder from Natural Scenes;D. Scaramuzza et al.;《IEEE/RSJ International Conference on Intelligent Robots and Systems》;20071029;第4164-4169页 *
成像激光雷达与摄像机外部位置关系的标定;胡峰等;《光学精密工程》;20110430;第19卷(第4期);第938-943页 *

Also Published As

Publication number Publication date
CN103049912A (en) 2013-04-17

Similar Documents

Publication Publication Date Title
CN103049912B (en) Random trihedron-based radar-camera system external parameter calibration method
CN110706248B (en) Visual perception mapping method based on SLAM and mobile robot
CN106826833B (en) Autonomous navigation robot system based on 3D (three-dimensional) stereoscopic perception technology
CN112634451B (en) Outdoor large-scene three-dimensional mapping method integrating multiple sensors
CA2950791C (en) Binocular visual navigation system and method based on power robot
CN104626206B (en) The posture information measuring method of robot manipulating task under a kind of non-structure environment
CN205905026U (en) Robot system based on two mesh stereovisions
CN101852609B (en) Ground obstacle detection method based on binocular stereo vision of robot
CN101794349B (en) Experimental system and method for augmented reality of teleoperation of robot
CN110097553A (en) The semanteme for building figure and three-dimensional semantic segmentation based on instant positioning builds drawing system
CN109048926A (en) A kind of intelligent robot obstacle avoidance system and method based on stereoscopic vision
CN109828658B (en) Man-machine co-fusion remote situation intelligent sensing system
CN110782524A (en) Indoor three-dimensional reconstruction method based on panoramic image
CN104376552A (en) Virtual-real registering algorithm of 3D model and two-dimensional image
CN103105851B (en) Kinesthesis teaching control method based on vision sense for remote control of robot
CN111998862B (en) BNN-based dense binocular SLAM method
Jia et al. A Survey of simultaneous localization and mapping for robot
CN104794737A (en) Depth-information-aided particle filter tracking method
CN103716399A (en) Remote interaction fruit picking cooperative asynchronous control system and method based on wireless network
CN111914615A (en) Fire-fighting area passability analysis system based on stereoscopic vision
CN109407115A (en) A kind of road surface extraction system and its extracting method based on laser radar
CN113031597A (en) Autonomous obstacle avoidance method based on deep learning and stereoscopic vision
CN109508673A (en) It is a kind of based on the traffic scene obstacle detection of rodlike pixel and recognition methods
CN115354708A (en) Excavator bucket autonomous excavation recognition control system and method based on machine vision
Zheng et al. Research on obstacle detection and path planning based on visual navigation for mobile robot

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20150311

Termination date: 20181221

CF01 Termination of patent right due to non-payment of annual fee