CN103049912B - Random trihedron-based radar-camera system external parameter calibration method - Google Patents
Random trihedron-based radar-camera system external parameter calibration method Download PDFInfo
- Publication number
- CN103049912B CN103049912B CN201210563695.5A CN201210563695A CN103049912B CN 103049912 B CN103049912 B CN 103049912B CN 201210563695 A CN201210563695 A CN 201210563695A CN 103049912 B CN103049912 B CN 103049912B
- Authority
- CN
- China
- Prior art keywords
- radar
- camera
- frame
- under
- parameter
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Landscapes
- Radar Systems Or Details Thereof (AREA)
Abstract
The invention discloses a random trihedron-based radar-camera system external parameter calibration method. According to the method, an external parameter of a system can be solved with only two frames of data by using a trihedron scene a natural environment. The method comprises the following steps of: stipulating a world coordinate system by using a trihedron; performing planar fitting on the trihedron observed in a radar system to obtain a parameter of each plane and solving conversion relation of the world coordinate system and the radar coordinate system and relative motion between the two frames of data under the radar coordinate system; and in a camera system, solving an essential matrix by using matching characteristic points extracted by front and rear frames, then solving relation motion under the camera coordinate system, solving the plane parameter under the camera coordinate system by using the parameter under the radar coordinate system, finally solving an external parameter of a radar-camera, and performing final optimization by using coplanarity of points on corresponding planes under the two coordinate systems. A scene required by the method is simpler; and the method has the characteristics of high anti-interference performance, simple experimental equipment and high flexibility.
Description
Technical field
The present invention relates to the method for radar-camera system calibrating external parameters, specifically a kind of radar based on any trihedral-camera system method for calibrating external parameters.
Background technology
Being a underlying issue in mobile robot field to the real-time structure of terrain environment, in order to realize this purpose, usually adopting binocular camera system or a radar sensor to provide three-dimensional information.But binocular camera system is owing to needing to carry out fine and close Stereo matching to carry out three-dimensional reconstruction, consuming time comparatively large, easily by external environment influence, so cannot meet the requirement of real-time and accuracy; Although radar sensor can provide real-time three-dimensional information, precision is higher, lacks the colouring information of environment.Therefore, more and more researcher formation radar-camera system that radar and monocular camera combined carrys out the environmental map that real-time providing has color and three-dimensional information.
In recent years, many researchers also provide the method for some radars-camera system calibrating external parameters.Document 1 (Zhang, Q.; Pless, R, " Extrinsic calibration of a camera and laser range finder (improvecamera calibration) ", IEEE International Conference on Intelligent Robots and Systems, IROS, 2004, pp.2301 – 2306.) and document 2 (Unnikrishnan, R.; Hebert, M., " Fast extrinsic calibration ofa laser rangefinder to a camera ", Technical report, Carnegie Mellon University, Rotbotics Institute, 2005.) gridiron pattern scaling board is utilized to carry out computing system external parameter, angle point on gridiron pattern will on camera image manual extraction, and need to gather multiple image, the method depends on the extraction order of accuarcy of angle point to a great extent, also there are certain requirements illumination.Document 3 (Rodriguez F, S.; Fremont, V.; Bonnifait, P, " Extrinsic calibration between a multi-layer lidar and a camera ", IEEE International Conferenceon Multisensor Fusion and Integration for Intelligent Systems, 2008, pp.214 – 219.) and document 4 (Li, G.; Liu, Y.; Dong, L.; Cai, X.; Zhou, D, " An algorithm for extrinsic parameters calibration of acamera and a laser range finder using line features ", IROS, 2007, pp.3854 – 3859.) in order to reduce the error of angle point grid, make use of the demarcation object of special shape, in radar data and view data, extract the corresponding corner location of demarcating thing, but the accuracy that the non-compactness of radar data itself determines the method can not be too high simultaneously.Recently, document 5 (Pandey, G., McBride, J.R., Savarese, S., Eustice, R.M., " Automatictargetless extrinsic calibration of a 3d lidar and camera by maximizing mutual information ", Proceedings of the AAAI National Conference on Artificial Intelligence, 2012.) any scene is not relied on, the three-dimensional point reflection strength only utilizing radar to provide and the mutual information of camera image are to optimize the external parameter of radar-camera system, but not all radar can provide the reflection strength of object, and the reflection strength of unlike material object is not identical.
Summary of the invention
In order to radar-camera data is effectively combined, the object of the present invention is to provide a kind of radar based on any trihedral-camera system method for calibrating external parameters, namely solve the position transformational relation of radar system and camera system; Utilize any trihedral common in natural scene, do not rely on the accuracy of manual input information, and only need two frame data can carry out external parameter fast to solve.
The step of the technical solution used in the present invention is as follows:
(1) utilize radar-camera system, two frame data are gathered to the demarcation scene that there is any trihedral, and according to trihedral definition world coordinate system O
w-X
wy
wz
w, there is relative motion between two frame data of collection, and the coloured image of the three-dimensional point cloud having corresponding radar to gather in every frame data and collected by camera;
(2) radar system parameters is solved: in three-dimensional point cloud, mark the point on trihedral three faces, be expressed as:
Wherein subscript l represents these points at radar fix system O
l-X
ly
lz
lunder, i represents the i-th frame data, and j represents a jth face, and k represents a kth point, and the number of each upper point is
point parameter on same plane
represent:
Wherein
for the normal vector of each,
for the distance to coordinate origin;
Then according to the coplanarity of trihedral at radar two frame data mid point, to two frames totally 6 faces
carry out nonlinear optimization;
Pass through
obtain rotation matrix and translation vector that world coordinates in two frame data is tied to radar fix system
and obtain the kinematic parameter of radar-camera system under radar fix system between two frames thus, namely the first frame is relative to the rotation matrix of the second frame and translation vector
(3) camera system parameters is solved: utilize the corresponding region in SIFT algorithm trihedral three faces in front and back two two field picture of collected by camera to extract the unique point of front and back frame coupling, be expressed as:
Wherein subscript c represents these points at camera coordinates system O
c-X
cy
cz
cunder, i represents the i-th frame data, and j represents a jth face, and k represents a kth unique point, and the front and back two obverse matching characteristic of frame is counted out and is equally
According to P
ccalculate the essential matrix E of two frame motions, and obtain the kinematic parameter of radar-camera system under camera coordinates system between two frames further
wherein
for the first frame under camera coordinates system is relative to the rotation matrix of the second frame,
for translation vector
normalized result:
Utilize
parameter under initialization camera coordinates system
, herein
for the parameter of each plane of trihedral under camera coordinates system; According to the corresponding projection relation of the unique point extracted, right
be optimized;
Pass through
obtain world coordinates in two frame data and be tied to the transformational relation of camera coordinates system, be i.e. rotation matrix and translation vector
(4) external parameter is solved: utilize R
wc, T
wc, R
wl, T
wlobtain the transformational relation that radar fix is tied to camera coordinates system, i.e. rotation matrix R
lcwith translation vector T
lc; Utilize coplanarity trihedral put under radar fix system and camera coordinates system, to R
lc, T
lccarry out nonlinear optimization, the transformational relation R after being optimized
lc, T
lc.
The concrete grammar that described step (2) solves radar system parameters is:
Utilize linear least-squares optimization object function F
1, to the point on an i-th frame data jth face
carry out plane, obtain fitting parameter
Pass through
obtain the transformational relation that world coordinates is tied to radar fix system
Here r
wl1, r
wl2, r
wl3 are respectively R
wlcolumn vector;
Pass through
to obtain under radar fix system the first frame relative to the kinematic parameter of the second frame
According to the coplanarity that corresponding surface in two frame data is put, to all
carry out nonlinear optimization, objective function is expressed as:
Wherein, in formula
be the second frame relative to the rotation matrix of the first frame and translation vector; Utilize after optimizing
right
upgrade.
The concrete grammar that described step (3) solves camera system parameters is:
Two-dimensional points P on image
cwith camera coordinates system lower peripheral surface point P
s,ctransformational relation be:
P
c=G(P
s,c)
G () is corresponding projection function herein, is determined by known camera type and internal reference;
According to the epipolar geometry constraints of front and back frame corresponding point, obtain:
G
-1(P
s,c1)
TEG
-1(P
s,c2)=0
Here E is essential matrix; Existing maturation method is utilized to solve essential matrix, and the motion under decompositing camera coordinates system between two frames
wherein
for normalized translation vector:
Because the positional distance of radar and camera in radar-camera system is comparatively near, the rotation between coordinate system is less, so according to parameter to be asked under parameter initial estimation camera coordinates system under radar fix system:
Then, matching characteristic point on image is utilized to utilize
re-projection error after recovering three-dimensional coordinate on another two field picture is optimized
Wherein, in formula
be the second frame relative to the rotation matrix of the first frame and translation vector;
Pass through
obtain the transformational relation that world coordinates is tied to camera coordinates system
Here r
wc1, r
wc2, r
wc3 are respectively R
wccolumn vector.
The concrete grammar that described step (4) solves external parameter is:
Pass through
in any one group of data, solve the transformational relation R between radar and camera coordinates system
lc, T
lc:
Further, coplanarity trihedral corresponding surface put under radar fix system and camera coordinates system is utilized, according to objective function F
4to R
lc, T
lccarry out nonlinear optimization, the R after being optimized
lc, T
lc.
The beneficial effect that the present invention has is:
The present invention has taken into account noiseproof feature, operation complexity requirement is low, does not need to do special setting to environment, is suitable for the external conditions such as arbitrary illumination, weather; Scene of the presently claimed invention is fairly simple, common, has strong interference immunity, experimental facilities is simple, dirigibility is stronger feature concurrently.
Accompanying drawing explanation
Fig. 1 is overview flow chart of the present invention.
Fig. 2 is three coordinate system definition and relation schematic diagram in radar-camera system calibration process.
Embodiment
Below in conjunction with the drawings and specific embodiments, the present invention will be further described.
Fig. 1 gives the techniqueflow of radar-camera system method for calibrating external parameters.
Radar-camera system calibrating external parameters comprises following four parts: 1. gather two frame radar-camera system data; 2. radar system parameters is solved: utilize coordinate trihedral face put under radar system, solve the parameter of each plane in two frame radar datas, relative movement parameters and the transformational relation with world coordinate system; 3. solve camera system parameters: according to known camera parameter, utilize parameter under radar system, solve the parameter of each plane in two frame camera datas, relative movement parameters and the transformational relation with world coordinate system; 4. external parameter is solved: the transformational relation solving radar and camera according to the relativeness of radar and camera coordinates system and world coordinate system, the i.e. external parameter of radar-camera system, and utilize the coplanarity put under different sensors system on same plane to carry out final optimization pass.
1, two frame radar-camera system data are gathered:
As shown in Figure 2, according to the trihedral definition world coordinate system O in scene
w-X
wy
wz
w:
With the intersection point in trihedral three faces for initial point O
w, plane 1 is X with the intersection of plane 3
waxle, the normal vector direction of plane 3 is Z
waxle, then can determine Y according to right-handed system rule
wdirection.
Utilize radar-camera system, two frame data are gathered to demarcation scene, and have relative motion between collection two frame data, require that two sensors can observe each face of trihedral simultaneously.
2, radar system parameters is solved:
In radar data, mark out the point on trihedral three faces, be expressed as:
Wherein subscript l represents these points at radar fix system O
l-X
ly
lz
lunder, i represents the i-th frame data, and j represents a jth face, and k represents a kth point, and the number of each upper point is
point on same plane can use parameter
represent:
Wherein
for the normal vector of each,
for the distance to coordinate origin.
Utilize linear least-squares optimization object function F
1, to the point on an i-th frame data jth face
carry out plane fitting, obtain parameter
Pass through
the transformational relation that world coordinates is tied to radar fix system can be obtained
Here r
wl1, r
wl2, r
wl3 are respectively R
wlcolumn vector.Pass through
the first frame can be obtained under radar fix system relative to the kinematic parameter of the second frame
According to the coplanarity that corresponding surface in two frame data is put, to all
carry out nonlinear optimization, then utilize after optimizing
right
upgrade.
The objective function optimized is:
Wherein, in formula
be the second frame relative to the rotation matrix of the first frame and translation vector, with
there is relation:
3, parameter under camera coordinates system is solved:
According to SIFT algorithm, (specific implementation is see document 6 (Lowe, David G. (1999). " Object recognition fromlocal scale-invariant features " .Proceedings of the International Conference on Computer Vision.2.pp.1150 – 1157.)) carry out feature point extraction, coupling front and back two two field picture is obverse, be expressed as:
Wherein subscript c represents these points at camera coordinates system O
c-X
cy
cz
cunder, i represents the i-th frame data, and j represents a jth face, and k represents a kth unique point, and the front and back two obverse matching characteristic of frame is counted out and is equally
Two-dimensional points P on image
cwith camera coordinates system lower peripheral surface point P
s,ctransformational relation be:
P
c=G(P
s,c) (9)
G () is corresponding projection function herein, is determined by known camera type and internal reference.
According to the epipolar geometry constraints of front and back frame corresponding point, can obtain:
G
-1(P
s,c1)
TEG
-1(P
s,c2)=0 (10)
Here E is essential matrix.Document 7 (R.I.Hartley and A.Zisserman can be utilized, " MultipleView Geometry in Computer Vision ", Cambridge University Press, ISBN:0521540518,2004,2nd ed.) middle 8 ripe methods solve essential matrix, and the motion under decompositing camera coordinates system between two frames
wherein
for normalized translation vector:
Because the positional distance of radar and camera in radar-camera system is comparatively near, the rotation between coordinate system is less, so according to parameter to be asked under parameter initial estimation camera coordinates system under radar fix system:
Then, matching characteristic point on image is utilized to utilize
re-projection error after recovering three-dimensional coordinate on another two field picture is optimized
Wherein, in formula
be the second frame relative to the rotation matrix of the first frame and translation vector, with
there is relation:
Pass through
the transformational relation that world coordinates is tied to camera coordinates system can be obtained
Here r
wc1, r
wc2, r
wc3 are respectively R
wccolumn vector.
4, external parameter is solved:
Pass through
in any one group of data, the transformational relation R between radar and camera coordinates system can be solved
lc, T
lc:
Further, coplanarity trihedral corresponding surface put under radar fix system and camera coordinates system is utilized, according to objective function F
4to R
lc, T
lccarry out nonlinear optimization, the R after being optimized
lc, T
lc.
Hereto, the calibrating external parameters of radar-camera system is complete.
Claims (1)
1., based on radar-camera system method for calibrating external parameters of any trihedral, it is characterized in that the method comprises the following steps:
(1) utilize radar-camera system, two frame data are gathered to the demarcation scene that there is any trihedral, and according to trihedral definition world coordinate system O
w-X
wy
wz
w, there is relative motion between two frame data of collection, and the coloured image of the three-dimensional point cloud having corresponding radar to gather in every frame data and collected by camera;
(2) radar system parameters is solved: in three-dimensional point cloud, mark the point on trihedral three faces, be expressed as:
Wherein subscript l represents these points at radar fix system O
l-X
ly
lz
lunder, i represents the i-th frame data, and j represents a jth face, and k represents a kth point, and the number of each upper point is
point parameter on same plane
represent:
Wherein
for the normal vector of each,
for the distance to coordinate origin;
Then according to the coplanarity of trihedral at radar two frame data mid point, to two frames totally 6 faces
carry out nonlinear optimization;
Pass through
obtain rotation matrix and translation vector that world coordinates in two frame data is tied to radar fix system
i=1,2, and obtain the kinematic parameter of radar-camera system under radar fix system between two frames thus, namely the first frame is relative to the rotation matrix of the second frame and translation vector
(3) camera system parameters is solved: utilize the corresponding region in SIFT algorithm trihedral three faces in front and back two two field picture of collected by camera to extract the unique point of front and back frame coupling, be expressed as:
Wherein subscript c represents these points at camera coordinates system O
c-X
cy
cz
cunder, i represents the i-th frame data, and j represents a jth face, and k represents a kth unique point, and the front and back two obverse matching characteristic of frame is counted out and is equally
According to P
ccalculate the essential matrix E of two frame motions, and obtain the kinematic parameter of radar-camera system under camera coordinates system between two frames further
wherein
for the first frame under camera coordinates system is relative to the rotation matrix of the second frame,
for translation vector
normalized result:
Utilize
parameter under initialization camera coordinates system
, herein
for the parameter of each plane of trihedral under camera coordinates system; According to the corresponding projection relation of the unique point extracted, right
be optimized;
Pass through
obtain world coordinates in two frame data and be tied to the transformational relation of camera coordinates system, be i.e. rotation matrix and translation vector
i=1,2;
(4) external parameter is solved: utilize R
wc, T
wc, R
wl, T
wlobtain the transformational relation that radar fix is tied to camera coordinates system, i.e. rotation matrix R
lcwith translation vector T
lc; Utilize coplanarity trihedral put under radar fix system and camera coordinates system, to R
lc, T
lccarry out nonlinear optimization, the transformational relation R after being optimized
lc, T
lc;
The concrete grammar that described step (2) solves radar system parameters is:
Utilize linear least-squares optimization object function F
1, to the point on an i-th frame data jth face
carry out plane fitting, obtain parameter
Pass through
obtain the transformational relation that world coordinates is tied to radar fix system
i=1,2:
Here r
wl1, r
wl2, r
wl3 are respectively R
wlcolumn vector;
Pass through
i=1,2 to obtain under radar fix system the first frame relative to the kinematic parameter of the second frame
According to the coplanarity that corresponding surface in two frame data is put, to all
carry out nonlinear optimization, objective function is expressed as:
Wherein, in formula
be the second frame relative to the rotation matrix of the first frame and translation vector; Utilize after optimizing
right
upgrade;
The concrete grammar that described step (3) solves camera system parameters is:
Two-dimensional points P on image
cwith camera coordinates system lower peripheral surface point P
s,ctransformational relation be:
P
c=G(P
s,c)
G () is corresponding projection function herein, is determined by known camera type and internal reference;
According to the epipolar geometry constraints of front and back frame corresponding point, obtain:
G
-1(P
s,c1)
TEG
-1(P
s,c2)=0
Here E is essential matrix; Existing maturation method is utilized to solve essential matrix, and the motion under decompositing camera coordinates system between two frames
wherein
for normalized translation vector:
Because the positional distance of radar and camera in radar-camera system is comparatively near, the rotation between coordinate system is less, so according to parameter to be asked under parameter initial estimation camera coordinates system under radar fix system:
Then, matching characteristic point on image is utilized to utilize
re-projection error after recovering three-dimensional coordinate on another two field picture is optimized
Wherein, in formula
be the second frame relative to the rotation matrix of the first frame and translation vector;
Pass through
obtain the transformational relation that world coordinates is tied to camera coordinates system
i=1,2:
Here r
wc1, r
wc2, r
wc3 are respectively R
wccolumn vector;
The concrete grammar that described step (4) solves external parameter is:
Pass through
i=1, any one group of data in 2, solve the transformational relation R between radar and camera coordinates system
lc, T
lc:
Further, coplanarity trihedral corresponding surface put under radar fix system and camera coordinates system is utilized, according to objective function F
4to R
lc, T
lccarry out nonlinear optimization, the R after being optimized
lc, T
lc;
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210563695.5A CN103049912B (en) | 2012-12-21 | 2012-12-21 | Random trihedron-based radar-camera system external parameter calibration method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210563695.5A CN103049912B (en) | 2012-12-21 | 2012-12-21 | Random trihedron-based radar-camera system external parameter calibration method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103049912A CN103049912A (en) | 2013-04-17 |
CN103049912B true CN103049912B (en) | 2015-03-11 |
Family
ID=48062541
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201210563695.5A Expired - Fee Related CN103049912B (en) | 2012-12-21 | 2012-12-21 | Random trihedron-based radar-camera system external parameter calibration method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103049912B (en) |
Families Citing this family (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105445721B (en) * | 2015-12-15 | 2018-06-12 | 中国北方车辆研究所 | Based on laser radar and video camera combined calibrating method with feature protrusion V-type calibration object |
CN108053450B (en) * | 2018-01-22 | 2020-06-30 | 浙江大学 | High-precision binocular camera calibration method based on multiple constraints |
CN108399643A (en) * | 2018-03-15 | 2018-08-14 | 南京大学 | A kind of outer ginseng calibration system between laser radar and camera and method |
CN109270534B (en) * | 2018-05-07 | 2020-10-27 | 西安交通大学 | Intelligent vehicle laser sensor and camera online calibration method |
CN109448060B (en) * | 2018-09-04 | 2020-11-06 | 惠州市德赛西威智能交通技术研究院有限公司 | Camera calibration parameter optimization method based on bat algorithm |
CN109242913B (en) | 2018-09-07 | 2020-11-10 | 百度在线网络技术(北京)有限公司 | Method, device, equipment and medium for calibrating relative parameters of collector |
CN109100698B (en) * | 2018-09-17 | 2019-08-30 | 中国电子科技集团公司第二十八研究所 | A kind of radar target spherical projection method for maritime formation |
CN110969663B (en) * | 2018-09-30 | 2023-10-03 | 北京魔门塔科技有限公司 | Static calibration method for external parameters of camera |
CN109483516B (en) * | 2018-10-16 | 2020-06-05 | 浙江大学 | Mechanical arm hand-eye calibration method based on space distance and polar line constraint |
CN111360810A (en) * | 2018-12-25 | 2020-07-03 | 深圳市优必选科技有限公司 | External parameter calibration method and device for robot sensor, robot and storage medium |
CN109740487B (en) * | 2018-12-27 | 2021-06-15 | 广州文远知行科技有限公司 | Point cloud labeling method and device, computer equipment and storage medium |
CN111383279B (en) * | 2018-12-29 | 2023-06-20 | 阿里巴巴集团控股有限公司 | External parameter calibration method and device and electronic equipment |
CN111609827B (en) * | 2019-02-26 | 2022-01-11 | 上汽通用汽车有限公司 | Construction method of theoretical precise datum plane of engine cylinder block and engine cylinder block |
CN110162089B (en) * | 2019-05-30 | 2020-11-03 | 北京三快在线科技有限公司 | Unmanned driving simulation method and device |
CN110599541B (en) * | 2019-08-28 | 2022-03-11 | 贝壳技术有限公司 | Method and device for calibrating multiple sensors and storage medium |
CN110675456B (en) * | 2019-09-18 | 2020-06-16 | 深圳普罗米修斯视觉技术有限公司 | Method and device for calibrating external parameters of multi-depth camera and storage medium |
CN111260735B (en) * | 2020-01-13 | 2022-07-01 | 福州大学 | External parameter calibration method for single-shot LIDAR and panoramic camera |
CN111325801B (en) * | 2020-01-23 | 2022-03-15 | 天津大学 | Combined calibration method for laser radar and camera |
TWI755765B (en) | 2020-06-22 | 2022-02-21 | 中強光電股份有限公司 | System for calibrating visual coordinate system and depth coordinate system, calibration method and calibration device |
CN112485773B (en) * | 2020-11-09 | 2023-06-06 | 中国人民解放军军事科学院国防科技创新研究院 | External parameter information calibration method for laser radar and inclination angle sensor |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101882313A (en) * | 2010-07-14 | 2010-11-10 | 中国人民解放军国防科学技术大学 | Calibration method of correlation between single line laser radar and CCD (Charge Coupled Device) camera |
DE102011112243A1 (en) * | 2011-09-01 | 2012-05-24 | Daimler Ag | Method for calibration of sensor device used for detecting surroundings of vehicle, involves comparing generated reference data with sensor data of sensor device, to calibrate sensor device |
-
2012
- 2012-12-21 CN CN201210563695.5A patent/CN103049912B/en not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101882313A (en) * | 2010-07-14 | 2010-11-10 | 中国人民解放军国防科学技术大学 | Calibration method of correlation between single line laser radar and CCD (Charge Coupled Device) camera |
DE102011112243A1 (en) * | 2011-09-01 | 2012-05-24 | Daimler Ag | Method for calibration of sensor device used for detecting surroundings of vehicle, involves comparing generated reference data with sensor data of sensor device, to calibrate sensor device |
Non-Patent Citations (4)
Title |
---|
An Algorithm for Extrinsic Parameters Calibration of a Camera and a Laser Range Finder Using Line Features;Ganhua Li et al.;《Proceedings of the 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems》;20071029;第3854-3859页 * |
Extrinsic calibration between a multi-layer lidar and a camera;Sergio Alberto Rodriguez Florez et al.;《IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems》;20080820;第214-219页 * |
Extrinsic Self Calibration of a Camera and a 3D Laser Range Finder from Natural Scenes;D. Scaramuzza et al.;《IEEE/RSJ International Conference on Intelligent Robots and Systems》;20071029;第4164-4169页 * |
成像激光雷达与摄像机外部位置关系的标定;胡峰等;《光学精密工程》;20110430;第19卷(第4期);第938-943页 * |
Also Published As
Publication number | Publication date |
---|---|
CN103049912A (en) | 2013-04-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103049912B (en) | Random trihedron-based radar-camera system external parameter calibration method | |
CN110706248B (en) | Visual perception mapping method based on SLAM and mobile robot | |
CN106826833B (en) | Autonomous navigation robot system based on 3D (three-dimensional) stereoscopic perception technology | |
CN112634451B (en) | Outdoor large-scene three-dimensional mapping method integrating multiple sensors | |
CA2950791C (en) | Binocular visual navigation system and method based on power robot | |
CN104626206B (en) | The posture information measuring method of robot manipulating task under a kind of non-structure environment | |
CN205905026U (en) | Robot system based on two mesh stereovisions | |
CN101852609B (en) | Ground obstacle detection method based on binocular stereo vision of robot | |
CN101794349B (en) | Experimental system and method for augmented reality of teleoperation of robot | |
CN110097553A (en) | The semanteme for building figure and three-dimensional semantic segmentation based on instant positioning builds drawing system | |
CN109048926A (en) | A kind of intelligent robot obstacle avoidance system and method based on stereoscopic vision | |
CN109828658B (en) | Man-machine co-fusion remote situation intelligent sensing system | |
CN110782524A (en) | Indoor three-dimensional reconstruction method based on panoramic image | |
CN104376552A (en) | Virtual-real registering algorithm of 3D model and two-dimensional image | |
CN103105851B (en) | Kinesthesis teaching control method based on vision sense for remote control of robot | |
CN111998862B (en) | BNN-based dense binocular SLAM method | |
Jia et al. | A Survey of simultaneous localization and mapping for robot | |
CN104794737A (en) | Depth-information-aided particle filter tracking method | |
CN103716399A (en) | Remote interaction fruit picking cooperative asynchronous control system and method based on wireless network | |
CN111914615A (en) | Fire-fighting area passability analysis system based on stereoscopic vision | |
CN109407115A (en) | A kind of road surface extraction system and its extracting method based on laser radar | |
CN113031597A (en) | Autonomous obstacle avoidance method based on deep learning and stereoscopic vision | |
CN109508673A (en) | It is a kind of based on the traffic scene obstacle detection of rodlike pixel and recognition methods | |
CN115354708A (en) | Excavator bucket autonomous excavation recognition control system and method based on machine vision | |
Zheng et al. | Research on obstacle detection and path planning based on visual navigation for mobile robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20150311 Termination date: 20181221 |
|
CF01 | Termination of patent right due to non-payment of annual fee |