CN112305557B - Panoramic camera and multi-line laser radar external parameter calibration system - Google Patents

Panoramic camera and multi-line laser radar external parameter calibration system Download PDF

Info

Publication number
CN112305557B
CN112305557B CN202011128611.6A CN202011128611A CN112305557B CN 112305557 B CN112305557 B CN 112305557B CN 202011128611 A CN202011128611 A CN 202011128611A CN 112305557 B CN112305557 B CN 112305557B
Authority
CN
China
Prior art keywords
point
calibration
center
laser radar
encoder
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011128611.6A
Other languages
Chinese (zh)
Other versions
CN112305557A (en
Inventor
陈诺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Nuoda Communication Technology Co ltd
Original Assignee
Shenzhen Nuoda Communication Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Nuoda Communication Technology Co ltd filed Critical Shenzhen Nuoda Communication Technology Co ltd
Priority to CN202011128611.6A priority Critical patent/CN112305557B/en
Publication of CN112305557A publication Critical patent/CN112305557A/en
Application granted granted Critical
Publication of CN112305557B publication Critical patent/CN112305557B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention discloses a panoramic camera and multi-line laser radar external parameter calibration system, which comprises a calibration room with a mark, a rotating device with an encoder, an accumulated multi-line laser radar generated laser intensity map algorithm, a depth map and panoramic map characteristic automatic matching algorithm and an optimization constraint relation acquisition external parameter algorithm; two calibration plates for acquiring characteristic information are respectively posted on the two sides, the front and the back of the marking room, and the right center of each calibration plate is provided with a reflecting plate for acquiring the characteristic information by the multi-line laser radar. The invention establishes the constraint relation through the marked calibration room and the specific device, and then optimizes and solves the panoramic camera and the laser radar through the least square method, so that the external parameter calibration system of the multi-line laser radar and the panoramic camera has a brand new calibration system with the functions of automatic calibration, multi-line radar application, scanning device calibration application and the like.

Description

Panoramic camera and multi-line laser radar external parameter calibration system
Technical Field
The invention relates to the field of external parameter calibration systems, in particular to an external parameter calibration system for a panoramic camera and a multi-line laser radar.
Background
The current domestic technology for creating various high-end technical products based on the laser radar and camera fusion technology is becoming a trend, so that the related technology of the laser radar and camera fusion is rapidly developed. The most basic technology of the fusion of the two is the calibration of external parameters between the two, which is the basis for ensuring the consistency of the environmental information perceived by the two. In order to achieve the calibration between the two, a plurality of effective technical schemes have appeared, but the calibration of the camera with small distortion and the laser radar is mainly performed, and the method is rarely applied to the external parameter calibration method of the panoramic camera and the multi-line laser radar.
The Chinese patent with publication number CN105678783B discloses a data fusion calibration method of a refraction-reflection panoramic camera and a laser radar, wherein the data fusion calibration structure comprises the laser radar and a single-viewpoint refraction-reflection panoramic camera which are arranged on an environment sensing system body; the combined calibration method comprises the following steps: 1. calibrating an internal reference K of a camera; 2. solving refractive point parameters XmYmZm of the refraction mirror surface; 3. solving the world coordinate point parameters XwYwZw of the panoramic camera; 4. measuring laser radar world coordinate point parameters; 5. and (5) calibrating the panoramic camera and the laser radar in a combined way. The invention has reasonable design, integrates the data of the laser radar and the panoramic camera, and can effectively calibrate the internal parameters of the panoramic camera. And a reasonable, quick and effective scheme is provided for the distance measurement and positioning problem in the environment sensing system.
However, the above disclosed method has three limitations: firstly, although the panoramic camera and the laser radar can be effectively calibrated, the corresponding point relation between the camera pixels and the radar cannot be effectively and automatically found, so that the corresponding points are required to be found by manual intervention; secondly, the single-line laser radar is adopted in the method, so that the multiple beams of the multi-line laser radar cannot be completely calibrated; thirdly, the method is not suitable for calibrating products such as a three-dimensional scanner, and the three-dimensional scanner requires that a panoramic camera is stationary and the multi-line laser radar always rotates at 360 degrees.
Based on the technical background, the external parameter calibration method of the multi-line laser radar and the panoramic camera is not available at present, but similar calibration methods of the single-line laser radar and the panoramic camera are available, but the external parameter calibration problems that corresponding points cannot be found automatically, multi-line laser radar multi-line beam calibration cannot be guaranteed, and the external parameter calibration method cannot be suitable for equipment such as a scanner exist. Therefore, it is necessary to invent a panoramic camera and multi-line laser radar external parameter calibration system to solve the above problems.
Disclosure of Invention
The invention aims to provide a panoramic camera and multi-line laser radar external parameter calibration system so as to solve the problems in the background technology.
In order to achieve the above purpose, the present invention provides the following technical solutions: a panoramic camera and multi-line laser radar external parameter calibration system comprises a calibration room with a mark, a rotating device with an encoder, an accumulated multi-line laser radar generated laser intensity map algorithm, a depth map and panoramic map characteristic automatic matching algorithm and an optimization constraint relation acquisition external parameter algorithm;
two calibration plates for acquiring characteristic information of images are respectively stuck to the two sides, the front and the back of the marking room, and the right center of each calibration plate is provided with a reflector for acquiring the characteristic information by the multi-line laser radar;
the rotating device comprises a fixed base, a motor with an encoder and a placing table arranged at the top end of the fixed base, wherein the motor with the encoder is arranged in the fixed base, and the output end of the motor with the encoder extends to the top end of the fixed base and is connected with the placing table;
the intensity map generation includes the steps of:
s1, placing a panoramic camera to be calibrated and a multi-line laser radar on a placement table, then rotating the placement table through a motor with an encoder, controlling the motor with the encoder to drive the placement table to rotate 360 degrees, and recording encoder data information, multi-line laser radar data information and a panoramic image img captured by the panoramic camera at the position of the motor with the encoder;
s2, accumulating point cloud information:
converting the non-frame point cloud information into a world coordinate system according to the following formula:
X wij =X ij ×cos(θ i )-Y ij ×sin(θ i );
Y wij =X ij ×sin(θ i )-Y ij ×cos(θ i );
Z wij =Z ij
K wij =K ij
wherein: { X wij ,Y wij ,Z wij ,K wij Information of radar laser point in world coordinate system, { X } refers to ij ,Y ij ,Z ij ,K ij The j laser point information, theta, in the point cloud acquired at the i moment i Refers to the rotated angle information obtained by the ith moment encoder;
s3, projecting the point cloud onto the panorama expansion chart according to longitude and latitude, wherein a specific projection formula is as follows:
longitude ij =arctan(yi j ,x ij );
v ij =latitude ij /360×img rows
u ij =longitude ij /180×img cols
wherein { img } in the formula rows ,img cols The number of rows and columns of pixels of the panorama expanded image, { v ij ,u ij The j point at the i moment is the pixel position of the j point in the panorama;
s4, finally mapping the intensity of the laser points onto the panorama according to the pixel position in the step S3;
the depth map and panorama feature automatic matching algorithm comprises panorama feature extraction, laser point cloud intensity map feature extraction and feature automatic matching.
Preferably, the encoder data information in step S1 is { θ } 0 ,θ 1 ,θ 2 ,…θ n Data information of the multi-line laser radar is { C } 0 ,C 1 ,C 2 ,…C n (C) i ={P 0 ,P 1 ,…P m }|P j ={x,y,z,k}};
Wherein: θ i Refers to the rotation angle read by each frame encoder;
C i the method comprises the steps of referring to a single-frame point cloud acquired at the ith moment;
{ x, y, z, k } refers to the positional information and intensity information of each point in the point cloud, respectively.
Preferably, the panorama feature extraction captures texture information of the whole calibration room, visually recognizes the position of the calibration plate, and simultaneously determines the center of the calibration plate by identifying the calibration plate, and takes the center as a feature point of the panorama, wherein the center feature point identification process is as follows:
s5, firstly performing panoramic binarization, setting a threshold T, and traversing each pixel of the panoramic image;
s6, when the pixel value is larger than T, setting the pixel value to be black, otherwise setting the pixel value to be white;
s7, identifying a rectangular frame, calculating fixed point numbers along the edges of each black area of the panoramic image, and identifying the rectangular frame when the number of vertexes is equal to 4, namely, the position of the calibration plate;
s8, finally calculating the position of the center point of the calibration plate through the following formula:
u=(u 1 +u 2 +u 3 +u 4 )/4;
v=(v 1 +v 2 +v 3 +v 4 )/4;
in the formula, { u, v } is the position of the center point of the calibration plate, { u } i ,v i I epsilon {1,2,3,4} is the pixel position information of the four vertexes of the calibration plate.
Preferably, the extracting of the laser point cloud intensity map features includes clustering of high reflection intensity pixels and determination of feature center points, which is specifically as follows:
firstly, clustering is carried out, wherein the clustering is carried out because the reflecting plates have a certain area, so that the positions of the laser striking points of the reflecting plates are different and are dispersed in a certain range, and therefore, in order to determine the central position of the reflecting plates, the laser reflecting points of the same reflecting plate are required to be clustered together, the clustering method is a threshold value method, and the distance between adjacent pixel points is calculated by the following formula:
d in ji I.e. the distance between the ith pixel and the jth pixel, { u } i ,v i }、{u j ,v j The i and j pixel point positions are respectively;
then setting a threshold T', and gathering the distance smaller than the threshold into one type, so as to classify the high-reflection laser points returned by the same reflector in the intensity graph together;
then, the position in the intensity diagram of the center of the reflector is calculated by the following formula:
u′=(u′ 1 +u′ 2 +…+u′ n )/n;
v′=(v′ 1 +v′ 2 +…+v′ n )/n;
where { u ', v ' } is the position in the intensity pattern of the reflector center, { u ' i ,v′ i I e {1,2 … n } is the position of n pixels in a cluster.
Preferably, the characteristic automatic matching utilizes a nearest neighbor method to automatically match the center of the calibration plate and the center of the reflector, and the specific method is to calculate the distance between any two center points by the following formula:
d in ji ' is the distance between the center point of the ith calibration plate and the center point of the jth reflector; then, for each calibration plate center point, selecting the nearest reflector center point as a matching point to generate a matching point set { { (u) i ,v i ),(u′ i ,v′ i The |i epsilon (1, 2 … n) }, and the matching of the feature points is automatically completed.
Preferably, the optimization constraint relation obtaining external parameters algorithm needs to include rotation and movement between a camera and a radar when being established, the external parameters are respectively set to be rotation { roll, pitch, yaw } and translation { tx, ty, tz }, and then the following equation set F is established according to each pair of matching points in a matching point set established by the depth map and panorama characteristic automatic matching algorithm:
x′=cp.cy.x+(sr.sp.cy-cr.sy).y+(cr.sp.cy+sr.sy).z+tx;
y′=cp.sy.x+(sr.sp.sy+cr.cy).y+(cr.sp.sy+sr.sy).z+ty;
z′=-sy.x+sr.cp.y+cr.cp.z+tx;
v′arctan(y′,x′)/180.img cols
u=u′;
v=v′;
where cp=cos (pitch), sp=sin (pitch), cr=cos (roll), sr=sin (roll), cy=cos (yaw), sy=sin (yaw);
thus, a constraint relation equation constructed by a pair of matching points is established, and n pairs of matching points are totally established, so that n equation sets { F 1 ,F 2 …F n And solving the equation set to obtain the external parameters { roll, pitch, law }, { tx, ty, tz }, thereby realizing the calibration of the panoramic camera and the multi-line laser radar external parameters.
The invention has the technical effects and advantages that: the invention designs a calibration room with a calibration plate and a reflector, and then designs a rotating device with an encoder in the center of the calibration room, the device rotates together with equipment combining a panoramic camera and a multi-line laser radar, a first frame of panoramic camera photo is snapped and accumulated multi-line laser radar information is generated to generate a laser intensity image, then the panoramic photo and a depth image are automatically matched to establish a constraint relation, the establishment of the constraint relation is carried out through the calibration room with a mark and a specific device, and then the panoramic camera and the laser radar are optimized and solved through a least square method, so that an external parameter calibration system of the multi-line laser radar and the panoramic camera has a brand-new calibration system with the functions of automatic calibration, multi-line radar application, scanning device calibration application and the like.
Drawings
FIG. 1 is a schematic view of a calibration room structure of the present invention.
Fig. 2 is a schematic view of a rotary device according to the present invention.
Fig. 3 is a schematic diagram of the device to be calibrated (panoramic camera and lidar) of the present invention.
Fig. 4 is a flow chart of laser intensity map generation according to the present invention.
FIG. 5 is a schematic view of a laser point cloud of a calibration room of the present invention.
Fig. 6 is a graph of laser point cloud intensity according to the present invention.
Fig. 7 is a flow chart of the feature automatic matching of the present invention.
Fig. 8 is a view of the panoramic view of the present invention.
Fig. 9 is a flowchart for determining the position of the feature point of the panorama of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The invention provides a panoramic camera and multi-line laser radar external parameter calibration system as shown in figures 1-9, which comprises a calibration room with marks, a rotating device with an encoder, an algorithm for accumulating laser intensity maps generated by the multi-line laser radar, an automatic matching algorithm for depth maps and panoramic map features and an external parameter acquisition algorithm for optimizing constraint relations;
as shown in fig. 1, two calibration plates for acquiring characteristic information of an image are respectively posted on two sides, the front and the back of the marking room, and a reflecting plate for acquiring the characteristic information of the multi-line laser radar is arranged at the right center of each calibration plate;
as shown in fig. 2, the rotating device comprises a fixed base, a motor with an encoder and a placing table arranged at the top end of the fixed base, wherein the motor with the encoder is arranged in the fixed base, and the output end of the motor with the encoder extends to the top end of the fixed base to be connected with the placing table; specifically, the panoramic camera and the multi-line laser radar device to be calibrated are placed on a placing table during calibration, and then the motor with the encoder is controlled to drive the placing table to rotate in situ by more than 180 degrees.
And (3) accumulating a multi-line laser radar to generate a laser intensity graph algorithm: as shown in fig. 3, the panoramic camera and the laser radar are simultaneously arranged on the rotary table to rotate for calibration, and the structure of the panoramic camera and the laser radar is the same as that of the scanner device, so the panoramic camera and the laser radar can be suitable for calibrating the scanner device. An intensity map is then generated by the flowchart as in fig. 4.
The intensity map generation includes the steps of:
s1, placing a panoramic camera to be calibrated and a multi-line laser radar on a placing table, then rotating the placing table through a motor with an encoder, controlling the motor with the encoder to drive the placing table to rotate 360 degrees, and recording encoder data information { theta } in the whole process 0 ,θ 1 ,θ 2 ,…θ n Data information { C of } multi-line laser radar 0 ,C 1 ,C 2 ,…C n And panoramic camera grabbed panoramic image img with coded motor position, where { C } i ={P 0 ,P 1 ,…P m }|P j ={x,y,z,k}};
Wherein: θ i Refers to the rotation angle read by each frame encoder;
C i the method comprises the steps of referring to a single-frame point cloud acquired at the ith moment;
{ x, y, z, k } refers to the position information and intensity information of each point in the point cloud, respectively;
s2, accumulating point cloud information:
the non-frame point cloud information is converted into a world coordinate system (the world coordinate system is the exact center of the turntable, and the orientation is the zero position of the turntable rotation) according to the following formula:
X wij =X ij ×cos(θ i )-Y ij ×sin(θ i );
Y wij =X ij ×sin(θ i )-Y ij ×cos(θ i );
Z wij =Z ij
K wij =K ij
wherein: { X wij ,Y wij ,Z wij ,K wij Information of radar laser point in world coordinate system, { X } refers to ij ,Y ij ,Z ij ,K ij The j laser point information, theta, in the point cloud acquired at the i moment i Refer to the ith timeThe rotated angle information obtained by the encoder; all laser point information can be converted into a world coordinate system through the method, accumulation of point clouds is achieved, and a complete calibration room point cloud chart shown in fig. 5 is generated. In the figure, the black dot represents a strong reflection point, and the white dot represents a low reflection point. The reason why the strong reflection point occurs is that the reflection plate in the calibration room has strong reflectivity, so that the reflection intensity of the laser spot is large at the position where the reflection plate exists, and the reflection intensity of the laser spot is small at other positions. .
S3, projecting the point cloud onto the panorama expansion chart according to longitude and latitude, wherein a projection formula is as follows:
longitude ij =arctan(y ij ,x ij );
v ij =latitude ij /360×img rows
u ij =longitude ij /180×img cols
wherein { img } in the formula rows ,img cols The number of rows and columns of pixels of the panorama expanded image, { v ij ,u ij The j point at the i moment is the pixel position of the j point in the panorama;
s4, finally mapping the intensity of the laser points onto the panorama according to the pixel position in the step S3 to form an intensity diagram shown in FIG. 6;
the depth map and panorama feature automatic matching algorithm comprises panorama feature extraction, laser point cloud intensity map feature extraction and feature automatic matching, and the algorithm flow is shown in a block diagram 7.
Preferably, the panorama feature extraction captures texture information of the whole calibration room, as shown in fig. 8, and intuitively sees the position of the calibration plate, and at the same time, the center of the calibration plate is determined by identifying the calibration plate, and the center is used as a feature point of the panorama, wherein the center feature point identification flow is as follows (as shown in fig. 9):
s5, firstly performing panoramic binarization, setting a threshold T, and traversing each pixel of the panoramic image;
s6, when the pixel value is larger than T, setting the pixel value to be black, otherwise setting the pixel value to be white;
s7, identifying a rectangular frame, calculating fixed point numbers along the edges of each black area of the panoramic image, and identifying the rectangular frame when the number of vertexes is equal to 4, namely, the position of the calibration plate;
s8, finally calculating the position of the center point of the calibration plate through the following formula:
u=(u 1 +u 2 +u 3 +u 4 )/4;
v=(v 1 +v 2 +v 3 +v 4 )/4;
in the formula, { u, v } is the position of the center point of the calibration plate, { u } i ,v i I epsilon {1,2,3,4} is the pixel position information of the four vertexes of the calibration plate.
Preferably, the extracting of the laser point cloud intensity map features includes clustering of high reflection intensity pixels and determination of feature center points, which is specifically as follows:
firstly, clustering is carried out, wherein the clustering is carried out because the reflecting plates have a certain area, so that the positions of the laser striking points of the reflecting plates are different and are dispersed in a certain range, and therefore, in order to determine the central position of the reflecting plates, the laser reflecting points of the same reflecting plate are required to be clustered together, the clustering method is a threshold value method, and the distance between adjacent pixel points is calculated by the following formula:
d in ji I.e. the distance between the ith pixel and the jth pixel, { u } i ,v i }、{u j ,v j The i and j pixel point positions are respectively;
then setting a threshold T', and gathering the distance smaller than the threshold into one type, so as to classify the high-reflection laser points returned by the same reflector in the intensity graph together;
then, the position in the intensity diagram of the center of the reflector is calculated by the following formula:
u′=(u′ 1 +u′ 2 +…+u′ n )/n;
v′=(v′ 1 +v′ 2 +…+v′ n )/n;
where { u ', v ' } is the position in the intensity pattern of the reflector center, { u ' i ,v′ i I e {1,2 … n } is the position of n pixels in a cluster.
Since the intensity map and the panorama are consistent in size and the reflector is attached to the center of the calibration plate when the calibration room is designed, the center of the calibration plate and the center of the reflector obtained by the calculation should be at the same position if there is no error in external parameters between the multi-line radar and the panorama camera. However, since there is a small error in the initial set of external parameters, there is a small deviation in the positions of the calibration plate center and the reflector plate center. Through the analysis, the center of the calibration plate can be automatically matched by using the nearest neighbor method, and the specific method is to calculate the distance between any two center points by the following formula:
d in ji ' is the distance between the center point of the ith calibration plate and the center point of the jth reflector; then, for each calibration plate center point, selecting the nearest reflector center point as a matching point to generate a matching point set { { (u) i ,v i ),(u′ i ,v′ i The |i epsilon (1, 2 … n) }, and the matching of the feature points is automatically completed.
Preferably, the optimization constraint relation obtaining external parameters algorithm needs to include rotation and movement between a camera and a radar when being established, the external parameters are respectively set to be rotation { roll, pitch, yaw } and translation { tx, ty, tz }, and then the following equation set F is established according to each pair of matching points in a matching point set established by the depth map and panorama characteristic automatic matching algorithm:
x′=cp.cy.x+(sr.sp.cy-cr.sy).y+(cr.sp.cy+sr.sy).z+tx;
y′=cp.sy.x+(sr.sp.sy+cr.cy).y+(cr.sp.sy+sr.sy).z+ty;
z′=-sy.x+sr.cp.y+cr.cp.z+tx;
v′arctan(y′,x′)/180.img cols
u=u′;
v=v′;
where cp=cos (pitch), sp=sin (pitch), cr=cos (roll), sr=sin (roll), cy=cos (yaw), sy=sin (yaw);
thus, a constraint relation equation constructed by a pair of matching points is established, and n pairs of matching points are totally established, so that n equation sets { F 1 ,F 2 …F n And solving the equation set to obtain the external parameters { roll, pitch, law }, { tx, ty, tz }, thereby realizing the calibration of the panoramic camera and the multi-line laser radar external parameters.
Finally, it should be noted that: the foregoing description is only illustrative of the preferred embodiments of the present invention, and although the present invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that modifications may be made to the embodiments described, or equivalents may be substituted for elements thereof, and any modifications, equivalents, improvements or changes may be made without departing from the spirit and principles of the present invention.

Claims (6)

1. A panoramic camera and multi-line laser radar external parameter calibration system is characterized in that: the method comprises a calibration room with a mark, a rotating device with an encoder, an accumulated multi-line laser radar generated laser intensity map algorithm, a depth map and panorama characteristic automatic matching algorithm and an optimization constraint relation acquisition external parameter algorithm;
two calibration plates for acquiring characteristic information of images are respectively stuck to the two sides, the front and the back of the marking room, and the right center of each calibration plate is provided with a reflector for acquiring the characteristic information by the multi-line laser radar;
the rotating device comprises a fixed base, a motor with an encoder and a placing table arranged at the top end of the fixed base, wherein the motor with the encoder is arranged in the fixed base, and the output end of the motor with the encoder extends to the top end of the fixed base and is connected with the placing table;
the intensity map generation includes the steps of:
s1, placing a panoramic camera to be calibrated and a multi-line laser radar on a placement table, then rotating the placement table through a motor with an encoder, controlling the motor with the encoder to drive the placement table to rotate 360 degrees, and recording encoder data information, multi-line laser radar data information and a panoramic image img captured by the panoramic camera at the position of the motor with the encoder;
s2, accumulating point cloud information:
converting the non-frame point cloud information into a world coordinate system according to the following formula:
X wij =X ij ×cos(θ i )-Y ij ×sin(θ i );
Y wij =X ij ×sin(θ i )-Y ij ×cos(θ i );
Z wij =Z ij
K wij =K ij
wherein: { X wij ,Y wij ,Z wij ,K wij Information of radar laser point in world coordinate system, { X } refers to ij ,Y ij ,Z ij ,K ij The j laser point information, theta, in the point cloud acquired at the i moment i Refers to the rotated angle information obtained by the ith moment encoder;
s3, projecting the point cloud onto the panorama expansion chart according to longitude and latitude, wherein a specific projection formula is as follows:
longitude ij =arctan(y ij ,x ij );
v ij =latitude ij /360×img rows
u ij =longitude ij /180×img cols
wherein { img } in the formula rows ,img cols The number of rows and columns of pixels of the panorama expanded image, { v ij ,u ij The j point at the i moment is the pixel position of the j point in the panorama;
s4, finally mapping the intensity of the laser points onto the panorama according to the pixel position in the step S3;
the depth map and panorama feature automatic matching algorithm comprises panorama feature extraction, laser point cloud intensity map feature extraction and feature automatic matching.
2. The panoramic camera and multi-line lidar external parameter calibration system of claim 1, wherein: the encoder data information in step S1 is { θ } 0 ,θ 1 ,θ 2 ,...θ n Data information of the multi-line laser radar is { C } 0 ,C 1 ,C 2 ,…C n (C) i ={P 0 ,P 1 ,...P m }|P j ={x,y,z,k}};
Wherein: θ i Refers to the rotation angle read by each frame encoder;
C i the method comprises the steps of referring to a single-frame point cloud acquired at the ith moment;
{ x, y, z, k } refers to the positional information and intensity information of each point in the point cloud, respectively.
3. The panoramic camera and multi-line lidar external parameter calibration system of claim 1, wherein: the panorama feature extraction is to capture texture information of the whole calibration room, intuitively see the position of the calibration plate, determine the center of the calibration plate by identifying the calibration plate, and take the center as a feature point of the panorama, wherein the center feature point identification flow is as follows:
s5, firstly performing panoramic binarization, setting a threshold T, and traversing each pixel of the panoramic image;
s6, when the pixel value is larger than T, setting the pixel value to be black, otherwise setting the pixel value to be white;
s7, identifying a rectangular frame, calculating fixed point numbers along the edges of each black area of the panoramic image, and identifying the rectangular frame when the number of vertexes is equal to 4, namely, the position of the calibration plate;
s8, finally calculating the position of the center point of the calibration plate through the following formula:
u=(u 1 +u 2 +u 3 +u 4 )/4;
v=(v 1 +v 2 +v 3 +v 4 )/4;
in the formula, { u, v } is the position of the center point of the calibration plate, { u } i ,v i I epsilon {1,2,3,4} is the pixel position information of the four vertexes of the calibration plate.
4. The panoramic camera and multi-line lidar external parameter calibration system of claim 1, wherein: the laser point cloud intensity map feature extraction comprises high reflection intensity pixel point clustering and feature center point determination, and the specific steps are as follows:
firstly, clustering is carried out, wherein the clustering is carried out because the reflecting plates have a certain area, so that the positions of the laser striking points of the reflecting plates are different and are dispersed in a certain range, and therefore, in order to determine the central position of the reflecting plates, the laser reflecting points of the same reflecting plate are required to be clustered together, the clustering method is a threshold value method, and the distance between adjacent pixel points is calculated by the following formula:
d in ji I.e. the distance between the ith pixel and the jth pixel, { u } i ,v i }、{u j ,v j The i and j pixel point positions are respectively;
then setting a threshold T', and gathering the distance smaller than the threshold into one type, so as to classify the high-reflection laser points returned by the same reflector in the intensity graph together;
then, the position in the intensity diagram of the center of the reflector is calculated by the following formula:
u’=(u’ 1 +u’ 2 +…+u’ n )/n;
v’=(v’ 1 +v’ 2 +…+v’ n )/n;
where { u ', v ' } is the position in the intensity pattern of the reflector center, { u ' i ,v’ i I e {1,2 … n } is the position of n pixels in a cluster.
5. The panoramic camera and multi-line lidar external parameter calibration system of claim 1, wherein: the characteristic automatic matching utilizes the nearest neighbor method to automatically match the center of the calibration plate and the center of the reflecting plate, and the specific method is to calculate the distance between any two center points by the following steps:
d in ji ' is the distance between the center point of the ith calibration plate and the center point of the jth reflector; then, for each calibration plate center point, selecting the nearest reflector center point as a matching point to generate a matching point set { { (u) i ,v i ),(u’ i ,v’ i The |i epsilon (1, 2 … n) }, and the matching of the feature points is automatically completed.
6. The panoramic camera and multi-line lidar external parameter calibration system of claim 1, wherein: the method comprises the steps that when the external parameter algorithm is established, external parameters between a camera and a radar are required to be set to be rotation { roll, pitch, yaw } and translation { tx, ty, tz }, and then the following equation F is established according to each pair of matching points in a matching point set established by the depth map and panorama characteristic automatic matching algorithm:
x’=cp.cy.x+(sr.sp.cy-cr.sy).y+(cr.sp.cy+sr.sy).z+tx;
y’=cp.sy.x+(sr.sp.sy+cr.cy).y+(cr.sp.sy+sr.sy).z+ty;
z’=-sy.x+sr.cp.y+cr.cp.z+tx;
v’=arctan(y’,x’)/180.img cols
u=u’;
v=v’;
where cp=cos (pitch), sp=sin (pitch), cr=cos (roll), sr=sin (roll), cy=cos (yaw), sy=sin (yaw);
thus, a constraint relation equation constructed by a pair of matching points is established, and n pairs of matching points are totally established, so that n equation sets { F 1 ,F 2 …F n And solving the equation set to obtain the external parameters { roll, pitch, law }, { tx, ty, tz }, thereby realizing the calibration of the panoramic camera and the multi-line laser radar external parameters.
CN202011128611.6A 2020-10-20 2020-10-20 Panoramic camera and multi-line laser radar external parameter calibration system Active CN112305557B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011128611.6A CN112305557B (en) 2020-10-20 2020-10-20 Panoramic camera and multi-line laser radar external parameter calibration system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011128611.6A CN112305557B (en) 2020-10-20 2020-10-20 Panoramic camera and multi-line laser radar external parameter calibration system

Publications (2)

Publication Number Publication Date
CN112305557A CN112305557A (en) 2021-02-02
CN112305557B true CN112305557B (en) 2023-10-20

Family

ID=74328237

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011128611.6A Active CN112305557B (en) 2020-10-20 2020-10-20 Panoramic camera and multi-line laser radar external parameter calibration system

Country Status (1)

Country Link
CN (1) CN112305557B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113777593B (en) * 2021-11-11 2022-03-04 中国科学院自动化研究所 Multi-laser radar external parameter calibration method and device based on servo motor auxiliary motion
CN114137553B (en) * 2022-01-30 2022-04-12 探维科技(北京)有限公司 Radar dimming method and system based on image fusion laser
CN114488099A (en) * 2022-01-30 2022-05-13 中国第一汽车股份有限公司 Laser radar coefficient calibration method and device, electronic equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2990828A1 (en) * 2014-08-26 2016-03-02 Kabushiki Kaisha Topcon Point cloud position data processing device, point cloud position data processing system, point cloud position data processing method, and program therefor
CN109300162A (en) * 2018-08-17 2019-02-01 浙江工业大学 A kind of multi-line laser radar and camera combined calibrating method based on fining radar scanning marginal point
CN111179358A (en) * 2019-12-30 2020-05-19 浙江商汤科技开发有限公司 Calibration method, device, equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109118542B (en) * 2017-06-22 2021-11-23 阿波罗智能技术(北京)有限公司 Calibration method, device, equipment and storage medium between laser radar and camera

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2990828A1 (en) * 2014-08-26 2016-03-02 Kabushiki Kaisha Topcon Point cloud position data processing device, point cloud position data processing system, point cloud position data processing method, and program therefor
CN109300162A (en) * 2018-08-17 2019-02-01 浙江工业大学 A kind of multi-line laser radar and camera combined calibrating method based on fining radar scanning marginal point
CN111179358A (en) * 2019-12-30 2020-05-19 浙江商汤科技开发有限公司 Calibration method, device, equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于点云中心的激光雷达与相机联合标定方法研究;康国华;张琪;张晗;徐伟证;张文豪;;仪器仪表学报(第12期);全文 *

Also Published As

Publication number Publication date
CN112305557A (en) 2021-02-02

Similar Documents

Publication Publication Date Title
CN112305557B (en) Panoramic camera and multi-line laser radar external parameter calibration system
CN111473739B (en) Video monitoring-based surrounding rock deformation real-time monitoring method for tunnel collapse area
WO2022142759A1 (en) Lidar and camera joint calibration method
EP0701225B1 (en) System for transcribing images on a board using a camera based board scanner
CN111369630A (en) Method for calibrating multi-line laser radar and camera
CN108389233B (en) Laser scanner and camera calibration method based on boundary constraint and mean value approximation
CN106774296A (en) A kind of disorder detection method based on laser radar and ccd video camera information fusion
CN111191625A (en) Object identification and positioning method based on laser-monocular vision fusion
WO2005024720A2 (en) Color edge based system and method for determination of 3d surface topology
Lyu et al. An interactive LiDAR to camera calibration
CN115937288A (en) Three-dimensional scene model construction method for transformer substation
CN104976968A (en) Three-dimensional geometrical measurement method and three-dimensional geometrical measurement system based on LED tag tracking
CN116704048B (en) Double-light registration method
CN113345084B (en) Three-dimensional modeling system and three-dimensional modeling method
CN114137564A (en) Automatic indoor object identification and positioning method and device
CN111380503B (en) Monocular camera ranging method adopting laser-assisted calibration
CN112182967B (en) Automatic photovoltaic module modeling method based on thermal imaging instrument
EP4071713B1 (en) Parameter calibration method and apapratus
CN114792343B (en) Calibration method of image acquisition equipment, method and device for acquiring image data
CN117392237A (en) Robust laser radar-camera self-calibration method
CN111598956A (en) Calibration method, device and system
CN110956668A (en) Focusing stack imaging system preset position calibration method based on focusing measure
CN116402713A (en) Electric three-dimensional point cloud completion method based on two-dimensional image and geometric shape
CN112288824B (en) Device and method for calibrating tele camera based on real scene
CN114782556A (en) Camera and laser radar registration method, system and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20230906

Address after: 518000 Room 601, building r2-b, Gaoxin industrial village, No. 020, Gaoxin South seventh Road, Gaoxin community, Yuehai street, Nanshan District, Shenzhen, Guangdong

Applicant after: Shenzhen Nuoda Communication Technology Co.,Ltd.

Address before: 518000 a501, 5th floor, Shanshui building, Nanshan cloud Valley Innovation Industrial Park, 4093 Liuxian Avenue, Taoyuan Street, Nanshan District, Shenzhen City, Guangdong Province

Applicant before: SHENZHEN WUJING INTELLIGENT ROBOT Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant