CN114371472A - Automatic combined calibration device and method for laser radar and camera - Google Patents

Automatic combined calibration device and method for laser radar and camera Download PDF

Info

Publication number
CN114371472A
CN114371472A CN202111539056.0A CN202111539056A CN114371472A CN 114371472 A CN114371472 A CN 114371472A CN 202111539056 A CN202111539056 A CN 202111539056A CN 114371472 A CN114371472 A CN 114371472A
Authority
CN
China
Prior art keywords
calibration
pca
calibration plate
point cloud
frame data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111539056.0A
Other languages
Chinese (zh)
Other versions
CN114371472B (en
Inventor
龚方徽
乔宝华
程坤
詹兴样
王方瑞
董帅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CETHIK Group Ltd
Original Assignee
CETHIK Group Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CETHIK Group Ltd filed Critical CETHIK Group Ltd
Priority to CN202111539056.0A priority Critical patent/CN114371472B/en
Publication of CN114371472A publication Critical patent/CN114371472A/en
Application granted granted Critical
Publication of CN114371472B publication Critical patent/CN114371472B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2135Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention discloses an automatic combined calibration device and method for a laser radar and a camera, wherein the method comprises the steps of controlling a movable track to move so that a rotatable support is located at a preset point position, controlling the rotatable support to change a rotating posture so that a moving piece changes the direction, and acquiring image frame data and point cloud frame data in each direction of the moving piece until a preset number of the image frame data and the point cloud frame data are acquired; calibrating camera internal parameters based on the acquired image frame data; calculating pixel coordinates of four vertexes at the outermost periphery of the calibration plate based on image frame data; extracting point cloud calibration points to obtain final coordinates of point clouds of four vertexes at the outermost periphery of the calibration plate; and solving the transformation relation between the pixel coordinates of the four vertexes at the outermost periphery of the calibration plate and the point cloud final coordinates to obtain a rotation matrix and a translation matrix, and finishing the joint calibration of the laser radar and the camera. The invention realizes the convenient, fast and high-precision linkage calibration of the laser radar and the camera.

Description

Automatic combined calibration device and method for laser radar and camera
Technical Field
The application belongs to the technical field of equipment combined calibration, and particularly relates to an automatic combined calibration device and method for a laser radar and a camera.
Background
In the fields of mobile robots, automatic driving, auxiliary driving, environment sensing and the like, a single sensor is difficult to meet the sensing requirement on a complex environment, a multi-sensor fusion algorithm becomes the mainstream algorithm at present, and multi-sensor acquisition information is fused to complement the advantages. The premise of improving the accuracy of the multi-sensor fusion algorithm is to solve the time synchronization and the space synchronization between the sensors, wherein the space synchronization is the joint calibration between the sensors. The joint calibration is divided into two parts: and calibrating the internal reference and the external reference. The internal reference calibration is a mapping relation inside the sensors, and the external reference calibration is a coordinate conversion relation among the sensors.
In order to realize calibration of different types of sensors, particularly joint calibration of a camera and a laser radar, a traditional method is to manually match laser point cloud 3D characteristic points and camera 2D characteristic points based on a calibration board (a checkerboard, an L-shaped calibration board and a three-dimensional calibration box) and perform solution of an extrinsic parameter matrix. In the calibration process, the calibration plate needs to be moved manually, the feature points need to be selected manually, the operation is complex, the manual participation degree is high, the time is consumed, the efficiency is low, and the precision is difficult to guarantee. In order to improve flexibility, calibration methods without participation of calibration plates are widely used. Based on the observation data, the correlation of the intensity or edge features between the observed point cloud and the image data is used to find the extrinsic parameters. The method can be calibrated in a natural scene, but the scene needs to be selected, and the scene needs to contain information such as trees, telegraph poles, street lamps and the like.
In the prior art, for example, patent document CN111127563A discloses a joint calibration method, which does not require a calibration board, but only requires a target object with an angular point. And arranging a target object which can be used for calibration in the target acquisition area, calculating coordinates of each corner point in the image data and the point cloud data, and calculating calibration parameters through coordinate matching. However, this method requires selection of a calibration scene, and requires that the scene contains a detection object satisfying a characteristic condition. In the prior art, for example, patent document No. CN111735479B proposes a device and method for auxiliary calibration using a mechanical arm, so as to implement intelligent calibration. A sensing fusion frame is arranged on the mechanical arm, a camera and a laser radar are arranged on the frame, the position of the central point of each calibration plate is obtained through data processing of the four calibration plates, and the position is used as a characteristic point to match and calculate an external parameter matrix. According to the method, the mechanical arm controls the calibration equipment to move to pick up points, the calibration range is limited due to the unchanged position of the calibration plate, and the price of the mechanical arm is high.
In summary, in the conventional calibration method in the prior art, a calibration plate is used, the manual participation degree is high, the calibration plate needs to be moved manually, the feature points are detected manually or automatically by using an algorithm, the calibration process is complicated, the workload is large, and the automation degree is low. The positions of the laser radar and the camera are changed by the mechanical arm, the calibration samples are collected, the automation degree is improved, but the movement range of the mechanical arm is limited, the calibration range is limited, and the calibration cost is increased due to high price. The calibration method of the fusion inertial navigation system needs to be carried out under the motion condition and is easily influenced by light and scenes.
Disclosure of Invention
The application aims to provide an automatic combined calibration device and method for a laser radar and a camera, so that the linkage calibration of the laser radar and the camera is convenient, fast and high in precision.
In order to achieve the purpose, the technical scheme adopted by the application is as follows:
an automatic combined calibration device for a laser radar and a camera is used for realizing the combined calibration of the laser radar and the camera, the laser radar and the camera are installed on a device to be calibrated, and the automatic combined calibration device for the laser radar and the camera comprises: but movable rail, rotatable support, calibration board and industrial computer, the calibration board is chess board check calibration board, treat that one in calibration equipment and the calibration board is the motion, another is the static, wherein:
the movable part is fixed on the rotatable bracket, and the static part and the movable part are arranged oppositely and do not move relative to the ground;
the rotatable bracket is provided with a plurality of preset rotating postures, the rotatable bracket changes the rotating postures to drive the moving piece to change the orientation, and the rotatable bracket is installed on the movable track;
the movable track is provided with a plurality of preset motion paths, and the movable track moves based on different motion paths to drive the rotatable support to change the point position;
the industrial personal computer is electrically connected with the laser radar, the camera, the movable track and the rotatable support and used for issuing a calibration instruction to complete the combined calibration of the laser radar and the camera, and the industrial personal computer specifically executes the following operations:
s1, controlling the movable track to move to enable the rotatable support to be located at a preset point position, controlling the rotatable support to change the rotating posture to enable the moving piece to change the direction, acquiring image frame data and point cloud frame data through a laser radar and a camera in each direction of the moving piece, and repeatedly executing the step S1 until a preset number of image frame data and point cloud frame data are acquired;
s2, calibrating camera internal parameters based on the collected image frame data;
s3, image index point extraction: calculating pixel coordinates of four vertexes at the outermost periphery of the calibration plate based on the image frame data;
s4, point cloud calibration point extraction:
s4.1, point cloud frame data preprocessing;
s4.2, fitting the maximum plane in the point cloud frame data by using a RANSAC algorithm to serve as a calibration plate plane, and extracting the point cloud frame data in the calibration plate plane;
s4.3, calculating a minimum external frame of point cloud frame data in the plane of the calibration plate, and taking the coordinates of four vertexes of the minimum external frame as point cloud estimation coordinates of four vertexes at the outermost periphery of the calibration plate;
s4.4, correcting the point cloud estimated coordinates to obtain final point cloud coordinates;
and S5, solving the transformation relation between the pixel coordinates of the four vertexes at the outermost periphery of the calibration plate and the point cloud final coordinates to obtain a rotation matrix and a translation matrix, and completing the joint calibration of the laser radar and the camera.
Several alternatives are provided below, but not as an additional limitation to the above general solution, but merely as a further addition or preference, each alternative being combinable individually for the above general solution or among several alternatives without technical or logical contradictions.
Preferably, the movable track has two preset motion paths, and the two motion paths form a cross shape; the rotatable support is provided with three preset rotating postures which are forward rotation, left rotation and right rotation, wherein the rotating angle of the left rotation and the right rotation is smaller than 30 degrees.
Preferably, the calculating the coordinates of the pixels of the four vertices at the outermost periphery of the calibration board based on the image frame data includes:
and extracting angular point position information based on the image frame data by using a growth algorithm, calculating pixel coordinates of four vertexes at the outermost periphery of the calibration plate according to the calibration plate customization parameters and the angular point position information, and storing the pixel coordinates of the four vertexes from the upper left corner of the calibration plate in a clockwise sequence.
Preferably, the correcting the point cloud estimated coordinates to obtain point cloud final coordinates includes:
s4.4.1, taking the point cloud frame data in the plane of the calibration plate as input, obtaining three principal component directions of the point cloud by PCA method, and recording as ypca、zpcaAnd xpca
S4.4.2, for the obtained principal component direction ypcaAnd zpcaFine adjustment is carried out to enable the length direction and the width direction of the calibration plate to be close to the real length direction and the real width direction of the calibration plate;
s4.4.3, in an adjusted principal component direction ypcaAs the length direction of the calibration plate, in the adjusted principal component direction zpcaAs the width direction of the calibration plate, in the principal component direction xpcaAs the normal direction of the calibration plate,and calculating coordinates of four vertexes of the adjusted minimum external frame as final coordinates of the point clouds of the four vertexes at the outermost periphery of the calibration plate, and storing the final coordinates of the point clouds of the four vertexes from the upper left corner of the calibration plate in a clockwise sequence.
Preferably, the pair of obtained principal component directions ypcaAnd zpcaFine tuning is performed to enable the calibration plate to approach the real length direction and the real width direction of the calibration plate, and the method comprises the following steps:
s4.4.2.1, and calculating point cloud coordinates of four vertexes at the outermost periphery of the calibration plate, and marking as V1、V2、V3And V4
S4.4.2.2, combining three principal component directions ypca、zpcaAnd xpcaSequentially moving the origin of the formed coordinate axis to the point cloud estimated coordinates of the four vertexes to obtain a coordinate system C1、C2、C3And C4
S4.4.2.3, respectively in four coordinate systems C1、C2、C3And C4The following operations are performed:
(1) principal component direction ypcaAnd zpcaCoordinate system based origin around principal component direction xpcaIs carried out in a span of [ -30 DEG, 30 DEG ]]The step length is 2 degrees;
(2) in the principal component direction ypcaAnd zpcaAcquire new y after each revolutionpcaAnd zpcaIs marked as
Figure BDA0003413371720000041
And
Figure BDA0003413371720000042
and calculate out
Figure BDA0003413371720000043
And
Figure BDA0003413371720000044
plane of composition
Figure BDA0003413371720000045
While recording the rotation angle thetaiWherein i is the number of revolutions, and i is 1,2,3 …, 30;
(3) projecting point cloud frame data in the plane of the calibration plate to the plane
Figure BDA0003413371720000046
Then calculating the projection plane
Figure BDA0003413371720000047
And
Figure BDA0003413371720000048
length in the direction as the length of the calibration plate
Figure BDA0003413371720000049
And width
Figure BDA00034133717200000410
Calculating length simultaneously
Figure BDA00034133717200000411
And width
Figure BDA00034133717200000412
The number Num of point clouds in the enclosed rectangular areai
(4) For NumiSorting to get NumiI at maximum value corresponding to I
Figure BDA00034133717200000413
And
Figure BDA00034133717200000414
s4.4.2.3, respectively taking the coordinate system C in four coordinate systems1、C2、C3And C4Four sets of data obtained by the following operations (
Figure BDA00034133717200000415
And
Figure BDA00034133717200000416
) In four groups of data
Figure BDA00034133717200000417
And
Figure BDA00034133717200000418
corresponding to length and width closest to the true length of the calibration plate
Figure BDA00034133717200000419
And
Figure BDA00034133717200000420
as adjusted principal component direction ypcaAnd zpca
The application also provides an automatic combined calibration method of the laser radar and the camera, which is used for realizing the combined calibration of the laser radar and the camera, wherein the laser radar and the camera are installed on equipment to be calibrated, the automatic combined calibration method of the laser radar and the camera is realized based on a movable track, a rotatable support, a calibration plate and an industrial personal computer, the calibration plate is a chessboard format calibration plate, one of the equipment to be calibrated and the calibration plate is a moving part, and the other one of the equipment to be calibrated and the calibration plate is a static part, wherein:
the movable part is fixed on the rotatable bracket, and the static part and the movable part are arranged oppositely and do not move relative to the ground;
the rotatable bracket is provided with a plurality of preset rotating postures, the rotatable bracket changes the rotating postures to drive the moving piece to change the orientation, and the rotatable bracket is installed on the movable track;
the movable track is provided with a plurality of preset motion paths, and the movable track moves based on different motion paths to drive the rotatable support to change the point position;
the industrial personal computer is electrically connected with the laser radar, the camera, the movable track and the rotatable support and is used for issuing a calibration instruction to complete the combined calibration of the laser radar and the camera, and the automatic combined calibration method of the laser radar and the camera comprises the following steps:
s1, controlling the movable track to move to enable the rotatable support to be located at a preset point position, controlling the rotatable support to change the rotating posture to enable the moving piece to change the direction, acquiring image frame data and point cloud frame data through a laser radar and a camera in each direction of the moving piece, and repeatedly executing the step S1 until a preset number of image frame data and point cloud frame data are acquired;
s2, calibrating camera internal parameters based on the collected image frame data;
s3, image index point extraction: calculating pixel coordinates of four vertexes at the outermost periphery of the calibration plate based on the image frame data;
s4, point cloud calibration point extraction:
s4.1, point cloud frame data preprocessing;
s4.2, fitting the maximum plane in the point cloud frame data by using a RANSAC algorithm to serve as a calibration plate plane, and extracting the point cloud frame data in the calibration plate plane;
s4.3, calculating a minimum external frame of point cloud frame data in the plane of the calibration plate, and taking the coordinates of four vertexes of the minimum external frame as point cloud estimation coordinates of four vertexes at the outermost periphery of the calibration plate;
s4.4, correcting the point cloud estimated coordinates to obtain final point cloud coordinates;
and S5, solving the transformation relation between the pixel coordinates of the four vertexes at the outermost periphery of the calibration plate and the point cloud final coordinates to obtain a rotation matrix and a translation matrix, and completing the joint calibration of the laser radar and the camera.
Preferably, the movable track has two preset motion paths, and the two motion paths form a cross shape; the rotatable support is provided with three preset rotating postures which are forward rotation, left rotation and right rotation, wherein the rotating angle of the left rotation and the right rotation is smaller than 30 degrees.
Preferably, the calculating the coordinates of the pixels of the four vertices at the outermost periphery of the calibration board based on the image frame data includes:
and extracting angular point position information based on the image frame data by using a growth algorithm, calculating pixel coordinates of four vertexes at the outermost periphery of the calibration plate according to the calibration plate customization parameters and the angular point position information, and storing the pixel coordinates of the four vertexes from the upper left corner of the calibration plate in a clockwise sequence.
Preferably, the correcting the point cloud estimated coordinates to obtain point cloud final coordinates includes:
s4.4.1, taking the point cloud frame data in the plane of the calibration plate as input, obtaining three principal component directions of the point cloud by PCA method, and recording as ypca、zpcaAnd xpca
S4.4.2, for the obtained principal component direction ypcaAnd zpcaFine adjustment is carried out to enable the length direction and the width direction of the calibration plate to be close to the real length direction and the real width direction of the calibration plate;
s4.4.3, in an adjusted principal component direction ypcaAs the length direction of the calibration plate, in the adjusted principal component direction zpcaAs the width direction of the calibration plate, in the principal component direction xpcaAnd calculating coordinates of four vertexes of the adjusted minimum external frame as final coordinates of point clouds of the four vertexes at the outermost periphery of the calibration plate as the normal direction of the calibration plate, and storing the final coordinates of the point clouds of the four vertexes from the upper left corner of the calibration plate in a clockwise sequence.
Preferably, the pair of obtained principal component directions ypcaAnd zpcaFine tuning is performed to enable the calibration plate to approach the real length direction and the real width direction of the calibration plate, and the method comprises the following steps:
s4.4.2.1, and calculating point cloud coordinates of four vertexes at the outermost periphery of the calibration plate, and marking as V1、V2、V3And V4
S4.4.2.2, combining three principal component directions ypca、zpcaAnd xpcaSequentially moving the origin of the formed coordinate axis to the point cloud estimated coordinates of the four vertexes to obtain a coordinate system C1、C2、C3And C4
S4.4.2.3, respectively in four coordinate systems C1、C2、C3And C4The following operations are performed:
(1) principal component direction ypcaAnd zpcaCoordinate system based origin around principal component direction xpcaIs carried out in a span of [ -30 DEG, 30 DEG ]]The step length is 2 degrees;
(2) in the principal component direction ypcaAnd zpcaAcquire new y after each revolutionpcaAnd zpcaIs marked as
Figure BDA0003413371720000061
And
Figure BDA0003413371720000062
and calculate out
Figure BDA0003413371720000063
And
Figure BDA0003413371720000064
plane of composition
Figure BDA0003413371720000065
While recording the rotation angle thetaiWherein i is the number of revolutions, and i is 1,2,3 …, 30;
(3) projecting point cloud frame data in the plane of the calibration plate to the plane
Figure BDA0003413371720000066
Then calculating the projection plane
Figure BDA0003413371720000067
And
Figure BDA0003413371720000068
length in the direction as the length of the calibration plate
Figure BDA0003413371720000069
And width
Figure BDA00034133717200000610
Calculating length simultaneously
Figure BDA00034133717200000611
And width
Figure BDA00034133717200000612
The number Num of point clouds in the enclosed rectangular areai
(4) For NumiSorting to get NumiI at maximum value corresponding to I
Figure BDA00034133717200000613
And
Figure BDA00034133717200000614
s4.4.2.3, respectively taking the coordinate system C in four coordinate systems1、C2、C3And C4Four sets of data obtained by the following operations (
Figure BDA00034133717200000615
And
Figure BDA00034133717200000616
) In four groups of data
Figure BDA00034133717200000617
And
Figure BDA00034133717200000618
corresponding to length and width closest to the true length of the calibration plate
Figure BDA00034133717200000619
And
Figure BDA00034133717200000620
as adjusted principal component direction ypcaAnd zpca
Compared with the prior art, the automatic combined calibration device and method for the laser radar and the camera have the following beneficial effects:
1. the operation process is full-automatic, convenient and fast, the calibration process is not influenced by environment and human operation factors, and intelligent, automatic and batch calibration can be realized.
2. Compared with a mechanical arm, the track calibration device is low in price.
3. Compared with the movement of the calibration equipment, the calibration plate automatically moves, so that the calibration range is larger, information at a relatively far position can be acquired, and the calibration precision is improved.
4. The vertex extraction of the laser point cloud calibration plate is finely adjusted, so that the vertex coordinates are more accurate, and the calibration precision is improved.
Drawings
FIG. 1 is a schematic structural diagram of an automated combined calibration apparatus for a laser radar and a camera according to the present application;
FIG. 2 is a schematic view of the rotation of the rotatable mount of the present application;
FIG. 3 is a schematic view of a calibration plate of the present application;
fig. 4 is a flowchart of an automated joint calibration method for a laser radar and a camera according to the present application.
In the drawings: 1. an industrial personal computer; 2. equipment to be calibrated; 3. a laser radar; 4. a camera; 5. a movable rail; 6. a rotatable support; 7. and (5) calibrating the board.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It is noted that when an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present; when an element is referred to as being "secured" to another element, it can be directly secured to the other element or intervening elements may also be present.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application.
In one embodiment, an automatic combined calibration device for a laser radar and a camera is provided, is used for realizing the combined calibration of the laser radar and the camera, and relates to a multi-sensor fusion calibration technology. The problem of current multisensor unite that the process of demarcating is loaded down with trivial details, work load is big, degree of automation is low is solved, convenient and fast ground realizes the full automatization of laser radar and camera and unites the demarcation, and the device is with low costs, and is fit for the volume production and uses.
Laser radar 3 and camera 4 install on treating calibration equipment 2 in this embodiment, guarantee that there is the overlap in camera 4 and the 3 field of vision of laser radar. The automatic combined calibration device for the laser radar 3 and the camera 4 comprises: the device comprises a movable track 5, a rotatable support 6, a calibration plate 7 and an industrial personal computer 1, wherein the calibration plate 7 is a checkerboard calibration plate, and one of the device to be calibrated 2 and the calibration plate 7 is a moving part, and the other is a static part.
In the embodiment, the moving part is fixed on the rotatable support, and the static part and the moving part are arranged oppositely and do not move relative to the ground. It is easy to understand that, the relative setting of stationary part and motion in this embodiment is not limited to stationary part and motion face-to-face setting, under the prerequisite that guarantees can effectively gather data, can have certain angle between stationary part and the motion.
As shown in fig. 1, the purpose of setting the moving member and the stationary member in the present embodiment is to change the relative attitude of the apparatus to be calibrated and the calibration plate, based on which changing the attitude of the calibration plate by the movable rail is one of the embodiments, and further changing the attitude of the apparatus to be calibrated by the movable rail (without changing the rigid body relationship of the camera and the laser radar in the apparatus to be calibrated) is another embodiment of the present application. For convenience of describing the technical solution of the present application, the present embodiment takes a moving element as a calibration plate, and a stationary element as a device to be calibrated for example.
In this embodiment, the rotatable bracket 6 has a plurality of preset rotation postures, the rotatable bracket 6 changes the rotation postures to drive the moving member to change the orientation, and the rotatable bracket 6 is installed on the movable rail 5.
In this embodiment, the movable track 5 has a plurality of preset movement paths, and the movable track moves based on different movement paths to drive the rotatable support to change the point location.
In this embodiment, a plurality of point locations are provided by setting a plurality of movement paths, a plurality of shooting angles are provided by setting a plurality of rotation postures, and the two shooting angles are combined to obtain a plurality of vertical different shooting angles, so that data of sufficient quantity can be obtained.
The industrial control computer is electrically connected with the laser radar, the camera, the movable track and the rotatable support in the embodiment and used for issuing the calibration instruction to complete the combined calibration of the laser radar and the camera.
The industrial personal computer specifically executes the following operations:
and S1, controlling the movable track to move to enable the rotatable support to be located at a preset point, controlling the rotatable support to change the rotating posture to enable the moving piece to change the direction, acquiring image frame data and point cloud frame data through a laser radar and a camera in each direction of the moving piece, and repeatedly executing the step S1 until a preset number of image frame data and point cloud frame data are acquired.
The movable track of the embodiment is provided with two preset motion paths, the two motion paths form a cross shape, and the track supports the front direction, the rear direction, the left direction and the right direction to move. As shown in fig. 2, the rotatable support has three preset rotation postures, and the three rotation postures are forward rotation, left rotation and right rotation, that is, the support supports clockwise rotation and counterclockwise rotation, and since the rotation angle is too large, the calibration plate may not be in the field angle (no complete calibration plate in the image/no calibration plate data in the point cloud frame), the rotation angle of the left rotation and the right rotation is less than 30 °.
One way to obtain data based on this embodiment may be: the rail driving support drives the calibration plate to move at five point positions of a rail origin, a forward one-point position, a backward one-point position, a left one-point position and a right one-point position, the distance between the point positions is 0.5-1.5 m, and meanwhile, the support at each point position rotates in three positions of forward rotation, left rotation and right rotation, so that the rotation angle can be fixed, for example, 20 degrees. In the initial calibration condition, the calibration plate 7 is in the forward direction of the origin, and the camera 4 and the laser radar 5 are looking at the calibration plate 7. When the calibration plate 7 changes one posture, the laser radar 5 and the camera 4 acquire data to acquire point clouds and images, and the whole data acquisition process in the example has 15 postures.
The subsequent calibration calculation work is automatically carried out after the data is collected. The automatic calibration device combining the laser radar and the camera realizes full automation of calibration without manual participation. And the shape of the track, the position, the moving sequence and the number of the movable point positions are flexible and variable, the rotating angle of the support is variable, the calibration process is not influenced by environment and human operation factors, and intelligent, automatic and batch calibration can be realized.
And S2, calibrating camera internal parameters based on the acquired image frame data.
And calibrating camera internal parameters by using the collected image frame data of each posture. Each frame of image contains a plane calibration plate, as shown in fig. 3, the characteristics of the calibration plate are composed of checkerboards with alternate black and white, angular point information of black and white intersection is extracted, and camera internal parameters are calibrated by a Zhang Zhengyou calibration method.
S3, image index point extraction: calculating pixel coordinates of four vertexes of the outermost periphery of the calibration plate based on the image frame data, wherein the pixel coordinates comprise:
and extracting angular point position information based on the image frame data by using a growth algorithm, calculating pixel coordinates of four vertexes at the outermost periphery of the calibration plate according to the calibration plate customization parameters and the angular point position information, and storing the pixel coordinates of the four vertexes from the upper left corner of the calibration plate in a clockwise sequence.
It is easy to understand that, in order to improve the accuracy of the pixel coordinate calculation, the image frame data may be subjected to distortion removal processing before being subjected to the image frame data processing, and the image frame data of each pose may be subjected to distortion removal processing by using the camera internal parameters to obtain the image frame data from which distortion is removed. Four vertex coordinates of the calibration plate are calculated based on the image frame data after the distortion removal.
In this embodiment, when calculating the pixel coordinates of the vertex, the adopted calibration board customization parameters include: the number of cells (e.g., 7 x 5) in the length and width of the panel, the size of each cell (e.g., square cells, 10cm side length), and the distance from the edge cells to the edge of the panel are scaled. On the basis, the pixel coordinates of the four corners at the outermost periphery of the calibration board can be calculated according to the extracted corner point position information of the black and white grid intersection points.
S4, point cloud calibration point extraction:
s4.1, point cloud frame data preprocessing; in the embodiment, algorithms such as point cloud filtering are adopted during preprocessing, and outliers in the point cloud data are removed.
And S4.2, analyzing and processing the preprocessed point cloud frame data of each pose, and acquiring the top point position of the calibration plate as a calibration characteristic point. In the embodiment, a RANSAC algorithm is used for fitting the maximum plane in the point cloud frame data to be used as a calibration plate plane, the point cloud frame data in the calibration plate plane is extracted, and meanwhile, the point cloud data far away from the plane is deleted by using the distance (for example, within 2 cm) between a point and the plane.
And S4.3, roughly calculating the vertex of the calibration plate. And processing the extracted point cloud of the plane of the calibration plate, and acquiring the top point position of the calibration plate as a calibration characteristic point. Specifically, the minimum bounding box of the point cloud frame data in the plane of the calibration plate is calculated (that is, the coordinates of four vertexes of the minimum bounding box are obtained through calculation), and the coordinates of the four vertexes of the minimum bounding box are used as the point cloud estimation coordinates of the four vertexes at the outermost periphery of the calibration plate.
S4.4, correcting the point cloud estimated coordinates to obtain final point cloud coordinates, wherein the final point cloud coordinates comprise:
s4.4.1, taking the point cloud frame data in the plane of the calibration plate as input, obtaining three principal component directions of the point cloud by PCA method, and recording as ypca、zpcaAnd xpca(ii) a They represent the x, y, z directions of the point cloud, which are relatively close to the length, width and normal directions of the calibration plate.
S4.4.2, y obtained in the above step due to the non-uniformity of the point cloudpca、zpcaThe length direction and the width direction of the calibration plate are not completely matched, and the calibration plate needs to be finely adjustedThe whole strategy is as follows, for the obtained pivot direction ypcaAnd zpcaFine tuning is performed to enable the calibration plate to approach the real length direction and the real width direction of the calibration plate, and the method comprises the following steps:
s4.4.2.1, and calculating point cloud coordinates of four vertexes at the outermost periphery of the calibration plate, and marking as V1、V2、V3And V4
S4.4.2.2, combining three principal component directions ypca、zpcaAnd xpcaSequentially moving the origin of the formed coordinate axis to the point cloud estimated coordinates of the four vertexes to obtain a coordinate system C1、C2、C3And C4
S4.4.2.3, respectively in four coordinate systems C1、C2、C3And C4The following operations are performed:
(1) principal component direction ypcaAnd zpcaCoordinate system based origin around principal component direction xpcaIs carried out in a span of [ -30 DEG, 30 DEG ]]The step length is 2 degrees; the "span and step length of this embodiment are preferred embodiments of the present invention, and should not be considered as limiting the scope of the present invention, for example, the step length is 3 °, 4 °, etc.
(2) In the principal component direction ypcaAnd zpcaAcquire new y after each revolutionpcaAnd zpcaIs marked as
Figure BDA0003413371720000101
And
Figure BDA0003413371720000102
and calculate out
Figure BDA0003413371720000103
And
Figure BDA0003413371720000104
plane of composition
Figure BDA0003413371720000105
While recording the rotation angle thetaiWhich isWhere i is the number of revolutions, and i is 1,2,3 …, 30.
(3) Projecting point cloud frame data in the plane of the calibration plate to the plane
Figure BDA0003413371720000106
Then calculating the projection plane
Figure BDA0003413371720000107
And
Figure BDA0003413371720000108
length in the direction as the length of the calibration plate
Figure BDA0003413371720000109
And width
Figure BDA00034133717200001010
Calculating length simultaneously
Figure BDA00034133717200001011
And width
Figure BDA00034133717200001012
The number Num of point clouds in the enclosed rectangular areai
(4) For NumiSorting to get NumiI at maximum value corresponding to I
Figure BDA00034133717200001013
And
Figure BDA00034133717200001014
s4.4.2.3, respectively taking the coordinate system C in four coordinate systems1、C2、C3And C4Four sets of data obtained by the following operations (
Figure BDA0003413371720000111
And
Figure BDA0003413371720000112
) In four groups of data
Figure BDA0003413371720000113
And
Figure BDA0003413371720000114
corresponding to length and width closest to the true length of the calibration plate
Figure BDA0003413371720000115
And
Figure BDA0003413371720000116
as adjusted principal component direction ypcaAnd zpca
S4.4.3, in an adjusted principal component direction ypcaAs the length direction of the calibration plate, in the adjusted principal component direction zpcaAs the width direction of the calibration plate, in the principal component direction xpcaAnd calculating coordinates of four vertexes of the adjusted minimum external frame as final coordinates of point clouds of the four vertexes at the outermost periphery of the calibration plate as the normal direction of the calibration plate, and storing the final coordinates of the point clouds of the four vertexes from the upper left corner of the calibration plate in a clockwise sequence.
In the embodiment, a RANSAC algorithm is used for fitting a calibration plate plane, a minimum external frame of the point cloud is calculated, and four vertex coordinates of the external frame are used as initial estimation values of the four vertex coordinates of the calibration plate. And calculating three principal component directions of the point cloud of the calibration plate by using a PCA method, and correcting the vertex coordinates of the calibration plate to obtain an accurate value.
And S5, solving the transformation relation between the pixel coordinates of the four vertexes at the outermost periphery of the calibration plate and the point cloud final coordinates to obtain a rotation matrix and a translation matrix, and completing the joint calibration of the laser radar and the camera.
The calibration external parameters of the embodiment comprise a rotation matrix R and a translation matrix T. The transformation relationship between the laser radar coordinate system and the camera coordinate system, i.e. the rotation matrix and the translation matrix, is solved using a point pair based spatial matching method, e.g. by using the solvepnp algorithm provided in opencv. And storing the calibration parameters.
In another embodiment, as shown in fig. 4, an automated combined calibration method for a lidar and a camera is provided, which is used to implement combined calibration of the lidar and the camera, where the lidar and the camera are mounted on a device to be calibrated, and the automated combined calibration method for the lidar and the camera is implemented based on a movable rail, a rotatable support, a calibration board and an industrial personal computer, where the calibration board is a checkerboard calibration board, and one of the device to be calibrated and the calibration board is a moving part, and the other is a stationary part, where:
the movable part is fixed on the rotatable bracket, and the static part and the movable part are arranged oppositely and do not move relative to the ground;
the rotatable bracket is provided with a plurality of preset rotating postures, the rotatable bracket changes the rotating postures to drive the moving piece to change the orientation, and the rotatable bracket is installed on the movable track;
the movable track is provided with a plurality of preset motion paths, and the movable track moves based on different motion paths to drive the rotatable support to change the point position;
the industrial personal computer is electrically connected with the laser radar, the camera, the movable track and the rotatable support and is used for issuing a calibration instruction to complete the combined calibration of the laser radar and the camera, and the automatic combined calibration method of the laser radar and the camera comprises the following steps:
and S1, controlling the movable track to move to enable the rotatable support to be located at a preset point, controlling the rotatable support to change the rotating posture to enable the moving piece to change the direction, acquiring image frame data and point cloud frame data through a laser radar and a camera in each direction of the moving piece, and repeatedly executing the step S1 until a preset number of image frame data and point cloud frame data are acquired. The movable track drives the calibration plate to move at each point, and when each point location is reached, the support drives the calibration plate to rotate and stop at three poses, and images and point cloud data of the poses are collected.
S2, calibrating camera internal parameters based on the collected image frame data;
s3, image index point extraction: calculating pixel coordinates of four vertexes at the outermost periphery of the calibration plate based on the image frame data; in order to improve the pixel coordinate calculation accuracy, image distortion removal processing may be performed before the image data calibration board vertex extraction.
S4, point cloud calibration point extraction:
s4.1, point cloud frame data preprocessing;
s4.2, calibration plate point cloud extraction: fitting the maximum plane in the point cloud frame data by using an RANSAC algorithm to be used as a calibration plate plane, and extracting the point cloud frame data in the calibration plate plane;
s4.3, roughly calculating a top plate of a calibration plate: calculating a minimum external frame of point cloud frame data in a plane of the calibration plate, and taking coordinates of four vertexes of the minimum external frame as point cloud estimation coordinates of four vertexes of the outermost periphery of the calibration plate;
s4.4, calibrating plate vertex correction: correcting the point cloud estimated coordinates to obtain final point cloud coordinates;
and S5, solving the transformation relation between the pixel coordinates of the four vertexes at the outermost periphery of the calibration plate and the point cloud final coordinates to obtain a rotation matrix and a translation matrix, and completing the joint calibration of the laser radar and the camera.
In another embodiment, the movable track has two preset motion paths, and the two motion paths form a cross shape; the rotatable support is provided with three preset rotating postures which are forward rotation, left rotation and right rotation, wherein the rotating angle of the left rotation and the right rotation is smaller than 30 degrees.
In another embodiment, the calculating of the pixel coordinates of the four vertices of the outermost periphery of the calibration plate based on the image frame data includes:
and extracting angular point position information based on the image frame data by using a growth algorithm, calculating pixel coordinates of four vertexes at the outermost periphery of the calibration plate according to the calibration plate customization parameters and the angular point position information, and storing the pixel coordinates of the four vertexes from the upper left corner of the calibration plate in a clockwise sequence.
In another embodiment, the correcting the point cloud estimated coordinates to obtain point cloud final coordinates includes:
s4.4.1, taking the point cloud frame data in the plane of the calibration plate as input, obtaining three principal component directions of the point cloud by PCA method, and recording as ypca、zpcaAnd xpca
S4.4.2, for the obtained principal component direction ypcaAnd zpcaFine adjustment is carried out to enable the length direction and the width direction of the calibration plate to be close to the real length direction and the real width direction of the calibration plate;
s4.4.3, in an adjusted principal component direction ypcaAs the length direction of the calibration plate, in the adjusted principal component direction zpcaAs the width direction of the calibration plate, in the principal component direction xpcaAnd calculating coordinates of four vertexes of the adjusted minimum external frame as final coordinates of point clouds of the four vertexes at the outermost periphery of the calibration plate as the normal direction of the calibration plate, and storing the final coordinates of the point clouds of the four vertexes from the upper left corner of the calibration plate in a clockwise sequence.
In another embodiment, the pair of obtained pivot directions ypcaAnd zpcaFine tuning is performed to enable the calibration plate to approach the real length direction and the real width direction of the calibration plate, and the method comprises the following steps:
s4.4.2.1, and calculating point cloud coordinates of four vertexes at the outermost periphery of the calibration plate, and marking as V1、V2、V3And V4
S4.4.2.2, combining three principal component directions ypca、zpcaAnd xpcaSequentially moving the origin of the formed coordinate axis to the point cloud estimated coordinates of the four vertexes to obtain a coordinate system C1、C2、C3And C4
S4.4.2.3, respectively in four coordinate systems C1、C2、C3And C4The following operations are performed:
(1) principal component direction ypcaAnd zpcaCoordinate system based origin around principal component direction xpcaIs carried out in a span of [ -30 DEG, 30 DEG ]]The step length is 2 degrees;
(2) in the principal component direction ypcaAnd zpcaAcquire new y after each revolutionpcaAnd zpcaIs marked as
Figure BDA0003413371720000131
And
Figure BDA0003413371720000132
and calculate out
Figure BDA0003413371720000133
And
Figure BDA0003413371720000134
plane of composition
Figure BDA0003413371720000135
While recording the rotation angle thetaiWherein i is the number of revolutions, and i is 1,2,3 …, 30;
(3) projecting point cloud frame data in the plane of the calibration plate to the plane
Figure BDA0003413371720000136
Then calculating the projection plane
Figure BDA0003413371720000137
And
Figure BDA0003413371720000138
length in the direction as the length of the calibration plate
Figure BDA0003413371720000139
And width
Figure BDA00034133717200001310
Calculating length simultaneously
Figure BDA00034133717200001311
And width
Figure BDA00034133717200001312
The number Num of point clouds in the enclosed rectangular areai
(4) For NumiSorting to get NumiWhen the value is maximumI is corresponding to I
Figure BDA00034133717200001313
And
Figure BDA00034133717200001314
s4.4.2.3, respectively taking the coordinate system C in four coordinate systems1、C2、C3And C4Four sets of data obtained by the following operations (
Figure BDA00034133717200001315
And
Figure BDA00034133717200001316
) In four groups of data
Figure BDA00034133717200001317
And
Figure BDA00034133717200001318
corresponding to length and width closest to the true length of the calibration plate
Figure BDA00034133717200001319
And
Figure BDA00034133717200001320
as adjusted principal component direction ypcaAnd zpca
For specific limitations of the automatic combined calibration method for the laser radar and the camera, reference may be made to the above limitations of the automatic combined calibration device for the laser radar and the camera, and details thereof are not repeated here.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. The utility model provides a calibration device is united in automation of laser radar and camera for realize the joint calibration of laser radar and camera, laser radar and camera are installed on waiting to calibrate equipment, its characterized in that, calibration device is united in automation of laser radar and camera includes: but movable rail, rotatable support, calibration board and industrial computer, the calibration board is chess board check calibration board, treat that one in calibration equipment and the calibration board is the motion, another is the static, wherein:
the movable part is fixed on the rotatable bracket, and the static part and the movable part are arranged oppositely and do not move relative to the ground;
the rotatable bracket is provided with a plurality of preset rotating postures, the rotatable bracket changes the rotating postures to drive the moving piece to change the orientation, and the rotatable bracket is installed on the movable track;
the movable track is provided with a plurality of preset motion paths, and the movable track moves based on different motion paths to drive the rotatable support to change the point position;
the industrial personal computer is electrically connected with the laser radar, the camera, the movable track and the rotatable support and used for issuing a calibration instruction to complete the combined calibration of the laser radar and the camera, and the industrial personal computer specifically executes the following operations:
s1, controlling the movable track to move to enable the rotatable support to be located at a preset point position, controlling the rotatable support to change the rotating posture to enable the moving piece to change the direction, acquiring image frame data and point cloud frame data through a laser radar and a camera in each direction of the moving piece, and repeatedly executing the step S1 until a preset number of image frame data and point cloud frame data are acquired;
s2, calibrating camera internal parameters based on the collected image frame data;
s3, image index point extraction: calculating pixel coordinates of four vertexes at the outermost periphery of the calibration plate based on the image frame data;
s4, point cloud calibration point extraction:
s4.1, point cloud frame data preprocessing;
s4.2, fitting the maximum plane in the point cloud frame data by using a RANSAC algorithm to serve as a calibration plate plane, and extracting the point cloud frame data in the calibration plate plane;
s4.3, calculating a minimum external frame of point cloud frame data in the plane of the calibration plate, and taking the coordinates of four vertexes of the minimum external frame as point cloud estimation coordinates of four vertexes at the outermost periphery of the calibration plate;
s4.4, correcting the point cloud estimated coordinates to obtain final point cloud coordinates;
and S5, solving the transformation relation between the pixel coordinates of the four vertexes at the outermost periphery of the calibration plate and the point cloud final coordinates to obtain a rotation matrix and a translation matrix, and completing the joint calibration of the laser radar and the camera.
2. The automatic joint calibration device for lidar and a camera according to claim 1, wherein the movable track has two preset motion paths, and the two motion paths form a cross shape; the rotatable support is provided with three preset rotating postures which are forward rotation, left rotation and right rotation, wherein the rotating angle of the left rotation and the right rotation is smaller than 30 degrees.
3. The automated lidar and camera combined calibration apparatus of claim 1, wherein the calculating pixel coordinates of the four vertices of the outermost periphery of the calibration plate based on the image frame data comprises:
and extracting angular point position information based on the image frame data by using a growth algorithm, calculating pixel coordinates of four vertexes at the outermost periphery of the calibration plate according to the calibration plate customization parameters and the angular point position information, and storing the pixel coordinates of the four vertexes from the upper left corner of the calibration plate in a clockwise sequence.
4. The apparatus for automated lidar and camera calibration in combination of claim 1, wherein the calibrating the point cloud estimated coordinates to obtain final point cloud coordinates comprises:
s4.4.1, taking the point cloud frame data in the plane of the calibration plate as input, obtaining three principal component directions of the point cloud by PCA method, and recording as ypca、zpcaAnd xpca
S4.4.2, for the obtained principal component direction ypcaAnd zpcaFine adjustment is carried out to enable the length direction and the width direction of the calibration plate to be close to the real length direction and the real width direction of the calibration plate;
s4.4.3, in an adjusted principal component direction ypcaAs the length direction of the calibration plate, in the adjusted principal component direction zpcaAs the width direction of the calibration plate, in the principal component direction xpcaAnd calculating coordinates of four vertexes of the adjusted minimum external frame as final coordinates of point clouds of the four vertexes at the outermost periphery of the calibration plate as the normal direction of the calibration plate, and storing the final coordinates of the point clouds of the four vertexes from the upper left corner of the calibration plate in a clockwise sequence.
5. Automatic joint calibration arrangement for lidar and camera according to claim 4, wherein the pair of acquired principal component directions ypcaAnd zpcaFine tuning is performed to enable the calibration plate to approach the real length direction and the real width direction of the calibration plate, and the method comprises the following steps:
s4.4.2.1, and calculating point cloud coordinates of four vertexes at the outermost periphery of the calibration plate, and marking as V1、V2、V3And V4
S4.4.2.2, combining three principal component directions ypca、zpcaAnd xpcaSequentially moving the origin of the formed coordinate axis to the point cloud estimated coordinates of the four vertexes to obtain a coordinate system C1、C2、C3And C4
S4.4.2.3, respectively in four coordinate systems C1、C2、C3And C4The following operations are performed:
(1) principal component direction ypcaAnd zpcaCoordinate system based origin around principal component direction xpcaIs carried out in a span of [ -30 DEG, 30 DEG ]]The step length is 2 degrees;
(2) in the principal component direction ypcaAnd zpcaAcquire new y after each revolutionpcaAnd zpcaIs marked as
Figure FDA0003413371710000021
And
Figure FDA0003413371710000022
and calculate out
Figure FDA0003413371710000023
And
Figure FDA0003413371710000024
plane of composition
Figure FDA0003413371710000025
While recording the rotation angle thetaiWherein i is the number of revolutions, and i is 1,2, 3., 30;
(3) projecting point cloud frame data in the plane of the calibration plate to the plane
Figure FDA0003413371710000031
Then calculating the projection plane
Figure FDA0003413371710000032
And
Figure FDA0003413371710000033
length in the direction as the length of the calibration plate
Figure FDA0003413371710000034
And widthDegree of rotation
Figure FDA0003413371710000035
Calculating length simultaneously
Figure FDA0003413371710000036
And width
Figure FDA0003413371710000037
The number Num of point clouds in the enclosed rectangular areai
(4) For NumiSorting to get NumiI at maximum value corresponding to I
Figure FDA0003413371710000039
And
Figure FDA00034133717100000310
s4.4.2.3, respectively taking the coordinate system C in four coordinate systems1、C2、C3And C4Four groups of data obtained after executing operation
Figure FDA00034133717100000311
And
Figure FDA00034133717100000312
get four groups of data
Figure FDA00034133717100000313
And
Figure FDA00034133717100000314
corresponding to length and width closest to the true length of the calibration plate
Figure FDA00034133717100000315
And
Figure FDA00034133717100000316
as adjusted principal component direction ypcaAnd zpca
6. An automatic combined calibration method of a laser radar and a camera is used for realizing the combined calibration of the laser radar and the camera, and the laser radar and the camera are installed on a device to be calibrated, and is characterized in that the automatic combined calibration method of the laser radar and the camera is realized based on a movable track, a rotatable support, a calibration plate and an industrial personal computer, the calibration plate is a chessboard calibration plate, one of the device to be calibrated and the calibration plate is a moving part, and the other is a static part, wherein:
the movable part is fixed on the rotatable bracket, and the static part and the movable part are arranged oppositely and do not move relative to the ground;
the rotatable bracket is provided with a plurality of preset rotating postures, the rotatable bracket changes the rotating postures to drive the moving piece to change the orientation, and the rotatable bracket is installed on the movable track;
the movable track is provided with a plurality of preset motion paths, and the movable track moves based on different motion paths to drive the rotatable support to change the point position;
the industrial personal computer is electrically connected with the laser radar, the camera, the movable track and the rotatable support and is used for issuing a calibration instruction to complete the combined calibration of the laser radar and the camera, and the automatic combined calibration method of the laser radar and the camera comprises the following steps:
s1, controlling the movable track to move to enable the rotatable support to be located at a preset point position, controlling the rotatable support to change the rotating posture to enable the moving piece to change the direction, acquiring image frame data and point cloud frame data through a laser radar and a camera in each direction of the moving piece, and repeatedly executing the step S1 until a preset number of image frame data and point cloud frame data are acquired;
s2, calibrating camera internal parameters based on the collected image frame data;
s3, image index point extraction: calculating pixel coordinates of four vertexes at the outermost periphery of the calibration plate based on the image frame data;
s4, point cloud calibration point extraction:
s4.1, point cloud frame data preprocessing;
s4.2, fitting the maximum plane in the point cloud frame data by using a RANSAC algorithm to serve as a calibration plate plane, and extracting the point cloud frame data in the calibration plate plane;
s4.3, calculating a minimum external frame of point cloud frame data in the plane of the calibration plate, and taking the coordinates of four vertexes of the minimum external frame as point cloud estimation coordinates of four vertexes at the outermost periphery of the calibration plate;
s4.4, correcting the point cloud estimated coordinates to obtain final point cloud coordinates;
and S5, solving the transformation relation between the pixel coordinates of the four vertexes at the outermost periphery of the calibration plate and the point cloud final coordinates to obtain a rotation matrix and a translation matrix, and completing the joint calibration of the laser radar and the camera.
7. The method for automated joint calibration of lidar and a camera according to claim 6, wherein the movable track has two preset motion paths, and the two motion paths form a cross shape; the rotatable support is provided with three preset rotating postures which are forward rotation, left rotation and right rotation, wherein the rotating angle of the left rotation and the right rotation is smaller than 30 degrees.
8. The method for automated lidar and camera joint calibration of claim 6, wherein the calculating pixel coordinates of the four vertices of the outermost periphery of the calibration plate based on the image frame data comprises:
and extracting angular point position information based on the image frame data by using a growth algorithm, calculating pixel coordinates of four vertexes at the outermost periphery of the calibration plate according to the calibration plate customization parameters and the angular point position information, and storing the pixel coordinates of the four vertexes from the upper left corner of the calibration plate in a clockwise sequence.
9. The method for automated joint calibration of lidar and a camera according to claim 6, wherein the correcting the point cloud estimated coordinates to obtain point cloud final coordinates comprises:
s4.4.1, taking the point cloud frame data in the plane of the calibration plate as input, obtaining three principal component directions of the point cloud by PCA method, and recording as ypca、zpcaAnd xpca
S4.4.2, for the obtained principal component direction ypcaAnd zpcaFine adjustment is carried out to enable the length direction and the width direction of the calibration plate to be close to the real length direction and the real width direction of the calibration plate;
s4.4.3, in an adjusted principal component direction ypcaAs the length direction of the calibration plate, in the adjusted principal component direction zpcaAs the width direction of the calibration plate, in the principal component direction xpcaAnd calculating coordinates of four vertexes of the adjusted minimum external frame as final coordinates of point clouds of the four vertexes at the outermost periphery of the calibration plate as the normal direction of the calibration plate, and storing the final coordinates of the point clouds of the four vertexes from the upper left corner of the calibration plate in a clockwise sequence.
10. Method for automated joint calibration of lidar and a camera according to claim 9, wherein the pair of acquired principal component directions ypcaAnd zpcaFine tuning is performed to enable the calibration plate to approach the real length direction and the real width direction of the calibration plate, and the method comprises the following steps:
s4.4.2.1, and calculating point cloud coordinates of four vertexes at the outermost periphery of the calibration plate, and marking as V1、V2、V3And V4
S4.4.2.2, combining three principal component directions ypca、zpcaAnd xpcaSequentially moving the origin of the formed coordinate axis to the point cloud estimated coordinates of the four vertexes to obtain a coordinate system C1、C2、C3And C4
S4.4.2.3, respectively in four coordinate systems C1、C2、C3And C4The following operations are performed:
(1) principal component direction ypcaAnd zpcaCoordinate system based origin around principal component direction xpcaIs carried out in a span of [ -30 DEG, 30 DEG ]]Rotation of, step length ofIs 2 degrees;
(2) in the principal component direction ypcaAnd zpcaAcquire new y after each revolutionpcaAnd zpcaIs marked as
Figure FDA00034133717100000516
And
Figure FDA00034133717100000515
and calculate out
Figure FDA00034133717100000514
And
Figure FDA00034133717100000513
plane of composition
Figure FDA00034133717100000512
While recording the rotation angle thetaiWherein i is the number of revolutions, and i is 1,2, 3., 30;
(3) projecting point cloud frame data in the plane of the calibration plate to the plane
Figure FDA00034133717100000511
Then calculating the projection plane
Figure FDA0003413371710000053
And
Figure FDA0003413371710000054
length in the direction as the length of the calibration plate
Figure FDA00034133717100000517
And width
Figure FDA00034133717100000518
Calculating length simultaneously
Figure FDA00034133717100000519
And width
Figure FDA00034133717100000520
The number Num of point clouds in the enclosed rectangular areai
(4) For NumiSorting to get NumiI at maximum value corresponding to I
Figure FDA0003413371710000055
And
Figure FDA0003413371710000056
s4.4.2.3, respectively taking the coordinate system C in four coordinate systems1、C2、C3And C4Four sets of data obtained by the following operations (
Figure FDA0003413371710000057
And
Figure FDA0003413371710000058
get four groups of data
Figure FDA0003413371710000059
And
Figure FDA00034133717100000510
corresponding to length and width closest to the true length of the calibration plate
Figure FDA0003413371710000052
And
Figure FDA0003413371710000051
as adjusted principal component direction ypcaAnd zpca
CN202111539056.0A 2021-12-15 2021-12-15 Automatic combined calibration device and method for laser radar and camera Active CN114371472B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111539056.0A CN114371472B (en) 2021-12-15 2021-12-15 Automatic combined calibration device and method for laser radar and camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111539056.0A CN114371472B (en) 2021-12-15 2021-12-15 Automatic combined calibration device and method for laser radar and camera

Publications (2)

Publication Number Publication Date
CN114371472A true CN114371472A (en) 2022-04-19
CN114371472B CN114371472B (en) 2024-07-12

Family

ID=81140398

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111539056.0A Active CN114371472B (en) 2021-12-15 2021-12-15 Automatic combined calibration device and method for laser radar and camera

Country Status (1)

Country Link
CN (1) CN114371472B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114782556A (en) * 2022-06-20 2022-07-22 季华实验室 Camera and laser radar registration method, system and storage medium
CN116563391A (en) * 2023-05-16 2023-08-08 深圳市高素科技有限公司 Automatic laser structure calibration method based on machine vision
WO2024109403A1 (en) * 2022-11-24 2024-05-30 梅卡曼德(北京)机器人科技有限公司 3d camera calibration method, point cloud image acquisition method, and camera calibration system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020233443A1 (en) * 2019-05-21 2020-11-26 菜鸟智能物流控股有限公司 Method and device for performing calibration between lidar and camera
CN112230204A (en) * 2020-10-27 2021-01-15 深兰人工智能(深圳)有限公司 Combined calibration method and device for laser radar and camera
CN112819903A (en) * 2021-03-02 2021-05-18 福州视驰科技有限公司 Camera and laser radar combined calibration method based on L-shaped calibration plate

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020233443A1 (en) * 2019-05-21 2020-11-26 菜鸟智能物流控股有限公司 Method and device for performing calibration between lidar and camera
CN112230204A (en) * 2020-10-27 2021-01-15 深兰人工智能(深圳)有限公司 Combined calibration method and device for laser radar and camera
CN112819903A (en) * 2021-03-02 2021-05-18 福州视驰科技有限公司 Camera and laser radar combined calibration method based on L-shaped calibration plate

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘今越;唐旭;贾晓辉;杨冬;李铁军;: "三维激光雷达-相机间外参的高效标定方法", 仪器仪表学报, no. 11, 15 November 2019 (2019-11-15) *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114782556A (en) * 2022-06-20 2022-07-22 季华实验室 Camera and laser radar registration method, system and storage medium
CN114782556B (en) * 2022-06-20 2022-09-09 季华实验室 Camera and laser radar registration method and system and storage medium
WO2024109403A1 (en) * 2022-11-24 2024-05-30 梅卡曼德(北京)机器人科技有限公司 3d camera calibration method, point cloud image acquisition method, and camera calibration system
CN116563391A (en) * 2023-05-16 2023-08-08 深圳市高素科技有限公司 Automatic laser structure calibration method based on machine vision
CN116563391B (en) * 2023-05-16 2024-02-02 深圳市高素科技有限公司 Automatic laser structure calibration method based on machine vision

Also Published As

Publication number Publication date
CN114371472B (en) 2024-07-12

Similar Documents

Publication Publication Date Title
CN114371472A (en) Automatic combined calibration device and method for laser radar and camera
CN109872372B (en) Global visual positioning method and system for small quadruped robot
CN110103217B (en) Industrial robot hand-eye calibration method
CN108436909A (en) A kind of hand and eye calibrating method of camera and robot based on ROS
CN110728715A (en) Camera angle self-adaptive adjusting method of intelligent inspection robot
CN112837383B (en) Camera and laser radar recalibration method and device and computer readable storage medium
CN108830906B (en) Automatic calibration method for camera parameters based on virtual binocular vision principle
CN108038886B (en) Binocular camera system calibration method and device and automobile
CN110450163A (en) The general hand and eye calibrating method based on 3D vision without scaling board
CN112396664A (en) Monocular camera and three-dimensional laser radar combined calibration and online optimization method
CN107610185A (en) A kind of fisheye camera fast calibration device and scaling method
CN111083381B (en) Image fusion method and device, double-optical camera and unmanned aerial vehicle
CN111815710B (en) Automatic calibration method for fish-eye camera
CN107607090B (en) Building projection correction method and device
CN111612794A (en) Multi-2D vision-based high-precision three-dimensional pose estimation method and system for parts
CN110312111A (en) The devices, systems, and methods calibrated automatically for image device
CN114310901B (en) Coordinate system calibration method, device, system and medium for robot
CN113793270A (en) Aerial image geometric correction method based on unmanned aerial vehicle attitude information
CN111047631A (en) Multi-view three-dimensional point cloud registration method based on single Kinect and round box
CN114241269B (en) A collection card vision fuses positioning system for bank bridge automatic control
CN111083376A (en) Method, system and device for determining installation position of target object and electronic equipment
CN114708326A (en) Full-automatic camera calibration system and method for adaptively adjusting brightness and ambiguity
CN110163823B (en) Multi-view image correction method and system based on capsule robot and oriented to drain pipe
CN107123135B (en) A kind of undistorted imaging method of unordered three-dimensional point cloud
CN113658221A (en) Monocular camera-based AGV pedestrian following method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant