CN114972532B - External parameter calibration method, device, equipment and storage medium between laser radars - Google Patents

External parameter calibration method, device, equipment and storage medium between laser radars Download PDF

Info

Publication number
CN114972532B
CN114972532B CN202210536165.5A CN202210536165A CN114972532B CN 114972532 B CN114972532 B CN 114972532B CN 202210536165 A CN202210536165 A CN 202210536165A CN 114972532 B CN114972532 B CN 114972532B
Authority
CN
China
Prior art keywords
point cloud
cloud data
data set
initial
ground
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210536165.5A
Other languages
Chinese (zh)
Other versions
CN114972532A (en
Inventor
李怡康
闫国行
魏鹏锦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai AI Innovation Center
Original Assignee
Shanghai AI Innovation Center
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai AI Innovation Center filed Critical Shanghai AI Innovation Center
Priority to CN202210536165.5A priority Critical patent/CN114972532B/en
Publication of CN114972532A publication Critical patent/CN114972532A/en
Application granted granted Critical
Publication of CN114972532B publication Critical patent/CN114972532B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The application belongs to the technical field of laser radars, and particularly relates to an external parameter calibration method, device and equipment between laser radars and a storage medium. The external parameter calibration method is applied to a multi-laser radar system, wherein the multi-laser radar system comprises a first laser radar and a second laser radar, and the method comprises the following steps: acquiring a first point cloud data set acquired by a first laser radar and a second point cloud data set acquired by a second laser radar; determining a first ground plane according to the first point cloud data set, and determining a second ground plane according to the second point cloud data set; determining a first external parameter according to the first ground plane and the second ground plane; converting the first point cloud data set by using the first external parameter to obtain a third point cloud data set; registering the second point cloud data set and the third point cloud data set to determine a second external parameter; and determining a target external parameter between the first laser radar and the second laser radar according to the first external parameter and the second external parameter. The method provided by the application can improve the accuracy of external parameter calibration between the laser radars.

Description

External parameter calibration method, device, equipment and storage medium between laser radars
Technical Field
The application belongs to the technical field of laser radars, and particularly relates to a method, a device, equipment and a storage medium for calibrating external parameters between laser radars.
Background
The laser radar equipment can assist the vehicle to sense surrounding environment information so as to realize functions of automatic driving, blind area detection, pedestrian collision early warning and the like. In order to further improve the sensing capability of the vehicle to the surrounding environment, a plurality of lidars are usually arranged on the vehicle to enlarge the detection range. Because the positions of the laser radars are different, the environmental data acquired by the laser radar devices can deviate to a certain extent, and therefore, external parameters such as a rotation matrix, a translation vector and the like of the laser radar devices are generally required to be calibrated, so that the data acquired by the laser radar devices are registered.
There are two calibration methods commonly used in the prior art: one method is to determine the relative attitude of the laser radar according to the tracks estimated by the inertial sensor, the visual sensor and other sensors, but when the accuracy of the estimated tracks is low, the calibration of the laser radar has larger error; another method estimates external parameters of the lidar using signals reflected by aids having special shapes and/or materials placed in the environment, but this method requires manual measurement of the actual size and position information of the aids, the accuracy of which can affect the accuracy of the calibration.
Disclosure of Invention
In view of the above, the embodiments of the present application provide a method, an apparatus, a device, and a storage medium for calibrating external parameters between lidars, which can improve the accuracy of external parameter calibration between lidars.
In a first aspect, an embodiment of the present application provides a method for calibrating external parameters between lidars, which is applied to a multi-lidar system, where the multi-lidar system includes a first lidar and a second lidar, and the method includes:
acquiring a first point cloud data set acquired by a first laser radar and a second point cloud data set acquired by a second laser radar;
determining a first ground plane according to the first point cloud data set, and determining a second ground plane according to the second point cloud data set;
Determining a first external parameter according to the first ground plane and the second ground plane;
converting the first point cloud data set by using the first external parameter to obtain a third point cloud data set;
registering the second point cloud data set and the third point cloud data set to determine a second external parameter;
And determining a target external parameter between the first laser radar and the second laser radar according to the first external parameter and the second external parameter.
In one possible implementation manner of the first aspect, determining the first ground plane according to the first point cloud data set and determining the second ground plane according to the second point cloud data set includes:
performing plane fitting on point clouds in the first point cloud data set to determine a first ground plane;
And performing plane fitting on the point clouds in the second point cloud data set to determine a second ground plane.
In one possible implementation manner of the first aspect, the first extrinsic parameters include an initial rotation matrix and an initial translation vector;
Determining a first external parameter from the first ground plane and the second ground plane, comprising:
Determining a first rotation matrix and a z-axis initial offset in an initial translation vector according to a first normal vector of the first ground plane and a second normal vector of the second ground plane, wherein the first rotation matrix indicates an initial pitch angle and an initial roll angle of the first laser radar relative to the second laser radar;
Converting the first non-ground point set in the first point cloud data set by using the first rotation matrix and the z-axis initial offset to obtain a third non-ground point set;
Determining an initial yaw angle of the first lidar relative to the second lidar, an initial offset of an x-axis in an initial translation vector, and an initial offset of a y-axis in the initial translation vector according to a second non-ground point set and a third non-ground point set in the second point cloud data set;
determining a second rotation matrix from the initial yaw angle;
An initial rotation matrix is determined from the first rotation matrix and the second rotation matrix.
In one possible implementation manner of the first aspect, the method for determining the first rotation matrix and the initial offset of the z-axis includes:
determining an axis of rotation and a rotation angle of the first ground plane relative to the second ground plane according to the first normal vector and the second normal vector;
Determining a first rotation matrix from the rotation axis and the rotation angle;
Rotating the first ground plane according to the first rotation matrix to obtain a third ground plane;
the distance between the second ground plane and the third ground plane is determined to be the z-axis initial offset.
In one possible implementation manner of the first aspect, determining an initial yaw angle of the first lidar relative to the second lidar, an x-axis initial offset in an initial translation vector, and a y-axis initial offset in the initial translation vector according to the second non-ground point set and the third non-ground point set in the second point cloud data set includes:
the relative yaw angle, the x-axis offset, and the y-axis offset that minimize the distance between the second set of non-ground points and the third set of non-ground points in the second set of non-ground points are determined to be an initial yaw angle, an x-axis initial offset, and a y-axis initial offset, respectively.
In a possible implementation manner of the first aspect, registering the second point cloud data set and the third point cloud data set, determining the second external parameter includes:
Determining a third external parameter which minimizes the distance between the point cloud in the second point cloud data set and the point cloud in the third point cloud data set as a first optimized external parameter by an iterative nearest neighbor point matching method with a normal;
converting the fourth non-ground point set in the third point cloud data set according to the first optimization external parameters to obtain a fifth non-ground point set;
registering the fifth non-ground point set and a second non-ground point set in the second point cloud data set through an octree algorithm, and determining a second optimization external parameter;
and determining a second external parameter according to the first optimized external parameter and the second optimized external parameter.
In a possible implementation manner of the first aspect, registering the fifth non-ground point set and the second non-ground point set in the second point cloud data set by an octree algorithm, determining the second optimized outlier includes:
and determining a fourth external parameter which enables the volume of a first space formed by the fifth non-ground point set and a second non-ground point set in the second point cloud data set to be minimum as the second optimized external parameter through an octree algorithm.
In a second aspect, an embodiment of the present application provides an external parameter calibration device between lidars, where the device includes:
the data acquisition unit is used for acquiring a first point cloud data set acquired by the first laser radar and a second point cloud data set acquired by the second laser radar;
A plane determining unit for determining a first ground plane from the first point cloud data set and a second ground plane from the second point cloud data set;
The first determining unit is used for determining a first external parameter according to the first ground plane and the second ground plane;
the data conversion unit is used for converting the first point cloud data set by utilizing the first external parameter to obtain a third point cloud data set;
the second determining unit is used for registering the second point cloud data set and the third point cloud data set and determining a second external parameter;
And the target determining unit is used for determining the target external parameters between the first laser radar and the second laser radar according to the first external parameters and the second external parameters.
In a third aspect, an embodiment of the present application provides a terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the method according to any one of the first aspects when the computer program is executed.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of the method according to any of the first aspects.
In a fifth aspect, embodiments of the present application provide a computer program product which, when run on a processor, causes the processor to perform the steps of the method of any of the first aspects described above.
Compared with the prior art, the embodiment of the application has the beneficial effects that: the application provides an external parameter calibration method, device and equipment between laser radars and a readable storage medium. According to the application, the ground plane is determined by a large amount of ground point cloud data contained in the point cloud data set acquired by the laser radars, the external parameters between the two laser radars can be determined by aligning different ground planes determined by different laser radars, the external parameters between the two laser radars can be calibrated by utilizing the ground plane information existing in a daily road scene, and the accuracy of the external parameters between the laser radars can be improved to a certain extent without the help of other types of sensors or auxiliary objects.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of a multi-lidar system according to an embodiment of the present application;
FIG. 2 is a schematic flow chart of a method for calibrating external parameters between lidars according to an embodiment of the present application;
FIG. 3 is a schematic view of a three-dimensional space according to an embodiment of the present application;
FIG. 4 is a schematic diagram of an external parameter calibration device between lidars according to an embodiment of the present application;
Fig. 5 is a schematic diagram of a terminal device according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The laser radar equipment can assist the vehicle to sense surrounding environment information so as to realize functions of automatic driving, blind area detection, pedestrian collision early warning and the like. In order to further improve the sensing capability of the vehicle to the surrounding environment, a plurality of lidars are usually arranged on the vehicle to enlarge the detection range. Because the positions of the laser radars are different, the environmental data acquired by the laser radar devices can deviate to a certain extent, and therefore, external parameters such as a rotation matrix, a translation vector and the like among the laser radar devices are generally required to be calibrated, so that the data acquired by the laser radar devices are registered.
There are two calibration methods commonly used in the prior art: one is a motion-based calibration method, also known as hand-eye calibration in synchronous positioning and mapping (Simultaneous Localization AND MAPPING, SLAM), which is used to map sensor-centric measurements to the robot workspace, allowing the robot to move the sensors accurately. The classical formula for hand-eye calibration is ax=xb, where a and B represent the motion of the two sensors, respectively, and X represents the positional relationship between the two sensors, and therefore, for such methods, the focus is on accurately estimating the trajectories of the respective sensors, but when the accuracy of the estimated trajectories is low, a large error in the calibration of the lidar is caused. The other is a calibration method based on targets, which needs overlapping visual fields of sensors and corresponding characteristic points, so that calibration is performed by using information overlapped among the sensors, for example, an auxiliary object with a special shape is manufactured by using a special material, the auxiliary object has stronger reflection on a laser radar signal, the position of the auxiliary object is conveniently determined in the radar signal, in order to further improve the precision, in the prior art, a method for calibrating by using a plurality of auxiliary objects is available, the size and the placement position of the auxiliary object are measured in advance, and then the auxiliary objects are extracted from the signal one by one, and the calibration is performed by using the specific targets which are measured in advance, so that the accuracy of the measured value directly influences the accuracy of external reference calibration.
In order to solve the technical problems, the embodiment of the application provides an external parameter calibration method, device and equipment between laser radars and a storage medium. The method comprises the steps that a first point cloud data set and a second point cloud data set are respectively obtained through a first laser radar and a second laser radar, the first point cloud data set and the second point cloud data set contain a large amount of ground plane information, the first ground plane can be determined according to the first point cloud data set, the second ground plane can be determined according to the second point cloud data set, coarse calibration is carried out according to the first ground plane and the second ground plane, a first external parameter can be determined, the first point cloud data set is converted into a third point cloud data set through the first external parameter, fine calibration is carried out according to the second point cloud data set and the third point cloud data set, a second external parameter can be determined, the target external parameter between the first laser radar and the second laser radar can be determined according to the first external parameter and the second external parameter, external parameter calibration between the two laser radars can be achieved through the ground plane information existing in a daily road scene, and the laser radar calibration accuracy can be improved to a certain extent through other types of sensors or auxiliary objects needing to be measured manually.
The technical scheme of the application is described in detail below by specific examples. The following embodiments may be combined with each other, and some embodiments may not be repeated for the same or similar concepts or processes.
The embodiment of the application provides a multi-laser radar system. The multiple lidar system includes at least two lidars disposed at different locations, each of which has a different detection angle, but each of which can detect information on a ground plane.
As an example, a multiple lidar system provided on a vehicle as shown in fig. 1 includes five lidars provided on a roof, a front side, a rear side, a left side, and a right side of the vehicle, respectively, each of which has a different detection range, the detection range of the lidar provided on the roof of the vehicle is a circular area centered on the vehicle, the detection range of the lidar provided on the front side of the vehicle is a sector area located in front of the vehicle, the detection range of the lidar provided on the rear side of the vehicle is a sector area located behind the vehicle, the detection range of the lidar provided on the left side of the vehicle is a sector area located in left of the vehicle, and the detection range of the lidar provided on the right side of the vehicle is a sector area located in right of the vehicle. The detection ranges of the four laser radars arranged on the front side, the rear side, the left side and the right side of the vehicle are all in the region which coincides with the detection range of the laser radar arranged on the top of the vehicle, and the information of the ground plane can be acquired.
For a laser radar system with a plurality of different observation ranges, external parameters between any two laser radars need to be calibrated, so that the relative position relation between the two laser radars is accurately found, and different data acquired by different laser radars are matched or spliced, so that the accuracy of detection data is ensured. Based on the multi-laser radar system shown in fig. 1, external parameter data between four laser radars arranged on the front side, the rear side, the left side and the right side of the vehicle and the laser radars arranged on the top of the vehicle are respectively determined by an external parameter calibration method, so that different data acquired by different laser radars are matched.
In a possible implementation manner, a flowchart of the method for calibrating the external parameters between the lidars according to the embodiment of the present application is shown in fig. 2, and the method may be applied to the above-mentioned multiple lidar system, and for any two lidars of the multiple lidar system, namely, the first lidar and the second lidar, the external parameters between the first lidar and the second lidar may be determined by using the method for calibrating the external parameters provided by the embodiment of the present application. The external parameter calibration method between the laser radars provided by the embodiment of the application comprises the following steps S201 to S206.
S201, acquiring a first point cloud data set acquired by a first laser radar and a second point cloud data set acquired by a second laser radar.
It should be noted that, the first point cloud data set and the second point cloud data set each include three-dimensional position coordinates corresponding to a plurality of point clouds, and the three-dimensional position coordinates of any one point cloud may be expressed as (x i,yi,zi). The detection ranges of the first laser radar and the second laser radar have partial overlapping areas, and the first laser radar and the second laser radar can acquire a large amount of point cloud information on the ground plane. The first lidar and the second lidar may be disposed at different positions of the same device, for example, the intelligent driving car shown in fig. 1, the first lidar may be any one of four lidars disposed on a front side, a rear side, a left side and a right side of the vehicle in the multiple lidar system shown in fig. 1, and the second lidar may be a lidar disposed on a top of the vehicle in the multiple lidar system shown in fig. 1. The first lidar and the second lidar may also be provided on different devices.
S202, determining a first ground plane according to the first point cloud data set, and determining a second ground plane according to the second point cloud data set.
It should be noted that, in general, a lidar may collect a large amount of point cloud information on a ground plane on a road, and thus, a maximum plane containing the maximum point cloud may be regarded as the ground plane. In the embodiment of the application, the plane fitting can be performed on the point clouds in the first point cloud data set through a plane extraction algorithm, the first ground plane is determined, the plane fitting can be performed on the point clouds in the second point cloud data set, and the second ground plane is determined.
Illustratively, taking the first point cloud data set as an example, the first ground plane may be determined by a RANSAC plane extraction algorithm, and the specific procedure includes the following steps:
randomly selecting L point clouds from the point clouds in the first point cloud data set, wherein L is a positive integer greater than or equal to 3;
Step two, obtaining a fitted plane by solving the following least square problem: i.e. solving the four parameters a, b, c, d of the plane equation such that the sum of squares of the distances of the L point clouds to the plane is minimal, i.e. the first objective function to be solved can be expressed as:
And thirdly, selecting the ground points with the distance smaller than a preset threshold value from other point clouds except L point clouds in the first point cloud data set, and recording the number of the ground points meeting the condition.
And step four, when the number of the ground points with the distance from the fitted plane smaller than the preset threshold value is larger than the first preset number, taking the plane obtained by fitting as a first ground plane, otherwise repeating the steps one to four.
S203, determining a first external parameter according to the first ground plane and the second ground plane.
In one embodiment, the first extrinsic parameters include an initial rotation matrix and an initial translation vector. The method for determining the first external parameters comprises the following steps: and determining a first rotation matrix and a z-axis initial offset in the initial translation vector according to a first normal vector of the first ground plane and a second normal vector of the second ground plane, wherein the first rotation matrix indicates an initial pitch angle and an initial roll angle of the first laser radar relative to the second laser radar. And converting the first non-ground point set in the first point cloud data set by using the first rotation matrix and the z-axis initial offset to obtain a third non-ground point set. And determining an initial yaw angle of the first laser radar relative to the second laser radar, an initial x-axis offset in an initial translation vector and an initial y-axis offset in the initial translation vector according to the second non-ground point set and the third non-ground point set in the second point cloud data set. A second rotation matrix is determined based on the initial yaw angle, and an initial rotation matrix is determined based on the first rotation matrix and the second rotation matrix.
In one example, assuming that the first ground plane is denoted as a 1x+b1y+c1z+d1 =0 and the second ground plane is denoted as a 2x+b2y+c2z+d2 =0, the normal vector of the first ground planeNormal vector of second ground planeFrom the first and second normal vectors, an axis of rotation and a rotation angle of the first ground plane relative to the second ground plane may be determined, wherein the rotation angle/>Rotation axis/>The first rotation matrix/>, can be determined from the rotation axis and the rotation angle and based on the rondrigas rotation formulaWherein/>Representing an identity matrix,/>Representing the rotation axis/>Transposed matrix of/>Representing the rotation axis/>Is an anti-symmetric matrix of (a).
In this example, the first rotation matrix indicates an initial pitch angle and an initial roll angle of the first lidar relative to the second lidar, and according to the first rotation matrix, the first ground plane may be rotated to obtain a third ground plane, where the third ground plane is substantially aligned with the second ground plane, i.e. the second ground plane and the third ground plane are in a parallel state but not coincident. The distance between the second ground plane and the third ground plane is the initial Z-axis offset Z 1 of the first laser radar relative to the second laser radar, and the distance between the second ground plane and the third ground plane is equal to the relative distance between any point cloud of the first ground plane and the first ground plane.
It is to be appreciated that after the first ground plane is determined from the first point cloud data set, a plurality of ground point clouds in the first point cloud data set that are located on the first ground plane may be determined, and remaining point clouds in the first point cloud data set other than the plurality of ground point clouds are non-ground point clouds. The first set of non-ground points in the first set of point cloud data includes a plurality of non-ground point clouds.
By the method provided in the above example, the initial pitch angle, the initial roll angle, and the z-axis initial offset of the first lidar with respect to the second lidar can be roughly estimated. The next step is to determine an initial yaw angle, an initial x-axis offset, and an initial y-axis offset of the first lidar relative to the second lidar.
Assuming that any one of the non-ground point clouds in the first set of non-ground points is denoted as p i, then p i may be transformed according to the first rotation matrix and the z-axis initial offset, i.e., the transformed non-ground point cloud is denoted as p i·R1+Z1. All non-ground point clouds in the first non-ground point set can be converted by using the first rotation matrix and the z-axis initial offset to obtain a third non-ground point set, and a relative yaw angle, an x-axis offset and a y-axis offset which minimize the distance between the second non-ground point set and the third non-ground point set in the second non-ground point set are respectively an initial yaw angle, an x-axis initial offset and a y-axis initial offset.
Illustratively, the initial yaw angle may be obtained by solving a least squares problem, the x-axis initial offset and the y-axis initial offset, i.e., the solved second objective function may be expressed as:
In the above expression, γ 1 denotes an initial yaw angle, X 1 denotes an X-axis initial offset amount, Y 1 denotes a Y-axis initial offset amount, (X 3_i,y3_i) denotes a two-dimensional coordinate of an i-th non-ground point in the third non-ground point set, and (X 2_i,y2_i) denotes a two-dimensional coordinate of a non-ground point corresponding to the i-th non-ground point in the third non-ground point set. The second objective function described above may be solved by a binary search method.
The second rotation matrix R 2 may be determined from the initial yaw angle, and multiplying the first rotation matrix R 1 by the second rotation matrix R 2 may determine an initial rotation matrix R 3 between the first lidar and the second lidar. Wherein the second rotation matrix R 2 can be expressed as:
optionally, under extreme conditions, the initial pitch angle and the initial roll angle of the first lidar relative to the second lidar and the difference between the actual pitch angle and roll angle are plus/minus pi, resulting in a portion of the non-ground point cloud in the third non-ground point set based on the initial pitch angle and the initial roll angle conversion being below the actual ground plane, and therefore, it is necessary to detect whether all of the non-ground point clouds in the third non-ground point set are above the ground plane before determining the initial yaw angle, the x-axis initial offset, and the y-axis initial offset based on the second non-ground point set and the third non-ground point set in the second non-ground point set. If there is a non-ground point cloud below the ground level, calibration of the position coordinates of the non-ground point cloud is required, i.e. the non-ground point cloud is rotated 180 degrees around the rotation axis.
Illustratively, the method of determining whether any one of the third set of non-ground points is above the ground plane comprises: converting the first ground plane according to the first rotation matrix and the initial offset of the z axis to obtain a third ground plane; and determining a difference value between the non-ground point cloud and any one of the ground point clouds on the third ground plane aiming at any one of the non-ground point clouds in the third non-ground point set to obtain a difference value vector, wherein if the inner product result between the normal vector of the third ground plane and the difference value vector is positive value, the non-ground point cloud is indicated to be above the ground plane, and if the inner product result between the normal vector of the third ground plane and the difference value vector is negative value, the non-ground point cloud is indicated to be below the ground plane.
S204, converting the first point cloud data set by using the first external parameter to obtain a third point cloud data set.
In the embodiment of the application, the first external parameters comprise an initial rotation matrix R 3 and an initial translation vector, wherein the initial translation vector consists of an x-axis initial offset, a y-axis initial offset and a z-axis initial offset, and the initial translation vectorAll point clouds (including non-ground point clouds and ground point clouds) in the first point cloud data set can be converted according to the first external parameters to obtain a third point cloud data set, namely, the three-dimensional position coordinates of the point clouds in the first point cloud data set are added with the initial translation vector after being multiplied by the initial rotation matrix.
And S205, registering the second point cloud data set and the third point cloud data set, and determining a second external parameter.
Based on the steps, the external parameter calibration result between the first laser radar and the second laser radar can be roughly determined, but the point cloud data acquired by the first laser radar and the point cloud data acquired by the second laser radar are not completely registered, so that the external parameter is further optimized by using an iterative nearest neighbor point matching algorithm (ITERATIVE CLOSEST POINT ALGORITHM WITH NORMALS, ICPN) with a normal and an octree algorithm.
In one embodiment, the third outlier that minimizes the distance between the point cloud in the second point cloud data set and the point cloud in the third point cloud data set may be determined to be the first optimized outlier by an iterative nearest neighbor matching method with a normal. The specific implementation process comprises the following steps: and finding the point pair closest to the second point cloud data set and the third point cloud data set, calculating the error after the point pair closest to the third point cloud data set is transformed according to the estimated third external parameter (comprising the estimated rotation matrix and the estimated translation vector), and continuously iterating to determine the estimated third external parameter as the first optimized external parameter when the error is smaller than a certain threshold value or the number of iterations is reached. ICPN enriches the feature information of each point cloud by including the normal of each point cloud to expand the receptive field of each point cloud, wherein the normal of any one point cloud is the normal vector of the plane fitted by 40 other point clouds located around the point cloud, so ICPN has higher accuracy compared with the existing iterative nearest neighbor method (ITERATIVE CLOSEST POINT ALGORITHM, ICP).
The first optimization external parameters comprise a first optimization rotation matrix and a first optimization translation vector, a fourth non-ground point set in the third point cloud data set is converted according to the first optimization external parameters to obtain a fifth non-ground point set, and specifically, for any one non-ground point in the fourth non-ground point set in the third point cloud data set, the three-dimensional position coordinates of the non-ground point are multiplied by the first optimization rotation matrix point and then added with the first optimization translation vector to obtain converted three-dimensional position coordinates in the fifth non-ground point set.
Furthermore, the embodiment of the application further optimizes the external parameter calibration through an octree algorithm. In one embodiment, the second optimized outlier may be determined by registering the fifth non-ground point set with a second non-ground point set in the second point cloud data set via an octree algorithm.
Specifically, a three-dimensional space is constructed based on all non-ground point clouds in the fifth non-ground point set and all non-ground point clouds in the second non-ground point set, and as shown in fig. 3, the three-dimensional space may be a cube. Then, the octree-based method uniformly divides the stereoscopic space into eight small cubes with smaller volumes, the small cubes containing non-ground point clouds are represented by a first space, the small cubes not containing non-ground point clouds are represented by a second space, and the volume of the stereoscopic space can be represented as:
wherein N represents the total number of first spaces in the stereoscopic space, M represents the total number of second spaces in the stereoscopic space, V oC represents the volume of the stereoscopic space, Representing the volume of the ith first space of the N first spaces, i being a positive integer greater than 1 and less than or equal to N,/>Represents the volume of the j-th second space of the M second spaces, j being a positive integer greater than 1 and less than or equal to M.
When the fifth non-ground point set is completely aligned with the second non-ground point set in the second point cloud data set, the distance between the fifth non-ground point set and the second non-ground point set is minimum, and then the sum of the volumes of all the first spaces formed by the fifth non-ground point set and the second non-ground point set is minimum. Therefore, by the octree algorithm provided by the present application, the value of the fourth external parameter is estimated by the method of scanning the independent variable domain, the estimated fourth external parameter includes an estimated rotation matrix and an estimated translation vector to estimate the volume of the first space formed by the fifth non-ground point set and the second non-ground point set, and the volume of each small cube in the three-dimensional space is continuously updated and divided (as shown in fig. 3) iteratively, and the fourth external parameter estimated when the sum of the volumes of all the first spaces is smaller than the preset threshold or the number of iterations is reached is determined as the second optimized external parameter, the second optimized external parameter includes the second optimized rotation matrix R 4 and the second optimized translation vector T 2. The optimization process can be expressed as the formula:
Finally, a second external parameter may be determined from the first optimized external parameter and the second optimized external parameter. The second external parameters comprise an optimized rotation matrix and an optimized translation vector, the optimized rotation matrix is the dot product of a first optimized rotation matrix in the first optimized parameters and a second optimized rotation matrix in the second optimized parameters, and the optimized translation vector is the sum of the first optimized translation vector in the first optimized parameters and the second optimized translation vector in the second optimized parameters.
S206, determining target external parameters between the first laser radar and the second laser radar according to the first external parameters and the second external parameters.
Specifically, the object appearance between the first laser radar and the second laser radar includes an object rotation matrix and an object translation vector. The target rotation matrix is the dot product of the initial rotation matrix in the first external parameter and the optimized rotation matrix in the second external parameter, and the target translation vector is the sum of the initial translation vector in the first external parameter and the optimized translation vector in the second external parameter.
Based on the external parameter calibration method between the laser radars, the first ground plane is determined through the first point cloud data set acquired by the first laser radar, the second ground plane is determined according to the second point cloud data set acquired by the second laser radar, coarse calibration is carried out according to the first ground plane and the second ground plane, the first external parameter can be determined, the coarse calibration result is sequentially optimized by using the iterative nearest neighbor point matching method with a normal line and the octree algorithm, so that the second external parameter is determined, and the target external parameter between the first laser radar and the second laser radar can be determined according to the first external parameter and the second external parameter. The application can realize the external parameter calibration between the two laser radars by utilizing the ground level information existing in the daily road scene, and can improve the accuracy of the external parameter calibration between the laser radars to a certain extent without using other types of sensors or auxiliary objects.
Based on the same inventive concept, as an implementation of the external parameter calibration method between the laser radars, the embodiment of the application provides an external parameter calibration device between the laser radars, and the embodiment of the device corresponds to the embodiment of the method. Fig. 4 is a schematic diagram of an external parameter calibration device between lidars according to an embodiment of the present application.
As shown in fig. 4, the external parameter calibration device 4 between lidars provided in this embodiment includes: the data acquisition unit 401 is configured to acquire a first point cloud data set acquired by the first lidar and a second point cloud data set acquired by the second lidar. A plane determination unit 402 for determining a first ground plane from the first point cloud data set and a second ground plane from the second point cloud data set. A first determining unit 403 for determining a first external parameter from the first ground plane and the second ground plane. The data conversion unit 404 is configured to convert the first point cloud data set by using the first external parameter, so as to obtain a third point cloud data set. A second determining unit 405 is configured to register the second point cloud data set and the third point cloud data set, and determine a second external parameter. The target determining unit 406 is configured to determine a target external parameter between the first laser radar and the second laser radar according to the first external parameter and the second external parameter.
The external parameter calibration device between the laser radars provided in this embodiment can realize all the contents in the above method embodiments, and its implementation principle and technical effects are similar, and will not be repeated here.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
Based on the same inventive concept, the embodiment of the application also provides a terminal device. As shown in fig. 5, the terminal device 5 of this embodiment includes: a processor 500, a memory 501 and a computer program 502 stored in the memory 501 and executable on the processor 500. The steps of the working memory training method embodiments described above are implemented by processor 500 when executing computer program 502. Or the processor 500 when executing the computer program 502 performs the functions of the modules/units in the above-described device embodiments, e.g. the functions of the units 401 to 406 shown in fig. 4.
By way of example, the computer program 502 may be partitioned into one or more modules/units, which are stored in the memory 501 and executed by the processor 500 to accomplish the present application. One or more of the modules/units may be a series of computer program instruction segments capable of performing a specific function for describing the execution of the computer program 502 in the terminal device 5.
It will be appreciated by those skilled in the art that fig. 5 is merely an example of the terminal device 5 and does not constitute a limitation of the terminal device 5, and may include more or less components than illustrated, or may combine certain components, or different components, e.g., the terminal device 5 may also include input-output devices, network access devices, buses, etc.
The Processor 500 may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (DIGITAL SIGNAL Processor, DSP), application SPECIFIC INTEGRATED Circuit (ASIC), field-Programmable gate array (Field-Programmable GATE ARRAY, FPGA) or other Programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 501 may be an internal storage unit of the terminal device 5, such as a hard disk or a memory of the terminal device 5. The memory 501 may also be an external storage device of the terminal device 5, such as a plug-in hard disk provided on the terminal device 5, a smart memory card (SMART MEDIA CARD, SMC), a Secure Digital (SD) card, a flash memory card (FLASH CARD), or the like. Further, the memory 501 may also include both an internal storage unit and an external storage device of the terminal device 5. The memory 501 is used to store computer programs and other programs and data required by the terminal device 5. The memory 501 may also be used to temporarily store data that has been output or is to be output.
The terminal device provided by the embodiment of the present application may execute the above method embodiment, and its implementation principle and technical effects are similar, and will not be described herein.
The embodiment of the application also provides a computer readable storage medium, on which a computer program is stored, which when being executed by a processor, implements the method described in the above method embodiment.
The embodiment of the application also provides a computer program product which, when run on a terminal device, causes the terminal device to execute the method described in the embodiment of the method.
The integrated units described above, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiments, and may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable storage medium may include at least: any entity or device capable of carrying computer program code to a photographing device/terminal apparatus, recording medium, computer Memory, read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), electrical carrier signals, telecommunications signals, and software distribution media. Such as a U-disk, removable hard disk, magnetic or optical disk, etc.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/device and method may be implemented in other manners. For example, the apparatus/device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in the present description and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
Furthermore, the terms "first," "second," "third," and the like in the description of the present specification and in the appended claims, are used for distinguishing between descriptions and not necessarily for indicating or implying a relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application.

Claims (9)

1. A method for calibrating external parameters between lidars, which is applied to a multi-lidar system, wherein the multi-lidar system comprises a first lidar and a second lidar, the method comprising:
Acquiring a first point cloud data set acquired by the first laser radar and a second point cloud data set acquired by the second laser radar;
Determining a first ground plane according to the first point cloud data set, and determining a second ground plane according to the second point cloud data set;
Determining a first rotation matrix and a z-axis initial offset in an initial translation vector according to a first normal vector of the first ground plane and a second normal vector of the second ground plane, wherein the first rotation matrix indicates an initial pitch angle and an initial roll angle of the first laser radar relative to the second laser radar;
converting the first non-ground point set in the first point cloud data set by using the first rotation matrix and the z-axis initial offset to obtain a third non-ground point set;
determining an initial yaw angle of the first lidar relative to the second lidar, an x-axis initial offset in the initial translation vector, and a y-axis initial offset in the initial translation vector according to a second non-ground point set and the third non-ground point set in the second point cloud data set;
Determining a second rotation matrix from the initial yaw angle;
determining an initial rotation matrix according to the first rotation matrix and the second rotation matrix;
Converting the first point cloud data set by using the initial translation vector and the initial rotation matrix to obtain a third point cloud data set;
Registering the second point cloud data set and the third point cloud data set to determine a second external parameter;
and determining target external parameters between the first laser radar and the second laser radar according to the initial translation vector, the initial rotation matrix and the second external parameters.
2. The method of claim 1, wherein the determining a first ground plane from the first point cloud data set and a second ground plane from the second point cloud data set comprises:
Performing plane fitting on the point clouds in the first point cloud data set to determine the first ground plane;
And carrying out plane fitting on the point clouds in the second point cloud data set, and determining the second ground plane.
3. The method of claim 1, wherein determining the first rotation matrix and the initial z-axis offset comprises:
Determining an axis of rotation and a rotation angle of the first ground plane relative to the second ground plane according to the first normal vector and the second normal vector;
Determining the first rotation matrix from the rotation axis and the rotation angle;
rotating the first ground plane according to the first rotation matrix to obtain a third ground plane;
A distance between the second ground plane and the third ground plane is determined to be the z-axis initial offset.
4. The method of claim 1, wherein the determining an initial yaw angle of the first lidar relative to the second lidar, an initial offset of the x-axis in the initial translation vector, and an initial offset of the y-axis in the initial translation vector from the second set of non-ground points and the third set of non-ground points in the second point cloud data comprises:
Determining a relative yaw angle, an x-axis offset, and a y-axis offset that minimize a distance between a second set of non-ground points in the second set of non-ground points and the third set of non-ground points as the initial yaw angle, the x-axis initial offset, and the y-axis initial offset, respectively.
5. The method of any of claims 1 to 4, wherein the registering the second and third point cloud data sets, determining a second outlier, comprises:
Determining a third external parameter which minimizes the distance between the point clouds in the second point cloud data set and the point clouds in the third point cloud data set as a first optimized external parameter by an iterative nearest neighbor point matching method with a normal;
converting a fourth non-ground point set in the third point cloud data set according to the first optimization external parameters to obtain a fifth non-ground point set;
Registering the fifth non-ground point set and a second non-ground point set in the second point cloud data set through an octree algorithm, and determining a second optimized external parameter;
Determining the second external parameters according to the first optimized external parameters and the second optimized external parameters.
6. The method of claim 5, wherein the registering the fifth set of non-ground points with the second set of non-ground points in the second point cloud data set via an octree algorithm to determine a second optimized outlier comprises:
and determining a fourth external parameter which enables the volume of a first space formed by the fifth non-ground point set and a second non-ground point set in the second point cloud data set to be minimum as the second optimized external parameter through an octree algorithm.
7. An external parameter calibration device between lidars, the device comprising:
the data acquisition unit is used for acquiring a first point cloud data set acquired by the first laser radar and a second point cloud data set acquired by the second laser radar;
a plane determining unit, configured to determine a first ground plane according to the first point cloud data set, and determine a second ground plane according to the second point cloud data set;
A first determining unit, configured to determine a first rotation matrix and a z-axis initial offset in an initial translation vector according to a first normal vector of the first ground plane and a second normal vector of the second ground plane, where the first rotation matrix indicates an initial pitch angle and an initial roll angle of the first lidar relative to the second lidar; converting the first non-ground point set in the first point cloud data set by using the first rotation matrix and the z-axis initial offset to obtain a third non-ground point set; determining an initial yaw angle of the first lidar relative to the second lidar, an x-axis initial offset in the initial translation vector, and a y-axis initial offset in the initial translation vector according to a second non-ground point set and the third non-ground point set in the second point cloud data set; determining a second rotation matrix from the initial yaw angle; determining an initial rotation matrix according to the first rotation matrix and the second rotation matrix;
The data conversion unit is used for converting the first point cloud data set by utilizing the initial translation vector and the initial rotation matrix to obtain a third point cloud data set;
a second determining unit, configured to register the second point cloud data set and the third point cloud data set, and determine a second external parameter;
and the target determining unit is used for determining target external parameters between the first laser radar and the second laser radar according to the initial translation vector, the initial rotation matrix and the second external parameters.
8. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 6 when executing the computer program.
9. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the method according to any one of claims 1 to 6.
CN202210536165.5A 2022-05-17 2022-05-17 External parameter calibration method, device, equipment and storage medium between laser radars Active CN114972532B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210536165.5A CN114972532B (en) 2022-05-17 2022-05-17 External parameter calibration method, device, equipment and storage medium between laser radars

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210536165.5A CN114972532B (en) 2022-05-17 2022-05-17 External parameter calibration method, device, equipment and storage medium between laser radars

Publications (2)

Publication Number Publication Date
CN114972532A CN114972532A (en) 2022-08-30
CN114972532B true CN114972532B (en) 2024-06-07

Family

ID=82983800

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210536165.5A Active CN114972532B (en) 2022-05-17 2022-05-17 External parameter calibration method, device, equipment and storage medium between laser radars

Country Status (1)

Country Link
CN (1) CN114972532B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115542301B (en) * 2022-11-24 2023-04-07 北京理工大学深圳汽车研究院(电动车辆国家工程实验室深圳研究院) Method, device and equipment for calibrating external parameters of laser radar and storage medium
CN116184339B (en) * 2023-04-26 2023-08-11 山东港口渤海湾港集团有限公司 Radar calibration method, electronic equipment, storage medium and calibration auxiliary
CN117876504B (en) * 2024-03-12 2024-06-04 苏州艾吉威机器人有限公司 Laser radar external parameter calibration method and device applied to AGV

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113888649A (en) * 2021-10-18 2022-01-04 上海振华重工(集团)股份有限公司 Multi-laser-radar external parameter calibration method, device, equipment and storage medium
WO2022022694A1 (en) * 2020-07-31 2022-02-03 北京智行者科技有限公司 Method and system for sensing automated driving environment
CN114488097A (en) * 2022-01-26 2022-05-13 广州小鹏自动驾驶科技有限公司 External parameter calibration method of laser radar, computer equipment and computer storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022022694A1 (en) * 2020-07-31 2022-02-03 北京智行者科技有限公司 Method and system for sensing automated driving environment
CN113888649A (en) * 2021-10-18 2022-01-04 上海振华重工(集团)股份有限公司 Multi-laser-radar external parameter calibration method, device, equipment and storage medium
CN114488097A (en) * 2022-01-26 2022-05-13 广州小鹏自动驾驶科技有限公司 External parameter calibration method of laser radar, computer equipment and computer storage medium

Also Published As

Publication number Publication date
CN114972532A (en) 2022-08-30

Similar Documents

Publication Publication Date Title
CN114972532B (en) External parameter calibration method, device, equipment and storage medium between laser radars
CN112462350B (en) Radar calibration method and device, electronic equipment and storage medium
CN110766758B (en) Calibration method, device, system and storage device
CN112034431B (en) External parameter calibration method and device for radar and RTK
CN111123242B (en) Combined calibration method based on laser radar and camera and computer readable storage medium
CN112464812B (en) Vehicle-based concave obstacle detection method
CN112379352B (en) Laser radar calibration method, device, equipment and storage medium
CN109828250B (en) Radar calibration method, calibration device and terminal equipment
EP2054835B1 (en) Target orientation
WO2022179094A1 (en) Vehicle-mounted lidar external parameter joint calibration method and system, medium and device
CN113111513B (en) Sensor configuration scheme determining method and device, computer equipment and storage medium
CN110579754A (en) Method for determining external parameters of a lidar and other sensors of a vehicle
CN115272616A (en) Indoor scene three-dimensional reconstruction method, system, device and storage medium
CN113569958A (en) Laser point cloud data clustering method, device, equipment and medium
EP1939585B1 (en) Object recognizing device
CN113296120A (en) Obstacle detection method and terminal
CN107765257A (en) A kind of laser acquisition and measuring method based on the calibration of reflected intensity accessory external
JP4618506B2 (en) Object recognition device
CN114241057A (en) External reference calibration method and system for camera and laser radar and readable storage medium
CN115100287A (en) External reference calibration method and robot
CN111986248A (en) Multi-view visual perception method and device and automatic driving automobile
CN111596309A (en) Vehicle queuing measurement method based on laser radar
CN112162252B (en) Data calibration method for millimeter wave radar and visible light sensor
Endo et al. Self-localization error evaluation from map structure based on autocorrelation values
CN117124334B (en) Robot drift correction method, robot, storage medium, and terminal device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant