CN113256729B - External parameter calibration method, device and equipment for laser radar and camera and storage medium - Google Patents

External parameter calibration method, device and equipment for laser radar and camera and storage medium Download PDF

Info

Publication number
CN113256729B
CN113256729B CN202110286400.3A CN202110286400A CN113256729B CN 113256729 B CN113256729 B CN 113256729B CN 202110286400 A CN202110286400 A CN 202110286400A CN 113256729 B CN113256729 B CN 113256729B
Authority
CN
China
Prior art keywords
key
point cloud
determining
laser radar
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110286400.3A
Other languages
Chinese (zh)
Other versions
CN113256729A (en
Inventor
李晓欢
覃兴胜
唐欣
陈倩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangxi Comprehensive Transportation Big Data Research Institute
Original Assignee
Guangxi Comprehensive Transportation Big Data Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangxi Comprehensive Transportation Big Data Research Institute filed Critical Guangxi Comprehensive Transportation Big Data Research Institute
Priority to CN202110286400.3A priority Critical patent/CN113256729B/en
Publication of CN113256729A publication Critical patent/CN113256729A/en
Application granted granted Critical
Publication of CN113256729B publication Critical patent/CN113256729B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The embodiment of the application provides a method, a device, equipment and a storage medium for calibrating external parameters of a laser radar and a camera, and relates to the field of image processing. The method comprises the following steps: acquiring a three-dimensional point cloud acquired by a laser radar at the same moment and a pixel image acquired by a camera, determining a first key point in the pixel image, and determining a second key point in the three-dimensional point cloud by adopting Hough transformation; and determining external parameters of the laser radar and the camera based on the coordinates of the first key point in the pixel image and the coordinates of the second key point in the three-dimensional point cloud. According to the embodiment of the application, the adjacent edge length of the calibration plate is prolonged to increase the number of point clouds on the adjacent edge, so that the identification degree of the adjacent edge is increased, the adjacent edge of the calibration plate is conveniently extracted through Hough transformation, the precision of extracting key points is improved, the optimization conditions are set according to geometric constraints, the key points under the pose of a plurality of calibration plates are counted, different external parameter values are obtained, the external parameter value with the minimum error is selected according to the constraint conditions, and the precision of external parameter calibration is improved.

Description

External parameter calibration method, device and equipment for laser radar and camera and storage medium
Technical Field
The application relates to the technical field of image processing, in particular to an external parameter calibration method, device and equipment for a laser radar and a camera and a storage medium.
Background
In recent years, MEMS (Micro-Electro-MECHANICAL SYSTEM) lidar has low cost, dense point cloud and good three-dimensional imaging effect, and is often used for information fusion with a camera, so as to improve the robustness and precision of system application, and is widely applied to the fields of intelligent driving, industrial robots and the like.
The external parameters of the laser radar and the camera are the spatial corresponding parameters of the point cloud coordinate system relative to the image coordinate system, namely the relation of rotation and translation, and are the key of information fusion. However, the point cloud ranging accuracy of the MEMS laser radar is lower than that of the traditional mechanical laser radar, the noise points are more, and the external parameter value deviation obtained by the traditional external parameter calibration method is large, so that the information fusion effect is affected. The external parameter calibration needs to accurately extract the corresponding points of the point cloud and the image pixels, and the external parameter value can be obtained according to a plurality of pairs of corresponding points. The corresponding points of the point cloud and the image pixels are used as key points, the distance measurement and noise characteristics of the MEMS laser radar influence the extraction of the key points, and the error is large when the external parameters are calculated with the image pixels in a traditional way.
Therefore, in the prior art, the characteristics of ranging and noise of the MEMS laser radar influence the extraction of key points, and the traditional technical problem of large error when the error is compared with the image pixel point is solved, so that improvement is urgently needed.
Disclosure of Invention
The application aims to at least solve one of the technical defects, and particularly solves the technical problems that the distance measurement and noise characteristics of the MEMS laser radar influence the extraction of key points in the prior art, and the error is large when the conventional method is used for solving the external parameters of image pixel points.
According to one aspect of the present application, there is provided an external parameter calibration method for a laser radar and a camera, the method comprising:
Acquiring a three-dimensional point cloud acquired by a laser radar at the same moment and a pixel image acquired by a camera, wherein the three-dimensional point cloud and the pixel image are acquired aiming at the same calibration plate;
Determining a first key point in the pixel image, and determining a second key point in the three-dimensional point cloud by adopting Hough transformation; the first key points and the second key points are multiple, and the points on the calibration plate corresponding to the first key points are the same as the points on the calibration plate corresponding to the second key points;
and determining external parameters of the laser radar and the camera based on the coordinates of the first key point in the pixel image and the coordinates of the second key point in the three-dimensional point cloud.
As an optional embodiment of the present application, the acquiring a three-dimensional point cloud acquired by a laser radar at the same time includes:
And aiming at the same calibration plate, acquiring multi-frame intermediate three-dimensional point clouds of the same pose, and fusing the multi-frame three-dimensional point clouds to obtain the three-dimensional point clouds acquired by the laser radar.
As an optional embodiment of the present application, the determining the second key point in the three-dimensional point cloud by using Hough transformation includes:
Fitting the three-dimensional point cloud by adopting a preset RANSAC algorithm to obtain an optimal plane;
and determining a second key point in the optimal plane by adopting Hough transformation.
As an optional embodiment of the present application, the determining the second key point in the optimal plane by using Hough transform includes:
Fitting an edge straight line of the calibration plate in the optimal plane by using Hough transformation;
and determining the second key point according to the intersection point of the edge straight line and the actual length of the calibration plate, wherein the length of the edge straight line is larger than the actual length of the calibration plate.
As an optional embodiment of the present application, the determining the external parameters of the lidar and the camera based on the coordinates of the first key point in the pixel image and the coordinates of the second key point in the three-dimensional point cloud includes:
converting the pixel coordinates of the first key point into camera coordinates by adopting a preset pixel coordinate-camera coordinate formula;
the outlier is determined based on a relationship of camera coordinates of the first keypoint and coordinates of the second keypoint in the three-dimensional point cloud.
As an alternative embodiment of the present application, the method further comprises:
Aiming at a plurality of different poses of the same calibration plate, respectively acquiring three-dimensional point clouds acquired by a plurality of groups of laser radars and pixel images acquired by cameras, and solving a plurality of external parameters;
and determining the optimal external parameters from the plurality of external parameters by adopting a preset algorithm.
According to another aspect of the present application, there is provided an external parameter calibration device for a laser radar and a camera, the device comprising:
the image and point cloud acquisition module is used for acquiring three-dimensional point cloud acquired by the laser radar at the same moment and pixel images acquired by the camera, wherein the three-dimensional point cloud and the pixel images are acquired aiming at the same calibration plate;
The key point determining module is used for determining a first key point in the pixel image and determining a second key point in the three-dimensional point cloud by adopting Hough transformation; the first key points and the second key points are multiple, and the points on the calibration plate corresponding to the first key points are the same as the points on the calibration plate corresponding to the second key points;
and the external parameter determining module is used for determining external parameters of the laser radar and the camera based on the coordinates of the first key point in the pixel image and the coordinates of the second key point in the three-dimensional point cloud.
As an optional embodiment of the present application, the keypoint determining module includes:
the straight line fitting unit is used for fitting the edge straight line of the calibration plate in the optimal plane by utilizing Hough transformation;
and the key point determining unit is used for determining the second key point according to the intersection point of the edge straight line and the actual length of the calibration plate, wherein the length of the edge straight line is larger than the actual length of the calibration plate.
According to another aspect of the present application, there is provided an electronic device including:
One or more processors;
A memory;
One or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more applications configured to: and executing the external parameter calibration method of the laser radar and the camera.
According to one aspect of the present application, there is provided a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The computer instructions are read from the computer-readable storage medium by a processor of a computer device, and executed by the processor, cause the computer device to perform the methods provided in the various alternative implementations described above. .
The embodiment of the application extracts the point cloud key points by utilizing Hough transformation. And the length of the adjacent edge of the calibration plate is prolonged, so that the number of point clouds on the adjacent edge is increased, the identification degree of the adjacent edge is increased, the adjacent edge of the calibration plate is conveniently extracted through Hough transformation, and then the point cloud key points are calculated according to the intersection point of the adjacent edge and the actual length of the adjacent edge. And the precision of extracting the key points is improved, after the key points are accurately extracted, an optimization condition is set according to geometric constraint, the key points under the pose of a plurality of calibration plates are counted, the external parameter values obtained by different key point pairs are obtained, then the external parameter value with the minimum error is selected according to the constraint condition, and the precision of external parameter calibration is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings that are required to be used in the description of the embodiments of the present application will be briefly described below.
FIG. 1 is a schematic flow chart of an external parameter calibration method for a laser radar and a camera according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a key point of a pixel image according to an embodiment of the present application;
FIG. 3 is a schematic diagram of three-dimensional point cloud key points according to an embodiment of the present application;
fig. 4 is a schematic flow chart of a plane fitting method according to an embodiment of the present application;
FIG. 5 is a schematic flow chart of a method for determining key points according to an embodiment of the present application;
FIG. 6 is a flowchart of a method for obtaining an optimal external parameter according to an embodiment of the present application;
FIG. 7 is a schematic structural diagram of an external parameter calibration device for a laser radar and a camera according to an embodiment of the present application;
Fig. 8 is a schematic structural diagram of a key point determining module according to an embodiment of the present application;
Fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
The above and other features, advantages and aspects of embodiments of the present application will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale.
Detailed Description
Embodiments of the present application are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are illustrative only and are not to be construed as limiting the application.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless expressly stated otherwise, as understood by those skilled in the art. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. The term "and/or" as used herein includes all or any element and all combination of one or more of the associated listed items.
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail with reference to the accompanying drawings.
The external parameters of the laser radar and the camera are the spatial corresponding parameters of the point cloud coordinate system relative to the image coordinate system, namely the relation of rotation and translation, and are the key of information fusion. However, the point cloud ranging accuracy of the MEMS laser radar is lower than that of the traditional mechanical laser radar, the noise points are more, and the external parameter value deviation obtained by the traditional external parameter calibration method is large, so that the information fusion effect is affected. The external parameter calibration needs to accurately extract the corresponding points of the point cloud and the image pixels, and the external parameter value can be obtained according to a plurality of pairs of corresponding points. The corresponding points of the point cloud and the image pixels are used as key points, the distance measurement and noise characteristics of the MEMS laser radar influence the extraction of the key points, and the error is large when the external parameters are calculated with the image pixels in a traditional way.
The application provides a method, a device, equipment and a storage medium for calibrating external parameters of a laser radar and a camera, which aim to solve the technical problems in the prior art.
The following describes the technical scheme of the present application and how the technical scheme of the present application solves the above technical problems in detail with specific embodiments. The following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
The embodiment of the application provides an external parameter calibration method of a laser radar and a camera, as shown in fig. 1, comprising the following steps:
step S101, acquiring three-dimensional point clouds acquired by a laser radar at the same moment and pixel images acquired by a camera, wherein the three-dimensional point clouds and the pixel images are acquired aiming at the same calibration plate;
Step S102, determining a first key point in the pixel image, and determining a second key point in the three-dimensional point cloud by adopting Hough transformation; the first key points and the second key points are multiple, and the points on the calibration plate corresponding to the first key points are the same as the points on the calibration plate corresponding to the second key points;
step S103, determining external parameters of the lidar and the camera based on the coordinates of the first key point in the pixel image and the coordinates of the second key point in the three-dimensional point cloud.
In the embodiment of the application, when images are acquired, a calibration plate is placed in the same background and is respectively acquired by adopting a laser radar and a camera, wherein the three-dimensional point cloud acquired by the laser radar and the pixel image acquired by the camera at the same moment refer to images of the same calibration plate in the same pose at the same moment, after the three-dimensional point cloud acquired by the laser radar and the pixel image acquired by the camera are acquired, a first key point in the pixel image and a second key point in the three-dimensional point cloud are respectively determined, wherein the first key point and the second key point refer to three vertexes of the pixel image and the three-dimensional point cloud calibration plate, as shown in figure 2, taking a pixel image as an example, the image includes a top plate 201, where three vertices o 1、o2 and o 3 of the calibration plate 201 are schematic diagrams of key points in a three-dimensional point cloud, as shown in fig. 3, where a second key point is o ' 1、o′2 and o ' 3, where points o 1 and o ' 1 correspond to an upper vertex of the calibration plate, points o 2 and o ' 2 correspond to a left vertex of the calibration plate, points o 3 and o ' 3 correspond to a right vertex of the calibration plate, where a key point in the determined pixel image can be identified by using the prior art, and when the key point of the three-dimensional point cloud is identified, a Hough transformation is used to determine the key point, where a specific solution manner is described later. And after the first key point and the second key point are determined, determining external parameters of the laser radar and the camera based on the coordinates of the first key point in the pixel image and the coordinates of the second key point in the three-dimensional point cloud.
For the embodiment of the application, for convenience of explanation, taking an embodiment as an example, the external parameter solving is to extract corresponding point cloud-pixel coordinates from the same time frame of the two types of sensors, and then solve the external parameter according to the corresponding coordinate values. Acquiring intrinsic parameters of the camera, namely internal parameters of the camera, before calibrating the laser radar and the external parameters of the camera, solving and converting the external parameters into PnP (multi-point perspective imaging) problems, wherein the relation between pixel point coordinates of the camera image and a camera coordinate system is shown in a formula (1),
Wherein (u, v) represents the coordinates of the first keypoint in the pixel coordinate system, (X c,Yc,Zc) identifies the coordinates of the first keypoint in the camera coordinate system, the relationship of the camera coordinate system to the point cloud coordinate system is as shown in equation (2),
[Xc,Yc,Zc]T=[Rt,tt][Xl,Yl,Zl]T (2)
Wherein, (X l,Yl,Zl) represents the coordinates of the second key point in the point cloud coordinate system, R t represents the 3X3 rotation matrix from the point cloud in the laser radar coordinate system to the image pixels in the camera coordinate system, t t represents the three-dimensional translation vector, and (R t,tt) is the external parameter.
The embodiment of the application extracts the point cloud key points by utilizing Hough transformation. And the length of the adjacent edge of the calibration plate is prolonged, so that the number of point clouds on the adjacent edge is increased, the identification degree of the adjacent edge is increased, the adjacent edge of the calibration plate is conveniently extracted through Hough transformation, and then the point cloud key points are calculated according to the intersection point of the adjacent edge and the actual length of the adjacent edge. And the precision of extracting the key points is improved, after the key points are accurately extracted, an optimization condition is set according to geometric constraint, the key points under the pose of a plurality of calibration plates are counted, the external parameter values obtained by different key point pairs are obtained, then the external parameter value with the minimum error is selected according to the constraint condition, and the precision of external parameter calibration is improved.
The embodiment of the application provides a possible implementation manner, in the implementation manner, the method for acquiring the three-dimensional point cloud acquired by the laser radar at the same moment comprises the following steps:
And aiming at the same calibration plate, acquiring multi-frame intermediate three-dimensional point clouds of the same pose, and fusing the multi-frame three-dimensional point clouds to obtain the three-dimensional point clouds acquired by the laser radar.
In the embodiment of the application, although the single-frame point cloud of the MEMS laser radar is dense, the vertex angle is fuzzy and the discrete points are more, so that the point cloud is preprocessed, and the accurate point cloud corner characteristics can be acquired only by needing enough points on the calibration plate, so that the key points of the calibration plate are extracted. Firstly, extracting point clouds on a calibration plate by a rough segmentation method, then increasing the density of the point clouds by using a time domain multi-frame fusion algorithm, and obtaining a point cloud set after overlapping multi-frame point clouds, wherein multi-frame three-dimensional point clouds to be fused are acquired aiming at the same calibration plate in the same pose.
According to the embodiment of the application, the multi-frame three-dimensional point clouds acquired when the same calibration plate is in the same pose are fused, so that the density of the points in the three-dimensional point clouds is ensured to be large enough, and the extraction of key points is facilitated.
The embodiment of the present application provides a possible implementation manner, in this implementation manner, as shown in fig. 4, the determining, by using Hough transformation, a second key point in the three-dimensional point cloud includes:
step S401, fitting the three-dimensional point cloud by adopting a preset RANSAC algorithm to obtain an optimal plane;
in step S402, a second key point in the optimal plane is determined using Hough transform.
In the embodiment of the application, after the point cloud set P is obtained, abnormal points far away from the center plane of the calibration plate exist in the point cloud set P, and in order to eliminate the abnormal points, a RANSAC algorithm is firstly used for fitting the optimal plane P 1, wherein a fitting formula is shown as a formula (3),
Ax1+By1+Cz1+D=0 (3)
Wherein A, B, C, D is a parameter of plane fitting, (x 1,y1,z1) identifies a point on the optimal plane P 1, eliminates a point far from the fitting plane from the point cloud P, and then projects the remaining points onto the plane P 1 to obtain the point cloud P'.
After the point cloud pretreatment, the points on the point set P' are all on the plane P 1, the discrete points on the edge of the calibration plate are not completely removed, the discrete points with large errors are avoided during calibration, then Hough transformation is adopted to fit two adjacent edges of the top point on the point cloud of the calibration plate, the straight line intersection point is the top point of the calibration plate, the left top point and the right top point of the calibration plate are obtained according to the actual edge length of the calibration plate, and then the coordinate value of the key point of the calibration plate is determined.
According to the embodiment of the application, abnormal points in the point cloud set are removed through an algorithm, and the influence of the abnormal points on key point extraction is eliminated.
The embodiment of the present application provides a possible implementation manner, in this implementation manner, as shown in fig. 5, the determining, by using Hough transformation, the second key point in the optimal plane includes:
Step S501, fitting an edge straight line of the calibration plate in the optimal plane by using Hough transformation;
And step S502, determining the second key point according to the intersection point of the edge straight line and the actual length of the calibration plate, wherein the length of the edge straight line is larger than the actual length of the calibration plate.
In the embodiment of the application, the Hough transformation principle is to utilize the duality of the dotted line to convert the straight line in the image into the point of the parameter domain, and determine the straight line in the image domain by counting the number of the points detected by the parameter domain. Therefore, the edge line of the calibration plate is prolonged to increase the number of point clouds of the edge line, so that the identification degree of the edge line is increased, the edge line is conveniently fitted by Hough transformation, and then key points are calculated according to the intersection point of the fitted edge line and the actual length of the edge of the calibration plate.
In the embodiment of the application, a polar coordinate mode is adopted to represent a straight line, the straight line y=kx+b in an X-y space is converted into a polar coordinate representation form ρ= xcos θ+ ysin θ, the straight line corresponds to a parameter space point (ρ, θ), wherein ρ represents the distance from a point on the straight line to an origin, and θ represents the included angle between the straight line where a line segment from the point to the origin is located and an X axis. The step of determining the key points by adopting Hough transformation comprises fitting a point cloud plane, resampling the point cloud to uniformly distribute the point cloud, converting the three-dimensional point cloud into a two-dimensional point cloud, determining a value range of a parameter space, discretizing the parameter space to obtain theta i(i=1,2,3…m),ρj (j=1, 2,3 … m), initializing an array A (rho, theta) for accumulating the number of the parameter space points, a linear parameter quantity threshold N ρ, a merging parameter threshold sigma 12, a line segment length threshold N L and a null array L ine; traversing theta i (i=1, 2,3 … m), solving the rho value of the linear space, then comparing with rho j of the parameter space, setting the fault tolerance threshold of the collinear point as sigma ρ, if |rho-rho j|<σρ, accumulating 1 in an array A (rho tt), judging (rho tt) as the parameter of the linear when the value of the array A (rho ji) is larger than a threshold N ρ, substituting the point of the x-y space into rho k=xi cosθi+yi sinθi, recording the point meeting the |rho kt|<σρ, and storing in an array L ine, merging the close-range straight lines, if two straight line parameters (ρ 11)、(ρ22) meet |ρ 12|<σ1、 |θ12|<σ2, merging the parameters into a straight line, setting the detected point cloud line segment error threshold as sigma L, sorting according to the coordinate size of the collinear point of the straight line, wherein the maximum value and the minimum value are the line segment end points on the fitting straight line respectively, and if |L t-Llength|<σL, when k >0, judging that the i-th left adjacent edge is the i-th left adjacent edgeWhen k is less than 0, the j-th right adjacent edge is judgedThe number of the collinear points is set as n, the collinear points on the straight line are fitted according to the least square method, the slope k and the interception b of the straight line are obtained, the calculation mode is shown as a formula (4) and a formula (5),
Determination ofAnd/>The intersection point of the three-dimensional point cloud is the upper vertex of the calibration plate, the actual side lengths of the left adjacent side and the right adjacent side of the calibration plate are L 1 and L 2 respectively, and then the left vertex and the right vertex of the calibration plate can be calculated, and three key points of the three-dimensional point cloud are further determined.
The embodiment of the application extracts the point cloud key points by utilizing Hough transformation. And the length of the adjacent edge of the calibration plate is prolonged, so that the number of point clouds on the adjacent edge is increased, the identification degree of the adjacent edge is increased, the adjacent edge of the calibration plate is conveniently extracted through Hough transformation, and then the point cloud key points are calculated according to the intersection point of the adjacent edge and the actual length of the adjacent edge. And improves the accuracy of the extraction of the key points.
The embodiment of the present application provides a possible implementation manner, in which the determining the external parameters of the lidar and the camera based on the coordinates of the first key point in the pixel image and the coordinates of the second key point in the three-dimensional point cloud includes:
converting the pixel coordinates of the first key point into camera coordinates by adopting a preset pixel coordinate-camera coordinate formula;
the outlier is determined based on a relationship of camera coordinates of the first keypoint and coordinates of the second keypoint in the three-dimensional point cloud.
For the embodiment of the application, for convenience of explanation, taking an embodiment as an example, the external parameter solving is to extract corresponding point cloud-pixel coordinates from the same time frame of the two types of sensors, and then solve the external parameter according to the corresponding coordinate values. Acquiring intrinsic parameters of the camera, namely internal parameters of the camera, before calibrating the laser radar and the external parameters of the camera, solving and converting the external parameters into PnP (multi-point perspective imaging) problems, wherein the relation between pixel point coordinates of the camera image and a camera coordinate system is shown in a formula (1),
Wherein (u, v) represents the coordinates of the first keypoint in the pixel coordinate system, (X c,Yc,Zc) identifies the coordinates of the first keypoint in the camera coordinate system, the relationship of the camera coordinate system to the point cloud coordinate system is as shown in equation (2),
[Xc,Yc,Zc]T=[Rt,tt][Xl,Yl,Zl]T (2)
Wherein, (X l,Yl,Zl) represents the coordinates of the second key point in the point cloud coordinate system, R t represents the 3X3 rotation matrix from the point cloud in the laser radar coordinate system to the image pixels in the camera coordinate system, t t represents the three-dimensional translation vector, and (R t,tt) is the external parameter.
The embodiment of the application extracts the point cloud key points by utilizing Hough transformation. And the length of the adjacent edge of the calibration plate is prolonged, so that the number of point clouds on the adjacent edge is increased, the identification degree of the adjacent edge is increased, the adjacent edge of the calibration plate is conveniently extracted through Hough transformation, and then the point cloud key points are calculated according to the intersection point of the adjacent edge and the actual length of the adjacent edge. And the precision of extracting the key points is improved, after the key points are accurately extracted, an optimization condition is set according to geometric constraint, the key points under the pose of a plurality of calibration plates are counted, the external parameter values obtained by different key point pairs are obtained, then the external parameter value with the minimum error is selected according to the constraint condition, and the precision of external parameter calibration is improved.
An embodiment of the present application provides a possible implementation manner, in which, as shown in fig. 6, the method further includes:
step S601, respectively acquiring three-dimensional point clouds acquired by a plurality of groups of laser radars and pixel images acquired by cameras aiming at a plurality of different poses of the same calibration plate, and solving a plurality of external parameters;
step S602, determining the optimal external parameters from the plurality of external parameters by adopting a preset algorithm.
In the embodiment of the application, a calibration plate is used for solving parameters in the external parameters (R t,tt), at least 3 different poses are needed, a plurality of groups of pixels and point clouds are obtained corresponding to key points of the calibration plate, and then the external parameters are solved according to formulas (1) and (2). Therefore, the calibration plate is moved in the field of view of the sensor to obtain N (N > 3) different calibration plate poses, so that the geometric characteristics corresponding to N groups of point clouds and images can be obtained, then a plurality of groups of external parameters (R t,tt) are calculated according to the corresponding relation between the key points and the normal vectors of the calibration plate, then according to the formula (3), the normal vector N (1),n(1) of the plane p 1 can be obtained after the transformation of a rotation matrix R t to obtain N (1,c), the dot product of N (1,c) and the image plane vector is ideally 0, but the system measurement error exists, the calibration error of the average dot product error is e d,n(1,c) and the image plane normal vector N (c) is e r, and the optimal rotation matrix can be obtained according to the formula (7)
Wherein N represents the total pose number of the calibration plate, i epsilon [1, N ] represents the ith pose,Representing key points on the image, obtained by the OpenCV library, e d is the average dot product error, and e r is the calibration error of n (1,c) and the normal vector n (c) of the image plane. Then, according to the formula (8), the optimal rotation matrix/>
Wherein R t represents a rotation matrix, which is obtained from a plurality of pixel coordinates and point cloud coordinates, and specific reference formulas (1) and (2) are different in coordinate value, and R t obtained is different, and it can be understood that R t is a variable, in formula (7)N (c) is a constant value, n (l) in n (1,c)=Rtn(l) is also a constant value, and a value of each of a plurality of R t is set to be the minimum value of e d+er, and the minimum value is obtained
According to the corresponding relation between the first key point of the calibration plate in the pixel image and the second key point in the three-dimensional point cloud, an optimal translation matrix is obtained
After all the point cloud key points are projected onto an image, calculating the average Euclidean distance between the point cloud key points and the image key points, wherein the average error is calculated as shown in the formula (9):
in the formula (9), N is the total number of the poses, For the j-th image key point on the i-th pose,/>Projection points which are the jth point cloud key points on the ith pose, wherein/> And (5) the jth point cloud key point on the ith pose. /(I)And/>The variance of euclidean distance is:
Traversing candidate extrinsic parameters, taking out extrinsic parameters with the smallest variance v t, then recalculating the average Euclidean distance between the projection points of the point cloud key points and the image key points by using the extrinsic parameters, removing key points with the Euclidean distance larger than the average Euclidean distance, setting the rest point cloud key point set as O l and the image key point set as O c, and solving the optimal translation vector As shown in equation (11), where the mean (·) function represents averaging on a row basis.
Wherein,Is the optimal external parameter.
According to the embodiment of the application, after the key points are accurately extracted, the optimization condition is set according to the geometric constraint, the key points under the pose of the plurality of calibration plates are counted, the external parameter values obtained by different key point pairs are obtained, then the external parameter value with the minimum error is selected according to the constraint condition, and the precision of external parameter calibration is improved.
The embodiment of the application extracts the point cloud key points by utilizing Hough transformation. And the length of the adjacent edge of the calibration plate is prolonged, so that the number of point clouds on the adjacent edge is increased, the identification degree of the adjacent edge is increased, the adjacent edge of the calibration plate is conveniently extracted through Hough transformation, and then the point cloud key points are calculated according to the intersection point of the adjacent edge and the actual length of the adjacent edge. And the precision of extracting the key points is improved, after the key points are accurately extracted, an optimization condition is set according to geometric constraint, the key points under the pose of a plurality of calibration plates are counted, the external parameter values obtained by different key point pairs are obtained, then the external parameter value with the minimum error is selected according to the constraint condition, and the precision of external parameter calibration is improved.
An embodiment of the present application provides an external parameter calibration device for a laser radar and a camera, as shown in fig. 7, an external parameter calibration device 70 for a laser radar and a camera may include: an image and point cloud acquisition module 701, a keypoint determination module 702, and an outlier determination module 703, wherein,
An image and point cloud acquisition module 701, configured to acquire a three-dimensional point cloud acquired by a laser radar at the same moment and a pixel image acquired by a camera, where the three-dimensional point cloud and the pixel image are acquired for the same calibration board;
a keypoint determining module 702, configured to determine a first keypoint in the pixel image, and determine a second keypoint in the three-dimensional point cloud by using Hough transformation; the first key points and the second key points are multiple, and the points on the calibration plate corresponding to the first key points are the same as the points on the calibration plate corresponding to the second key points;
An external parameter determining module 703 is configured to determine external parameters of the lidar and the camera based on the coordinates of the first key point in the pixel image and the coordinates of the second key point in the three-dimensional point cloud.
Further, as shown in fig. 8, the keypoint determining module 702 includes:
a straight line fitting unit 801, configured to fit an edge straight line of the calibration plate in the optimal plane by using Hough transformation;
And a key point determining unit 802, configured to determine the second key point according to the intersection point of the edge straight line and the actual length of the calibration plate, where the length of the edge straight line is greater than the actual length of the calibration plate.
The external parameter calibration device for the laser radar and the camera in the embodiment of the application can execute the external parameter calibration method for the laser radar and the camera shown in the above embodiment of the application, and the implementation principle is similar, and the description is omitted here.
The embodiment of the application extracts the point cloud key points by utilizing Hough transformation. And the length of the adjacent edge of the calibration plate is prolonged, so that the number of point clouds on the adjacent edge is increased, the identification degree of the adjacent edge is increased, the adjacent edge of the calibration plate is conveniently extracted through Hough transformation, and then the point cloud key points are calculated according to the intersection point of the adjacent edge and the actual length of the adjacent edge. And the precision of extracting the key points is improved, after the key points are accurately extracted, an optimization condition is set according to geometric constraint, the key points under the pose of a plurality of calibration plates are counted, the external parameter values obtained by different key point pairs are obtained, then the external parameter value with the minimum error is selected according to the constraint condition, and the precision of external parameter calibration is improved.
The embodiment of the application provides electronic equipment, which comprises: a memory and a processor; at least one program stored in the memory for, when executed by the processor, extracting the point cloud key points by utilizing Hough transformation, compared with the prior art, according to the embodiment of the application. And the length of the adjacent edge of the calibration plate is prolonged, so that the number of point clouds on the adjacent edge is increased, the identification degree of the adjacent edge is increased, the adjacent edge of the calibration plate is conveniently extracted through Hough transformation, and then the point cloud key points are calculated according to the intersection point of the adjacent edge and the actual length of the adjacent edge. And the precision of extracting the key points is improved, after the key points are accurately extracted, an optimization condition is set according to geometric constraint, the key points under the pose of a plurality of calibration plates are counted, the external parameter values obtained by different key point pairs are obtained, then the external parameter value with the minimum error is selected according to the constraint condition, and the precision of external parameter calibration is improved.
In an alternative embodiment, there is provided an electronic device, as shown in fig. 9, the electronic device 4000 shown in fig. 9 includes: a processor 4001 and a memory 4003. Wherein the processor 4001 is coupled to the memory 4003, such as via a bus 4002. Optionally, the electronic device 4000 may further comprise a transceiver 4004, the transceiver 4004 may be used for data interaction between the electronic device and other electronic devices, such as transmission of data and/or reception of data, etc. It should be noted that, in practical applications, the transceiver 4004 is not limited to one, and the structure of the electronic device 4000 is not limited to the embodiment of the present application.
The Processor 4001 may be a CPU (Central Processing Unit ), general purpose Processor, DSP (DIGITAL SIGNAL Processor, data signal Processor), ASIC (Application SPECIFIC INTEGRATED Circuit), FPGA (FieldProgrammable GATE ARRAY ) or other programmable logic device, transistor logic device, hardware component, or any combination thereof. Which may implement or perform the various exemplary logic blocks, modules and circuits described in connection with this disclosure. The processor 4001 may also be a combination that implements computing functionality, e.g., comprising one or more microprocessor combinations, a combination of a DSP and a microprocessor, etc.
Bus 4002 may include a path to transfer information between the aforementioned components. Bus 4002 may be a PCI (PERIPHERAL COMPONENT INTERCONNECT, peripheral component interconnect standard) bus or an EISA (Extended Industry Standard Architecture ) bus, or the like. The bus 4002 can be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in fig. 9, but not only one bus or one type of bus.
Memory 4003 may be, but is not limited to, ROM (Read Only Memory) or other type of static storage device that can store static information and instructions, RAM (Random Access Memory ) or other type of dynamic storage device that can store information and instructions, EEPROM (ELECTRICALLY ERASABLE PROGRAMMABLE READ ONLY MEMORY ), CD-ROM (Compact Disc ReadOnly Memory, compact disc Read Only Memory) or other optical disk storage, optical disk storage (including compact discs, laser discs, optical discs, digital versatile discs, blu-ray discs, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
The memory 4003 is used for storing application program codes (computer programs) for executing the present application and is controlled to be executed by the processor 4001. The processor 4001 is configured to execute application program codes stored in the memory 4003 to realize what is shown in the foregoing method embodiment.
Embodiments of the present application provide a computer-readable storage medium having a computer program stored thereon, which when run on a computer, causes the computer to perform the corresponding method embodiments described above.
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited in order and may be performed in other orders, unless explicitly stated herein. Moreover, at least some of the steps in the flowcharts of the figures may include a plurality of sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, the order of their execution not necessarily being sequential, but may be performed in turn or alternately with other steps or at least a portion of the other steps or stages.
The foregoing is only a partial embodiment of the present application, and it should be noted that it will be apparent to those skilled in the art that modifications and adaptations can be made without departing from the principles of the present application, and such modifications and adaptations are intended to be comprehended within the scope of the present application.

Claims (7)

1. The external parameter calibration method for the laser radar and the camera is characterized by comprising the following steps of:
Acquiring a three-dimensional point cloud acquired by a laser radar at the same moment and a pixel image acquired by a camera, wherein the three-dimensional point cloud and the pixel image are acquired aiming at the same calibration plate;
Determining a first key point in the pixel image, and determining a second key point in the three-dimensional point cloud by adopting Hough transformation; the first key points and the second key points are multiple, and the points on the calibration plate corresponding to the first key points are the same as the points on the calibration plate corresponding to the second key points;
Determining external parameters of the laser radar and the camera based on the coordinates of the first key point in the pixel image and the coordinates of the second key point in the three-dimensional point cloud;
The step of acquiring the three-dimensional point cloud acquired by the laser radar at the same moment comprises the following steps:
aiming at the same calibration plate, acquiring multi-frame intermediate three-dimensional point clouds of the same pose, and fusing the multi-frame intermediate three-dimensional point clouds to obtain a three-dimensional point cloud acquired by a laser radar;
the determining the second key point in the three-dimensional point cloud by adopting Hough transformation comprises the following steps:
Fitting the three-dimensional point cloud by adopting a preset RANSAC algorithm to obtain an optimal plane;
and determining a second key point in the optimal plane by adopting Hough transformation.
2. The method for calibrating a laser radar and camera according to claim 1, wherein determining the second key point in the optimal plane by using Hough transform comprises:
Fitting an edge straight line of the calibration plate in the optimal plane by using Hough transformation;
and determining the second key point according to the intersection point of the edge straight line and the actual length of the calibration plate, wherein the length of the edge straight line is larger than the actual length of the calibration plate.
3. The method for calibrating the external parameters of the laser radar and the camera according to claim 1, wherein the determining the external parameters of the laser radar and the camera based on the coordinates of the first key point in the pixel image and the coordinates of the second key point in the three-dimensional point cloud comprises:
converting the pixel coordinates of the first key point into camera coordinates by adopting a preset pixel coordinate-camera coordinate formula;
the outlier is determined based on a relationship of camera coordinates of the first keypoint and coordinates of the second keypoint in the three-dimensional point cloud.
4. The method for calibrating external parameters of a laser radar and camera according to claim 2, further comprising:
Aiming at a plurality of different poses of the same calibration plate, respectively acquiring three-dimensional point clouds acquired by a plurality of groups of laser radars and pixel images acquired by cameras, and solving a plurality of external parameters;
and determining the optimal external parameters from the plurality of external parameters by adopting a preset algorithm.
5. An external parameter calibration device for a laser radar and a camera, which is characterized by comprising:
the image and point cloud acquisition module is used for acquiring three-dimensional point cloud acquired by the laser radar at the same moment and pixel images acquired by the camera, wherein the three-dimensional point cloud and the pixel images are acquired aiming at the same calibration plate;
The key point determining module is used for determining a first key point in the pixel image and determining a second key point in the three-dimensional point cloud by adopting Hough transformation; the first key points and the second key points are multiple, and the points on the calibration plate corresponding to the first key points are the same as the points on the calibration plate corresponding to the second key points;
The external parameter determining module is used for determining external parameters of the laser radar and the camera based on the coordinates of the first key point in the pixel image and the coordinates of the second key point in the three-dimensional point cloud;
The step of acquiring the three-dimensional point cloud acquired by the laser radar at the same moment comprises the following steps:
aiming at the same calibration plate, acquiring multi-frame intermediate three-dimensional point clouds of the same pose, and fusing the multi-frame intermediate three-dimensional point clouds to obtain a three-dimensional point cloud acquired by a laser radar;
the determining the second key point in the three-dimensional point cloud by adopting Hough transformation comprises the following steps:
Fitting the three-dimensional point cloud by adopting a preset RANSAC algorithm to obtain an optimal plane;
Determining a second key point in the optimal plane by adopting Hough transformation;
the determining the second key point in the optimal plane by adopting Hough transformation comprises the following steps:
The straight line fitting unit is used for fitting the edge straight line of the calibration plate in the optimal plane by utilizing Hough transformation;
and the key point determining unit is used for determining the second key point according to the intersection point of the edge straight line and the actual length of the calibration plate, wherein the length of the edge straight line is larger than the actual length of the calibration plate.
6. An electronic device, the electronic device comprising:
One or more processors;
A memory;
One or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more applications configured to: performing the external parameter calibration method of the laser radar and the camera according to any one of claims 1 to 4.
7. A computer readable storage medium storing at least one instruction, at least one program, code set, or instruction set, loaded and executed by a processor to implement the method of calibrating a lidar and camera according to any of claims 1 to 4.
CN202110286400.3A 2021-03-17 2021-03-17 External parameter calibration method, device and equipment for laser radar and camera and storage medium Active CN113256729B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110286400.3A CN113256729B (en) 2021-03-17 2021-03-17 External parameter calibration method, device and equipment for laser radar and camera and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110286400.3A CN113256729B (en) 2021-03-17 2021-03-17 External parameter calibration method, device and equipment for laser radar and camera and storage medium

Publications (2)

Publication Number Publication Date
CN113256729A CN113256729A (en) 2021-08-13
CN113256729B true CN113256729B (en) 2024-06-18

Family

ID=77181467

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110286400.3A Active CN113256729B (en) 2021-03-17 2021-03-17 External parameter calibration method, device and equipment for laser radar and camera and storage medium

Country Status (1)

Country Link
CN (1) CN113256729B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113838141B (en) * 2021-09-02 2023-07-25 中南大学 External parameter calibration method and system for single-line laser radar and visible light camera
CN114387347B (en) * 2021-10-26 2023-09-19 浙江视觉智能创新中心有限公司 Method, device, electronic equipment and medium for determining external parameter calibration
CN114758005B (en) * 2022-03-23 2023-03-28 中国科学院自动化研究所 Laser radar and camera external parameter calibration method and device
CN115187715A (en) * 2022-06-30 2022-10-14 先临三维科技股份有限公司 Mapping method, device, equipment and storage medium
CN115994955B (en) * 2023-03-23 2023-07-04 深圳佑驾创新科技有限公司 Camera external parameter calibration method and device and vehicle

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108932736B (en) * 2018-05-30 2022-10-11 南昌大学 Two-dimensional laser radar point cloud data processing method and dynamic robot pose calibration method
CN111862224B (en) * 2019-04-17 2023-09-19 杭州海康威视数字技术股份有限公司 Method and device for determining external parameters between camera and laser radar
CN111179358B (en) * 2019-12-30 2024-01-05 浙江商汤科技开发有限公司 Calibration method, device, equipment and storage medium
CN111192331B (en) * 2020-04-09 2020-09-25 浙江欣奕华智能科技有限公司 External parameter calibration method and device for laser radar and camera
CN111627072B (en) * 2020-04-30 2023-10-24 贝壳技术有限公司 Method, device and storage medium for calibrating multiple sensors
CN111965624B (en) * 2020-08-06 2024-04-09 阿波罗智联(北京)科技有限公司 Laser radar and camera calibration method, device, equipment and readable storage medium
CN112270713B (en) * 2020-10-14 2024-06-14 北京航空航天大学杭州创新研究院 Calibration method and device, storage medium and electronic device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Automatic extrinsic calibration between a camera and a 3D Lidar using 3D point and plane correspondences;Surabhi Verma 等;《2019 IEEE Intelligent Transportation Systems Conference (ITSC)》;全文 *
基于Hough变换的点云数据直线特征提取研究;张云鹏;《矿山测量》;第47卷(第5期);全文 *

Also Published As

Publication number Publication date
CN113256729A (en) 2021-08-13

Similar Documents

Publication Publication Date Title
CN113256729B (en) External parameter calibration method, device and equipment for laser radar and camera and storage medium
CN109035320B (en) Monocular vision-based depth extraction method
CN109146980B (en) Monocular vision based optimized depth extraction and passive distance measurement method
CN108805934B (en) External parameter calibration method and device for vehicle-mounted camera
CN109961468B (en) Volume measurement method and device based on binocular vision and storage medium
US10339390B2 (en) Methods and apparatus for an imaging system
Micusik et al. Structure from motion with wide circular field of view cameras
US9928595B2 (en) Devices, systems, and methods for high-resolution multi-view camera calibration
CN111123242B (en) Combined calibration method based on laser radar and camera and computer readable storage medium
CN111627075B (en) Camera external parameter calibration method, system, terminal and medium based on aruco code
CN110766758B (en) Calibration method, device, system and storage device
EP3460715B1 (en) Template creation apparatus, object recognition processing apparatus, template creation method, and program
CN103473771A (en) Method for calibrating camera
CN108182708B (en) Calibration method and calibration device of binocular camera and terminal equipment
KR102490521B1 (en) Automatic calibration through vector matching of the LiDAR coordinate system and the camera coordinate system
CN116433737A (en) Method and device for registering laser radar point cloud and image and intelligent terminal
CN114241062A (en) Camera external parameter determination method and device for automatic driving and computer readable storage medium
CN113643380A (en) Mechanical arm guiding method based on monocular camera vision target positioning
CN116012428A (en) Method, device and storage medium for combining and positioning thunder and vision
CN110458951B (en) Modeling data acquisition method and related device for power grid pole tower
CN111656404B (en) Image processing method, system and movable platform
CN111598956A (en) Calibration method, device and system
CN111336938A (en) Robot and object distance detection method and device thereof
CN115713564A (en) Camera calibration method and device
CN115018922A (en) Distortion parameter calibration method, electronic device and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant