CN112270713B - Calibration method and device, storage medium and electronic device - Google Patents

Calibration method and device, storage medium and electronic device Download PDF

Info

Publication number
CN112270713B
CN112270713B CN202011094343.0A CN202011094343A CN112270713B CN 112270713 B CN112270713 B CN 112270713B CN 202011094343 A CN202011094343 A CN 202011094343A CN 112270713 B CN112270713 B CN 112270713B
Authority
CN
China
Prior art keywords
calibration plate
point cloud
calibration
sensor
plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011094343.0A
Other languages
Chinese (zh)
Other versions
CN112270713A (en
Inventor
欧阳真超
崔家赫
何云翔
朱进文
牛建伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Innovation Research Institute of Beihang University
Original Assignee
Hangzhou Innovation Research Institute of Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Innovation Research Institute of Beihang University filed Critical Hangzhou Innovation Research Institute of Beihang University
Priority to CN202011094343.0A priority Critical patent/CN112270713B/en
Publication of CN112270713A publication Critical patent/CN112270713A/en
Application granted granted Critical
Publication of CN112270713B publication Critical patent/CN112270713B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • G06T2207/30208Marker matrix

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The embodiment of the application provides a calibration method and device, a storage medium and an electronic device, wherein the method comprises the following steps: collecting point cloud data of a solid-state laser radar, and obtaining an area where a calibration plate is located; extracting according to the region to obtain a calibration plate point cloud; fitting a first angular point position coordinate of the calibration plate through nonlinear optimization, wherein the first angular point position coordinate is a three-dimensional internal angular point coordinate of a point cloud obtained by measuring the calibration plate by a solid-state laser radar in the sensor; obtaining an external parameter conversion matrix of the sensor according to a second angular point position coordinate of the calibration plate and the first angular point position coordinate, wherein the second angular point position coordinate is a two-dimensional internal angular point coordinate of an image obtained by shooting the calibration plate of a camera in the sensor; and calibrating in the sensor through the external parameter conversion matrix. The scheme of the application increases the calibration precision and reduces the manual intervention.

Description

Calibration method and device, storage medium and electronic device
Technical Field
The application relates to an automatic driving technology, in particular to a calibration method and device, a storage medium and an electronic device.
Background
By means of the sensors mounted at different positions of the vehicle body, information such as roads, pedestrians, vehicles and the like in an open environment can be acquired and used in an automatic driving system.
In the related art, the automatic driving automobile perception scheme can simultaneously obtain real-world visual information and three-dimensional space distance information in space. But registration is required for the camera two-dimensional plane imaging results and sparse laser radar point cloud three-dimensional data. But the accuracy of calibration can be affected by a number of factors.
Aiming at the problem that the effect of a calibration scheme of a perception task in an automatic driving system is poor in the related art, no effective solution exists at present.
Disclosure of Invention
The embodiment of the application provides a calibration method and device for a solid-state laser radar-camera multi-sensor system, a storage medium and an electronic device, which are used for at least solving the problem of poor effect of a calibration scheme of a perception task in an automatic driving system in the related technology.
According to a first aspect of an embodiment of the present application, there is provided a calibration method, including: collecting point cloud data of a solid-state laser radar, and obtaining an area where a calibration plate is located; extracting according to the region to obtain a calibration plate; solving according to the calibration plate to obtain coordinates of the plane of the calibration plate according to the space distribution of the point cloud; and obtaining an external parameter conversion matrix of the sensor according to the position coordinates of the second angular point and the first angular point of the calibration plate. The first angular point position coordinate is a three-dimensional angular point coordinate of a point cloud obtained by measuring a calibration plate by a solid-state laser radar in the sensor; the second angular point position coordinates are two-dimensional inner angular point coordinates of an image obtained by shooting a calibration plate of a camera in the sensor; and calibrating in the sensor through the external parameter conversion matrix.
Optionally, after calibration in the sensor by the external parameter conversion matrix, the method further includes: projecting radar point cloud data in the sensor to an imaging plane of a camera in the sensor; and/or mapping visible light image data in the sensor to a three-dimensional space.
Optionally, the extracting the calibration plate according to the region includes: calculating probability density distribution of the Z-axis height of the point cloud in the region; converting the discrete probability density of the probability density distribution into a histogram; calculating gradients between two adjacent bins in the histogram; and dividing and extracting according to the gradient calculation result to obtain the calibration plate.
Optionally, the solving according to the calibration plate to obtain the fitted coordinates of the plane of the calibration plate according to the spatial distribution of the point cloud includes: performing plane fitting on the point cloud data of the calibration plate based on a plane segmentation method with random sampling consistency to obtain a fitting plane of the calibration plate; and determining coordinates of the plane of the calibration plate according to the spatial distribution of the point cloud on the fitting plane of the calibration plate.
Optionally, the determining the first angular point position coordinate of the calibration plate according to the coordinate of the point cloud spatial distribution on the plane of the calibration plate includes: generating a preset calibration plate in a two-dimensional plane according to the actual dimension parameter width and height of the calibration plate and the size of the calibration plate; and solving the three-dimensional pose difference value of the actual point cloud data and the preset calibration plate so as to enable the reflectivity of the actual point cloud data to be consistent with the space distribution condition in the preset calibration plate.
Optionally, collecting the point cloud data of the sensor, and determining the area where the calibration plate is located includes: filtering the collected point cloud data by using statistical outliers, and overlapping the point cloud data in continuous time in space; marking the area of the initial position of the calibration plate.
Optionally, the obtaining the extrinsic conversion matrix of the sensor according to the second angular point position coordinate and the first angular point position coordinate of the calibration board includes: the sensor includes: the camera and the solid-state laser radar acquire the second angular point and the first angular point corresponding to point cloud data of images in the calibration data in the calibration plate; and solving a PNP problem by utilizing a random sampling consistency method according to the camera internal parameters, the second angular points and the first angular points, and obtaining the calibration external parameters between the camera and the solid-state laser radar.
According to a second aspect of an embodiment of the present application, there is provided a calibration device including: the acquisition module is used for acquiring point cloud data of the sensor and acquiring an area where the calibration plate is positioned; the segmentation extraction module is used for extracting and obtaining a calibration plate according to the region; the fitting module is used for solving and obtaining the fitted coordinates of the plane of the calibration plate according to the point cloud space distribution according to the calibration plate; the angular point position determining module is used for determining first angular point position coordinates of the calibration plate according to coordinates of the point cloud space distribution of the plane of the calibration plate, wherein the first angular point position coordinates are three-dimensional internal angular point coordinates of the point cloud obtained by measuring the calibration plate by the solid-state laser radar in the sensor; the external parameter matrix solving module is used for obtaining an external parameter conversion matrix of the sensor according to the second angular point position coordinate and the first angular point position coordinate of the calibration plate, wherein the second angular point position coordinate is a two-dimensional internal angular point coordinate of an image obtained by shooting the calibration plate by a camera in the sensor; the external parameter matrix solving module is further used for calibrating the external parameter conversion matrix obtained through solving in the sensor.
According to a third aspect of embodiments of the present application, there is provided a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the method at run-time.
According to a fourth aspect of embodiments of the present application there is provided an electronic device comprising a memory having a computer program stored therein and a processor arranged to run the computer program to perform the method.
The embodiment of the application provides a calibration method, a calibration device, a storage medium and an electronic device, wherein the method comprises the steps of firstly realizing the densification of local static point cloud through point cloud integration in a time domain to obtain rich target point cloud data, and obtaining the three-dimensional coordinates of the corner point cloud of a calibration plate based on region detection, plane fitting and quasi-Newton optimization. And then acquiring the angular point two-dimensional coordinates of the calibration plate in the graph by combining the camera internal parameter calibration and angular point detection in the sensor. And finally, optimizing a projection matrix of the two-dimensional point cloud data-two-dimensional image by a random sampling consistency algorithm, namely obtaining external parameter projection results among different sensors. The technical effects of increasing the calibration precision and reducing the manual intervention are achieved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
FIG. 1 is a schematic flow chart of a calibration method according to an embodiment of the application;
FIG. 2 is a schematic diagram of a calibration device according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a flow for detecting corner points of a checkerboard calibration plate in a 3D point cloud in an embodiment of the application;
FIG. 4 is a schematic diagram of the flow of corner points of a checkerboard calibration plate in a detected picture in an embodiment of the application;
FIG. 5 is a flow chart of calculating a radar-camera extrinsic matrix in an embodiment of the application;
FIG. 6 is a schematic diagram of a top view of a radar point cloud in an embodiment of the present application;
FIG. 7 is a schematic view of the area of a checkerboard calibration plate in a point cloud in an embodiment of the application;
FIG. 8 is a schematic diagram of a calibration plate divided from a region of checkerboard point cloud in an embodiment of the present application;
Fig. 9 is a schematic diagram of detecting 2D corner points in an image corresponding to a point cloud in an embodiment of the present application;
FIG. 10 is a schematic diagram of radar-camera calibration using a extrinsic matrix in an embodiment of the application.
Detailed Description
In the process of realizing the application, the inventor finds that if a monocular vision sensor scheme is adopted in an automatic driving system, the cost is low, rich image information can be provided for an automatic driving automobile, but reliable and accurate three-dimensional space distance information cannot be provided; if a binocular camera is adopted, three-dimensional space distance information in a short distance can be provided through parallax calibration, but after the distance exceeds a limited distance, the data precision is poor, and the method cannot be suitable for outdoor scenes.
Further, although the cost is high, the laser radar can provide high-precision three-dimensional space distance information in the range of 200 meters. The realization of the omnibearing vehicle body environment sensing is a current general scheme through the mutual matching of various sensors. The solid-state laser radar has low cost and dense point cloud, which is favorable for wide popularization in unmanned platforms, but the defects of non-repeated scanning, easy color influence, large measurement noise and the like of the solid-state laser radar also lead to the need of targeted optimization in actual use.
In an automatic driving sensing system, the multi-sensor fusion not only can expand the visual field of an automatic driving automobile, but also can make up for the defects among different sensor sensing modes, but the current fusion strategy needs to finish the calibration of different sensors in advance, the original sensing data of different sensors are projected to a unified coordinate system, and the requirements for calibration and time stamp synchronization for different types of sensors are high.
Because the automatic driving automobile sensing scheme performs information fusion through a camera-laser, visual information of the real world and three-dimensional space distance information in space can be obtained at the same time, but registration is required to be performed on a two-dimensional imaging result of a camera plane and three-dimensional data of sparse laser radar point cloud. In addition, the solid-state laser radar is subjected to the problem of wavelength absorption of target colors on laser beams, so that laser pulses form dithering noise on targets with different colors, and the calibration precision is affected. Therefore, in order to complete the perception task in the automatic driving system, a multi-sensor fusion method is developed, and especially scheme optimization aiming at the instability of the solid-state laser radar point cloud is necessary.
In view of the above problems, an embodiment of the present application provides a calibration method, including: acquiring point cloud data of a sensor, and acquiring an area where a calibration plate is located; extracting according to the region to obtain a calibration plate; solving according to the calibration plate to obtain the fitted coordinates of the plane of the calibration plate according to the space distribution of the point cloud; and determining a first angular point position coordinate of the calibration plate according to the coordinates of the point cloud space distribution on the plane of the calibration plate, and obtaining an external parameter conversion matrix of the sensor according to a second angular point position coordinate of the calibration plate and the first angular point position coordinate. The first corner point coordinates are three-dimensional internal corner point coordinates of the solid-state laser radar in the sensor; the second angular point position coordinates are two-dimensional inner angular point coordinates of a camera in the sensor; and calibrating in the sensor through the external parameter conversion matrix.
In order to make the technical solutions and advantages of the embodiments of the present application more apparent, the following detailed description of exemplary embodiments of the present application is provided in conjunction with the accompanying drawings, and it is apparent that the described embodiments are only some embodiments of the present application and not exhaustive of all embodiments. It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other.
In this embodiment, a calibration method is provided, as shown in fig. 1, and the process includes the following steps:
Step S101, acquiring point cloud data of a sensor, and acquiring an area where a calibration plate is located;
step S102, extracting according to the region to obtain a calibration plate;
Step S103, solving according to the calibration plate to obtain the fitted coordinates of the plane of the calibration plate according to the spatial distribution of the point cloud;
Step S104, determining first angular point coordinates of the calibration plate according to coordinates of the point cloud space distribution on the plane of the calibration plate, wherein the first angular point coordinates are three-dimensional internal angular point coordinates of the solid-state laser radar in the sensor;
Step S105, obtaining an external parameter conversion matrix of the sensor according to a second angular point position coordinate of the calibration plate and the first angular point position coordinate, wherein the second angular point position coordinate is a two-dimensional internal angular point coordinate of a camera in the sensor;
and S106, calibrating in the sensor through the external parameter conversion matrix.
Through the steps, the following technical effects are realized:
The method comprises the steps of firstly realizing the densification of local static point clouds by carrying out point cloud integration in a time domain to obtain rich target point cloud data, and obtaining three-dimensional coordinates of corner points of a calibration plate based on region detection, plane fitting and quasi-Newton optimization. And then acquiring the angular point two-dimensional coordinates of the calibration plate in the graph by combining the camera internal parameter calibration and angular point detection in the sensor. And finally, optimizing a projection matrix of the two-dimensional point cloud data-two-dimensional image by a random sampling consistency algorithm, namely obtaining external parameter projection results among different sensors. The technical effects of increasing the calibration precision and reducing the manual intervention are achieved.
In the step S101, the acquisition and preprocessing module of the original point cloud data performs statistical outlier filtering (STATISTICAL OUTLIER REMOVAL) noise reduction processing on each frame of solid-state laser radar point cloud data, and integrates the point clouds within continuous T seconds; then, the approximate positions of the checkered calibration plates in the aerial view are manually marked, and the regions (Region of Interest, ROI) where the calibration plates are located are acquired.
In the step S102, the calibration plate is divided, and a calibration frame (calibration box) of the calibration plate is obtained from the label, the point cloud is cut by combining the Region (ROI), and the subsequent detection module only considers the points inside the region. And cutting out the calibration plate from the region, determining a height threshold value for dividing the calibration plate by using the distribution of the probability density of the point cloud in the Z axis in the region, and dividing the calibration plate.
In step S103 described above, the calibration plate planar features are optimized. And then, carrying out plane fitting on the calibration plate, and obtaining the real position of the plane of the calibration plate through multiple iterations. After obtaining plane parameters, calculating the ray parameters from the origin of the solid laser radar to each point in the original point cloud of the calibration plate by adopting a ray projection model, and then projecting the original point cloud to an ideal calibration plate plane, wherein the projection point is the intersection point of the ray set and the plane. Because the solid-state laser radar performs reciprocating scanning in the Y-axis direction, the problem of uneven density distribution exists at the edge points of the scanning lines, so that the fitted calibration plate plane point clouds are randomly sampled in a grid division mode, and the point cloud density in each grid is ensured to be evenly distributed after resampling.
In the above step S104, three-dimensional corner detection is performed. Solving a pose difference matrix T of the ideal chessboard model and the actual calibration plate plane point cloud by using a quasi-Newton method (L-BGFS-B) optimization method and utilizing the reflectivity distribution characteristics of the point cloud; and transforming the angular point coordinates of the ideal chessboard back to the coordinate system where the original point cloud is located by using T, and obtaining the detection result of the coordinate in the three-dimensional angular point corresponding to the calibration plate.
Repeating the steps 101-104 in the step S105 to obtain corresponding two-dimensional inner corner points and three-dimensional inner corner point coordinates in all the image-point clouds in the calibration data; and solving a PNP (PERSPECTIVE N Point) problem by using a RANSAC (Random Sample Consensus, random sampling consistency) method in combination with the camera internal parameters to finally obtain a calibration external parameter E between the camera and the solid-state laser radar.
In the step S106, the angular point coordinates of the calibration plate in the accurate three-dimensional space are obtained by calibrating the sensor through the external parameter conversion matrix.
Preferably, in an embodiment of the present application, after calibration in the sensor by the external parameter conversion matrix, the method further includes: projecting radar point cloud data in the sensor to an imaging plane of a camera in the sensor; and/or mapping visible light image data in the sensor to a three-dimensional space.
Based on the steps, the solid-state laser radar and the camera are combined, the checkerboard calibration plate is used as a calibration reference target, and the two-dimensional and three-dimensional positions of the corner points of the calibration plate in the visual field of the camera and the solid-state laser radar are calculated in real time in the calibration process, so that the whole process only needs a small amount of manual intervention operation and acquisition of a plurality of groups of image-point cloud data pairs. The accurate external parameter conversion matrix of the camera-solid laser radar can be obtained through calculation of the algorithm, and then radar point clouds can be projected to an imaging plane of the camera or visible light images are mapped to a three-dimensional space, so that fusion of multi-mode sensor data is realized; meanwhile, the method can be used for calibrating a plurality of radars to expand the field of view.
As a preferred embodiment of the present application, the extracting the calibration plate according to the region includes: calculating probability density distribution of the Z-axis height of the point cloud in the region; converting the discrete probability density of the probability density distribution into a histogram; calculating gradients between two adjacent bins in the histogram; and dividing and extracting according to the gradient calculation result to obtain the calibration plate.
In the specific implementation, dividing a calibration plate from the vicinity of a region of the checkerboard calibration plate obtained by marking, calculating probability density distribution of the Z-axis height of point cloud in the region, and densifying the probability of discrete distribution into a histogram; then, calculating the gradient between two adjacent cells (bins) in the histogram, wherein the bins of the histogram are color histograms, and the color space is divided into a plurality of small color intervals. Obtaining a color histogram by calculating pixels of colors in each cell; the more bins, the stronger the resolution of the histogram color. Since no shielding object is arranged above the designated calibration plate, the top height Z_max of the calibration plate (point cloud) is the maximum height of the calibration plate in the region; and then selecting K bins with highest gradient values to calculate average heights, calculating the height difference between the average heights and the top of the calibration plate, and selecting the height with the difference closest to the diagonal length of the calibration plate as a segmentation threshold value Z_max. The calibration plate can be segmented by means of a height threshold value (z_min, z_max).
Preferably, in the embodiment of the present application, the solving according to the calibration plate to obtain the fitted coordinates of the plane of the calibration plate according to the spatial distribution of the point cloud includes: performing plane fitting on the point cloud data of the calibration plate based on a plane segmentation method with random sampling consistency to obtain a fitting plane of the calibration plate; and determining coordinates of the plane of the calibration plate according to the spatial distribution of the point cloud on the fitting plane of the calibration plate.
In specific implementation, a plane segmentation method based on random sampling consistency is applied to perform plane fitting on the calibration plate point cloud. And obtaining a plane parameter S by calculating the distance from the point cloud of the calibration plate to the fitting plane S, filtering out points with the distance S being greater than 2 theta, and then halving the theta by the iteratively fitting plane until the distances from all the points to the new fitting plane are smaller than theta, wherein the final rest point cloud set is the fitting plane of the obtained calibration plate.
Further, by resampling P 'proj, P' proj is first divided into a plurality of gridpatch (grid blocks) and the density of each patch is calculated, then each patch with a density less than D th is resampled, and the samples in the current patch are selectedA point.
In some alternative embodiments, because the distance variance measured by the solid-state laser radar in the axial direction is larger, a calibration plate obtained after time superposition has a larger error in the X-axis direction, and then the actual position of the plane of the calibration plate needs to be estimated.
As a preferred embodiment of the present application, the determining, according to the coordinates of the point cloud spatial distribution of the plane of the calibration plate, the first angular point position coordinates of the calibration plate includes: generating a preset calibration plate in a two-dimensional plane according to the actual dimension parameter width and height of the calibration plate and the size of the calibration plate; and solving the three-dimensional pose difference value of the actual point cloud data and the preset calibration plate so as to enable the reflectivity of the actual point cloud data to be consistent with the space distribution condition in the preset calibration plate.
In specific implementation, an L-BGFS-B optimization method is applied, and the point cloud corner coordinates fitting the plane of the calibration plate are obtained by solving the reflectivity information. Firstly, generating an ideal checkerboard in a two-dimensional plane according to the actual dimension parameter width (w), the height (h) and the size (grid_size) of the calibration board, and enabling the reflectivity of an actual point cloud to be consistent with the space distribution of the ideal checkerboard by solving the three-dimensional pose difference T= [ theta, x, y ] of the actual point cloud and the ideal checkerboard through an optimization algorithm.
Further, the specific implementation steps of obtaining the point cloud corner coordinates fitting the plane of the calibration plate by solving the reflectivity information by applying the L-BGFS-B optimization method are as follows:
Step S1, randomly generating an initial three-dimensional pose T 0=[θ0,x0,y0, so that the current pose variable T=T 0
Step S2, transforming P 'proj to a new pose (pose) by using T, and calculating a difference cost between the P' proj and the ideal checkerboard space position distribution by using reflectivity information:
step S21, traversing the point cloud to obtain the plane coordinate { p x,py } of each point p;
step S22, if p falls outside the ideal checkerboard, then
Cost p=ming∈Gdist({px,py},{gx,gy), wherein G is the set of ideal tessellation corner points, the current computation ends;
step S23, if p falls inside the ideal checkerboard, the coordinates of the p falling on the ideal checkerboard are { G x,Gy };
Step S24, finding (G x,Gy) a corresponding color CLR estimate according to the color mode of the actual checkerboard;
Step S25, obtaining the corresponding color CLR gt after the actual reflectivity of the current point is binarized, and comparing the color CLR gt with the ideal color CLR estimate; cost p =0 if the colors are the same, otherwise cost p=ming∈Gdist({px,py},{gx,gy };
Step S26, the cost calculation of the current point is ended.
And S3, optimizing the T by using an L-BGFS-B algorithm according to the cost value to obtain a pose parameter T 'at the next moment, enabling the T to be T=T', and returning to the step ii to repeat the optimization algorithm until the cost converges.
As an embodiment of the present application, preferably, the collecting the point cloud data of the sensor, and determining the area where the calibration plate is located includes: filtering the collected point cloud data by using statistical outliers, and overlapping the point cloud data in continuous time in space; marking the area of the initial position of the calibration plate.
In specific implementation, the collected point cloud data is filtered by using statistical outliers (STATISTICAL OUTLIER REMOVAL), the point clouds of continuous T seconds are spatially overlapped, and then an operator marks the area of the approximate position of the checkerboard calibration plate in the top view of the point cloud.
As a preferred embodiment of the present application, the obtaining the extrinsic conversion matrix of the sensor according to the position coordinates of the second angular point and the position coordinates of the first angular point of the calibration board includes: the sensor includes: the camera and the solid-state laser radar acquire the second angular point and the first angular point corresponding to point cloud data of images in the calibration data in the calibration plate; and solving a PNP problem by utilizing a random sampling consistency method according to the camera internal parameters, the second angular points and the first angular points, and obtaining the calibration external parameters between the camera and the solid-state laser radar.
In specific implementation, corresponding two-dimensional inner corner points and three-dimensional inner corner point coordinates in all image-point clouds in the calibration data are obtained, and a PNP problem is solved by using a random (Random Sample Consensus, random sampling consistency) method in combination with camera inner parameters, so that the calibration outer parameters E between the camera and the solid-state laser radar are finally obtained.
From the description of the above embodiments, it will be clear to a person skilled in the art that the method according to the above embodiments may be implemented by means of software plus the necessary general hardware platform, but of course also by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present invention.
In this embodiment, a calibration device is further provided, and the device is used to implement the foregoing embodiments and preferred embodiments, which are not described in detail. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. While the means described in the following embodiments are preferably implemented in software, implementation in hardware, or a combination of software and hardware, is also possible and contemplated.
FIG. 2 is a block diagram of a calibration device according to an embodiment of the present invention, as shown in FIG. 2, the device includes:
The acquisition module 21 is used for acquiring point cloud data of the sensor and acquiring an area where the calibration plate is located;
A segmentation extraction module 22, configured to extract a calibration plate according to the region;
the fitting module 23 is used for solving according to the calibration plate to obtain the fitted coordinates of the plane of the calibration plate according to the space distribution of the point cloud;
The angular point position determining module 24 is configured to determine a first angular point position coordinate of the calibration plate according to coordinates of a point cloud spatial distribution of a plane of the calibration plate, where the first angular point position coordinate is a three-dimensional internal angular point coordinate of the solid-state laser radar in the sensor;
the external parameter matrix solving module 25 is configured to obtain an external parameter conversion matrix of the sensor according to a second angular point position coordinate of the calibration plate and the first angular point position coordinate, where the second angular point position coordinate is a two-dimensional internal angular point coordinate of a camera in the sensor;
The external parameter matrix module is also used for calibrating in the sensor through the external parameter conversion matrix.
The acquisition module 21 performs statistical outlier filtering noise reduction processing on the solid-state laser radar point cloud data of each frame, and spatially overlaps the point clouds within continuous T seconds; then, the approximate positions of the checkered calibration plates in the aerial view are manually marked, and the areas (RegionofInterest, areas) where the calibration plates are located are obtained.
The segmentation extraction module 22 obtains a calibration frame (boundingbox) of the calibration plate from the label, tailors the point cloud in the region where the point cloud is located, and the subsequent detection module only considers points inside the region. Dividing a calibration plate from the region, obtaining a height threshold value of the calibration plate by using probability density of the Z-axis point cloud distribution in the region, determining the width of the calibration plate, and finally obtaining the calibration plate.
The fitting module 23 obtains the real position of the plane of the calibration plate through multiple iterations, a linear projection model is adopted for specific operation, the ray parameters from the origin of the solid-state laser radar to each point in the plane of the calibration plate are calculated, the intersection point space P of the ray and the plane of the calibration plate is obtained through calculation, point clouds near the plane are projected onto an ideal plane, principal component analysis (PRINCIPAL COMPONENTS ANALYSIS, PCA) transformation is carried out on the intersection point space P, and the coordinates of the fitted plane of the calibration plate according to the point cloud space distribution are solved.
The angular point position determining module 24 applies a quasi-Newton method (L-BGFS-B) optimization method and solves the angular point coordinate position of the point cloud in the plane of the calibration plate by utilizing the reflectivity of the point cloud; and (3) converting the angular point coordinates of the obtained point cloud calibration plate plane into an original point cloud space to obtain the corresponding three-dimensional internal angular point coordinates of the detected calibration plate.
Since the solid-state laser radar is scanned in a reciprocating manner in the Y-axis direction, the problem of uneven density of edge points of a scanning line is caused, and therefore resampling of the plane of the calibration plate is required.
The external reference matrix module 25 obtains a radar-camera external reference matrix based on a random sampling consistency method and a solution (PERSPECTIVE N Point, PNP) method, and obtains corresponding two-dimensional internal corner points and three-dimensional internal corner coordinates in all image-Point clouds in the calibration data. And solving the PNP problem by utilizing a random sampling consistency (Random Sample Consensus, RANSAC) method in combination with the camera internal parameters to finally obtain a calibration external parameter E between the camera and the solid-state laser radar.
In order to better understand the calibration procedure, the technical solution described above is explained below in connection with the preferred embodiment, but the technical solution of the embodiment of the present invention is not limited.
Aiming at the inherent precision defect of the solid-state laser radar, the application designs a point cloud optimization noise reduction algorithm for a multi-radar-camera sensing system, designs a corresponding sensor calibration flow on the basis, increases the calibration precision and reduces the manual intervention. The whole method mainly comprises three main steps of camera two-dimensional angular point detection, solid-state laser radar point cloud three-dimensional angular point detection and two-dimensional-three-dimensional projection transformation-based solving. Firstly, realizing the densification of local static point clouds by carrying out point cloud integration in a time domain to obtain rich target point cloud data, and obtaining the three-dimensional coordinates of the corner points of the calibration plate based on region detection, plane fitting and quasi-Newton optimization. And then, combining camera internal parameter calibration and angular point detection to obtain the angular point two-dimensional coordinates of the calibration plate in the graph. And finally, optimizing a projection matrix of the three-dimensional point cloud-two-dimensional image by a random sampling consistency algorithm, namely obtaining external parameter projection results among different sensors. The calibration method is mainly used for combining a camera and a solid-state laser radar sensor, simultaneously obtaining visual information of a real world and three-dimensional space distance information in space, and noise distribution front view and side view of different radar point clouds on a calibration plate, and particularly, solid-state laser radars with denser point clouds are used as distance sensors in experiments.
The radar-camera calibration task is divided into two modules by a frame. Firstly, acquiring three-dimensional angular point information of a checkerboard calibration plate in a laser radar point cloud view; then, acquiring two-dimensional angular point information of a checkerboard calibration plate in the camera picture; finally, combining the three-dimensional-two-dimensional angular point information to calculate an accurate radar-camera conversion matrix. Preferably, the sensor and calibration plate relative positions and corresponding coordinate systems set the sensor positions and checkerboard calibration plate positions. The flow diagrams of the main two modules of the solid-state laser radar-camera calibration method based on time domain integration are respectively shown in fig. 3 and 4. Finally, three-dimensional-two-dimensional checkerboard angular point information is obtained through modules shown in fig. 3 and 4 respectively, and an accurate radar-camera conversion matrix is obtained, as shown in fig. 5.
Step 1, firstly, mounting a solid-state laser radar and a camera at a rigid body fixed position (such as a vehicle-mounted platform bracket), ensuring that the fields of view of the radar and the camera are not blocked, and accessing the solid-state laser radar and the camera into a computing device through a data line such as an Ethernet port or USB. Meanwhile, the checkerboard calibration plate is prepared and supported in a certain range in front of the radar and the camera, the checkerboard calibration plate is ensured to be always in the fields of view of the radar and the camera at the same time in the whole experimental process, the calibration plate is placed at a plurality of positions as much as possible for sampling in the experimental process, and the angle of the calibration plate is properly adjusted.
Step 2, a calibration procedure is initiated and the camera will transmit the captured photograph to the computing device while the computer will pop up a window of the point cloud top view as shown in fig. 6. At this time, noise reduction processing has been performed on the solid-state lidar point cloud data of each frame, and point clouds within consecutive T seconds are spatially superimposed. The rough position of the checkerboard calibration plate in the point cloud top view needs to be marked manually by an operator, and the area where the calibration plate is located can be obtained through automatic processing of an algorithm, as shown in fig. 7.
Step 3, the algorithm obtains a calibration frame (calibration box) of the calibration plate from the label, the calibration plate point cloud is cut by combining the area where the point cloud is located, and the subsequent detection module only considers the points in the area, namely: dividing the calibration plate from the region, obtaining a height threshold value of the calibration plate by using probability density of the Z-axis point cloud distribution in the region, determining the width of the calibration plate, and dividing the calibration plate, as shown in fig. 8.
And 4, performing plane fitting on the calibration plate, and obtaining the real position of the plane of the calibration plate through multiple iterations. The specific operation adopts a linear projection model, the ray parameters from the origin of the solid-state laser radar to each point in the plane of the calibration plate are calculated, the intersection point space P of the ray and the plane of the calibration plate is calculated, and the point cloud near the plane is projected onto an ideal plane. And carrying out PCA transformation on the intersection point space P, and solving to obtain the fitted coordinate of the calibration plate plane according to the spatial distribution of the point cloud.
Step 5, resampling the plane of the calibration plate, and solving the coordinate position of the point cloud corner point in the plane of the calibration plate by using the reflectivity of the point cloud by applying a quasi-Newton method (L-BGFS-B) optimization method; converting the angular point coordinates of the obtained point cloud calibration plate plane into an original point cloud space to obtain three-dimensional internal angular point coordinates corresponding to the detected calibration plate;
and 6, the computer can acquire the two-dimensional angular point information of the calibration plate from the picture at the same time, and the detected angular point is shown in fig. 9.
Step 7, placing the calibration plates at different positions, and repeating the steps 2-6 to obtain a plurality of groups of three-dimensional-two-dimensional checkerboard corner information; the camera internal reference matrix is combined, and an accurate radar-camera external reference conversion matrix can be obtained through calculation. Fig. 10 is an effect diagram of calibrating a radar-camera under the same coordinate system by using the obtained external parameter transformation matrix.
Specifically, a solid-state laser radar reprojection error histogram may be used, and the quantization method is to reproject the checkerboard corner detected in the point cloud view to the camera plane by using the radar-camera extrinsic conversion matrix obtained in the above steps, and then calculate a histogram obtained by pixel errors from the corner information of the point cloud reprojection and the corner information of the image.
An embodiment of the invention also provides a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the method embodiments described above when run.
Alternatively, in the present embodiment, the above-described storage medium may be configured to store a computer program for performing the steps of:
s1, acquiring point cloud data of a sensor, and acquiring an area where a calibration plate is located;
s2, extracting a calibration plate from the point cloud of the area;
s3, solving according to the calibration plate to obtain the fitted coordinates of the plane of the calibration plate according to the space distribution of the point cloud;
S4, determining a first angular point position coordinate of the calibration plate through nonlinear optimization according to the coordinate of the calibration plate plane distributed according to the point cloud space, wherein the first angular point position coordinate is a three-dimensional internal angular point coordinate of the point cloud obtained by measuring the calibration plate by the solid-state laser radar in the sensor;
S5, obtaining an external parameter conversion matrix between the sensors according to the second angular point position coordinates of the calibration plate and the first angular point position coordinates, wherein the second angular point position coordinates are two-dimensional internal angular point coordinates of a camera in the sensors;
s6, calibrating the sensor through the external parameter conversion matrix.
Optionally, the storage medium is further arranged to store a computer program for performing the steps of:
s31, projecting radar point cloud data in the sensor to an imaging plane of a camera in the sensor.
Optionally, the storage medium is further arranged to store a computer program for performing the steps of:
and S32, mapping the visible light image data in the sensor to a three-dimensional space.
Alternatively, in the present embodiment, the storage medium may include, but is not limited to: a usb disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory RAM), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing a computer program.
An embodiment of the invention also provides an electronic device comprising a memory having stored therein a computer program and a processor arranged to run the computer program to perform the steps of any of the method embodiments described above.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, where the transmission device is connected to the processor, and the input/output device is connected to the processor.
Alternatively, in the present embodiment, the above-described processor may be configured to execute the following steps by a computer program:
S1, acquiring point cloud data of a sensor, and acquiring an area where a calibration plate is located
S2, solving according to the calibration plate to obtain the fitted coordinates of the plane of the calibration plate according to the space distribution of the point cloud;
S3, determining a first angular point position coordinate of the calibration plate according to the coordinates of the point cloud space distribution on the plane of the calibration plate, wherein the first angular point position coordinate is a three-dimensional internal angular point coordinate of the point cloud obtained by measuring the calibration plate by the solid-state laser radar in the sensor;
S4, obtaining an external parameter conversion matrix of the sensor according to a second angular point position coordinate and the first angular point position coordinate of the calibration plate, wherein the second angular point position coordinate is a two-dimensional internal angular point coordinate of an image obtained by shooting the calibration plate by a camera in the sensor;
s5, calibrating the sensor through the external parameter conversion matrix.
Alternatively, specific examples in this embodiment may refer to examples described in the foregoing embodiments and optional implementations, and this embodiment is not described herein.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present application without departing from the spirit or scope of the application. Thus, it is intended that the present application also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (9)

1. The calibration method is characterized by comprising the following steps:
collecting point cloud data of a solid-state laser radar, and obtaining an area where a calibration plate is located;
extracting the point cloud of the area to obtain a calibration plate point cloud;
according to the calibration method, the fitted coordinates of the corresponding plane of the calibration plate are distributed according to the point cloud space;
Determining a first angular point position coordinate of the calibration plate through nonlinear optimization according to the coordinates of the spatial distribution of the point cloud on the plane of the calibration plate, wherein the first angular point position coordinate is a three-dimensional internal angular point coordinate of the point cloud obtained by measuring the calibration plate by a solid-state laser radar in a sensor;
According to the second angular point position coordinates of the calibration plate and the first angular point position coordinates, solving to obtain an external parameter conversion matrix of the sensor, wherein the second angular point position coordinates are two-dimensional internal angular point coordinates of an image obtained by shooting the calibration plate by a camera in the sensor;
calibrating in the sensor through the external parameter conversion matrix;
The method for extracting the calibration plate from the region comprises the following steps:
Calculating probability density distribution of the Z-axis height of the point cloud in the region;
Converting the discrete probability density of the probability density distribution into a histogram;
Calculating gradients between two adjacent bins in the histogram;
dividing and extracting according to the gradient calculation result to obtain the calibration plate;
dividing a calibration plate from the vicinity of a region of the checkerboard calibration plate obtained by marking, calculating probability density distribution of the Z-axis height of point cloud in the region, and densifying the probability of discrete distribution into a histogram;
Then, calculating the gradient between two adjacent bin in the histogram, wherein the bin in the histogram is the color histogram, and the color space is divided into a plurality of small color intervals; obtaining a color histogram by calculating pixels of colors in each cell; the more bins, the stronger the resolution of the histogram color;
And then selecting K bins with highest gradient values to calculate average heights, calculating the height difference between the average heights and the top of the calibration plate, and selecting the height with the difference closest to the diagonal length of the calibration plate as a segmentation threshold value Z_max.
2. The method of claim 1, further comprising verifying calibration effects after calibrating the sensor by the extrinsic transformation matrix by:
projecting radar point cloud data in the sensor to an imaging plane of a camera in the sensor;
and/or mapping visible light image data in the sensor to a three-dimensional space.
3. The method according to claim 1, wherein the coordinates of the plane of the calibration plate fitted according to the calibration method according to the spatial distribution of the point cloud include:
performing plane fitting on the point cloud data of the calibration plate based on a plane segmentation method with random sampling consistency to obtain a fitting plane of the calibration plate;
And determining coordinates of the plane of the calibration plate according to the spatial distribution of the point cloud in the fitting plane.
4. The method of claim 1, wherein determining the first angular point location coordinates of the calibration plate based on coordinates of the plane of the calibration plate spatially distributed according to the point cloud comprises:
Generating a preset calibration plate in a two-dimensional plane according to the actual dimension parameter width and height of the calibration plate and the size of the calibration plate;
The reflectivity of the actual point cloud data is consistent with the space distribution condition in the preset calibration plate by solving the three-dimensional pose difference value of the actual point cloud data and the preset calibration plate;
and according to the three-dimensional pose difference value, reversely transforming the angular point of the preset calibration plate to a coordinate system corresponding to the actual point cloud to obtain the first angular point position coordinate.
5. The method of claim 1, wherein determining the area where the calibration plate is located based on the point cloud data of the acquisition sensor comprises:
Filtering the collected point cloud data by using a statistical outlier filtering method, and overlapping the point cloud data in continuous time in space;
Marking the area of the initial position of the calibration plate.
6. The method of claim 1, wherein said solving the extrinsic transformation matrix of the sensor based on the second corner position coordinates and the first corner position coordinates of the calibration plate comprises:
The sensor comprises at least: a camera, a solid-state lidar,
Acquiring the position coordinates of the second angular point and the position coordinates of the first angular point, which correspond to the point cloud data of the image in the calibration data in the calibration plate;
And solving a PNP problem by utilizing a random sampling consistency method according to the camera internal parameters, the second angular points and the first angular points, and obtaining an external parameter matrix between the camera and the solid-state laser radar.
7. A calibration device, comprising:
The data acquisition module is used for acquiring point cloud data of the sensor and acquiring an area where the calibration plate is positioned;
the segmentation extraction module is used for extracting and obtaining a calibration plate according to the region;
The fitting module is used for solving and obtaining coordinates of the plane of the calibrated plate which is fitted according to the space distribution of the point cloud;
The angular point position determining module is used for determining first angular point position coordinates of the calibration plate according to coordinates of the point cloud space distribution of the plane of the calibration plate, wherein the first angular point position coordinates are three-dimensional internal angular point coordinates of the point cloud obtained by measuring the calibration plate by the solid-state laser radar in the sensor;
The external parameter matrix solving module is used for obtaining an external parameter conversion matrix of the sensor according to the second angular point position coordinate and the first angular point position coordinate of the calibration plate, wherein the second angular point position coordinate is a two-dimensional internal angular point coordinate of an image obtained by shooting the calibration plate by a camera in the sensor;
the external parameter matrix solving module is further used for calibrating the external parameter conversion matrix obtained through solving in the sensor;
extracting the calibration plate in the region comprises the following steps:
Calculating probability density distribution of the Z-axis height of the point cloud in the region;
Converting the discrete probability density of the probability density distribution into a histogram;
Calculating gradients between two adjacent bins in the histogram;
dividing and extracting according to the gradient calculation result to obtain the calibration plate;
dividing a calibration plate from the vicinity of a region of the checkerboard calibration plate obtained by marking, calculating probability density distribution of the Z-axis height of point cloud in the region, and densifying the probability of discrete distribution into a histogram;
Then, calculating the gradient between two adjacent bin in the histogram, wherein the bin in the histogram is the color histogram, and the color space is divided into a plurality of small color intervals; obtaining a color histogram by calculating pixels of colors in each cell; the more bins, the stronger the resolution of the histogram color; since no shielding object exists above the designated calibration plate, the top height Z_max of the calibration plate is the maximum height of the calibration plate in the region;
then selecting K bins with highest gradient values to calculate average heights, calculating the height difference between the average heights and the top of the calibration plate, and selecting the height with the difference closest to the diagonal length of the calibration plate as a segmentation threshold Z_max; the calibration plate can be partitioned by the height threshold value.
8. A storage medium having a computer program stored therein, wherein the computer program is arranged to perform the method of any of claims 1 to 7 when run.
9. An electronic device comprising a memory and a processor, characterized in that the memory has stored therein a computer program, the processor being arranged to run the computer program to perform the method of any of the claims 1 to 6.
CN202011094343.0A 2020-10-14 2020-10-14 Calibration method and device, storage medium and electronic device Active CN112270713B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011094343.0A CN112270713B (en) 2020-10-14 2020-10-14 Calibration method and device, storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011094343.0A CN112270713B (en) 2020-10-14 2020-10-14 Calibration method and device, storage medium and electronic device

Publications (2)

Publication Number Publication Date
CN112270713A CN112270713A (en) 2021-01-26
CN112270713B true CN112270713B (en) 2024-06-14

Family

ID=74338185

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011094343.0A Active CN112270713B (en) 2020-10-14 2020-10-14 Calibration method and device, storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN112270713B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112462348B (en) * 2021-02-01 2021-04-27 知行汽车科技(苏州)有限公司 Method and device for amplifying laser point cloud data and storage medium
CN113139569B (en) * 2021-03-04 2022-04-22 山东科技大学 Target classification detection method, device and system
US20220300681A1 (en) * 2021-03-16 2022-09-22 Yuan Ren Devices, systems, methods, and media for point cloud data augmentation using model injection
CN113256729B (en) * 2021-03-17 2024-06-18 广西综合交通大数据研究院 External parameter calibration method, device and equipment for laser radar and camera and storage medium
CN113126115B (en) * 2021-04-06 2023-11-17 北京航空航天大学杭州创新研究院 Semantic SLAM method and device based on point cloud, electronic equipment and storage medium
CN115239815B (en) * 2021-06-23 2023-10-27 上海仙途智能科技有限公司 Camera calibration method and device
CN113567964B (en) * 2021-06-29 2023-07-25 苏州一径科技有限公司 Laser radar automatic test method, device and system
CN113406604A (en) * 2021-06-30 2021-09-17 山东新一代信息产业技术研究院有限公司 Device and method for calibrating positions of laser radar and camera
CN114689106B (en) * 2022-03-31 2024-03-08 上海擎朗智能科技有限公司 Sensor calibration method, robot and computer readable storage medium
CN114624683A (en) * 2022-04-07 2022-06-14 苏州知至科技有限公司 Calibration method for external rotating shaft of laser radar
CN114677429B (en) * 2022-05-27 2022-08-30 深圳广成创新技术有限公司 Positioning method and device of manipulator, computer equipment and storage medium
CN116452439A (en) * 2023-03-29 2023-07-18 中国工程物理研究院计算机应用研究所 Noise reduction method and device for laser radar point cloud intensity image

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111179358A (en) * 2019-12-30 2020-05-19 浙江商汤科技开发有限公司 Calibration method, device, equipment and storage medium

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9811714B2 (en) * 2013-08-28 2017-11-07 Autodesk, Inc. Building datum extraction from laser scanning data
CN107976668B (en) * 2016-10-21 2020-03-31 法法汽车(中国)有限公司 Method for determining external parameters between camera and laser radar
CN107976669B (en) * 2016-10-21 2020-03-31 法法汽车(中国)有限公司 Device for determining external parameters between camera and laser radar
CN106651863A (en) * 2016-11-30 2017-05-10 厦门大学 Point cloud data based automatic tree cutting method
KR101948569B1 (en) * 2017-06-07 2019-02-15 국방과학연구소 Flying object identification system using lidar sensors and pan/tilt zoom cameras and method for controlling the same
CN108983248A (en) * 2018-06-26 2018-12-11 长安大学 It is a kind of that vehicle localization method is joined based on the net of 3D laser radar and V2X
CN109919237B (en) * 2019-03-13 2021-02-26 武汉海达数云技术有限公司 Point cloud processing method and device
CN110349221A (en) * 2019-07-16 2019-10-18 北京航空航天大学 A kind of three-dimensional laser radar merges scaling method with binocular visible light sensor
CN110599541B (en) * 2019-08-28 2022-03-11 贝壳技术有限公司 Method and device for calibrating multiple sensors and storage medium
CN111127563A (en) * 2019-12-18 2020-05-08 北京万集科技股份有限公司 Combined calibration method and device, electronic equipment and storage medium
CN111260668B (en) * 2020-01-20 2023-05-26 南方电网数字电网研究院有限公司 Power line extraction method, system and terminal
CN111192331B (en) * 2020-04-09 2020-09-25 浙江欣奕华智能科技有限公司 External parameter calibration method and device for laser radar and camera
CN111612845A (en) * 2020-04-13 2020-09-01 江苏大学 Laser radar and camera combined calibration method based on mobile calibration plate

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111179358A (en) * 2019-12-30 2020-05-19 浙江商汤科技开发有限公司 Calibration method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN112270713A (en) 2021-01-26

Similar Documents

Publication Publication Date Title
CN112270713B (en) Calibration method and device, storage medium and electronic device
CN107316325B (en) Airborne laser point cloud and image registration fusion method based on image registration
CN107463918B (en) Lane line extraction method based on fusion of laser point cloud and image data
CN111179358B (en) Calibration method, device, equipment and storage medium
CN112907676B (en) Calibration method, device and system of sensor, vehicle, equipment and storage medium
CN111383279B (en) External parameter calibration method and device and electronic equipment
CN109961468B (en) Volume measurement method and device based on binocular vision and storage medium
JP6821712B2 (en) Calibration of integrated sensor in natural scene
CN111563921B (en) Underwater point cloud acquisition method based on binocular camera
CN109801333B (en) Volume measurement method, device and system and computing equipment
CN108474658B (en) Ground form detection method and system, unmanned aerial vehicle landing method and unmanned aerial vehicle
CN110555407B (en) Pavement vehicle space identification method and electronic equipment
CN111815716A (en) Parameter calibration method and related device
EP2686827A1 (en) 3d streets
CN112907675B (en) Calibration method, device, system, equipment and storage medium of image acquisition equipment
CN113205604A (en) Feasible region detection method based on camera and laser radar
CN111382591B (en) Binocular camera ranging correction method and vehicle-mounted equipment
CN114494466B (en) External parameter calibration method, device and equipment and storage medium
CN112446926A (en) Method and device for calibrating relative position of laser radar and multi-eye fisheye camera
GB2569609A (en) Method and device for digital 3D reconstruction
CN115546216B (en) Tray detection method, device, equipment and storage medium
CN115407338A (en) Vehicle environment information sensing method and system
JP6492603B2 (en) Image processing apparatus, system, image processing method, and program
CN112767498A (en) Camera calibration method and device and electronic equipment
CN111414848B (en) Full-class 3D obstacle detection method, system and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant