CN114076918A - Millimeter wave radar, laser radar and camera combined calibration method and device - Google Patents

Millimeter wave radar, laser radar and camera combined calibration method and device Download PDF

Info

Publication number
CN114076918A
CN114076918A CN202010843901.2A CN202010843901A CN114076918A CN 114076918 A CN114076918 A CN 114076918A CN 202010843901 A CN202010843901 A CN 202010843901A CN 114076918 A CN114076918 A CN 114076918A
Authority
CN
China
Prior art keywords
point cloud
millimeter wave
camera
laser
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010843901.2A
Other languages
Chinese (zh)
Inventor
王邓江
马冰
杨光
邓永强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Wanji Technology Co Ltd
Original Assignee
Beijing Wanji Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Wanji Technology Co Ltd filed Critical Beijing Wanji Technology Co Ltd
Priority to CN202010843901.2A priority Critical patent/CN114076918A/en
Publication of CN114076918A publication Critical patent/CN114076918A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The application relates to a millimeter wave radar, laser radar and camera combined calibration method and device, comprising the following steps: and mapping the point cloud corresponding to the millimeter wave point cloud data to the image to obtain millimeter wave mapping point cloud, and mapping the point cloud corresponding to the laser point cloud data to the image to obtain laser mapping point cloud. And respectively calculating the overlapping areas between the millimeter wave point cloud target frame and the laser point cloud target frame of the calibration object on the image and the image identification target frame. And adjusting the first initial combined calibration parameter and the second initial combined calibration parameter based on the overlapping area until the overlapping area meets a preset threshold, and outputting the adjusted first and second combined calibration parameters as target combined calibration parameters for combined calibration of the millimeter wave radar, the laser radar and the camera. Carry out self-calibration through millimeter wave radar, laser radar and camera, need not use specific calibration object, avoided extravagant manpower and materials because need not use specific calibration object just also to improve the reusability.

Description

Millimeter wave radar, laser radar and camera combined calibration method and device
Technical Field
The application relates to the technical field of camera calibration, in particular to a millimeter wave radar, laser radar and camera combined calibration method and device.
Background
With the continuous development of the automatic driving or unmanned driving technology, the automatic driving or unmanned driving gradually steps into the daily life of people, and brings convenience to the life of people. The core problems faced by the automatic driving or unmanned driving technology are: the visual perception capability of the vehicle is insufficient, and the overall perception of the whole road traffic environment cannot be formed, so that the driving safety of the vehicle cannot be comprehensively guaranteed.
In recent years, the scheme that the vision perception capability of the vehicle is improved by the millimeter wave radar, the laser radar and the camera together appears, but when the millimeter wave radar, the laser radar and the camera are calibrated by adopting the traditional method, calibration is required to be carried out based on a specific calibration object, manpower and material resources are wasted, and the reusability is not strong.
Disclosure of Invention
The embodiment of the application provides a millimeter wave radar, laser radar and camera combined calibration method and device, which can avoid waste of manpower and material resources and improve reusability of a calibration process.
A millimeter wave radar, laser radar and camera combined calibration method comprises the following steps:
acquiring images, millimeter wave point cloud data and laser point cloud data which are acquired at the same time and in the same scene;
mapping point clouds corresponding to the millimeter wave point cloud data to the image to obtain millimeter wave mapping point clouds according to first initial joint calibration parameters of the millimeter wave radar relative to the camera, and mapping the point clouds corresponding to the laser point cloud data to the image to obtain laser mapping point clouds according to second initial joint calibration parameters of the laser radar relative to the camera;
calculating the overlapping area between a millimeter wave point cloud target frame and a laser point cloud target frame of a calibration object on the image and an image identification target frame respectively, wherein the calibration object is at least one target in the overlapping detection area of the camera and the millimeter wave radar;
and adjusting the first initial combined calibration parameter and the second initial combined calibration parameter based on the overlapping area corresponding to each calibration object until the overlapping area meets a preset threshold, and taking the adjusted first combined calibration parameter and second combined calibration parameter as target combined calibration parameters for combined calibration of the millimeter wave radar, the laser radar and the camera.
A calibration device is united with camera to millimeter wave radar, laser radar, includes:
the data acquisition module is used for acquiring images, millimeter wave point cloud data and laser point cloud data acquired at the same time and in the same scene;
the point cloud mapping module is used for mapping the point cloud corresponding to the millimeter wave point cloud data to the image according to a first initial joint calibration parameter of the millimeter wave radar relative to the camera to obtain millimeter wave mapping point cloud, and mapping the point cloud corresponding to the laser point cloud data to the image according to a second initial joint calibration parameter of the laser radar relative to the camera to obtain laser mapping point cloud;
the overlapping area calculation module is used for calculating the overlapping area between a millimeter wave point cloud target frame and a laser point cloud target frame of a calibration object on the image and an image identification target frame respectively, wherein the calibration object is at least one target in the overlapping detection area of the camera and the millimeter wave radar;
and the target joint calibration parameter output module is used for adjusting the first initial joint calibration parameter and the second initial joint calibration parameter based on the overlapping area corresponding to each calibration object until the overlapping area meets a preset threshold value, and taking the adjusted first joint calibration parameter and the adjusted second joint calibration parameter as target joint calibration parameters for joint calibration of the millimeter wave radar, the laser radar and the camera.
A server comprising a memory and a processor, the memory having stored therein a computer program which, when executed by the processor, causes the processor to carry out the steps of the above method.
A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method as above.
According to the millimeter wave radar, laser radar and camera combined calibration method and device, the point cloud corresponding to the millimeter wave point cloud data is mapped to the image to obtain the millimeter wave mapping point cloud, and the point cloud corresponding to the laser point cloud data is mapped to the image to obtain the laser mapping point cloud. And respectively calculating the overlapping areas between the millimeter wave point cloud target frame and the laser point cloud target frame of the calibration object on the image and the image identification target frame. And adjusting the first initial combined calibration parameter and the second initial combined calibration parameter based on the overlapping area until the overlapping area meets a preset threshold, and outputting the adjusted first combined calibration parameter and the adjusted second combined calibration parameter as target combined calibration parameters for combined calibration of the millimeter wave radar, the laser radar and the camera. Carry out self-calibration through millimeter wave radar, laser radar and camera, need not use specific calibration object, avoided extravagant manpower and materials because need not use specific calibration object just also to improve the reusability.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a diagram of an application scenario of a millimeter wave radar, lidar and camera joint calibration method in an embodiment;
FIG. 2 is a flowchart of a millimeter wave radar, lidar and camera joint calibration method in one embodiment;
FIG. 3 is a flow diagram of a process for calculating a first initial joint calibration parameter for the millimeter wave radar relative to the camera in one embodiment;
FIG. 4 is a diagram illustrating conversion of a spherical coordinate system to a world coordinate system in one embodiment;
FIG. 5 is a flowchart of the method for mapping the point cloud corresponding to the millimeter wave point cloud data onto the image to obtain the millimeter wave mapped point cloud shown in FIG. 2;
FIG. 6 is a flowchart of a method for mapping a point cloud corresponding to the laser point cloud data onto an image to obtain a laser mapped point cloud in FIG. 2;
FIG. 7 is a flowchart of a millimeter wave radar, lidar and camera joint calibration method in an exemplary embodiment;
FIG. 8 is a block diagram of the millimeter wave radar, lidar and camera combined calibration apparatus in one embodiment;
FIG. 9 is a block diagram of a millimeter-wave radar, lidar and camera combined calibration apparatus in another embodiment;
fig. 10 is a schematic diagram of an internal configuration of a server in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another.
In the traditional method, when the millimeter wave radar, the laser radar and the camera are jointly calibrated, a calibration target is usually specially designed due to different imaging principles of the three sensors. Therefore, the traditional method needs to arrange a specific calibration object on site, wastes manpower and material resources and has low reusability.
As shown in fig. 1, fig. 1 is an application scenario diagram of a millimeter wave radar, laser radar, and camera combined calibration method in an embodiment. As shown in fig. 1, the application environment includes millimeter-wave radar 120, laser radar 140, camera 160, and server 180. The method comprises the steps of obtaining images through camera shooting, obtaining millimeter wave point cloud data through millimeter wave radar collection in the same time and the same scene, and obtaining laser point cloud data through laser radar collection in the same time and the same scene. The method comprises the steps that a server obtains images, millimeter wave point cloud data and laser point cloud data which are collected at the same time and in the same scene, wherein the images are obtained through shooting by a camera, the millimeter wave point cloud data are obtained through millimeter wave radar collection, and the laser point cloud data are obtained through laser radar collection; mapping point clouds corresponding to the millimeter wave point cloud data to the image to obtain millimeter wave mapping point clouds according to first initial joint calibration parameters of the millimeter wave radar relative to the camera, and mapping the point clouds corresponding to the laser point cloud data to the image to obtain laser mapping point clouds according to second initial joint calibration parameters of the laser radar relative to the camera; and calculating a first overlapping area between a millimeter wave point cloud target frame and an image recognition target frame of a calibration object on the image, wherein the calibration object is at least one target in the overlapping detection area of the camera and the millimeter wave radar. Calculating a second overlapping area between the laser point cloud target frame of the calibration object on the image and the image identification target frame; and adjusting the first initial combined calibration parameter based on the first overlapping area until the first overlapping area meets a first preset threshold, outputting the adjusted first combined calibration parameter, adjusting the second initial combined calibration parameter based on the second overlapping area until the second overlapping area meets a second preset threshold, outputting the adjusted second combined calibration parameter, and taking the adjusted first combined calibration parameter and the adjusted second combined calibration parameter as target combined calibration parameters for the combined calibration of the millimeter wave radar, the laser radar and the camera. The millimeter wave radar is a radar that operates in a millimeter wave band (millimeter wave) for detection. Generally, the millimeter wave refers to electromagnetic waves in the frequency domain of 30 to 300GHz (with the wavelength of 1 to 10 mm). The laser radar is a radar system that detects a characteristic amount such as a position and a velocity of a target by emitting a laser beam. The working principle is to transmit a detection signal (laser beam) to a target, then compare the received signal (target echo) reflected from the target with the transmitted signal, and after appropriate processing, obtain the relevant information of the target, such as target distance, azimuth, height, speed, attitude, even shape and other parameters.
Millimeter wave radar is good at judging position information and speed estimation of targets, but is poor in effect when acquiring information such as the number and characteristics of targets. The camera has great advantages in extracting the characteristics of the target, but cannot accurately acquire the speed and the position of the target. Therefore, the method of fusing the millimeter wave radar and the camera can extract more target characteristic information and can acquire more accurate speed and position estimation. Laser radar detection distance is close, and measurement accuracy is high, can accurately acquire the three-dimensional information and the speed information of object, but with high costs, lacks RGB information, and can't normally work under bad weather, like rainy day, fog day etc.. Therefore, the method of fusing the millimeter wave radar, the laser radar and the camera can exert respective advantages. And on the basis of fusion, information interaction is carried out, and the information of the surrounding environment is acquired to the maximum extent.
Fig. 2 is a flowchart of a millimeter wave radar, laser radar, and camera combined calibration method in an embodiment, and as shown in fig. 2, a millimeter wave radar, laser radar, and camera combined calibration method applied to a server is provided, which includes steps 220 to 280.
And step 220, acquiring images, millimeter wave point cloud data and laser point cloud data acquired at the same time and in the same scene.
The image is shot by a camera, the millimeter wave point cloud data is acquired by a millimeter wave radar, and the laser point cloud data is acquired by a laser radar.
When the millimeter wave radar, the laser radar and the camera are jointly calibrated, firstly, the millimeter wave radar, the laser radar and the camera are adjusted to a preset angle, and the relative positions of the millimeter wave radar, the laser radar and the camera are fixed. Then, images are obtained through shooting by a camera, millimeter wave point cloud data are obtained through acquisition of a millimeter wave radar in the same scene at the same time, and laser point cloud data are obtained through acquisition of a laser radar in the same scene at the same time. And finally, the server acquires a frame of image, millimeter wave point cloud data and laser point cloud data corresponding to the frame of image from the image acquired by the camera, the millimeter wave point cloud data acquired by the millimeter wave radar and the laser point cloud data acquired by the laser radar to form a data set. In jointly calibrating a millimeter wave radar, a laser radar, and a camera, multiple such data sets may be acquired. The obtained millimeter wave point cloud data is the point cloud data corresponding to the target screened from the millimeter wave point cloud data, and all the point cloud data are not required, so that the calculation amount is reduced to a certain extent.
Specifically, system timestamps of the millimeter wave radar, the laser radar and the camera are respectively obtained, and a system time difference value of the millimeter wave radar and the industrial personal computer, a system time difference value of the laser radar and the industrial personal computer and a system time difference value of the camera and the industrial personal computer are respectively calculated. And simultaneously acquiring millimeter wave point cloud data, laser point cloud data and data corresponding to the images by the same industrial personal computer, and converting system time stamps of the millimeter wave radar point cloud data, the laser point cloud data and the data corresponding to the images into a time axis of the industrial personal computer based on a system time difference value of the millimeter wave radar and the industrial personal computer, a system time difference value of the laser radar and the industrial personal computer and a system time difference value of the camera and the industrial personal computer, so that the millimeter wave radar data, the laser point cloud data and the data corresponding to the images which are time-synchronized (at the same moment) can be acquired.
And 240, mapping the point cloud corresponding to the millimeter wave point cloud data to the image to obtain a millimeter wave mapping point cloud according to the first initial joint calibration parameter of the millimeter wave radar relative to the camera, and mapping the point cloud corresponding to the laser point cloud data to the image to obtain a laser mapping point cloud according to the second initial joint calibration parameter of the laser radar relative to the camera.
Wherein, the internal reference matrix of the camera
Figure BDA0002642382930000051
Wherein (c)x,cy) Representing coordinates of the center point of the image, fx,fyIs a focal length expressed in units of pixels.
In addition, the millimeter wave radar is located in a spherical coordinate system, the camera is located in a camera coordinate system, and an image captured by the camera is located in a pixel coordinate system. And the spherical coordinate system can be converted with a camera coordinate system through a world coordinate system, and the camera coordinate system can be converted with a pixel coordinate system. Therefore, when the server calculates the initial joint calibration parameters of the millimeter wave radar relative to the camera, the server can calculate the first initial joint calibration parameters of the millimeter wave radar relative to the camera according to the coordinates of the millimeter wave radar in the world coordinate system and the coordinates of the camera in the world coordinate system by using the world coordinate system as a bridge. The joint calibration parameters may also be referred to as an external parameter matrix.
And then, mapping the point cloud corresponding to the millimeter wave point cloud data to the image through the transmission transformation matrix to obtain a millimeter wave mapping point cloud according to the internal reference matrix of the camera, the distortion coefficient of the camera and the first initial combined calibration parameter of the millimeter wave radar relative to the camera. A correspondence between the millimeter wave point cloud (i.e., the point cloud corresponding to the millimeter wave point cloud data) and the pixels on the image is established.
For the laser radar, the laser radar originally uses a world coordinate system, and the joint calibration parameters of the laser radar are the joint calibration parameters of the laser radar relative to the world coordinate system. And the world coordinate system and the camera coordinate system can be directly converted with each other. Therefore, the server can directly calculate the second initial combined calibration parameter of the laser radar relative to the camera by taking the world coordinate system as a bridge according to the combined calibration parameter of the laser radar and the combined calibration parameter of the camera relative to the world coordinate system.
And then, according to the internal reference matrix of the camera, the distortion coefficient of the camera and a second initial combined calibration parameter of the laser radar relative to the camera, mapping the point cloud corresponding to the laser point cloud data to the image through the transmission transformation matrix to obtain laser mapping point cloud. A correspondence between the laser point cloud (i.e., the point cloud to which the laser point cloud data corresponds) and the pixels on the image is established.
And step 260, calculating the overlapping area between the millimeter wave point cloud target frame and the laser point cloud target frame of the calibration object on the image and the image identification target frame respectively, wherein the calibration object is at least one target in the overlapping detection area of the camera and the millimeter wave radar.
The server maps the point cloud corresponding to the millimeter wave point cloud data to the image to obtain millimeter wave mapping point cloud, maps the point cloud corresponding to the laser point cloud data to the image to obtain laser mapping point cloud, and then obtains a millimeter wave point cloud target frame, a laser point cloud target frame and an image recognition target frame of the calibration object on the image. And calculating the overlapping area between the millimeter wave point cloud target frame of the calibration object on the image and the image recognition target frame, and verifying the accuracy of the millimeter wave radar relative to the first initial combined calibration parameter of the camera based on the overlapping area. Because the larger the overlapping area, the higher the accuracy of spatially aligning the millimeter wave point cloud with the image captured by the camera.
And then calculating a second overlapping area between the laser point cloud target frame of the calibration object on the image and the image recognition target frame, and verifying the accuracy of the laser radar relative to a second initial combined calibration parameter of the camera based on the overlapping area. Since the larger the overlap area, the higher the accuracy of the spatial alignment of the laser point cloud with the image captured by the camera.
And step 280, adjusting the first initial combined calibration parameter and the second initial combined calibration parameter based on the overlapping area corresponding to each calibration object until the overlapping area meets a preset threshold value, and taking the adjusted first combined calibration parameter and the adjusted second combined calibration parameter as target combined calibration parameters for the combined calibration of the millimeter wave radar, the laser radar and the camera.
Specifically, step 280 includes adjusting a first initial joint calibration parameter based on the first overlap area until the first overlap area satisfies a first preset threshold, and outputting the adjusted first joint calibration parameter. Adjusting a second initial combined calibration parameter based on the second overlapping area until the second overlapping area meets a second preset threshold, and outputting the adjusted second combined calibration parameter; and taking the adjusted first combined calibration parameter and the adjusted second combined calibration parameter as target combined calibration parameters for combined calibration of the millimeter wave radar, the laser radar and the camera.
The larger the overlapping area is, the higher the accuracy of the spatial alignment of the millimeter wave point cloud or the laser point cloud and the image is. Therefore, the server can adjust the first initial combined calibration parameter based on the first overlapping area until the first overlapping area meets the first preset threshold, output the adjusted first combined calibration parameter, adjust the second initial combined calibration parameter based on the second overlapping area until the second overlapping area meets the second preset threshold, output the adjusted second combined calibration parameter, and use the adjusted first combined calibration parameter and the adjusted second combined calibration parameter as target combined calibration parameters for the combined calibration of the millimeter wave radar, the laser radar and the camera, thereby completing the process of the combined calibration of the millimeter wave radar, the laser radar and the camera.
In the embodiment of the application, the point cloud corresponding to the millimeter wave point cloud data is mapped to the image to obtain the millimeter wave mapping point cloud, and the point cloud corresponding to the laser point cloud data is mapped to the image to obtain the laser mapping point cloud. And respectively calculating the overlapping areas between the millimeter wave point cloud target frame and the laser point cloud target frame of the calibration object on the image and the image identification target frame. And adjusting the first initial combined calibration parameter and the second initial combined calibration parameter based on the overlapping area until the overlapping area meets a preset threshold, and outputting the adjusted first combined calibration parameter and the adjusted second combined calibration parameter as target combined calibration parameters for combined calibration of the millimeter wave radar, the laser radar and the camera. Carry out self-calibration through millimeter wave radar, laser radar and camera, need not use specific calibration object, avoided extravagant manpower and materials because need not use specific calibration object just also to improve the reusability.
In one embodiment, as shown in fig. 3, the calculation process of the first initial joint calibration parameter of the millimeter wave radar with respect to the camera includes:
and step 320, converting the coordinates of the millimeter wave radar in the spherical coordinate system into the coordinates of the millimeter wave radar in the world coordinate system.
Specifically, as shown in fig. 4, the spherical coordinate system is converted into a world coordinate system. As shown in fig. 4, the spherical coordinate system uses the origin of coordinates as a reference point, and is composed of an azimuth angle α, an elevation angle θ, and a distance r. The world coordinate system is composed of Xw, Yw, Zw. The coordinate system of the millimeter wave radar is a spherical coordinate system, and the coordinate of the millimeter wave radar in the spherical coordinate system is converted into the coordinate of the millimeter wave radar in the world coordinate system through a conversion formula between the spherical coordinate system and the world coordinate system.
Step 340, converting the coordinates of the camera in the camera coordinate system into the coordinates of the camera in the world coordinate system.
The coordinate system of the camera is a camera coordinate system, and the coordinates of the camera in the camera coordinate system are converted into the coordinates of the camera in the world coordinate system through a conversion formula between the camera coordinate system and the world coordinate system. Therein, the camera coordinate system may be represented by (x, y, z).
And step 360, calculating a first initial joint calibration parameter of the millimeter wave radar relative to the camera according to the coordinate of the millimeter wave radar in the world coordinate system and the coordinate of the camera in the world coordinate system.
After the calculation in the steps 320 and 340, the coordinates of the millimeter-wave radar in the world coordinate system and the coordinates of the camera in the world coordinate system are obtained, and the first initial external reference matrix of the millimeter-wave radar relative to the camera can be calculated based on the world coordinate system as an intermediate coordinate system. The joint calibration parameters may also be referred to as an external parameter matrix. [ RT ]]Comprising a rotation matrix R3×3Translation vector T3×1The method specifically comprises the following steps:
Figure BDA0002642382930000081
in the embodiment of the application, firstly, the coordinates of the millimeter wave radar in a spherical coordinate system are converted into the coordinates of the millimeter wave radar in a world coordinate system; secondly, converting the coordinates of the camera in a camera coordinate system into the coordinates of the camera in a world coordinate system; finally, a first initial external reference matrix of the millimeter wave radar with respect to the camera may be calculated based on the world coordinate system as an intermediate coordinate system. By means of the world coordinate system as an intermediate coordinate system, the first initial external parameter matrix of the millimeter wave radar relative to the camera can be accurately calculated.
In an embodiment, as shown in fig. 5, mapping a point cloud corresponding to millimeter wave point cloud data onto an image according to a first initial joint calibration parameter of a millimeter wave radar relative to a camera to obtain a millimeter wave mapping point cloud, including:
step 242, converting the coordinates of the millimeter wave point cloud data in the spherical coordinate system into coordinates in the world coordinate system.
As shown in fig. 4, it is assumed that the millimeter wave radar detects a point cloud corresponding to a target from a shooting scene, where the radial distance between the point cloud and the radar is r, the azimuth angle is α, and the elevation angle is θ. Then the coordinates of the millimeter wave point cloud data corresponding to the target in the spherical coordinate system are (r, θ, α).
The radial distance can be decomposed into:
x=rsin(α) y=rcos(α) (1-1)
if the installation angle of the radar is h, the radar beam irradiates downwards in an inclined mode, and s is the radial distance from the intersection point of the central axis of the radar beam and the ground plane to the radar, and the inclination angle information of the radar installation is represented. Taking radar as a center and taking the inclined direction of a straight line to the ground as O-ZwPositive direction of axis, from radar perspective, in left direction O-X on plane of radarwIs the positive direction of the axis, the upward direction is O-YwAnd establishing a three-dimensional world coordinate system in the positive direction of the axis. Mapping the detected position information r and alpha of the target to a three-dimensional world coordinate system to obtain
Xw=-x (1-2)
Yw=-ysin(θ) (1-3)
Zw=ycos(θ) (1-4)
Wherein, the relation between x and y, r and alpha is shown as the above formula. As can be seen from the geometric relationship, θ satisfies:
Figure BDA0002642382930000091
therefore, the process of converting the coordinates of the millimeter wave point cloud data in the spherical coordinate system into the coordinates in the world coordinate system is completed.
And 243, inputting the internal reference matrix of the camera, the distortion coefficient of the camera, the first initial joint calibration parameter of the millimeter wave radar relative to the camera and the coordinate of the millimeter wave point cloud data in the world coordinate system into the transmission transformation matrix, and calculating the coordinate of the millimeter wave point cloud corresponding to the millimeter wave point cloud data in the pixel coordinate system.
First, an internal reference matrix of the camera and a distortion coefficient of the camera are calculated. And calibrating by the camera to obtain an internal reference matrix of the camera and a distortion coefficient of the camera. Wherein, the internal reference matrix of the camera
Figure BDA0002642382930000092
Wherein (c)x,cy) Representing coordinates of the center point of the image, fx,fyIs a focal length expressed in units of pixels.
Among them, the distortion coefficient of the camera can be divided into radial distortion and tangential distortion.
The internal parameter matrix A of the camera, the distortion coefficient K of the camera and the initial external parameter matrix [ RT ] of the millimeter wave radar relative to the camera are obtained through the calculation]And the coordinates (X) of the millimeter wave point cloud data in the world coordinate systemW,YW,ZW). The joint calibration parameters may also be referred to as an external parameter matrix. Wherein the transmission transformation matrix is:
Figure BDA0002642382930000093
wherein the content of the first and second substances,
Figure BDA0002642382930000094
an internal parameter matrix A of the camera, a distortion coefficient K of the camera and an initial combined calibration parameter [ RT ] of the millimeter wave radar relative to the camera]And the coordinates (X) of the millimeter wave point cloud data in the world coordinate systemW,YW,ZW) And inputting the data into the transmission transformation matrix, and calculating the coordinates (mu, v) of the millimeter wave point cloud corresponding to the millimeter wave point cloud data in the pixel coordinate system.
The process of calculating the coordinate (mu, ν) of the millimeter wave point cloud in the pixel coordinate system corresponding to the millimeter wave point cloud data is as follows:
Figure BDA0002642382930000101
x′=x/z y′=y/z (1-9)
Figure BDA0002642382930000102
Figure BDA0002642382930000103
wherein r is2=x′2+y′2
u=fx*x″+cx v=fy*y″+cy
(1-12)
Wherein k is1,k2,k3,k4,k5,k6Representing the radial distortion coefficient, p1,p2Representing the tangential distortion coefficient.
Step 244, extracting pixel points of the millimeter wave point cloud at the coordinates in the pixel coordinate system from the image to serve as the millimeter wave mapping point cloud.
After the coordinates (mu, v) of the millimeter wave point cloud corresponding to the millimeter wave point cloud data in the pixel coordinate system are calculated, the millimeter wave point cloud corresponding to the millimeter wave point cloud data is mapped to the image to obtain the millimeter wave mapping point cloud. Specifically, a pixel point of the point cloud at the coordinate (μ, ν) in the pixel coordinate system is extracted from the image as a millimeter wave mapping point cloud. When the millimeter wave radar and the camera are jointly calibrated at the beginning, the millimeter wave point cloud corresponding to each selected millimeter wave radar point cloud data is mapped to the image to obtain millimeter wave mapping point cloud.
In the embodiment of the application, firstly, coordinates of the millimeter wave point cloud data in the spherical coordinate system are converted into coordinates in the world coordinate system. Then, an internal reference matrix of the camera, a distortion coefficient of the camera, a first initial combined calibration parameter of the millimeter wave radar relative to the camera and coordinates of the millimeter wave point cloud data in a world coordinate system are input into the transmission transformation matrix, and coordinates of the millimeter wave point cloud corresponding to the millimeter wave point cloud data in a pixel coordinate system are calculated. And extracting pixel points of the millimeter wave point cloud at the coordinates in the pixel coordinate system from the image to serve as millimeter wave mapping point cloud. The millimeter wave point cloud and the image shot by the camera are matched in space, so that the millimeter wave radar, the laser radar and the camera can be conveniently calibrated in a follow-up mode.
In an embodiment, as shown in fig. 6, mapping the point cloud corresponding to the laser point cloud data onto the image according to the second initial joint calibration parameter of the laser radar relative to the camera to obtain a laser mapping point cloud, including:
and step 245, acquiring the coordinates of the laser point cloud data in a world coordinate system.
For the laser radar, because the world coordinate system is originally used by the laser radar, the coordinates of the laser point cloud data in the world coordinate system are directly acquired.
Step 246, inputting the internal reference matrix of the camera, the distortion coefficient of the camera, the second initial joint calibration parameter of the laser radar relative to the camera and the coordinates of the laser point cloud data in the world coordinate system into the transmission transformation matrix, and calculating the coordinates of the laser point cloud corresponding to the laser point cloud data in the pixel coordinate system.
Similar to the previous embodiment, the camera calibration is used to obtain the internal reference matrix of the camera and the distortion coefficient of the camera. The calculation process of the second initial joint calibration parameter of the laser radar relative to the camera is similar to the calculation process of the second initial joint calibration parameter of the millimeter wave radar relative to the camera, and comprises the following steps: calculating a joint calibration parameter of the laser radar relative to a world coordinate system; calculating the joint calibration parameters of the camera relative to a world coordinate system; and calculating a second initial combined calibration parameter of the laser radar relative to the camera according to the combined calibration parameter of the laser radar relative to the world coordinate system and the combined calibration parameter of the camera relative to the world coordinate system.
And then, inputting an internal reference matrix of the camera, a distortion coefficient of the camera, a second initial joint calibration parameter of the laser radar relative to the camera and the coordinates of the laser point cloud data in a world coordinate system into the transmission transformation matrix, and calculating the coordinates of the laser point cloud corresponding to the laser point cloud data in a pixel coordinate system.
Specifically, the internal parameter matrix A of the camera, the distortion coefficient K of the camera and the initial external parameter matrix [ RT ] of the laser radar relative to the camera are obtained through the calculation]And coordinates (X) of the laser point cloud data in the world coordinate systemW,YW,ZW). Wherein the transmission transformation matrix is represented by the above formula (1-6), and the initial external reference matrix "RT]As shown in the above equations (1-7).
An internal reference matrix A of the camera, a distortion coefficient K of the camera and an initial external reference matrix 'RT' of the laser radar relative to the camera]And coordinates (X) of the laser point cloud data in the world coordinate systemW,YW,ZW) And inputting the data into the transmission transformation matrix, and calculating the coordinates (mu, v) of the laser point cloud corresponding to the laser point cloud data in the pixel coordinate system.
The process of calculating the coordinates (μ, ν) of the millimeter-wave point cloud corresponding to the laser point cloud data in the pixel coordinate system is shown in the above formulas (1-8) to (1-12), and is not described herein again.
And 247, extracting pixel points of the laser point cloud at the coordinates in the pixel coordinate system from the image to serve as laser mapping point cloud.
After the coordinates (mu, v) of the laser point cloud corresponding to the laser point cloud data in the pixel coordinate system are calculated, the laser point cloud corresponding to the laser point cloud data is mapped to the image to obtain the laser mapping point cloud. Specifically, a pixel point of the point cloud at the coordinate (μ, ν) in the pixel coordinate system is extracted from the image as a laser mapping point cloud. When joint calibration is carried out on the laser radar and the camera at the beginning, the laser point cloud corresponding to each selected laser radar point cloud data is mapped to the image to obtain laser mapping point cloud.
In the embodiment of the present application, first, coordinates of the laser point cloud data in the spherical coordinate system are converted into coordinates in the world coordinate system. And then, inputting an internal reference matrix of the camera, a distortion coefficient of the camera, a second initial joint calibration parameter of the laser radar relative to the camera and the coordinates of the laser point cloud data in a world coordinate system into the transmission transformation matrix, and calculating the coordinates of the laser point cloud corresponding to the laser point cloud data in a pixel coordinate system. And extracting pixel points of the laser point cloud at the coordinates in the pixel coordinate system from the image to serve as laser mapping point cloud. The method and the device realize the spatial matching between the laser point cloud and the image shot by the camera, thereby facilitating the subsequent millimeter wave radar, laser radar and camera combined calibration.
In one embodiment, before calculating the overlapping areas between the millimeter wave point cloud target frame and the laser point cloud target frame of the calibration object on the image and the image recognition target frame respectively, the method further comprises the following steps:
aiming at each calibration object on the image, drawing a millimeter wave point cloud target frame on the image by taking the millimeter wave mapping point cloud as a center and a preset size; acquiring a circumscribed rectangular frame of laser mapping point cloud corresponding to a target on the image as a laser point cloud target frame; carrying out image recognition on the image to obtain an image recognition target frame;
and acquiring a millimeter wave point cloud target frame, a laser point cloud target frame and an image identification target frame of the calibration object on the image.
Specifically, a millimeter wave point cloud target frame is drawn for each millimeter wave mapping point cloud on the image, and specifically, the millimeter wave mapping point cloud is taken as a center on the image, and the millimeter wave point cloud target frame is drawn in a preset size. For example, a millimeter wave point cloud target frame (circular frame) can be drawn by taking the millimeter wave mapping point cloud as a center and taking p as a preset radius; or drawing a millimeter wave point cloud target frame (rectangular frame) by taking the millimeter wave mapping point cloud as the center, l as the length and w as the width.
The laser radar generally acquires a plurality of laser point clouds for a calibration object (target), and then a plurality of laser mapping point clouds are correspondingly obtained for one target after the laser point clouds are mapped on an image. Therefore, the minimum circumscribed rectangle frame of the laser mapping point cloud can be obtained on the image to form a laser point cloud target frame.
For an image obtained by shooting through a camera, performing image recognition on a target through an image recognition technology to obtain an image recognition target frame, and similarly, the shape of the image recognition target frame here may be a circular frame or a rectangular frame, and the shape of the target frame is not limited in the present application. For example, the point cloud target frame and the image recognition target frame are preferably the same shape of target frame for comparison.
Since the image captured by the camera includes a plurality of targets, a millimeter wave point cloud target frame, a laser point cloud target frame, and an image recognition target frame corresponding to a calibration object (the same target) need to be acquired from the image. The millimeter wave point cloud target frame, the laser point cloud target frame and the image recognition target frame on the image are combined to obtain the millimeter wave point cloud target frame, the laser point cloud target frame and the image recognition target frame corresponding to a calibration object (the same target).
In the embodiment of the application, the millimeter wave mapping point cloud is taken as the center on the image, and the millimeter wave point cloud target frame is drawn in a preset size. And acquiring a circumscribed rectangular frame of the laser mapping point cloud corresponding to the target on the image as a laser point cloud target frame, and performing image recognition on the image to obtain an image recognition target frame. And acquiring a millimeter wave point cloud target frame, a laser point cloud target frame and an image identification target frame corresponding to the calibration object. The millimeter wave point cloud target frame and the laser point cloud target frame of the calibration object are matched with the image identification target frame of the calibration object on the image shot by the camera in space, so that millimeter wave radar, laser radar and camera combined calibration can be performed according to the matching degree of the target frames conveniently in the follow-up process.
In one embodiment, adjusting the first initial joint calibration parameter and the second initial joint calibration parameter based on the overlapping area corresponding to each calibration object until the overlapping area satisfies a preset threshold, and using the adjusted first joint calibration parameter and second joint calibration parameter as target joint calibration parameters for joint calibration of the millimeter wave radar, the laser radar, and the camera includes:
calculating a first overlapping area between a millimeter wave point cloud target frame and an image recognition target frame of the same calibration object on the image, and calculating a second overlapping area between a laser point cloud target frame and an image recognition target frame of the same calibration object on the image;
adjusting the first initial joint calibration parameter based on the first overlapping area corresponding to each calibration object until the first overlapping area meets a first preset threshold, and outputting the adjusted first joint calibration parameter;
adjusting the second initial combined calibration parameters based on the second overlapping area corresponding to each calibration object until the second overlapping area meets a second preset threshold, and outputting adjusted second combined calibration parameters;
and taking the adjusted first combined calibration parameter and the adjusted second combined calibration parameter as target combined calibration parameters for combined calibration of the millimeter wave radar, the laser radar and the camera.
In the embodiment of the application, first, a first overlapping area and a second overlapping area are calculated for the same calibration object; and secondly, adjusting a first initial combined calibration parameter based on the first overlapping area, and adjusting a second initial combined calibration parameter based on the second overlapping area. And finally, taking the adjusted first combined calibration parameter and the adjusted second combined calibration parameter as target combined calibration parameters for combined calibration of the millimeter wave radar, the laser radar and the camera. The self-calibration is realized through the millimeter wave radar, the laser radar and the camera, a specific calibration object is not needed, and the waste of manpower and material resources is avoided. And finally, the efficiency and reusability of the calibration process are improved.
In one embodiment, adjusting the first initial joint calibration parameter based on the first overlap area corresponding to each calibration object until the first overlap area satisfies the first preset threshold, and outputting the adjusted first joint calibration parameter includes:
and constructing a first objective function based on the first overlapping area corresponding to each calibration object, adjusting the first initial combined calibration parameter through the first objective function until the value of the first objective function meets a first preset threshold, and outputting the adjusted first combined calibration parameter.
Specifically, the objective function is also called a loss function (loss function) or a cost function (cost function). And after the server maps the millimeter wave point cloud corresponding to the millimeter wave point cloud data to the image to obtain the millimeter wave mapping point cloud, acquiring a millimeter wave point cloud target frame and an image recognition target frame of the calibration object on the image. Calculating a first overlapping area between a millimeter wave point cloud target frame of a calibration object on an image and an image recognition target frame, then constructing a first target function based on the first overlapping area, adjusting a first initial combined calibration parameter through the first target function until the value of the first target function meets a first preset threshold, and outputting the adjusted first combined calibration parameter.
For the laser radar, similarly, a second overlapping area between the laser point cloud target frame of the calibration object on the image and the image recognition target frame is calculated, a second target function is constructed based on the second overlapping area, a second initial combined calibration parameter is adjusted through the second target function until the value of the second target function meets a second preset threshold, and the adjusted second combined calibration parameter is output.
In the embodiment of the application, the larger the overlapping area is, the higher the accuracy rate of the spatial alignment between the millimeter wave point cloud and the image shot by the camera is. The accuracy of the millimeter wave radar relative to the first initial combined calibration parameter of the camera can be verified through the first objective function constructed through the first overlapping area, the first initial combined calibration parameter of the millimeter wave radar relative to the camera is continuously optimized until the value of the first objective function first meets a preset threshold value, and the adjusted combined calibration parameter is output. And calculating a second combined calibration parameter after the laser radar is adjusted in the same way. The self-calibration is realized through the millimeter wave radar, the laser radar and the camera, a specific calibration object is not needed, and the waste of manpower and material resources is avoided. And finally, the efficiency and reusability of the calibration process are improved.
In one embodiment, constructing the first objective function based on the first overlap area comprises:
calculating the ratio of the first overlapping area corresponding to each calibration object to the area of the millimeter wave point cloud target frame or the image identification frame;
calculating the mean, variance or standard deviation of the ratio corresponding to each calibration object;
a first objective function is constructed based on the mean, variance, or standard deviation.
Specifically, a first overlapping area between a millimeter wave point cloud target frame and an image recognition target frame on an image is calculated for a calibration object, and then a ratio R of the first overlapping area to an area of the millimeter wave point cloud target frame or the image recognition frame corresponding to the calibration object is calculated. For example, assuming that 10 calibration objects (targets) are selected from the image, a ratio R (including R) of the overlapping area to the area of the millimeter-wave point cloud target frame corresponding to the target is calculated for each target1-R1010 data) of the same. Then calculating the ratio of the overlapping area corresponding to each target to the area of the millimeter wave point cloud target frame to calculate the mean, variance or standard deviation, namely R1-R10The mean M, variance V or standard deviation SD was calculated for the 10 data.
After the mean M, the variance V, or the standard deviation SD is calculated, a first objective function may be constructed based on the mean M. For example, the first objective function is constructed as follows:
Figure BDA0002642382930000151
wherein M isk+1、MkDenotes the mean value, γkDenotes the step size, GkThe gradient is indicated.
For lidar, the process of constructing the second objective function based on the second overlap area is also similar to the process of constructing the first objective function based on the first overlap area.
In the embodiment of the application, the ratio of the first overlapping area to the area of the millimeter wave point cloud target frame is calculated, the mean value, the variance or the standard deviation of the ratio of the first overlapping area corresponding to each calibration object to the area of the millimeter wave point cloud target frame or the image identification frame is calculated, and then the first target function is constructed based on the mean value, the variance or the standard deviation. For multiple groups of millimeter wave point cloud data selected at the beginning and corresponding images, calculating the mean value, the variance or the standard deviation of the ratio of the first overlapping area corresponding to each calibration object on the image to the area of the millimeter wave point cloud target frame or the image identification frame for each frame of image, and then constructing a first target function based on the mean value, the variance or the standard deviation. Similarly, a second objective function is constructed based on the second overlap area. Therefore, the sample data for constructing the target function is abundant, and the accuracy of the constructed target function is improved to a certain extent. And outputting the final target joint calibration parameters of the millimeter wave radar, the laser radar and the camera joint calibration based on the first target function and the second target function respectively.
In one embodiment, adjusting the first initial joint calibration parameters by the first objective function until the value of the first objective function satisfies a first preset threshold, and outputting the adjusted first joint calibration parameters includes:
calculating the value of the first objective function by adopting a gradient descent algorithm;
and adjusting the first initial combined calibration parameter according to the value of the first objective function until the value of the first objective function meets a first preset threshold, and outputting the adjusted first combined calibration parameter.
Specifically, the first objective function may simultaneously use a gradient descent algorithm to calculate the value of the first objective function, as shown in the above equations (1-13). The gradient descent (gradient) algorithm may be applied to a linear regression problem or a nonlinear regression problem, and the gradient descent algorithm is used to find the minimum value of the objective function through iteration, or converge to the minimum value.
Gradient descent is one type of iterative method that can be used to solve the least squares problem. Gradient descent is one of the most commonly employed methods when solving for parameters of neural networks. When the minimum value of the loss function is solved, iterative solution may be performed step by a gradient descent method to obtain the minimized loss function and the values of the parameters, that is, the loss function is optimized in step 501 until the value of the loss function is smaller than the preset threshold. Adam is a first-order optimization algorithm capable of replacing the traditional random gradient descent process, iteratively updates each weight of a neural network based on training data, and can design independent adaptive learning rates for different parameters. Adam has the following specific formula:
Figure BDA0002642382930000161
Figure BDA0002642382930000162
wherein, t represents the number of times,
Figure BDA0002642382930000163
is mtThe correction of (2) is performed,
Figure BDA0002642382930000164
is v istCorrection of (b)1And beta2Is constant, controls exponential decay, mtIs an exponential moving average of the gradient, through the gradient GkThe first moment of (a) is obtained. v. oftIs a squared gradient, passing through a gradient GkThe second order moment of (c) is found, and both α and ∈ are coefficients.
Specifically, after the value of the first objective function is calculated by using the gradient descent algorithm, because a first preset threshold is preset for the value of the first objective function, the first initial joint calibration parameter is adjusted according to the value of the first objective function until the value of the first objective function meets the first preset threshold, and the adjusted first joint calibration parameter is output. Similarly, adjusting the second initial combined calibration parameter by the second objective function until the value of the second objective function satisfies a second preset threshold, and outputting the adjusted second combined calibration parameter, including: calculating the value of a second objective function by adopting a gradient descent algorithm; and adjusting the second initial combined calibration parameter according to the value of the second objective function until the value of the second objective function meets a second preset threshold, and outputting the adjusted second combined calibration parameter. Of course, in the present application, the value of the objective function may be calculated by using any one of a gradient descent algorithm and an adaptive learning rate algorithm, or by using another optimization algorithm, which is not limited in the present application.
In the embodiment of the application, the larger the overlapping area is, the higher the accuracy rate of the spatial alignment between the millimeter wave point cloud and the image shot by the camera is. And calculating the ratio of the overlapping area to the area of the point cloud target frame, calculating the average value of the ratio of the overlapping area corresponding to each calibration object (target) to the area of the point cloud target frame, and constructing a target function based on the average value. And adjusting the initial combined calibration parameters according to the value of the target function until the value of the target function meets a preset threshold, and taking the adjusted first combined calibration parameters and the adjusted second combined calibration parameters as target combined calibration parameters for combined calibration of the millimeter wave radar, the laser radar and the camera. The method realizes that the millimeter wave point cloud, the laser point cloud and the image shot by the camera are continuously aligned in space through the overlapping area until the value of the target function meets the preset threshold value, and outputs the adjusted first and second combined calibration parameters as target combined calibration parameters for the combined calibration of the millimeter wave radar, the laser radar and the camera. Finally, self-calibration is achieved through the millimeter wave radar, the laser radar and the camera, a specific calibration object does not need to be used, and waste of manpower and material resources is avoided.
In one embodiment, a millimeter wave radar, laser radar and camera combined calibration method is provided, which further includes: and adjusting the millimeter wave radar, the laser radar and the camera to a preset angle, and fixing the relative positions of the millimeter wave radar, the laser radar and the camera.
In the embodiment of the application, before the millimeter wave radar, the laser radar and the camera are calibrated, the millimeter wave radar, the laser radar and the camera are adjusted to the preset angle, and the relative positions of the millimeter wave radar, the laser radar and the camera are fixed. The spherical coordinate system of the millimeter wave radar can be conveniently and subsequently converted with the camera coordinate system where the camera is located through the world coordinate system, or converted with each other among the world coordinate system, the camera coordinate system and the pixel coordinate system. And the preset angle refers to an angle corresponding to a large number of overlapped areas covered by the millimeter wave radar, the laser radar and the camera. For example, the millimeter wave radar is adjusted to have a downward inclination angle of 3 ° to 5 ° with respect to the vertical direction, the camera is adjusted to have a downward inclination angle of 5 ° to 15 ° with respect to the vertical direction, and the laser radar is horizontally installed. Of course, this application is not limited thereto.
In a specific embodiment, as shown in fig. 7, there is provided a millimeter wave radar, lidar and camera combined calibration method, including:
step 702, adjusting the millimeter wave radar, the laser radar and the camera to proper angles, and assembling a fusion suite of the millimeter wave radar, the laser radar and the camera;
step 704, acquiring images acquired at the same time and in the same scene, and millimeter wave point cloud data and laser point cloud data corresponding to the images;
step 706, calculating a first initial external parameter matrix of the millimeter wave radar relative to the camera and a second initial external parameter matrix of the laser radar relative to the camera;
step 708, based on the camera internal reference matrix, the camera distortion coefficient and the first initial external reference matrix, mapping the millimeter wave point cloud to the image according to the transmission transformation principle to obtain two-dimensional coordinates of each millimeter wave mapping point cloud on the image;
step 710, based on the camera internal parameter matrix, the camera distortion coefficient and the second initial external parameter matrix, mapping the laser point clouds onto the image according to the transmission transformation principle to obtain two-dimensional coordinates of each laser mapping point cloud on the image;
step 712, drawing a millimeter wave point cloud target frame on the image by taking the millimeter wave mapping point cloud as a center and a preset size, obtaining a laser point cloud target frame formed by the laser mapping point cloud on the image, and performing image recognition on the image to obtain an image recognition target frame;
step 714, calculating a first overlapping area between the millimeter wave point cloud target frame and the image recognition target frame of the calibration object on the image;
step 716, calculating the first overlap area and the millimeter wave point cloud target frameThe ratio of the areas of (a); calculating a first mean value M of the ratio of the first overlapping area corresponding to each target to the area of the millimeter wave point cloud target frame1
Step 718, calculating a second overlapping area between the laser point cloud target frame of the calibration object on the image and the image recognition target frame;
step 720, calculating the ratio of the second overlapping area to the area of the laser point cloud target frame; calculating a second average value M of the ratio of the second overlapping area corresponding to each target to the area of the laser point cloud target frame2
Step 722, determine the first mean value M1Whether or not it is 25% or more, and the second mean value M2If the value is greater than or equal to 70%, go to step 724; if not, go to step 726;
step 724, explaining that the millimeter wave radar, laser radar and camera combined calibration precision meets a preset condition, and outputting the first external reference matrix and the second external reference matrix at the moment as target external reference matrices for the millimeter wave radar, laser radar and camera combined calibration for subsequent fusion of millimeter wave point cloud data and image data;
step 726, it is stated that the millimeter wave radar, laser radar, and camera combined calibration precision does not meet the preset condition, the first initial external reference matrix is updated by using the first external reference matrix at this time, the second initial external reference matrix is updated by using the second external reference matrix at this time, the step 708 is returned to, the millimeter wave radar, laser radar, and camera combined calibration is performed again until it is determined that the first mean value is greater than or equal to 25%, and the second mean value is greater than or equal to 70%, and step 724 is performed.
In the embodiment of the application, the point cloud corresponding to the millimeter wave point cloud data is mapped to the image to obtain the millimeter wave mapping point cloud, and the point cloud corresponding to the laser point cloud data is mapped to the image to obtain the laser mapping point cloud. And respectively calculating a first overlapping area between the millimeter wave point cloud target frame and the image recognition target frame of the calibration object on the image, and calculating a second overlapping area between the laser point cloud target frame and the image recognition target frame of the calibration object on the image. And continuously adjusting the first initial external parameter matrix based on the first overlapping area, continuously adjusting the second initial external parameter matrix based on the second overlapping area until the overlapping area meets a preset threshold value, and outputting the adjusted first external parameter matrix and the adjusted second external parameter matrix as target external parameter matrixes calibrated by the millimeter wave radar, the laser radar and the camera in a combined manner. Carry out self-calibration through millimeter wave radar, laser radar and camera, need not use specific calibration object, avoided extravagant manpower and materials because need not use specific calibration object just also to improve the reusability.
In one embodiment, as shown in fig. 8, there is provided a millimeter wave radar, lidar and camera combined calibration apparatus 800, including:
a data acquisition module 820, configured to acquire images, millimeter wave point cloud data, and laser point cloud data acquired at the same time and in the same scene;
the point cloud mapping module 840 is used for mapping the point cloud corresponding to the millimeter wave point cloud data to the image to obtain a millimeter wave mapping point cloud according to a first initial joint calibration parameter of the millimeter wave radar relative to the camera, and mapping the point cloud corresponding to the laser point cloud data to the image to obtain a laser mapping point cloud according to a second initial joint calibration parameter of the laser radar relative to the camera;
an overlap area calculation module 860, configured to calculate an overlap area between a millimeter wave point cloud target frame and a laser point cloud target frame of a calibration object on an image and an image recognition target frame, where the calibration object is at least one target in the overlap detection area of the camera and the millimeter wave radar;
and a target joint calibration parameter output module 880, configured to adjust the first initial joint calibration parameter and the second initial joint calibration parameter based on the overlap area corresponding to each calibration object until the overlap area meets a preset threshold, and use the adjusted first joint calibration parameter and second joint calibration parameter as target joint calibration parameters for joint calibration of the millimeter wave radar, the laser radar, and the camera.
In one embodiment, as shown in fig. 9, there is provided a millimeter wave radar, lidar and camera combined calibration apparatus 800, further comprising:
the first initial joint calibration parameter calculation module 830 is configured to convert coordinates of the millimeter wave radar in the spherical coordinate system into coordinates of the millimeter wave radar in the world coordinate system; converting the coordinates of the camera in a camera coordinate system into the coordinates of the camera in a world coordinate system; and calculating a first initial combined calibration parameter of the millimeter wave radar relative to the camera according to the coordinate of the millimeter wave radar in the world coordinate system and the coordinate of the camera in the world coordinate system.
In one embodiment, the point cloud mapping module 840 includes:
the coordinate conversion unit is used for converting the coordinates of the millimeter wave point cloud data in the spherical coordinate system into coordinates in a world coordinate system;
the transmission transformation unit is used for inputting an internal reference matrix of the camera, a distortion coefficient of the camera, a first initial joint calibration parameter of the millimeter wave radar relative to the camera and a coordinate of the millimeter wave point cloud data in a world coordinate system into the transmission transformation matrix, and calculating the coordinate of the millimeter wave point cloud corresponding to the millimeter wave point cloud data in a pixel coordinate system;
and the millimeter wave mapping point cloud obtaining unit is used for extracting pixel points of the millimeter wave point cloud at the coordinates in the pixel coordinate system from the image to be used as the millimeter wave mapping point cloud.
In one embodiment, the point cloud mapping module 840 is further configured to obtain coordinates of the laser point cloud data in a world coordinate system; inputting an internal reference matrix of the camera, a distortion coefficient of the camera, a second initial joint calibration parameter of the laser radar relative to the camera and a coordinate of the laser point cloud data in a world coordinate system into a transmission transformation matrix, and calculating the coordinate of the laser point cloud corresponding to the laser point cloud data in a pixel coordinate system; and extracting pixel points of the laser point cloud at the coordinates in the pixel coordinate system from the image to serve as laser mapping point cloud.
In one embodiment, a millimeter wave radar, lidar and camera combined calibration apparatus 800 is provided, further comprising: the target frame determining module is used for drawing a millimeter wave point cloud target frame on the image by taking the millimeter wave mapping point cloud as a center and a preset size aiming at each calibration object on the image; acquiring a circumscribed rectangular frame of laser mapping point cloud corresponding to a target on the image as a laser point cloud target frame; carrying out image recognition on the image to obtain an image recognition target frame; and acquiring a millimeter wave point cloud target frame, a laser point cloud target frame and an image identification target frame of the calibration object on the image.
In one embodiment, the target joint calibration parameter output module 880 is further configured to calculate a first overlapping area between the millimeter-wave point cloud target frame and the image recognition target frame of the same calibration object on the image, and calculate a second overlapping area between the laser point cloud target frame and the image recognition target frame of the same calibration object on the image; adjusting a first initial combined calibration parameter based on a first overlapping area corresponding to each calibration object until the first overlapping area meets a first preset threshold, and outputting the adjusted first combined calibration parameter; adjusting a second initial combined calibration parameter based on a second overlapping area corresponding to each calibration object until the second overlapping area meets a second preset threshold, and outputting the adjusted second combined calibration parameter; and taking the adjusted first combined calibration parameter and the adjusted second combined calibration parameter as target combined calibration parameters for combined calibration of the millimeter wave radar, the laser radar and the camera.
In an embodiment, the target joint calibration parameter output module 880 is further configured to construct a first objective function based on the first overlap area corresponding to each calibration object, adjust the first initial joint calibration parameter through the first objective function until the value of the first objective function meets a first preset threshold, and output the adjusted first joint calibration parameter.
In one embodiment, the target joint calibration parameter output module 880 is further configured to calculate a ratio of the first overlapping area corresponding to each calibration object to the area of the millimeter wave point cloud target frame; calculating the mean, variance or standard deviation of the ratio corresponding to each calibration object; a first objective function is constructed based on the mean, variance, or standard deviation.
In one embodiment, the target joint calibration parameter output module 880 is further configured to calculate a value of the first target function using a gradient descent algorithm; and adjusting the first initial combined calibration parameter according to the value of the first objective function until the value of the first objective function meets a first preset threshold, and outputting the adjusted first combined calibration parameter.
In one embodiment, a millimeter wave radar, lidar and camera combined calibration apparatus 800 is provided, further comprising: and the fixing module is used for adjusting the millimeter wave radar, the laser radar and the camera to a preset angle and fixing the relative positions of the millimeter wave radar, the laser radar and the camera.
The division of each module in the calibration apparatus combining the millimeter wave radar, the laser radar and the camera is only used for illustration, and in other embodiments, the calibration apparatus combining the millimeter wave radar, the laser radar and the camera may be divided into different modules as required to complete all or part of the functions of the calibration apparatus combining the millimeter wave radar, the laser radar and the camera.
Fig. 10 is a schematic diagram of an internal configuration of a server in one embodiment. As shown in fig. 10, the server includes a processor and a memory connected by a system bus. Wherein, the processor is used for providing calculation and control capability and supporting the operation of the whole server. The memory may include a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The computer program can be executed by a processor to implement a millimeter wave radar, lidar and camera joint calibration method provided in the following embodiments. The internal memory provides a cached execution environment for the operating system computer programs in the non-volatile storage medium. The server may be a mobile phone, a tablet computer, or a personal digital assistant or a wearable device, etc.
The implementation of each module in the millimeter wave radar, laser radar and camera joint calibration device provided in the embodiment of the present application may be in the form of a computer program. The computer program may be run on a terminal or a server. The program modules constituted by the computer program may be stored on the memory of the terminal or the server. Which when executed by a processor, performs the steps of the method described in the embodiments of the present application.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the steps of the millimeter wave radar, lidar and camera joint calibration method.
A computer program product containing instructions which, when run on a computer, cause the computer to perform a millimeter wave radar, lidar and camera joint calibration method.
Any reference to memory, storage, database, or other medium used by embodiments of the present application may include non-volatile and/or volatile memory. Suitable non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
The above examples only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (13)

1. A millimeter wave radar, laser radar and camera combined calibration method is characterized by comprising the following steps:
acquiring images, millimeter wave point cloud data and laser point cloud data which are acquired at the same time and in the same scene;
mapping the point cloud corresponding to the millimeter wave point cloud data to the image to obtain millimeter wave mapping point cloud according to a first initial joint calibration parameter of the millimeter wave radar relative to the camera, and mapping the point cloud corresponding to the laser point cloud data to the image to obtain laser mapping point cloud according to a second initial joint calibration parameter of the laser radar relative to the camera;
calculating the overlapping area between a millimeter wave point cloud target frame and a laser point cloud target frame of a calibration object on the image and an image identification target frame respectively, wherein the calibration object is at least one target in the overlapping detection area of the camera and the millimeter wave radar;
and adjusting the first initial combined calibration parameter and the second initial combined calibration parameter based on the overlapping area corresponding to each calibration object until the overlapping area meets a preset threshold, and taking the adjusted first combined calibration parameter and second combined calibration parameter as target combined calibration parameters for combined calibration of the millimeter wave radar, the laser radar and the camera.
2. The method of claim 1, wherein the calculation of the first initial joint calibration parameter of the millimeter wave radar relative to the camera comprises:
converting the coordinate of the millimeter wave radar in a spherical coordinate system into the coordinate of the millimeter wave radar in a world coordinate system;
converting the coordinates of the camera in a camera coordinate system to the coordinates of the camera in the world coordinate system;
and calculating a first initial joint calibration parameter of the millimeter wave radar relative to the camera according to the coordinate of the millimeter wave radar in the world coordinate system and the coordinate of the camera in the world coordinate system.
3. The method according to claim 1 or 2, wherein the mapping the point cloud corresponding to the millimeter wave point cloud data onto the image according to the first initial joint calibration parameter of the millimeter wave radar relative to the camera to obtain a millimeter wave mapped point cloud comprises:
converting the coordinates of the millimeter wave point cloud data in a spherical coordinate system into coordinates in a world coordinate system;
inputting an internal reference matrix of the camera, a distortion coefficient of the camera, a first initial joint calibration parameter of the millimeter wave radar relative to the camera and a coordinate of the millimeter wave point cloud data in a world coordinate system into a transmission transformation matrix, and calculating the coordinate of the millimeter wave point cloud corresponding to the millimeter wave point cloud data in a pixel coordinate system;
and extracting pixel points of the millimeter wave point cloud at the coordinates in a pixel coordinate system from the image to serve as millimeter wave mapping point cloud.
4. The method of claim 1 or 2, wherein mapping the point cloud corresponding to the laser point cloud data onto the image according to a second initial joint calibration parameter of the lidar relative to the camera to obtain a laser mapped point cloud, comprises:
acquiring coordinates of the laser point cloud data in a world coordinate system;
inputting an internal reference matrix of the camera, a distortion coefficient of the camera, a second initial joint calibration parameter of the laser radar relative to the camera and a coordinate of the laser point cloud data in a world coordinate system into a transmission transformation matrix, and calculating the coordinate of the laser point cloud corresponding to the laser point cloud data in a pixel coordinate system;
and extracting pixel points of the laser point cloud at the coordinates in a pixel coordinate system from the image to serve as laser mapping point cloud.
5. The method of claim 1 or 2, further comprising, prior to the calculating an overlap area between the millimeter wave point cloud target box and the laser point cloud target box on the image and the image recognition target box of the calibration object,:
drawing a millimeter wave point cloud target frame on the image by taking the millimeter wave mapping point cloud as a center and a preset size aiming at each calibration object on the image; acquiring a circumscribed rectangular frame of the laser mapping point cloud corresponding to the target on the image as a laser point cloud target frame; carrying out image recognition on the image to obtain an image recognition target frame;
and acquiring a millimeter wave point cloud target frame, a laser point cloud target frame and an image identification target frame of the calibration object on the image.
6. The method according to claim 1 or 2, wherein the adjusting the first initial combined calibration parameter and the second initial combined calibration parameter based on the overlapping area corresponding to each calibration object until the overlapping area satisfies a preset threshold value, and the using the adjusted first combined calibration parameter and second combined calibration parameter as a target combined calibration parameter for combined calibration of the millimeter wave radar, the laser radar, and the camera includes:
calculating a first overlapping area between a millimeter wave point cloud target frame and an image recognition target frame of the same calibration object on the image, and calculating a second overlapping area between a laser point cloud target frame and an image recognition target frame of the same calibration object on the image;
adjusting the first initial joint calibration parameter based on the first overlapping area corresponding to each calibration object until the first overlapping area meets a first preset threshold, and outputting the adjusted first joint calibration parameter;
adjusting the second initial combined calibration parameters based on the second overlapping area corresponding to each calibration object until the second overlapping area meets a second preset threshold, and outputting adjusted second combined calibration parameters;
and taking the adjusted first combined calibration parameter and the adjusted second combined calibration parameter as target combined calibration parameters for combined calibration of the millimeter wave radar, the laser radar and the camera.
7. The method according to claim 6, wherein the adjusting the first initial joint calibration parameter based on the first overlap area corresponding to each calibration object until the first overlap area satisfies a first preset threshold value, and outputting the adjusted first joint calibration parameter comprises:
and constructing a first objective function based on the first overlapping area corresponding to each calibration object, adjusting the first initial combined calibration parameter through the first objective function until the value of the first objective function meets a first preset threshold, and outputting the adjusted first combined calibration parameter.
8. The method of claim 7, wherein constructing a first objective function based on the first overlap area for each calibration object comprises:
calculating the ratio of the first overlapping area corresponding to each calibration object to the area of the millimeter wave point cloud target frame or the image identification frame;
calculating the mean, variance or standard deviation of the ratio corresponding to each calibration object;
a first objective function is constructed based on the mean, variance, or standard deviation.
9. The method as claimed in claim 7, wherein said adjusting the first initial joint calibration parameters by the first objective function until the value of the first objective function satisfies a first preset threshold value, and outputting the adjusted first joint calibration parameters comprises:
calculating the value of the first objective function by adopting a gradient descent algorithm;
and adjusting the first initial joint calibration parameter according to the value of the first objective function until the value of the first objective function meets a first preset threshold, and outputting the adjusted first joint calibration parameter.
10. The method according to claim 1 or 2, characterized in that the method further comprises:
and adjusting the millimeter wave radar, the laser radar and the camera to a preset angle, and fixing the relative positions of the millimeter wave radar, the laser radar and the camera.
11. The utility model provides a calibration device is united with camera to millimeter wave radar, laser radar which characterized in that includes:
the data acquisition module is used for acquiring images, millimeter wave point cloud data and laser point cloud data acquired at the same time and in the same scene;
the point cloud mapping module is used for mapping the point cloud corresponding to the millimeter wave point cloud data to the image according to a first initial joint calibration parameter of the millimeter wave radar relative to the camera to obtain millimeter wave mapping point cloud, and mapping the point cloud corresponding to the laser point cloud data to the image according to a second initial joint calibration parameter of the laser radar relative to the camera to obtain laser mapping point cloud;
the overlapping area calculation module is used for calculating the overlapping area between a millimeter wave point cloud target frame and a laser point cloud target frame of a calibration object on the image and an image identification target frame respectively, wherein the calibration object is at least one target in the overlapping detection area of the camera and the millimeter wave radar;
and the target joint calibration parameter output module is used for adjusting the first initial joint calibration parameter and the second initial joint calibration parameter based on the overlapping area corresponding to each calibration object until the overlapping area meets a preset threshold value, and taking the adjusted first joint calibration parameter and the adjusted second joint calibration parameter as target joint calibration parameters for joint calibration of the millimeter wave radar, the laser radar and the camera.
12. A server comprising a memory and a processor, the memory having stored thereon a computer program, wherein the computer program, when executed by the processor, causes the processor to perform the steps of the millimeter wave radar, lidar and camera joint calibration method of any of claims 1 to 10.
13. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the millimeter wave radar, lidar and camera joint calibration method according to any one of claims 1 to 10.
CN202010843901.2A 2020-08-20 2020-08-20 Millimeter wave radar, laser radar and camera combined calibration method and device Pending CN114076918A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010843901.2A CN114076918A (en) 2020-08-20 2020-08-20 Millimeter wave radar, laser radar and camera combined calibration method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010843901.2A CN114076918A (en) 2020-08-20 2020-08-20 Millimeter wave radar, laser radar and camera combined calibration method and device

Publications (1)

Publication Number Publication Date
CN114076918A true CN114076918A (en) 2022-02-22

Family

ID=80281941

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010843901.2A Pending CN114076918A (en) 2020-08-20 2020-08-20 Millimeter wave radar, laser radar and camera combined calibration method and device

Country Status (1)

Country Link
CN (1) CN114076918A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114526746A (en) * 2022-03-15 2022-05-24 智道网联科技(北京)有限公司 Method, device and equipment for generating high-precision map lane line and storage medium
CN114758504A (en) * 2022-06-13 2022-07-15 之江实验室 Online vehicle overspeed early warning method and system based on filtering correction
CN114814758A (en) * 2022-06-24 2022-07-29 国汽智控(北京)科技有限公司 Camera-millimeter wave radar-laser radar combined calibration method and device
CN115079168A (en) * 2022-07-19 2022-09-20 陕西欧卡电子智能科技有限公司 Mapping method, device and equipment based on fusion of laser radar and millimeter wave radar
CN115471574A (en) * 2022-11-02 2022-12-13 北京闪马智建科技有限公司 External parameter determination method and device, storage medium and electronic device
CN115797401A (en) * 2022-11-17 2023-03-14 昆易电子科技(上海)有限公司 Verification method and device of alignment parameters, storage medium and electronic equipment
CN116047440A (en) * 2023-03-29 2023-05-02 陕西欧卡电子智能科技有限公司 End-to-end millimeter wave radar and camera external parameter calibration method
WO2024113207A1 (en) * 2022-11-30 2024-06-06 华为技术有限公司 Data processing method and apparatus

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114526746A (en) * 2022-03-15 2022-05-24 智道网联科技(北京)有限公司 Method, device and equipment for generating high-precision map lane line and storage medium
CN114758504A (en) * 2022-06-13 2022-07-15 之江实验室 Online vehicle overspeed early warning method and system based on filtering correction
CN114814758A (en) * 2022-06-24 2022-07-29 国汽智控(北京)科技有限公司 Camera-millimeter wave radar-laser radar combined calibration method and device
CN114814758B (en) * 2022-06-24 2022-09-06 国汽智控(北京)科技有限公司 Camera-millimeter wave radar-laser radar combined calibration method and device
CN115079168A (en) * 2022-07-19 2022-09-20 陕西欧卡电子智能科技有限公司 Mapping method, device and equipment based on fusion of laser radar and millimeter wave radar
CN115079168B (en) * 2022-07-19 2022-11-22 陕西欧卡电子智能科技有限公司 Mapping method, device and equipment based on fusion of laser radar and millimeter wave radar
CN115471574A (en) * 2022-11-02 2022-12-13 北京闪马智建科技有限公司 External parameter determination method and device, storage medium and electronic device
CN115471574B (en) * 2022-11-02 2023-02-03 北京闪马智建科技有限公司 External parameter determination method and device, storage medium and electronic device
CN115797401A (en) * 2022-11-17 2023-03-14 昆易电子科技(上海)有限公司 Verification method and device of alignment parameters, storage medium and electronic equipment
CN115797401B (en) * 2022-11-17 2023-06-06 昆易电子科技(上海)有限公司 Verification method and device for alignment parameters, storage medium and electronic equipment
CN116594028A (en) * 2022-11-17 2023-08-15 昆易电子科技(上海)有限公司 Verification method and device for alignment parameters, storage medium and electronic equipment
CN116594028B (en) * 2022-11-17 2024-02-06 昆易电子科技(上海)有限公司 Verification method and device for alignment parameters, storage medium and electronic equipment
WO2024113207A1 (en) * 2022-11-30 2024-06-06 华为技术有限公司 Data processing method and apparatus
CN116047440A (en) * 2023-03-29 2023-05-02 陕西欧卡电子智能科技有限公司 End-to-end millimeter wave radar and camera external parameter calibration method
CN116047440B (en) * 2023-03-29 2023-06-09 陕西欧卡电子智能科技有限公司 End-to-end millimeter wave radar and camera external parameter calibration method

Similar Documents

Publication Publication Date Title
CN114076918A (en) Millimeter wave radar, laser radar and camera combined calibration method and device
CN112132972B (en) Three-dimensional reconstruction method and system for fusing laser and image data
CN111553859B (en) Laser radar point cloud reflection intensity completion method and system
CN109270534B (en) Intelligent vehicle laser sensor and camera online calibration method
CN114076937A (en) Laser radar and camera combined calibration method and device, server and computer readable storage medium
CN110008843B (en) Vehicle target joint cognition method and system based on point cloud and image data
CN114076919A (en) Millimeter wave radar and camera combined calibration method and device, server and computer readable storage medium
KR102586208B1 (en) Method and apparatus for analyzing communication channel considering information related to material and exterior shape of a object
CN112215306B (en) Target detection method based on fusion of monocular vision and millimeter wave radar
CN109283538A (en) A kind of naval target size detection method of view-based access control model and laser sensor data fusion
CN108574929A (en) The method and apparatus for reproducing and enhancing for the networking scenario in the vehicle environment in autonomous driving system
CN111207762B (en) Map generation method and device, computer equipment and storage medium
CN105139350A (en) Ground real-time reconstruction processing system for unmanned aerial vehicle reconnaissance images
CN110889829A (en) Monocular distance measurement method based on fisheye lens
CN113050074B (en) Camera and laser radar calibration system and calibration method in unmanned environment perception
CN115797454B (en) Multi-camera fusion sensing method and device under bird's eye view angle
CN113192182A (en) Multi-sensor-based live-action reconstruction method and system
CN112037249A (en) Method and device for tracking object in image of camera device
CN110718137A (en) Method and device for constructing density distribution map of target object, terminal and mobile device
CN114076936A (en) Precision evaluation method and device of combined calibration parameters, server and computer readable storage medium
CN114332494A (en) Three-dimensional target detection and identification method based on multi-source fusion under vehicle-road cooperation scene
CN114413958A (en) Monocular vision distance and speed measurement method of unmanned logistics vehicle
CN114076935A (en) Laser radar and camera combined calibration method and device, server and computer readable storage medium
CN112132900A (en) Visual repositioning method and system
CN117058051A (en) Method and device based on fusion of laser point cloud and low-light-level image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination