CN113514803A - Combined calibration method for monocular camera and millimeter wave radar - Google Patents

Combined calibration method for monocular camera and millimeter wave radar Download PDF

Info

Publication number
CN113514803A
CN113514803A CN202110322844.8A CN202110322844A CN113514803A CN 113514803 A CN113514803 A CN 113514803A CN 202110322844 A CN202110322844 A CN 202110322844A CN 113514803 A CN113514803 A CN 113514803A
Authority
CN
China
Prior art keywords
camera
millimeter wave
wave radar
calibration
radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110322844.8A
Other languages
Chinese (zh)
Inventor
程德心
张伟
胡早阳
汤戈
蔡幼波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Kotei Informatics Co Ltd
Original Assignee
Wuhan Kotei Informatics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Kotei Informatics Co Ltd filed Critical Wuhan Kotei Informatics Co Ltd
Priority to CN202110322844.8A priority Critical patent/CN113514803A/en
Publication of CN113514803A publication Critical patent/CN113514803A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/46Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being of a radio-wave signal type

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention provides a monocular camera and millimeter wave radar combined calibration method, which comprises the steps of firstly carrying out time synchronization and spatial data fusion on a millimeter wave radar and a camera, and then adjusting the installation angle and the installation position of the millimeter wave radar and the camera based on the spatial data fusion result, so that when the same detection point is detected, the coordinates of the detection point detected by the millimeter wave radar are consistent with the coordinates of the detection point detected by the camera. According to the method, the millimeter wave radar and the camera are subjected to time synchronization and spatial data fusion, the coordinate relation between the camera installed on the unmanned vehicle and the millimeter wave radar is determined, then the camera and the millimeter wave radar are subjected to combined calibration, and the calibration precision of the camera and the millimeter wave radar is improved. In addition, the calibration support used by the invention is composed of a chessboard calibration plate fixed on the support and a radar corner reflector, and has the advantages of simple manufacture and low cost.

Description

Combined calibration method for monocular camera and millimeter wave radar
Technical Field
The invention relates to the technical field of sensor calibration, in particular to a combined calibration method of a monocular camera and a millimeter wave radar.
Background
Advanced Driver Assistance Systems (ADAS) and advanced autopilot, with the aim of reducing road traffic accidents and improving driver comfort. The advanced driving assistance technology firstly solves the problem of perception, and perception necessarily utilizes different sensors, such as a camera, a millimeter wave radar, a laser radar and the like. A plurality of sensors are arranged on a vehicle, and the coordinate relation between the sensors is required to be determined, so that the calibration of the sensors is the basic requirement of automatic driving and is an important basis for determining whether a sensing system is correct or not.
Additionally, the relationship between sensor input and output can be determined by sensor calibration. Since a safe and fast driving path needs to be planned by identifying the surrounding environment when the advanced unmanned vehicle drives on the road, the sensor calibration is the basis of advanced auxiliary driving and high-level automatic driving.
Disclosure of Invention
In order to solve the above problems, embodiments of the present invention provide a method for jointly calibrating a monocular camera and a millimeter wave radar, which overcomes or at least partially solves the above problems.
The embodiment of the invention provides a monocular camera and millimeter wave radar combined calibration method, which comprises the steps of
S1, carrying out time synchronization and spatial data fusion on the millimeter wave radar and the camera;
and S2, adjusting the installation angles and the installation positions of the millimeter wave radar and the camera based on the spatial data fusion result, so that the coordinates of the detection points detected by the millimeter wave radar are consistent with the coordinates of the detection points detected by the camera when the same detection point is detected.
Preferably, the time synchronization of the millimeter wave radar and the monocular camera includes:
and GPS time service of the millimeter wave radar and the monocular camera is unified, so that time synchronization of the millimeter wave radar and the monocular camera is realized.
Preferably, the spatial data fusion of the millimeter wave radar and the monocular camera includes:
establishing a millimeter wave radar coordinate system and a camera coordinate system, and establishing a pixel level data fusion equation of the millimeter wave radar and the camera by utilizing the space constraint relation between the millimeter wave radar data point and the camera image;
and solving a pixel-level data fusion equation to obtain a spatial transformation relation between the millimeter wave radar coordinate system and the camera coordinate system, and completing spatial data fusion of the millimeter wave radar and the camera.
Preferably, the pixel-level data fusion equation for the millimeter wave radar and the camera includes:
Figure BDA0002991985760000021
Figure BDA0002991985760000022
where u, v are the image points in the camera image pixels, Kc is the internal parameter matrix of the camera, (u0,v0) As the image pixel center coordinates, (f)x,fy) Is the equivalent focal length in the x, y directions;
Figure BDA0002991985760000023
is a 3 x 3 rotation matrix and,
Figure BDA0002991985760000024
is a coordinate translation matrix of 1 x 3,
Figure BDA0002991985760000025
and
Figure BDA0002991985760000026
respectively representing the installation angle and the installation position of the camera relative to the vehicle, also called the external parameters of the camera;
the installation angle of the camera and the millimeter wave radar is determined by the course angle
Figure BDA0002991985760000027
The pitch angle δ, and roll angle ξ are determined, as follows:
Figure BDA0002991985760000028
Figure BDA0002991985760000031
wherein, t1,t2,t3Indicating the mounting position of the camera.
Preferably, step S2 specifically includes:
s21, mounting the camera below the windshield of the vehicle, and mounting the millimeter wave radar on the front bumper of the vehicle;
s22, according to the installation position of the camera and the millimeter wave radar, a calibration support is arranged in front of the vehicle, and the calibration support comprises a checkerboard calibration plate and a radar corner reflector which are fixed on the support;
and S23, adjusting the installation angle and the installation position of the millimeter wave radar and the camera, and when the millimeter wave radar and the camera detect the same detection point on the calibration support, calculating through a pixel-level data fusion equation to obtain that the coordinates of the detection point detected by the radar are consistent with the coordinates of the detection point detected by the camera.
Preferably, after step S2, the method further comprises:
and S3, moving the calibration support, repeating the step S2, and calibrating the millimeter wave radar and the camera for multiple times.
Preferably, according to the installation position of camera and millimeter wave radar, set up calibration support in the plantago, include:
and setting the height difference between the camera and the millimeter wave radar at the installation position as H, and setting the height difference between the chessboard pattern calibration plate on the calibration support and the radar corner reflector as H.
Correspondingly, the height difference between the detection point of the camera on the checkerboard calibration board and the detection point of the millimeter wave radar on the radar corner reflector is H.
Compared with the prior art, the monocular camera and millimeter wave radar combined calibration method provided by the embodiment of the invention has the following beneficial effects:
1) the millimeter wave radar and the camera are subjected to time synchronization and spatial data fusion, the coordinate relation between the camera installed on the unmanned automobile and the millimeter wave radar is determined, then the multi-sensor combined calibration is carried out, and the calibration precision of the camera and the millimeter wave radar is improved.
2) Because the calibration precision of the camera and the millimeter wave radar is improved, the precision of the obstacle fusion detection of the camera and the millimeter wave radar can be improved
3) The calibration support used by the invention consists of the chessboard pattern calibration plate fixed on the support and the radar corner reflector, and has the advantages of simple manufacture and low cost.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a schematic flow chart of a combined calibration method for a monocular camera and a millimeter wave radar according to an embodiment of the present invention;
fig. 2 is a schematic view illustrating an installation of a camera and a millimeter wave radar according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of a calibration bracket according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
Advanced driver assistance techniques first solve the problem of perception, which inevitably uses different sensors, such as cameras, millimeter-wave radars, laser radars, etc. A plurality of sensors are arranged on a vehicle, and the coordinate relation between the sensors is required to be determined, so that the calibration of the sensors is the basic requirement of automatic driving and is an important basis for determining whether a sensing system is correct or not. Additionally, the relationship between sensor input and output can be determined by sensor calibration. Since a safe and fast driving path needs to be planned by identifying the surrounding environment when the advanced unmanned vehicle drives on the road, the sensor calibration is the basis of advanced auxiliary driving and high-level automatic driving.
Therefore, the invention develops a combined calibration method of the monocular camera and the millimeter wave radar, the millimeter wave radar and the camera are subjected to time synchronization and spatial data fusion, the coordinate relation between the camera installed on the unmanned automobile and the millimeter wave radar is determined, then the combined calibration of the multiple sensors is carried out, and the calibration precision of the camera and the millimeter wave radar is improved. The following description and description of various embodiments are presented in conjunction with the following drawings.
Sensor calibration can be divided into two parts: the method comprises the following steps of internal parameter calibration and external parameter calibration, wherein the internal parameters are used for determining the mapping relation inside the sensor, such as the focal length, eccentricity and pixel aspect ratio (distortion coefficient) of a camera, and the external parameters are used for determining the conversion relation between the sensor and an external coordinate system, such as attitude parameters (rotation and translation).
The invention solves the external reference calibration between different sensors, particularly the calibration between a millimeter wave radar and a monocular camera, namely the problems of time data fusion and space data fusion of the millimeter wave radar and the monocular camera, thereby achieving the purpose of calibration. The "camera" in this application is a "monocular camera".
Fig. 1 is a schematic flow chart of a combined calibration method for a monocular camera and a millimeter wave radar according to an embodiment of the present invention, and as shown in fig. 1, the combined calibration method for a monocular camera and a millimeter wave radar according to the embodiment of the present invention includes, but is not limited to, the following steps:
s1, carrying out time synchronization and spatial data fusion on the millimeter wave radar and the camera;
and S2, adjusting the installation angles and the installation positions of the millimeter wave radar and the camera based on the spatial data fusion result, so that the coordinates of the detection points detected by the millimeter wave radar are consistent with the coordinates of the detection points detected by the camera when the same detection point is detected.
Specifically, the invention performs time data synchronization on the millimeter wave radar and the camera, and because the acquisition channels of different sensors such as the millimeter wave radar and the monocular camera are different, for example, the millimeter wave radar acquires through a Can bus, and the camera acquires through a Can interface or a USB interface. In addition, the acquisition periods of different sensors are also different, the acquisition period of the millimeter wave radar can be 60ms, and the acquisition period of the camera can be 50ms, so that the different sensors need to be synchronized in time. The invention adopts a GPS time service mode, and the millimeter wave radar and the camera are subjected to unified time service through the GPS. And the GPS is used as standard time to stamp the millimeter wave radar and the camera, so that the purpose of time synchronization is achieved. Furthermore, a memory with a certain size is designed to serve as a cache pool for caching the data of the current millimeter wave radar and the current camera, so that the problem of data lag is solved, and the data to be processed is guaranteed to be the latest data after current synchronization.
Further, in step S1, performing spatial data fusion on the millimeter wave radar and the monocular camera, including: establishing a millimeter wave radar coordinate system and a camera coordinate system, and establishing a pixel level data fusion equation of the millimeter wave radar and the camera by utilizing the space constraint relation between the millimeter wave radar data point and the camera image; and solving a pixel-level data fusion equation to obtain a spatial transformation relation between the millimeter wave radar coordinate system and the camera coordinate system, and completing spatial data fusion of the millimeter wave radar and the camera.
When the millimeter wave radar coordinate system and the camera are installed, the millimeter wave radar is in rigid connection with the vehicle, and the relative posture and displacement of the millimeter wave radar and the vehicle are fixed and unchanged without considering the vibration condition of the vehicle, so that data points returned by the millimeter wave radar have unique position coordinates corresponding to each other in a world coordinate system. Similarly, the monocular camera is rigidly connected with the automobile, the relative posture and displacement of the monocular camera and the automobile are fixed, and only one unique image pixel is corresponding to each point in the world coordinate, so that a unique corresponding point exists in the image space for each data point of the millimeter wave radar in the same space.
Therefore, by establishing a reasonable millimeter wave radar coordinate system and a camera coordinate system and utilizing the space constraint relation between the millimeter wave radar data point and the camera image, the space transformation relation between the millimeter wave radar coordinate system and the camera coordinate system can be solved, so that the space fusion of the millimeter wave radar and the camera is completed, and therefore, the space fusion problem of the millimeter wave radar and the camera is converted into the function fitting problem of corresponding points of the radar and the image.
After the external parameters of the camera are solved through the pixel-level data fusion equation, the relation between the millimeter wave radar and the relative environment coordinate system of the camera is completely determined, so that millimeter wave radar data points can be projected onto an image pixel coordinate system through a camera model, and the pixel-level data fusion equation comprises the following steps:
Figure BDA0002991985760000071
Figure BDA0002991985760000072
where u, v are the image points in the camera image pixels, Kc is the internal parameter matrix of the camera, (u0,v0) As the image pixel center coordinates, (f)x,fy) Is the equivalent focal length in the x, y directions;
Figure BDA0002991985760000073
is a 3 x 3 rotation matrix and,
Figure BDA0002991985760000074
is a coordinate translation matrix of 1 x 3,
Figure BDA0002991985760000075
and
Figure BDA0002991985760000076
respectively representing the installation angle and the installation position of the camera relative to the vehicle, also called the external parameters of the camera;
the installation angle of the camera and the millimeter wave radar is determined by the course angle
Figure BDA0002991985760000077
The pitch angle δ, and roll angle ξ are determined, as follows:
Figure BDA0002991985760000078
Figure BDA0002991985760000079
wherein, t1,t2,t3Indicating the mounting position of the camera.
The calibration of the camera and the millimeter wave radar needs to calculate a detection point detected by the camera and the millimeter wave radar at the same time according to the above method, and project the detection point to a pixel plane. Fig. 2 is a schematic view illustrating an installation of a camera and a millimeter wave radar according to an embodiment of the present invention, and referring to fig. 2, in this embodiment, the millimeter wave radar is installed in a front bumper of a vehicle, and the camera is installed in a windshield and has a height difference. The calibration can be considered successful only by finding the coincidence of the observation points of the plurality of radars and the pixel points of the camera.
Based on the content of the foregoing embodiment, step S2 specifically includes:
s21, mounting the camera below the windshield of the vehicle, and mounting the millimeter wave radar on the front bumper of the vehicle; as shown with reference to fig. 2.
S22, according to the installation position of the camera and the millimeter wave radar, a calibration support is arranged in front of the vehicle, and the calibration support comprises a checkerboard calibration plate and a radar corner reflector which are fixed on the support; fig. 3 is a schematic structural diagram of a calibration bracket according to an embodiment of the present invention. And setting the height difference between the camera and the millimeter wave radar at the installation position as H, and setting the height difference between the chessboard pattern calibration plate on the calibration support and the radar corner reflector as H.
And S23, adjusting the installation angle and the installation position of the millimeter wave radar and the camera, and when the millimeter wave radar and the camera detect the same detection point on the calibration support, calculating through a pixel-level data fusion equation to obtain that the coordinates of the detection point detected by the radar are consistent with the coordinates of the detection point detected by the camera.
Referring to fig. 2 and 3, the calibration support comprises a checkerboard calibration plate and a radar corner reflector, the checkerboard calibration plate is used for calibrating a camera, according to a pixel-level data fusion equation, a projection from a camera detection point on the checkerboard calibration plate to a camera pixel plane is found, and a projection from a radar corner reflector reflection point to the camera pixel plane is found at the same time. In this embodiment, the height difference between the detection point of the camera on the checkerboard calibration plate and the detection point of the millimeter wave radar on the radar corner reflector is equal to the height difference between the actual camera and the actual radar mounted on the vehicle, so that the height difference between the detection point of the camera on the checkerboard calibration plate and the detection point of the millimeter wave radar on the radar corner reflector is H. The calibration precision can be improved and the calculation difficulty can be reduced by the arrangement. In this embodiment, adjustment camera and millimeter wave radar's installation angle and mounted position for same check point is at the pixel plane coincidence of camera, because radar corner reflector and chess board check calibration board have the difference in height of H on the support, need subtract the difference in height H during the projection, makes the radar pixel point after the conversion and the pixel point coincidence of camera.
Further, the calibration support is moved once at an interval of 0.5m from the front of the vehicle at 1m, the step S2 is repeated, and the millimeter wave radar and the camera are calibrated for multiple times according to the method so as to improve the calibration precision. Through multiple calibration tests, the calibration precision can be obviously improved after more than 4 times of calibration.
Compared with the prior art, the monocular camera and millimeter wave radar combined calibration method provided by the embodiment of the invention has the following beneficial effects:
1) the time synchronization and the spatial data fusion are carried out on the millimeter wave radar and the camera, the coordinate relation between the camera installed on the unmanned automobile and the millimeter wave radar is determined, the camera and the millimeter wave radar are jointly calibrated, and the calibration precision of the camera and the millimeter wave radar is improved.
2) Because the calibration precision of the camera and the millimeter wave radar is improved, the precision of the obstacle fusion detection of the camera and the millimeter wave radar can be improved
3) The calibration support used by the invention consists of the chessboard pattern calibration plate fixed on the support and the radar corner reflector, and has the advantages of simple manufacture and low cost.
The embodiments of the present invention can be arbitrarily combined to achieve different technical effects.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (7)

1. A combined calibration method for a monocular camera and a millimeter wave radar is characterized by comprising the following steps.
S1, carrying out time synchronization and spatial data fusion on the millimeter wave radar and the camera;
and S2, adjusting the installation angles and the installation positions of the millimeter wave radar and the camera based on the spatial data fusion result, so that the coordinates of the detection points detected by the millimeter wave radar are consistent with the coordinates of the detection points detected by the camera when the same detection point is detected.
2. The method for the combined calibration of the monocular camera and the millimeter wave radar according to claim 1, wherein the time synchronization of the millimeter wave radar and the monocular camera comprises:
and GPS time service of the millimeter wave radar and the monocular camera is unified, so that time synchronization of the millimeter wave radar and the monocular camera is realized.
3. The method for jointly calibrating a monocular camera and a millimeter wave radar according to claim 1, wherein the spatial data fusion of the millimeter wave radar and the monocular camera comprises:
establishing a millimeter wave radar coordinate system and a camera coordinate system, and establishing a pixel level data fusion equation of the millimeter wave radar and the camera by utilizing the space constraint relation between the millimeter wave radar data point and the camera image;
and solving a pixel-level data fusion equation to obtain a spatial transformation relation between the millimeter wave radar coordinate system and the camera coordinate system, and completing spatial data fusion of the millimeter wave radar and the camera.
4. The method for united calibration of a monocular camera and a millimeter wave radar according to claim 3, wherein the pixel level data fusion equation for the millimeter wave radar and the camera comprises:
Figure FDA0002991985750000011
Figure FDA0002991985750000021
where u, v are the image points in the camera image pixels, Kc is the internal parameter matrix of the camera, (u0,v0) As the image pixel center coordinates, (f)x,fy) Is the equivalent focal length in the x, y directions;
Figure FDA0002991985750000022
is a 3 x 3 rotation matrix and,
Figure FDA0002991985750000023
is a coordinate translation matrix of 1 x 3,
Figure FDA0002991985750000024
and
Figure FDA0002991985750000025
respectively representing the installation angle and the installation position of the camera relative to the vehicle, also called the external parameters of the camera;
installation angle of camera and millimeter wave radarAngle of direction
Figure FDA0002991985750000026
The pitch angle δ, and roll angle ξ are determined, as follows:
Figure FDA0002991985750000027
Figure FDA0002991985750000028
wherein, t1,t2,t3Indicating the mounting position of the camera.
5. The method for the combined calibration of the monocular camera and the millimeter wave radar according to claim 4, wherein step S2 specifically includes:
s21, mounting the camera below the windshield of the vehicle, and mounting the millimeter wave radar on the front bumper of the vehicle;
s22, according to the installation position of the camera and the millimeter wave radar, a calibration support is arranged in front of the vehicle, and the calibration support comprises a checkerboard calibration plate and a radar corner reflector which are fixed on the support;
s23, adjusting the installation angle and the installation position of the millimeter wave radar and the camera, and when the millimeter wave radar and the camera detect the same detection point on the calibration support, calculating through a pixel-level data fusion equation to obtain that the coordinates of the detection point detected by the radar are consistent with the coordinates of the detection point detected by the camera.
6. The method for united calibration of a monocular camera and a millimeter wave radar according to claim 5, further comprising, after step S2:
and S3, moving the calibration support, repeating the step S2, and calibrating the millimeter wave radar and the camera for multiple times.
7. The method for united calibration of a monocular camera and a millimeter wave radar according to claim 5, wherein the step of arranging a calibration bracket in front of the vehicle according to the installation position of the camera and the millimeter wave radar comprises:
and setting the height difference between the camera and the millimeter wave radar at the installation position as H, and setting the height difference between the chessboard pattern calibration plate on the calibration support and the radar corner reflector as H.
Correspondingly, the height difference between the detection point of the camera on the checkerboard calibration board and the detection point of the millimeter wave radar on the radar corner reflector is H.
CN202110322844.8A 2021-03-25 2021-03-25 Combined calibration method for monocular camera and millimeter wave radar Pending CN113514803A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110322844.8A CN113514803A (en) 2021-03-25 2021-03-25 Combined calibration method for monocular camera and millimeter wave radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110322844.8A CN113514803A (en) 2021-03-25 2021-03-25 Combined calibration method for monocular camera and millimeter wave radar

Publications (1)

Publication Number Publication Date
CN113514803A true CN113514803A (en) 2021-10-19

Family

ID=78062064

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110322844.8A Pending CN113514803A (en) 2021-03-25 2021-03-25 Combined calibration method for monocular camera and millimeter wave radar

Country Status (1)

Country Link
CN (1) CN113514803A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114187365A (en) * 2021-12-09 2022-03-15 联陆智能交通科技(上海)有限公司 Camera and millimeter wave radar combined calibration method and system for roadside sensing system
CN114488047A (en) * 2022-01-27 2022-05-13 中国第一汽车股份有限公司 Vehicle sensor calibration system
CN114814758A (en) * 2022-06-24 2022-07-29 国汽智控(北京)科技有限公司 Camera-millimeter wave radar-laser radar combined calibration method and device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107966700A (en) * 2017-11-20 2018-04-27 天津大学 A kind of front obstacle detecting system and method for pilotless automobile
CN109598765A (en) * 2018-12-21 2019-04-09 浙江大学 Join combined calibrating method outside monocular camera and millimetre-wave radar based on spherical calibration object
CN110849362A (en) * 2019-11-28 2020-02-28 湖南率为控制科技有限公司 Laser radar and vision combined navigation algorithm based on vehicle-mounted inertia
CN111368706A (en) * 2020-03-02 2020-07-03 南京航空航天大学 Data fusion dynamic vehicle detection method based on millimeter wave radar and machine vision
CN111812649A (en) * 2020-07-15 2020-10-23 西北工业大学 Obstacle identification and positioning method based on fusion of monocular camera and millimeter wave radar
CN111815717A (en) * 2020-07-15 2020-10-23 西北工业大学 Multi-sensor fusion external parameter combination semi-autonomous calibration method
CN112013877A (en) * 2020-08-31 2020-12-01 广州景骐科技有限公司 Detection method and related device for millimeter wave radar and inertial measurement unit
CN112070841A (en) * 2020-07-01 2020-12-11 北京中科原动力科技有限公司 Rapid combined calibration method for millimeter wave radar and camera

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107966700A (en) * 2017-11-20 2018-04-27 天津大学 A kind of front obstacle detecting system and method for pilotless automobile
CN109598765A (en) * 2018-12-21 2019-04-09 浙江大学 Join combined calibrating method outside monocular camera and millimetre-wave radar based on spherical calibration object
CN110849362A (en) * 2019-11-28 2020-02-28 湖南率为控制科技有限公司 Laser radar and vision combined navigation algorithm based on vehicle-mounted inertia
CN111368706A (en) * 2020-03-02 2020-07-03 南京航空航天大学 Data fusion dynamic vehicle detection method based on millimeter wave radar and machine vision
CN112070841A (en) * 2020-07-01 2020-12-11 北京中科原动力科技有限公司 Rapid combined calibration method for millimeter wave radar and camera
CN111812649A (en) * 2020-07-15 2020-10-23 西北工业大学 Obstacle identification and positioning method based on fusion of monocular camera and millimeter wave radar
CN111815717A (en) * 2020-07-15 2020-10-23 西北工业大学 Multi-sensor fusion external parameter combination semi-autonomous calibration method
CN112013877A (en) * 2020-08-31 2020-12-01 广州景骐科技有限公司 Detection method and related device for millimeter wave radar and inertial measurement unit

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
郭明 等: "《激光雷达技术与结构分析方法》", 28 February 2017, 测绘出版社 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114187365A (en) * 2021-12-09 2022-03-15 联陆智能交通科技(上海)有限公司 Camera and millimeter wave radar combined calibration method and system for roadside sensing system
CN114488047A (en) * 2022-01-27 2022-05-13 中国第一汽车股份有限公司 Vehicle sensor calibration system
CN114814758A (en) * 2022-06-24 2022-07-29 国汽智控(北京)科技有限公司 Camera-millimeter wave radar-laser radar combined calibration method and device
CN114814758B (en) * 2022-06-24 2022-09-06 国汽智控(北京)科技有限公司 Camera-millimeter wave radar-laser radar combined calibration method and device

Similar Documents

Publication Publication Date Title
CN110146869B (en) Method and device for determining coordinate system conversion parameters, electronic equipment and storage medium
US10659677B2 (en) Camera parameter set calculation apparatus, camera parameter set calculation method, and recording medium
CN113514803A (en) Combined calibration method for monocular camera and millimeter wave radar
CN109920246B (en) Collaborative local path planning method based on V2X communication and binocular vision
JP6825569B2 (en) Signal processor, signal processing method, and program
JP7016058B2 (en) Camera parameter set calculation method, camera parameter set calculation program and camera parameter set calculation device
EP3367677B1 (en) Calibration apparatus, calibration method, and calibration program
EP2939211B1 (en) Method and system for generating a surround view
CN112419385A (en) 3D depth information estimation method and device and computer equipment
CN112284416B (en) Automatic driving positioning information calibration device, method and storage medium
CN112802092B (en) Obstacle sensing method and device and electronic equipment
US11783507B2 (en) Camera calibration apparatus and operating method
CN115797454B (en) Multi-camera fusion sensing method and device under bird's eye view angle
CN111382591B (en) Binocular camera ranging correction method and vehicle-mounted equipment
CN111508027A (en) Method and device for calibrating external parameters of camera
CN110341621B (en) Obstacle detection method and device
CN113989766A (en) Road edge detection method and road edge detection equipment applied to vehicle
CN113985405A (en) Obstacle detection method and obstacle detection equipment applied to vehicle
CN113658262B (en) Camera external parameter calibration method, device, system and storage medium
CN110398747B (en) All-solid-state laser radar field angle dynamic expansion method, system and storage medium
CN117452410A (en) Millimeter wave radar-based vehicle detection system
CN111538008B (en) Transformation matrix determining method, system and device
CN110796604A (en) Image correction method and device
CN112347825B (en) Adjusting method and system for vehicle body looking-around model
US10643077B2 (en) Image processing device, imaging device, equipment control system, equipment, image processing method, and recording medium storing program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20211019

RJ01 Rejection of invention patent application after publication