CN112180362B - Method and device for determining conversion pose between radar and camera and electronic equipment - Google Patents

Method and device for determining conversion pose between radar and camera and electronic equipment Download PDF

Info

Publication number
CN112180362B
CN112180362B CN201910606517.8A CN201910606517A CN112180362B CN 112180362 B CN112180362 B CN 112180362B CN 201910606517 A CN201910606517 A CN 201910606517A CN 112180362 B CN112180362 B CN 112180362B
Authority
CN
China
Prior art keywords
calibration object
center point
radar
coordinate system
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910606517.8A
Other languages
Chinese (zh)
Other versions
CN112180362A (en
Inventor
闫明
杨德刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Horizon Robotics Technology Research and Development Co Ltd
Original Assignee
Beijing Horizon Robotics Technology Research and Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Horizon Robotics Technology Research and Development Co Ltd filed Critical Beijing Horizon Robotics Technology Research and Development Co Ltd
Priority to CN201910606517.8A priority Critical patent/CN112180362B/en
Publication of CN112180362A publication Critical patent/CN112180362A/en
Application granted granted Critical
Publication of CN112180362B publication Critical patent/CN112180362B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4004Means for monitoring or calibrating of parts of a radar system
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/415Identification of targets based on measurements of movement associated with the target
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Electromagnetism (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The present disclosure provides a method and an apparatus for determining a conversion pose between a radar and a camera, an electronic device, and a storage medium, and relates to the field of image technology, where the method includes: acquiring a first calibration object plane equation and a first center point coordinate of a calibration object and a calibration object center point under a radar coordinate system respectively based on first point cloud data acquired by a radar; acquiring a second calibration object plane equation and a second center point coordinate of the calibration object and the calibration object center point under a camera coordinate system respectively based on image data acquired by a camera; and determining the conversion pose between the radar coordinate system and the camera coordinate system according to the first normal vector of the first calibration object plane equation, the second normal vector of the second calibration object plane equation, the first center point coordinate and the second center point coordinate. The method, the device, the electronic equipment and the storage medium realize automation of the calibration process, reduce the manual participation in the calibration process, have low dependence on the state of the calibration plate and the external environment, and have stable calibration result.

Description

Method and device for determining conversion pose between radar and camera and electronic equipment
Technical Field
The disclosure relates to the field of computer technology, and in particular, to a method and a device for determining a conversion pose between a radar and a camera, an electronic device, and a storage medium.
Background
The automatic driving automobile is provided with a plurality of sensors such as a radar, a camera and the like, and the calibration of the conversion pose relationship between the radar and the camera is a precondition for realizing the information fusion of the radar and the camera. By determining the conversion pose between the radar and the camera, the advantage complementation between the two sensors can be realized, and more effective perception information is obtained. Currently, there are two general schemes for calibrating the conversion pose relationship between radar and camera: a calibration scheme based on a calibration plate and a calibration scheme based on a natural scene. Calibration schemes based on calibration plates are usually to determine radar and camera external parameters by reconstructing the pose of the calibration plate under the radar and camera coordinate system and by optimizing the distance from the point cloud to the plane. The calibration scheme based on the natural scene is to optimally solve the sensor external parameters by matching geometric edge features, colors, laser reflectivity and the like in the natural environment observed by different sensors.
However, for the calibration scheme based on the calibration plate, the method is easily influenced by the internal parameters of the camera and the flatness of the calibration plate, and the robustness degree is poor; for a calibration scheme based on a natural scene, a higher-precision initial value is needed, otherwise, the convergence result is poor, and the initial value is difficult to meet in many practical applications; and the requirements on the external environment are high, obvious characteristics which are easy to extract and have strong contrast are required, and the method is difficult to be applied to most scenes.
Disclosure of Invention
In order to solve the technical problems, embodiments of the present disclosure provide a method and apparatus for determining a conversion pose between a radar and a camera, an electronic device, and a storage medium.
According to an aspect of the embodiments of the present disclosure, there is provided a method for determining a transition pose between a radar and a camera, including: receiving first point cloud data acquired by a radar for a calibration object, and acquiring a first calibration object plane equation and a first center point coordinate of the calibration object and a calibration object center point under a radar coordinate system respectively based on the first point cloud data; receiving image data acquired by a camera for the calibration object, and acquiring a second calibration object plane equation and a second center point coordinate of the calibration object and the center point of the calibration object under a camera coordinate system respectively based on the image data; and determining a conversion pose between the radar coordinate system and the camera coordinate system according to the first normal vector of the first calibration object plane equation, the second normal vector of the second calibration object plane equation, the first center point coordinate and the second center point coordinate.
According to another aspect of the embodiments of the present disclosure, there is provided a conversion pose determining apparatus between a radar and a camera, including: the first data processing module is used for receiving point cloud data acquired by a radar for a calibration object, and acquiring a first calibration object plane equation and a first center point coordinate of the calibration object and a calibration object center point under a radar coordinate system respectively based on the point cloud data; the second data processing module is used for receiving image data acquired by a camera for the calibration object, and obtaining a second calibration object plane equation and a second center point coordinate of the calibration object and the center point of the calibration object under a camera coordinate system respectively based on the image data; and the first parameter determining module is used for determining the conversion pose between the radar coordinate system and the camera coordinate system according to the first normal vector of the first calibration object plane equation, the second normal vector of the second calibration object plane equation, the first center point coordinate and the second center point coordinate.
According to another aspect of the embodiments of the present disclosure, there is provided a computer-readable storage medium storing a computer program for executing the above-described method.
According to still another aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including: a processor; a memory for storing the processor-executable instructions; the processor is used for executing the method.
Based on the method and the device for determining the conversion pose between the radar and the camera, the electronic equipment and the storage medium provided by the embodiment of the disclosure, automation of a calibration process is realized, manual participation in the calibration process is reduced, the dependence on the state of a calibration plate and the external environment is low, and the calibration result is stable.
The technical scheme of the present disclosure is described in further detail below through the accompanying drawings and examples.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent by describing embodiments thereof in more detail with reference to the accompanying drawings. The accompanying drawings are included to provide a further understanding of embodiments of the disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the disclosure, not to limit the disclosure. In the drawings, like reference numerals generally refer to like parts or steps.
FIG. 1 is a flow chart of one embodiment of a method of determining a transition pose between a radar and a camera of the present disclosure;
FIG. 2 is a flow chart of one embodiment of processing radar point cloud data in a method of determining a transition pose between radar and camera of the present disclosure;
FIG. 3 is a flow chart of one embodiment of motion compensation for radar point cloud data in a method of determining a transition pose between a radar and a camera of the present disclosure;
FIG. 4 is a flow chart of one embodiment of determining a transition pose between a radar and a camera in a transition pose determination method between a radar and a camera of the present disclosure;
FIG. 5 is a flow chart of one embodiment of determining a transition pose between a radar and a camera from an initial transition pose in a transition pose determination method between a radar and a camera of the present disclosure;
FIG. 6 is a flow chart of one embodiment of determining a transition pose between a radar and a camera based on a reprojection error in a transition pose determination method between a radar and a camera of the present disclosure;
FIG. 7 is a flow chart of one embodiment of a process for verifying a transition pose in a transition pose determination method between a radar and a camera of the present disclosure;
FIG. 8 is a flow chart of one embodiment of optimizing a transition pose in a transition pose determination method between a radar and a camera of the present disclosure;
FIG. 9 is a block diagram of one embodiment of a conversion pose determination apparatus between radar and camera of the present disclosure;
FIG. 10 is a block diagram of one embodiment of a transition pose determination module in a transition pose determination device between radar and camera of the present disclosure;
FIG. 11 is a block diagram of one embodiment of a first data processing module in a conversion pose determination device between radar and camera of the present disclosure;
fig. 12 is a block diagram of one embodiment of an electronic device of the present disclosure.
Detailed Description
Example embodiments according to the present disclosure will be described in detail below with reference to the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present disclosure and not all of the embodiments of the present disclosure, and that the present disclosure is not limited by the example embodiments described herein.
It should be noted that: the relative arrangement of the components and steps, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present disclosure unless it is specifically stated otherwise.
It will be appreciated by those of skill in the art that the terms "first," "second," etc. in embodiments of the present disclosure are used merely to distinguish between different steps, devices or modules, etc., and do not represent any particular technical meaning nor necessarily logical order between them.
It should also be understood that in embodiments of the present disclosure, "plurality" may refer to two or more, and "at least one" may refer to one, two or more.
It should also be appreciated that any component, data, or structure referred to in the presently disclosed embodiments may be generally understood as one or more without explicit limitation or the contrary in the context.
In addition, the term "and/or" in this disclosure is merely an association relationship describing an association object, and indicates that three relationships may exist, such as a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" in the present disclosure generally indicates that the front and rear association objects are an or relationship.
It should also be understood that the description of the various embodiments of the present disclosure emphasizes the differences between the various embodiments, and that the same or similar features may be referred to each other, and for brevity, will not be described in detail.
Meanwhile, it should be understood that the sizes of the respective parts shown in the drawings are not drawn in actual scale for convenience of description.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses.
Techniques, methods, and apparatus known to one of ordinary skill in the relevant art may not be discussed in detail, but are intended to be part of the specification where appropriate.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further discussion thereof is necessary in subsequent figures.
Embodiments of the present disclosure are applicable to electronic devices such as terminal devices, computer systems, servers, etc., which are operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known terminal devices, computing systems, environments, and/or configurations that may be suitable for use with the terminal device, computer system, or server, include, but are not limited to: personal computer systems, server computer systems, thin clients, thick clients, hand-held or laptop devices, microprocessor-based systems, set-top boxes, programmable consumer electronics, network personal computers, minicomputer systems, mainframe computer systems, distributed cloud computing environments that include any of the above systems, and the like.
Electronic devices such as terminal devices, computer systems, servers, etc. may be described in the general context of computer system-executable instructions, such as program modules, being executed by a computer system. Generally, program modules may include routines, programs, objects, components, logic, data structures, etc., that perform particular tasks or implement particular abstract data types. The computer system/server may be implemented in a distributed cloud computing environment. In a distributed cloud computing environment, tasks may be performed by remote processing devices that are linked through a communications network. In a distributed cloud computing environment, program modules may be located in both local and remote computing system storage media including memory storage devices.
Summary of the application
In the process of implementing the present disclosure, the inventor finds that there are two main types of conversion pose relationships between the calibration radar and the camera at present: a calibration scheme based on a calibration plate and a calibration scheme based on a natural scene. The existing calibration scheme based on the calibration plate is mainly divided into two types: firstly, reconstructing the gesture of a calibration plate under a radar and camera coordinate system, and optimizing the distance difference value between the distance of the calibration plate and the calculated distance of the calibration plate under the camera coordinate system after converting the point cloud of the calibration plate under the camera coordinate system to determine the radar and camera external parameters; and secondly, extracting corner points or edge points of the calibration plate by using different sensors, and solving external parameters simultaneously. The calibration scheme based on the natural scene is to optimally solve the sensor external parameters by matching geometric edge features, colors, laser reflectivity and the like in the natural environment observed by different sensors.
For a calibration scheme for optimizing the distance from the radar point cloud to the calibration plate, the radar point cloud is easily influenced by the internal parameters of the camera and the flatness degree of the calibration plate, and the robustness degree is poor; for the calibration scheme of the extracted corner points, the extracted corner points in the radar point cloud have larger errors and poorer precision due to the limitation of the vertical and horizontal angle resolutions of the radar. For a calibration scheme based on a natural scene, a higher-precision initial value is needed, otherwise, the convergence result is poor, and the method is difficult to meet in many practical applications; and the requirements on the external environment are high, obvious characteristics which are easy to extract and have strong contrast are required, and the method is difficult to be applied to most scenes.
According to the conversion pose determining scheme between the radar and the camera, a plane equation and a center point of a calibration object are automatically extracted, the conversion pose between a radar coordinate system and a camera coordinate system is determined, and the external parameters between the radar coordinate system and the camera coordinate system are optimized by using the reprojection error of the center point of the calibration object; the automatic calibration process can be realized, the dependence on the state of a calibration object and the external environment is low, and the robustness is high.
Exemplary method
Fig. 1 is a flowchart of one embodiment of a method for determining a transition pose between a radar and a camera of the present disclosure, the method shown in fig. 1 including the steps of: s101, S102, and S103, the respective steps are described below.
S101, receiving first point cloud data acquired by a radar for a calibration object, and acquiring a first calibration object plane equation and a first center point coordinate of the calibration object and a calibration object center point under a radar coordinate system respectively based on the first point cloud data. The calibration objects can be various, for example, the calibration objects are round calibration plates and the like, the plane equation of the calibration objects is a plane equation of the round calibration plates and the like under a radar coordinate system and the like, and the center point coordinates are coordinates of centers of the round calibration plates and the like under the radar coordinate system and the like. The radar may be a lidar or the like, and the first point cloud data may be three-dimensional lidar point cloud data or the like, which is mounted on the vehicle.
S102, receiving image data acquired by a camera for a calibration object, and obtaining a second calibration object plane equation and a second center point coordinate of the calibration object and a calibration object center point under a camera coordinate system respectively based on the image data. The camera may be a variety of cameras mounted on a vehicle, and the image data collected by the camera may be two-dimensional image data or the like.
And S103, determining the conversion pose between the radar coordinate system and the camera coordinate system according to the first normal vector of the first calibration object plane equation, the second normal vector of the second calibration object plane equation, the first center point coordinate and the second center point coordinate. Determining the transformed pose between the radar coordinate system and the camera coordinate system includes calibrating a rotation matrix and an initial translation vector between the radar coordinate system and the camera coordinate system, and the like, and various methods can be adopted.
In one embodiment, a plurality of synchronous modes such as hardware connection of the radar and the camera can be adopted to ensure the time consistency of the data collected by the radar and the camera. The calibration object is a disc-shaped planar calibration object, fig. 2 is a flowchart of one embodiment of processing radar point cloud data in the method for determining the conversion pose between radar and camera according to the present disclosure, and the method shown in fig. 2 includes the steps of: s201 to S206, the respective steps are described below.
S201, performing segmentation processing on the first point cloud data by adopting a deep learning method to obtain three-dimensional calibration object point cloud data corresponding to the calibration object.
And performing point cloud segmentation on the first point cloud data by using a point cloud segmentation algorithm based on deep learning, and extracting three-dimensional calibration object point cloud data of radar scanning on a calibration object. The point cloud segmentation algorithm based on the deep learning comprises pointnet, pointnet ++ algorithm and the like, and the pointnet algorithm is a point cloud classification/segmentation deep learning algorithm proposed by the university of schwann.
And obtaining point cloud data of the radar scanned on the calibration object and taking the point cloud data of the radar scanned on other objects as a positive sample, and training a deep learning model. And inputting the first point cloud data into a trained deep learning model, performing semantic segmentation and other processing on the point cloud by the trained deep learning model, and outputting three-dimensional calibration object point cloud data corresponding to the calibration object.
S202, clustering is carried out on the three-dimensional calibration object point cloud data, and a point cloud cluster, and a feature vector and a feature value corresponding to the point cloud cluster are obtained.
And obtaining a feature vector corresponding to the minimum feature value of the point cloud cluster, determining the feature vector as a plane normal vector, and determining the first point cloud data as invalid data if the included angle between the plane normal vector and the normal vector of the surface of the preset calibration object is larger than a preset included angle threshold value.
In one embodiment, clustering is performed on three-dimensional calibration object point cloud data to obtain a point cloud cluster, a covariance matrix of the point cloud cluster is calculated, a characteristic value and a characteristic vector of the covariance matrix are calculated, and if the ratio of the minimum characteristic value to the second minimum characteristic value is smaller than a preset threshold value, the cluster is considered to be a plane. And determining the feature vector corresponding to the minimum feature value as the normal vector corresponding to the point cloud cluster to obtain the placement posture of the calibration object, and judging the plane orientation range of the calibration object approximately.
In one embodiment, an XOY plane of the radar coordinate system is parallel to the ground plane, a disc-shaped calibration object is placed parallel to the ground plane, a normal vector of the disc-shaped calibration object is approximately (0, 1), an included angle between a normal vector corresponding to the point cloud cluster and the normal vector (0, 1) of the calibration object is calculated, and if the included angle is larger than a preset angle threshold value, the detection result is discarded.
And S203, fitting the point cloud cluster by using a preset first fitting algorithm to obtain a first calibration object plane equation. The first fitting algorithm comprises a RANSAC algorithm and the like, and the first calibration object plane equation is a calibration object plane equation under a radar coordinate system. The RANSAC (Random Sample Consensus) algorithm can calculate mathematical model parameters of data according to a group of sample data sets containing abnormal data, and obtain effective sample data.
And S204, projecting the point cloud cluster to a plane corresponding to the first calibration object plane equation according to the characteristic value and the characteristic vector to obtain two-dimensional point cloud data.
And S205, fitting the two-dimensional point cloud data by using a preset second fitting algorithm to obtain a fitting circle.
The second fitting algorithm includes a minimum bounding circle (Smallest Enclosing Disks) algorithm, or the like. And if the difference value between the radius of the fitting circle and the preset calibration object radius is determined to be outside the range of the difference value threshold value interval, determining the three-dimensional calibration object point cloud data as invalid data.
S206, converting the two-dimensional coordinates of the fitted circle center into three-dimensional first center point coordinates in a radar coordinate system.
In one embodiment, a feature value and a feature vector are obtained, and the point cloud cluster is projected onto a plane corresponding to the first calibration object plane equation to obtain two-dimensional point cloud data. And using two-dimensional point cloud data to fit a circle, comparing the radius of the fitted circle with the radius of a real known calibration object, and if the difference value between the two is smaller than a preset difference value threshold, considering that the extraction is successful.
For three-dimensional point cloud scanning points (X, Y, Z) of the point cloud cluster and the feature vector matrix M, the corresponding relation exists:
Wherein, (x, y) is the two-dimensional coordinates of the three-dimensional point cloud scanning points of the point cloud cluster projected into the plane corresponding to the first calibration object plane equation. The calibration object can be approximately regarded as a plane, so that the z value is approximately the average z coordinate value of all three-dimensional point cloud scanning points of the point cloud cluster:
The relationship between the two-dimensional coordinate transformation (x c,yc) of the fitted circle center and the three-dimensional coordinate of the fitted circle center is as follows:
the two-dimensional coordinates of the center of the fitted circle can be converted into three-dimensional first center point coordinates Xc, yc, zc in the radar coordinate system by the formula (1-3).
Fig. 3 is a flowchart of one embodiment of motion compensation for radar point cloud data in a method of determining a transition pose between a radar and a camera of the present disclosure, the method shown in fig. 3 comprising the steps of: s301, S302, and S303, each of which is described below.
S301, acquiring the position and normal vector of the calibration object center corresponding to each frame of three-dimensional calibration object point cloud data and the acquisition time difference of two adjacent frames of three-dimensional calibration object point cloud data.
S302, calculating the speed and the angular speed of the movement of the calibration object based on the position and the normal vector of the center of the calibration object and the acquisition time difference.
And S303, performing motion compensation processing on the three-dimensional calibration object point cloud data of each frame according to the speed and the angular speed and the acquisition time difference.
In one embodiment, for three-dimensional calibration object point cloud data of each frame, according to the position and normal vector of the center of the calibration object between adjacent frames, calculating the speed and angular speed of the motion of the calibration object, according to the speed and angular speed and the acquisition time difference, performing motion compensation processing on the three-dimensional calibration object point cloud data of each frame, removing distortion generated by the motion, and performing calibration processing again by using the three-dimensional calibration object point cloud data after the motion compensation processing.
The speed of the movement of the calibration object is v=dx/dt, the angular speed is w=dθ/dt, wherein dx represents the translation of the center position of the calibration object between two adjacent frames of three-dimensional calibration object point cloud data, dθ is the normal vector included angle between two adjacent frames of three-dimensional calibration object point cloud data, and dt is the acquisition time difference between two adjacent frames of three-dimensional calibration object point cloud data.
Let the scanning time of the laser scanning point x 0 corresponding to the 1 st frame of three-dimensional calibration object point cloud data be t 0, and the scanning time of the laser scanning point x i corresponding to the i th frame of three-dimensional calibration object point cloud data be t i, then in this period of time, the calibration object motion may be represented as a rotational motion θ=w× (t i-t0), and a translational motion t=v× (t i-t0). And the laser scanning point x' i=R-1xi-R-1 t corresponding to the three-dimensional calibration object point cloud data after the motion compensation processing. Wherein, R represents a rotation matrix corresponding to the rotation motion, and can be obtained by converting the rotation angle theta.
FIG. 4 is a flowchart of one embodiment of determining a transition pose between a radar and a camera in a transition pose determination method between a radar and a camera of the present disclosure, the method shown in FIG. 4 comprising the steps of: s401 and S402, the respective steps are described below.
S401, determining an initial conversion pose between a radar coordinate system and a camera coordinate system according to a first normal vector of a first calibration object plane equation, a second normal vector of a second calibration object plane equation, a first center point coordinate and a second center point coordinate.
The initial transition pose includes a rotation matrix R, a translation vector t, and the like. There are a number of ways to determine the initial rotation matrix between the radar coordinate system and the camera coordinate system based on the first normal vector and the second normal vector. In one embodiment, a normal vector angle difference constraint relationship between a first normal vector and a second normal vector is obtained, a first objective function is established based on the normal vector angle difference constraint relationship, and an initial rotation matrix is obtained by minimizing the first objective function.
There are a number of ways to determine the initial translation vector between the radar coordinate system and the camera coordinate system based on the first center point coordinates and the second center point coordinates. In one embodiment, a distance constraint relation between the center point of the calibration object in two positions in the laser radar coordinate system and the camera coordinate system is obtained according to the first center coordinate and the second center coordinate, a second objective function is established based on the distance constraint relation, and an initial translation vector is obtained based on the second objective function.
S402, determining the conversion pose between the radar coordinate system and the camera coordinate system according to the initial conversion pose, the first center point coordinate and the second center point coordinate. The determination of the shift position based on the initial shift position obtained in advance or the like may employ various methods.
In one embodiment, the position of the vehicle body is kept unchanged, the position and the posture of the calibration object are changed, and a plurality of groups of images and corresponding point cloud data are acquired. In the ith image, the rotation matrix and translation vector of the calibration relative to the camera coordinate system are R ci,Tci, the calibration object normal vector (the second normal vector of the second calibration object plane equation) is theta ci, and the vector is 1 multiplied by 3; and if the distance between the calibration object and the camera is alpha ci, the second calibration object plane equation is as follows:
the first calibration object plane equation under the radar coordinate system can be obtained by fitting by least squares principle or RANSAC algorithm:
Wherein θ li is a normal vector of a calibration object (a first normal vector of a first calibration object plane equation) in the radar coordinate system, and α li is an origin distance between the calibration object and the radar coordinate system.
For the rotation matrix R, the optimal estimation of the rotation matrix R can be obtained by minimizing the normal vector angle difference of the observed calibration object plane between the image and the corresponding frame of the radar point cloud data, namely the difference between theta c and theta l. The objective function may be in the form of maximizing the sum of cosine values of the respective angles:
In the above formula, R is a rotation matrix and is an orthogonal matrix, satisfying RR T =i and det (R) =1, which can be equivalent to:
The problem is converted to an orthogonal force specification problem (Orthogonal Procrustes Problem, OPP) based on equations 1-7, whose solution is in the form of:
R1=VUT (1-8);
Wherein, Is a corresponding singular value decomposition.
For the translation vector t, the target function can be obtained by the constraint of the distance between the center of the calibration plate in the radar coordinate system and the center of the calibration plate in the camera coordinate system, and the target function is as follows:
The analytical solution is as follows:
Wherein p l is the center coordinate of the calibration plate in the laser radar coordinate system.
Fig. 5 is a flowchart of one embodiment of determining a transition pose between a radar and a camera from an initial transition pose in a transition pose determination method between a radar and a camera of the present disclosure, the method as shown in fig. 5 comprising the steps of: s501, S502, and S503, each of which is described below.
S501, according to the camera internal parameters, the second center point coordinates are projected into an image plane, and the third center point coordinates are determined.
In one embodiment, two-dimensional image data acquired by a camera is obtained, and a calibration object in the image is detected. The angular point positions in the images are extracted to obtain internal parameters and calibration object parameters of the camera, the internal parameters of the camera are divided into an internal parameter matrix, a distortion parameter matrix and the like, and the internal parameters of the camera can be calibrated by using a Zhang Zhengyou calibration method and the like. And obtaining camera external parameters by using PnP and other algorithms, and obtaining a second calibration object plane equation (three-dimensional plane equation) and a second center point coordinate (three-dimensional coordinate) of the calibration object and the calibration object center point under a camera coordinate system respectively.
Projecting a second center point coordinate of the center point of the calibration object under a camera coordinate system onto an image plane through an internal camera reference to obtain a third center point coordinate (two-dimensional coordinate) of the center point of the calibration object in the image, wherein a projection formula is as follows: x=k (rx+t). X represents the third center point coordinate of the center point of the calibration object, K represents the camera internal reference, R and t are respectively a rotation matrix and a translation vector in the calibration object and the camera external reference, and X represents the second center point coordinate of the center point of the calibration object under the camera coordinate system obtained by using algorithms such as PnP.
S502, according to the initial conversion pose, the first center point coordinate is projected into an image plane, and the fourth center point coordinate is determined.
S503, determining a transformation pose based on the distance between the third center point and the fourth center coordinate. The determination of the transition pose based on the distance of the third center point from the fourth center coordinate may take a variety of methods, such as determining the transition pose between the radar and the camera via a reprojection error.
Fig. 6 is a flowchart of one embodiment of determining a transition pose between a radar and a camera based on a re-projection error in a transition pose determination method between a radar and a camera of the present disclosure. The method as shown in fig. 6 comprises the steps of: s601 and S602, the respective steps are described below.
S601, establishing a reprojection error function according to the distance difference between the third center point coordinate and the fourth center point coordinate;
s602, optimizing the initial conversion pose by minimizing the reprojection error function.
In one embodiment, the optimization of the initial rotation matrix R 1 and translation vector t 1 may be achieved by minimizing the reprojection error of the marker center point:
Wherein n represents the total frame number of the calibration object point cloud data participating in calibration, u i represents the 2D center point (third center point) of the calibration object obtained in the image, p li represents the center point (fourth center point) of the calibration object calculated in the radar point cloud, K represents the internal parameters of the camera, R and t represent the quantity to be optimized, and the rotation matrix and the translation vector in the external parameters between the radar and the camera are represented.
FIG. 7 is a flowchart of one embodiment of a process for verifying a transition pose in a transition pose determination method between a radar and a camera of the present disclosure, the method shown in FIG. 7 comprising the steps of: s701 and S702, the respective steps are described below.
S701, using the first coordinate set and optimizing the initial rotation matrix and the initial translation vector by minimizing the reprojection error function, to obtain a second rotation matrix and a second translation vector.
S702, using the second coordinate set and performing checking processing on the second rotation matrix and the second translation vector through a reprojection error function.
The first coordinate set and the second coordinate set comprise a plurality of groups of first center point coordinates and second center point coordinates, the second coordinate set, the second rotation matrix and the second translation vector are used for obtaining function values of a plurality of re-projection error functions, and if the average value of the function values is smaller than a preset error threshold value or if the number of the function values smaller than the error threshold value is larger than a preset number threshold value, the second rotation matrix and the second translation vector are determined to be a third rotation matrix and a third translation vector which are finally calibrated.
In one embodiment, m frames Lei Dadian of cloud and image data are synchronously acquired in the process of calibrating the conversion pose between the radar coordinate system and the camera coordinate system, and a re-projection error threshold, a model frame number threshold and a maximum iteration number are set. Each time randomly extracting n frames of radar point clouds and image data from m frames Lei Dadian clouds and image data, performing calibration processing to obtain calibration parameters (rotation matrix and translation vector), and calculating average reprojection errors of the remaining m-n frames of data according to a RANSAC algorithm (R, t) by using the obtained calibration parameters:
Counting the number of frames with the re-projection error smaller than the threshold value, and repeating the sampling-calibration iterative process until the number of frames with the re-projection error smaller than the threshold value is larger than the threshold value of the model frame number or the maximum number of iterations is reached, so as to obtain the final calibration parameters.
FIG. 8 is a flowchart of one embodiment of optimizing a transition pose in a method for determining a transition pose between a radar and a camera according to the present disclosure, the method shown in FIG. 8 including the steps of: s801 to S805, the respective steps are described below.
S801, extracting a target object image with preset edge characteristics from an image acquired by a camera, performing edge detection on the target object image, and performing fitting processing to obtain an edge image in the target object image.
S802, second point cloud data obtained by scanning the target object by the radar is obtained, and edge point cloud data corresponding to the edge of the target object is extracted from the second point cloud data.
S803, converting the edge point cloud based on the third rotation matrix and the third translation vector.
S804, constructing a distance error function between the converted edge point cloud data and the corresponding pixel point data positioned on the edge straight line image.
And S805, optimizing the third rotation matrix and the third translation vector by minimizing an error function.
In one embodiment, the external parameters such as the rotation torque array, the translation vector and the like can be subjected to online optimization processing, and the external parameters are subjected to online calibration in actual use. The method comprises the steps of obtaining an image acquired by a camera, carrying out semantic segmentation on the image by using semantic segmentation algorithms such as FCN and SegNet based on deep learning, extracting a target object image with preset edge characteristics, wherein the target object can be an object with obvious linear edges such as a lane line and a lamp post, or a circular calibration plate.
And performing edge detection and fitting treatment on the target object image to obtain an edge image in the target object image. For example, edge detection is performed on an object region, edges of objects in an image are extracted, and 2D edge lines are fitted. Obtaining second point cloud data obtained by scanning the target object by the radar, extracting edge point cloud data corresponding to the edge of the target object from the second point cloud data, projecting edge points onto a 2D image, constructing an error function of a point-to-edge linear distance, optimizing and minimizing the error function, and optimizing external parameters in real time. The projection formula:
x=K(RX+t) (1-14);
Wherein X represents coordinates of points projected into the image, K represents camera internal parameters, R and t are rotation matrixes and translation vectors of the calibrated radar and the camera external parameters respectively, and X represents coordinates of laser scanning points. According to formulas 1-14, the one-to-one correspondence between radar scanning points and image pixel points can be obtained, pixel level registration is carried out, and the point cloud scanned on the object can be obtained by combining semantic segmentation results.
Constructing a distance error function for optimizing:
Where n represents the number of frames of an image, m represents the number of lines in an image, l represents the number of points on a line, K represents a camera internal reference, R, t represents a rotation matrix and a translation vector in a laser radar and a camera external reference, a ij,bij,cij represents a line parameter of a jth line in an ith frame image (a ijx+bijy+cij=0),Xijk represents a laser radar 3D scan point corresponding to a kth point on a jth line in an ith frame image.
The method for determining the conversion pose between the radar and the camera can automatically extract the plane equation and the center point of the calibration object, determine the conversion pose between the radar coordinate system and the camera coordinate system, and optimize the external parameters between the radar coordinate system and the camera coordinate system by using the reprojection error of the center point of the calibration object; the automation of the calibration process can be realized, the manual participation of the calibration process is reduced, and the high automation of the calibration process is realized; the dependence on the state of a calibration object and the external environment is low, and the calibration result is stable and the robustness is high; the pixel level fusion of radar data and image data can be realized, and the calibration precision is high.
Exemplary apparatus
In one embodiment, as shown in fig. 9, the present disclosure provides a method for determining a transition pose between a radar and a camera, including: a first data processing module 910, a second data processing module 920, and a transition pose determination module 930. The first data processing module 910 receives point cloud data acquired by the radar for the calibration object, and obtains a first calibration object plane equation and a first center point coordinate of the calibration object and a center point of the calibration object under a radar coordinate system respectively based on the point cloud data. The second data processing module 920 receives image data acquired by the camera for the calibration object, and obtains a second calibration object plane equation and a second center point coordinate of the calibration object and the calibration object center point under the camera coordinate system respectively based on the image data. The conversion pose determination module 930 determines a conversion pose between the radar coordinate system and the camera coordinate system according to the first normal vector of the first calibration object plane equation, the second normal vector of the second calibration object plane equation, the first center point coordinate and the second center point coordinate.
As shown in fig. 10, the conversion pose determination module 930 includes: a first determination unit 931, a second determination unit 932, and a verification processing unit 933. The first determining unit 931 determines an initial conversion pose between the radar coordinate system and the camera coordinate system from the first normal vector of the first calibration object plane equation, the second normal vector of the second calibration object plane equation, the first center point coordinate and the second center point coordinate. The second determination unit 932 determines a transition pose between the radar coordinate system and the camera coordinate system based on the initial transition pose, the first center point coordinate, and the second center point coordinate.
The first determining unit 931 may determine an initial rotation matrix between the radar coordinate system and the camera coordinate system according to the first normal vector and the second normal vector, and determine an initial translation vector between the radar coordinate system and the camera coordinate system according to the first center point coordinate and the second center point coordinate. The first determining unit 931 obtains a normal vector angle difference constraint relationship between the first normal vector and the second normal vector, and establishes a first objective function based on the normal vector angle difference constraint relationship; an initial rotation matrix is obtained by minimizing the first objective function.
The first determining unit 931 may obtain a distance constraint relationship between two positions of the calibration object center point in the laser radar coordinate system and the camera coordinate system according to the first center coordinate and the second center coordinate, establish a second objective function based on the distance constraint relationship, and obtain an initial translation vector based on the second objective function.
In one embodiment, the second determination unit 932 projects the second center point coordinates into the image plane, determining third center point coordinates, based on camera parameters. The second determining unit 932 projects the first center point coordinates into the image plane according to the initial transition pose, determines fourth center point coordinates, and determines the transition pose based on a distance between the third center point and the fourth center point coordinates. The second determining unit 932 may establish a reprojection error function according to a distance difference between the third center point coordinate and the fourth center point coordinate, and perform optimization processing on the initial conversion pose by minimizing the reprojection error function.
The initial rotation gesture includes an initial rotation matrix, an initial translation vector, and the like. The inspection processing unit 933 obtains a second rotation matrix and a second translation vector by optimizing the initial rotation matrix and the initial translation vector using the first coordinate set and by minimizing the re-projection error function. The inspection processing unit 933 performs inspection processing on the second rotation matrix and the second translation vector using the second coordinate set and by the reprojection error function. The first coordinate set and the second coordinate set include: and a plurality of sets of first center point coordinates and second center point coordinates.
In one embodiment, the inspection processing unit 933 obtains function values of a plurality of re-projection error functions using the second coordinate set, the second rotation matrix, and the second translation vector. If the average value of the plurality of function values is smaller than the preset error threshold value, or if the number of function values smaller than the error threshold value is larger than the preset number threshold value, the verification processing unit 933 determines the second rotation matrix and the second translation vector as the third rotation matrix and the third translation vector to be finally calibrated.
The inspection processing unit 933 extracts an object image having a preset edge feature from an image acquired by the camera, performs edge detection on the object image, and performs fitting processing to obtain an edge image in the object image. The inspection processing unit 933 obtains second point cloud data obtained by scanning the target object by the radar, and extracts edge point cloud data corresponding to the edge of the target object from the second point cloud data. The inspection processing unit 933 performs conversion processing on the edge point cloud based on the third rotation matrix and the third translation vector, and constructs a distance error function between the converted edge point cloud data and the corresponding pixel point data located on the edge straight line image. The inspection processing unit 933 performs optimization processing on the third rotation matrix and the third translation vector by minimizing an error function.
In one embodiment, as shown in FIG. 11, the first data processing module 910 includes: a point cloud obtaining unit 911, a point cloud processing unit 912, a fitting processing unit 913, and a data conversion unit 914. The point cloud obtaining unit 911 performs a segmentation process on the first point cloud data by using a deep learning method to obtain three-dimensional calibration object point cloud data corresponding to the calibration object. The point cloud processing unit 912 performs clustering processing on the three-dimensional calibration object point cloud data to obtain a point cloud cluster and a feature vector and a feature value corresponding to the point cloud cluster.
The fitting processing unit 913 uses a preset first fitting algorithm to perform fitting processing on the point cloud cluster, so as to obtain a first calibration object plane equation. The fitting processing unit 913 projects the point cloud cluster onto a plane corresponding to the first calibration object plane equation according to the feature value and the feature vector to obtain two-dimensional point cloud data, and uses a preset second fitting algorithm to perform fitting processing on the two-dimensional point cloud data to obtain a fitting circle. The data conversion unit 914 converts the two-dimensional coordinates of the fitted circle center into three-dimensional first center point coordinates in the radar coordinate system.
The point cloud processing unit 912 obtains a feature vector corresponding to a minimum feature value of the point cloud cluster, determines the feature vector as a plane normal vector, and determines the first point cloud data as invalid data if it is determined that an included angle between the plane normal vector and a normal vector of a preset calibration object surface is greater than a preset included angle threshold. If it is determined that the difference between the radius of the fitted circle and the preset calibration object radius is outside the difference threshold interval range, the point cloud obtaining unit 911 determines that the three-dimensional calibration object point cloud data is invalid data.
The point cloud obtaining unit 911 obtains the position and normal vector of the calibration object center corresponding to each frame of three-dimensional calibration object point cloud data, and the acquisition time difference of two adjacent frames of three-dimensional calibration object point cloud data; the point cloud obtaining unit 911 calculates the speed and angular speed of the movement of the calibration object based on the position and normal vector of the center of the calibration object and the acquisition time difference. The point cloud obtaining unit 911 performs motion compensation processing on three-dimensional calibration object point cloud data of each frame according to the speed and angular speed and the acquisition time difference.
Fig. 12 is a block diagram of one embodiment of an electronic device of the present disclosure. As shown in fig. 12, the electronic device 121 includes one or more processors 1211 and memory 1212.
The processor 1211 may be a Central Processing Unit (CPU) or other form of processing unit having data processing and/or instruction execution capabilities, and may control other components in the electronic device 121 to perform the desired functions.
Memory 1212 may include one or more computer program products, which may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. Volatile memory, for example, may include: random Access Memory (RAM) and/or cache, etc. The nonvolatile memory may include, for example: read Only Memory (ROM), hard disk, flash memory, etc. On which one or more computer program instructions may be stored, which the processor 1211 may execute to implement the various embodiments of the present disclosure described above, and further, which may be a computer readable storage medium having stored thereon computer program instructions which, when executed by the processor, cause the processor to perform the steps in the method of determining a transition pose between a radar and a camera according to various embodiments of the present disclosure described in the above "exemplary method" section of the present specification.
And/or other desired functions. Various contents such as an input signal, a signal component, a noise component, and the like may also be stored in the computer-readable storage medium.
In one example, the electronic device 121 may further include: input devices 1213 and output devices 1214, etc., which are interconnected by a bus system and/or other forms of connection mechanisms (not shown). In addition, the input device 1213 may also include, for example, a keyboard, mouse, and the like. The output device 1214 can output various information to the outside. The output devices 1214 may include, for example, a display, speakers, a printer, and a communication network and remote output devices connected thereto, etc.
Of course, only some of the components of the electronic device 121 that are relevant to the present disclosure are shown in fig. 8, components such as buses, input/output interfaces, and the like are omitted for simplicity. In addition, the electronic device 121 may include any other suitable components depending on the particular application.
In addition to the methods and apparatus described above, embodiments of the present disclosure may also be a computer program product comprising computer program instructions which, when executed by a processor, cause the processor to perform the steps in the method of determining a pose of transition between radar and camera according to various embodiments of the present disclosure described in the "exemplary methods" section of the present description.
The computer program product may write program code for performing the operations of embodiments of the present disclosure in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, embodiments of the present disclosure may also be a computer-readable storage medium, having stored thereon computer program instructions, which when executed by a processor, cause the processor to perform steps in a method of determining a transition pose between radar and camera according to various embodiments of the present disclosure described in the above-described "exemplary method" section of the present disclosure.
The computer readable storage medium may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium may include, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium may include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The basic principles of the present disclosure have been described above in connection with specific embodiments, but it should be noted that the advantages, benefits, effects, etc. mentioned in the present disclosure are merely examples and not limiting, and these advantages, benefits, effects, etc. are not to be considered as necessarily possessed by the various embodiments of the present disclosure. Furthermore, the specific details disclosed herein are for purposes of illustration and understanding only, and are not intended to be limiting, since the disclosure is not necessarily limited to practice with the specific details described.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different manner from other embodiments, so that the same or similar parts between the embodiments are mutually referred to. For system embodiments, the description is relatively simple as it essentially corresponds to method embodiments, and reference should be made to the description of method embodiments for relevant points.
According to the method and the device for determining the conversion pose between the radar and the camera, the electronic equipment and the storage medium, a first calibration object plane equation and a first center point coordinate of a calibration object and a calibration object center point under a radar coordinate system are obtained according to point cloud data, a second calibration object plane equation and a second center point coordinate of the calibration object and the calibration object center point under the camera coordinate system are obtained according to image data, and the conversion pose between the radar coordinate system and the camera coordinate system is determined based on the obtained information; the plane equation and the center point of the calibration object can be automatically extracted, the conversion pose between the radar coordinate system and the camera coordinate system is determined, and the external parameters between the radar coordinate system and the camera coordinate system are optimized by using the re-projection error of the center point of the calibration object; the automation of the calibration process can be realized, the manual participation of the calibration process is reduced, and the high automation of the calibration process is realized; the dependence on the state of a calibration object and the external environment is low, and the calibration result is stable and the robustness is high; the pixel level fusion of radar data and image data can be realized, and the calibration precision is high.
The block diagrams of the devices, apparatuses, devices, systems referred to in this disclosure are merely illustrative examples and are not intended to require or imply that the connections, arrangements, configurations must be made in the manner shown in the block diagrams. As will be appreciated by one of skill in the art, the devices, apparatus, devices, and systems may be connected, arranged, configured in any manner. Words such as "including," "comprising," "having," and the like are words of openness and mean "including but not limited to," and are used interchangeably therewith. The terms "or" and "as used herein refer to and are used interchangeably with the term" and/or "unless the context clearly indicates otherwise. The term "such as" as used herein refers to, and is used interchangeably with, the phrase "such as, but not limited to.
The methods and apparatus of the present disclosure may be implemented in a number of ways. For example, the methods and apparatus of the present disclosure may be implemented by software, hardware, firmware, or any combination of software, hardware, firmware. The above-described sequence of steps for the method is for illustration only, and the steps of the method of the present disclosure are not limited to the sequence specifically described above unless specifically stated otherwise. Furthermore, in some embodiments, the present disclosure may also be implemented as programs recorded in a recording medium, the programs including machine-readable instructions for implementing the methods according to the present disclosure. Thus, the present disclosure also covers a recording medium storing a program for executing the method according to the present disclosure.
It is also noted that in the apparatus, devices and methods of the present disclosure, components or steps may be disassembled and/or assembled. Such decomposition and/or recombination should be considered equivalent to the present disclosure.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present disclosure. Various modifications to these aspects, and the like, will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the disclosure. Thus, the present disclosure is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, this description is not intended to limit embodiments of the disclosure to the form disclosed herein. Although a number of example aspects and embodiments have been discussed above, those of skill in the art will recognize certain variations, modifications, changes, additions, and sub-combinations thereof.

Claims (15)

1. A method of determining a transition pose between a radar and a camera, comprising:
Receiving first point cloud data acquired by a radar for a calibration object, and acquiring a first calibration object plane equation and a first center point coordinate of the calibration object and a calibration object center point under a radar coordinate system respectively based on the first point cloud data;
Receiving image data acquired by a camera for the calibration object, and acquiring a second calibration object plane equation and a second center point coordinate of the calibration object and the center point of the calibration object under a camera coordinate system respectively based on the image data;
Determining an initial conversion pose between the radar coordinate system and the camera coordinate system according to a first normal vector of the first calibration object plane equation, a second normal vector of the second calibration object plane equation, the first center point coordinate and the second center point coordinate;
according to the camera internal parameters, the second center point coordinate is projected into an image plane, and a third center point coordinate is determined;
According to the initial conversion pose, projecting the first center point coordinate into an image plane, and determining a fourth center point coordinate;
And determining a conversion pose between the radar coordinate system and the camera coordinate system based on the distance between the third center point coordinate and the fourth center point coordinate.
2. The method of claim 1, the determining a transition pose between the radar coordinate system and the camera coordinate system based on a distance of the third center point coordinate from the fourth center point coordinate comprising:
Establishing a reprojection error function according to the distance difference between the third center point coordinate and the fourth center point coordinate;
and optimizing the initial conversion pose by minimizing the reprojection error function.
3. The method of claim 2, the initially converting the pose comprising: an initial rotation matrix and an initial translation vector, the method further comprising:
Optimizing the initial rotation matrix and the initial translation vector by using a first coordinate set and minimizing the reprojection error function to obtain a second rotation matrix and a second translation vector;
performing an inspection process on the second rotation matrix and the second translation vector using a second set of coordinates and by the reprojection error function;
wherein the first set of coordinates and the second set of coordinates comprise: and a plurality of sets of first center point coordinates and second center point coordinates.
4. The method of claim 3, the verifying the second rotation matrix and the second translation vector using a second set of coordinates and by the reprojection error function comprising:
Obtaining a plurality of function values of the re-projection error function using a second set of coordinates, the second rotation matrix and the second translation vector;
If the average value of the plurality of function values is smaller than a preset error threshold value, or if the number of function values smaller than the error threshold value is larger than a preset number threshold value, determining the second rotation matrix and the second translation vector as a third rotation matrix and a third translation vector which are finally calibrated.
5. The method of claim 4, further comprising:
Extracting a target object image with preset edge characteristics from an image acquired by the camera, performing edge detection on the target object image, and performing fitting processing to obtain an edge image in the target object image;
obtaining second point cloud data obtained by scanning the target object by the radar, and extracting edge point cloud data corresponding to the edge of the target object from the second point cloud data;
converting the edge point cloud based on the third rotation matrix and the third translation vector;
Constructing a distance error function between the converted edge point cloud data and the corresponding pixel point data positioned on the edge linear image;
And optimizing the third rotation matrix and the third translation vector by minimizing the error function.
6. The method of claim 1, the determining an initial conversion pose between the radar coordinate system and the camera coordinate system from a first normal vector of the first calibration object plane equation, a second normal vector of the second calibration object plane equation, the first center point coordinate, and the second center point coordinate comprising:
Determining an initial rotation matrix between the radar coordinate system and the camera coordinate system according to the first normal vector and the second normal vector;
and determining an initial translation vector between the radar coordinate system and the camera coordinate system according to the first center point coordinate and the second center point coordinate.
7. The method of claim 6, the determining an initial rotation matrix between the radar coordinate system and the camera coordinate system from the first normal vector, the second normal vector comprising:
Obtaining a normal vector angle difference constraint relationship between the first normal vector and the second normal vector;
establishing a first objective function based on the normal vector angle difference constraint relation;
The initial rotation matrix is obtained by minimizing the first objective function.
8. The method of claim 6, the determining an initial translation vector between the radar coordinate system and the camera coordinate system from the first center point coordinates, second center point coordinates comprising:
Obtaining a distance constraint relation between two positions of the calibration object center point in the radar coordinate system and the camera coordinate system according to the first center point coordinate and the second center point coordinate;
And establishing a second objective function based on the distance constraint relation, and obtaining the initial translation vector based on the second objective function.
9. The method of claim 1, the calibration artifact comprising: the disc-shaped plane calibration object is characterized in that the first point cloud data are three-dimensional point cloud data; the obtaining a first calibration object plane equation and a first center point coordinate of the calibration object and the calibration object center point under a radar coordinate system respectively comprises:
dividing the first point cloud data by adopting a deep learning method to obtain three-dimensional calibration object point cloud data corresponding to the calibration object;
Clustering the three-dimensional calibration object point cloud data to obtain a point cloud cluster, and a feature vector and a feature value corresponding to the point cloud cluster;
Fitting the point cloud cluster by using a preset first fitting algorithm to obtain the first calibration object plane equation;
Projecting the point cloud cluster to a plane corresponding to the first calibration object plane equation according to the characteristic value and the characteristic vector to obtain two-dimensional point cloud data;
Fitting the two-dimensional point cloud data by using a preset second fitting algorithm to obtain a fitting circle;
and converting the two-dimensional coordinates of the fitted circle center into three-dimensional first center point coordinates under the radar coordinate system.
10. The method of claim 9, further comprising:
Obtaining a feature vector corresponding to the minimum feature value of the point cloud cluster, and determining the feature vector as a plane normal vector;
And if the included angle between the plane normal vector and the normal vector of the surface of the preset calibration object is larger than the preset included angle threshold, determining that the first point cloud data is invalid data.
11. The method of claim 9, further comprising:
And if the difference value between the radius of the fitting circle and the preset calibration object radius is determined to be outside the range of the difference value threshold value interval, determining the three-dimensional calibration object point cloud data as invalid data.
12. The method of claim 9, further comprising:
Acquiring the position and normal vector of the center of the calibration object corresponding to each frame of three-dimensional calibration object point cloud data and the acquisition time difference of two adjacent frames of three-dimensional calibration object point cloud data;
Calculating the speed and angular speed of the movement of the calibration object based on the position and normal vector of the center of the calibration object and the acquisition time difference;
and carrying out motion compensation processing on the three-dimensional calibration object point cloud data of each frame according to the speed and the angular speed and the acquisition time difference.
13. A conversion pose determining apparatus between a radar and a camera, comprising:
The first data processing module is used for receiving point cloud data acquired by a radar for a calibration object, and acquiring a first calibration object plane equation and a first center point coordinate of the calibration object and a calibration object center point under a radar coordinate system respectively based on the point cloud data;
the second data processing module is used for receiving image data acquired by a camera for the calibration object, and obtaining a second calibration object plane equation and a second center point coordinate of the calibration object and the center point of the calibration object under a camera coordinate system respectively based on the image data;
the conversion pose determining module comprises:
The first determining unit is used for determining an initial conversion pose between the radar coordinate system and the camera coordinate system according to a first normal vector of the first calibration object plane equation, a second normal vector of the second calibration object plane equation, the first center point coordinate and the second center point coordinate;
A second determining unit, configured to project the second center point coordinate into an image plane according to the camera internal parameter, and determine a third center point coordinate; according to the initial conversion pose, projecting the first center point coordinate into an image plane, and determining a fourth center point coordinate; and determining a conversion pose between the radar coordinate system and the camera coordinate system based on the distance between the third center point coordinate and the fourth center point coordinate.
14. A computer readable storage medium storing a computer program for performing the method of any one of the preceding claims 1-12.
15. An electronic device, the electronic device comprising:
A processor;
a memory for storing the processor-executable instructions;
the processor being configured to perform the method of any of the preceding claims 1-12.
CN201910606517.8A 2019-07-05 2019-07-05 Method and device for determining conversion pose between radar and camera and electronic equipment Active CN112180362B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910606517.8A CN112180362B (en) 2019-07-05 2019-07-05 Method and device for determining conversion pose between radar and camera and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910606517.8A CN112180362B (en) 2019-07-05 2019-07-05 Method and device for determining conversion pose between radar and camera and electronic equipment

Publications (2)

Publication Number Publication Date
CN112180362A CN112180362A (en) 2021-01-05
CN112180362B true CN112180362B (en) 2024-04-23

Family

ID=73918765

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910606517.8A Active CN112180362B (en) 2019-07-05 2019-07-05 Method and device for determining conversion pose between radar and camera and electronic equipment

Country Status (1)

Country Link
CN (1) CN112180362B (en)

Families Citing this family (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113847930A (en) 2020-06-28 2021-12-28 图森有限公司 Multi-sensor calibration system
US11960276B2 (en) 2020-11-19 2024-04-16 Tusimple, Inc. Multi-sensor collaborative calibration system
US11702089B2 (en) * 2021-01-15 2023-07-18 Tusimple, Inc. Multi-sensor sequential calibration system
CN112561990B (en) * 2021-01-21 2022-05-31 禾多科技(北京)有限公司 Positioning information generation method, device, equipment and computer readable medium
CN112802124B (en) * 2021-01-29 2023-10-31 北京罗克维尔斯科技有限公司 Calibration method and device for multiple stereo cameras, electronic equipment and storage medium
CN112819861B (en) * 2021-02-26 2024-06-04 广州小马慧行科技有限公司 Point cloud motion compensation method, device and computer readable storage medium
CN113176544B (en) * 2021-03-05 2022-11-11 河海大学 Mismatching correction method for slope radar image and terrain point cloud
CN113077518B (en) * 2021-03-15 2022-02-11 中移(上海)信息通信科技有限公司 Camera parameter calibration method, device and storage medium
CN113077521B (en) * 2021-03-19 2022-11-01 浙江华睿科技股份有限公司 Camera calibration method and device
CN113034613B (en) * 2021-03-25 2023-09-19 ***股份有限公司 External parameter calibration method and related device for camera
CN113359116B (en) * 2021-05-12 2023-09-12 武汉中仪物联技术股份有限公司 Method, system, device, equipment and medium for relative calibration of range radar
CN113436233A (en) * 2021-06-29 2021-09-24 阿波罗智能技术(北京)有限公司 Registration method and device of automatic driving vehicle, electronic equipment and vehicle
CN113436277A (en) * 2021-07-15 2021-09-24 无锡先导智能装备股份有限公司 3D camera calibration method, device and system
CN113436278A (en) * 2021-07-22 2021-09-24 深圳市道通智能汽车有限公司 Calibration method, calibration device, distance measurement system and computer readable storage medium
CN113484840B (en) * 2021-07-23 2024-07-09 青岛海尔空调电子有限公司 Target positioning method for household appliance working space based on radar and household appliance system
CN113487684A (en) * 2021-07-23 2021-10-08 浙江华睿科技股份有限公司 Calibration parameter determination method and device, storage medium and electronic device
CN113639782A (en) * 2021-08-13 2021-11-12 北京地平线信息技术有限公司 External parameter calibration method and device for vehicle-mounted sensor, equipment and medium
CN114452545B (en) * 2021-08-25 2024-07-09 西安大医集团股份有限公司 Method, device and system for confirming coordinate system conversion relation
CN113655453B (en) * 2021-08-27 2023-11-21 阿波罗智能技术(北京)有限公司 Data processing method and device for sensor calibration and automatic driving vehicle
CN113744348A (en) * 2021-08-31 2021-12-03 南京慧尔视智能科技有限公司 Parameter calibration method and device and radar vision fusion detection equipment
CN115060229A (en) * 2021-09-30 2022-09-16 西安荣耀终端有限公司 Method and device for measuring a moving object
CN114265042A (en) * 2021-12-09 2022-04-01 上海禾赛科技有限公司 Calibration method, calibration device, calibration system and readable storage medium
CN114399555B (en) * 2021-12-20 2022-11-11 禾多科技(北京)有限公司 Data online calibration method and device, electronic equipment and computer readable medium
CN114923453B (en) * 2022-05-26 2024-03-05 杭州海康机器人股份有限公司 Calibration method and device for external parameters of linear profiler and electronic equipment
CN115439634B (en) * 2022-09-30 2024-02-23 如你所视(北京)科技有限公司 Interactive presentation method of point cloud data and storage medium
CN115471574B (en) * 2022-11-02 2023-02-03 北京闪马智建科技有限公司 External parameter determination method and device, storage medium and electronic device
CN115712111A (en) * 2022-11-07 2023-02-24 北京斯年智驾科技有限公司 Camera and radar combined calibration method and system, electronic device, computer equipment and storage medium
CN115856849B (en) * 2023-02-28 2023-05-05 季华实验室 Depth camera and 2D laser radar calibration method and related equipment
CN117928680B (en) * 2024-03-21 2024-06-07 青岛清万水技术有限公司 Automatic positioning method and system for transducer, electronic equipment and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090092150A (en) * 2008-02-26 2009-08-31 울산대학교 산학협력단 Method of 3d inspection of the object using ccd camera laser beam and apparutus thereof
CN103837869A (en) * 2014-02-26 2014-06-04 北京工业大学 Vector-relation-based method for calibrating single-line laser radar and CCD camera
CN106228537A (en) * 2016-07-12 2016-12-14 北京理工大学 A kind of three-dimensional laser radar and the combined calibrating method of monocular-camera
CN106408011A (en) * 2016-09-09 2017-02-15 厦门大学 Laser scanning three-dimensional point cloud tree automatic classifying method based on deep learning
CN106556825A (en) * 2015-09-29 2017-04-05 北京自动化控制设备研究所 A kind of combined calibrating method of panoramic vision imaging system
CN108932736A (en) * 2018-05-30 2018-12-04 南昌大学 Two-dimensional laser radar Processing Method of Point-clouds and dynamic robot pose calibration method
CN109323650A (en) * 2018-01-31 2019-02-12 黑龙江科技大学 Image visual transducer and the unified approach for putting ligh-ranging sensor measurement coordinate system
CN109598765A (en) * 2018-12-21 2019-04-09 浙江大学 Join combined calibrating method outside monocular camera and millimetre-wave radar based on spherical calibration object
CN109920011A (en) * 2019-05-16 2019-06-21 长沙智能驾驶研究院有限公司 Outer ginseng scaling method, device and the equipment of laser radar and binocular camera
CN109946680A (en) * 2019-02-28 2019-06-28 北京旷视科技有限公司 External parameters calibration method, apparatus, storage medium and the calibration system of detection system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3217355A1 (en) * 2016-03-07 2017-09-13 Lateral Reality Kft. Methods and computer program products for calibrating stereo imaging systems by using a planar mirror

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090092150A (en) * 2008-02-26 2009-08-31 울산대학교 산학협력단 Method of 3d inspection of the object using ccd camera laser beam and apparutus thereof
CN103837869A (en) * 2014-02-26 2014-06-04 北京工业大学 Vector-relation-based method for calibrating single-line laser radar and CCD camera
CN106556825A (en) * 2015-09-29 2017-04-05 北京自动化控制设备研究所 A kind of combined calibrating method of panoramic vision imaging system
CN106228537A (en) * 2016-07-12 2016-12-14 北京理工大学 A kind of three-dimensional laser radar and the combined calibrating method of monocular-camera
CN106408011A (en) * 2016-09-09 2017-02-15 厦门大学 Laser scanning three-dimensional point cloud tree automatic classifying method based on deep learning
CN109323650A (en) * 2018-01-31 2019-02-12 黑龙江科技大学 Image visual transducer and the unified approach for putting ligh-ranging sensor measurement coordinate system
CN108932736A (en) * 2018-05-30 2018-12-04 南昌大学 Two-dimensional laser radar Processing Method of Point-clouds and dynamic robot pose calibration method
CN109598765A (en) * 2018-12-21 2019-04-09 浙江大学 Join combined calibrating method outside monocular camera and millimetre-wave radar based on spherical calibration object
CN109946680A (en) * 2019-02-28 2019-06-28 北京旷视科技有限公司 External parameters calibration method, apparatus, storage medium and the calibration system of detection system
CN109920011A (en) * 2019-05-16 2019-06-21 长沙智能驾驶研究院有限公司 Outer ginseng scaling method, device and the equipment of laser radar and binocular camera

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
立体视觉和三维激光***的联合标定方法;董方新等;《仪器仪表学报》;第38卷(第10期);第2589-2596页 *

Also Published As

Publication number Publication date
CN112180362A (en) 2021-01-05

Similar Documents

Publication Publication Date Title
CN112180362B (en) Method and device for determining conversion pose between radar and camera and electronic equipment
David et al. Simultaneous pose and correspondence determination using line features
US10574974B2 (en) 3-D model generation using multiple cameras
US8953847B2 (en) Method and apparatus for solving position and orientation from correlated point features in images
CA2687213C (en) System and method for stereo matching of images
EP3016071B1 (en) Estimating device and estimation method
Takimoto et al. 3D reconstruction and multiple point cloud registration using a low precision RGB-D sensor
US20140253679A1 (en) Depth measurement quality enhancement
CN102289803A (en) Image Processing Apparatus, Image Processing Method, and Program
CN112184799B (en) Lane line space coordinate determination method and device, storage medium and electronic equipment
Stommel et al. Inpainting of missing values in the Kinect sensor's depth maps based on background estimates
CN110793441B (en) High-precision object geometric dimension measuring method and device
Mei et al. Radial lens distortion correction using cascaded one-parameter division model
CN113763478A (en) Unmanned vehicle camera calibration method, device, equipment, storage medium and system
CN113592706B (en) Method and device for adjusting homography matrix parameters
CN113689508B (en) Point cloud labeling method and device, storage medium and electronic equipment
US10628968B1 (en) Systems and methods of calibrating a depth-IR image offset
Cui et al. ACLC: Automatic Calibration for non-repetitive scanning LiDAR-Camera system based on point cloud noise optimization
KR100933304B1 (en) An object information estimator using the single camera, a method thereof, a multimedia device and a computer device including the estimator, and a computer-readable recording medium storing a program for performing the method.
CN116630423A (en) ORB (object oriented analysis) feature-based multi-target binocular positioning method and system for micro robot
CN111179331A (en) Depth estimation method, depth estimation device, electronic equipment and computer-readable storage medium
Bartczak et al. Extraction of 3D freeform surfaces as visual landmarks for real-time tracking
Hartley et al. Camera models
KR20220085693A (en) A multi-view camera-based iterative calibration method for the generation of a 3D volume model
Paudel et al. 2D–3D synchronous/asynchronous camera fusion for visual odometry

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant