CN113759348A - Radar calibration method, device, equipment and storage medium - Google Patents

Radar calibration method, device, equipment and storage medium Download PDF

Info

Publication number
CN113759348A
CN113759348A CN202110075819.4A CN202110075819A CN113759348A CN 113759348 A CN113759348 A CN 113759348A CN 202110075819 A CN202110075819 A CN 202110075819A CN 113759348 A CN113759348 A CN 113759348A
Authority
CN
China
Prior art keywords
cloud data
point cloud
plane
target
auxiliary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110075819.4A
Other languages
Chinese (zh)
Other versions
CN113759348B (en
Inventor
林金表
徐卓然
许新玉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jingdong Kunpeng Jiangsu Technology Co Ltd
Original Assignee
Jingdong Kunpeng Jiangsu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jingdong Kunpeng Jiangsu Technology Co Ltd filed Critical Jingdong Kunpeng Jiangsu Technology Co Ltd
Priority to CN202110075819.4A priority Critical patent/CN113759348B/en
Publication of CN113759348A publication Critical patent/CN113759348A/en
Application granted granted Critical
Publication of CN113759348B publication Critical patent/CN113759348B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The embodiment of the invention discloses a radar calibration method, a device, equipment and a storage medium. The method comprises the following steps: acquiring main point cloud data and auxiliary point cloud data which are respectively acquired by a main radar and an auxiliary radar; the main point cloud data and the auxiliary point cloud data respectively comprise plane point cloud data corresponding to three calibration planes which are intersected pairwise; determining a plane equation corresponding to each plane point cloud data based on the plane point cloud data corresponding to the main point cloud data and the auxiliary point cloud data; and determining a target conversion matrix between the main radar and the auxiliary radar based on each plane equation, and determining calibration point cloud data based on the target conversion matrix. The embodiment of the invention solves the problem that the existing radar calibration process has more limiting factors, and widens the applicable scene of radar calibration.

Description

Radar calibration method, device, equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of radars, in particular to a radar calibration method, a device, equipment and a storage medium.
Background
Lidar can help robots accurately perceive the surrounding environment, often being configured to different devices, such as unmanned vehicles or drones. However, because the unmanned vehicle has a large volume, a detection blind area exists in any position of the unmanned vehicle no matter where a single laser radar is installed, so that the unmanned vehicle is often provided with a main laser radar and a plurality of auxiliary radars to achieve the purpose of information complementation.
Because a plurality of laser radars all give the detected point cloud coordinate by the coordinate system of the laser radars, radar calibration work is an indispensable step for integrating detection results of all the laser radars. The radar calibration is to unify point cloud coordinates given by a plurality of laser radars into the same laser radar coordinate system based on the mutual conversion relation of different coordinate systems to realize the fusion of point cloud data. The common radar calibration method is mainly used for calibrating the point cloud data of the overlapping area of a plurality of laser radars under the condition that the point cloud shapes acquired by the plurality of laser radars are similar.
In the process of implementing the invention, at least the following technical problems are found in the prior art:
the existing radar calibration method has two limiting conditions: the point cloud shapes are similar and more overlapping areas need to exist. Therefore, the existing radar calibration method is not applicable to a scene adopting various laser radars of different types or a scene with a small laser radar overlapping area, and further limits the applicable scene of radar calibration.
Disclosure of Invention
The embodiment of the invention provides a radar calibration method, a device, equipment and a storage medium, which are used for solving the problem that the existing laser radar has more limiting factors and widening the applicable scene of radar calibration.
In a first aspect, an embodiment of the present invention provides a radar calibration method, where the method includes:
acquiring main point cloud data and auxiliary point cloud data which are respectively acquired by a main radar and an auxiliary radar; the main point cloud data and the auxiliary point cloud data respectively comprise plane point cloud data corresponding to three calibration planes which are intersected pairwise;
determining a plane equation corresponding to each plane point cloud data based on the plane point cloud data corresponding to the main point cloud data and the auxiliary point cloud data;
and determining a target conversion matrix between the main radar and the auxiliary radar based on each plane equation, and determining calibration point cloud data based on the target conversion matrix.
In a second aspect, an embodiment of the present invention further provides a radar calibration apparatus, where the apparatus includes:
the point cloud data acquisition module is used for acquiring main point cloud data and auxiliary point cloud data which are respectively acquired by the main radar and the auxiliary radar; the main point cloud data and the auxiliary point cloud data respectively comprise plane point cloud data corresponding to three calibration planes which are intersected pairwise;
the plane equation determining module is used for determining a plane equation corresponding to each plane point cloud data based on the plane point cloud data corresponding to the main point cloud data and the auxiliary point cloud data;
and the point cloud data calibration module is used for determining a target conversion matrix between the main radar and the auxiliary radar based on each plane equation and determining calibration point cloud data based on the target conversion matrix.
In a third aspect, an embodiment of the present invention further provides an electronic device, where the electronic device includes:
one or more processors;
a memory for storing one or more programs;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement any of the radar calibration methods referred to above.
In a fourth aspect, embodiments of the present invention further provide a storage medium containing computer-executable instructions, which when executed by a computer processor, are configured to perform any of the above-mentioned radar calibration methods.
The embodiment of the invention has the following advantages or beneficial effects:
according to the embodiment of the invention, the plane point cloud data corresponding to three calibration planes intersected in pairs are collected by the main radar and the auxiliary radar, the plane equation of each calibration plane is determined based on the plane point cloud data, and the target conversion matrix between the main radar and the auxiliary radar is determined based on the plane equation, so that the problem that the radar calibration process in the prior art has more limiting factors is solved, the installation position relation between the radars and the radar types are not limited, and the applicable scene of radar calibration is widened.
Drawings
Fig. 1 is a flowchart of a radar calibration method according to an embodiment of the present invention.
Fig. 2 is a flowchart of a radar calibration method according to a second embodiment of the present invention.
Fig. 3 is a schematic diagram of the coordinates of the feature point of the intersection line according to the second embodiment of the present invention.
Fig. 4A is a flowchart of a method for determining a main feature set according to a second embodiment of the present invention.
Fig. 4B is a flowchart of a method for determining an assistant feature set according to a second embodiment of the present invention.
Fig. 5 is a schematic diagram of a radar calibration apparatus according to a third embodiment of the present invention.
Fig. 6 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a flowchart of a radar calibration method according to an embodiment of the present invention, where the present embodiment is applicable to radar calibration in a multi-radar navigation or positioning scenario, and the method may be executed by a radar calibration apparatus, and the apparatus may be implemented in a software and/or hardware manner. The method specifically comprises the following steps:
and S110, acquiring main point cloud data and auxiliary point cloud data which are respectively acquired by the main radar and the auxiliary radar.
In this embodiment, at least two radars may be set in a radar navigation or radar positioning scene, specifically, any one radar may be used as a main radar, and other radars may be used as auxiliary radars.
In this embodiment, the main point cloud data and the auxiliary point cloud data each include plane point cloud data corresponding to three calibration planes that intersect each other two by two. Specifically, the three calibration planes corresponding to the main radar and the three calibration planes corresponding to the auxiliary radar may be the same set of calibration planes or different sets of calibration planes. Illustratively, a main radar acquires plane point cloud data corresponding to a calibration plane 1, a calibration plane 2 and a calibration plane 3 respectively to obtain main point cloud data; the auxiliary radar acquires plane point cloud data corresponding to the calibration plane 4, the calibration plane 5 and the calibration plane 6 respectively to obtain auxiliary point cloud data, and of course, the auxiliary radar may also acquire plane point cloud data corresponding to the calibration plane 1, the calibration plane 2 and the calibration plane 4 respectively to obtain auxiliary point cloud data, wherein the calibration plane 1, the calibration plane 2 and the calibration plane 4 intersect with each other in pairs.
In one embodiment, when the calibration plane corresponding to the main radar is different from the calibration plane corresponding to the auxiliary radar, the plane angle between the calibration planes corresponding to the main radar is the same as the plane angle between the calibration planes corresponding to the auxiliary radar. Specifically, the calibration plane corresponding to the main radar and the calibration plane corresponding to the auxiliary radar may be different from each other in all three calibration planes, or different in part of the calibration planes, and the same in part of the calibration planes. For example, assuming that the plane angles between the calibration planes corresponding to the primary radar are 90 °,90 ° and 30 °, respectively, the plane angles between the calibration planes corresponding to the secondary radar are 90 °,90 ° and 30 °, respectively.
In one embodiment, optionally, the three calibration planes include two perpendicular calibration planes. In an exemplary embodiment, the three calibration planes are a ground surface and two wall surfaces perpendicular to each other and the ground surface, and specifically, the plane angles between the calibration planes are 90 °,90 ° and 90 ° respectively.
In an embodiment, optionally, the point cloud data acquired by the radar is acquired according to a scanning mode of the radar. In an exemplary case, taking the main radar as an example, when the main radar is a mechanical rotation laser radar, a scanning mode of the mechanical rotation laser radar is single-frame scanning, and single-frame point cloud data acquired by the mechanical rotation laser radar is used as main point cloud data. When the main radar is a rotating mirror type laser radar, the scanning mode of the rotating mirror type laser radar is non-repeated scanning, and the non-repeated scanning means that point cloud data collected by each frame are different even if the main radar is in a static state. Therefore, multi-frame point clouds collected by the rotary mirror type laser radar are accumulated to obtain main point cloud data.
Illustratively, the main point cloud data is recorded as P0Recording the auxiliary point cloud data as Q0
And S120, determining plane equations respectively corresponding to the plane point cloud data based on the plane point cloud data respectively corresponding to the main point cloud data and the auxiliary point cloud data.
In an embodiment, optionally, determining a plane equation corresponding to each plane point cloud data based on the plane point cloud data corresponding to each of the main point cloud data and the auxiliary point cloud data includes: and determining plane equations respectively corresponding to the plane point cloud data based on the plane point cloud data respectively corresponding to the main point cloud data and the pre-auxiliary point cloud data.
For example, the predetermined transformation matrix may be determined based on an installation pose of the primary radar and an installation pose of the secondary radar, wherein the installation poses include an installation position coordinate and a central axis angle.
In this embodiment, coordinate conversion is performed on the auxiliary point cloud data based on the preset conversion matrix, and the auxiliary point cloud data can be converted into a coordinate system similar to the coordinate system of the main radar, so that the coordinate system similarity between the main point cloud data and the auxiliary point cloud data is realized, the consistency of subsequent parameter setting is ensured, and the algorithm calculation difficulty is reduced.
Wherein, for example, the pre-auxiliary point cloud data is recorded as Q1
In an embodiment, optionally, determining a plane equation corresponding to each planar point cloud data based on the planar point cloud data corresponding to each of the main point cloud data and the pre-auxiliary point cloud data includes: respectively taking the main point cloud data and the pre-auxiliary point cloud data as target point cloud data; determining a first target plane equation corresponding to a first calibration plane based on first plane point cloud data corresponding to the target point cloud data; determining a second target plane equation corresponding to the second calibration plane and a third target plane equation corresponding to the third calibration plane based on the reference plane point cloud data corresponding to the target point cloud data; the reference plane point cloud data comprises second plane point cloud data and third plane point cloud data.
Specifically, the first plane point cloud data is plane point cloud data corresponding to the first calibration plane. In one embodiment, optionally, the method further comprises: selecting point cloud data with the minimum vertical coordinate in the target point cloud data based on the preset selection number, and determining the average height corresponding to the selected point cloud data; and screening the target point cloud data based on the average height and a preset height difference threshold value to obtain first plane point cloud data corresponding to the target point cloud data.
In this embodiment, the first calibration plane is the ground. For example, the preset number of picks may be 1000 or 2000, and the preset height difference threshold may be 0.05. For example, suppose that 1000 point cloud data with the smallest vertical coordinate in the target point cloud data are selected to calculate the average height h, and point cloud data with the vertical coordinate not exceeding h + α in the target point cloud data are used as the first plane point cloud data, wherein α represents a preset height difference threshold. Illustratively, the first plane point cloud data corresponding to the main point cloud data is recorded as PGRecording the first plane point cloud data corresponding to the auxiliary point cloud data as QG
For example, the algorithm for determining the first target plane equation based on the first plane point cloud data may be a least squares method or a random sample consensus (RANSAC) algorithm.
In one embodiment, optionally, the method further comprises: determining reference plane point cloud data corresponding to the target point cloud data based on the target point cloud data and a first target plane equation corresponding to the target point cloud data; when the target point cloud data is main point cloud data, the first target plane equation is a first main plane equation, and when the target point cloud data is pre-auxiliary point cloud data, the first target plane equation is a first auxiliary plane equation.
Specifically, the target point cloud data includes plane point cloud data corresponding to three calibration planes, and point cloud data corresponding to the first target plane equation in the target point cloud data is deleted, that is, point cloud data corresponding to the first calibration plane in the target point cloud data is deleted, so that reference plane point cloud data is obtained. Wherein, exemplarily, the point cloud data of the reference plane corresponding to the cloud data of the principal point is recorded as PABRecording the point cloud data of the reference plane corresponding to the pre-auxiliary point cloud data as QAB
In one embodiment, optionally, determining a second target plane equation corresponding to the second calibration plane and a third target plane equation corresponding to the third calibration plane based on the reference plane point cloud data corresponding to the target point cloud data includes: determining a second target plane equation corresponding to a second calibration plane based on the reference plane point cloud data corresponding to the target point cloud data; determining third plane point cloud data corresponding to the target point cloud data based on the reference plane point cloud data and the second target plane equation; and determining a third target plane equation corresponding to the third calibration plane based on the third plane point cloud data.
Specifically, the algorithm for determining the second target plane equation based on the reference plane point cloud data may be a random sampling consensus algorithm.
Specifically, the reference plane point cloud data includes plane point cloud data corresponding to two calibration planes, and point cloud data corresponding to the second target plane equation in the reference plane point cloud data is deleted, that is, point cloud data corresponding to the second calibration plane in the reference plane point cloud data is deleted, so that third plane point cloud data is obtained.
For example, the algorithm for determining the third target plane equation based on the third plane point cloud data may be a least square method or a random sampling consensus algorithm.
In this example, a first principal plane equation corresponding to the principal point cloud data is denoted as GPAnd recording a second principal plane equation corresponding to the principal point cloud data as APAnd recording a third principal plane equation corresponding to the principal point cloud data as BPRecording a first auxiliary plane equation corresponding to the pre-auxiliary point cloud data as GQRecording a second auxiliary plane equation corresponding to the pre-auxiliary point cloud data as AQRecording a third auxiliary plane equation corresponding to the pre-auxiliary point cloud data as BQ
S130, determining a target conversion matrix between the main radar and the auxiliary radar based on each plane equation, and determining calibration point cloud data based on the target conversion matrix.
In an embodiment, optionally, if the coordinate conversion is performed on the auxiliary point cloud data based on a preset conversion matrix to obtain pre-auxiliary point cloud data, and a plane equation corresponding to each plane point cloud data is determined based on the plane point cloud data corresponding to each main point cloud data and the pre-auxiliary point cloud data, determining a target conversion matrix between the main radar and the auxiliary radar based on each plane equation includes: and determining a reference conversion matrix between the main radar and the auxiliary radar based on each plane equation, and determining a target conversion matrix between the main radar and the auxiliary radar based on a preset conversion matrix and the reference conversion matrix.
Wherein, for example, let T denote the predetermined transformation matrix1Let the reference transformation matrix be T2Then the target transformation matrix T satisfies the formula: t ═ T1*T2
In one embodiment, optionally, determining a reference transformation matrix between the primary radar and the secondary radar based on the plane equations comprises: determining feature point coordinates respectively corresponding to the principal point cloud data and the pre-auxiliary point cloud data based on three plane equations respectively corresponding to the principal point cloud data and the pre-auxiliary point cloud data; the characteristic point coordinates comprise plane intersection point coordinates and intersection line characteristic point coordinates, the plane intersection point coordinates are used for representing intersection point coordinates of the three calibration planes, and the intersection line characteristic point coordinates are used for representing characteristic point coordinates on an intersection line between the three calibration planes; and performing pose estimation operation on a main feature set generated based on the feature point coordinates corresponding to the main point cloud data and an auxiliary feature set generated based on the feature point coordinates corresponding to the pre-auxiliary point cloud data to obtain a reference conversion matrix between the main radar and the auxiliary radar.
In this example, the feature point coordinates of the intersection line include a feature point coordinate AB on the intersection line between the calibration plane a and the calibration plane B, a feature point coordinate AC on the intersection line between the calibration plane a and the calibration plane C, and a feature point coordinate BC on the intersection line between the calibration plane B and the calibration plane C.
A detailed explanation about the specific implementation of determining the coordinates of the feature points based on the plane equation is given in the following examples.
In one embodiment, the pose estimation algorithm may optionally include a Singular Value Decomposition (SVD) algorithm.
Specifically, assume that the master feature set is X ═ { X ═ X1,x2,...,xnY is the set of auxiliary features1,y2,...,ynAnd solving the obtained reference conversion matrix to make the distance between the feature point coordinates in the main feature set and the feature point coordinates in the auxiliary feature set after coordinate conversion shortest, wherein the distance satisfies the formula:
Figure BDA0002907565740000091
where R represents a rotation matrix and t represents a translation matrix.
Specifically, a mean value X corresponding to the main feature set X is calculated0And the mean value Y corresponding to the assistant feature set Y0Respectively subtracting the mean value X from the feature point coordinates in the main feature set X0To obtain xi' the mean value Y is subtracted from the feature point coordinates in the auxiliary feature set Y respectively0To obtain yi', and then obtaining a matrix H, wherein the matrix H satisfies the formula:
Figure BDA0002907565740000101
decomposing the matrix H based on a singular value decomposition algorithm to obtain an optimal rotation matrix R with the shortest distance*Satisfies the formula R*=VUTWherein V and U are respectively unit orthogonal matrix and optimal translation matrix t*Satisfies the formula: t is t*=y0-R*·x0
Wherein, in particular, the transformation matrix T is referenced2Satisfies the formula:
Figure BDA0002907565740000102
specifically, determining calibration point cloud data based on the target transformation matrix includes: and performing coordinate conversion on the main point cloud data or the auxiliary point cloud data based on the target conversion matrix to obtain calibrated main point cloud data or calibrated auxiliary point cloud data. The coordinate systems of the calibrated main point cloud data and the auxiliary point cloud data are the same, or the coordinate systems of the calibrated auxiliary point cloud data and the main point cloud data are the same.
According to the technical scheme, plane point cloud data corresponding to three calibration planes which are intersected pairwise are collected through the main radar and the auxiliary radar, the plane equation of each calibration plane is determined based on the plane point cloud data, the target conversion matrix between the main radar and the auxiliary radar is determined based on the plane equation, the problem that the limiting factors are more in the radar calibration process in the prior art is solved, the installation position relation between the radars and the radar types are not limited, and therefore the applicable scene of radar calibration is widened.
Example two
Fig. 2 is a flowchart of a radar calibration method according to a second embodiment of the present invention, and the technical solution of the present embodiment is further detailed based on the above-mentioned embodiment. Optionally, determining, based on the target point cloud data and a first target plane equation corresponding to the target point cloud data, reference plane point cloud data corresponding to the target point cloud data includes: determining parameter data between the target point cloud data and a main radar central point, and taking point cloud data which meets a preset parameter range in the target point cloud data as reference target point cloud data; wherein the parameter data comprises distance data and/or offset angle data; and determining reference plane point cloud data corresponding to the target point cloud data based on the reference target point cloud data and a first target plane equation corresponding to the target point cloud data.
The specific implementation steps of this embodiment include:
s210, main point cloud data and auxiliary point cloud data which are respectively collected by the main radar and the auxiliary radar are obtained.
S220, respectively taking the main point cloud data and the pre-auxiliary point cloud data as target point cloud data, and determining a first target plane equation corresponding to a first calibration plane based on first plane point cloud data corresponding to the target point cloud data.
And S230, determining parameter data between the target point cloud data and the main radar central point, and taking the point cloud data which meets a preset parameter range in the target point cloud data as reference target point cloud data.
In this embodiment, the parameter data comprises distance data and/or offset angle data. Specifically, the offset angle data is an X-axis deflection angle, a Y-axis deflection angle or a Z-axis deflection angle of the target point cloud data relative to the center point of the main radar.
For example, assuming that the coordinates of the point cloud data are (x, y, z) and the coordinates of the center point of the main radar are (0,0,0), the distance data L satisfies the formula:
Figure BDA0002907565740000111
x-axis deflection angle thetaxSatisfies the formula:
Figure BDA0002907565740000112
y-axis deflection angle thetaySatisfies the formula:
Figure BDA0002907565740000113
z-axis deflection angle thetazSatisfies the formula:
Figure BDA0002907565740000114
here, the preset distance range may be [0,5m ], and the preset angle range may be [0,90 ° ], for example. The preset distance range and the preset angle range are related to the position relation of the main radar relative to the three calibration planes, and specific parameters of the preset distance range and the preset angle range are not limited.
Illustratively, the reference principal point cloud data is denoted as P2Recording the reference auxiliary point cloud data as Q2
S240, determining reference plane point cloud data corresponding to the target point cloud data based on the reference target point cloud data and a first target plane equation corresponding to the target point cloud data.
In an embodiment, optionally, when the first calibration plane is an XOY plane, the offset angle data is a Z-axis deflection angle, when the first calibration plane is an XOZ plane, the offset angle data is a Y-axis deflection angle, and when the first calibration plane is a YOZ plane, the offset angle data is an X-axis deflection angle. Specifically, when the first calibration plane is the ground, the offset angle data is the Z-axis offset angle. The Z-axis deflection angle may also be referred to as a horizontal deflection angle.
Specifically, the reference target point cloud data is point cloud data obtained by screening the target point cloud data based on a preset distance range and a preset angle range, and in this embodiment, theoretically, the reference target point cloud data includes plane point cloud data corresponding to the second calibration plane and plane point cloud data corresponding to the third calibration plane, but is affected by the screening precision, and the reference target point cloud data also includes a small amount of plane point cloud data corresponding to the first calibration plane. Specifically, point cloud data corresponding to the first target plane equation in the reference target point cloud data is deleted, that is, a small amount of point cloud data corresponding to the first calibration plane in the target point cloud data is deleted, so that reference plane point cloud data is obtained.
And S250, determining a second target plane equation corresponding to the second calibration plane and a third target plane equation corresponding to the third calibration plane based on the reference plane point cloud data corresponding to the target point cloud data.
And S260, determining a target conversion matrix between the main radar and the auxiliary radar based on each plane equation, and determining calibration point cloud data based on the target conversion matrix.
On the basis of the foregoing embodiment, optionally, determining feature point coordinates respectively corresponding to the principal point cloud data and the pre-auxiliary point cloud data based on three plane equations respectively corresponding to the principal point cloud data and the pre-auxiliary point cloud data includes: respectively taking the main point cloud data and the pre-auxiliary point cloud data as target point cloud data; determining three target linear equations corresponding to the three calibration planes based on three plane equations corresponding to the target point cloud data, and determining plane intersection point coordinates corresponding to the target point cloud data based on each target linear equation; respectively determining the coordinates of characteristic points of the intersection line corresponding to each target linear equation based on the plane intersection point coordinates and the preset distance standards respectively corresponding to each target linear equation; the preset distance standard comprises that the distance between the intersection point coordinate of the intersecting line characteristic point coordinate and the plane intersection point coordinate meets a preset standard distance, and the distance between the intersection line characteristic point coordinate and the main radar central point is larger than or smaller than the distance between the other characteristic point coordinate meeting the preset standard distance and the main radar central point.
In this embodiment, the main point cloud data is taken as an example for explanation, and accordingly, a specific implementation manner corresponding to the pre-auxiliary point cloud data is similar to a specific implementation manner corresponding to the main point cloud data.
Specifically, based on a first principal plane equation G corresponding to principal point cloud dataPSecond principal plane equation APAnd a third principal plane equation BPDetermining a principal linear equation GA between the first calibration plane and the second calibration planePA primary linear equation GB between the first calibration plane and the third calibration planePAnd the equation AB of the principal straight line between the second calibration plane and the third calibration planePFurther, based on the main line equation GAPPrincipal linear equation GBPAnd equation of principal straight line ABPAnd determining the plane intersection point coordinates corresponding to the principal point cloud data.
Wherein, the intersecting characteristic points are characteristic points on the principal straight line equation. Specifically, the preset standard distances in the preset distance standards corresponding to different principal linear equations may be the same or different, but the distance magnitude relations in the preset distance standards are the same, that is, the preset distance standards corresponding to different principal linear equations all use feature points with larger distances as the coordinates of feature points of the intersecting line or all use feature points with smaller distances as the coordinates of the intersecting feature points.
In one embodiment, optionally, the principal linear equation GAPEquation of the main straight line GBPThe corresponding preset distance criteria are the same. In particular, the equation of the principal straight line GAPThe preset standard distance in the corresponding preset distance standard and the main linear equation GBPThe preset standard distances in the corresponding preset distance standards are the same.
Fig. 3 is a schematic diagram of the coordinates of the feature point of the intersection line according to the second embodiment of the present invention. Using principal linear equation ABPFor example, equation AB with the principal straight linePThe preset standard distance in the corresponding preset distance standard is beta, and the distance is calculated according to the principle linear equation ABPTwo feature points g with the coordinate distance beta from the plane intersection point are respectively selected in two corresponding directionsP1And gP2And respectively calculating the distance between the coordinates of the two characteristic points and the central point of the main radar, and taking the characteristic point with larger distance as the characteristic point of the intersecting line or taking the characteristic point with smaller distance as the characteristic point of the intersecting line. Where, for example, β may be 0.5. Using the equation of the principal straight line GAPFor example, equation GA with principal straight linePThe preset standard distance in the corresponding preset distance standard is gamma, and the main linear equation GAPTwo feature points b with the coordinate distance gamma from the plane intersection point are respectively selected in two corresponding directionsP1And bP2And respectively calculating the distance between the coordinates of the two characteristic points and the central point of the main radar, and taking the characteristic point with larger distance as the characteristic point of the intersecting line or taking the characteristic point with smaller distance as the characteristic point of the intersecting line. Where, for example, γ may be 1.0.
Therein, in particular, with the principal linear equation ABPPrincipal linear equation GAPEquation of the main straight line GBPThe characteristic points of the respective corresponding intersecting lines are gP1、bP1And aP1Or, with the equation of the main straight line ABPMain straight lineEquation GAPEquation of the main straight line GBPThe characteristic points of the respective corresponding intersecting lines are gP2、bP2And aP2
Illustratively, the coordinate of the plane intersection point corresponding to the main point cloud data is recorded as oPWill be related to the principal linear equation ABPThe corresponding intersection characteristic point coordinates are marked as gPWill be related to the principal linear equation GAPThe corresponding intersection line characteristic point coordinates are marked as bPWill be related to the principal linear equation GBPThe corresponding intersection line characteristic point coordinates are marked as aP(ii) a Recording the coordinate of the plane intersection point corresponding to the pre-auxiliary point cloud data as oQWill be related to the pre-auxiliary linear equation ABQThe corresponding intersection characteristic point coordinates are marked as gQWill be related to the pre-auxiliary linear equation GAQThe corresponding intersection line characteristic point coordinates are marked as bQWill be related to the pre-auxiliary linear equation GBQThe corresponding intersection line characteristic point coordinates are marked as aQ
On the basis of the foregoing embodiment, optionally, the method further includes: respectively taking the main point cloud data and the pre-auxiliary point cloud data as target point cloud data; determining offset angle data corresponding to the characteristic point coordinates of each intersection line respectively based on the characteristic point coordinates of the three intersection lines corresponding to the target point cloud data; based on the data of each offset angle, sorting the coordinates of the characteristic points of each intersection line to obtain a target characteristic set; and when the target point cloud data is the main point cloud data, the target feature set is the main feature set, and when the target point cloud data is the pre-auxiliary point cloud data, the target feature set is the auxiliary feature set.
Specifically, offset angle data between the intersection characteristic point coordinates and the main radar center point is determined. In an embodiment, optionally, when the first calibration plane is an XOY plane, the offset angle data is a Z-axis deflection angle, when the first calibration plane is an XOZ plane, the offset angle data is a Y-axis deflection angle, and when the first calibration plane is a YOZ plane, the offset angle data is an X-axis deflection angle.
Specifically, the coordinates of the feature points of the intersecting line are sorted based on the offset angle data, and the sorting mode may be, for exampleIt can be from large to small, or from small to large. The advantage of such an arrangement is that, since the determination order of the second target plane equation and the third target plane equation is random, for example, it is assumed that the reference plane point cloud data includes plane point cloud data corresponding to the calibration plane a and plane point cloud data corresponding to the calibration plane B, and the second principal plane equation determined based on the reference plane point cloud data corresponding to the principal point cloud data may be a plane equation corresponding to the calibration plane a or a plane equation corresponding to the calibration plane B, and similarly, the second auxiliary plane equation determined based on the reference plane point cloud data corresponding to the pre-auxiliary point cloud data may be a plane equation corresponding to the calibration plane a or a plane equation corresponding to the calibration plane B, when the principal feature set and the auxiliary feature set are generated, B in the principal feature set is a plane equation corresponding to the calibration plane a or a plane equation corresponding to the calibration plane BPIt may correspond to a in the set of assist featuresQAnd then a great error occurs in a reference transformation matrix obtained by pose estimation operation subsequently. By sequencing the coordinates of the characteristic points of the intersecting lines based on the offset angle data, the consistency of the corresponding sequence of the coordinates of the characteristic points of the intersecting lines in the main characteristic set and the auxiliary characteristic set can be ensured, and the accuracy of a reference transformation matrix obtained by pose estimation operation subsequently can be ensured.
Fig. 4A is a flowchart of a method for determining a main feature set according to a second embodiment of the present invention. Specifically, main point cloud data P acquired by a main radar is acquired0Determining principal point cloud data P based on the average height and a preset height difference threshold0The first plane point cloud data P corresponding to the first calibration planeGFor the first plane point cloud data PGPerforming plane fitting to obtain a first principal plane equation GP. Main point cloud data P based on preset parameter range0Screening to obtain reference principal point cloud data P2And delete the reference principal point cloud data P2Equation G of the first principal planePCorresponding point cloud data to obtain reference plane point cloud data PABFor reference plane point cloud data PABPerforming plane fitting to obtain a second principal plane equation AP. Deleting reference plane point cloud data PABNeutralization ofSecond principal plane equation APObtaining third plane point cloud data by the corresponding point cloud data, and carrying out plane fitting on the third plane point cloud data to obtain a third principal plane equation BP. Based on the first principal plane equation GPSecond principal plane equation APAnd a third principal plane equation BPDetermining the equation of the principal straight line GAPPrincipal linear equation GBPAnd equation of principal straight line ABPAnd based on the principal linear equation GAPPrincipal linear equation GBPAnd equation of principal straight line ABPDetermining the coordinates a of the characteristic points of the intersecting linesPCoordinates of characteristic points of intersecting lines bPCoordinates g of characteristic points of intersecting linesPCoordinates o of points intersecting the planePSorting the coordinates of the feature points of the intersecting lines based on the offset angle data corresponding to the coordinates of the feature points of each intersecting line to obtain a main feature set, illustratively, the main feature set X ═ oP,gP,aP,bP]。
Fig. 4B is a flowchart of a method for determining an assistant feature set according to a second embodiment of the present invention. Specifically, auxiliary point cloud data Q acquired by an auxiliary radar is acquired0Based on the preset transformation matrix to assist the point cloud data Q0Coordinate conversion is carried out to obtain pre-auxiliary point cloud data Q1. Determining pre-auxiliary point cloud data Q based on average height and preset height difference threshold1First plane point cloud data Q corresponding to first calibration planeGFor the first plane point cloud data QGPerforming plane fitting to obtain a first auxiliary plane equation GQ. Pre-auxiliary point cloud data Q based on preset parameter range1Screening to obtain reference auxiliary point cloud data Q2And deleting the reference auxiliary point cloud data Q2Middle first auxiliary plane equation GQCorresponding point cloud data to obtain reference plane point cloud data QABFor reference plane point cloud data QABPerforming plane fitting to obtain a second auxiliary plane equation AQ. Deleting reference plane point cloud data QABMiddle and second auxiliary plane equation AQObtaining third plane point cloud data by the corresponding point cloud data, and carrying out plane fitting on the third plane point cloud data to obtain a third plane point cloud dataAuxiliary plane equation BQ. Based on the first auxiliary plane equation GQSecond auxiliary plane equation AQAnd a third auxiliary plane equation BQDetermination of the auxiliary linear equation GAQAuxiliary linear equation GBQAnd the auxiliary linear equation ABQAnd based on the auxiliary linear equation GAQAuxiliary linear equation GBQAnd the auxiliary linear equation ABQDetermining the coordinates a of the characteristic points of the intersecting linesQCoordinates of characteristic points of intersecting lines bQCoordinates g of characteristic points of intersecting linesQCoordinates o of points intersecting the planeQSorting the coordinates of the feature points of the intersecting lines based on the offset angle data corresponding to the coordinates of the feature points of each intersecting line to obtain an auxiliary feature set, illustratively, the auxiliary feature set Y ═ oQ,gQ,aQ,bQ]。
According to the technical scheme, the reference target point cloud data is obtained by determining the parameter data between the target point cloud data and the main radar central point and screening the target point cloud data based on the preset parameter range, the reference plane point cloud data is determined based on the reference target point cloud data and the first target plane equation corresponding to the target point cloud data, the plane point cloud data corresponding to the first calibration plane in the target point cloud data is sequentially screened based on two different screening modes, the problem that the screening effect of the reference plane point cloud data is poor is solved, the reference plane point cloud data obtained through final screening only comprises the second plane point cloud data and the third plane point cloud data as far as possible, the fitting effect of a subsequent plane equation is improved, and the accuracy of a radar calibration result is further improved.
EXAMPLE III
Fig. 5 is a schematic diagram of a radar calibration apparatus according to a third embodiment of the present invention. The embodiment can be suitable for radar calibration under the condition of multi-radar navigation or positioning, and the device can be realized in a software and/or hardware mode. The radar calibration device comprises: a point cloud data acquisition module 310, a plane equation determination module 320, and a point cloud data calibration module 330.
The point cloud data acquisition module 310 is configured to acquire main point cloud data and auxiliary point cloud data acquired by a main radar and an auxiliary radar respectively; the main point cloud data and the auxiliary point cloud data respectively comprise plane point cloud data corresponding to three calibration planes which are intersected pairwise;
a plane equation determining module 320, configured to determine, based on the plane point cloud data corresponding to the main point cloud data and the auxiliary point cloud data, a plane equation corresponding to each plane point cloud data;
and the point cloud data calibration module 330 is configured to determine a target transformation matrix between the main radar and the auxiliary radar based on each plane equation, and determine calibration point cloud data based on the target transformation matrix.
According to the technical scheme, plane point cloud data corresponding to three calibration planes which are intersected pairwise are collected through the main radar and the auxiliary radar, the plane equation of each calibration plane is determined based on the plane point cloud data, the target conversion matrix between the main radar and the auxiliary radar is determined based on the plane equation, the problem that the limiting factors are more in the radar calibration process in the prior art is solved, the installation position relation between the radars and the radar types are not limited, and therefore the applicable scene of radar calibration is widened.
On the basis of the above technical solution, optionally, the plane equation determining module 320 includes:
the plane equation determining unit is used for performing coordinate conversion on the auxiliary point cloud data based on a preset conversion matrix to obtain pre-auxiliary point cloud data, and determining plane equations respectively corresponding to the plane point cloud data based on the plane point cloud data respectively corresponding to the main point cloud data and the pre-auxiliary point cloud data;
correspondingly, the point cloud data calibration module 330 includes:
and the reference conversion matrix determining unit is used for determining a reference conversion matrix between the main radar and the auxiliary radar based on each plane equation, and determining a target conversion matrix between the main radar and the auxiliary radar based on a preset conversion matrix and the reference conversion matrix.
On the basis of the above technical solution, optionally, the plane equation determining unit includes:
the first target plane equation determining subunit is used for respectively taking the main point cloud data and the pre-auxiliary point cloud data as target point cloud data; determining a first target plane equation corresponding to a first calibration plane based on first plane point cloud data corresponding to the target point cloud data;
the second target plane equation determining subunit is used for determining a second target plane equation corresponding to the second calibration plane and a third target plane equation corresponding to the third calibration plane based on the reference plane point cloud data corresponding to the target point cloud data; the reference plane point cloud data comprises second plane point cloud data and third plane point cloud data.
On the basis of the above technical solution, optionally, the apparatus further includes:
the first plane point cloud data determining module is used for selecting point cloud data with the minimum vertical coordinate in the target point cloud data based on the preset selecting number and determining the average height corresponding to the selected point cloud data; and screening the target point cloud data based on the average height and a preset height difference threshold value to obtain first plane point cloud data corresponding to the target point cloud data.
On the basis of the above technical solution, optionally, the apparatus further includes:
the reference plane point cloud data determining module is used for determining reference plane point cloud data corresponding to the target point cloud data based on the target point cloud data and a first target plane equation corresponding to the target point cloud data; when the target point cloud data is main point cloud data, the first target plane equation is a first main plane equation, and when the target point cloud data is pre-auxiliary point cloud data, the first target plane equation is a first auxiliary plane equation.
On the basis of the above technical solution, optionally, the reference plane point cloud data determining module is specifically configured to:
determining parameter data between the target point cloud data and a main radar central point, and taking point cloud data which meets a preset parameter range in the target point cloud data as reference target point cloud data; wherein the parameter data comprises distance data and/or offset angle data;
and determining reference plane point cloud data corresponding to the target point cloud data based on the reference target point cloud data and a first target plane equation corresponding to the target point cloud data.
On the basis of the above technical solution, optionally, the second target plane equation determining subunit is specifically configured to:
determining a second target plane equation corresponding to a second calibration plane based on the reference plane point cloud data corresponding to the target point cloud data; determining third plane point cloud data corresponding to the target point cloud data based on the reference plane point cloud data and the second target plane equation; and determining a third target plane equation corresponding to the third calibration plane based on the third plane point cloud data.
On the basis of the foregoing technical solution, optionally, the reference transformation matrix determining unit includes:
the characteristic point coordinate determination subunit is used for determining characteristic point coordinates respectively corresponding to the main point cloud data and the pre-auxiliary point cloud data based on three plane equations respectively corresponding to the main point cloud data and the pre-auxiliary point cloud data; the characteristic point coordinates comprise plane intersection point coordinates and intersection line characteristic point coordinates, the plane intersection point coordinates are used for representing intersection point coordinates of the three calibration planes, and the intersection line characteristic point coordinates are used for representing characteristic point coordinates on an intersection line between the three calibration planes;
and the reference conversion matrix determining subunit is used for performing pose estimation operation on a main feature set generated based on the feature point coordinates corresponding to the main point cloud data and an auxiliary feature set generated based on the feature point coordinates corresponding to the pre-auxiliary point cloud data to obtain a reference conversion matrix between the main radar and the auxiliary radar.
On the basis of the above technical solution, optionally, the feature point coordinate determination subunit is specifically configured to:
respectively taking the main point cloud data and the pre-auxiliary point cloud data as target point cloud data;
determining three target linear equations corresponding to the three calibration planes based on three plane equations corresponding to the target point cloud data, and determining plane intersection point coordinates corresponding to the target point cloud data based on each target linear equation;
respectively determining the coordinates of characteristic points of the intersection line corresponding to each target linear equation based on the plane intersection point coordinates and the preset distance standards respectively corresponding to each target linear equation; the preset distance standard comprises that the distance between the intersection point coordinate of the intersecting line characteristic point coordinate and the plane intersection point coordinate meets a preset standard distance, and the distance between the intersection line characteristic point coordinate and the main radar central point is larger than or smaller than the distance between the other characteristic point coordinate meeting the preset standard distance and the main radar central point.
On the basis of the above technical solution, optionally, the apparatus further includes:
the intersection line characteristic point coordinate ordering module is used for respectively taking the main point cloud data and the pre-auxiliary point cloud data as target point cloud data; determining offset angle data corresponding to the characteristic point coordinates of each intersection line respectively based on the characteristic point coordinates of the three intersection lines corresponding to the target point cloud data; based on the data of each offset angle, sorting the coordinates of the characteristic points of each intersection line to obtain a target characteristic set; and when the target point cloud data is the main point cloud data, the target feature set is the main feature set, and when the target point cloud data is the pre-auxiliary point cloud data, the target feature set is the auxiliary feature set.
The radar calibration device provided by the embodiment of the invention can be used for executing the radar calibration method provided by the embodiment of the invention, and has corresponding functions and beneficial effects of the execution method.
It should be noted that, in the embodiment of the radar calibration apparatus, each included unit and each included module are only divided according to functional logic, but are not limited to the above division, as long as the corresponding function can be implemented; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention.
Example four
Fig. 6 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention, where the embodiment of the present invention provides a service for implementing the radar calibration method according to the foregoing embodiment of the present invention, and the radar calibration device according to the foregoing embodiment may be configured. FIG. 6 illustrates a block diagram of an exemplary electronic device 12 suitable for use in implementing embodiments of the present invention. The electronic device 12 shown in fig. 6 is only an example and should not bring any limitation to the function and the scope of use of the embodiment of the present invention.
As shown in FIG. 6, electronic device 12 is embodied in the form of a general purpose computing device. The components of electronic device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including the system memory 28 and the processing unit 16.
Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor, or a local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Electronic device 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by electronic device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)30 and/or cache memory 32. The electronic device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 6, and commonly referred to as a "hard drive"). Although not shown in FIG. 6, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. Memory 28 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 40 having a set (at least one) of program modules 42 may be stored, for example, in memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 42 generally carry out the functions and/or methodologies of the described embodiments of the invention.
Electronic device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), with one or more devices that enable a user to interact with electronic device 12, and/or with any devices (e.g., network card, modem, etc.) that enable electronic device 12 to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 22. Also, the electronic device 12 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet) via the network adapter 20. As shown in FIG. 6, the network adapter 20 communicates with the other modules of the electronic device 12 via the bus 18. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with electronic device 12, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 16 executes programs stored in the system memory 28 to perform various functional applications and data processing, such as implementing the radar calibration method provided by the embodiments of the present invention.
Through the electronic equipment, the problem that the limiting factors are more in the radar calibration process in the prior art is solved, the installation position relation between the radars and the radar type are not limited, and therefore the applicable scene of radar calibration is widened.
EXAMPLE five
An embodiment of the present invention further provides a storage medium containing computer-executable instructions, where the computer-executable instructions are executed by a computer processor to perform a radar calibration method, and the method includes:
acquiring main point cloud data and auxiliary point cloud data which are respectively acquired by a main radar and an auxiliary radar; the main point cloud data and the auxiliary point cloud data respectively comprise plane point cloud data corresponding to three calibration planes which are intersected pairwise;
determining plane equations respectively corresponding to the plane point cloud data based on the plane point cloud data respectively corresponding to the main point cloud data and the auxiliary point cloud data;
and determining a target conversion matrix between the main radar and the auxiliary radar based on each plane equation, and determining calibration point cloud data based on the target conversion matrix.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, or the like, as well as conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
Of course, the storage medium provided by the embodiments of the present invention contains computer-executable instructions, and the computer-executable instructions are not limited to the above method operations, and may also perform related operations in the radar calibration method provided by any embodiment of the present invention.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments illustrated herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (13)

1. A radar calibration method is characterized by comprising the following steps:
acquiring main point cloud data and auxiliary point cloud data which are respectively acquired by a main radar and an auxiliary radar; the main point cloud data and the auxiliary point cloud data respectively comprise plane point cloud data corresponding to three calibration planes which are intersected pairwise;
determining a plane equation corresponding to each plane point cloud data based on the plane point cloud data corresponding to the main point cloud data and the auxiliary point cloud data;
and determining a target conversion matrix between the main radar and the auxiliary radar based on each plane equation, and determining calibration point cloud data based on the target conversion matrix.
2. The method of claim 1, wherein determining a plane equation corresponding to each of the planar point cloud data based on the planar point cloud data corresponding to each of the primary point cloud data and the secondary point cloud data comprises:
performing coordinate conversion on the auxiliary point cloud data based on a preset conversion matrix to obtain pre-auxiliary point cloud data, and determining plane equations respectively corresponding to the plane point cloud data based on the plane point cloud data respectively corresponding to the main point cloud data and the pre-auxiliary point cloud data;
correspondingly, the determining a target transformation matrix between the main radar and the auxiliary radar based on each plane equation comprises:
and determining a reference conversion matrix between the main radar and the auxiliary radar based on each plane equation, and determining a target conversion matrix between the main radar and the auxiliary radar based on the preset conversion matrix and the reference conversion matrix.
3. The method of claim 2, wherein determining a plane equation corresponding to each of the plane point cloud data based on the plane point cloud data corresponding to each of the primary point cloud data and the pre-auxiliary point cloud data comprises:
respectively taking the main point cloud data and the pre-auxiliary point cloud data as target point cloud data;
determining a first target plane equation corresponding to a first calibration plane based on first plane point cloud data corresponding to the target point cloud data;
determining a second target plane equation corresponding to a second calibration plane and a third target plane equation corresponding to a third calibration plane based on the reference plane point cloud data corresponding to the target point cloud data; wherein the reference plane point cloud data comprises second plane point cloud data and third plane point cloud data.
4. The method of claim 3, further comprising:
based on a preset selection quantity, selecting point cloud data with the minimum vertical coordinate in the target point cloud data, and determining the average height corresponding to the selected point cloud data;
and screening the target point cloud data to obtain first plane point cloud data corresponding to the target point cloud data based on the average height and a preset height difference threshold value.
5. The method of claim 3, further comprising:
determining reference plane point cloud data corresponding to the target point cloud data based on the target point cloud data and a first target plane equation corresponding to the target point cloud data; the first target plane equation is a first main plane equation when the target point cloud data is main point cloud data, and the first target plane equation is a first auxiliary plane equation when the target point cloud data is pre-auxiliary point cloud data.
6. The method of claim 5, wherein determining reference plane point cloud data corresponding to the target point cloud data based on the target point cloud data and a first target plane equation corresponding to the target point cloud data comprises:
determining parameter data between the target point cloud data and a main radar central point, and taking point cloud data which meets a preset parameter range in the target point cloud data as reference target point cloud data; wherein the parameter data comprises distance data and/or offset angle data;
determining reference plane point cloud data corresponding to the target point cloud data based on the reference target point cloud data and a first target plane equation corresponding to the target point cloud data.
7. The method of claim 3, wherein determining a second target plane equation corresponding to a second calibration plane and a third target plane equation corresponding to a third calibration plane based on reference plane point cloud data corresponding to the target point cloud data comprises:
determining a second target plane equation corresponding to a second calibration plane based on the reference plane point cloud data corresponding to the target point cloud data;
determining third plane point cloud data corresponding to the target point cloud data based on the reference plane point cloud data and a second target plane equation;
and determining a third target plane equation corresponding to a third calibration plane based on the third plane point cloud data.
8. The method of claim 2, wherein determining a reference transformation matrix between the primary radar and the secondary radar based on each of the plane equations comprises:
determining feature point coordinates respectively corresponding to the principal point cloud data and the pre-auxiliary point cloud data based on three plane equations respectively corresponding to the principal point cloud data and the pre-auxiliary point cloud data; the characteristic point coordinates comprise plane intersection point coordinates and intersection line characteristic point coordinates, the plane intersection point coordinates are used for representing intersection point coordinates of the three calibration planes, and the intersection line characteristic point coordinates are used for representing characteristic point coordinates on an intersection line between the three calibration planes;
and carrying out pose estimation operation on a main feature set generated based on the feature point coordinates corresponding to the main point cloud data and an auxiliary feature set generated based on the feature point coordinates corresponding to the pre-auxiliary point cloud data to obtain a reference conversion matrix between the main radar and the auxiliary radar.
9. The method of claim 8, wherein determining feature point coordinates corresponding to the principal point cloud data and the pre-auxiliary point cloud data, respectively, based on three plane equations corresponding to the principal point cloud data and the pre-auxiliary point cloud data, respectively, comprises:
respectively taking the main point cloud data and the pre-auxiliary point cloud data as target point cloud data;
determining three target linear equations corresponding to the three calibration planes based on three plane equations corresponding to the target point cloud data, and determining plane intersection point coordinates corresponding to the target point cloud data based on each target linear equation;
respectively determining the coordinates of characteristic points of the intersecting lines corresponding to the target linear equations based on the plane intersection point coordinates and preset distance standards respectively corresponding to the target linear equations; the preset distance standard comprises that the distance between the intersection point coordinate of the intersecting line and the plane intersection point coordinate meets a preset standard distance, and the distance between the intersection line characteristic point coordinate and the main radar central point is larger than or smaller than the distance between the other characteristic point coordinate meeting the preset standard distance and the main radar central point.
10. The method of claim 8, further comprising:
respectively taking the main point cloud data and the pre-auxiliary point cloud data as target point cloud data;
determining offset angle data corresponding to the characteristic point coordinates of each intersection line based on the characteristic point coordinates of the three intersection lines corresponding to the target point cloud data;
sorting the coordinates of the characteristic points of the intersecting lines based on the data of the deviation angles to obtain a target characteristic set; when the target point cloud data is main point cloud data, the target feature set is a main feature set, and when the target point cloud data is pre-auxiliary point cloud data, the target feature set is an auxiliary feature set.
11. A radar calibration device, comprising:
the point cloud data acquisition module is used for acquiring main point cloud data and auxiliary point cloud data which are respectively acquired by the main radar and the auxiliary radar; the main point cloud data and the auxiliary point cloud data respectively comprise plane point cloud data corresponding to three calibration planes which are intersected pairwise;
the plane equation determining module is used for determining a plane equation corresponding to each plane point cloud data based on the plane point cloud data corresponding to the main point cloud data and the auxiliary point cloud data;
and the point cloud data calibration module is used for determining a target conversion matrix between the main radar and the auxiliary radar based on each plane equation and determining calibration point cloud data based on the target conversion matrix.
12. An electronic device, characterized in that the electronic device comprises:
one or more processors;
a memory for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the radar calibration method of any one of claims 1-10.
13. A storage medium containing computer executable instructions for performing the radar calibration method of any one of claims 1 to 10 when executed by a computer processor.
CN202110075819.4A 2021-01-20 2021-01-20 Radar calibration method, device, equipment and storage medium Active CN113759348B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110075819.4A CN113759348B (en) 2021-01-20 2021-01-20 Radar calibration method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110075819.4A CN113759348B (en) 2021-01-20 2021-01-20 Radar calibration method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113759348A true CN113759348A (en) 2021-12-07
CN113759348B CN113759348B (en) 2024-05-17

Family

ID=78786387

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110075819.4A Active CN113759348B (en) 2021-01-20 2021-01-20 Radar calibration method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113759348B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114694138A (en) * 2022-06-01 2022-07-01 远峰科技股份有限公司 Road surface detection method, device and equipment applied to intelligent driving
CN115236690A (en) * 2022-09-20 2022-10-25 图达通智能科技(武汉)有限公司 Data fusion method and device for laser radar system and readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109613543A (en) * 2018-12-06 2019-04-12 深圳前海达闼云端智能科技有限公司 Method and device for correcting laser point cloud data, storage medium and electronic equipment
CN110031824A (en) * 2019-04-12 2019-07-19 杭州飞步科技有限公司 Laser radar combined calibrating method and device
CN110148180A (en) * 2019-04-22 2019-08-20 河海大学 A kind of laser radar and camera fusing device and scaling method
CN111815716A (en) * 2020-07-13 2020-10-23 北京爱笔科技有限公司 Parameter calibration method and related device
CN112233182A (en) * 2020-12-15 2021-01-15 北京云测网络科技有限公司 Method and device for marking point cloud data of multiple laser radars

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109613543A (en) * 2018-12-06 2019-04-12 深圳前海达闼云端智能科技有限公司 Method and device for correcting laser point cloud data, storage medium and electronic equipment
CN110031824A (en) * 2019-04-12 2019-07-19 杭州飞步科技有限公司 Laser radar combined calibrating method and device
CN110148180A (en) * 2019-04-22 2019-08-20 河海大学 A kind of laser radar and camera fusing device and scaling method
CN111815716A (en) * 2020-07-13 2020-10-23 北京爱笔科技有限公司 Parameter calibration method and related device
CN112233182A (en) * 2020-12-15 2021-01-15 北京云测网络科技有限公司 Method and device for marking point cloud data of multiple laser radars

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114694138A (en) * 2022-06-01 2022-07-01 远峰科技股份有限公司 Road surface detection method, device and equipment applied to intelligent driving
CN114694138B (en) * 2022-06-01 2022-09-20 远峰科技股份有限公司 Road surface detection method, device and equipment applied to intelligent driving
CN115236690A (en) * 2022-09-20 2022-10-25 图达通智能科技(武汉)有限公司 Data fusion method and device for laser radar system and readable storage medium
CN115236690B (en) * 2022-09-20 2023-02-10 图达通智能科技(武汉)有限公司 Data fusion method and device for laser radar system and readable storage medium

Also Published As

Publication number Publication date
CN113759348B (en) 2024-05-17

Similar Documents

Publication Publication Date Title
CN109297510B (en) Relative pose calibration method, device, equipment and medium
CN109270545B (en) Positioning true value verification method, device, equipment and storage medium
CN112654886B (en) External parameter calibration method, device, equipment and storage medium
CN110163903B (en) Three-dimensional image acquisition and image positioning method, device, equipment and storage medium
EP3620823B1 (en) Method and device for detecting precision of internal parameter of laser radar
CN110095752B (en) Positioning method, apparatus, device and medium
CN109190573B (en) Ground detection method applied to unmanned vehicle, electronic equipment and vehicle
CN110927708B (en) Calibration method, device and equipment of intelligent road side unit
CN111127563A (en) Combined calibration method and device, electronic equipment and storage medium
CN110988849B (en) Calibration method and device of radar system, electronic equipment and storage medium
EP3617997A1 (en) Method, apparatus, device, and storage medium for calibrating posture of moving obstacle
KR20210052409A (en) Lane line determination method and apparatus, lane line positioning accuracy evaluation method and apparatus, device, and program
CN110849363B (en) Pose calibration method, system and medium for laser radar and combined inertial navigation
CN110782531A (en) Method and computing device for processing three-dimensional point cloud data
CN113759348B (en) Radar calibration method, device, equipment and storage medium
WO2022179094A1 (en) Vehicle-mounted lidar external parameter joint calibration method and system, medium and device
CN112684478A (en) Parameter calibration method and device based on double antennas, storage medium and electronic equipment
CN114926549A (en) Three-dimensional point cloud processing method, device, equipment and storage medium
CN114049401A (en) Binocular camera calibration method, device, equipment and medium
CN109389119B (en) Method, device, equipment and medium for determining interest point region
CN110853098A (en) Robot positioning method, device, equipment and storage medium
WO2022160879A1 (en) Method and apparatus for determining conversion parameters
CN115063489A (en) External parameter calibration method, device, equipment and storage medium
CN110634159A (en) Target detection method and device
CN111366172A (en) Quality detection method and device of digital elevation model and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant