CN111443337B - Radar-IMU calibration method based on hand-eye calibration - Google Patents

Radar-IMU calibration method based on hand-eye calibration Download PDF

Info

Publication number
CN111443337B
CN111443337B CN202010230937.3A CN202010230937A CN111443337B CN 111443337 B CN111443337 B CN 111443337B CN 202010230937 A CN202010230937 A CN 202010230937A CN 111443337 B CN111443337 B CN 111443337B
Authority
CN
China
Prior art keywords
radar
imu
relative pose
solving
hand
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010230937.3A
Other languages
Chinese (zh)
Other versions
CN111443337A (en
Inventor
赵龙
冀磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN202010230937.3A priority Critical patent/CN111443337B/en
Publication of CN111443337A publication Critical patent/CN111443337A/en
Application granted granted Critical
Publication of CN111443337B publication Critical patent/CN111443337B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Manufacturing & Machinery (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Gyroscopes (AREA)

Abstract

The invention discloses a radar-IMU calibration method based on hand-eye calibration, which extracts feature points in radar point cloud data by using curvature information; matching the two adjacent scans of the radar by using the characteristic points to obtain the relative poses of the two adjacent scans of the radar, wherein the two adjacent scans respectively represent the previous time and the current time; aligning the radar timestamp with the IMU timestamp, and solving the relative pose of the two adjacent timestamps of the IMU; according to the relative attitude of the radar adjacent to the two times of scanning and the relative attitude of the IMU adjacent to the two times of time stamps, solving the relative attitude of the radar and the IMU by using a hand-eye calibration method; the method can effectively weaken the influence of radar motion information in the point cloud data on the data, and improve the precision of point cloud matching and external reference calibration.

Description

Radar-IMU calibration method based on hand-eye calibration
Technical Field
The invention relates to the field of electronic information, in particular to a radar-IMU calibration method based on hand-eye calibration.
Background
SLAM, all known as instant positioning and mapping, is a technique for estimating own motion and constructing a map of the surrounding environment in an unfamiliar environment. Due to the important theory and application value, the learners consider the key technology for realizing automatic driving. Early SLAM relied on only a single sensor such as a camera or radar, and as hardware was developed and theory matured, SLAM technology incorporating multiple sensors became the focus of research for scholars. In order to realize multi-sensor fusion, the basic task is to solve the relative position between the multi-sensors, which can also be called as calibration external parameter. The inertial measurement unit is a device for measuring the three-axis attitude angle (or angular rate) and acceleration of the object; generally, an IMU includes three single-axis accelerometers and three single-axis gyroscopes, the accelerometers detect acceleration signals of an object in three independent axes of a carrier coordinate system, and the gyroscopes detect angular velocity signals of the carrier relative to a navigation coordinate system, measure angular velocity and acceleration of the object in a three-dimensional space, and calculate the attitude of the object based on the angular velocity and acceleration signals; the calibration method of the camera and the IMU is mature, and the calibration tool can be easily obtained on the internet, but no calibration algorithm of the radar and the IMU exists at present, and compared with the camera, the radar scanning is completed in one period, in other words, how to eliminate or weaken the influence of the point cloud distortion on the radar-IMU calibration precision is always a research hotspot because the point cloud data obtained by the radar has the motion information of the radar in the scanning period.
Therefore, how to eliminate or weaken the influence of the point cloud distortion on the radar-IMU calibration precision is an urgent problem to be solved by practitioners of the same industry.
Disclosure of Invention
The invention aims to provide a radar-IMU calibration method based on hand-eye calibration, which can effectively weaken the influence of radar motion information in point cloud data on the data and improve the precision of point cloud matching and external reference calibration.
In order to solve the above technical problem, an embodiment of the present invention provides a radar-IMU calibration method based on hand-eye calibration, including:
s1, extracting feature points in the radar point cloud data by using curvature information;
s2, matching the two adjacent scans of the radar by using the characteristic points to obtain the relative poses of the two adjacent scans of the radar, wherein the two adjacent scans respectively represent the previous moment and the current moment;
s3, aligning the radar timestamp with the IMU timestamp, and solving the relative pose of the two adjacent IMU timestamps;
s4, solving the relative pose of the radar and the IMU by using a hand-eye calibration method according to the relative pose of the two adjacent scans of the radar and the relative pose of the two adjacent timestamps of the IMU;
and S5, reducing the distortion of the radar point cloud data by using the relative pose of the radar and the IMU, and completing the calibration of the relative pose of the radar and the IMU.
In one embodiment, the step S1 includes:
the step S1 includes: by one point P in the point cloud data set PiCentered by the distance PiSolving for the coordinate information of the nearest 10 points PiAnd is marked with PiThe scanning line is located; and marking the points with the curvatures larger than a preset threshold value as the characteristic points.
In one embodiment, the formula to solve for the curvature is:
Figure BDA0002429247550000021
wherein, PjIs shown at PiPoints in a three-dimensional neighborhood, S representing PiThe neighborhood points of (2), wherein the S comprises 10 neighborhood points; eta represents a point PiOf (c) is performed.
In one embodiment, the step S2 includes:
s21, matching feature points obtained by two adjacent scans by using distance constraint from points to straight lines;
and S22, solving the relative pose between the two previous and next scanning scans of the radar by using a Gaussian-Newton method for all matched points.
In one embodiment, the equation for the point-to-line distance constraint of S21 is:
Figure BDA0002429247550000022
Figure BDA0002429247550000023
the feature points acquired by the scan at the present time are indicated,
Figure BDA0002429247550000024
show that
Figure BDA0002429247550000025
Projected to a point in the scanning coordinate system at the previous moment,
Figure BDA0002429247550000026
RAdenotes rotation, tARepresents a translation;
Figure BDA00024292475500000215
representing the sum of feature points scanned at the previous time
Figure BDA0002429247550000027
In the same scan line and
Figure BDA0002429247550000028
the point of the shortest distance is the point of the shortest distance,
Figure BDA0002429247550000029
represents the time of last scanning point in AND
Figure BDA00024292475500000210
Adjacent scan lines and
Figure BDA00024292475500000211
the point of shortest distance, d1Representative point
Figure BDA00024292475500000212
To a straight line
Figure BDA00024292475500000213
The distance of (d);
the S22 includes: the relative pose solved at the moment is an initial value, the Gaussian-Newton method is used for solving the relative pose for all matched points, and a constraint equation is as follows:
Figure BDA00024292475500000214
wherein k represents the number of characteristic points extracted by scanning at the current moment, and R is iteratively optimized by using a Gaussian-Newton methodAAnd tASo as to minimize the value of d and further obtain the relative pose R of the radarAAnd tA
In one embodiment, the step S3 includes:
the time stamp of the IMU is referenced by the time stamp of the radar point cloud data by utilizing a linear interpolation method; and solving the relative pose of the IMU at the current moment and the previous moment by using an inertial navigation pose solving algorithm.
In one embodiment, the step S4 includes:
s41, establishing a basic equation of the radar and IMU hand-eye calibration;
s42, deriving a least square cost function of the radar and IMU hand-eye calibration according to a basic equation;
and S43, solving the Jacobian matrix according to the cost function, and solving the relative pose of the radar and the IMU by using a Gaussian-Newton method.
In one embodiment, the basic equation for the hand-eye calibration of the radar and IMU in S41 is:
RARX=RXRB
(RA-I3)tX=RXtB-tA
wherein R isAAnd tARespectively representing rotation and translation of the radar, RBAnd tBRepresenting rotation and translation of the IMU, RXAnd tXThen the relative rotation and translation between the IMU and the radar is represented;
the derivation process of the least square cost function of the radar and IMU hand-eye calibration in S42 is as follows:
Figure BDA0002429247550000031
r is to beXAnd tXSubstituting into the solution to obtain:
Figure BDA0002429247550000032
order to
Figure BDA0002429247550000033
Figure BDA0002429247550000034
Figure BDA0002429247550000035
Then
Figure BDA0002429247550000036
The solving process of the relative pose in the S43 is as follows:
the solution Jacobian matrix is:
Figure BDA0002429247550000041
using rx ry rzRepresents RXRespectively about three coordinate axes, tx ty tzRepresents tXFor translation along three axes, the jacobian matrix can be rewritten as:
Figure BDA0002429247550000042
by solving equation JTJΔξ=-JTf obtaining the increment of relative pose delta xi ═ delta rx,Δry,Δrz,Δtx,Δty,Δtz]TAnd the relative pose acquired at the last moment is an iteration initial value, the increment is superposed on the initial value for iterative solution, and finally the optimal relative pose R of the radar and the IMU is obtainedXAnd tX
In one embodiment, the S5 includes:
s51, solving linear motion and nonlinear motion parts of two adjacent times by using IMU data;
s52, projecting the nonlinear motion of the IMU to a radar coordinate system by using the relative pose obtained by primary calibration to simulate the radar motion, and subtracting the projected nonlinear motion part from point cloud data to reduce the distortion generated in the radar motion process;
and S53, extracting characteristic points from the processed point cloud data, matching the two adjacent scans of the radar, solving the relative pose of uniform motion, adding the nonlinear motion part subtracted in the step S52 to obtain the relative pose of the two adjacent scans of the radar, and calibrating the relative pose of the radar and the IMU.
In one embodiment, the S51 includes:
order:
Figure BDA0002429247550000043
wherein, BIIRepresenting the integral pose information acquired according to the IMU during calibration, and dividing the pose information into
Figure BDA0002429247550000044
And
Figure BDA0002429247550000045
in combination of (1), wherein
Figure BDA0002429247550000046
Represents BIIThe linear motion part in (1) is,
Figure BDA0002429247550000047
represents BIIThe non-linear motion section of (1);
the S52 includes: suppose that the result X of the S43 calibration has been obtainedISaid X isIRepresenting the relative pose of the radar and IMU solved in S43, will now be
Figure BDA0002429247550000048
And
Figure BDA0002429247550000049
projected onto a radar coordinate system, then
Figure BDA00024292475500000410
Figure BDA00024292475500000411
Wherein,
Figure BDA0002429247550000051
represents the linear part of the radar motion estimated by the IMU,
Figure BDA0002429247550000052
the non-linear part representing the radar motion estimated by the IMU is subtracted, and the remaining information of the point cloud data is generated for linear motion.
The invention has the advantages that the invention provides a radar-IMU calibration method based on hand-eye calibration, the algorithm firstly solves the relative pose between adjacent moments of the radar by using the constraint between points, and solves the relative pose (also called external reference) between the radar and the IMU by using the pose; on the basis, the pose of the IMU is used for simulating the radar motion, the relative pose of the radar with the point cloud distortion reduced is obtained through solving, and then the relative pose between the radar and the IMU is solved again through the pose. Because the influence of point cloud distortion on the matching precision of two times of scanning of the radar is reduced by utilizing the IMU information during secondary calibration, the influence of the nonlinear motion of the radar on the calibration result can be effectively weakened through the method, and more accurate external parameters of the radar/IMU are obtained.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings:
fig. 1 is a flowchart of a radar-IMU calibration method based on hand-eye calibration according to an embodiment of the present invention;
fig. 2 is a flowchart of step S4 according to an embodiment of the present invention;
fig. 3 is a flowchart of step S5 according to an embodiment of the present invention;
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
As shown in fig. 1, an embodiment of the present invention provides a radar-IMU calibration method based on hand-eye calibration, including:
s1, extracting feature points in the radar point cloud data by using curvature information;
s2, matching the two adjacent scans of the radar by using the characteristic points to obtain the relative poses of the two adjacent scans of the radar, wherein the two adjacent scans respectively represent the previous moment and the current moment;
s3, aligning the radar timestamp with the IMU timestamp, and solving the relative pose of the two adjacent IMU timestamps;
s4, solving the relative pose of the radar and the IMU by using a hand-eye calibration method according to the relative pose of the two adjacent scans of the radar and the relative pose of the two adjacent timestamps of the IMU;
and S5, reducing the distortion of the radar point cloud data by using the relative pose of the radar and the IMU, and completing the calibration of the relative pose of the radar and the IMU.
In the embodiment, in steps S1-S2, the relative poses of two adjacent scans of the radar are solved by matching the acquired feature points;
in step S3, aligning the time stamp of the IMU with the time stamp of the radar point cloud data as a reference using a linear interpolation method; solving the relative poses of the IMU twice before and after by using an inertial navigation pose solving algorithm;
in step S4, solving for the relative pose (also called an external reference) between the radar and the IMU by the relative pose of the radar in two adjacent scans and the relative pose of the IMU in two previous and subsequent scans obtained in steps S2 and S3;
in step S5, the pose of the IMU is used to simulate the radar motion, and the relative pose of the radar with reduced point cloud distortion is solved, and then the relative pose between the radar and IMU is solved again using this pose.
In one embodiment, step S1 includes:
defining the point cloud data set as P ═ { P ═ P1,P2…PnAnd marking the points in P on the same scanning line as the same type of points, and taking the point P as a pointi(i∈[1,n]) Centered by the distance PiSolving for the coordinate information of the nearest 10 points PiAnd is marked with PiThe formula for solving the curvature of the scanning line is as follows;
Figure BDA0002429247550000061
wherein, PjIs shown at point PiPoints in a three-dimensional neighborhood, S representing PiIs 10, η represents the point PiThe curvature of (a);
at point X(k,i) For example, the formula for solving the curvature is as follows;
Figure BDA0002429247550000062
wherein, X(k,i)And X(k,j)Represents the point on the same scanning line acquired by the k-th scanning, S represents the number of the selected neighborhood points, which is generally 10, and eta represents the point X(k,i)Is larger than a certain threshold value, for example, the threshold value may be 10-5
In one embodiment, the step S2 of solving the relative pose part of the radar includes the specific steps of:
features obtained by using distance between point and straight line as adjacent two scansBasis of point matching, hypothesis
Figure BDA0002429247550000063
Representing the feature point set obtained by scanning at the current moment,
Figure BDA0002429247550000064
representing the feature point set obtained by scanning at the last moment,
Figure BDA0002429247550000071
representing one point in the point set P, and the relative pose of the last time solved by the last time relative to the last time is RAAnd tA,RADenotes rotation, tARepresents a translation, and let RAAnd tAAnd finishing scanning of one period of the radar for the initial value of the relative pose of the current time and the last time.
Since the data acquired by the radar carries its motion information, which is simply considered linear during the radar scan period at this stage of processing, the motion information carried by each point cloud is proportional to its time stamp. In the matching process, the point cloud scanned at the previous moment needs to be projected to the initial moment of current scanning, and the point cloud scanned at the current moment also needs to be projected to the initial moment of current scanning. If the point is the last point obtained by current scanning, the projection equation is
Figure BDA0002429247550000072
Wherein,
Figure BDA0002429247550000073
show that
Figure BDA0002429247550000074
Projected to a point in the scanning coordinate system at the previous moment. Other points acquired in the scanning can be based on the time stamp thereof and RAAnd tAAs relative pose over the scan period, passing through the lineAnd acquiring the relative pose of the scanning object to the scanning initial moment by using a linear interpolation method. During radar scanning, each point has a time stamp, for example: one (or one) scan cycle from t0To tnThe time instants, all points being obtained within the cycle, if the last point, contains the motion R of the whole cycle, but if the point of some intermediate time instant, it is linearly interpolated according to a time stamp, e.g. tiThe time interpolation is (t)i-t0)/(tn-t0) R; finding a matching point to construct a point-to-straight line distance constraint equation, wherein the distance from the point to the straight line is the distance between the feature points at the current moment and the straight line formed by nearest neighbor feature points at the previous moment:
Figure BDA0002429247550000075
wherein,
Figure BDA0002429247550000076
is the neutralization of the feature points scanned at the last moment
Figure BDA0002429247550000077
At the same point of the scan line and
Figure BDA0002429247550000078
the distance of (a) is the shortest,
Figure BDA0002429247550000079
at the scanning line and
Figure BDA00024292475500000710
is located adjacent to and in contact with the scan line
Figure BDA00024292475500000711
The distance of (d) is the shortest among the non-identical scan lines1Representative point
Figure BDA00024292475500000712
To a straight line
Figure BDA00024292475500000713
The distance of (c).
And (3) solving the pose: the relative pose solved at the previous moment is an initial value, the point obtained at the current moment is projected to the current scanning starting moment, and because the radar scanning is completed in one period, the point obtained at the previous moment is projected to the current scanning starting moment at the previous moment scanning ending moment, namely the current scanning starting moment, so that a constraint equation of all the points is constructed:
Figure BDA00024292475500000714
wherein k represents the number of characteristic points scanned and extracted at the current moment, and R is iteratively optimized by using a Gaussian-Newton methodAAnd tAAnd d value is minimized, so that the relative pose of the radar is obtained.
In one embodiment, the process of calculating the relative pose of the IMU in step S3: aligning the time stamp of the IMU by using the time stamp of the radar point cloud data as a reference by using a linear interpolation method; and solving the relative pose of the IMU at the front and back moments by using an inertial navigation pose solving algorithm.
As shown in fig. 2, in an embodiment, the process of solving the radar-IMU external parameter by using the hand-eye mark algorithm in step S4 includes the following specific steps:
s41: establishing a basic equation of the radar and IMU hand-eye calibration:
RARX=RXRB
(RA-I3)tX=RXtB-tA
wherein R isAAnd tARespectively representing rotation and translation of the radar, RBAnd tBRepresenting rotation and translation of the IMU, RXAnd tXThen the relative rotation and translation between the IMU and the radar are represented, and now the rotation and translation of the radar and IMU have been acquired separately, and R needs to be solvedXAnd tX
S42: and (3) deriving a least square cost function of the radar and IMU hand-eye calibration according to a basic equation:
Figure BDA0002429247550000081
can be further written as
Figure BDA0002429247550000082
Order to
Figure BDA0002429247550000083
Figure BDA0002429247550000084
Figure BDA0002429247550000085
Then there is
Figure BDA0002429247550000086
S43: and solving the Jacobian matrix according to the cost function and solving the external parameter by using a Gaussian-Newton method. Solving the Jacobian matrix according to the cost function as
Figure BDA0002429247550000087
Using rx ry rzRepresents RXRespectively about three coordinate axes, tx ty tzRepresents tXIn translation along three axes, the Jacobian matrix can be rewritten as
Figure BDA0002429247550000091
By solving equation JTJΔξ=-JTf obtaining the increment of relative pose delta xi ═ delta rx,Δry,Δrz,Δtx,Δty,Δtz]TAnd taking the relative pose acquired at the moment as an iteration initial value, overlapping the increment to the initial value for iterative solution, and finally obtaining the optimal relative pose R of the radar and the IMU through iterative optimizationXAnd tX
As shown in fig. 3, in an embodiment, step S5 includes a process of performing quadratic calibration by using distortion caused by nonlinear motion of the radar, and specifically includes the following steps:
s51: the IMU data is used for solving linear motion and nonlinear motion parts at adjacent moments, wherein the linear motion is particularly uniform motion and can be obtained from linear velocity and angular velocity at an initial moment, and the nonlinear motion is accelerated motion and is obtained from the angular velocity and the angular velocity output by the IMU.
Order to
Figure BDA0002429247550000092
Wherein B isIIRepresenting the integral pose information acquired according to the IMU during the secondary calibration, and dividing the pose information into
Figure BDA0002429247550000093
And
Figure BDA0002429247550000094
in combination of (1), wherein
Figure BDA0002429247550000095
Represents BIIThe linear motion part in (1) is,
Figure BDA0002429247550000096
represents BIIThe non-linear motion section of (1);
s52: and projecting the nonlinear motion to a radar coordinate system by using the external parameters obtained by primary calibration to simulate the radar motion, and then subtracting the projected nonlinear motion part from the point cloud data, thereby reducing the distortion generated in the radar motion process. In the first calibration, it is assumed that the result X of this calibration has already been obtainedI,XIThe relative attitude R is obtained in representation S52XAnd tX(ii) a Will now be
Figure BDA0002429247550000097
And
Figure BDA0002429247550000098
projected onto a radar coordinate system, having
Figure BDA0002429247550000099
Figure BDA00024292475500000910
Wherein
Figure BDA00024292475500000911
Represents the linear part of the radar motion estimated by the IMU,
Figure BDA00024292475500000912
by subtracting the non-linear motion component, which represents the radar motion estimated by the IMU, the remaining information of the point cloud data can be considered as linear motion.
S53: and extracting characteristic points from the processed point cloud data, matching between two adjacent scans of the radar, solving the relative pose of uniform motion, adding the nonlinear motion part subtracted in the step S52 to obtain the relative pose of the two adjacent scans of the radar, and calibrating the relative pose of the radar and the IMU.
Those of ordinary skill in the art will understand that: all or part of the steps involved in implementing the method of the above embodiment may be implemented by related software programming and run on a processor to form a radar-IMU calibration algorithm based on hand-eye calibration, where the processor may be a notebook, a server, a workstation or a general computer, or an embedded processor, or a portable terminal device.
The embodiment of the invention discloses a radar-IMU calibration algorithm based on hand-eye calibration, which firstly solves the relative pose between adjacent moments of a radar by using point-to-point constraint and solves the relative pose (also called external reference) between the radar and the IMU by using the pose; on the basis, the pose of the IMU is used for simulating the radar motion, the relative pose of the radar with the point cloud distortion reduced is obtained through solving, and then the relative pose between the radar and the IMU is solved again through the pose. Because the influence of point cloud distortion on the matching precision of two times of scanning of the radar is reduced by utilizing the IMU information during secondary calibration, the influence of the nonlinear motion of the radar on the calibration result can be effectively weakened through the method, and more accurate external parameters of the radar/IMU are obtained.
Finally, it should be noted that the above embodiments are only used for illustrating the technical solutions of the present invention and are not limited. Although the present invention has been described in detail with reference to the embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (9)

1. A radar-IMU calibration method based on hand-eye calibration is characterized by comprising the following steps:
s1, extracting feature points in the radar point cloud data by using curvature information;
s2, matching the two adjacent scans of the radar by using the characteristic points to obtain the relative poses of the two adjacent scans of the radar, wherein the two adjacent scans respectively represent the previous moment and the current moment;
s3, aligning the radar timestamp with the IMU timestamp, and solving the relative pose of the two adjacent IMU timestamps;
s4, solving the relative pose of the radar and the IMU by using a hand-eye calibration method according to the relative pose of the two adjacent scans of the radar and the relative pose of the two adjacent timestamps of the IMU;
s5, reducing the distortion of the radar point cloud data by using the relative pose of the radar and the IMU, and completing the calibration of the relative pose of the radar and the IMU;
the S5 includes:
s51, solving linear motion and nonlinear motion parts of two adjacent times by using IMU data;
s52, projecting the nonlinear motion of the IMU to a radar coordinate system by using the relative pose obtained by calibration in the step S4 to simulate the radar motion, and subtracting the projected nonlinear motion part from point cloud data to reduce the distortion generated in the radar motion process;
and S53, extracting characteristic points from the processed point cloud data, matching the two adjacent scans of the radar, solving the relative pose of uniform motion, adding the nonlinear motion part subtracted in the step S52 to obtain the relative pose of the two adjacent scans of the radar, and calibrating the relative pose of the radar and the IMU.
2. The method for calibrating radar-IMU based on hand-eye calibration as claimed in claim 1, wherein said step S1 includes: by one point P in the point cloud data set PiCentered by the distance PiSolving for the coordinate information of the nearest 10 points PiAnd is marked with PiThe scanning line is located; and marking the points with the curvatures larger than a preset threshold value as the characteristic points.
3. The method for calibrating radar-IMU based on hand-eye calibration as claimed in claim 2, wherein the formula for solving the curvature is:
Figure FDA0003377411490000021
wherein, PjIs shown at PiPoints in a three-dimensional neighborhood, S representing PiThe neighborhood points of (2), wherein the S comprises 10 neighborhood points; eta represents a point PiOf (c) is performed.
4. The method for calibrating radar-IMU based on hand-eye calibration as claimed in claim 1, wherein said step S2 includes:
s21, matching feature points obtained by two adjacent scans by using distance constraint from points to straight lines;
and S22, solving the relative pose between the two previous and next scanning scans of the radar by using a Gaussian-Newton method for all matched points.
5. The method for calibrating radar-IMU based on hand-eye calibration as claimed in claim 4, wherein the equation of the distance constraint from the midpoint to the straight line in S21 is:
Figure FDA0003377411490000022
Figure FDA0003377411490000023
the feature points acquired by the scan at the present time are indicated,
Figure FDA0003377411490000024
show that
Figure FDA0003377411490000025
Projected to a point in the scanning coordinate system at the previous moment,
Figure FDA0003377411490000026
RAdenotes rotation, tARepresents a translation;
Figure FDA0003377411490000027
representing the sum of feature points scanned at the previous time
Figure FDA0003377411490000028
In the same scan line and
Figure FDA0003377411490000029
the point of the shortest distance is the point of the shortest distance,
Figure FDA00033774114900000210
represents the time of last scanning point in AND
Figure FDA00033774114900000211
Adjacent scan lines and
Figure FDA00033774114900000212
the point of shortest distance, d1Representative point
Figure FDA00033774114900000213
To a straight line
Figure FDA00033774114900000214
The distance of (d);
the S22 includes: the relative pose solved at the moment is an initial value, the Gaussian-Newton method is used for solving the relative pose for all matched points, and a constraint equation is as follows:
Figure FDA0003377411490000031
wherein k represents the number of characteristic points extracted by scanning at the current moment, and R is iteratively optimized by using a Gaussian-Newton methodAAnd tATo makeThe value of d is minimized, and then the relative pose R of the radar is obtainedAAnd tA
6. The method for calibrating radar-IMU based on hand-eye calibration as claimed in claim 1, wherein said step S3 includes:
the time stamp of the IMU is referenced by the time stamp of the radar point cloud data by utilizing a linear interpolation method; and solving the relative pose of the IMU at the current moment and the previous moment by using an inertial navigation pose solving algorithm.
7. The method for calibrating radar-IMU based on hand-eye calibration as claimed in claim 1, wherein said step S4 includes:
s41, establishing a basic equation of the radar and IMU hand-eye calibration;
s42, deriving a least square cost function of the radar and IMU hand-eye calibration according to a basic equation;
and S43, solving the Jacobian matrix according to the cost function, and solving the relative pose of the radar and the IMU by using a Gaussian-Newton method.
8. The method for calibrating radar-IMU based on hand-eye calibration of claim 7, wherein the basic equation for the hand-eye calibration of radar and IMU in S41 is:
RARX=RXRB
(RA-I3)tX=RXtB-tA
wherein R isAAnd tARespectively representing rotation and translation of the radar, RBAnd tBRepresenting rotation and translation of the IMU, RXAnd tXThen the relative rotation and translation between the IMU and the radar is represented;
the derivation process of the least square cost function of the radar and IMU hand-eye calibration in S42 is as follows:
Figure FDA0003377411490000032
r is to beXAnd tXSubstituting into the solution to obtain:
Figure FDA0003377411490000041
order to
Figure FDA0003377411490000042
Figure FDA0003377411490000043
Figure FDA0003377411490000044
Then
Figure FDA0003377411490000045
The solving process of the relative pose in the S43 is as follows:
the solution Jacobian matrix is:
Figure FDA0003377411490000046
using rx ry rzRepresents RXRespectively about three coordinate axes, tx ty tzRepresents tXFor translation along three axes, the jacobian matrix can be rewritten as:
Figure FDA0003377411490000051
by solving equation JTJΔX=-JTf obtaining the increment of relative pose delta xi ═ delta rx,Δry,Δrz,Δtx,Δty,Δtz]TAnd the relative pose obtained at the last moment is an iteration initial value, the increment is superposed on the initial value for iterative solution, and finally the relative pose R of the radar and the IMU is obtainedXAnd tX
9. The method for calibrating radar-IMU based on hand-eye calibration as claimed in claim 1, wherein said S51 includes:
order:
Figure FDA0003377411490000052
wherein, BIIRepresenting the integral pose information acquired according to the IMU during calibration, and dividing the pose information into
Figure FDA0003377411490000053
And
Figure FDA0003377411490000054
in combination of (1), wherein
Figure FDA0003377411490000055
Represents BIIThe linear motion part in (1) is,
Figure FDA0003377411490000056
represents BIIThe non-linear motion section of (1);
the S52 includes: assuming that a calibrated result X has been obtainedISaid X isIRepresenting the relative pose of the radar and IMU solved in S43
Figure FDA0003377411490000057
And
Figure FDA0003377411490000058
projected onto a radar coordinate system, then
Figure FDA0003377411490000059
Figure FDA00033774114900000510
Wherein,
Figure FDA00033774114900000511
represents the linear part of the radar motion estimated by the IMU,
Figure FDA00033774114900000512
the non-linear part representing the radar motion estimated by the IMU is subtracted, and the remaining information of the point cloud data is generated for linear motion.
CN202010230937.3A 2020-03-27 2020-03-27 Radar-IMU calibration method based on hand-eye calibration Active CN111443337B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010230937.3A CN111443337B (en) 2020-03-27 2020-03-27 Radar-IMU calibration method based on hand-eye calibration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010230937.3A CN111443337B (en) 2020-03-27 2020-03-27 Radar-IMU calibration method based on hand-eye calibration

Publications (2)

Publication Number Publication Date
CN111443337A CN111443337A (en) 2020-07-24
CN111443337B true CN111443337B (en) 2022-03-08

Family

ID=71649072

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010230937.3A Active CN111443337B (en) 2020-03-27 2020-03-27 Radar-IMU calibration method based on hand-eye calibration

Country Status (1)

Country Link
CN (1) CN111443337B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112034438B (en) * 2020-08-27 2024-04-09 江苏智能网联汽车创新中心有限公司 Radar calibration method and device, electronic equipment and storage medium
CN112285676B (en) * 2020-10-22 2024-02-09 知行汽车科技(苏州)股份有限公司 Laser radar and IMU external parameter calibration method and device
CN112362054B (en) * 2020-11-30 2022-12-16 上海商汤临港智能科技有限公司 Calibration method, calibration device, electronic equipment and storage medium
CN112767493B (en) * 2020-12-30 2023-06-13 浙江大学 Machine vision calibration method for kinematic parameters of Stewart platform
CN112729344B (en) * 2020-12-30 2022-09-13 珠海市岭南大数据研究院 Sensor external reference calibration method without reference object

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9031809B1 (en) * 2010-07-14 2015-05-12 Sri International Method and apparatus for generating three-dimensional pose using multi-modal sensor fusion
CN108801166A (en) * 2018-05-29 2018-11-13 北京航空航天大学 Fiber grating wing distortion measurement modeling based on cantilever beam theory and scaling method
CN109166140A (en) * 2018-07-27 2019-01-08 长安大学 A kind of vehicle movement track estimation method and system based on multi-line laser radar
CN109270534A (en) * 2018-05-07 2019-01-25 西安交通大学 A kind of intelligent vehicle laser sensor and camera online calibration method
CN110109146A (en) * 2019-04-30 2019-08-09 北京云迹科技有限公司 Pavement detection method and device based on multi-line laser radar
CN110109144A (en) * 2019-04-30 2019-08-09 北京云迹科技有限公司 Road shoulder detection method and device based on multi-line laser radar
CN110238820A (en) * 2019-07-12 2019-09-17 易思维(杭州)科技有限公司 Hand and eye calibrating method based on characteristic point
CN110261870A (en) * 2019-04-15 2019-09-20 浙江工业大学 It is a kind of to synchronize positioning for vision-inertia-laser fusion and build drawing method
CN110428467A (en) * 2019-07-30 2019-11-08 四川大学 A kind of camera, imu and the united robot localization method of laser radar
CN110501036A (en) * 2019-08-16 2019-11-26 北京致行慕远科技有限公司 The calibration inspection method and device of sensor parameters
CN110514225A (en) * 2019-08-29 2019-11-29 中国矿业大学 The calibrating external parameters and precise positioning method of Multi-sensor Fusion under a kind of mine
CN110686704A (en) * 2019-10-18 2020-01-14 深圳市镭神智能***有限公司 Pose calibration method, system and medium for laser radar and combined inertial navigation
CN110703229A (en) * 2019-09-25 2020-01-17 禾多科技(北京)有限公司 Point cloud distortion removal method and external reference calibration method for vehicle-mounted laser radar reaching IMU

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9916703B2 (en) * 2015-11-04 2018-03-13 Zoox, Inc. Calibration for autonomous vehicle operation
CN109544638B (en) * 2018-10-29 2021-08-03 浙江工业大学 Asynchronous online calibration method for multi-sensor fusion
CN109945856B (en) * 2019-02-18 2021-07-06 天津大学 Unmanned aerial vehicle autonomous positioning and mapping method based on inertia/radar

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9031809B1 (en) * 2010-07-14 2015-05-12 Sri International Method and apparatus for generating three-dimensional pose using multi-modal sensor fusion
CN109270534A (en) * 2018-05-07 2019-01-25 西安交通大学 A kind of intelligent vehicle laser sensor and camera online calibration method
CN108801166A (en) * 2018-05-29 2018-11-13 北京航空航天大学 Fiber grating wing distortion measurement modeling based on cantilever beam theory and scaling method
CN109166140A (en) * 2018-07-27 2019-01-08 长安大学 A kind of vehicle movement track estimation method and system based on multi-line laser radar
CN110261870A (en) * 2019-04-15 2019-09-20 浙江工业大学 It is a kind of to synchronize positioning for vision-inertia-laser fusion and build drawing method
CN110109146A (en) * 2019-04-30 2019-08-09 北京云迹科技有限公司 Pavement detection method and device based on multi-line laser radar
CN110109144A (en) * 2019-04-30 2019-08-09 北京云迹科技有限公司 Road shoulder detection method and device based on multi-line laser radar
CN110238820A (en) * 2019-07-12 2019-09-17 易思维(杭州)科技有限公司 Hand and eye calibrating method based on characteristic point
CN110428467A (en) * 2019-07-30 2019-11-08 四川大学 A kind of camera, imu and the united robot localization method of laser radar
CN110501036A (en) * 2019-08-16 2019-11-26 北京致行慕远科技有限公司 The calibration inspection method and device of sensor parameters
CN110514225A (en) * 2019-08-29 2019-11-29 中国矿业大学 The calibrating external parameters and precise positioning method of Multi-sensor Fusion under a kind of mine
CN110703229A (en) * 2019-09-25 2020-01-17 禾多科技(北京)有限公司 Point cloud distortion removal method and external reference calibration method for vehicle-mounted laser radar reaching IMU
CN110686704A (en) * 2019-10-18 2020-01-14 深圳市镭神智能***有限公司 Pose calibration method, system and medium for laser radar and combined inertial navigation

Non-Patent Citations (9)

* Cited by examiner, † Cited by third party
Title
AGV多导引驱动单元协同路径跟踪控制研究;赵龙;《中国优秀硕士学位论文全文数据库 信息科技辑》;20180315(第3期);第I140-542页 *
IMU-Aided High-Frequency Lidar Odometry for Autonomous Driving;Xue, HZ;等;《APPLIED SCIENCES-BASEL》;20190401;第9卷(第7期);第1-7页 *
On-Line Initialization and Extrinsic Calibration of an Inertial Navigation System With a Relative Preintegration Method on Manifold;Dongshin Kim;等;《IEEE Transactions on Automation Science and Engineering 》;20171212;第15卷(第3期);第1272-1285页 *
Optimal Hand-Eye Calibration of IMU and Camera;Yang, G ;等;《2017 CHINESE AUTOMATION CONGRESS (CAC)》;20171222;第1023-1028页 *
Real-time 3D Grid Map Building for Autonomous Driving in Dynamic Environment;Hanzhang,等;《2019 IEEE International Conference on Unmanned Systems (ICUS)》;20191231;第40-45页 *
Zhang B ; 等.A Point Cloud Distortion Removing and Mapping Algorithm based on Lidar and IMU UKF Fusion.《2019 IEEE ASME international conference on advanced intelligent mechatronics(AIM)》.2019, *
基于激光与组合导航***协作的三维地图构建;代东;《万方数据库》;20190827;第1-67页 *
基于视觉-里程计融合的移动机器人定位算法研究;王雪锋;《中国优秀硕士学位论文全文数据库 信息科技辑》;20190115(第1期);第I140-931页 *
李帅鑫 ; 等.LiDAR/IMU紧耦合的实时定位方法.《自动化学报》.2019, *

Also Published As

Publication number Publication date
CN111443337A (en) 2020-07-24

Similar Documents

Publication Publication Date Title
CN111443337B (en) Radar-IMU calibration method based on hand-eye calibration
US7768534B2 (en) Method of and system for determining inaccuracy information in an augmented reality system
CN111207774A (en) Method and system for laser-IMU external reference calibration
CN112183171B (en) Method and device for building beacon map based on visual beacon
CN110570449B (en) Positioning and mapping method based on millimeter wave radar and visual SLAM
CN111121754A (en) Mobile robot positioning navigation method and device, mobile robot and storage medium
WO2020140431A1 (en) Camera pose determination method and apparatus, electronic device and storage medium
CN111665512B (en) Ranging and mapping based on fusion of 3D lidar and inertial measurement unit
JP2018124787A (en) Information processing device, data managing device, data managing system, method, and program
CN110388919B (en) Three-dimensional model positioning method based on feature map and inertial measurement in augmented reality
CN112781586B (en) Pose data determination method and device, electronic equipment and vehicle
CN112146682B (en) Sensor calibration method and device for intelligent automobile, electronic equipment and medium
CN114526745A (en) Drawing establishing method and system for tightly-coupled laser radar and inertial odometer
CN109740487B (en) Point cloud labeling method and device, computer equipment and storage medium
CN109186596A (en) IMU measurement data generation method, system, computer installation and readable storage medium storing program for executing
CN113763549A (en) Method, device and storage medium for simultaneous positioning and mapping by fusing laser radar and IMU
CN111998870B (en) Calibration method and device of camera inertial navigation system
CN113310505B (en) External parameter calibration method and device of sensor system and electronic equipment
CN113554712B (en) Registration method and device of automatic driving vehicle, electronic equipment and vehicle
CN109506617B (en) Sensor data processing method, storage medium, and electronic device
CN114397642A (en) Three-dimensional laser radar and IMU external reference calibration method based on graph optimization
CN114049401A (en) Binocular camera calibration method, device, equipment and medium
CN117554976A (en) Laser radar distortion removal method
JP2002046087A (en) Three-dimensional position measuring method and apparatus, and robot controller
TWI822423B (en) Computing apparatus and model generation method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant