CN113721260A - Online combined calibration method for laser radar, binocular camera and inertial navigation - Google Patents
Online combined calibration method for laser radar, binocular camera and inertial navigation Download PDFInfo
- Publication number
- CN113721260A CN113721260A CN202110993256.7A CN202110993256A CN113721260A CN 113721260 A CN113721260 A CN 113721260A CN 202110993256 A CN202110993256 A CN 202110993256A CN 113721260 A CN113721260 A CN 113721260A
- Authority
- CN
- China
- Prior art keywords
- inertial navigation
- data
- laser radar
- camera
- frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 50
- 238000005457 optimization Methods 0.000 claims abstract description 20
- 230000010354 integration Effects 0.000 claims description 27
- 230000001133 acceleration Effects 0.000 claims description 9
- 238000001914 filtration Methods 0.000 claims description 7
- 230000009466 transformation Effects 0.000 claims description 7
- 238000000605 extraction Methods 0.000 claims description 4
- 230000000007 visual effect Effects 0.000 claims description 3
- 238000004364 calculation method Methods 0.000 description 6
- 230000004927 fusion Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 102100025232 Calcium/calmodulin-dependent protein kinase type II subunit beta Human genes 0.000 description 1
- 101001077352 Homo sapiens Calcium/calmodulin-dependent protein kinase type II subunit beta Proteins 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000012417 linear regression Methods 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/865—Combination of radar systems with lidar systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Automation & Control Theory (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention discloses an online joint calibration method for a laser radar, a binocular camera and inertial navigation, which comprises the steps of constructing a camera-inertial navigation data pair and a laser radar-inertial navigation data pair; calculating optimal monocular camera and inertial navigation and external parameters of the laser radar and the inertial navigation by a nonlinear optimization method according to the camera-inertial navigation data pair and the laser radar-inertial navigation data pair information; aligning timestamps of the laser radar data and image data by taking the timestamp of the inertial navigation data as a reference, projecting straight line features in the laser radar original data onto an image two-dimensional plane of the nearest timestamp, constructing a laser radar-camera data pair, and calculating external parameters between the laser radar and a monocular camera; unifying coordinate systems of the linear features in the two monocular camera data under a world coordinate system, constructing a camera-camera data pair, registering, and calculating external parameters between the two monocular cameras; and carrying out global optimization on each external parameter, and continuously carrying out iterative updating until a global error threshold is reached.
Description
Technical Field
The invention relates to the technical field of multi-sensor combined calibration, in particular to an online combined calibration method for a laser radar, a binocular camera and inertial navigation.
Background
The multi-sensor combined calibration technology is an important difficulty in the fields of robots, unmanned driving and the like. In the current multi-sensor combined calibration algorithm, most methods use off-line manual calibration, a calibration board is required for operation, or only two sensors are fused, and data fusion and calibration of three or more sensors are not involved. However, in practical applications, a robot or an unmanned vehicle often needs to be equipped with a plurality of sensors for environment observation, and the sensors often deviate due to mechanical vibration or damage over time, so that it is very significant to develop an online automatic calibration algorithm of the plurality of sensors.
Disclosure of Invention
The invention aims to provide an online combined calibration method for a laser radar, a binocular camera and inertial navigation, which does not need to artificially set a calibration reference object or a specific environment and improves the convenience and the robustness of the calibration method.
The invention adopts the following technical scheme for realizing the aim of the invention:
the invention provides an online combined calibration method for a laser radar, a binocular camera and inertial navigation, which comprises the following steps:
acquiring laser radar data, binocular camera image data and inertial navigation data;
performing real-time linear feature extraction on image data of a binocular camera and laser radar data to obtain an image feature frame and a laser radar feature frame;
combining the single image characteristic frame with the corresponding adjacent inertial navigation data pre-integration result to construct a monocular camera-inertial navigation data pair, and combining the single laser radar characteristic frame with the corresponding adjacent inertial navigation data pre-integration result to construct a laser radar-inertial navigation data pair;
calculating optimal parameters of the monocular camera and inertial navigation and the laser radar and inertial navigation according to the information of the monocular camera-inertial navigation data pair and the information of the laser radar-inertial navigation data pair;
aligning timestamps of the laser radar data and the binocular camera image data by taking the timestamp of the inertial navigation data as a reference, constructing a laser radar-camera data pair, optimizing a reprojection error and calculating external parameters between the laser radar and the monocular camera;
unifying the coordinate system of the linear feature in the image feature frame under a world coordinate system, constructing a camera-camera data pair, registering, and calculating external parameters between the two monocular cameras;
and carrying out global optimization on the external parameters of the monocular camera-inertial navigation, the laser radar-monocular camera and the monocular camera-monocular camera to complete online combined calibration.
And further, laser radar data, binocular camera image data and inertial navigation data are obtained, pre-integration processing is carried out on the inertial navigation data, and system pose is initialized.
Further, the method for acquiring the laser radar data, the binocular camera image data and the inertial navigation data, performing pre-integration processing on the inertial navigation data and initializing the system pose comprises the following steps:
acquiring point cloud data of a laser radar, image data of a binocular camera and acceleration and angular velocity data of inertial navigation in real time through ROS;
and performing pre-integration processing on the acceleration and the angular velocity of the inertial navigation, and calculating to obtain an initial pose of the system, wherein the initial pose comprises velocity, rotation and translation.
Further, the method for respectively extracting the real-time linear features of the single-frame image frame and the single-frame laser radar frame data comprises the following steps:
extracting and filtering the linear features of the image frame by using an improved LSD algorithm;
extracting and filtering the linear characteristics of the laser radar frame by using an improved RANSAC algorithm;
and storing the linear characteristic information of each frame of image and point cloud data.
Further, the method for calculating the optimal external parameters of the monocular camera and inertial navigation and the lidar and inertial navigation comprises the following steps:
the motion transformation calculated by the inertial navigation pre-integration is acted on the image straight line characteristic frame of the current timestamp to obtain a predicted next frame image straight line characteristic frame, then the predicted next frame image straight line characteristic frame and the real next frame image straight line characteristic frame extracted from the image original data are used as algorithm input by utilizing an LM (linear optimization) algorithm, and the reprojection error of the predicted next frame image straight line characteristic frame and the real next frame image straight line characteristic frame is calculated to obtain the external parameter of the camera-inertial navigation;
and (3) acting the motion transformation calculated by the inertial navigation pre-integration on the laser radar straight line characteristic frame of the current timestamp to obtain a predicted next frame laser radar straight line characteristic frame, taking the predicted next frame laser radar straight line characteristic frame and the real next frame laser radar straight line characteristic frame extracted from the laser radar original data as algorithm input by utilizing an LM + NDT nonlinear optimization algorithm, and calculating the reprojection error of the predicted next frame laser radar straight line characteristic frame and the real next frame laser radar straight line characteristic frame to obtain the external parameter of the laser radar and the inertial navigation.
Further, the method of aligning timestamps of lidar data and binocular camera image data includes:
taking the time stamp of the inertial navigation data as a reference, and uniformly receiving data of the laser radar and the two monocular cameras through a TimeSynchronitor filter; and when all the data have the same time stamp, generating a callback function of a synchronization result, and processing the data after the synchronization time in the callback function.
Further, the method of calculating the external reference between the lidar and the monocular camera includes:
and performing PNP solution on the linear characteristics of the adjacent laser radar-camera data pairs on the timestamp, and calculating the external parameters between the camera coordinate system and the radar coordinate system.
Further, the method of calculating the external reference between the two monocular cameras includes:
and matching the linear features in the two image frames by utilizing an SIFT algorithm, unifying two-dimensional coordinates of the images to a world coordinate system, and calculating external parameters between the two cameras.
Further, the method for performing global optimization on the external parameters of the monocular camera-inertial navigation, the lidar-monocular camera and the monocular camera-monocular camera and continuously performing iterative updating along with the movement of the system until the global error threshold is reached comprises the following steps:
substituting the external parameters of the monocular camera-inertial navigation, laser radar-monocular camera and monocular camera-monocular camera into the corresponding data pairs;
the mobile system calculates whether the current frame feature matching error result obtained by the external reference of the previous frame is within the acceptable threshold range along with the data updating;
and if the global error threshold is reached, accepting the external parameter of the previous frame, otherwise, updating and optimizing the external parameter of the previous frame on the new error result, and continuously iterating.
Further, the system comprises a laser radar, a binocular camera and an inertial navigation hardware platform which are fixed through a support, and the coincidence range of the visual fields of the laser radar and the binocular camera is more than 80%.
The invention has the following beneficial effects:
the method can simultaneously calibrate external parameters of three different sensors, namely the laser radar sensor, the binocular camera sensor and the inertial navigation sensor, and provides a novel algorithm for a multi-mode data fusion technology;
the calibration method among the three sensors is online and fully automatic, does not need to manually set a calibration reference object, does not need a specific environment, and improves the convenience and the robustness of the calibration method.
Drawings
FIG. 1 is a flowchart of an online joint calibration method for a laser radar, a binocular camera and inertial navigation according to an embodiment of the present invention;
FIG. 2 is a diagram of a transformation relationship among coordinate systems of a lidar, a binocular camera and inertial navigation according to an embodiment of the present invention;
fig. 3 is a schematic diagram of data after alignment of a laser radar, a binocular camera, and inertial navigation timestamps according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, embodiment 1 of the present invention provides an online joint calibration method for a laser radar, a binocular camera and inertial navigation, including the following steps:
building hardware platforms of a laser radar, a binocular camera and inertial navigation;
acquiring original data of a laser radar, a binocular camera and inertial navigation in real time, performing pre-integration processing on the inertial navigation data, and initializing the pose of the system; taking the time stamp of the inertial navigation data as a reference, and aligning the time stamps of the laser radar data and the image data; performing real-time linear feature extraction on the image frame and the laser radar frame data; fusing inertial navigation data of the single image characteristic frame and image interframe pre-integration and inertial navigation data of the single laser radar characteristic frame and laser radar interframe pre-integration to construct a camera-inertial navigation data pair and a laser radar-inertial navigation data pair;
motion compensation is provided for the inertial navigation pre-integration result, and optimal parameters of the monocular camera and the inertial navigation and the laser radar and the inertial navigation are calculated by a nonlinear optimization method, so that the re-projection error of adjacent image feature frames is minimum and the point cloud registration error of adjacent laser radar feature frames is minimum; projecting the linear characteristics of the laser radar data onto an image two-dimensional plane of an adjacent timestamp, constructing a laser radar-camera data pair, optimizing a reprojection error and calculating external parameters between the laser radar and a monocular camera; unifying the coordinate systems of the linear features of the two monocular cameras under a world coordinate system, constructing a camera-camera data pair, registering, and calculating external parameters between the two monocular cameras;
and performing global optimization on the camera-inertial navigation, the laser radar-camera and the camera-camera external parameters, and continuously performing iterative updating along with the slow movement of the system until a global error threshold value is reached.
The hardware platform for building the laser radar, the binocular camera and the inertial navigation specifically comprises: the laser radar, the binocular camera and the inertial navigation device are fixed by the metal frame, the visual field coincidence range of the laser radar and the binocular camera is guaranteed to reach more than 80%, and therefore feature matching can be conducted quickly during calibration.
The real-time acquisition of the original data of the laser radar, the binocular camera and the inertial navigation specifically comprises the following steps: creating and issuing a laser radar data frame message in the ROS, wherein a timestamp of the laser radar data frame and point cloud XYZ position information are recorded; creating and issuing binocular camera data frame messages, wherein time stamps and RBG pixel information of left and right camera image data frames are recorded; and creating and publishing an inertial navigation data frame message, wherein the information of the acceleration and the angular velocity acquired by inertial navigation is recorded.
The method for carrying out pre-integration processing on the inertial navigation data and initializing the system pose specifically comprises the following steps: acceleration a published by subscribing inertial navigation data frame by ROSBAnd angular velocity wBInformation, taking into account slight deviations b of the original data over timea、bgAnd white gaussian noise ηa、ηgBy spacing Δ t a certain time stampijThe acceleration and angular velocity are integrated to calculate the rotation relative to the initial positionTranslationAnd amount of velocity
Wherein, W1B represents a world coordinate system and an inertial navigation coordinate system respectively, g represents gravity acceleration, and i and j timestamp intervals delta tijMay also be used to indicate the subscripts of two adjacent key data frames. Through a pre-integration algorithm, the system pose can be initialized, and the estimation and calculation of the system state at the later stage are facilitated.
Taking the time stamp of the inertial navigation data as a reference, aligning the time stamps of the laser radar data and the image data specifically comprises: considering that the data frequency of an image frame is 30Hz, the frequency of a laser radar data frame is 60Hz, and the frequency of an inertial navigation data frame is 200Hz, under the condition of not considering the alignment problem of a timestamp, 4 frames of laser radar data frames and a plurality of frames of inertial navigation data frames can be simultaneously received by using ROS every time two frames of image frames are received, but due to different processing speeds of hardware devices of three sensors, certain deviation can be generated on the timestamp by acquired complete data. The algorithm takes the time stamps of two adjacent image frames as the start and the end of a public time stamp interval, records the time stamps of all inertial navigation data in the interval, searches effective laser radar data frames in the time stamp interval by taking the time stamps of the inertial navigation data as the reference, and aligns the time stamps of the three sensors in the time stamp in a constant compensation mode, and the effect is shown in figure 3.
The real-time linear feature extraction of the image frame and the laser radar frame data specifically comprises the following steps: in order to extract linear features, the improved LSD algorithm firstly carries out Gaussian down-sampling on the image frame, and then calculation is carried outPseudo-ordering all pixel points according to gradient values and gradient directions of all pixel points, establishing a state list, searching pixels with the direction around the pixel point with the maximum gradient within a threshold value, constructing a rectangle, performing truncation according to the density of the same-direction points within the rectangle, and receiving the straight line if the false alarm number is less than the threshold value, thereby filtering the image frame into a set L of a plurality of straight line featuresCAM(ii) a The RANSAC algorithm is improved to filter the three-dimensional point cloud of the laser radar frame, if a certain three-dimensional point and the three-dimensional points which are smaller than the threshold distance around the certain three-dimensional point meet a space straight line characteristic equation, the three-dimensional point cloud is extracted to construct a point cloud straight line characteristic, and therefore the laser radar frame is also filtered into a set L of a plurality of straight line characteristicsLIDAR. Since the overlap of the fields of view of the lidar and the camera is greater than 80%, most of the linear features are available for registration calculations.
The method for constructing the camera-inertial navigation data pair and the laser radar-inertial navigation data pair includes the following steps of fusing inertial navigation data of a single image characteristic frame and image interframe pre-integration and inertial navigation data of a single laser radar characteristic frame and laser radar interframe pre-integration, and specifically including: image feature frame L using ROSCAMAnd carrying out data fusion with the pre-integrated inertial navigation data frame to construct a camera-inertial navigation data pair CAM _ IMU ({ T, X) }C,YCR, p, v, wherein the aligned timestamp, the pixel coordinates of the straight line in the image frame and the system pose information are recorded; laser radar characteristic frame LLIDARAnd carrying out data fusion with the inertial navigation data frame after pre-integration to construct LIDAR-inertial navigation data pair { T, X ═ LIDAR _ IMU ═L,YL,ZLAnd R, p, v, wherein the aligned timestamp, the point cloud coordinate of the straight line in the laser radar frame and the system pose information are recorded.
The inertial navigation pre-integration result provides motion compensation, and optimal parameters of the monocular camera and the inertial navigation and the laser radar and the inertial navigation are calculated by a nonlinear optimization method, so that the reprojection error of adjacent image feature frames is minimum and the point cloud registration error of adjacent laser radar feature frames is minimum, and the method specifically comprises the following steps: image feature frames L of two adjacent time stampsCAMAnd L'CAMThe coordinate system is unified under a world coordinate system, and the image feature frame after motion compensation is obtained according to rotation and translation obtained by inertial navigation pre-integral calculationFinding out proper monocular camera and inertial navigation external parameters by using LM nonlinear optimization algorithm to enable image feature frameAnd L'CAMReprojection error ofCAM_IMUMinimum; directly applying inertial navigation pre-integration result to laser radar characteristic frame LLIDARObtaining the laser radar characteristic frame after motion compensationFinding out proper laser radar and inertial navigation external parameters by using LM nonlinear optimization algorithm to enable laser radar characteristic frameAnd L'LIDARRegistration error ofLIDAR_IMUAnd minimum.
The linear features of the LIDAR data are projected onto two-dimensional planes of images of adjacent time stamps to construct LIDAR-camera data pairs (LIDAR _ CAM ═ T, X)L,YL,ZL,XC,YCAnd optimizing a reprojection error and calculating external parameters between the laser radar and the monocular camera, specifically comprising: three-dimensional point cloud [ X ] of laser radar straight line characteristicsL,YL,ZL]In the image plane [ x ]L,yL]=[XL/ZL,YL/ZL]Calculating the phase plane coordinate [ x ] of the image feature frame according to the camera internal parametersC,yC]Finding out proper laser radar and camera parameters by using an LM (linear regression) nonlinear optimization algorithm to ensure that the reprojection error loss of the image plane of the image characteristic frame and the image plane of the laser radar characteristic frameLIDAR_CAMAnd minimum.
The seating of the straight line features of the two monocular camerasThe target system is set to the world coordinate system, and a camera-camera data pair CAM1_ CAM2 is constructed as { T, X }C1,YC1,XC2,YC2Performing registration, and calculating external parameters between the two monocular cameras, specifically comprising: unifying the linear characteristics to the world coordinate system according to the internal parameters and focal lengths of the left camera and the right camera, and finding out the proper camera and camera external parameters by using an LM (linear optimization) algorithm, so that the registration error loss of the frame coordinates of the image characteristics of the left camera and the right camera in the world coordinate systemCAM1_CAM2And minimum.
The method specifically comprises the following steps of performing global optimization on external parameters of the camera-inertial navigation, the laser radar-camera and the camera-camera, and continuously performing iterative updating along with slow movement of a system until a global error threshold is reached: as the system moves slowly, new sensor data messages are issued by the ROS, and new external parameter errors are obtained through continuous iterative calculation: loss 'of'CAM_IMU、loss′LIDAR_IMU、loss′LIDAR_CAM、loss′CAM1_CAM2If the local and global error thresholds are reached, new external parameters are received.
The method comprises the steps that ROS is utilized to issue data messages of three sensors, the laser radar data messages comprise time stamps and XYZ space coordinates of point clouds, the image data messages comprise time stamps and RGB pixel information of pixel points, and inertial navigation data comprise time stamps, acceleration and angular velocity information. These three data frames may be subscribed to in real time.
Considering that the data frequency of the image frames is 30Hz, the data frame frequency of the laser radar is 60Hz, and the frame frequency of the inertial navigation data is 200Hz, the image frame with the lowest frequency is taken as a reference, the inertial navigation data of two adjacent image frames are processed by pre-integration, and the speed, the rotation and the translation of the system in the time interval are preliminarily estimated.
Considering that the frame frequency of the inertial navigation data is 200Hz at the highest, the inertial navigation pre-integration in step 002 is calculated by taking the inertial navigation time stamp as a reference, and the time stamp deviations of the image frame and the lidar frame are involved and the adjacent time stamp data frames of the three sensors are subjected to the time stamp alignment process.
The method for extracting the linear features of the image frame and the laser radar data frame specifically comprises the following steps: the improved LSD algorithm is used for extracting and filtering the linear features of the image frames, the improved RANSAC algorithm is used for extracting and filtering the linear features of the laser radar frames, only the linear feature information of each frame of image and point cloud data is stored, other invalid data are deleted, and the calculation complexity of later-stage matching is reduced.
Two-by-two new data pairs are constructed for the image frame, the laser radar frame and the inertial navigation data frame after the linear features are extracted, and the method specifically comprises the following steps: taking the timestamps of two certain adjacent image frames as the starting point, finishing, wherein under the ideal condition, four laser radar frames and a plurality of inertial navigation data frames exist between the timestamps of the two image frames, and respectively constructing a camera-inertial navigation data pair by using the two adjacent image frames and the inertial navigation data frames; constructing a laser radar-inertial navigation data pair from the four laser radar frames and the inertial navigation data frame; constructing a camera-lidar data pair by using two adjacent image frames and four lidar frames; the image frames of the left and right cameras are constructed into camera-camera data pairs.
As shown in FIG. 2, the external parameters of camera-inertial navigation, lidar-inertial navigation, camera-lidar and camera-camera are calculatedThe method specifically comprises the following steps: calculating the reprojection error of the image linear feature after the current motion transformation and the image linear feature of the next frame by utilizing an LM (linear motion estimation) nonlinear optimization algorithm to obtain the external parameters of the camera-inertial navigation; registering the point cloud by utilizing an LM + NDT algorithm for the current laser radar linear feature after motion transformation and the laser radar linear feature of the next frame to obtain external parameters of the camera-inertial navigation; and performing PNP solution on the straight line characteristics of the nearest laser radar-camera data pair, and calculating external parameters between a camera coordinate system and a radar coordinate system.
Global optimization is carried out on external parameters of the camera, the laser radar, the inertial navigation, the laser radar, the camera and the camera, and the external parameters are continuously updated in an iterative manner along with slow movement of the system until a global error threshold is reached, and the method specifically comprises the following steps: and substituting the extrinsic parameters of the camera-inertial navigation, the laser radar-camera and the camera-camera into the corresponding data pairs, slowly moving the hardware system, calculating whether the current frame feature matching error result obtained by calculating the extrinsic parameters of the previous frame is within the acceptable threshold range along with the data updating of all the sensors, receiving the extrinsic parameters of the previous frame if the current frame feature matching error result reaches the global error threshold, and updating and optimizing the extrinsic parameters of the previous frame on the new error result to continuously iterate.
The above description is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, several modifications and variations can be made without departing from the technical principle of the present invention, and these modifications and variations should also be regarded as the protection scope of the present invention.
Claims (10)
1. An online combined calibration method for a laser radar, a binocular camera and inertial navigation is characterized by comprising the following steps:
acquiring laser radar data, binocular camera image data and inertial navigation data;
performing real-time linear feature extraction on image data of a binocular camera and laser radar data to obtain an image feature frame and a laser radar feature frame; combining the single image characteristic frame with the corresponding adjacent inertial navigation data pre-integration result to construct a monocular camera-inertial navigation data pair, and combining the single laser radar characteristic frame with the corresponding adjacent inertial navigation data pre-integration result to construct a laser radar-inertial navigation data pair;
calculating optimal parameters of the monocular camera and inertial navigation and the laser radar and inertial navigation according to the information of the monocular camera-inertial navigation data pair and the information of the laser radar-inertial navigation data pair;
aligning timestamps of the laser radar data and the binocular camera image data by taking the timestamp of the inertial navigation data as a reference, constructing a laser radar-camera data pair, optimizing a reprojection error and calculating external parameters between the laser radar and the monocular camera;
unifying the coordinate system of the linear feature in the image feature frame under a world coordinate system, constructing a camera-camera data pair, registering, and calculating external parameters between the two monocular cameras;
and carrying out global optimization on the external parameters of the monocular camera-inertial navigation, the laser radar-monocular camera and the monocular camera-monocular camera to complete online combined calibration.
2. The on-line joint calibration method for the lidar, the binocular camera and the inertial navigation system according to claim 1, characterized in that lidar data, binocular camera image data and inertial navigation data are obtained, pre-integration processing is performed on the inertial navigation data, and system pose is initialized.
3. The on-line combined calibration method for the lidar, the binocular camera and the inertial navigation system according to claim 2, wherein the method for acquiring lidar data, binocular camera image data and inertial navigation data, performing pre-integration processing on the inertial navigation data and initializing the system pose comprises the following steps:
acquiring point cloud data of a laser radar, image data of a binocular camera and acceleration and angular velocity data of inertial navigation in real time through ROS;
and performing pre-integration processing on the acceleration and the angular velocity of the inertial navigation, and calculating to obtain an initial pose of the system, wherein the initial pose comprises velocity, rotation and translation.
4. The online joint calibration method for the lidar, the binocular camera and the inertial navigation according to claim 1, wherein the method for respectively extracting the real-time linear features of the single-frame image frame and the single-frame lidar frame data comprises the following steps:
extracting and filtering the linear features of the image frame by using an improved LSD algorithm;
extracting and filtering the linear characteristics of the laser radar frame by using an improved RANSAC algorithm;
and storing the linear characteristic information of each frame of image and point cloud data.
5. The on-line joint calibration method for the lidar, the binocular camera and the inertial navigation according to claim 1, wherein the method for calculating the optimal parameters of the monocular camera and the inertial navigation and the lidar and the inertial navigation comprises the following steps:
the motion transformation calculated by the inertial navigation pre-integration is acted on the image straight line characteristic frame of the current timestamp to obtain a predicted next frame image straight line characteristic frame, then the predicted next frame image straight line characteristic frame and the real next frame image straight line characteristic frame extracted from the image original data are used as algorithm input by utilizing an LM (linear optimization) algorithm, and the reprojection error of the predicted next frame image straight line characteristic frame and the real next frame image straight line characteristic frame is calculated to obtain the external parameter of the camera-inertial navigation;
and (3) acting the motion transformation calculated by the inertial navigation pre-integration on the laser radar straight line characteristic frame of the current timestamp to obtain a predicted next frame laser radar straight line characteristic frame, taking the predicted next frame laser radar straight line characteristic frame and the real next frame laser radar straight line characteristic frame extracted from the laser radar original data as algorithm input by utilizing an LM + NDT nonlinear optimization algorithm, and calculating the reprojection error of the predicted next frame laser radar straight line characteristic frame and the real next frame laser radar straight line characteristic frame to obtain the external parameter of the laser radar and the inertial navigation.
6. The method for online joint calibration of lidar, a binocular camera, and inertial navigation according to claim 1, wherein the method of aligning timestamps of lidar data and binocular camera image data comprises:
taking the time stamp of the inertial navigation data as a reference, and uniformly receiving data of the laser radar and the two monocular cameras through a TimeSynchronitor filter; and when all the data have the same time stamp, generating a callback function of a synchronization result, and processing the data after the synchronization time in the callback function.
7. The on-line joint calibration method for the lidar, the binocular camera and the inertial navigation according to claim 1, wherein the method for calculating the external parameters between the lidar and the monocular camera comprises the following steps:
and performing PNP solution on the linear characteristics of the adjacent laser radar-camera data pairs on the timestamp, and calculating the external parameters between the camera coordinate system and the radar coordinate system.
8. The on-line joint calibration method for the lidar, the binocular camera and the inertial navigation according to claim 1, wherein the method for calculating the external parameters between the two monocular cameras comprises the following steps:
and matching the linear features in the two image frames by utilizing an SIFT algorithm, unifying two-dimensional coordinates of the images to a world coordinate system, and calculating external parameters between the two cameras.
9. The on-line joint calibration method for the lidar, the binocular camera and the inertial navigation according to claim 1, wherein the method for performing global optimization on external parameters of the monocular camera-inertial navigation, the lidar-monocular camera and the monocular camera-monocular camera comprises the following steps of:
substituting the external parameters of the monocular camera-inertial navigation, laser radar-monocular camera and monocular camera-monocular camera into the corresponding data pairs;
the mobile system calculates whether the current frame feature matching error result obtained by the external reference of the previous frame is within the acceptable threshold range along with the data updating;
and if the global error threshold is reached, accepting the external parameter of the previous frame, otherwise, updating and optimizing the external parameter of the previous frame on the new error result, and continuously iterating.
10. The on-line joint calibration method for the lidar, the binocular camera and the inertial navigation system according to claim 1, wherein the system comprises the lidar, the binocular camera and the inertial navigation hardware platform which are fixed through a support, and the coincidence range of the visual fields of the lidar and the binocular camera is more than 80%.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110993256.7A CN113721260B (en) | 2021-08-26 | 2021-08-26 | Online combined calibration method for laser radar, binocular camera and inertial navigation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110993256.7A CN113721260B (en) | 2021-08-26 | 2021-08-26 | Online combined calibration method for laser radar, binocular camera and inertial navigation |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113721260A true CN113721260A (en) | 2021-11-30 |
CN113721260B CN113721260B (en) | 2023-12-12 |
Family
ID=78678359
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110993256.7A Active CN113721260B (en) | 2021-08-26 | 2021-08-26 | Online combined calibration method for laser radar, binocular camera and inertial navigation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113721260B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114114178A (en) * | 2021-12-10 | 2022-03-01 | 南京邮电大学 | Calibration device for radar and video image |
CN114648584A (en) * | 2022-05-23 | 2022-06-21 | 北京理工大学前沿技术研究院 | Robustness control method and system for multi-source fusion positioning |
CN115200608A (en) * | 2022-06-10 | 2022-10-18 | 北京航天控制仪器研究所 | Method for calibrating installation error of water laser radar and inertial navigation |
CN116524014A (en) * | 2023-05-23 | 2023-08-01 | 斯乾(上海)科技有限公司 | Method and device for calibrating external parameters on line |
CN116721166A (en) * | 2023-06-09 | 2023-09-08 | 江苏集萃清联智控科技有限公司 | Binocular camera and IMU rotation external parameter online calibration method, device and storage medium |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1850150A1 (en) * | 2006-04-27 | 2007-10-31 | Omron Corporation | Radar device |
CN105910602A (en) * | 2016-05-30 | 2016-08-31 | 南京航空航天大学 | Combined navigation method |
CN109143205A (en) * | 2018-08-27 | 2019-01-04 | 深圳清创新科技有限公司 | Integrated transducer external parameters calibration method, apparatus |
CN109544638A (en) * | 2018-10-29 | 2019-03-29 | 浙江工业大学 | A kind of asynchronous online calibration method for Multi-sensor Fusion |
CN110262546A (en) * | 2019-06-18 | 2019-09-20 | 武汉大学 | A kind of tunnel intelligent unmanned plane cruising inspection system and method |
CN110428467A (en) * | 2019-07-30 | 2019-11-08 | 四川大学 | A kind of camera, imu and the united robot localization method of laser radar |
CN110842940A (en) * | 2019-11-19 | 2020-02-28 | 广东博智林机器人有限公司 | Building surveying robot multi-sensor fusion three-dimensional modeling method and system |
CN111983639A (en) * | 2020-08-25 | 2020-11-24 | 浙江光珀智能科技有限公司 | Multi-sensor SLAM method based on Multi-Camera/Lidar/IMU |
CN112945233A (en) * | 2021-01-15 | 2021-06-11 | 北京理工大学 | Global drift-free autonomous robot simultaneous positioning and map building method |
CN113052908A (en) * | 2021-04-16 | 2021-06-29 | 南京工业大学 | Mobile robot pose estimation method based on multi-sensor data fusion |
CN113091771A (en) * | 2021-04-13 | 2021-07-09 | 清华大学 | Laser radar-camera-inertial navigation combined calibration method and system |
-
2021
- 2021-08-26 CN CN202110993256.7A patent/CN113721260B/en active Active
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1850150A1 (en) * | 2006-04-27 | 2007-10-31 | Omron Corporation | Radar device |
CN105910602A (en) * | 2016-05-30 | 2016-08-31 | 南京航空航天大学 | Combined navigation method |
CN109143205A (en) * | 2018-08-27 | 2019-01-04 | 深圳清创新科技有限公司 | Integrated transducer external parameters calibration method, apparatus |
CN109544638A (en) * | 2018-10-29 | 2019-03-29 | 浙江工业大学 | A kind of asynchronous online calibration method for Multi-sensor Fusion |
CN110262546A (en) * | 2019-06-18 | 2019-09-20 | 武汉大学 | A kind of tunnel intelligent unmanned plane cruising inspection system and method |
CN110428467A (en) * | 2019-07-30 | 2019-11-08 | 四川大学 | A kind of camera, imu and the united robot localization method of laser radar |
CN110842940A (en) * | 2019-11-19 | 2020-02-28 | 广东博智林机器人有限公司 | Building surveying robot multi-sensor fusion three-dimensional modeling method and system |
CN111983639A (en) * | 2020-08-25 | 2020-11-24 | 浙江光珀智能科技有限公司 | Multi-sensor SLAM method based on Multi-Camera/Lidar/IMU |
CN112945233A (en) * | 2021-01-15 | 2021-06-11 | 北京理工大学 | Global drift-free autonomous robot simultaneous positioning and map building method |
CN113091771A (en) * | 2021-04-13 | 2021-07-09 | 清华大学 | Laser radar-camera-inertial navigation combined calibration method and system |
CN113052908A (en) * | 2021-04-16 | 2021-06-29 | 南京工业大学 | Mobile robot pose estimation method based on multi-sensor data fusion |
Non-Patent Citations (4)
Title |
---|
刘广彬;赵鹏;姜洲;焦明东;密兴刚;: "机载Lidar***在地形图测绘中的应用", 北京测绘, no. 07 * |
孙楠;裴信彪;王春军;李继辉;彭程;白越;: "基于立体视觉――惯导SLAM的四旋翼无人机导航算法", 微电子学与计算机, no. 05 * |
方章云;袁亮;侯爱萍;吴金强;: "基于单目视觉与里程计的组合室内定位研究", 组合机床与自动化加工技术, no. 12 * |
蒋林;夏旭洪;韩璐;邱存勇;张泰;宋杰;: "一种点线特征融合的双目同时定位与地图构建方法", 科学技术与工程, no. 12 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114114178A (en) * | 2021-12-10 | 2022-03-01 | 南京邮电大学 | Calibration device for radar and video image |
CN114648584A (en) * | 2022-05-23 | 2022-06-21 | 北京理工大学前沿技术研究院 | Robustness control method and system for multi-source fusion positioning |
CN114648584B (en) * | 2022-05-23 | 2022-08-30 | 北京理工大学前沿技术研究院 | Robustness control method and system for multi-source fusion positioning |
CN115200608A (en) * | 2022-06-10 | 2022-10-18 | 北京航天控制仪器研究所 | Method for calibrating installation error of water laser radar and inertial navigation |
CN116524014A (en) * | 2023-05-23 | 2023-08-01 | 斯乾(上海)科技有限公司 | Method and device for calibrating external parameters on line |
CN116721166A (en) * | 2023-06-09 | 2023-09-08 | 江苏集萃清联智控科技有限公司 | Binocular camera and IMU rotation external parameter online calibration method, device and storage medium |
CN116721166B (en) * | 2023-06-09 | 2024-07-12 | 江苏集萃清联智控科技有限公司 | Binocular camera and IMU rotation external parameter online calibration method, device and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN113721260B (en) | 2023-12-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113721260B (en) | Online combined calibration method for laser radar, binocular camera and inertial navigation | |
CN109166149B (en) | Positioning and three-dimensional line frame structure reconstruction method and system integrating binocular camera and IMU | |
CN112634451B (en) | Outdoor large-scene three-dimensional mapping method integrating multiple sensors | |
CN113269837B (en) | Positioning navigation method suitable for complex three-dimensional environment | |
CN109506642B (en) | Robot multi-camera visual inertia real-time positioning method and device | |
WO2021035669A1 (en) | Pose prediction method, map construction method, movable platform, and storage medium | |
CN106873619B (en) | Processing method of flight path of unmanned aerial vehicle | |
CN112304307A (en) | Positioning method and device based on multi-sensor fusion and storage medium | |
CN110261870A (en) | It is a kind of to synchronize positioning for vision-inertia-laser fusion and build drawing method | |
US20220292711A1 (en) | Pose estimation method and device, related equipment and storage medium | |
US8213706B2 (en) | Method and system for real-time visual odometry | |
CN105783913A (en) | SLAM device integrating multiple vehicle-mounted sensors and control method of device | |
US11430199B2 (en) | Feature recognition assisted super-resolution method | |
JP7369847B2 (en) | Data processing methods and devices, electronic devices, storage media, computer programs, and self-driving vehicles for self-driving vehicles | |
CN105141807A (en) | Video signal image processing method and device | |
CN113409459A (en) | Method, device and equipment for producing high-precision map and computer storage medium | |
CN115272494B (en) | Calibration method and device for camera and inertial measurement unit and computer equipment | |
Zienkiewicz et al. | Extrinsics autocalibration for dense planar visual odometry | |
CN112991400B (en) | Multi-sensor auxiliary positioning method for unmanned ship | |
CN111366153A (en) | Positioning method for tight coupling of laser radar and IMU | |
CN116295412A (en) | Depth camera-based indoor mobile robot dense map building and autonomous navigation integrated method | |
CN113587934A (en) | Robot, indoor positioning method and device and readable storage medium | |
Peng et al. | Vehicle odometry with camera-lidar-IMU information fusion and factor-graph optimization | |
CN115218906A (en) | Indoor SLAM-oriented visual inertial fusion positioning method and system | |
CN114638897A (en) | Multi-camera system initialization method, system and device based on non-overlapping views |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |