CN114659514B - LiDAR-IMU-GNSS fusion positioning method based on voxelized fine registration - Google Patents

LiDAR-IMU-GNSS fusion positioning method based on voxelized fine registration Download PDF

Info

Publication number
CN114659514B
CN114659514B CN202210278286.4A CN202210278286A CN114659514B CN 114659514 B CN114659514 B CN 114659514B CN 202210278286 A CN202210278286 A CN 202210278286A CN 114659514 B CN114659514 B CN 114659514B
Authority
CN
China
Prior art keywords
gnss
point cloud
point
voxel
global
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210278286.4A
Other languages
Chinese (zh)
Other versions
CN114659514A (en
Inventor
潘树国
何璇
高旺
刘宏
陆新宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN202210278286.4A priority Critical patent/CN114659514B/en
Publication of CN114659514A publication Critical patent/CN114659514A/en
Application granted granted Critical
Publication of CN114659514B publication Critical patent/CN114659514B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • G01S19/485Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an optical system or imaging system
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • G01S19/49Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an inertial position system, e.g. loosely-coupled
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention discloses a LiDAR-IMU-GNSS fusion positioning method based on voxel precise registration. Firstly, a voxel point cloud downsampling method based on curvature segmentation is provided, coarse classification is carried out through curvature threshold values, point cloud voxel downsampling is carried out by utilizing a Hash mapping function, and spatial feature distribution properties of source point clouds are reserved to a greater extent. Secondly, a point cloud registration model based on the nearest neighborhood of the point and the neighborhood point set is constructed, an iteration termination threshold is set to reduce the occurrence probability of the local optimal solution problem, and the time consumption of registering single-frame point clouds is increased by one order of magnitude. Finally, an optimized LiDAR-IMU-GNSS fusion positioning model is constructed, the GNSS observation value based on confidence weighting is utilized to carry out global correction on the local pose estimation value, and more continuous and more accurate pose estimation and map reconstruction than the similar advanced method can be realized on average in a complex urban environment.

Description

LiDAR-IMU-GNSS fusion positioning method based on voxelized fine registration
Technical Field
The invention belongs to the field of multi-sensor instant positioning and mapping, and particularly relates to a LiDAR-IMU-GNSS fusion positioning method based on voxel-based fine registration.
Background
For any autonomous robotic system, accurate and robust mobile carrier navigation positioning techniques are one of the most basic techniques. Aiming at the positioning problem of a mobile observation platform. In recent years, multisource fusion localization techniques based on SLAM (Simultaneous Localization AND MAPPING, instant localization and mapping) have received widespread attention from related businesses and researchers. The SLAM-based multisource fusion localization schemes can be classified into vision-based SLAM and lidar-based SLAM according to the difference of the main sensors. Due to the superiority of the sensor, the SLAM scheme based on the laser radar can collect the space fingerprint information more frequently and more accurately, and is more suitable for the real-time high-precision positioning requirement of a vehicle-mounted platform. Therefore, a SLAM scheme based on a laser radar, represented by a LiDAR-inertial meter (LiDAR-Inertial Odometry, LIO), is widely used to acquire three-dimensional geographic information of a complex environment and for carrier localization and map reconstruction.
The point cloud registration of the laser radar is a key step of pose estimation of a mobile carrier, and strictly influences the pose estimation and map reconstruction results of a positioning method. Common point cloud filtering registration methods include normal distribution transformation (Normal Distributions Transform, NDT), iterative closest point (ITERATIVE CLOSEST POINT, ICP) and various modifications. The NDT method is characterized in that a probability density function of a source point cloud and a probability density function of a target point cloud are used as target functions, a nonlinear optimization method is utilized to minimize the probability density function between the source point cloud and the target point cloud to obtain an optimal solution, and although the real-time performance is better, a covariance matrix is needed to be built through multiple points, so that the method has low robustness in a point cloud sparse area. As another method for point cloud registration, the ICP method has higher positioning accuracy, but in each iteration process, nearest neighbors are required to be searched again, a transformation matrix is obtained, and the high calculation cost is difficult to adapt to a vehicle-mounted platform with limited calculation resources. In summary, on the basis of the compression calculation cost, a high-precision real-time point cloud registration method suitable for an on-board platform is yet to be studied.
In addition, as a local sensor, there is a cumulative offset between the local map and the global map when the LIO pose estimates the current frame, which limits the positioning accuracy of the LIO positioning mapping scheme in a large outdoor environment. The need to use global observations from a global satellite positioning navigation system (Global Navigation SATELLITE SYSTEM, GNSS) can provide a trusted global constraint correction for LIO systems. The traditional fusion method often introduces an optimization framework for assisting LIO by three-dimensional GNSS measurement factors, but measurement information of a single key frame is redundant, and the reliability of the GNSS factors added when the vehicle travels to a GNSS multipath area is poor. In order to reduce the data redundancy, the conventional method provides that GNSS corner factors can be screened to restrict the local pose, but the condition that the number of the corner points of the straight road section is insufficient is not considered, the utilization rate of measurement information is low, and the application of the method in a large-scale complex outdoor environment is limited.
From the above analysis, it is highly desirable to propose a LIO-GNSS fusion scheme suitable for an on-board platform, which mainly includes: (1) Based on the compression calculation cost, the real-time high-precision point cloud registration is realized. (2) On the basis of fully utilizing GNSS measurement information, global accumulated error correction is carried out on LIO by utilizing GNSS.
Disclosure of Invention
In order to solve the problems, the invention discloses a LiDAR-IMU-GNSS fusion positioning method based on voxel precise registration, which realizes higher positioning precision in an actual urban environment than other similar methods and can provide precise and continuous pose estimation results for a vehicle-mounted platform in the running process of a complex urban environment.
In order to achieve the above purpose, the technical scheme of the invention is as follows:
a LiDAR-IMU-GNSS fusion positioning method based on voxel precise registration comprises the following steps:
(1) The voxel downsampling method based on curvature segmentation is adopted: giving an original point cloud sequence acquired by a group of laser radars, traversing all points in the original point cloud sequence, and performing coarse clustering by using a breadth-first method;
Specifically, the point cloud clusters with similar depths are finely segmented by using a geometric angle threshold based on Euclidean distance. Let the laser radar scan center be O, two similar edge points p a、pb in the point cloud cluster, the depth is d a、db(da>db respectively), and if the point cloud number of the point cloud cluster where the point p i is located is M, the roughness of the point cloud of p i is:
Setting the roughness threshold as Traversing M, will/>Points of (1) are classified as edge feature point sets, will/>The points of the curve are classified into a plane characteristic point set, downsampling operation is respectively carried out, and then a hash function mapping value is used for replacing random sampling to rapidly downsample the point cloud cluster after the curvature coarse classification; let the coordinates of a feature point in a set of point cloud sequences in voxel space be p (x, y, z). Let the voxel grid size be r, then the dimension of the voxel grid in the x-direction be D x=(xmax-xmin)/r, and the x-direction index of p in the voxel grid be h x=(x-xmin)/r, and the y and z directions are the same.
After obtaining the three-dimensional index of the feature point in the voxel space, rapidly sequencing the feature point index by adopting a hash function, and mapping the feature point index into N containers (N=80), wherein the hash mapping function is as follows:
hash(hx,hy,hz)=(hx+hy·Dx+hz·Dx·Dy)%N (2)
in order to avoid hash collision, the collision detection conditions are set as follows:
hash(hx,hy,hz)=hash(h'x,h'y,h'z)(hx≠h'x|hy≠h'y|hz≠h'z) (3)
once a hash collision is detected, the index value in the current container is output and the container is emptied, and a new index value is placed into the container.
(2) Point cloud registration based on smoothing voxel processing
Adopting a method for solving T based on characteristic point distribution in voxels, and converting a problem of constructing a single-point nearest neighbor model by using a dendrogram into a nearest neighbor model of a construction point and a neighborhood point set; first, two sets of point cloud sequences are approximated as gaussian distributions, i.e Wherein i is E (1, n),/>And/>Respectively two groups of point cloud sequence covariance matrixes; let the neighborhood point set of a i be/>Wherein λ is a neighborhood decision threshold, thereby calculating rigid body transformation error/>The method comprises the following steps:
The step realizes the smooth processing of all adjacent point clouds in the a i neighborhood; the maximum likelihood estimation of the frame pose transformation matrix T can be estimated as follows, where N i is the number of point clouds in the a i neighborhood.
In addition to smoothing all neighboring point clouds in the a i neighborhood, to avoid trapping in a locally optimal blind zone after multiple iterations, an iteration termination threshold epsilon is set as follows:
|RMSEk+1-RMSEk|>ε (6)
Wherein, RMSE k+1 and RMSE k are root mean square errors of the previous k+1 iterations and the previous k iterations, respectively, and when the absolute value of the variation of the root mean square errors |rmse k+1-RMSEk |ε or the maximum number of iterations is reached, the iterations are completed.
(3) The method suitable for unifying the pose space of the laser radar-IMU-GNSS multi-sensor is adopted;
Is provided with For IMU to GNSS extrinsic conversion matrix,/>An external parameter conversion matrix for the laser radar reaching the IMU; because the hardware is fixedly connected with the mobile carrier, the two are fixed values after calibration; introducing spatial conversion parameters/>(/>Including translation parametersAnd rotation parameter/>) Associating two pose spaces; the problem of multi-source pose space unification at the time t is expressed as follows:
In the method, in the process of the invention, Setting the initial value as a unit array, and updating the/>, at the next moment after adding GNSS factors to solve the global optimal solution each timeA value correcting an accumulated offset between the local coordinate system and the global coordinate system;
(4) Global map construction
And (3) using a graph optimization method, regarding the construction global map as a maximum posterior estimation problem, and carrying out nonlinear optimization on the state vector in the sliding window. The global optimization function is constructed as follows:
Wherein ρ is GNSS confidence; For the current global point cloud/> Local point cloud derived after local matching with inter-frameA pose conversion matrix between the two; the specific meaning of each sensor cost function in the formula is as follows;
① Lidar-inertial odometer factor: the carrier position under the local coordinate system at the moment t can be obtained according to the local pose estimation result of the laser radar-inertial odometer And rotation/>The local residual factor can thus be constructed as follows:
② GNSS factors: setting the time interval of two frames of GNSS observation values as delta t, and realizing time alignment with LIO pose estimation values in an interpolation mode; GNSS measurements in an ENU coordinate system are now given Pose observations of LIO in global coordinate systemThe GNSS residual factor is expressed as follows:
When the carrier moves to a GNSS signal good area, GNSS factors are added by taking GNSS confidence as a weight in order to fully and reliably utilize GNSS observation values, wherein the GNSS confidence is determined by the number of visible effective satellites of the GNSS;
③ Voxelized loop-back factor: considering the possible coincidence of the running areas of the mobile carrier, a loop detection link needs to be added to establish loop constraint existing between non-adjacent frames; the loop back factor can be constructed according to equation (5) as follows:
The invention has the beneficial effects that:
The invention provides a LiDAR-IMU-GNSS fusion positioning method based on voxel precise registration. Firstly, a voxel point cloud downsampling method based on curvature segmentation is provided, coarse classification is carried out through curvature threshold values, point cloud voxel downsampling is carried out by utilizing a Hash mapping function, and spatial feature distribution attributes of source point clouds are reserved to a greater extent. Secondly, a point cloud registration model based on the nearest neighborhood of the point and the neighborhood point set is constructed, and an iteration termination threshold is set to reduce the occurrence probability of the local optimal solution problem. The time consumption of registering the single-frame point cloud is improved by one order of magnitude. And finally, constructing a fusion positioning model based on graph optimization, and carrying out global correction on the local pose of the LIO by using the GNSS observation value weighted based on the confidence coefficient. Experimental results show that in a large-scale actual measurement outdoor environment, the mean root mean square error of absolute positioning track errors of the method is 2.317m, and compared with similar methods, the accuracy is improved. Experimental results fully prove that the method can realize more continuous and more accurate pose estimation and map reconstruction than the similar advanced methods on average in a complex urban environment.
Drawings
FIG. 1 is a unified schematic diagram of a multi-sensor pose space;
FIG. 2 is a comparative bin diagram of the absolute track error of the present system and the like;
FIG. 3 is a diagram showing the effect of the system.
Detailed Description
The present invention is further illustrated in the following drawings and detailed description, which are to be understood as being merely illustrative of the invention and not limiting the scope of the invention.
As shown in the figure, the LiDAR-IMU-GNSS fusion positioning method based on voxel precise registration comprises the following specific steps:
(1) Voxelized downsampling based on curvature segmentation
The invention provides a voxelized downsampling method based on curvature segmentation. Giving an original point cloud sequence acquired by a group of laser radars, traversing all points in the original point cloud sequence, and performing coarse clustering by using a breadth-first method. Further, the point cloud clusters with similar depths are finely segmented by utilizing a geometric angle threshold value based on Euclidean distance. Let the laser radar scan center be O, two similar edge points p a、pb in the point cloud cluster, the depth is d a、db(da>db respectively), and if the point cloud number of the point cloud cluster where the point p i is located is M, the roughness of the point cloud of p i is:
Setting the roughness threshold as Traversing M, will/>Points of (1) are classified as edge feature point sets, will/>The points of (2) are classified into a plane characteristic point set, and downsampling operations are respectively carried out. The invention uses hash function mapping value to replace random sampling to rapidly downsample the point cloud cluster after curvature coarse classification. Let the coordinates of a feature point in a set of point cloud sequences in voxel space be p (x, y, z). Let the voxel grid size be r, then the dimension of the voxel grid in the x-direction be D x=(xmax-xmin)/r, and the x-direction index of p in the voxel grid be h x=(x-xmin)/r, and the y and z directions are the same.
After obtaining the three-dimensional index of the feature point in the voxel space, rapidly sequencing the feature point index by adopting a hash function, and mapping the feature point index into N containers (N=80), wherein the hash mapping function is as follows:
hash(hx,hy,hz)=(hx+hy·Dx+hz·Dx·Dy)%N (2)
in order to avoid hash collision, the collision detection conditions are set as follows:
hash(hx,hy,hz)=hash(h'x,h'y,h'z)(hx≠h'x|hy≠h'y|hz≠h'z) (3)
once a hash collision is detected, the index value in the current container is output and the container is emptied, and a new index value is placed into the container.
(2) Point cloud registration based on smoothing voxel processing
The invention provides a method for solving T based on feature point distribution in voxels, which converts a point-to-nearest neighbor model problem constructed by using a dendrogram into a nearest neighbor model of a constructed point and a neighborhood point set. First, two sets of point cloud sequences are approximated as gaussian distributions, i.eWherein i is E (1, n),/>And/>Two groups of point cloud sequence covariance arrays are respectively adopted. Let the neighborhood point set of a i be/>Wherein λ is a neighborhood decision threshold, thereby calculating a rigid body transformation errorThe method comprises the following steps:
The step realizes the smoothing processing of all the adjacent point clouds in the neighborhood of a i. The maximum likelihood estimation of the frame pose transformation matrix T can be estimated as follows, where N i is the number of point clouds in the a i neighborhood.
In addition to the smoothing process for all neighboring point clouds in the a i neighborhood, in order to avoid the dead zone which is trapped in the local optimum after multiple iterations, further, an iteration termination threshold epsilon is set as follows:
|RMSEk+1-RMSEk|>ε (6)
Where RMSE k+1 and RMSE k are root mean square errors for the first k+1 iterations and the first k iterations, respectively. The iteration is completed when the absolute value of the variation of the root mean square error |rmse k+1-RMSEk |ε or the maximum number of iterations is reached.
(3) Multi-sensor pose space unification
The invention provides a method suitable for unifying the pose space of multiple sensors of a laser radar-IMU-GNSS. Is provided withFor IMU to GNSS extrinsic conversion matrix,/>The matrix is an external parameter conversion matrix of the laser radar reaching the IMU. Because the hardware is fixedly connected with the mobile carrier, the two are fixed values after calibration. Introducing spatial conversion parameters/>(Including translation parameters/>)And rotation parameters) Associating the two pose spaces. the problem of multi-source pose space unification at time t can be expressed as:
In the method, in the process of the invention, Setting the initial value as a unit array, and updating the/>, at the next moment after adding GNSS factors to solve the global optimal solution each timeAnd correcting the accumulated offset between the local coordinate system and the global coordinate system.
(4) Global map construction
The invention uses the graph optimization method to consider the construction global map as a maximum posterior estimation problem and carries out nonlinear optimization on the state vector in the sliding window. The global optimization function is constructed as follows:
where ρ is GNSS confidence. For the current global point cloud/>Local point cloud derived after local matching with inter-frameA pose conversion matrix between the two. The specific meaning of each sensor cost function in the formula is as follows.
① Lidar-inertial odometer factor: the carrier position under the local coordinate system at the moment t can be obtained according to the local pose estimation result of the laser radar-inertial odometerAnd rotation/>The local residual factor can thus be constructed as follows:
② GNSS factors: let the time interval of two frames of GNSS observations be Deltat, and realize time alignment with LIO pose estimation values in an interpolation mode. GNSS measurements in an ENU coordinate system are now given Pose observations of LIO in global coordinate systemThe GNSS residual factor is expressed as follows:
To fully and reliably utilize the GNSS observations when the carrier moves to a GNSS signal good region, GNSS factors are added with GNSS confidence as a weight, the GNSS confidence being determined by the number of GNSS visible satellites in effect.
③ Voxelized loop-back factor: considering the possible coincidence of the running areas of the mobile carrier, a loop detection link needs to be added to establish loop constraint existing between non-adjacent frames. The loop back factor can be constructed according to equation (5) as follows:
The positioning accuracy of the technical scheme of the invention is quantitatively evaluated according to a public data set experiment. And evaluating the positioning accuracy of the fusion method by using the KITTI _RAW data set, and comparing with other similar advanced methods. Four different outdoor scenes are selected for verifying the performance of the fusion method, including urban environments, open areas, highways and in-forest roads. The size of the voxel grid of the fusion method is set to be 0.3 multiplied by 0.2 multiplied by 0.3, the maximum iteration frequency threshold is set to be 30, and the iteration termination tolerance threshold is set to be 1e-8, so that the real-time requirement is met, and the stability of the quantity of the feature point clouds participating in matching in the outdoor point cloud sparse area is ensured. The evaluation strategy is to output pose estimation results by comparing the track true values with different methods and calculate the root mean square error (APE_RMSE) value of the absolute track error.
Data set test results are shown in the following table:
TABLE 1 absolute track error (ATE_RMSE (m)) for each method within KITTI datasets
From the table, thanks to the accurate point cloud registration and the multi-sensor weighted nonlinear optimization framework introducing global constraint of GNSS, the mean root mean square error of absolute track errors of the method is about 2.317m, and compared with the positioning errors of the three similar advanced methods, the positioning errors of the three advanced methods are obviously reduced, and the positioning accuracy of the three advanced methods is superior to that of the three advanced methods. The experimental result shows that the global optimization link in the method well completes the effect of inhibiting the local accumulated drift, so that the pose estimation result tends to the global optimal solution. And because the local sensor is not interfered by the signal refraction and reflection environment, the positioning abnormal value generated by the multipath effect in the urban environment of the traditional GNSS positioning is compensated, and the complete reliability of the positioning of the fusion system is ensured.
According to the LiDAR-IMU-GNSS fusion positioning method based on voxel precise registration, which is provided by the invention, the positioning accuracy is higher than that of other similar methods in the actual urban environment, and a precise and continuous pose estimation result can be provided for the vehicle-mounted platform in the running process of the complex urban environment.

Claims (5)

1. A LiDAR-IMU-GNSS fusion positioning method based on voxel precise registration comprises the following steps:
(1) The voxel downsampling method based on curvature segmentation is adopted: giving an original point cloud sequence acquired by a group of laser radars, traversing all points in the original point cloud sequence, and performing coarse clustering by using a breadth-first method;
Specifically, the point cloud clusters with similar depths are finely segmented by utilizing a geometric angle threshold value based on Euclidean distance; let the laser radar scan center be O, two similar edge points p a、pb in the point cloud cluster, the depth is d a、db,da>db respectively, the point cloud number in the point cloud cluster where the set point p i is located is M, and then the p i point cloud roughness is:
Setting the roughness threshold as Traversing M, will/>Points of (1) are classified as edge feature point sets, will/>The points of the curve are classified into a plane characteristic point set, downsampling operation is respectively carried out, and then a hash function mapping value is used for replacing random sampling to rapidly downsample the point cloud cluster after the curvature coarse classification; setting the coordinates of a certain characteristic point in a group of point cloud sequences in a voxel space as p (x, y, z); setting the size of the voxel grid as r, wherein the dimension of the voxel grid in the x direction is D x=(xmax-xmin)/r, and the x direction index of p in the voxel grid is h x=(x-xmin)/r, and the y and z directions are the same;
after three-dimensional indexes of the feature points in the voxel space are obtained, the feature point indexes are rapidly ordered by adopting a hash function, the feature point indexes are mapped into N containers, N=80, and the hash mapping function is as follows:
hash(hx,hy,hz)=(hx+hy·Dx+hz·Dx·Dy)%N (2)
in order to avoid hash collision, the collision detection conditions are set as follows:
hash(hx,hy,hz)=hash(h'x,h'y,h'z) (hx≠h'x|hy≠h'y|hz≠h'z) (3)
Outputting the index value in the current container and emptying the container once the hash collision is detected, and placing a new index value into the container;
(2) Point cloud registration based on smoothing voxel processing
Adopting a method for solving T based on characteristic point distribution in voxels, and converting a problem of constructing a single-point nearest neighbor model by using a dendrogram into a nearest neighbor model of a construction point and a neighborhood point set; first, two sets of point cloud sequences are approximated as gaussian distributions, i.e Wherein i is E (1, n),/>And/>Respectively two groups of point cloud sequence covariance matrixes; let the neighborhood point set of a i be/>Wherein λ is a neighborhood decision threshold, thereby calculating rigid body transformation error/>The method comprises the following steps:
The step realizes the smooth processing of all adjacent point clouds in the a i neighborhood; the maximum likelihood estimation of the pose transformation matrix T of the frame can be estimated as follows, wherein N i is the number of point clouds in a i neighborhood;
in addition to smoothing all neighboring point clouds in the a i neighborhood, to avoid trapping in a locally optimal blind zone after multiple iterations, an iteration termination threshold epsilon is set as follows:
|RMSEk+1-RMSEk|>ε (6)
Wherein, RMSE k+1 and RMSE k are root mean square errors of the previous k+1 iterations and the previous k iterations respectively, and when the absolute value of the variation of the root mean square errors is |rmse k+1-RMSEk |ε or reaches the maximum number of iterations, the iterations are completed;
(3) The method suitable for unifying the pose space of the laser radar-IMU-GNSS multi-sensor is adopted;
Is provided with For IMU to GNSS extrinsic conversion matrix,/>An external parameter conversion matrix for the laser radar reaching the IMU; because the hardware is fixedly connected with the mobile carrier, the two are fixed values after calibration; introducing spatial conversion parameters/>Associating two pose spaces; /(I)Including translation parameters/>And rotation parameter/>The problem of multi-source pose space unification at the time t is expressed as follows:
In the method, in the process of the invention, Setting the initial value as a unit array, and updating the/>, at the next moment after adding GNSS factors to solve the global optimal solution each timeA value correcting an accumulated offset between the local coordinate system and the global coordinate system;
(4) Global map construction
The method comprises the steps that a map optimization method is utilized, a global map is constructed as a maximum posterior estimation problem, and nonlinear optimization is carried out on state vectors in a sliding window; the global optimization function is constructed as follows:
Wherein ρ is GNSS confidence; For the current global point cloud/> Local point cloud derived after local matching with inter-frameA pose conversion matrix between the two; the specific meaning of each sensor cost function in the formula is as follows;
① Lidar-inertial odometer factor: the carrier position under the local coordinate system at the moment t can be obtained according to the local pose estimation result of the laser radar-inertial odometer And rotation/>The local residual factor can thus be constructed as follows:
② GNSS factors: setting the time interval of two frames of GNSS observation values as delta t, and realizing time alignment with LIO pose estimation values in an interpolation mode; GNSS measurements in an ENU coordinate system are now given Pose observations of LIO in global coordinate System/>The GNSS residual factor is expressed as follows:
When the carrier moves to a GNSS signal good area, GNSS factors are added by taking GNSS confidence as a weight in order to fully and reliably utilize GNSS observation values, wherein the GNSS confidence is determined by the number of visible effective satellites of the GNSS;
③ Voxelized loop-back factor: considering the possible coincidence of the running areas of the mobile carrier, a loop detection link needs to be added to establish loop constraint existing between non-adjacent frames; the loop back factor can be constructed according to equation (5) as follows:
2. The LiDAR-IMU-GNSS fusion positioning method based on voxel precise registration according to claim 1, wherein the point cloud voxel downsampling method based on curvature segmentation in the step (1) is characterized in that firstly coarse classification is carried out through a curvature threshold value, then the point cloud voxel downsampling is carried out by utilizing a Hash mapping function instead of a random sampling consistency method, and the spatial feature distribution attribute of source point clouds is reserved to a greater extent.
3. The LiDAR-IMU-GNSS fusion positioning method based on voxel-based fine registration of claim 1, wherein the point cloud registration based on smooth voxel-based processing in the step (2) performs smooth processing on the point cloud sequence aiming at single-to-multiple distribution in the point cloud sequence, a point cloud registration model based on the nearest neighborhood of the point and neighborhood point set is constructed, and an iteration termination threshold is set to reduce the occurrence probability of a local optimal solution problem.
4. The LiDAR-IMU-GNSS fusion positioning method based on voxel-based fine registration according to claim 1, wherein in the multi-sensor pose space unification method in the step (3), a pose estimation result of a laser radar and an IMU in a local map and a GNSS measurement value in a global map are spatially unified by a lever arm compensation method, and a pose conversion matrix at the next moment is updated after a GNSS factor is added to solve a global optimal solution each time, so that the accumulated offset between a local coordinate system and the global coordinate system is corrected.
5. The LiDAR-IMU-GNSS fusion positioning method based on voxel-based fine registration of claim 1, wherein in the step (4), a global map construction framework based on graph optimization is adopted, and weighted GNSS global position observations and voxel loop factors are used as global constraints to inhibit the accumulated drift of a local sensor; and optimizing and correcting the historical track by utilizing an optimized smooth voxel point cloud registration method through registration between the point cloud of the prior local map and the current global point cloud, so that the pose estimated value is converged to the global optimal result.
CN202210278286.4A 2022-03-21 2022-03-21 LiDAR-IMU-GNSS fusion positioning method based on voxelized fine registration Active CN114659514B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210278286.4A CN114659514B (en) 2022-03-21 2022-03-21 LiDAR-IMU-GNSS fusion positioning method based on voxelized fine registration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210278286.4A CN114659514B (en) 2022-03-21 2022-03-21 LiDAR-IMU-GNSS fusion positioning method based on voxelized fine registration

Publications (2)

Publication Number Publication Date
CN114659514A CN114659514A (en) 2022-06-24
CN114659514B true CN114659514B (en) 2024-05-31

Family

ID=82032309

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210278286.4A Active CN114659514B (en) 2022-03-21 2022-03-21 LiDAR-IMU-GNSS fusion positioning method based on voxelized fine registration

Country Status (1)

Country Link
CN (1) CN114659514B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115407357B (en) * 2022-07-05 2024-05-31 东南大学 Low-harness laser radar-IMU-RTK positioning mapping algorithm based on large scene
CN115508844B (en) * 2022-11-23 2023-03-21 江苏新宁供应链管理有限公司 Intelligent detection method for deviation of logistics conveyor based on laser radar
WO2024112255A1 (en) * 2022-11-23 2024-05-30 Dconstruct Technologies Pte. Ltd. Device and method for performing localization of a robot device
CN115540875B (en) * 2022-11-24 2023-03-07 成都运达科技股份有限公司 Method and system for high-precision detection and positioning of train vehicles in station track
CN115631314B (en) * 2022-12-19 2023-06-09 中汽研(天津)汽车工程研究院有限公司 Point cloud map construction method based on multi-feature and self-adaptive key frames
CN118379526A (en) * 2024-06-26 2024-07-23 集美大学 Multi-feature point cloud matching method based on probability distribution sampling constraint

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107709928A (en) * 2015-04-10 2018-02-16 欧洲原子能共同体由欧洲委员会代表 For building figure and the method and apparatus of positioning in real time
CN111710027A (en) * 2020-05-25 2020-09-25 南京林业大学 Tunnel three-dimensional geometric reconstruction method considering data-driven segment segmentation and model-driven segment assembly
CN111949943A (en) * 2020-07-24 2020-11-17 北京航空航天大学 Vehicle fusion positioning method for V2X and laser point cloud registration for advanced automatic driving
CN114140761A (en) * 2020-08-13 2022-03-04 长沙智能驾驶研究院有限公司 Point cloud registration method and device, computer equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108230379B (en) * 2017-12-29 2020-12-04 百度在线网络技术(北京)有限公司 Method and device for fusing point cloud data

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107709928A (en) * 2015-04-10 2018-02-16 欧洲原子能共同体由欧洲委员会代表 For building figure and the method and apparatus of positioning in real time
CN111710027A (en) * 2020-05-25 2020-09-25 南京林业大学 Tunnel three-dimensional geometric reconstruction method considering data-driven segment segmentation and model-driven segment assembly
CN111949943A (en) * 2020-07-24 2020-11-17 北京航空航天大学 Vehicle fusion positioning method for V2X and laser point cloud registration for advanced automatic driving
CN114140761A (en) * 2020-08-13 2022-03-04 长沙智能驾驶研究院有限公司 Point cloud registration method and device, computer equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于激光雷达的SLAM和融合定位方法综述;代凯 等;《汽车文摘》;20211231;第1-8页 *

Also Published As

Publication number Publication date
CN114659514A (en) 2022-06-24

Similar Documents

Publication Publication Date Title
CN114659514B (en) LiDAR-IMU-GNSS fusion positioning method based on voxelized fine registration
CN113432600B (en) Robot instant positioning and map construction method and system based on multiple information sources
CN105760811B (en) Global map closed loop matching process and device
CN112230243B (en) Indoor map construction method for mobile robot
CN114526745B (en) Drawing construction method and system for tightly coupled laser radar and inertial odometer
CN103345757B (en) Optics under multilevel multi-feature constraint and SAR image autoegistration method
CN104657981B (en) Dynamic compensation method for three-dimensional laser distance metering data of mobile robot in moving process
CN115407357B (en) Low-harness laser radar-IMU-RTK positioning mapping algorithm based on large scene
CN109949350B (en) Multi-temporal point cloud automatic registration method based on morphological invariant features
CN111242996B (en) SLAM method based on Apriltag and factor graph
CN105758408A (en) Method and device for building local maps
CN112215958B (en) Laser radar point cloud data projection method based on distributed computation
WO2024001629A1 (en) Multi-sensor fusion method and system for intelligent driving vehicle
CN114777775B (en) Positioning method and system for multi-sensor fusion
CN116758153A (en) Multi-factor graph-based back-end optimization method for accurate pose acquisition of robot
CN111274529B (en) Robust Gao Sini Weisal PHD multi-expansion target tracking algorithm
CN114563000B (en) Indoor and outdoor SLAM method based on improved laser radar odometer
CN110688440B (en) Map fusion method suitable for less sub-map overlapping parts
CN110285805A (en) A kind of adaptive-interpolation/division processing method of data void holes
CN113670290B (en) Mobile robot indoor map construction method based on multi-robot cooperation
CN113627569A (en) Data fusion method for radar video all-in-one machine used for traffic large scene
CN117031440A (en) Laser SLAM method based on road surface point cloud plane
CN117541631B (en) Blade profile data registration method based on multi-scale features and bidirectional distances
CN116736330A (en) Method for acquiring laser odometer of robot based on dynamic target tracking
Li et al. Plane Detection Based on an Improved RANSAC Algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant