CN115963508A - Tightly-coupled SLAM method and system for laser radar and IMU - Google Patents

Tightly-coupled SLAM method and system for laser radar and IMU Download PDF

Info

Publication number
CN115963508A
CN115963508A CN202210521720.7A CN202210521720A CN115963508A CN 115963508 A CN115963508 A CN 115963508A CN 202210521720 A CN202210521720 A CN 202210521720A CN 115963508 A CN115963508 A CN 115963508A
Authority
CN
China
Prior art keywords
data
imu
point cloud
frame
laser
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210521720.7A
Other languages
Chinese (zh)
Inventor
刘弨
段章领
仰劲涛
王坤
周行云
路然
吴世博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Hegong Anchi Intelligent Technology Co ltd
Original Assignee
Hefei Hegong Anchi Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Hegong Anchi Intelligent Technology Co ltd filed Critical Hefei Hegong Anchi Intelligent Technology Co ltd
Priority to CN202210521720.7A priority Critical patent/CN115963508A/en
Publication of CN115963508A publication Critical patent/CN115963508A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a method and a system for tightly coupling SLAM (SLAM-based radar) of a laser radar and an IMU (inertial measurement unit), belonging to the technical field of intelligent positioning and navigation of mines, wherein the method comprises the steps of carrying out IMU pre-integration based on data of a laser odometer and data of the IMU odometer to obtain IMU integral data; carrying out motion distortion correction on point cloud data generated by a laser radar to obtain point cloud data after distortion correction, and extracting point cloud characteristic information from the point cloud data after distortion correction; selecting a candidate closed-loop matching frame from the historical key frames, matching the scanning data of the laser radar with a map to obtain pose transformation, and constructing a closed-loop factor; and regarding the point cloud characteristic information, adding the IMU integral data, the data of the laser odometer and a closed-loop factor as factors of a graph optimization algorithm into a factor graph, optimizing the factor graph, and updating the poses of all key frames to obtain a reconstructed map. The invention improves the accuracy of the whole system in an unstructured environment.

Description

Tightly-coupled SLAM method and system for laser radar and IMU
Technical Field
The invention relates to the technical field of mine intelligent positioning and navigation, in particular to a method and a system for tightly coupling SLAM of a laser radar and an IMU.
Background
The operation areas such as coal mine tunnels, excavation working faces and the like have typical unstructured environmental characteristics, and a Global Positioning System (GPS) technology cannot be directly applied to the underground, so that mine accidents frequently occur during coal mining, and people need to be replaced mechanically, reduced automatically and the intelligent level of mines needs to be improved urgently. By constructing the scheme of the autonomous positioning system suitable for the coal mine robot, the problems of accurate positioning, posture sensing and the like of the underground robot can be solved. How to rapidly break through an underground robot accurate sensing And positioning technology fused by multiple information such as Inertial navigation And laser is a key for realizing local autonomy of the underground robot, so that a method for fusing synchronous positioning And map construction (SLAM) of a laser radar And an Inertial Measurement Unit (IMU) is very important.
In the related technology, the application publication number CN113379910a of the invention discloses a mobile robot mine scene reconstruction method and system based on SLAM, which obtains synchronously calibrated laser point cloud data and visual point cloud data measured by a mobile robot; fusing the acquired laser point cloud data and the visual point cloud data; carrying out point cloud motion distortion removal processing and point cloud filtering processing on the fused point cloud data; adding IMU pre-integration data, point cloud key frame data and GNSS data into a constraint subgraph by combining the processed point cloud data and adopting a multi-constraint factor graph algorithm based on graph optimization, and performing loop detection to obtain a reconstructed three-dimensional map; according to the method and the device, the three-dimensional reconstruction under the mine scene can be effectively realized, the point cloud map with colors is finally obtained, and the accuracy of the mine scene reconstruction is improved.
However, the reconstruction method performs the operation on the original point cloud of the laser radar and the IMU data together, so that the precision of the whole system in the unstructured environment is improved, but the calculation amount is increased, and the accuracy and the real-time performance of the system cannot be considered.
Disclosure of Invention
The invention aims to solve the technical problem of how to balance the precision and the calculated amount of the system and ensure the accuracy and the real-time performance of the system.
The invention solves the technical problems through the following technical means:
in one aspect, the present invention provides a method for tightly coupling SLAM between a laser radar and an IMU, where the method includes:
performing IMU pre-integration based on the data of the laser odometer and the data of the IMU odometer to obtain IMU integration data;
carrying out motion distortion correction on point cloud data generated by a laser radar to obtain point cloud data after distortion correction, and extracting point cloud characteristic information from the point cloud data after distortion correction;
selecting a candidate closed-loop matching frame from the historical key frames, matching the scanning data of the laser radar with a map to obtain pose transformation, and constructing a closed-loop factor;
and regarding the point cloud characteristic information, adding the IMU integral data, the data of the laser odometer and the closed-loop factor as factors of a graph optimization algorithm into a factor graph, optimizing the factor graph, and updating the poses of all key frames to obtain a reconstructed map.
The method for tightly coupling the laser radar and the IMU is used for carrying out the same operation on the original point cloud of the laser radar and the IMU data, so that the precision of the whole system in an unstructured environment is improved; the data of the laser odometer is used as an optimization factor and added into the factor graph, and the distortion of the data of the laser odometer can be removed, so that the precision is further improved; meanwhile, aiming at the problem that the calculation amount is increased due to the simultaneous operation of the laser radar point cloud data and the IMU data, the method and the device match the key frames, balance is achieved between the precision and the calculation amount, and the accuracy and the real-time performance of the system are guaranteed. In addition, the IMU odometer and the laser odometer are arranged, and the laser radar and the IMU are tightly coupled, so that the Global Navigation Satellite System (GNSS) is not required, the System can be used without Satellite signals, and the universality is higher.
Further, the performing IMU pre-integration based on the data of the laser odometer and the data of the IMU odometer to obtain IMU integration data includes:
receiving data of the laser odometer and storing a current time stamp of the laser odometer;
calculating relative pose transformation between the IMU odometers corresponding to the starting time and the ending time in the IMU queue, wherein the starting time is the current timestamp of the laser odometer;
adding IMU original data into an IMU queue, and performing integral calculation on data between two frames in the IMU queue by using the data of the laser odometer and the relative pose transformation as initial values of integral to obtain IMU integral data.
Further, the extracting point cloud feature information from the point cloud data after distortion correction includes:
for each point cloud frame in the point cloud data after the distortion correction, taking the first five points and the last five points of the current point cloud frame, and calculating the square of the sum of the distance differences from each point to the middle point of the current point cloud frame as the curvature of the current point cloud frame;
dividing each point cloud frame into a plurality of sections, extracting a set number of edge points and plane points from each section, and respectively sequencing the edge points and the plane points in each section by taking smoothness as an index to obtain a first sequencing result and a second sequencing result;
extracting edge point features according to smoothness from low to high based on the first sequencing result;
extracting plane point features according to smoothness from high to low based on the second sorting result;
and encapsulating the point cloud data with the edge point characteristics and the plane point characteristics to obtain the point cloud characteristic information.
Further, the process of determining the data of the laser odometer comprises:
determining a key frame based on a point cloud frame of a laser radar, wherein the point cloud frame of the laser radar consists of a plane feature and an edge feature;
constructing a point cloud map comprising a certain amount of scanning point clouds by a sliding window method, registering a key frame into a prior sub key frame set with a fixed size, and converting the sub key frame set into a world coordinate system by adopting a transformation relation to obtain a map formed by merging sub key frames;
adding the obtained current key frame of the laser radar into the map through scanning matching;
calculating a first distance between the edge feature and the corresponding edge block, and calculating a second distance between the planar feature and the corresponding planar block;
and solving the optimal transformation matrix relation by adopting a Gauss-Newton method based on the first distance and the second distance to obtain the pose transformation relation of state nodes of the robot at adjacent moments as the data of the laser odometer.
Further, the determining a key frame based on the point cloud frame of the laser radar comprises:
defining edge features of scanning point cloud of laser radar at t moment
Figure BDA0003643748720000041
And a plano feature->
Figure BDA0003643748720000042
Constructing a lidar point cloud frame based on the edge feature and the plane feature>
Figure BDA0003643748720000043
/>
When the difference value between the current state node of the robot and the state node at the last moment exceeds a defined threshold value, determining a laser radar point cloud frame corresponding to the current state node as the key frame;
and associating the current state node with the key frame.
Further, the solving an optimal transformation matrix relation by using a gauss-newton method based on the first distance and the second distance to obtain a pose transformation relation of state nodes of the robot at adjacent moments as data of the laser odometer includes:
based on the first distance and the second distance, solving the optimal transformation matrix relation by adopting a Gauss-Newton method as follows:
Figure BDA0003643748720000044
in the formula: d ek Is a first distance, d pk The second distance is a distance between the first and second electrodes,
Figure BDA0003643748720000045
converting the key frame in the robot coordinate system to the edge feature and the plane feature in the world coordinate system, and->
Figure BDA0003643748720000046
Taken as an edge point at the k +1 th moment>
Figure BDA0003643748720000047
The plane point taken at the k +1 th moment.
Obtaining a state node X based on the transformation matrix k And X k+1 A transformation relation between delta T k,k+1
Figure BDA0003643748720000048
In the formula: t is k For rotation and translation at time k, T k+1 Rotation and translation at time k + 1.
Further, the method further comprises:
when a new state node is added into the factor graph, searching a prior state node which is closest to the new state node in the factor graph in Euclidean space;
and adding the pose transformation relation of the closest prior state node and the state node at the next moment into the factor graph as a closed-loop factor.
Further, the selecting a candidate closed-loop matching frame from the historical key frames, matching the scanning data of the laser radar with a map to obtain pose transformation, and constructing a closed-loop factor includes:
constructing a Kdtree through the historical key frames, searching and matching the key frames with the specified search radius, and taking the key frames which are closest to the Kdtree and have the time intervals meeting the specified range as the matched frames;
constructing a first feature set of the current historical key frame through inherent edge points and plane points of the current historical key frame, and constructing a second feature set of the matching frame by using m key frame feature points which are adjacent to each other before and after the matching frame;
and matching the first characteristic point set and the second characteristic point set based on the appointed iteration times and the minimum iteration threshold, correcting the pose of the current historical key frame according to the matching result, and storing the corresponding relation between the current historical key frame and the matched frame as a closed-loop factor.
Further, the method further comprises:
and displaying the IMU integral data, the point cloud data after distortion correction, the point cloud characteristic information and the optimization result of the factor graph through Rviz.
In addition, the invention also provides a tightly coupled SLAM system of the laser radar and the IMU, and the system comprises:
the pre-integration module is used for performing IMU pre-integration based on the data of the laser odometer and the data of the IMU odometer to obtain IMU integration data;
the correction extraction module is used for carrying out motion distortion correction on the point cloud data generated by the laser radar to obtain the point cloud data after the distortion correction, and extracting point cloud characteristic information from the point cloud data after the distortion correction;
the closed-loop factor determination module is used for selecting a candidate closed-loop matching frame from historical key frames, matching the scanning data of the laser radar with a map to obtain pose transformation and constructing a closed-loop factor;
and the optimization module is used for adding the IMU integral data, the data of the laser odometer and the closed-loop factor as factors of a graph optimization algorithm for the point cloud characteristic information into a factor graph, optimizing the factor graph, and updating the poses of all key frames to obtain a reconstructed map.
The invention has the advantages that:
(1) The method for tightly coupling the laser radar and the IMU is used for carrying out the same operation on the original point cloud of the laser radar and the IMU data, so that the precision of the whole system in an unstructured environment is improved; the data of the laser odometer is used as an optimization factor and added into the factor graph, and the distortion of the data of the laser odometer can be removed, so that the precision is further improved; meanwhile, aiming at the problem that the calculation amount is increased due to the simultaneous operation of the laser radar point cloud data and the IMU data, the method and the device match the key frames, balance is achieved between the precision and the calculation amount, and the accuracy and the real-time performance of the system are guaranteed. In addition, the IMU odometer and the laser odometer are arranged, and the laser radar and the IMU are tightly coupled, so that the method does not need to depend on GNSS data, can be used without satellite signals, and has universality.
(2) In the invention, the key frame is defined by using the node state of the robot, and because the state parameter is in a multi-dimensional vector form, the perception of the environment is more obvious compared with the single data of the distance, the definition of the key frame in the method is more accurate, and the accuracy of the system is further improved.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
FIG. 1 is a schematic flow diagram of a method for tightly coupling SLAM between a lidar and an IMU in a first embodiment of the invention;
FIG. 2 is a schematic block diagram of a tightly coupled SLAM of a lidar and an IMU in a first embodiment of the invention;
FIG. 3 is a diagram illustrating IMU pre-integration data transfer according to a first embodiment of the present invention;
FIG. 4 is a schematic illustration of a first embodiment of a laser distortion correction data delivery system in accordance with the present invention;
FIG. 5 is a diagram illustrating the transmission of point cloud feature extraction data according to a first embodiment of the present invention;
FIG. 6 is a diagram illustrating factor graph optimized data delivery according to a first embodiment of the present invention;
FIG. 7 is a schematic structural diagram of a tightly coupled SLAM system for a lidar and an IMU in a second embodiment of the invention;
FIG. 8 is a graph of experimental results of a tightly coupled SLAM method of the laser radar and IMU of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the embodiments of the present invention, and it is obvious that the described embodiments are some embodiments of the present invention, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1 to 2, a first embodiment of the present invention proposes a tightly coupled SLAM method of a laser radar and an IMU, the method including the steps of:
s10, performing IMU pre-integration based on the data of the laser odometer and the data of the IMU odometer to obtain IMU integration data;
s20, carrying out motion distortion correction on the point cloud data generated by the laser radar to obtain point cloud data after distortion correction, and extracting point cloud characteristic information from the point cloud data after distortion correction;
s30, selecting a candidate closed-loop matching frame from the historical key frames, matching the scanning data of the laser radar with a map to obtain pose transformation, and constructing a closed-loop factor;
and S40, regarding the point cloud characteristic information, adding the IMU integral data, the data of the laser odometer and a closed-loop factor as factors of a graph optimization algorithm into a factor graph, optimizing the factor graph, and updating the poses of all key frames to obtain a reconstructed map.
The factors that mine roadway unstructured environment features and GPS technology cannot be directly applied are considered, so that under certain conditions, the laser radar cannot acquire enough feature point clouds, but the IMU can not rely on external information and is independent of the existence of other sensors, so that the IMU can perform pre-integration when the insufficient feature point clouds exist, the defect that the insufficient feature point clouds exist is overcome, and then feature extraction of the point clouds is performed on the laser radar, and the whole system is kept stable.
In the embodiment, the method of tightly coupling the laser radar and the IMU is used for carrying out the simultaneous operation on the original point cloud of the laser radar and the IMU data, so that the precision of the whole system in the unstructured environment is improved; and the data of the laser odometer is used as an optimization factor and added into the factor graph, and the distortion of the data of the laser odometer can be removed, so that the precision is further improved, but the operation cost is increased. According to the method and the device, aiming at the problem that the calculation amount is increased due to the simultaneous operation of the laser radar point cloud data and the IMU data, the key frames are matched, so that the precision and the calculation amount are balanced, and the accuracy and the real-time performance of the system are guaranteed.
Furthermore, in the embodiment, by arranging the IMU odometer and the lidar and performing tight coupling with the IMU, the GNSS data is not required to be relied on, the system can be used without satellite signals, and the system has high universality.
In an embodiment, as shown in fig. 3, the step S10 includes the following steps:
s11, receiving data of the laser odometer and storing a current time stamp of the laser odometer;
specifically, the present embodiment unifies the received data of the laser mileage data into a designated format and saves the current time stamp of the laser mileage meter.
S12, calculating relative pose transformation between the IMU odometers corresponding to the starting time and the ending time in the IMU queue, wherein the starting time is a current timestamp of the laser odometer;
specifically, the embodiment removes, based on the current timestamp of the laser odometer, IMU incremental data in the IMU queue whose timestamp is earlier than the current timestamp of the laser odometer; then calculating the relative pose transformation between the IMU odometers corresponding to the starting time and the ending time in the IMU queue; and combining and issuing the data received by the laser odometer and the data obtained by the relative pose transformation obtained by calculation.
And S13, adding IMU original data into an IMU queue, and performing integral calculation on data between two frames in the IMU queue by using the data of the laser odometer and the relative pose transformation as initial values of integral to obtain IMU integral data.
Further, the IMU integral data is observation data, including angular velocity and acceleration of the IMU, and the formula is defined as follows:
angular velocity:
Figure BDA0003643748720000081
acceleration: />
Figure BDA0003643748720000082
Then there are: IMU observation source data at the time t is
Figure BDA0003643748720000083
And &>
Figure BDA0003643748720000084
And->
Figure BDA0003643748720000085
And &>
Figure BDA0003643748720000086
Will be subjected to white noise n t And IMU zero offset b t The influence of (c); matrix->
Figure BDA0003643748720000087
Is a transformation matrix from a world coordinate system to a robot coordinate system; the gravity constant vector g belongs to a world coordinate system W; w is a t Is the angular velocity at time t->
Figure BDA0003643748720000088
Is the noise of the IMU zero offset angular velocity at time t, < >>
Figure BDA0003643748720000089
White noise at IMU zero yaw rate at time t, a t Acceleration at time t->
Figure BDA00036437487200000810
Is the noise of the IMU zero offset angular velocity at time t, < >>
Figure BDA00036437487200000811
White noise at zero bias angular velocity of the IMU at time t.
Further, this embodiment further includes:
the optimizer resets every time 100 frames of laser odometer data are received.
Specifically, initialization and reset are different, when reset is carried out, the current speed and pose are consistent with those obtained by previous optimization, and the noise model adopts the edge distribution of the model after previous optimization; adding an IMU factor, and performing integral calculation between two frames of IMU data; adding a prior factor, and taking a currently received laser odometer as the prior factor; giving initial values to the variable nodes, performing optimization and updating the state of the previous moment; and initializing the pre-integrator by using the current optimized state, and calculating IMU data after the current frame.
Further, by extrapolating the motion of the robot through the IMU increments, the position, velocity and rotation calculation formula of the robot at time t + Δ t is as follows:
Figure BDA00036437487200000812
Figure BDA0003643748720000091
Figure BDA0003643748720000092
wherein
Figure BDA0003643748720000093
It is assumed that both the angular velocity and the acceleration of the robot remain unchanged during the integration.
Obtaining relative motion between different time stamps by IMU pre-integration, the position change delta p between the m-th time and the n-th time obtained by pre-integration mn Velocity change Δ v mn And a change in rotation Δ R mn The calculation formula of (a) is as follows:
Figure BDA0003643748720000094
Figure BDA0003643748720000095
Figure BDA0003643748720000096
in the formula: r n Is the amount of rotation at time n, R m Is the amount of rotation at time m, p n Is the position quantity at time n, p m Is the position quantity at time m, v n Is the speed quantity at time n, v m Is m timesThe amount of speed of the moment.
In an embodiment, as shown in fig. 4, in the step S20, the motion distortion correction is performed on the point cloud data generated by the laser radar, which includes the following steps:
(1) And subscribing the original data and IMU increment acquired by the IMU, and estimating the pose at the current moment through IMU integral data and the currently obtained new IMU data when receiving new IMU data.
(2) The method comprises the steps of subscribing original point cloud data acquired by a laser radar, defaulting that only rotation and translation occur within the time of scanning a frame by laser, aligning IMU data and a time stamp, simultaneously calculating IMU integral data including RPY of each frame in an IMU queue, and then performing linear interpolation on each scanning point in a laser frame.
(3) And issuing effective point cloud of the current frame after distortion correction, and issuing point cloud data, pose angles, initial poses and other information of the current frame after distortion correction.
In an embodiment, as shown in fig. 5, in the step S20, extracting point cloud feature information from the point cloud data after distortion correction includes the following steps:
(1) For each point cloud frame in the point cloud data after distortion correction, taking the first five points and the last five points of the current point cloud frame, and calculating the square of the sum of the distance differences from each point to the middle point of the current point cloud frame as the curvature of the current point cloud frame;
(2) Dividing each point cloud frame into a plurality of sections, extracting a set number of edge points and plane points from each section, and respectively sequencing the edge points and the plane points in each section by taking smoothness as an index to obtain a first sequencing result and a second sequencing result;
specifically, one frame of point cloud data is divided into 6 segments, 20 edge points and 20 plane points are extracted from each segment, and all points in the segment are sorted by taking smoothness as an index.
(3) Extracting edge point features according to smoothness from low to high based on the first sequencing result; extracting plane point features according to smoothness from high to low based on the second sorting result;
it should be noted that the remaining unprocessed points in a frame of point cloud data are classified as plane points.
(4) And encapsulating the point cloud data with the edge point characteristics and the plane point characteristics to obtain the point cloud characteristic information.
It should be noted that, in consideration of the unstructured environment characteristics of the mine roadway, the laser radar cannot acquire enough characteristic point clouds under certain conditions, the IMU can perform pre-integration when there is no enough characteristic point clouds to make up for the defect of no enough characteristic point clouds, and a method of tightly coupling the laser radar and the IMU is used to perform the operation on the original point clouds of the laser radar and the IMU data together, so that the accuracy of the whole system in the unstructured environment is improved.
In an embodiment, as shown in fig. 6, in the step S40, the determining process of the data of the laser odometer includes the following steps:
(1) Determining a key frame based on a point cloud frame of a laser radar, wherein the point cloud frame of the laser radar is composed of a plane feature and an edge feature.
The method specifically comprises the following steps: extracting plane and edge features by calculating curvature, wherein points with smaller curvature are divided into plane features, and points with larger curvature are divided into edge features; defining edge features of scanning point cloud of laser radar at t moment
Figure BDA0003643748720000101
And a plano feature->
Figure BDA0003643748720000102
Constructing a lidar point cloud frame based on the edge feature and the plane feature>
Figure BDA0003643748720000103
When the difference value between the current state node of the robot and the state node at the last moment exceeds a defined threshold value, determining a laser radar point cloud frame corresponding to the current state node as the key frame; and associating the current state node with the key frame.
Such as: when the robot is at that timeState X k+1 With the previous state X k Exceeds a defined threshold, the lidar frame R is transmitted k+1 As a keyframe, and associating a robot state node X k+1 And key frame F k+1
It should be noted that after determining the key frames, the non-key frames between two key frames are discarded.
It should be understood that the threshold value described in the present embodiment is a value obtained through a large number of experiments for comparison with the state quantities of translation and rotation, and the threshold values of the change of translation and rotation may be selected from 1 meter and 10 degrees according to the map.
It should be noted that, the key frame is defined by using the node state of the robot, and since the state parameter is in a multi-dimensional vector form, the perception of the environment is more obvious compared with a single data of distance, so that the definition of the key frame in this method is more accurate, and the accuracy of the system is further improved.
(2) The method comprises the steps of constructing a point cloud map comprising a certain number of scanning point clouds through a sliding window method, registering key frames into a prior sub key frame set with a fixed size, and converting the sub key frame set into a world coordinate system by adopting a transformation relation to obtain a map formed by merging the sub key frames.
Specifically, a point cloud map comprising a certain number of scanning point clouds of the nearest laser radar is constructed by a sliding window method; using transformation relations { T k-n ,...,T k Will be associated with a set of sub-key frames F k-n ,...,F k Conversion into the world coordinate system, where T k Representing rotation and translation at time k, T k-n Representing rotation and translation at time k-n; the converted sub-keyframes are combined into a map M k
Wherein the map M k Is expressed as:
Figure BDA0003643748720000111
Figure BDA0003643748720000112
Figure BDA0003643748720000113
in the formula:
Figure BDA0003643748720000114
for edge feature maps, <' > based on the characteristic of the feature>
Figure BDA0003643748720000115
For a planar feature map, be>
Figure BDA0003643748720000116
Are transformed plane and edge features in a world coordinate system.
(3) And adding the obtained current key frame of the laser radar into the map through scanning matching to obtain the actual coordinate position.
Such as: matching the latest radar key frame by scanning
Figure BDA0003643748720000117
Joining map M k Performing coordinate system conversion, and judging whether a key frame in a robot coordinate system is based on the judgment result>
Figure BDA0003643748720000118
Conversion into the world coordinate System->
Figure BDA0003643748720000119
(4) A first distance between the edge feature and the corresponding edge block is calculated, and a second distance between the planar feature and the corresponding planar block is calculated.
Specifically, the calculation formulas of the first distance and the second distance are respectively:
Figure BDA00036437487200001110
Figure BDA0003643748720000121
in the formula: i, u, v, w are index numbers in the set of planar features or edge features respectively,
Figure BDA0003643748720000122
for the edge point taken at the (k + 1) th time in the i-th index>
Figure BDA0003643748720000123
For the edge point taken at the k-th time in the v-th index>
Figure BDA0003643748720000124
For the edge point taken at the kth time in the u index, < >>
Figure BDA0003643748720000125
For the plane point taken at the (k + 1) th time in the i-th index>
Figure BDA0003643748720000126
For the plane point taken at the kth time in the v index, < >>
Figure BDA0003643748720000127
Is the plane point taken at the kth time in the u index.
(5) And solving the optimal transformation matrix relation by adopting a Gauss-Newton method based on the first distance and the second distance to obtain the pose transformation relation of state nodes of the robot at adjacent moments as the data of the laser odometer.
Specifically, based on the first distance and the second distance, solving the optimal transformation matrix relation by using a gauss-newton method is as follows:
Figure BDA0003643748720000128
in the formula:d ek Is a first distance, d pk Is a second distance that is a function of the distance,
Figure BDA0003643748720000129
converting key frames in a robot coordinate system to edge and plane features in a world coordinate system, and based on the edge and plane features>
Figure BDA00036437487200001210
Selected edge point for the time k +1>
Figure BDA00036437487200001211
The plane point selected for time k + 1.
Note that the edge points are distinctive points, and the plane points are not distinctive points.
Obtaining a state node X based on the transformation matrix k And X k+1 A transformation relation between delta T k,k+1
Figure BDA00036437487200001212
In the formula: t is a unit of k For rotation and translation at time k, T k+1 Rotation and translation at time k + 1.
It should be noted that, the data of the laser odometer is used as an optimization factor, and is added to the factor graph, and the distortion can be removed from the data of the laser odometer, so that the accuracy is further improved.
In an embodiment, the method further comprises:
when a new state node is added into the factor graph, searching a prior state node which is closest to the new state node in the factor graph in Euclidean space;
and adding the pose transformation relation of the closest prior state node and the state node at the next moment into the factor graph as a closed-loop factor.
Specifically, there is a new state X k+1 When adding to the factor graph, the factor graph is first searched and the Euclidean space is foundFinding a close X in the middle k+1 A priori. Suppose X 5 Is one of the returned candidate states, F is matched by scanning k+1 And sub-key frame { F 5-m ,...,F 5 ,...,F 5+m Match is made. Before scan matching, F is converted k+1 And the previous sub-keyframe is transformed into the world coordinate system W. Will obtain a.DELTA.T 5,i+1 Added to the factor graph as a closed-loop factor.
Further, after the factor graph optimization, the present embodiment further corrects all the point cloud data before the laser radar by using the historical key frame pose after the factor graph optimization.
In an embodiment, the method further comprises: and displaying the IMU integral data, the point cloud data after distortion correction, the point cloud characteristic information and the optimization result of the factor graph through Rviz.
In an embodiment, the step S30 specifically includes the following steps:
s31, constructing a Kdtree through the historical key frames, searching and matching the key frames with the specified search radius, and taking the key frame which is closest to the Kdtree and has the time interval meeting the specified range as the matched frame;
s32, constructing a first feature set of the current historical key frame through inherent edge points and plane points of the current historical key frame, and constructing a second feature set of the matched frame by using m key frame feature points which are adjacent to each other before and after the matched frame;
and S33, matching the first feature point set and the second feature point set based on the specified iteration times and the minimum iteration threshold, correcting the pose of the current historical key frame according to the matching result, and storing the corresponding relation between the current historical key frame and the matching frame as a closed-loop factor.
In this embodiment, by matching the key frames, the calculation amount of the lidar and the IMU under the condition of tight coupling is reduced, so that the accuracy and the calculation amount are balanced, the accuracy and the real-time performance of the system are ensured, and an experimental result diagram of the tightly coupled SLAM method of the lidar and the IMU under the non-structural characteristic of the mine roadway is shown in fig. 7.
In addition, as shown in fig. 8, a second embodiment of the present invention further provides a tightly coupled SLAM system of a laser radar and an IMU, the system comprising:
the pre-integration module 10 is used for performing IMU pre-integration based on the data of the laser odometer and the data of the IMU odometer to obtain IMU integration data;
the correction extraction module 20 is configured to perform motion distortion correction on the point cloud data generated by the laser radar to obtain point cloud data after distortion correction, and extract point cloud feature information from the point cloud data after distortion correction;
the closed-loop factor determination module 30 is configured to select a candidate closed-loop matching frame from the historical key frames, perform matching between the scanning data of the laser radar and a map, obtain pose transformation, and construct a closed-loop factor;
and the optimization module 40 is configured to add the IMU integral data, the data of the laser odometer, and the closed-loop factor as factors of a graph optimization algorithm to a factor graph for the point cloud feature information, optimize the factor graph, and update poses of all keyframes to obtain a reconstructed map.
In the embodiment, the method of tightly coupling the laser radar and the IMU is used for carrying out the operation on the original point cloud of the laser radar and the IMU data together, so that the precision of the whole system in the unstructured environment is improved; and the data of the laser odometer is used as an optimization factor and added into the factor graph, and the distortion of the data of the laser odometer can be removed, so that the precision is further improved, but the operation cost is increased. According to the method and the device, aiming at the problem that the calculation amount is increased due to the simultaneous operation of the laser radar point cloud data and the IMU data, the key frames are matched, so that the precision and the calculation amount are balanced, and the accuracy and the real-time performance of the system are guaranteed.
Furthermore, in the embodiment, by arranging the IMU odometer and the lidar and performing tight coupling with the IMU, the GNSS data is not required to be relied on, the system can be used without satellite signals, and the system has high universality.
In one embodiment, the pre-integration module 10 includes:
the receiving unit is used for receiving the data of the laser odometer and storing the current time stamp of the laser odometer;
specifically, the present embodiment unifies the received data of the laser mileage data into a designated format and saves the current time stamp of the laser mileage meter.
The relative pose transformation calculation unit is used for calculating the relative pose transformation between the IMU odometers corresponding to the starting time and the ending time in the IMU queue, wherein the starting time is the current timestamp of the laser odometer;
specifically, the embodiment removes, based on the current timestamp of the laser odometer, IMU incremental data in the IMU queue whose timestamp is earlier than the current timestamp of the laser odometer; then calculating the relative pose transformation between the IMU odometers corresponding to the starting time and the ending time in the IMU queue; and combining and issuing the data received by the laser odometer and the data obtained by the relative pose transformation obtained by calculation.
And the pre-integration unit is used for adding IMU original data into an IMU queue, and performing integration calculation on data between two frames in the IMU queue by using the data of the laser odometer and the relative pose transformation as initial values of integration to obtain IMU integral data.
In one embodiment, the correction extraction module 20 includes:
the distortion correction unit is used for carrying out motion distortion correction on the point cloud data generated by the laser radar to obtain the point cloud data after distortion correction;
the characteristic extraction unit is used for extracting point cloud characteristic information from the point cloud data after distortion correction, and is specifically used for realizing the following steps:
(1) For each point cloud frame in the point cloud data after distortion correction, taking the first five points and the last five points of the current point cloud frame, and calculating the square of the sum of the distance differences from each point to the middle point of the current point cloud frame as the curvature of the current point cloud frame;
(2) Dividing each point cloud frame into a plurality of sections, extracting a set number of edge points and plane points from each section, and respectively sequencing the edge points and the plane points in each section by taking smoothness as an index to obtain a first sequencing result and a second sequencing result;
specifically, one frame of point cloud data is divided into 6 segments, 20 edge points and 20 plane points are extracted from each segment, and all points in the segment are sorted by taking smoothness as an index.
(3) Extracting edge point features according to smoothness from low to high based on the first sequencing result; extracting plane point features according to smoothness from high to low based on the second sorting result;
it should be noted that the remaining unprocessed points in the frame of point cloud data are classified as plane points.
(4) And encapsulating the point cloud data with the edge point characteristics and the plane point characteristics to obtain the point cloud characteristic information.
In one embodiment, the closed loop factor determination module 30 includes:
the searching and matching unit is used for constructing a Kdtree through the historical key frames, searching and matching the key frames with the specified searching radius, and taking the key frames which are closest to the key frames and have the time intervals meeting the specified range as the matched frames;
the feature set construction unit is used for constructing a first feature set of the current historical key frame through inherent edge points and plane points of the current historical key frame, and constructing a second feature set of the matched frame by using m key frame feature points which are adjacent to each other before and after the matched frame;
and the matching unit is used for matching the first characteristic point set and the second characteristic point set based on the appointed iteration times and the minimum iteration threshold, correcting the pose of the current historical key frame according to the matching result, and storing the corresponding relation between the current historical key frame and the matching frame as a closed-loop factor.
In an embodiment, the optimization module 40 includes a laser odometry data factor adding unit, specifically configured to:
(1) And determining a key frame based on a point cloud frame of the laser radar, wherein the point cloud frame of the laser radar consists of a plane feature and an edge feature.
(2) A point cloud map comprising a certain amount of scanning point clouds is constructed through a sliding window method, and a transformation relation is adopted to convert an associated key frame set into a world coordinate system, so that a map formed by combining key frames is obtained.
(3) And adding the obtained current key frame of the laser radar into the map through scanning matching.
(4) A first distance between the edge feature and the corresponding edge block is calculated, and a second distance between the planar feature and the corresponding planar block is calculated.
(5) And solving the optimal transformation matrix relation by adopting a Gauss-Newton method based on the first distance and the second distance to obtain the pose transformation relation of state nodes of the robot at adjacent moments as the data of the laser odometer.
In an embodiment, the optimization module 40 includes a closed-loop factor adding unit, specifically configured to:
when a new state node is added into the factor graph, searching a prior state node which is closest to the new state node in the factor graph in Euclidean space;
and adding the pose transformation relation of the closest prior state node and the state node at the next moment into the factor graph as a closed-loop factor.
It should be noted that other embodiments or methods of implementing the tightly coupled SLAM system of lidar and IMU of the present invention can refer to the above embodiments, and are not redundant herein.
It should be noted that the logic and/or steps represented in the flowcharts or otherwise described herein, such as an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Further, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
In the description of the specification, reference to the description of "one embodiment," "some embodiments," "an example," "a specific example," or "some examples" or the like means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or to implicitly indicate the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one of the feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (10)

1. A method for tightly coupling SLAM of a lidar and an IMU, the method comprising:
performing IMU pre-integration based on the data of the laser odometer and the data of the IMU odometer to obtain IMU integration data;
carrying out motion distortion correction on point cloud data generated by a laser radar to obtain point cloud data after distortion correction, and extracting point cloud characteristic information from the point cloud data after distortion correction;
selecting a candidate closed-loop matching frame from the historical key frames, matching the scanning data of the laser radar with a map to obtain pose transformation, and constructing a closed-loop factor;
and regarding the point cloud characteristic information, adding the IMU integral data, the data of the laser odometer and the closed-loop factor as factors of a graph optimization algorithm into a factor graph, optimizing the factor graph, and updating the poses of all key frames to obtain a reconstructed map.
2. The method of claim 1, wherein the IMU pre-integration based on the LIGA data and the IMU odometer data to obtain IMU integration data comprises:
receiving data of the laser odometer and storing a current time stamp of the laser odometer;
calculating relative pose transformation between the IMU odometers corresponding to a starting time and an ending time in the IMU queue, wherein the starting time is a current timestamp of the laser odometer;
and adding IMU original data into an IMU queue, and performing integral calculation on data between two frames in the IMU queue by using the data of the laser odometer and the relative pose transformation as initial values of integral to obtain IMU integral data.
3. The method of claim 1, wherein extracting point cloud feature information from the point cloud data after distortion correction comprises:
for each point cloud frame in the point cloud data after distortion correction, taking the first five points and the last five points of the current point cloud frame, and calculating the square of the sum of the distance differences from each point to the middle point of the current point cloud frame as the curvature of the current point cloud frame;
dividing each point cloud frame into a plurality of sections, extracting a set number of edge points and plane points from each section, and respectively sequencing the edge points and the plane points in each section by taking smoothness as an index to obtain a first sequencing result and a second sequencing result;
extracting edge point features according to smoothness from low to high based on the first sequencing result;
extracting plane point features according to smoothness from high to low based on the second sorting result;
and encapsulating the point cloud data with the edge point characteristics and the plane point characteristics to obtain the point cloud characteristic information.
4. The method for tightly-coupled SLAM of lidar and an IMU of claim 1, wherein the determining of the data for the laser odometer comprises:
determining a key frame based on a point cloud frame of a laser radar, wherein the point cloud frame of the laser radar consists of a plane feature and an edge feature;
constructing a point cloud map comprising a certain amount of scanning point clouds by a sliding window method, registering a key frame into a prior sub key frame set with a fixed size, and converting the sub key frame set into a world coordinate system by adopting a transformation relation to obtain a map formed by combining sub key frames;
adding the obtained current key frame of the laser radar into the map through scanning matching;
calculating a first distance between the edge feature and the corresponding edge block, and calculating a second distance between the planar feature and the corresponding planar block;
and solving the optimal transformation matrix relation by adopting a Gauss-Newton method based on the first distance and the second distance to obtain the pose transformation relation of state nodes of the robot at adjacent moments as the data of the laser odometer.
5. The method for tightly-coupled SLAM of lidar and an IMU according to claim 4, wherein determining a key frame based on the point cloud frame of the lidar comprises:
defining edge characteristics of laser radar in scanning point cloud at time t
Figure FDA0003643748710000021
And a plano feature->
Figure FDA0003643748710000022
Constructing a lidar point cloud frame based on the edge feature and the plane feature>
Figure FDA0003643748710000023
When the difference value between the current state node and the last state node of the robot exceeds a defined threshold value, determining a laser radar point cloud frame corresponding to the current state node as the key frame;
and associating the current state node with the key frame.
6. The method of claim 4, wherein the step of solving the optimal transformation matrix relationship by using a Gauss-Newton method based on the first distance and the second distance to obtain the pose transformation relationship of the state nodes of the robot at the adjacent moments as the data of the laser odometer comprises the steps of:
based on the first distance and the second distance, solving the optimal transformation matrix relation by adopting a Gauss-Newton method as follows:
Figure FDA0003643748710000031
in the formula: d is a radical of ek Is a first distance, d pk The second distance is a distance between the first and second electrodes,
Figure FDA0003643748710000032
converting key frames in a robot coordinate system to edge and plane features in a world coordinate system, and based on the edge and plane features>
Figure FDA0003643748710000033
Is the edge point at the moment k +1 @>
Figure FDA0003643748710000034
A plane point at time k + 1;
obtaining a state node X based on the transformation matrix k And X k+1 A transformation relation between delta T k,k+1
Figure FDA0003643748710000035
In the formula: t is k For rotation and translation at time k, T k+1 Rotation and translation at time k + 1.
7. The method of tightly coupled SLAM of lidar and an IMU of claim 6, wherein the method further comprises:
when a new state node is added into the factor graph, searching a prior state node which is closest to the new state node in the factor graph in Euclidean space;
and adding the pose transformation relation of the closest prior state node and the state node at the next moment into the factor graph as a closed-loop factor.
8. The method of claim 1, wherein the selecting a candidate closed-loop matching frame from historical keyframes, matching scan data of the lidar with a map to obtain pose transformation, and constructing a closed-loop factor comprises:
constructing a Kdtree through the historical key frames, searching and matching the key frames with the specified search radius, and taking the key frames which are closest to the key frames and have the time intervals meeting the specified range as the matched frames;
constructing a first feature set of the current historical key frame through inherent edge points and plane points of the current historical key frame, and constructing a second feature set of the matching frame by using m key frame feature points which are adjacent to each other before and after the matching frame;
and matching the first characteristic point set and the second characteristic point set based on the appointed iteration times and the minimum iteration threshold, correcting the pose of the current historical key frame according to the matching result, and storing the corresponding relation between the current historical key frame and the matched frame as a closed-loop factor.
9. The method of tightly coupled SLAM of lidar and an IMU of claim 1, wherein the method further comprises:
and displaying the IMU integral data, the point cloud data after distortion correction, the point cloud characteristic information and the optimization result of the factor graph through Rviz.
10. A tightly coupled SLAM system of a lidar and an IMU, the system comprising:
the pre-integration module is used for performing IMU pre-integration based on the data of the laser odometer and the data of the IMU odometer to obtain IMU integration data;
the correction extraction module is used for carrying out motion distortion correction on the point cloud data generated by the laser radar to obtain the point cloud data after the distortion correction, and extracting point cloud characteristic information from the point cloud data after the distortion correction;
the closed-loop factor determination module is used for selecting a candidate closed-loop matching frame from historical key frames, matching the scanning data of the laser radar with a map to obtain pose transformation and constructing a closed-loop factor;
and the optimization module is used for adding the IMU integral data, the data of the laser odometer and the closed-loop factor as factors of a graph optimization algorithm for the point cloud characteristic information into a factor graph, optimizing the factor graph, and updating the poses of all key frames to obtain a reconstructed map.
CN202210521720.7A 2022-05-13 2022-05-13 Tightly-coupled SLAM method and system for laser radar and IMU Pending CN115963508A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210521720.7A CN115963508A (en) 2022-05-13 2022-05-13 Tightly-coupled SLAM method and system for laser radar and IMU

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210521720.7A CN115963508A (en) 2022-05-13 2022-05-13 Tightly-coupled SLAM method and system for laser radar and IMU

Publications (1)

Publication Number Publication Date
CN115963508A true CN115963508A (en) 2023-04-14

Family

ID=87358635

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210521720.7A Pending CN115963508A (en) 2022-05-13 2022-05-13 Tightly-coupled SLAM method and system for laser radar and IMU

Country Status (1)

Country Link
CN (1) CN115963508A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117974926A (en) * 2024-03-28 2024-05-03 苏州艾吉威机器人有限公司 Positioning and mapping method and system based on mechanical rotation type three-dimensional laser

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117974926A (en) * 2024-03-28 2024-05-03 苏州艾吉威机器人有限公司 Positioning and mapping method and system based on mechanical rotation type three-dimensional laser

Similar Documents

Publication Publication Date Title
CN112268559B (en) Mobile measurement method for fusing SLAM technology in complex environment
CN112734852B (en) Robot mapping method and device and computing equipment
CN114526745B (en) Drawing construction method and system for tightly coupled laser radar and inertial odometer
Cieslewski et al. Point cloud descriptors for place recognition using sparse visual information
KR20200121274A (en) Method, apparatus, and computer readable storage medium for updating electronic map
CN107167826B (en) Vehicle longitudinal positioning system and method based on variable grid image feature detection in automatic driving
CN109074638B (en) Fusion graph building method, related device and computer readable storage medium
EP3667236B1 (en) A method of determining position data
CN113654555A (en) Automatic driving vehicle high-precision positioning method based on multi-sensor data fusion
CN114323033B (en) Positioning method and equipment based on lane lines and feature points and automatic driving vehicle
CN110187375A (en) A kind of method and device improving positioning accuracy based on SLAM positioning result
CN113358112B (en) Map construction method and laser inertia odometer
US11880931B2 (en) High-definition city mapping
CN114964212B (en) Multi-machine collaborative fusion positioning and mapping method oriented to unknown space exploration
CN111623773B (en) Target positioning method and device based on fisheye vision and inertial measurement
CN115494533A (en) Vehicle positioning method, device, storage medium and positioning system
CN113763549A (en) Method, device and storage medium for simultaneous positioning and mapping by fusing laser radar and IMU
CN115963508A (en) Tightly-coupled SLAM method and system for laser radar and IMU
CN113838129B (en) Method, device and system for obtaining pose information
Lu et al. A lightweight real-time 3D LiDAR SLAM for autonomous vehicles in large-scale urban environment
CN115900697B (en) Object motion trail information processing method, electronic equipment and automatic driving vehicle
CN109074407A (en) Multi-source data mapping method, related device and computer-readable storage medium
CN113379915B (en) Driving scene construction method based on point cloud fusion
CN113227713A (en) Method and system for generating environment model for positioning
CN115937449A (en) High-precision map generation method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination