CN117387598A - Tightly-coupled lightweight real-time positioning and mapping method - Google Patents

Tightly-coupled lightweight real-time positioning and mapping method Download PDF

Info

Publication number
CN117387598A
CN117387598A CN202311296493.3A CN202311296493A CN117387598A CN 117387598 A CN117387598 A CN 117387598A CN 202311296493 A CN202311296493 A CN 202311296493A CN 117387598 A CN117387598 A CN 117387598A
Authority
CN
China
Prior art keywords
point cloud
state
points
unmanned
motion system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311296493.3A
Other languages
Chinese (zh)
Inventor
张金会
魏嘉桐
吕千一
李思杭
孟焕
蔡吉山
邵之玥
赵凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN202311296493.3A priority Critical patent/CN117387598A/en
Publication of CN117387598A publication Critical patent/CN117387598A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3807Creation or updating of map data characterised by the type of data
    • G01C21/3811Point data, e.g. Point of Interest [POI]
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3841Data obtained from two or more sources, e.g. probe vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Architecture (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention provides a tightly-coupled lightweight real-time positioning and mapping method, which comprises the steps of acquiring a state predicted value of an unmanned motion system through an Inertial Measurement Unit (IMU) and acquiring an original point cloud of the unmanned motion system through a laser radar; preprocessing an original point cloud; performing motion compensation on the preprocessed original point cloud through a state predicted value to obtain a de-distorted point cloud; combining the state predicted value and the de-distorted point cloud, and carrying out state update on the unmanned motion system through Kalman filtering to obtain an optimal state estimation; and updating the map according to the optimal state estimation. According to the method provided by the invention, based on Kalman filtering, the laser radar and inertial navigation are tightly coupled, so that accurate and real-time motion estimation and mapping can be realized.

Description

Tightly-coupled lightweight real-time positioning and mapping method
Technical Field
The invention belongs to the field of positioning and mapping of unmanned systems.
Background
For automatic driving tasks, high-precision positioning information is particularly important as input of modules such as subsequent decision planning and the like. In the positioning field, particularly in the task scenario of high-speed driving, high-frequency motion estimation is indispensable. For this purpose, the skilled person proposes active sensors, such as lidar, to accomplish this task, so-called simultaneous localization and mapping (SLAM). Some key advantages of typical 3D lidar include a wide horizontal field of view and invariance to ambient lighting conditions, however, lidar-based navigation systems are sensitive to the surrounding environment. Furthermore, motion distortion and sparsity of the point cloud can degrade the performance of the algorithm in certain challenging scenarios (e.g., wide open areas).
Recent studies have shown that the shortfall of lidar can be remedied by fusing IMUs. Unlike lidar, IMUs are not sensitive to the environment. It provides accurate short-term motion constraints and typically operates at high frequencies (e.g., 100Hz-500 Hz). These functions can help the lidar navigation system recover the point cloud from highly dynamic motion distortions, thereby improving accuracy.
Disclosure of Invention
The present invention aims to solve at least one of the technical problems in the related art to some extent.
Therefore, the invention aims to provide a tightly-coupled lightweight real-time positioning and mapping method which is used for helping a laser radar navigation system to recover point clouds from highly dynamic motion distortion, thereby improving accuracy.
To achieve the above objective, an embodiment of a first aspect of the present invention provides a method for positioning and mapping in real time of tight coupling lightweight, including:
acquiring a state predicted value of the unmanned motion system through an Inertial Measurement Unit (IMU), and acquiring an original point cloud of the unmanned motion system through a laser radar;
preprocessing the original point cloud; performing motion compensation on the preprocessed original point cloud through the state predicted value to obtain a de-distorted point cloud;
combining the state predicted value and the de-distortion point cloud, and carrying out state update on the unmanned motion system through Kalman filtering to obtain an optimal state estimation;
and updating the map according to the optimal state estimation.
In addition, the method for positioning and mapping in real time of tight coupling lightweight according to the above embodiment of the present invention may further have the following additional technical features:
further, in an embodiment of the present invention, the acquiring, by the inertial measurement unit IMU, a state prediction value of the unmanned motion system includes:
acquiring acceleration and angular velocity data of the unmanned motion system in real time through the IMU;
integrating the acceleration and angular velocity data, and combining the state estimation and the motion equation at the last moment to obtain the position, the velocity and the gesture information of the unmanned motion system at the current moment, and taking the position, the velocity and the gesture information as initial values of subsequent optimization; and meanwhile, updating a covariance matrix of state estimation according to a noise model of the unmanned motion system, and quantifying the uncertainty of prediction.
Further, in an embodiment of the present invention, the preprocessing the original point cloud includes:
performing point cloud segmentation and feature extraction on the original point cloud; wherein,
the point cloud segmentation comprises the steps of re-projecting an original point cloud into a distance image, marking points on a scanning line between preset angle ranges as ground points, and not participating in subsequent segmentation; dividing non-ground points of the distance image into a plurality of clusters, removing clusters with fewer points in the clusters as noise, and marking the points in the left clusters as dividing points;
the feature extraction comprises feature extraction from ground points and segmentation points, in particular to a point cloud P at the moment t t Each point P of (3) i The 5 points on the left and right in the distance image are combined into a point set S, and the roughness is calculated by the following formula:
wherein r is i Representing P i Distance to sensor, r j Representing P j Distance to the sensor;
dividing the distance image level into a plurality of sub-images; and selecting a plurality of points with the largest roughness from the segmentation points of each row in the sub-image as edge characteristic points, and selecting a plurality of points with the smallest roughness from the ground points or the segmentation points of each row in the sub-image as plane characteristic points.
Further, in an embodiment of the present invention, the motion compensation of the preprocessed original point cloud by the state prediction value includes:
and transforming the point cloud according to the state predicted value, and projecting the point cloud to the position of the extraction moment, so that the distortion of the point cloud in one frame is removed.
Further, in an embodiment of the present invention, the performing, by kalman filtering, the state update on the unmanned motion system to obtain the optimal state estimate includes:
comparing the undistorted point cloud with a predicted state to obtain a measurement residual;
calculating Kalman gain according to covariance matrixes of measurement residual errors and measurement noise, and weighing predicted values and measured values;
carrying out weighted average on the predicted value and the measurement residual error by using Kalman gain to obtain an optimized state estimation value, and updating a covariance matrix;
continuously optimizing the state estimation of the point cloud; and finally, continuously converging the state estimation of the unmanned motion system to an optimal value through Kalman filtering.
Further, in an embodiment of the present invention, the updating the map according to the optimal state estimation includes:
radar feature points are estimated according to the optimal stateConversion to world coordinate system>
Wherein,optimal estimate of world coordinates representing radar feature points,/->Representing t k The optimal pose estimation at the time is performed, I T L representing the relative pose of IMU and radar, < ->Indicating the jth scan of the radarFeature point transition to t k A position under a time radar coordinate system;
feature points are formedAdded to the existing map.
In order to achieve the above objective, a second aspect of the present invention provides a tightly coupled lightweight real-time positioning and mapping device, which includes the following modules:
the acquisition module is used for acquiring a state predicted value of the unmanned motion system through the Inertial Measurement Unit (IMU), and acquiring an original point cloud of the unmanned motion system through a laser radar;
the motion compensation module is used for preprocessing the original point cloud; performing motion compensation on the preprocessed original point cloud through the state predicted value to obtain a de-distorted point cloud;
the state updating module is used for carrying out state updating on the unmanned motion system through Kalman filtering by combining the state predicted value and the de-distortion point cloud to obtain an optimal state estimation;
and the map updating module is used for updating the map according to the optimal state estimation.
To achieve the above object, an embodiment of the present invention provides a computer device, which is characterized by comprising a memory, a processor and a computer program stored in the memory and capable of running on the processor, wherein the processor implements a tightly coupled lightweight real-time positioning and mapping method as described above when executing the computer program.
To achieve the above object, a fourth aspect of the present invention provides a computer readable storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements a tightly coupled lightweight real-time positioning and mapping method as described above.
According to the tightly-coupled lightweight real-time positioning and mapping method provided by the embodiment of the invention, firstly, the state predicted value is obtained through the IMU to carry out state propagation, and meanwhile, the input radar point cloud is preprocessed, including point cloud segmentation and feature extraction. And then carrying out motion compensation on the preprocessed point cloud through the state predicted value to remove distortion. Next, the state is updated by kalman filtering to obtain an optimal state estimate. Finally, the map is updated and managed, and the updated map is used for registering new points in the next round. The method is based on Kalman filtering, and the laser radar and inertial navigation are tightly coupled, so that accurate and real-time motion estimation and mapping can be realized.
Drawings
The foregoing and/or additional aspects and advantages of the invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, in which:
fig. 1 is a flow chart of a method for positioning and mapping in real time of tight coupling lightweight according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of a time relationship between IMU input and radar point cloud input in a radar frame time provided in an embodiment of the present invention.
FIG. 3 is a flow chart of a method for tightly coupling lightweight real-time localization and mapping according to an embodiment of the present invention.
Fig. 4 is a schematic diagram of a tightly coupled lightweight real-time positioning and mapping device according to an embodiment of the present invention.
Detailed Description
Embodiments of the present invention are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are illustrative and intended to explain the present invention and should not be construed as limiting the invention.
The following describes a tightly coupled lightweight real-time localization and mapping method of an embodiment of the present invention with reference to the accompanying drawings.
Fig. 1 is a flow chart of a method for positioning and mapping in real time of tight coupling lightweight according to an embodiment of the present invention.
As shown in fig. 1, the method for positioning and mapping the tight-coupling lightweight in real time comprises the following steps:
s101: acquiring a state predicted value of the unmanned motion system through an Inertial Measurement Unit (IMU), and acquiring an original point cloud of the unmanned motion system through a laser radar;
s102: preprocessing an original point cloud; performing motion compensation on the preprocessed original point cloud through a state predicted value to obtain a de-distorted point cloud;
s103: combining the state predicted value and the de-distortion point cloud, and carrying out state update on the unmanned motion system through Kalman filtering to obtain an optimal state estimation;
s104: and updating the map according to the optimal state estimation.
Further, in an embodiment of the present invention, the acquiring, by the inertial measurement unit IMU, a state prediction value of the unmanned motion system includes:
acquiring acceleration and angular velocity data of the unmanned motion system in real time through the IMU;
integrating the acceleration and angular velocity data, and combining the state estimation and the motion equation at the last moment to obtain the position, the velocity and the gesture information of the unmanned motion system at the current moment, and taking the position, the velocity and the gesture information as initial values of subsequent optimization; and meanwhile, updating a covariance matrix of state estimation according to a noise model of the unmanned motion system, and quantifying the uncertainty of prediction.
Specifically, first, a state x of the unmanned system is defined as:
the system I is an IMU coordinate system and also an unmanned system coordinate system; the G-system is the world coordinate system, and the first IMU coordinate system is taken as the G-system. G R IG p I Andthe pose, position and speed of the unmanned system in world coordinates, respectively, +.>And->Respectively the offset of the IMU angular velocity and acceleration measurements, G g is the unknown gravitational acceleration in the world coordinate system. IMU input u is
Wherein omega m Representing measurements of the angular velocity of an IMU, a m Representing the measurement of acceleration by the IMU. The time relationship between IMU input and radar point cloud input during the time of one radar frame is shown in fig. 2.
As shown in FIG. 2, the acquisition time of the kth radar frame is [ t ] k-1 ,t k ]The method comprises the steps of carrying out a first treatment on the surface of the In the kth frame, the time of IMU input is noted τ 12 ,…τ ii+1 …; the time of the radar scan is noted as ρ 12 ,…ρ j-1j ,…ρ m Wherein ρ is j-1 ∈[τ i-1i )。
Receiving an IMU input state and transmitting the state oncePropagating as follows:
wherein,indicated at time tau i+1 Status of transmission, ++>Indicated at time tau i Propagation state, Δt=τ i+1i Representing the time difference between two consecutive inputs of IMU, u i Indicating time τ i IMU measurements,/->Representing t k-1 Optimal estimation of the time of day state ∈>A type of operation that is defined is represented,
covariance of state errorsPropagating as follows:
wherein,representation->Covariance of error,>representation->Covariance of error,>representation->Covariance of error,>and F w A type of operation that is defined is represented,
wherein,denoted τ i The gesture of the unmanned system in world coordinates at the moment; Δt=τ i+1i Representing a time difference;denoted τ i An IMU angular velocity estimation value at a moment; />Denoted τ i Time IMU acceleration estimate.
Further, in an embodiment of the present invention, the preprocessing the original point cloud includes:
performing point cloud segmentation and feature extraction on the original point cloud; wherein,
the point cloud segmentation comprises the steps of re-projecting an original point cloud into a distance image, marking points on a scanning line between preset angle ranges as ground points, and not participating in subsequent segmentation; dividing non-ground points of the distance image into a plurality of clusters, removing clusters with fewer points in the clusters as noise, and marking the points in the left clusters as dividing points;
the feature extraction comprises feature extraction from ground points and segmentation points, in particular to a point cloud P at the moment t t Each point P of (3) i The left and right 5 points in the distance image are combined into a pointSet S, roughness is calculated by:
wherein r is i Representing P i Distance to sensor, r j Representing P j Distance to the sensor;
dividing the distance image level into a plurality of sub-images; and selecting a plurality of points with the largest roughness from the segmentation points of each row in the sub-image as edge characteristic points, and selecting a plurality of points with the smallest roughness from the ground points or the segmentation points of each row in the sub-image as plane characteristic points.
Specifically, the point cloud segmentation changes a three-dimensional point cloud into a two-dimensional image, the horizontal resolution is 360 degrees/the radar horizontal resolution, the vertical resolution is the number of radar lines, and the pixel value is the distance from the point to the sensor. Points on the scan line that occur between a range of angles (e.g., -15 °, -1 ° ]) are marked as ground points and do not participate in subsequent segmentations.
Subsequently, the non-ground points of the distance image are segmented into a plurality of clusters, the clusters with fewer points are removed as noise, and the points in the remaining clusters are marked as segmented points.
Further, in an embodiment of the present invention, the motion compensation of the preprocessed original point cloud by the state prediction value includes:
and transforming the point cloud according to the state predicted value, and projecting the point cloud to the position of the extraction moment, so that the distortion of the point cloud in one frame is removed.
Specifically, the method is obtained by the propagation calculation of the following three formulasConverting the original point cloud from the self-acquisition time to the frame end time of one frame:
wherein,representing that unmanned system is at ρ j-1 At I k Position under coordinate system, +.>Representing that unmanned system is at ρ j At I k Position under coordinate system, +.>Representing that unmanned system is at ρ j At I k Speed in the coordinate system, Δt represents the time interval between two scans of the radar, +.>Representing that unmanned system is at ρ m At I k Position under the coordinate system.
Wherein,representing that unmanned system is at ρ j-1 At I k Speed in coordinate System, +.>Representing that unmanned system is at ρ j At I k Speed in coordinate System, +.>Representing that unmanned system is at ρ j At I k Pose in coordinate system, ++>Represented at τ i-1 Time IMU pair addingMeasurement of speed, +.>Representing an estimate of the IMU acceleration measurement bias in the kth radar frame, Δt represents the time interval between two scans of the radar, +.>Indicating that in the kth radar frame I k The acceleration of gravity in the coordinate system,representing that unmanned system is at ρ m At I k Speed in the coordinate system.
Wherein,representing that unmanned system is at ρ j-1 At I k Pose in coordinate system, ++>Representing that unmanned system is at ρ j At I k Pose in coordinate system, ++>Represented at τ i-1 Measurement of angular velocity of IMU, +.>Representing an estimate of the offset of the IMU angular velocity measurement in the kth radar frame, Δt representing the time interval between two scans of the radar,/for>Representing that unmanned system is at ρ m At I k Pose in coordinate system.
ObtainingAfter that, the unmanned system is obtained at ρ j Relative to t k Pose transformation at the time:
thereby obtaining the characteristic point obtained by the jth scanning of the radar and converting the characteristic point into t k Position under the time radar coordinate system:
wherein the relative pose of the IMU and the radar is known I T LRepresenting that unmanned system is at ρ j Relative to t k Posture change during time, the->And the characteristic points obtained by the jth scanning of the radar are represented.
Further, in an embodiment of the present invention, the performing, by kalman filtering, the state update on the unmanned motion system to obtain the optimal state estimate includes:
comparing the undistorted point cloud with a predicted state to obtain a measurement residual;
calculating Kalman gain according to covariance matrixes of measurement residual errors and measurement noise, and weighing predicted values and measured values;
carrying out weighted average on the predicted value and the measurement residual error by using Kalman gain to obtain an optimized state estimation value, and updating a covariance matrix;
continuously optimizing the state estimation of the point cloud; and finally, continuously converging the state estimation of the unmanned motion system to an optimal value through Kalman filtering.
First, an initial value is set.
Let k = -1,kappa indicates the number of iterations, < >>Representing t k A state of propagation.
And secondly, iterating the calculation.
Let k=k+1, where k represents the number of iterations.
RecalculatingWherein->At the end of the last iteration, get +.>And the IMU state prediction calculation is carried out.
Then, calculate
Next, the residual is calculated
The original feature points are pressed downConversion to world coordinate system>
Wherein,world coordinates representing feature points obtained by jth scan of radar calculated by kth iteration of Kalman filtering, < + >>Representing t calculated by the kth iteration of Kalman filtering k The position and the posture of the user during the process, I T L representing the relative pose of IMU and radar, < ->Representing the conversion of the characteristic point obtained by the jth scanning of the radar to t k Radar coordinate system. For each lidar feature point, defining the residual as the world coordinate of the feature point estimate +.>Distance from nearest plane (or edge) in the map:
wherein,is the residual error, G j Is the normal vector (or direction vector) of the corresponding plane (or edge), +.>World coordinates representing feature point estimates, +.>Representing a point on the corresponding plane (or edge). The nearest point in the map is searched by constructing a KD-tree of the latest map points. Considering only residuals with norms below a certain threshold, residuals exceeding the threshold are either alienConstant, or newly observed point.
Then, calculateWherein->Is the residual error, v j Measurement noise from radar, < >>Representing t calculated by the kth iteration of Kalman filtering k Error of time status.
Next, a state update is calculated
Order theR=diag(R 1 ,…R m ),
Calculation of
K=PH T (HPH T +R) -1
Wherein,representing t calculated from the kth+1th iteration of Kalman filtering k Status of time->Representing t calculated by the kth iteration of Kalman filtering k Status of time->Representing t calculated by the kth iteration of Kalman filtering k Residual error at time->Indicated at time t k The state of propagation.
K=(H T R -1 H+P -1 ) -1 H T R -1
And finally, judging convergence.
If it is(∈is a set small value), the iteration is stopped, the optimal estimate of the output state +.>
Further, in one embodiment of the present invention, updating the map according to the optimal state estimate includes:
radar feature points according to optimal state estimationConversion to world coordinate system>
Wherein,optimal estimate of world coordinates representing radar feature points,/->Representing t k Optimal pose estimation at time, I T L Representing the relative pose of IMU and radar, < ->Representing the conversion of the characteristic point obtained by the jth scanning of the radar to t k A position under a time radar coordinate system;
feature points are formedAdded to the existing map.
The above is a complete process flow of the tight coupling lightweight real-time positioning and mapping method, and fig. 3 is a schematic diagram of the technical route of the present invention.
According to the tightly-coupled lightweight real-time positioning and mapping method provided by the embodiment of the invention, the IMU is used for obtaining the state predicted value to perform state propagation, and meanwhile, the input radar point cloud is subjected to preprocessing operation, including point cloud segmentation and feature extraction. And then carrying out motion compensation on the preprocessed point cloud through the state predicted value to remove distortion. Next, the state is updated by kalman filtering to obtain an optimal state estimate. Finally, the map is updated and managed, and the updated map is used for registering new points in the next round. The method is based on Kalman filtering, and the laser radar and inertial navigation are tightly coupled, so that accurate and real-time motion estimation and mapping can be realized.
The invention has the advantages over the prior art that
1) Firstly, acquiring a state prediction value through an IMU to perform state propagation, and simultaneously performing preprocessing operation on input radar point clouds, including point cloud segmentation and feature extraction. And then carrying out motion compensation on the preprocessed point cloud through the state predicted value to remove distortion. Next, the state is updated by kalman filtering to obtain an optimal state estimate. Finally, the map is updated and managed, and the updated map is used for registering new points in the next round. The method tightly couples the laser radar and the inertial navigation, and can realize accurate and real-time motion estimation and mapping.
2) The ground points marked before the point cloud segmentation do not participate in the subsequent segmentation, so that the calculated amount is greatly reduced.
3) The motion compensation method is to remove distortion from the preprocessed point cloud by using state prediction values.
4) The back-end optimization adopts a Kalman filtering method, so that the optimal estimation of the current state is only related to the variable at the last moment, and the real-time performance of calculation is improved.
In order to realize the embodiment, the invention also provides a tightly-coupled lightweight real-time positioning and mapping device.
Fig. 4 is a schematic structural diagram of a tightly-coupled lightweight real-time positioning and mapping device according to an embodiment of the present invention.
As shown in fig. 4, the tightly-coupled lightweight real-time positioning and mapping apparatus includes: an acquisition module 100, a motion compensation module 200, a status update module 300, a map update module 400, wherein,
the acquisition module is used for acquiring a state predicted value of the unmanned motion system through the Inertial Measurement Unit (IMU), and acquiring an original point cloud of the unmanned motion system through the laser radar;
the motion compensation module is used for preprocessing the original point cloud; performing motion compensation on the preprocessed original point cloud through a state predicted value to obtain a de-distorted point cloud;
the state updating module is used for carrying out state updating on the unmanned motion system through Kalman filtering by combining the state predicted value and the undistorted point cloud to obtain the optimal state estimation;
and the map updating module is used for updating the map according to the optimal state estimation.
To achieve the above object, an embodiment of the present invention provides a computer device, which is characterized by comprising a memory, a processor, and a computer program stored in the memory and capable of running on the processor, wherein the processor implements the tightly coupled lightweight real-time positioning and mapping method as described above when executing the computer program.
To achieve the above object, a fourth aspect of the present invention provides a computer-readable storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the tightly-coupled lightweight real-time positioning and mapping method as described above.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present invention, the meaning of "plurality" means at least two, for example, two, three, etc., unless specifically defined otherwise.
While embodiments of the present invention have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the invention, and that variations, modifications, alternatives and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the invention.

Claims (9)

1. A method for positioning and mapping a tight coupling lightweight real-time image is characterized by comprising the following steps:
acquiring a state predicted value of the unmanned motion system through an Inertial Measurement Unit (IMU), and acquiring an original point cloud of the unmanned motion system through a laser radar;
preprocessing the original point cloud; performing motion compensation on the preprocessed original point cloud through the state predicted value to obtain a de-distorted point cloud;
combining the state predicted value and the de-distortion point cloud, and carrying out state update on the unmanned motion system through Kalman filtering to obtain an optimal state estimation;
and updating the map according to the optimal state estimation.
2. The method according to claim 1, wherein the obtaining, by the inertial measurement unit IMU, a state prediction value of the unmanned motion system comprises:
acquiring acceleration and angular velocity data of the unmanned motion system in real time through the IMU;
integrating the acceleration and angular velocity data, and combining the state estimation and the motion equation at the last moment to obtain the position, the velocity and the gesture information of the unmanned motion system at the current moment, and taking the position, the velocity and the gesture information as initial values of subsequent optimization; and meanwhile, updating a covariance matrix of state estimation according to a noise model of the unmanned motion system, and quantifying the uncertainty of prediction.
3. The method of claim 1, wherein the preprocessing the original point cloud comprises:
performing point cloud segmentation and feature extraction on the original point cloud; wherein,
the point cloud segmentation comprises the steps of re-projecting an original point cloud into a distance image, marking points on a scanning line between preset angle ranges as ground points, and not participating in subsequent segmentation; dividing non-ground points of the distance image into a plurality of clusters, removing clusters with fewer points in the clusters as noise, and marking the points in the left clusters as dividing points;
the feature extraction comprises feature extraction from ground points and segmentation points, in particular to a point cloud P at the moment t t Each point P of (3) i The 5 points on the left and right in the distance image are combined into a point set S, and the roughness is calculated by the following formula:
wherein ri represents the distance Pi from the sensor and rj represents P j Distance to the sensor;
dividing the distance image level into a plurality of sub-images; and selecting a plurality of points with the largest roughness from the segmentation points of each row in the sub-image as edge characteristic points, and selecting a plurality of points with the smallest roughness from the ground points or the segmentation points of each row in the sub-image as plane characteristic points.
4. The method of claim 1, wherein the motion compensating the preprocessed source point cloud with the state prediction value comprises:
and transforming the point cloud according to the state predicted value, and projecting the point cloud to the position of the extraction moment, so that the distortion of the point cloud in one frame is removed.
5. The method of claim 1, wherein the performing a state update on the unmanned motion system by kalman filtering to obtain an optimal state estimate comprises:
comparing the undistorted point cloud with a predicted state to obtain a measurement residual;
calculating Kalman gain according to covariance matrixes of measurement residual errors and measurement noise, and weighing predicted values and measured values;
carrying out weighted average on the predicted value and the measurement residual error by using Kalman gain to obtain an optimized state estimation value, and updating a covariance matrix;
continuously optimizing the state estimation of the point cloud; and finally, continuously converging the state estimation of the unmanned motion system to an optimal value through Kalman filtering.
6. The method of claim 1, wherein updating the map based on the optimal state estimate comprises:
radar feature points are estimated according to the optimal stateConversion to world coordinate system>
Wherein,optimal estimate of world coordinates representing radar feature points,/->Representing the optimal pose estimate at tk, ITL representing the relative pose of IMU and radar, +.>Representing the position of a radar coordinate system when the characteristic point obtained by the jth scanning of the radar is converted to tk;
feature points are formedAdded to the existing map.
7. The tight coupling lightweight real-time positioning and mapping device is characterized by comprising the following modules:
the acquisition module is used for acquiring a state predicted value of the unmanned motion system through the Inertial Measurement Unit (IMU), and acquiring an original point cloud of the unmanned motion system through a laser radar;
the motion compensation module is used for preprocessing the original point cloud; performing motion compensation on the preprocessed original point cloud through the state predicted value to obtain a de-distorted point cloud;
the state updating module is used for carrying out state updating on the unmanned motion system through Kalman filtering by combining the state predicted value and the de-distortion point cloud to obtain an optimal state estimation;
and the map updating module is used for updating the map according to the optimal state estimation.
8. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the tightly coupled lightweight real-time localization and mapping method of any of claims 1-6 when the computer program is executed by the processor.
9. A computer readable storage medium, having stored thereon a computer program, which when executed by a processor implements the tightly coupled lightweight real-time localization and mapping method of any of claims 1-6.
CN202311296493.3A 2023-10-08 2023-10-08 Tightly-coupled lightweight real-time positioning and mapping method Pending CN117387598A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311296493.3A CN117387598A (en) 2023-10-08 2023-10-08 Tightly-coupled lightweight real-time positioning and mapping method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311296493.3A CN117387598A (en) 2023-10-08 2023-10-08 Tightly-coupled lightweight real-time positioning and mapping method

Publications (1)

Publication Number Publication Date
CN117387598A true CN117387598A (en) 2024-01-12

Family

ID=89440219

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311296493.3A Pending CN117387598A (en) 2023-10-08 2023-10-08 Tightly-coupled lightweight real-time positioning and mapping method

Country Status (1)

Country Link
CN (1) CN117387598A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111929699A (en) * 2020-07-21 2020-11-13 北京建筑大学 Laser radar inertial navigation odometer considering dynamic obstacles and mapping method and system
CN114136311A (en) * 2021-11-08 2022-03-04 上海应用技术大学 Laser SLAM positioning method based on IMU pre-integration
CN115326068A (en) * 2022-10-17 2022-11-11 北京理工大学 Design method and system for laser radar-inertial measurement unit fusion odometer
CN115984463A (en) * 2022-12-21 2023-04-18 泉州装备制造研究所 Three-dimensional reconstruction method and system suitable for narrow roadway
CN116452763A (en) * 2023-04-17 2023-07-18 浙江大学 Three-dimensional point cloud map construction method based on error Kalman filtering and factor graph
CN116449384A (en) * 2023-03-17 2023-07-18 天津大学 Radar inertial tight coupling positioning mapping method based on solid-state laser radar
CN116448100A (en) * 2023-03-10 2023-07-18 华南理工大学 Multi-sensor fusion type offshore unmanned ship SLAM method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111929699A (en) * 2020-07-21 2020-11-13 北京建筑大学 Laser radar inertial navigation odometer considering dynamic obstacles and mapping method and system
CN114136311A (en) * 2021-11-08 2022-03-04 上海应用技术大学 Laser SLAM positioning method based on IMU pre-integration
CN115326068A (en) * 2022-10-17 2022-11-11 北京理工大学 Design method and system for laser radar-inertial measurement unit fusion odometer
CN115984463A (en) * 2022-12-21 2023-04-18 泉州装备制造研究所 Three-dimensional reconstruction method and system suitable for narrow roadway
CN116448100A (en) * 2023-03-10 2023-07-18 华南理工大学 Multi-sensor fusion type offshore unmanned ship SLAM method
CN116449384A (en) * 2023-03-17 2023-07-18 天津大学 Radar inertial tight coupling positioning mapping method based on solid-state laser radar
CN116452763A (en) * 2023-04-17 2023-07-18 浙江大学 Three-dimensional point cloud map construction method based on error Kalman filtering and factor graph

Similar Documents

Publication Publication Date Title
CN112347840B (en) Vision sensor laser radar integrated unmanned aerial vehicle positioning and image building device and method
CN113436260B (en) Mobile robot pose estimation method and system based on multi-sensor tight coupling
JP6760114B2 (en) Information processing equipment, data management equipment, data management systems, methods, and programs
CN105809687B (en) A kind of monocular vision ranging method based on point information in edge in image
CN111795686B (en) Mobile robot positioning and mapping method
CN112304307A (en) Positioning method and device based on multi-sensor fusion and storage medium
JP6456141B2 (en) Generating map data
CN111665512B (en) Ranging and mapping based on fusion of 3D lidar and inertial measurement unit
CN111524194B (en) Positioning method and terminal for mutually fusing laser radar and binocular vision
CN114013449B (en) Data processing method and device for automatic driving vehicle and automatic driving vehicle
CN112444246B (en) Laser fusion positioning method in high-precision digital twin scene
CN111862214B (en) Computer equipment positioning method, device, computer equipment and storage medium
CN111623773B (en) Target positioning method and device based on fisheye vision and inertial measurement
CN115272596A (en) Multi-sensor fusion SLAM method oriented to monotonous texture-free large scene
CN114494629A (en) Three-dimensional map construction method, device, equipment and storage medium
CN115526914A (en) Robot real-time positioning and color map fusion mapping method based on multiple sensors
CN114397642A (en) Three-dimensional laser radar and IMU external reference calibration method based on graph optimization
CN117710476A (en) Monocular vision-based unmanned aerial vehicle pose estimation and dense mapping method
WO2023226154A1 (en) Autonomous localization method and apparatus, and device and computer-readable storage medium
CN115984463A (en) Three-dimensional reconstruction method and system suitable for narrow roadway
CN115950414A (en) Adaptive multi-fusion SLAM method for different sensor data
CN117387598A (en) Tightly-coupled lightweight real-time positioning and mapping method
CN115355904A (en) Slam method for Lidar-IMU fusion of ground mobile robot
CN114723920A (en) Point cloud map-based visual positioning method
CN115344033A (en) Monocular camera/IMU/DVL tight coupling-based unmanned ship navigation and positioning method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination