CN112859051A - Method for correcting laser radar point cloud motion distortion - Google Patents
Method for correcting laser radar point cloud motion distortion Download PDFInfo
- Publication number
- CN112859051A CN112859051A CN202110030119.3A CN202110030119A CN112859051A CN 112859051 A CN112859051 A CN 112859051A CN 202110030119 A CN202110030119 A CN 202110030119A CN 112859051 A CN112859051 A CN 112859051A
- Authority
- CN
- China
- Prior art keywords
- point cloud
- frame
- pose
- laser radar
- imu
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 31
- 238000012937 correction Methods 0.000 claims abstract description 16
- 238000006073 displacement reaction Methods 0.000 claims abstract description 14
- 238000005259 measurement Methods 0.000 claims abstract description 7
- 230000001960 triggered effect Effects 0.000 claims abstract description 4
- 239000011159 matrix material Substances 0.000 claims description 20
- 238000013519 translation Methods 0.000 claims description 6
- 230000004927 fusion Effects 0.000 claims description 4
- 230000003287 optical effect Effects 0.000 claims 1
- 238000005516 engineering process Methods 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
- G01C25/005—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Manufacturing & Machinery (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
A method for correcting laser radar point cloud motion distortion belongs to the technical field of unmanned vehicle automatic driving. The method aims at the problem that an external sensor has large displacement accumulated error in the laser radar point cloud motion distortion correction. The method comprises the following steps: an RTK is used as a pulse generator, and a laser radar and an IMU are synchronously triggered; establishing a global coordinate system for the first frame point cloud of the laser radar; establishing a local coordinate system for the non-first frame point cloud; simultaneously converting the corresponding IMU coordinate system to the local coordinate system; synchronously obtaining IMU poses; obtaining an IMU (inertial measurement Unit) pose corresponding to each data point in the current frame point cloud by adopting an interpolation method, and correcting the motion distortion of the current frame point cloud of the laser radar according to the IMU pose; converting the local coordinate system into the global coordinate system, and calibrating the pose of the IMU by using the RTK pose; and correcting the point cloud of the next frame until the point cloud motion distortion of the laser radar of the last frame is corrected. The invention enhances the robustness and stability of laser radar point cloud correction.
Description
Technical Field
The invention relates to a method for correcting laser radar point cloud motion distortion, and belongs to the technical field of unmanned vehicle automatic driving.
Background
With the development of artificial intelligence, the laser radar has gained wide attention with the advantages of high resolution, long distance measurement, no influence from illumination, strong anti-interference capability and the like.
In the field of unmanned vehicle automatic driving, the laser radar can establish map information in an unknown environment, and provides a good foundation for subsequent positioning navigation, path planning and the like. When the laser radar based on mechanical rotation scanning moves along with an unmanned vehicle, the coordinate system of the laser radar changes continuously, so that points in the same frame of point cloud are not in the same coordinate system and generate motion distortion, and therefore the motion distortion correction needs to be performed on the acquired point cloud data.
At present, aiming at the motion distortion of a multi-line laser radar, a single Inertial Measurement Unit (IMU) or a combination of the IMU and a wheel speed odometer can be used as an external sensor, and the motion information of the laser radar is deduced on the assumption that an unmanned vehicle only has course angle change, so that the motion distortion of a laser radar point cloud is corrected. However, both the IMU and the wheel speed odometer need to obtain displacement through integration, and the defect of large accumulated error of the displacement exists in a long-time working state.
Disclosure of Invention
The invention provides a method for correcting point cloud motion distortion of a laser radar, aiming at the problem that displacement accumulated errors are large in an adopted external sensor in the conventional method for correcting the point cloud motion distortion of the laser radar.
The invention relates to a method for correcting laser radar point cloud motion distortion, which comprises the following steps,
the method comprises the following steps: an RTK is used as a pulse generator, and a laser radar and an IMU are synchronously triggered; establishing a global coordinate system for the collected first frame point cloud of the laser radar by taking the first frame point cloud as an origin;
step two: collecting current frame point clouds of a laser radar of a non-first frame, and establishing a local coordinate system by taking the current frame point clouds as an origin; simultaneously converting an IMU coordinate system corresponding to the current frame point cloud acquisition moment into the local coordinate system; synchronously obtaining IMU poses;
step three: obtaining an IMU (inertial measurement Unit) pose corresponding to each data point in the current frame point cloud by adopting an interpolation method according to the local coordinate system, and correcting the motion distortion of the current frame point cloud of the laser radar according to the IMU pose;
step four: converting the local coordinate system into the global coordinate system, and calibrating the pose of the IMU by using the RTK pose; and then returning to the step two until the correction of the point cloud motion distortion of the laser radar of the last frame is finished.
According to the method for correcting the laser radar point cloud motion distortion of the invention,
in the second step, for the collected laser radar point cloud, under the local coordinate system, the coordinate of the ith point of the kth frame isWherein xki、yki、zkiThe coordinates of X, Y, Z axes of the ith point of the kth frame under a local coordinate system are respectively, wherein k belongs to (1,2,3.. m), i belongs to (1,2,3.. n), m is the total frame number of the point cloud, and n is the number of data points contained in the frame data;
according to the method for correcting the laser radar point cloud motion distortion of the invention,
in the second step, the IMU poses are expressed in a matrix form as follows:
whereinIs the pose of the ith point of the kth frame of the laser radar point cloud,is a rotation matrix from the ith point of the kth frame of the laser radar point cloud to the corresponding local coordinate system,a translation matrix from the ith point of the kth frame of the laser radar point cloud to a corresponding local coordinate system;
the angle of the ith point of the kth frame of the laser radar point cloud rotating around the Z axis of the local coordinate system, beta is the angle of the ith point of the kth frame of the laser radar point cloud rotating around the Y axis of the local coordinate system, and alpha is the angle of the ith point of the kth frame of the laser radar point cloud rotating around the X axis of the local coordinate system;
in the formula sxIs the displacement of the ith point of the kth frame of the laser radar point cloud along the X axis of the local coordinate system syIs the displacement of the ith point of the kth frame of the laser radar point cloud along the Y axis of the local coordinate system szThe displacement of the ith point of the kth frame of the laser radar point cloud along the Z axis of the local coordinate system;
in the formulaIs the starting pose of the current k frame of the laser radar point cloud,a rotation matrix of the current laser radar point cloud k frame data to a corresponding local coordinate system,and a translation matrix from the current laser radar point cloud kth frame data to the corresponding local coordinate system.
According to the method for correcting the laser radar point cloud motion distortion of the invention,
correcting the laser radar point cloud motion distortion according to the IMU pose in the third step comprises the following steps:
marking the frame head time stamp of the kth frame of the laser radar point cloud as tdTime stamp in frame teAnd the time stamp of the frame tail is tf(ii) a The positions of IMUs of the frame head of the k frame and the front frame and the rear frame are respectivelyAndIMU poses of a front frame and a rear frame in the k frame are respectivelyAndthe pose of the IMU of the frame before and after the frame end of the kth frame isAnd
interpolating pose positions of IMU to obtain pose positions of the head, middle and tail of the kth frameAndand then carrying out secondary curve fitting to obtain a point cloud pose curve fitting equation:
Pt k=At2+Bt+C,
in the formula Pt kThe pose corresponding to each data point in the kth frame point cloud is shown, t is time, A is a quadratic term coefficient, B is a primary term coefficient, and C is a constant;
correcting the point cloud motion distortion of the laser radar current frame according to a point cloud pose curve fitting equation to obtain:
According to the method for correcting the laser radar point cloud motion distortion of the invention,
the step four of utilizing the RTK pose to calibrate the IMU pose comprises the following steps:
judging whether the timestamp of the IMU pose is smaller than the timestamp of the RTK pose according to the RTK pose and the IMU pose in the local coordinate system;
if the timestamp of the IMU pose is smaller than the timestamp of the RTK pose, taking the RTK pose as an IMU pose observation value; if the time stamp of the IMU pose is larger than or equal to the time stamp of the RTK pose, searching two frames of RTK poses before and after the current frame IMU pose time stamp, and using the pose obtained by RTK interpolation as an IMU pose observation value;
and performing Kalman fusion on the IMU pose under the local coordinate system and the IMU pose observation value to obtain a calibrated IMU pose.
The invention has the beneficial effects that: the method utilizes IMU pose interpolation to obtain the poses of all points of each frame in a local coordinate system, further converts each frame of point cloud after distortion removal into the local coordinate system of the current frame, converts each frame of local coordinate system into the global coordinate system, and finally completes the correction of the laser radar point cloud motion distortion.
The method utilizes a carrier phase differential technology RTK, the plane precision and the elevation precision of the RTK can reach centimeter level, no accumulated error exists, and the RTK can be used as a clock source to unify the time of each sensor on hardware. After the pose of the IMU is corrected in real time by using the RTK, the displacement, the course angle, the pitch angle and the roll angle of the unmanned vehicle can be calculated by using the IMU, and then the point cloud motion distortion of the laser radar is corrected. When the unmanned vehicle runs to an area with serious shielding, the IMU corrected by the RTK still can provide a more accurate pose, and the robustness and the stability of the system are enhanced although the RTK works are influenced to a certain extent.
Drawings
FIG. 1 is an overall flow chart of the correction method for the laser radar point cloud motion distortion;
FIG. 2 is a flowchart of an embodiment of a method for correcting the point cloud motion distortion of a laser radar;
FIG. 3 is a flow chart of interpolation IMU pose correction lidar point cloud distortion;
FIG. 4 is a flow chart for calibrating IMU pose using RTK pose.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the embodiments and features of the embodiments may be combined with each other without conflict.
The invention is further described with reference to the following drawings and specific examples, which are not intended to be limiting.
First embodiment, as shown in fig. 1 and fig. 2, the present invention provides a method for correcting laser radar point cloud motion distortion, including,
the method comprises the following steps: an RTK is used as a pulse generator, and a laser radar and an IMU are synchronously triggered; establishing a global coordinate system for the collected first frame point cloud of the laser radar by taking the first frame point cloud as an origin; after each time of triggering of the RTK, the RTK can correct the clock of the RTK, so that clock source time synchronization on hardware is completed;
step two: collecting current frame point clouds of a laser radar of a non-first frame, and establishing a local coordinate system by taking the current frame point clouds as an origin; simultaneously converting an IMU coordinate system corresponding to the current frame point cloud acquisition moment into the local coordinate system; synchronously obtaining IMU poses;
step three: obtaining an IMU (inertial measurement Unit) pose corresponding to each data point in the current frame point cloud by adopting an interpolation method according to the local coordinate system, and correcting the motion distortion of the current frame point cloud of the laser radar according to the IMU pose;
step four: converting the local coordinate system into the global coordinate system, and calibrating the pose of the IMU by using the RTK pose; and then returning to the step two until the correction of the point cloud motion distortion of the laser radar of the last frame is finished.
The method realizes the correction of the laser radar point cloud distortion based on the inertial measurement unit IMU and the real-time dynamic carrier phase difference technology RTK. The RTK has two functions, namely, time synchronization is achieved, and data inaccuracy is caused by long-time operation due to time drift of the pose of the IMU, so that the pose of the IMU can be corrected by the RTK. Therefore, when the RTK is seriously shielded and cannot transmit data, the pose of the IMU is corrected, and the data which continuously work is still credible. Therefore, complementary operation of the three sensors of the laser radar, the RTK and the IMU is realized.
In the embodiment, when collecting the laser radar point cloud, firstly, judging whether the point cloud is a first frame point cloud, and establishing a global coordinate system according to the first frame point cloud; establishing a local coordinate system for the non-first frame point cloud; and after each correction is finished, judging whether the current frame point cloud is the last frame point cloud or not, if not, repeating the correction process, and if so, finishing the operation. And after finishing the correction of one frame of point cloud, correcting the pose of the IMU once until finishing all correction tasks.
Further, in the second step, for the collected laser radar point cloud, under the local coordinate system, the coordinate of the ith point of the kth frame isWherein xki、yki、zkiThe coordinates of X, Y, Z axes of the ith point of the kth frame under a local coordinate system are respectively, wherein k belongs to (1,2,3.. m), i belongs to (1,2,3.. n), m is the total frame number of the point cloud, and n is the number of data points contained in the frame data;
the laser radar point cloud is expressed in a matrix form as follows:the last item 1 is added in the matrix form, which is equivalent to a row and four columns, so that the subsequent calculation can be facilitated.
Still further, in step two, the IMU pose is expressed in a matrix form as:
whereinIs the pose of the ith point of the kth frame of the laser radar point cloud,is a rotation matrix from the ith point of the kth frame of the laser radar point cloud to the corresponding local coordinate system,a translation matrix from the ith point of the kth frame of the laser radar point cloud to a corresponding local coordinate system;
the angle of the ith point of the kth frame of the laser radar point cloud rotating around the Z axis of the local coordinate system, beta is the angle of the ith point of the kth frame of the laser radar point cloud rotating around the Y axis of the local coordinate system, and alpha is the angle of the ith point of the kth frame of the laser radar point cloud rotating around the X axis of the local coordinate system;
in the formula sxIs the displacement of the ith point of the kth frame of the laser radar point cloud along the X axis of the local coordinate system syIs the displacement of the ith point of the kth frame of the laser radar point cloud along the Y axis of the local coordinate system szThe displacement of the ith point of the kth frame of the laser radar point cloud along the Z axis of the local coordinate system;
in the formulaIs the starting pose of the current k frame of the laser radar point cloud,a rotation matrix of the current laser radar point cloud k frame data to a corresponding local coordinate system,and a translation matrix from the current laser radar point cloud kth frame data to the corresponding local coordinate system.
Each frame of data of the lidar point cloud has its own local coordinate system.
Still further, with reference to fig. 3, the correcting the laser radar point cloud motion distortion according to the IMU pose in step three includes:
marking the frame head time stamp of the kth frame of the laser radar point cloud as tdTime stamp in frame teAnd the time stamp of the frame tail is tf(ii) a The positions of IMUs of the frame head of the k frame and the front frame and the rear frame are respectivelyAndIMU poses of a front frame and a rear frame in the k frame are respectivelyAndthe pose of the IMU of the frame before and after the frame end of the kth frame isAnd
interpolating pose positions of IMU to obtain pose positions of the head, middle and tail of the kth frameAndand then carrying out secondary curve fitting to obtain a point cloud pose curve fitting equation:
Pt k=At2+Bt+C,
in the formula Pt kThe pose corresponding to each data point in the kth frame point cloud is shown, t is time, A is a quadratic term coefficient, B is a primary term coefficient, and C is a constant;
correcting the point cloud motion distortion of the laser radar current frame according to a point cloud pose curve fitting equation to obtain:
Representing the inverse of the current k-th frame starting pose matrix, Pi kShowing the pose of the ith point of the current k-th frame,and representing the point cloud coordinates before the i-th point correction of the current k-th frame. Thus, the correction of the current k frame point cloud motion distortion is completed.
Still further, as shown in fig. 4, the calibrating the position of the IMU by using the RTK position in the fourth step includes:
judging whether the timestamp of the IMU pose is smaller than the timestamp of the RTK pose according to the RTK pose and the IMU pose in the local coordinate system;
if the timestamp of the IMU pose is smaller than the timestamp of the RTK pose, taking the RTK pose as an IMU pose observation value; if the time stamp of the IMU pose is larger than or equal to the time stamp of the RTK pose, searching two frames of RTK poses before and after the current frame IMU pose time stamp, and using the pose obtained by RTK interpolation as an IMU pose observation value;
and performing Kalman fusion on the IMU pose under the local coordinate system and the IMU pose observation value to obtain a calibrated IMU pose.
In the embodiment, the position and pose of the IMU under the original local coordinate system can be used as a predicted value, then Kalman fusion of the IMU position and pose predicted value and observation value is carried out, and finally the calibrated IMU position and pose are obtained.
In the present invention, the update frequency of the lidar may be 10HZ and the update frequency of the IMU may be 300HZ, as an example.
Although the invention herein has been described with reference to particular embodiments, it is to be understood that these embodiments are merely illustrative of the principles and applications of the present invention. It is therefore to be understood that numerous modifications may be made to the illustrative embodiments and that other arrangements may be devised without departing from the spirit and scope of the present invention as defined by the appended claims. It should be understood that features described in different dependent claims and herein may be combined in ways different from those described in the original claims. It is also to be understood that features described in connection with individual embodiments may be used in other described embodiments.
Claims (5)
1. A method for correcting the point cloud motion distortion of laser radar comprises,
the method comprises the following steps: an RTK is used as a pulse generator, and a laser radar and an IMU are synchronously triggered; establishing a global coordinate system for the collected first frame point cloud of the laser radar by taking the first frame point cloud as an origin;
step two: collecting current frame point clouds of a laser radar of a non-first frame, and establishing a local coordinate system by taking the current frame point clouds as an origin; simultaneously converting an IMU coordinate system corresponding to the current frame point cloud acquisition moment into the local coordinate system; synchronously obtaining IMU poses;
step three: obtaining an IMU (inertial measurement Unit) pose corresponding to each data point in the current frame point cloud by adopting an interpolation method according to the local coordinate system, and correcting the motion distortion of the current frame point cloud of the laser radar according to the IMU pose;
step four: converting the local coordinate system into the global coordinate system, and calibrating the pose of the IMU by using the RTK pose; and then returning to the step two until the correction of the point cloud motion distortion of the laser radar of the last frame is finished.
2. The method for correcting laser radar point cloud motion distortion according to claim 1,
in the second step, for the collected laser radar point cloud, under the local coordinate system, the coordinate of the ith point of the kth frame isWherein xki、yki、zkiX, Y, Z axes of the ith point of the k frame under the local coordinate systemThe coordinate of (1), (2), (3) · m), i ∈ (1, (2), (3) · n), m is the total frame number of the point cloud, and n is the number of data points contained in the frame data;
3. the method for correcting laser radar point cloud motion distortion according to claim 2,
in the second step, the IMU poses are expressed in a matrix form as follows:
whereinIs the pose of the ith point of the kth frame of the laser radar point cloud,is a rotation matrix from the ith point of the kth frame of the laser radar point cloud to the corresponding local coordinate system,a translation matrix from the ith point of the kth frame of the laser radar point cloud to a corresponding local coordinate system;
the angle of the ith point of the kth frame of the laser radar point cloud rotating around the Z axis of the local coordinate system, beta is the angle of the ith point of the kth frame of the laser radar point cloud rotating around the Y axis of the local coordinate system, and alpha is laserThe rotation angle of the ith point of the kth frame of the optical radar point cloud around the X axis of the local coordinate system;
in the formula sxIs the displacement of the ith point of the kth frame of the laser radar point cloud along the X axis of the local coordinate system syIs the displacement of the ith point of the kth frame of the laser radar point cloud along the Y axis of the local coordinate system szThe displacement of the ith point of the kth frame of the laser radar point cloud along the Z axis of the local coordinate system;
in the formulaIs the starting pose of the current k frame of the laser radar point cloud,a rotation matrix of the current laser radar point cloud k frame data to a corresponding local coordinate system,and a translation matrix from the current laser radar point cloud kth frame data to the corresponding local coordinate system.
4. The method for correcting laser radar point cloud motion distortion according to claim 3,
correcting the laser radar point cloud motion distortion according to the IMU pose in the third step comprises the following steps:
marking the frame head time stamp of the kth frame of the laser radar point cloud as tdTime stamp in frame teAnd the time stamp of the frame tail is tf(ii) a The positions of IMUs of the frame head of the k frame and the front frame and the rear frame are respectivelyAndIMU poses of a front frame and a rear frame in the k frame are respectivelyAndthe pose of the IMU of the frame before and after the frame end of the kth frame isAnd
interpolating pose positions of IMU to obtain pose positions of the head, middle and tail of the kth framePe kAndand then carrying out secondary curve fitting to obtain a point cloud pose curve fitting equation:
Pt k=At2+Bt+C,
in the formula Pt kThe pose corresponding to each data point in the kth frame point cloud is shown, t is time, A is a quadratic term coefficient, B is a primary term coefficient, and C is a constant;
correcting the point cloud motion distortion of the laser radar current frame according to a point cloud pose curve fitting equation to obtain:
5. The method for correcting laser radar point cloud motion distortion according to claim 4,
the step four of utilizing the RTK pose to calibrate the IMU pose comprises the following steps:
judging whether the timestamp of the IMU pose is smaller than the timestamp of the RTK pose according to the RTK pose and the IMU pose in the local coordinate system;
if the timestamp of the IMU pose is smaller than the timestamp of the RTK pose, taking the RTK pose as an IMU pose observation value; if the time stamp of the IMU pose is larger than or equal to the time stamp of the RTK pose, searching two frames of RTK poses before and after the current frame IMU pose time stamp, and using the pose obtained by RTK interpolation as an IMU pose observation value;
and performing Kalman fusion on the IMU pose under the local coordinate system and the IMU pose observation value to obtain a calibrated IMU pose.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110030119.3A CN112859051B (en) | 2021-01-11 | 2021-01-11 | Laser radar point cloud motion distortion correction method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110030119.3A CN112859051B (en) | 2021-01-11 | 2021-01-11 | Laser radar point cloud motion distortion correction method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112859051A true CN112859051A (en) | 2021-05-28 |
CN112859051B CN112859051B (en) | 2024-04-09 |
Family
ID=76002243
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110030119.3A Active CN112859051B (en) | 2021-01-11 | 2021-01-11 | Laser radar point cloud motion distortion correction method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112859051B (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113376638A (en) * | 2021-06-08 | 2021-09-10 | 武汉理工大学 | Unmanned logistics trolley environment sensing method and system |
CN113495281A (en) * | 2021-06-21 | 2021-10-12 | 杭州飞步科技有限公司 | Real-time positioning method and device for movable platform |
CN113790738A (en) * | 2021-08-13 | 2021-12-14 | 上海智能网联汽车技术中心有限公司 | Data compensation method based on intelligent cradle head IMU |
CN113838143A (en) * | 2021-09-13 | 2021-12-24 | 三一专用汽车有限责任公司 | Method and device for determining calibration external parameter, engineering vehicle and readable storage medium |
CN113935904A (en) * | 2021-08-27 | 2022-01-14 | 清华大学 | Laser odometer error correction method, system, storage medium and computing equipment |
CN114372914A (en) * | 2022-01-12 | 2022-04-19 | 吉林大学 | Mechanical laser radar point cloud preprocessing method applied to mining electric shovel |
CN114569011A (en) * | 2022-03-25 | 2022-06-03 | 微思机器人(深圳)有限公司 | Wall-following walking method and device, floor sweeping robot and storage medium |
CN114820392A (en) * | 2022-06-28 | 2022-07-29 | 新石器慧通(北京)科技有限公司 | Laser radar detection moving target distortion compensation method, device and storage medium |
CN114862932A (en) * | 2022-06-20 | 2022-08-05 | 安徽建筑大学 | BIM global positioning-based pose correction method and motion distortion correction method |
CN115840234A (en) * | 2022-10-28 | 2023-03-24 | 苏州知至科技有限公司 | Radar data acquisition method and device and storage medium |
CN116359938A (en) * | 2023-05-31 | 2023-06-30 | 未来机器人(深圳)有限公司 | Object detection method, device and carrying device |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20180080828A (en) * | 2017-01-05 | 2018-07-13 | 서울대학교산학협력단 | Method for recognizing lane-level vehicle positioning information based on lidar map matching, recording medium and device for performing the method |
CN109975792A (en) * | 2019-04-24 | 2019-07-05 | 福州大学 | Method based on Multi-sensor Fusion correction multi-line laser radar point cloud motion distortion |
CN110703229A (en) * | 2019-09-25 | 2020-01-17 | 禾多科技(北京)有限公司 | Point cloud distortion removal method and external reference calibration method for vehicle-mounted laser radar reaching IMU |
CN111045017A (en) * | 2019-12-20 | 2020-04-21 | 成都理工大学 | Method for constructing transformer substation map of inspection robot by fusing laser and vision |
CN111578957A (en) * | 2020-05-07 | 2020-08-25 | 泉州装备制造研究所 | Intelligent pure vehicle tracking and tracking method based on three-dimensional point cloud map positioning |
CN112082545A (en) * | 2020-07-29 | 2020-12-15 | 武汉威图传视科技有限公司 | Map generation method, device and system based on IMU and laser radar |
-
2021
- 2021-01-11 CN CN202110030119.3A patent/CN112859051B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20180080828A (en) * | 2017-01-05 | 2018-07-13 | 서울대학교산학협력단 | Method for recognizing lane-level vehicle positioning information based on lidar map matching, recording medium and device for performing the method |
CN109975792A (en) * | 2019-04-24 | 2019-07-05 | 福州大学 | Method based on Multi-sensor Fusion correction multi-line laser radar point cloud motion distortion |
CN110703229A (en) * | 2019-09-25 | 2020-01-17 | 禾多科技(北京)有限公司 | Point cloud distortion removal method and external reference calibration method for vehicle-mounted laser radar reaching IMU |
CN111045017A (en) * | 2019-12-20 | 2020-04-21 | 成都理工大学 | Method for constructing transformer substation map of inspection robot by fusing laser and vision |
CN111578957A (en) * | 2020-05-07 | 2020-08-25 | 泉州装备制造研究所 | Intelligent pure vehicle tracking and tracking method based on three-dimensional point cloud map positioning |
CN112082545A (en) * | 2020-07-29 | 2020-12-15 | 武汉威图传视科技有限公司 | Map generation method, device and system based on IMU and laser radar |
Non-Patent Citations (2)
Title |
---|
吴昱晗: "一种基于点云匹配的激光雷达/IMU联合标定方法", 《电子技术应用》, vol. 45, no. 12, pages 78 - 82 * |
李旭: "基于多线激光雷达建图的里程计优化及回环检测", 《中国优秀硕士学位论文全文数据库 信息科技辑》, no. 2, pages 136 - 1991 * |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113376638A (en) * | 2021-06-08 | 2021-09-10 | 武汉理工大学 | Unmanned logistics trolley environment sensing method and system |
CN113495281B (en) * | 2021-06-21 | 2023-08-22 | 杭州飞步科技有限公司 | Real-time positioning method and device for movable platform |
CN113495281A (en) * | 2021-06-21 | 2021-10-12 | 杭州飞步科技有限公司 | Real-time positioning method and device for movable platform |
CN113790738A (en) * | 2021-08-13 | 2021-12-14 | 上海智能网联汽车技术中心有限公司 | Data compensation method based on intelligent cradle head IMU |
CN113935904A (en) * | 2021-08-27 | 2022-01-14 | 清华大学 | Laser odometer error correction method, system, storage medium and computing equipment |
CN113838143A (en) * | 2021-09-13 | 2021-12-24 | 三一专用汽车有限责任公司 | Method and device for determining calibration external parameter, engineering vehicle and readable storage medium |
CN114372914A (en) * | 2022-01-12 | 2022-04-19 | 吉林大学 | Mechanical laser radar point cloud preprocessing method applied to mining electric shovel |
CN114569011A (en) * | 2022-03-25 | 2022-06-03 | 微思机器人(深圳)有限公司 | Wall-following walking method and device, floor sweeping robot and storage medium |
CN114569011B (en) * | 2022-03-25 | 2023-09-05 | 微思机器人(深圳)有限公司 | Wall-following walking method and device, sweeping robot and storage medium |
CN114862932A (en) * | 2022-06-20 | 2022-08-05 | 安徽建筑大学 | BIM global positioning-based pose correction method and motion distortion correction method |
CN114862932B (en) * | 2022-06-20 | 2022-12-30 | 安徽建筑大学 | BIM global positioning-based pose correction method and motion distortion correction method |
CN114820392B (en) * | 2022-06-28 | 2022-10-18 | 新石器慧通(北京)科技有限公司 | Laser radar detection moving target distortion compensation method, device and storage medium |
CN114820392A (en) * | 2022-06-28 | 2022-07-29 | 新石器慧通(北京)科技有限公司 | Laser radar detection moving target distortion compensation method, device and storage medium |
CN115840234A (en) * | 2022-10-28 | 2023-03-24 | 苏州知至科技有限公司 | Radar data acquisition method and device and storage medium |
CN115840234B (en) * | 2022-10-28 | 2024-04-19 | 苏州知至科技有限公司 | Radar data acquisition method, device and storage medium |
CN116359938A (en) * | 2023-05-31 | 2023-06-30 | 未来机器人(深圳)有限公司 | Object detection method, device and carrying device |
CN116359938B (en) * | 2023-05-31 | 2023-08-25 | 未来机器人(深圳)有限公司 | Object detection method, device and carrying device |
Also Published As
Publication number | Publication date |
---|---|
CN112859051B (en) | 2024-04-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112859051A (en) | Method for correcting laser radar point cloud motion distortion | |
CN101241011B (en) | High precision positioning and posture-fixing device on laser radar platform and method | |
WO2023131123A1 (en) | External parameter calibration method and apparatus for combined navigation device and laser radar | |
WO2022127532A1 (en) | Method and apparatus for calibrating external parameter of laser radar and imu, and device | |
CN109934920A (en) | High-precision three-dimensional point cloud map constructing method based on low-cost equipment | |
CN113358112B (en) | Map construction method and laser inertia odometer | |
CN109471146B (en) | Self-adaptive fault-tolerant GPS/INS integrated navigation method based on LS-SVM | |
CN113933818A (en) | Method, device, storage medium and program product for calibrating laser radar external parameter | |
CN112731358B (en) | Multi-laser-radar external parameter online calibration method | |
CN110570449A (en) | positioning and mapping method based on millimeter wave radar and visual SLAM | |
CN114526745A (en) | Drawing establishing method and system for tightly-coupled laser radar and inertial odometer | |
CN111880207A (en) | Visual inertial satellite tight coupling positioning method based on wavelet neural network | |
CN114612348B (en) | Laser point cloud motion distortion correction method and device, electronic equipment and storage medium | |
CN112946681B (en) | Laser radar positioning method fusing combined navigation information | |
CN109507706B (en) | GPS signal loss prediction positioning method | |
CN113721248B (en) | Fusion positioning method and system based on multi-source heterogeneous sensor | |
CN114413887A (en) | Method, equipment and medium for calibrating external parameters of sensor | |
CN114485654A (en) | Multi-sensor fusion positioning method and device based on high-precision map | |
CN115728753A (en) | External parameter calibration method and device for laser radar and integrated navigation and intelligent vehicle | |
CN114915913A (en) | UWB-IMU combined indoor positioning method based on sliding window factor graph | |
CN112883134A (en) | Data fusion graph building method and device, electronic equipment and storage medium | |
CN111521996A (en) | Laser radar installation calibration method | |
CN116481543A (en) | Multi-sensor fusion double-layer filtering positioning method for mobile robot | |
CN116753948A (en) | Positioning method based on visual inertial GNSS PPP coupling | |
CN111307176B (en) | Online calibration method for visual inertial odometer in VR head-mounted display equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |