CN113325428A - Distance and course measuring method based on laser radar - Google Patents

Distance and course measuring method based on laser radar Download PDF

Info

Publication number
CN113325428A
CN113325428A CN202110471992.6A CN202110471992A CN113325428A CN 113325428 A CN113325428 A CN 113325428A CN 202110471992 A CN202110471992 A CN 202110471992A CN 113325428 A CN113325428 A CN 113325428A
Authority
CN
China
Prior art keywords
point cloud
laser radar
distance
plane
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110471992.6A
Other languages
Chinese (zh)
Inventor
卜春光
张建刚
范晓亮
赵忆文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenyang Institute of Automation of CAS
Original Assignee
Shenyang Institute of Automation of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenyang Institute of Automation of CAS filed Critical Shenyang Institute of Automation of CAS
Priority to CN202110471992.6A priority Critical patent/CN113325428A/en
Publication of CN113325428A publication Critical patent/CN113325428A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention relates to a distance and course measuring method based on a laser radar. The method is based on 16-line laser radar to realize the measurement of the distance between the two side wall surfaces of the robot in the long corridor environment and the measurement of the yaw angle of the robot in the advancing direction. The method comprises the following steps: 1) installing a laser radar sensor on a rotating cloud platform of a robot, and scanning to obtain three-dimensional point cloud data of an environment; 2) extracting features required for measurement from scanned three-dimensional point cloud data, and the specific operations comprise: performing direct filtering, statistical filtering, Euclidean clustering and random sampling consistency segmentation on the obtained three-dimensional point cloud, and calculating the centroid of the clustered point cloud cluster; 3) and calculating the distance from the laser radar to the wall surfaces on two sides of the corridor and the yaw angle of the robot according to the obtained mass center and plane parameters of the point cloud cluster. The invention solves the requirements of the robot on the distance to the wall surfaces at two sides and the yaw angle in the autonomous navigation in the long corridor environment, and the robot can realize the autonomous navigation in a specific scene through the provided data.

Description

Distance and course measuring method based on laser radar
Technical Field
The invention relates to the technical field of environment perception, in particular to a method for measuring the distance between two side wall surfaces and calculating a yaw angle under a long corridor environment based on a laser radar.
Background
The laser radar is used as an active measurement sensor, can accurately obtain the depth information of a scene, has good anti-interference capability, and has great application value in the fields of robot navigation, engineering measurement and the like. Under the environment of a long corridor, due to dim light and sparse environmental texture features, the positioning method based on vision or laser cannot extract effective environmental feature information, and the positioning is easy to lose effectiveness or generate larger drift.
Aiming at the environmental conditions, a lightweight distance and yaw angle measuring method is provided, and autonomous navigation of the robot is guaranteed by providing the distance from the robot to the two side walls and the yaw angle of the robot.
Disclosure of Invention
Aiming at the defect that the prior art is easy to lose efficacy in a long corridor environment, the invention provides a distance and course measuring method based on a laser radar, which realizes the measurement of the distance between two side walls of a robot in the long corridor environment and the measurement of the yaw angle of the robot in the advancing direction, and outputs the detection result data to a robot control system to ensure that the robot can autonomously advance in the long corridor environment.
In order to achieve the purpose, the invention adopts the technical scheme that:
a distance and course measuring method based on laser radar is disclosed, which is based on 16-line laser radar to realize the measurement of the distance of two side walls of a robot in a long corridor environment and the measurement of the yaw angle of the robot in the advancing direction; the method comprises the following steps:
step 1: scanning by a laser radar sensor arranged on a robot body to obtain three-dimensional point cloud data of an environment;
step 2: carrying out data processing on the acquired three-dimensional point cloud data, wherein the specific operations comprise: performing straight-through filtering, statistical filtering, Euclidean clustering and random sampling consistency segmentation on the point cloud, and calculating the mass center of a clustered point cloud cluster;
and step 3: and calculating the distance from the laser radar to the wall surfaces on two sides of the corridor and the yaw angle of the robot according to the obtained mass center and plane parameters of the point cloud cluster.
The laser radar sensor is arranged on a rotating holder of the robot, and the rotating angle of the holder in the horizontal direction is-90 degrees to 90 degrees; the 360-degree transverse scanning plane used for adjusting the laser radar sensor is a horizontal plane by controlling the rotation of the rotary holder.
The method for extracting the characteristics required by measurement from the three-dimensional point cloud data obtained by scanning comprises the following steps:
step 2.1: performing through filtering operation according to the space three-dimensional point distance scanned by the laser radar to obtain data in a laser radar area;
step 2.2: carrying out statistical filtering on the point cloud data subjected to direct filtering to remove outliers caused by noise;
step 2.3: carrying out Euclidean clustering on the data after the statistical filtering processing, and carrying out random sampling consistency segmentation on a point cloud cluster obtained by clustering to obtain an inner point of a fitting plane;
step 2.4: performing Euclidean clustering on the point cloud interior point data obtained in the step 2.3 for the second time and calculating the mass center of the point cloud cluster after clustering;
step 2.5: and (4) carrying out plane segmentation on the point cloud cluster obtained by the Euclidean clustering in the step 2.4 to obtain parameter information of a plane.
The method for performing the direct filtering operation according to the space three-dimensional point distance scanned by the laser radar comprises the following steps:
and traversing each cloud point by the acquired three-dimensional point cloud data, and cutting off outliers according to the set candidate area coordinate threshold range to reduce the data volume.
The candidate region coordinate threshold range is: data with Y axis located within (-1.0,1.0) meter, Z axis located within (-0.3,1.0) meter, and X axis.
The statistical filtering is carried out on the point cloud data after the direct filtering, and the process of removing outliers caused by noise comprises the following steps:
firstly, calculating the average distance from each point to all K neighborhood points, wherein K is 50; secondly, calculating the average value and the sample standard deviation of the whole point set from the container, and assuming that the average value and the sample standard deviation obey Gaussian distribution; and removing points with the average distance outside the standard range as outer points to obtain the point cloud data with the noise points removed.
The method comprises the following steps of performing Euclidean clustering on data after statistical filtering processing, and performing random sampling consistency segmentation on a clustering point cloud cluster to obtain an inner point of a fitting plane:
for a certain three-dimensional point P in the space, K points nearest to the P are obtained through KDTree neighbor search, the points with the distance smaller than a threshold value in the points are clustered into a set Q, and when the point number in the Q is not increased any more and meets the specified number requirement, the clustering process is completed; otherwise, selecting point P again to complete the operation;
and on the obtained clustering point cloud cluster, creating a random sampling consistency object of the plane model according to a random sampling consistency algorithm RANSAC, and testing all other point data in the cluster according to the plane model obtained by multiple iterations to obtain the cluster point meeting the model.
Further performing Euclidean clustering on the obtained point cloud interior point data, calculating the mass center of the clustered point cloud cluster, and further performing plane segmentation on the point cloud cluster obtained by the Euclidean clustering to obtain parameter information of a plane, wherein the specific process comprises the following steps:
calculating the mass center of the point cloud cluster according to the point cloud cluster obtained by Euclidean clustering, wherein the calculation mode is as follows:
Figure BDA0003045806150000041
wherein r isi=(xi,yi,zi) N is a three-dimensional coordinate of each point in the point cloud cluster;
fitting a plane model of the point cloud through a random sampling consistency algorithm, wherein parameters of the finally fitted plane model are in the following form:
ax+by+cz+d=0
according to the plane model, the normal vector of the obtained plane is as follows:
Figure BDA0003045806150000042
according to the obtained mass center and plane parameters of the point cloud cluster, calculating the distance from the laser radar to the wall surfaces on two sides of the corridor and the yaw angle of the robot, and the method comprises the following steps:
a. assuming that the fitting plane is the wall surfaces on both sides, the distance from the laser radar to the wall surfaces on both sides of the corridor is the distance from the coordinate origin of the laser radar to the fitting plane, and the calculation formula is as follows:
Figure BDA0003045806150000043
b. the yaw angle is the included angle between the positive direction of the Y axis of the laser radar and the fitting plane, namely the direction vector n of the Y axisy=[0,1,0]And the included angle between the robot and the fitting plane is considered in the range of a yaw angle of the robot in the traveling process, wherein the range of the yaw angle is (-90 degrees and 90 degrees), and the value of the yaw angle is as follows:
θ=|90°-<n,ny>|
Figure BDA0003045806150000044
the direction of the yaw angle can be judged according to the classification discussion of the fact that the plane of the extracted normal vector is located in the positive direction or the negative direction of the X axis of the laser radar.
A distance and course measuring system based on laser radar is disclosed, the device realizes the measurement of the distance of two side walls of a robot in a long corridor environment and the measurement of the yaw angle of the robot in the advancing direction based on 16-line laser radar; the system comprises: the robot comprises a rotating cloud deck of the robot, a laser radar sensor arranged on the rotating cloud deck, and an embedded computing platform; the embedded computing platform stores the following program modules: the device comprises a data acquisition module, a point cloud data processing module and a distance and course calculation module; the embedded computing platform reads each program module to complete the distance and course measurement of the laser radar;
the data acquisition module outputs an instruction to control a 360-degree transverse scanning plane of the laser radar sensor to be a horizontal plane through a rotating holder, and controls the laser radar sensor to scan to obtain three-dimensional point cloud data of an environment;
the point cloud data processing module extracts characteristic information required by measurement from scanned three-dimensional point cloud data, and the specific operation comprises the following steps: performing direct filtering, statistical filtering, Euclidean clustering and random sampling consistency segmentation on the obtained three-dimensional point cloud, and calculating the centroid of the clustered point cloud cluster;
and the distance and course calculation module is used for calculating the distance from the laser radar to the wall surfaces on two sides of the corridor and the yaw angle of the robot according to the obtained mass center and plane parameters of the point cloud cluster.
The invention has the following benefits and advantages:
1. the invention solves the problems of positioning drift or loss of the robot under the conditions of long corridor, poor illumination condition and sparse environmental texture features, and ensures the autonomous navigation of the robot under the environment of the long corridor by providing the distance from the robot body to the wall surfaces on two sides and the yaw angle of the robot.
2. The invention adopts the 16-line laser radar sensor as the measuring device, has strong anti-interference capability and high measuring precision, and can realize the distance measuring precision of 3 cm.
3. The invention loads and rotates the laser radar with the holder instead of fixedly installing, can control the scanning angle of the laser radar and obtain richer point cloud data.
4. The method provided by the invention is lighter and saves computing resources.
Drawings
FIG. 1 is a flow chart of a method of the present invention;
FIG. 2 is a schematic diagram of a lidar sensor model;
FIG. 3 is a schematic diagram of a laser radar coordinate system relative to a fitting plane I;
FIG. 4 is a schematic diagram of a laser radar coordinate system relative to a fitting plane II;
FIG. 5 is a schematic diagram of a laser radar coordinate system relative to a fitting plane;
FIG. 6 is a schematic diagram of a laser radar coordinate system relative to a fitting plane;
FIG. 7(a) is a laser cloud plot of an experimental corridor;
FIG. 7(b) is a cloud point image after being processed by the method of the present invention.
Detailed Description
In order to make the aforementioned objects and features of the present invention more comprehensible, embodiments of the present invention will be described in detail with reference to the accompanying drawings. Unless defined otherwise, technical and scientific terms used herein are to be understood as commonly understood by one of ordinary skill in the art to which this invention belongs.
As shown in fig. 1, which is a flow chart of the method of the present invention, the method for measuring distance and heading based on lidar includes the following specific processes:
firstly, the rotation of the laser radar holder is controlled, so that the laser radar holder and the robot body are in the same parallel surface when the environment is scanned, and a laser radar sensor model is shown in figure 2. The laser radar uses Robosense 16 line laser radar, and the horizontal measurement is 360 degrees, and the vertical measurement is-15 degrees. The laser radar and the industrial personal computer adopt Ethernet communication, and UDP protocol is used for transmitting three-dimensional measurement data.
The method comprises the steps of completing analysis of original laser radar data through an ROS functional packet of Robosense, establishing a working space of the algorithm, creating ROS nodes in the working space, and obtaining three-dimensional point cloud data of the laser radar.
The point cloud data read in the above steps includes a point cloud far from the robot body and a large number of noise points, and in order to facilitate later application operations, the point cloud is filtered.
The invention adopts two filtering methods to preprocess data, firstly, the data information is directly filtered, and the point cloud far away from the robot body is cut off, thereby reducing the data amount, and the specific selection mode is as follows:
and traversing each point cloud point of the acquired point cloud data, and retaining all data of which the Y axis is located within the range of (-1.0,1.0) meters, the Z axis is located within the range of (-0.3,1.0) meters and the X axis direction. Data points around the robot body are selected, and measurement can be performed more accurately.
After the straight-through filtering operation, the data is subjected to statistical filtering processing to remove outliers caused by noise, and the specific process is as follows:
firstly, calculating the average distance from each point to all K neighborhood points, wherein K is 50; secondly, calculating the average value and the sample standard deviation of the whole point set from the container, and assuming that the average value and the sample standard deviation obey Gaussian distribution; and (4) taking the points with the average distance outside the standard range as outer points to remove, so as to obtain the point cloud data with the noise points removed.
After filtering operation is carried out on the laser radar point cloud data twice, good point cloud data can be obtained for clustering and dividing operation, firstly, Euclidean clustering is carried out on the data after statistical filtering processing, and the steps are as follows:
for a certain three-dimensional point P in the space, K points nearest to the P are searched through KDTree neighbors, the points with the distance smaller than a threshold value are clustered into a set Q, and when the point number in the Q is not increased any more and meets the specified number requirement, the clustering process is completed; otherwise, the point P is selected again to finish the operation.
And after the European clustering is finished, carrying out random sampling consistency segmentation on the point cloud to obtain the interior points meeting the plane model, and further removing outlier data after the operation. The specific operation mode is as follows:
and creating a Random sampling consistency object of the plane model according to a Random sampling consistency algorithm (RANSAC, Random Sample Consensus), and testing data of all other points in the cluster according to the plane model obtained by multiple iterations to obtain the points in the cluster meeting the model.
Further performing Euclidean clustering on the obtained point cloud interior point data, and calculating the mass center of the point cloud cluster, wherein the process is as follows:
calculating the centroid of the point cloud cluster according to the point cloud cluster obtained by clustering, wherein the calculation mode is as follows:
Figure BDA0003045806150000081
wherein r isi=(xi,yi,zi) N is the current value 1,2And three-dimensional coordinates of each point in the point cloud cluster.
Carrying out Euclidean clustering on the obtained point cloud interior point data, carrying out plane segmentation on the obtained point cloud cluster, and obtaining parameter information of a plane, wherein the method comprises the following steps:
fitting a plane model of the point cloud through a random sampling consistency algorithm, wherein parameters of the finally fitted plane model are in the following form:
ax+by+cz+d=0
wherein a, b, c and d are parameters of a plane equation;
according to the plane model, the normal vector of the obtained plane is as follows:
Figure BDA0003045806150000082
and finally, calculating the distance from the laser radar to the wall surfaces on two sides of the corridor and the yaw angle of the robot according to the obtained mass center and plane parameters of the point cloud cluster, wherein the calculation mode is as follows:
the distance from the laser radar to the two side wall surfaces is the distance l from the laser radar coordinate origin to the fitting plane, and the calculation formula is as follows:
Figure BDA0003045806150000083
the yaw angle is the included angle between the positive direction of the Y axis of the laser radar and the fitting plane, namely the direction vector n of the Y axisy=[0,1,0]And the angle from the plane is formed, the range of the yaw angle theta of the robot is considered to be (-90 degrees and 90 degrees) in the process of traveling, and the value of the yaw angle theta is as follows:
θ=|90°-<n,ny>|
wherein the content of the first and second substances,
Figure BDA0003045806150000091
in the course of calculating the yaw angle, there are 4 cases, as shown in fig. 3-6, according to the orientation of the plane and the coordinate system that depend on. In the calculation process, when the Y axis of the laser radar coordinate system is assumed to be positioned on the left side of a dotted line in the figure, the yaw angle is taken to be positive; otherwise, when the Y axis of the radar coordinate system is positioned at the right side of the dotted line in the figure, the heading angle is negative.
When calculating the yaw angle, a plane-dependent normal vector is needed
Figure BDA0003045806150000092
Because the planes on two sides of the laser radar can be fitted at the same time during plane fitting, in order to distinguish the two planes, the centroid P of the point cloud cluster of the fitted plane needs to be introducedC=[px,py,pz]。
When p isxWhen the yaw angle is less than 0, calculating the yaw angle by depending on the right plane; when p isxAnd when the angle is more than 0, the yaw angle is calculated by depending on the left plane. In addition, to determine the positive or negative of the yaw angle, a direction vector of the X axis needs to be introduced
Figure BDA0003045806150000093
Try to vector
Figure BDA0003045806150000094
Direction vector with X axis
Figure BDA0003045806150000095
Is at an included angle of
Figure BDA0003045806150000096
Normal vector
Figure BDA0003045806150000097
Direction vector with Y axis
Figure BDA0003045806150000098
Is at an included angle of
Figure BDA0003045806150000099
By pxThe magnitude and direction of the yaw angle theta can be calculated according to the magnitude relation between alpha, beta and 90 degrees.
As shown in fig. 3, the plane on which the beam depends now liesTo the right of the optical radar coordinate system, i.e. pxIs less than 0. Corresponding to a plane normal vector of
Figure BDA00030458061500000910
Because the normal vector cannot be distinguished
Figure BDA00030458061500000911
The direction of (c) can be discussed separately in two cases:
when the normal vector
Figure BDA00030458061500000913
When the direction of (a) is directed to the outside of the plane, i.e. as shown by the solid line in fig. 3, when α > 90 ° and β < 90 °, the available yaw angle θ is 90 ° - β; when the normal vector
Figure BDA00030458061500000912
When the direction of (b) is directed to the inner side of the plane, i.e. as shown by the dashed line in fig. 3, when α < 90 ° and β > 90 °, the yaw angle θ can be obtained as β -90 °.
Similarly, as shown in FIG. 4, the plane on which the reference is made is located on the left side of the lidar coordinate system, i.e., pxIs greater than 0. When the normal vector
Figure BDA0003045806150000101
When the direction of (a) is directed to the outside of the plane, i.e., as shown by the solid line in fig. 4, when α is less than 90 ° and β is greater than 90 °, the yaw angle θ can be obtained as β -90 °; when the normal vector
Figure BDA0003045806150000104
When pointing to the inner side of the plane, i.e. as shown by the dashed line in fig. 4, where α > 90 ° and β < 90 °, the yaw angle θ is obtained as 90 ° - β.
Similarly, as shown in FIG. 5, the plane on which the reference is made is located on the right side of the lidar coordinate system, i.e., pxIs less than 0. When the normal vector
Figure BDA0003045806150000105
When the direction of (a) is directed to the outside of the plane, i.e., as shown by the solid line in fig. 5, when α > 90 ° and β > 90 °, a yaw angle θ may be obtained as β -90 °; when making an approach(Vector)
Figure BDA0003045806150000106
When pointing to the inner side of the plane, i.e. as shown by the dashed line in fig. 5, where α < 90 ° and β < 90 °, the yaw angle θ can be obtained as 90 ° - β.
Similarly, as shown in FIG. 6, the plane relied upon at this time is located to the left of the lidar coordinate system, i.e., pxIs greater than 0. When the normal vector
Figure BDA0003045806150000102
When the direction of (a) is directed to the outside of the plane, i.e., as shown by the solid line in fig. 6, when α < 90 ° and β < 90 °, the yaw angle θ can be obtained as 90 ° - β; when the normal vector
Figure BDA0003045806150000103
When the direction of (b) is directed to the inner side of the plane, i.e. as shown by the dashed line in fig. 6, when α > 90 ° and β > 90 °, the yaw angle θ can be obtained as β -90 °.
Furthermore, the invention also provides a distance and course measuring system based on the laser radar.
The system realizes the measurement of the distance of the wall surfaces at two sides of the robot in a long corridor environment and the measurement of the yaw angle of the robot in the advancing direction based on a 16-line laser radar; the system comprises: the robot comprises a rotating cloud deck of the robot, a laser radar sensor arranged on the rotating cloud deck, and an embedded computing platform; the embedded computing platform stores the following program modules: the device comprises a data acquisition module, a point cloud data processing module and a distance and course calculation module; and the embedded computing platform reads each program module to complete the distance and course measurement of the laser radar.
The data acquisition module outputs an instruction to control a 360-degree transverse scanning plane of the laser radar sensor to be a horizontal plane through a rotating holder, and controls the laser radar sensor to scan to obtain three-dimensional point cloud data of an environment; the point cloud data processing module extracts characteristic information required by measurement from scanned three-dimensional point cloud data, and the specific operation comprises the following steps: performing direct filtering, statistical filtering, Euclidean clustering and random sampling consistency segmentation on the obtained three-dimensional point cloud, and calculating the centroid of the clustered point cloud cluster; and the distance and course calculation module is used for calculating the distance from the laser radar to the wall surfaces on two sides of the corridor and the yaw angle of the robot according to the obtained mass center and plane parameters of the point cloud cluster. Wherein, the embedded computing platform selects a Mohua MIO-5391 single board computer.
In order to verify the effectiveness of the method, the Robosense 16-line laser radar is used for scanning in a corridor to obtain three-dimensional point cloud data of a scene, the distance from two side wall surfaces and the yaw angle information of the robot are returned in real time, and the data refreshing frequency is 30 HZ. And the self-contained navigation test of the robot is carried out under the corridor environment by utilizing the solved and calculated data, the average error of the measured distance measurement is about 0.026 meter, and the task requirement is met. As shown in fig. 7(a) and (b) which are a three-dimensional point cloud chart of an experimental corridor environment and a processed point cloud chart, it can be seen that the method of the present invention solves the requirements of the robot for the distance to the wall surfaces on both sides and the yaw angle in the autonomous navigation in a long corridor environment, and the autonomous navigation in a specific scene is realized through the provided data.
The invention is best realized according to the above example. It is to be understood that any equivalent or obvious modifications made by those skilled in the art in the light of the present description are within the scope of the present invention.

Claims (10)

1. A distance and course measuring method based on a laser radar is characterized in that: the method is based on 16-line laser radar to realize the measurement of the distance of the wall surfaces at two sides of the robot in the long corridor environment and the measurement of the yaw angle of the robot in the advancing direction; the method comprises the following steps:
step 1: scanning by a laser radar sensor arranged on a robot body to obtain three-dimensional point cloud data of an environment;
step 2: carrying out data processing on the acquired three-dimensional point cloud data, wherein the specific operations comprise: performing straight-through filtering, statistical filtering, Euclidean clustering and random sampling consistency segmentation on the point cloud, and calculating the mass center of a clustered point cloud cluster;
and step 3: and calculating the distance from the laser radar to the wall surfaces on two sides of the corridor and the yaw angle of the robot according to the obtained mass center and plane parameters of the point cloud cluster.
2. The method for measuring the distance and the course based on the laser radar as claimed in claim 1, wherein the laser radar sensor is installed on a rotating holder of the robot, and the rotating angle of the holder in the horizontal direction is-90 degrees to 90 degrees; the 360-degree transverse scanning plane used for adjusting the laser radar sensor is a horizontal plane by controlling the rotation of the rotary holder.
3. The lidar based distance and heading measurement method according to claim 1, wherein the step of extracting features required for measurement from the scanned three-dimensional point cloud data comprises the steps of:
step 2.1: performing through filtering operation according to the space three-dimensional point distance scanned by the laser radar to obtain data in a laser radar area;
step 2.2: carrying out statistical filtering on the point cloud data subjected to direct filtering to remove outliers caused by noise;
step 2.3: carrying out Euclidean clustering on the data after the statistical filtering processing, and carrying out random sampling consistency segmentation on a point cloud cluster obtained by clustering to obtain an inner point of a fitting plane;
step 2.4: performing Euclidean clustering on the point cloud interior point data obtained in the step 2.3 for the second time and calculating the mass center of the point cloud cluster after clustering;
step 2.5: and (4) carrying out plane segmentation on the point cloud cluster obtained by the Euclidean clustering in the step 2.4 to obtain parameter information of a plane.
4. The lidar based distance and heading measurement method according to claim 3, wherein the pass-through filtering operation according to the space three-dimensional point distance scanned by the lidar comprises the steps of:
and traversing each cloud point by the acquired three-dimensional point cloud data, and cutting off outliers according to the set candidate area coordinate threshold range to reduce the data volume.
5. The lidar-based distance and heading measurement method according to claim 4, wherein the candidate area coordinate threshold range is: data with Y axis located within (-1.0,1.0) meter, Z axis located within (-0.3,1.0) meter, and X axis.
6. The lidar based distance and heading measurement method according to claim 3, wherein the statistical filtering of the directly filtered point cloud data to remove outliers caused by noise comprises the following processes:
firstly, calculating the average distance from each point to all K neighborhood points, wherein K is 50; secondly, calculating the average value and the sample standard deviation of the whole point set from the container, and assuming that the average value and the sample standard deviation obey Gaussian distribution; and removing points with the average distance outside the standard range as outer points to obtain the point cloud data with the noise points removed.
7. The method for measuring the distance and the course based on the laser radar according to the claim 3, wherein the method for measuring the distance and the course based on the laser radar is characterized in that Euclidean clustering is carried out on data after statistical filtering processing, random sampling consistency segmentation is carried out on a clustering point cloud cluster, and an inner point of a fitting plane is obtained, and the method comprises the following steps:
for a certain three-dimensional point P in the space, K points nearest to the P are obtained through KDTree neighbor search, the points with the distance smaller than a threshold value in the points are clustered into a set Q, and when the point number in the Q is not increased any more and meets the specified number requirement, the clustering process is completed; otherwise, selecting point P again to complete the operation;
and on the obtained clustering point cloud cluster, creating a random sampling consistency object of the plane model according to a random sampling consistency algorithm RANSAC, and testing all other point data in the cluster according to the plane model obtained by multiple iterations to obtain the cluster point meeting the model.
8. The method for measuring the distance and the course based on the laser radar according to the claim 3 is characterized in that the obtained point cloud interior point data is further subjected to Euclidean clustering, the centroid of the clustered point cloud cluster is calculated, the point cloud cluster obtained by the Euclidean clustering is further subjected to plane segmentation, and the parameter information of the plane is obtained, and the specific process is as follows:
calculating the mass center of the point cloud cluster according to the point cloud cluster obtained by Euclidean clustering, wherein the calculation mode is as follows:
Figure FDA0003045806140000031
wherein r isi=(xi,yi,zi) N is a three-dimensional coordinate of each point in the point cloud cluster;
fitting a plane model of the point cloud through a random sampling consistency algorithm, wherein parameters of the finally fitted plane model are in the following form:
ax+by+cz+d=0
according to the plane model, the normal vector of the obtained plane is as follows:
Figure FDA0003045806140000032
9. the method for measuring the distance and the course based on the laser radar according to the claim 1, wherein the method for calculating the distance from the laser radar to the wall surfaces on two sides of the corridor and the yaw angle of the robot according to the obtained mass center and the plane parameters of the point cloud cluster comprises the following steps:
a. assuming that the fitting plane is the wall surfaces on both sides, the distance from the laser radar to the wall surfaces on both sides of the corridor is the distance from the coordinate origin of the laser radar to the fitting plane, and the calculation formula is as follows:
Figure FDA0003045806140000041
b. the yaw angle is the included angle between the positive direction of the Y axis of the laser radar and the fitting plane, namely the direction vector n of the Y axisy=[0,1,0]The included angle between the fitting plane and the fitting plane,during the traveling process of the robot, the range of the yaw angle is considered to be (-90 degrees and 90 degrees), and the value of the yaw angle is as follows:
θ=|90°-<n,ny>|
Figure FDA0003045806140000042
the direction of the yaw angle can be judged according to the classification discussion of the fact that the plane of the extracted normal vector is located in the positive direction or the negative direction of the X axis of the laser radar.
10. A distance and course measuring system based on a laser radar is characterized in that the device realizes the measurement of the distance of two side wall surfaces of a robot in a long corridor environment and the measurement of a yaw angle of the robot in the advancing direction based on a 16-line laser radar; the system comprises: the robot comprises a rotating cloud deck of the robot, a laser radar sensor arranged on the rotating cloud deck, and an embedded computing platform; the embedded computing platform stores the following program modules: the device comprises a data acquisition module, a point cloud data processing module and a distance and course calculation module; the embedded computing platform reads each program module to complete the distance and course measurement of the laser radar;
the data acquisition module outputs an instruction to control a 360-degree transverse scanning plane of the laser radar sensor to be a horizontal plane through a rotating holder, and controls the laser radar sensor to scan to obtain three-dimensional point cloud data of an environment;
the point cloud data processing module extracts characteristic information required by measurement from scanned three-dimensional point cloud data, and the specific operation comprises the following steps: performing direct filtering, statistical filtering, Euclidean clustering and random sampling consistency segmentation on the obtained three-dimensional point cloud, and calculating the centroid of the clustered point cloud cluster;
and the distance and course calculation module is used for calculating the distance from the laser radar to the wall surfaces on two sides of the corridor and the yaw angle of the robot according to the obtained mass center and plane parameters of the point cloud cluster.
CN202110471992.6A 2021-04-29 2021-04-29 Distance and course measuring method based on laser radar Pending CN113325428A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110471992.6A CN113325428A (en) 2021-04-29 2021-04-29 Distance and course measuring method based on laser radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110471992.6A CN113325428A (en) 2021-04-29 2021-04-29 Distance and course measuring method based on laser radar

Publications (1)

Publication Number Publication Date
CN113325428A true CN113325428A (en) 2021-08-31

Family

ID=77413935

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110471992.6A Pending CN113325428A (en) 2021-04-29 2021-04-29 Distance and course measuring method based on laser radar

Country Status (1)

Country Link
CN (1) CN113325428A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113920134A (en) * 2021-09-27 2022-01-11 山东大学 Slope ground point cloud segmentation method and system based on multi-line laser radar
CN114494301A (en) * 2022-02-14 2022-05-13 北京智弘通达科技有限公司 Railway scene point cloud segmentation method based on airborne radar point cloud
CN116447977A (en) * 2023-06-16 2023-07-18 北京航天计量测试技术研究所 Round hole feature measurement and parameter extraction method based on laser radar

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014132509A1 (en) * 2013-02-27 2014-09-04 シャープ株式会社 Surrounding environment recognition device, autonomous mobile system using same, and surrounding environment recognition method
WO2015017691A1 (en) * 2013-08-02 2015-02-05 Irobot Corporation Time-dependent navigation of telepresence robots
US20180299557A1 (en) * 2017-04-17 2018-10-18 Baidu Online Network Technology (Beijing) Co., Ltd Method and apparatus for updating maps
US20190033459A1 (en) * 2017-07-25 2019-01-31 Waymo Llc Determining Yaw Error from Map Data, Lasers, and Cameras
CN110142805A (en) * 2019-05-22 2019-08-20 武汉爱速达机器人科技有限公司 A kind of robot end's calibration method based on laser radar
CN110700839A (en) * 2019-10-21 2020-01-17 北京易联创安科技发展有限公司 Heading machine pose measuring device based on laser scanner and measuring method thereof
CN110780305A (en) * 2019-10-18 2020-02-11 华南理工大学 Track cone bucket detection and target point tracking method based on multi-line laser radar

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014132509A1 (en) * 2013-02-27 2014-09-04 シャープ株式会社 Surrounding environment recognition device, autonomous mobile system using same, and surrounding environment recognition method
WO2015017691A1 (en) * 2013-08-02 2015-02-05 Irobot Corporation Time-dependent navigation of telepresence robots
US20180299557A1 (en) * 2017-04-17 2018-10-18 Baidu Online Network Technology (Beijing) Co., Ltd Method and apparatus for updating maps
US20190033459A1 (en) * 2017-07-25 2019-01-31 Waymo Llc Determining Yaw Error from Map Data, Lasers, and Cameras
CN110142805A (en) * 2019-05-22 2019-08-20 武汉爱速达机器人科技有限公司 A kind of robot end's calibration method based on laser radar
CN110780305A (en) * 2019-10-18 2020-02-11 华南理工大学 Track cone bucket detection and target point tracking method based on multi-line laser radar
CN110700839A (en) * 2019-10-21 2020-01-17 北京易联创安科技发展有限公司 Heading machine pose measuring device based on laser scanner and measuring method thereof

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
SHIFEI LIU, TASHFEEN KARAMAT, ET AL: "A Dual-Rate Multi-filter Algorithm for LiDAR- Aided Indoor Navigation Systems", IEEE XPLORE, vol. 978, pages 1014 - 1019, XP032616499, DOI: 10.1109/PLANS.2014.6851467 *
范晶晶;王力;褚文博;罗禹贡;: "基于KDTree树和欧式聚类的越野环境下行人识别的研究", 汽车工程, no. 12, pages 16 - 20 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113920134A (en) * 2021-09-27 2022-01-11 山东大学 Slope ground point cloud segmentation method and system based on multi-line laser radar
CN114494301A (en) * 2022-02-14 2022-05-13 北京智弘通达科技有限公司 Railway scene point cloud segmentation method based on airborne radar point cloud
CN116447977A (en) * 2023-06-16 2023-07-18 北京航天计量测试技术研究所 Round hole feature measurement and parameter extraction method based on laser radar
CN116447977B (en) * 2023-06-16 2023-08-29 北京航天计量测试技术研究所 Round hole feature measurement and parameter extraction method based on laser radar

Similar Documents

Publication Publication Date Title
CN113325428A (en) Distance and course measuring method based on laser radar
Zou et al. A comparative analysis of LiDAR SLAM-based indoor navigation for autonomous vehicles
CN111337941B (en) Dynamic obstacle tracking method based on sparse laser radar data
CN110780305B (en) Track cone detection and target point tracking method based on multi-line laser radar
Huang Review on LiDAR-based SLAM techniques
CN113534844B (en) Method and device for inspecting transmission line of rotorcraft in unknown environment
CN113516108B (en) Construction site dust suppression data matching processing method based on data identification
CN114581681A (en) Fuselage profile analysis method for aircraft complete machine measurement data
CN116630403A (en) Lightweight semantic map construction method and system for mowing robot
CN113820682B (en) Millimeter wave radar-based target detection method and device
Wu et al. Autonomous UAV landing system based on visual navigation
CN116630411B (en) Mining electric shovel material surface identification method, device and system based on fusion perception
CN115797490B (en) Graph construction method and system based on laser vision fusion
CN115971004A (en) Intelligent putty spraying method and system for carriage
CN115686073A (en) Unmanned aerial vehicle-based power transmission line inspection control method and system
CN114862908A (en) Dynamic target tracking method and system based on depth camera
Zhang et al. Hybrid iteration and optimization-based three-dimensional reconstruction for space non-cooperative targets with monocular vision and sparse lidar fusion
CN113554705B (en) Laser radar robust positioning method under changing scene
Jia et al. A Mobile Robot Mapping Method Integrating Lidar and Depth Camera
CN115307646A (en) Multi-sensor fusion robot positioning method, system and device
Zhao et al. Design of 3D reconstruction system on quadrotor Fusing LiDAR and camera
Ma et al. Robust visual-inertial odometry with point and line features for blade inspection UAV
CN111239761B (en) Method for indoor real-time establishment of two-dimensional map
CN114217641A (en) Unmanned aerial vehicle power transmission and transformation equipment inspection method and system in non-structural environment
CN112911535A (en) Underwater AUV path forming method based on Dirichlet vertex

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination