CN114663850A - Obstacle detection method and device, rail vehicle and storage medium - Google Patents

Obstacle detection method and device, rail vehicle and storage medium Download PDF

Info

Publication number
CN114663850A
CN114663850A CN202011529056.8A CN202011529056A CN114663850A CN 114663850 A CN114663850 A CN 114663850A CN 202011529056 A CN202011529056 A CN 202011529056A CN 114663850 A CN114663850 A CN 114663850A
Authority
CN
China
Prior art keywords
point cloud
target
map
bounding
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011529056.8A
Other languages
Chinese (zh)
Inventor
沈剑罡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BYD Co Ltd
Original Assignee
BYD Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BYD Co Ltd filed Critical BYD Co Ltd
Priority to CN202011529056.8A priority Critical patent/CN114663850A/en
Publication of CN114663850A publication Critical patent/CN114663850A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application discloses an obstacle detection method and device, a rail vehicle and a storage medium. The method is applied to a railway vehicle with a laser radar, and comprises the following steps: acquiring a point cloud collected by a laser radar in a first current frame; selecting a target frame off-line point cloud matched with the first current frame point cloud from an off-line map; converting the first current frame point cloud into a map coordinate system of an offline map according to the first current frame point cloud and the target frame offline point cloud to obtain a second current frame point cloud and a current position of the rail vehicle corresponding to the offline map; according to the current position, selecting the boundary node coordinates of at least one target boundary area within a preset range from the off-line map, wherein the target boundary area is any boundary area within the preset range from the current position along the advancing direction of the track; and determining the distance between the obstacle and the rail vehicle within the preset range according to the second current frame point cloud and the boundary node coordinates of each target boundary area.

Description

Obstacle detection method, obstacle detection device, rail vehicle and storage medium
Technical Field
The present disclosure relates to the field of vehicle technologies, and more particularly, to an obstacle detection method, an obstacle detection apparatus, a rail vehicle, and a computer-readable storage medium.
Background
During the running of the rail vehicle, there are cases where foreign objects (e.g., rail-along structures, broken stones, pedestrians, etc.) intrude into the rail vehicle clearance due to natural or human factors, and this seriously jeopardizes the safety and stability of the rail vehicle running. Aiming at the problem that foreign matters invade the clearance of the rail vehicle, in the traditional manual driving scene, a driver mainly judges whether an obstacle exists in the clearance in front of the rail vehicle or not by means of manual observation, and the distance between the obstacle and the rail vehicle when the obstacle exists. In a more intelligent unmanned driving scenario, a series of active detection devices (such as cameras) are mainly used to detect whether an obstacle exists in the front boundary of the vehicle and the distance between the obstacle and the rail vehicle when the obstacle exists.
However, because the track line for the rail vehicle to run is complex in linearity, the distance between the obstacle and the rail vehicle when the obstacle exists in the limit cannot be accurately judged no matter the driver performs manual observation or adopts active detection equipment for detection.
Disclosure of Invention
It is an object of the present application to provide a new solution for obstacle detection.
According to a first aspect of the present application, there is provided an obstacle detection method applied to a railway vehicle mounted with a laser radar, comprising:
acquiring a point cloud collected by a laser radar in a first current frame;
selecting a target frame off-line point cloud matched with the first current frame point cloud from an off-line map;
converting the first current frame point cloud into a map coordinate system of the off-line map according to the first current frame point cloud and the target frame off-line point cloud to obtain a second current frame point cloud and a current position of the rail vehicle corresponding to the off-line map;
according to the current position, boundary node coordinates of at least one target boundary area within a preset range are selected from the off-line map, wherein the target boundary area is any boundary area within the preset range from the current position along the advancing direction of the track;
and determining the distance between an obstacle positioned in a preset range and the rail vehicle according to the second current frame point cloud and the boundary node coordinates of each target boundary area.
Optionally, the offline map is generated through the following steps:
acquiring multi-frame laser point cloud collected by the laser radar along a track line;
determining a positioning map according to each frame of laser point cloud, wherein the positioning map comprises a frame of off-line point cloud corresponding to each frame of laser point cloud, and a translation distance and an Euler angle of a corresponding laser radar coordinate system relative to a map coordinate system of the positioning map, and one frame of off-line point cloud is converted into a point cloud under the map coordinate system;
determining a base space and a coordinate origin of the corresponding frame of laser point cloud under the map coordinate system according to each group of translation distance and Euler angle;
determining a bounding node coordinate of each bounding region in the map coordinate system according to a base space and a coordinate origin of each frame of laser point cloud in the map coordinate system and a distance between the coordinate origin of the corresponding frame of laser point cloud in the radar coordinate and the corresponding bounding region;
and generating an offline map according to the boundary node coordinates of each boundary area and the corresponding frame of offline point cloud.
Optionally, each bounding region is a cuboid, and the bounding node coordinate of each bounding region in the map coordinate system is represented by eight vertex coordinates of the corresponding cuboid;
after determining the boundary node coordinates of each boundary area in the map coordinate system according to the base space and the coordinate origin of each frame of laser point cloud in the map coordinate system and the distance between the coordinate origin of the corresponding frame of laser point cloud in the radar coordinate system and the corresponding boundary area, the method further comprises the following steps:
updating the boundary node coordinates of each boundary region in the map coordinate system to coordinates corresponding to four target vertexes in a cuboid; the four target vertexes are formed by vertexes corresponding to three edges of a common point in the cuboid.
Optionally, each bounded region is a cuboid, and after the method selects bounded node coordinates of at least one target bounded region located within a preset range from the offline map according to the current position, the method further includes:
calculating the included angle of two edges at the same position in the first limiting area and the second limiting area; the first bounding region and the second bounding region are two adjacent bounding regions of at least one target bounding region;
merging the first and second bounding regions into a third bounding region and updating the first and second bounding regions into the third bounding region if a difference between the included angle and a straight angle is less than a preset angle;
determining a bounding node coordinate of the third bounding region according to the bounding node coordinate of the first bounding region and the bounding node coordinate of the second bounding region.
Optionally, the determining, according to the second current frame point cloud and the boundary node coordinates of each target boundary region, a distance between an obstacle located within a preset range and the rail vehicle includes:
determining points located in each target limit area according to the second current frame point cloud and limit node coordinates of each target limit area;
clustering points in each target boundary region to obtain an obstacle point cloud in the preset range;
and determining the distance between the obstacle in the preset range and the rail vehicle according to the obstacle point cloud.
Optionally, the determining, according to the second current frame point cloud, the point cloud located in each target bounding area includes:
for each of the target bounded regions, determining a maximum value and a minimum value of the target bounded region on three coordinate axes of the map coordinate system from bounded node coordinates of the target bounded region;
and performing direct filtering on the second current frame point cloud according to the maximum value and the minimum value corresponding to each target limited area to obtain the point cloud in each target limited area in the second current frame point cloud.
Optionally, the performing, according to the maximum value corresponding to each target bounding region, direct filtering on the point clouds of the second current frame respectively to obtain the point clouds located in each target bounding region in the point clouds of the second current frame includes:
performing direct filtering on the second current frame point cloud according to the maximum value and the minimum value corresponding to each target limited area to obtain a rough point cloud located in each target limited area in the second current frame point cloud;
for each coarse point in a coarse point cloud located within each of the target bounding regions, determining whether the coarse point is located on an opposite side of a set of parallel surfaces of the corresponding target bounding region;
in the case of yes, it is determined that the coarse point is located in a corresponding target bounding region.
According to a second aspect of the present application, there is provided an obstacle detecting apparatus comprising a lidar, a memory, and a processor, wherein:
a lidar to collect a point cloud;
the memory is to store computer instructions;
the processor is configured to invoke the computer instructions from the memory to perform the method of any of the first aspects.
According to a third aspect of the present application, there is provided a rail vehicle comprising an obstacle detecting device according to the second aspect.
According to a fourth aspect of the present application, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method according to any one of the first aspects.
In the embodiment of the application, a first current frame point cloud acquired by a laser radar is acquired, and a target frame offline point cloud matched with the current frame point cloud is selected from an offline map. And then, according to the first current frame point cloud and the target frame off-line point cloud, the current frame point cloud can be converted into a map coordinate system of an off-line map, so that the current position of a second current frame point cloud and the current position of the rail vehicle corresponding to the off-line map are obtained. On the basis, according to the current position, the boundary node coordinates of at least one target boundary area in a preset range are selected from the off-line map, and then according to the second current frame point cloud and the boundary node coordinates of each target boundary area, the point cloud in the at least one target boundary area in the second current frame point cloud can be determined. The determined point cloud is the point cloud of the obstacle in the preset range. Further, the distance between the obstacle and the rail vehicle within the preset range can be determined based on the point cloud within each target boundary area. Because the off-line map is a boundary node coordinate of a boundary area including an actual rail where the rail vehicle runs, that is, the embodiment of the present application can obtain the prior rail boundary information, the embodiment of the present application can accurately obtain the distance between the obstacle and the rail vehicle within the preset range.
Further features of the present application and advantages thereof will become apparent from the following detailed description of exemplary embodiments thereof, which is to be read in connection with the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the application and together with the description, serve to explain the principles of the application.
Fig. 1 is a schematic flow chart of an obstacle detection method according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a positional relationship between a rough point and a cuboid corresponding to a target bounding region according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a first bounded region and a second bounded region provided by an embodiment of the present application;
fig. 4 is a schematic structural diagram of an obstacle detection device according to an embodiment of the present application.
Detailed Description
Various exemplary embodiments of the present application will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, the numerical expressions, and numerical values set forth in these embodiments do not limit the scope of the present application unless specifically stated otherwise.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the application, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
< method examples >
The embodiment of the application provides a method for detecting an obstacle, which is applied to a rail vehicle carrying a laser radar. For example, the lidar may be arranged on the roof of the rail vehicle. Wherein, the rail vehicle can be a train, a high-speed rail, a subway and the like.
As shown in fig. 1, the obstacle detection method provided in the embodiment of the present application includes the following steps S1100 to S1500:
s1100, acquiring a first current frame point cloud acquired by the laser radar.
In the embodiment of the application, the frame point cloud acquired by the laser radar at the current acquisition time is recorded as the first current frame point cloud. The first current frame point cloud is a frame point cloud under the lidar coordinate system, and the first current frame point cloud includes a plurality of points, and a set of vectors (usually expressed in the form of X, Y, Z three-dimensional coordinates) of each point in the lidar coordinate system, RGB colors, gray values, depths, segmentation results, and the like.
It is understood that the first current frame point cloud can be used to describe the morphology of the lidar, i.e., the environment in which the rail vehicle is currently located.
S1200, selecting a target frame off-line point cloud matched with the first current frame point cloud from the off-line map.
In the embodiment of the application, the offline map is obtained offline in advance, and the offline map comprises a bounding node of a bounding area of an actual track where the rail vehicle runs in a map coordinate system of the offline map and a frame offline point cloud corresponding to each bounding area. It should be noted that, in the embodiments of the present application, the entire continuous bounding region of the track on which the rail vehicle runs is divided into a plurality of overlapping or non-overlapping bounding regions.
The limit is a contour dimension line which is set for ensuring the transport safety and cannot be exceeded between buildings, equipment and rail vehicles on a rail line.
In this embodiment of the application, the specific implementation of S1200 may be: and respectively matching the first current frame point cloud and each offline frame point cloud in the offline map by using a point cloud matching algorithm, and marking the offline frame point cloud which is most matched with the first current frame point cloud in the offline map as a target frame offline point cloud. The point cloud matching algorithm can be an NDT algorithm and/or an ICP algorithm. When the point cloud matching algorithm is the NDT algorithm and the ICP algorithm, the above-mentioned matching, by using the point cloud matching algorithm, the point cloud of the first current frame and each point cloud of the offline frame in the offline map respectively may specifically be: firstly, carrying out probability-based point cloud rough matching by using an NDT algorithm, then carrying out distance-based point cloud fine matching by using an ICP algorithm, and recording the off-line frame point cloud obtained by fine matching as a target frame off-line point cloud.
S1300, converting the first current frame point cloud into a map coordinate system of an offline map according to the first current frame point cloud and the target frame offline point cloud to obtain a second current frame point cloud and the current position of the rail vehicle corresponding to the offline map.
In this embodiment of the present application, the foregoing S1300 is specifically implemented as: and determining a rotation matrix R and a translation matrix T under the map coordinates corresponding to the laser radar coordinate system and the target frame off-line point cloud corresponding to the first current frame point cloud according to the first current frame point cloud and the target frame off-line point cloud. And converting the first current frame point cloud to the map coordinates of the off-line map according to each point, the rotation matrix R and the translation matrix T in the first current frame point cloud, so as to obtain a second current frame point cloud. That is, the second current frame point cloud is the representation of the first current frame point cloud in the map coordinate system. Meanwhile, the current position of the rail vehicle corresponding to the off-line map is represented by the target off-line frame point cloud, because the off-line frame point cloud and the position of the rail vehicle in the off-line map have a corresponding relation.
S1400, according to the current position, the boundary node coordinates of at least one target boundary area within a preset range are selected from the off-line map.
The target bound region is any bound region within a preset range along the track advancing direction from the current position.
In the embodiment of the present application, the preset range may be set according to actual requirements, and in one example, the preset range may be 100 m.
In one embodiment of the application, when acquiring the off-line frame point clouds, determining the distance between the position of the acquired first off-line frame point cloud corresponding to the rail vehicle in the track and the position of the acquired second off-line frame point cloud corresponding to the rail vehicle in the track; and calculating a quotient value between the preset range and the distance, and determining each quotient value limiting area of the target offline point cloud frame corresponding to the current position along the advancing direction of the track as a target limiting area. The first off-line frame point cloud and the second off-line frame point cloud are two adjacent frame point clouds.
S1500, determining the distance between the obstacle and the rail vehicle within the preset range according to the second current frame point cloud and the limit node coordinates of each target limit area.
In this embodiment of the present application, the foregoing S1500 is specifically implemented as: for each point in the second current frame point cloud, it is determined whether the point is within a region represented by the bounded node coordinates of any of the target bounded regions. In the case of yes, the point is determined to be a point of the obstacle. And determining the object composed of the points of all the obstacles as the obstacles located in the preset range.
In one embodiment, each point in the second current frame point cloud includes a set of vectors in the map coordinates, so that the distance between the rail vehicle and the point of the obstacle can be determined according to the determined point of the obstacle including a set of vectors, and the distance can be determined as the distance between the obstacle and the rail vehicle within the preset range.
In another embodiment, when an obstacle exists in the preset range, the number of points in the second current frame point cloud corresponding to the obstacle is multiple, so that the determined points of the obstacle can be clustered, and the distance between the clustering center of each cluster and the rail vehicle is determined as the distance between the obstacle located in the preset range and the rail vehicle. On this basis, the above S1500 can be implemented by the following S1510-S1530:
s1510, determining points located in each target bounding region according to the second current frame point cloud and the bounding node coordinates of each target bounding region.
And S1520, clustering the point clouds in each target boundary limiting area to obtain the obstacle point clouds in a preset range.
And S1530, determining the distance between the obstacle in the preset range and the rail vehicle according to the obstacle point cloud.
In the embodiment of the application, points in the target boundary area in the second current frame are determined according to the point cloud of the second current frame and the boundary node coordinates of each target boundary area. And clustering points in any target boundary region in the second current frame point cloud by using a point cloud segmentation and clustering algorithm to obtain at least one class. Points in one class correspond to one obstacle point cloud. The distance between each obstacle point cloud and the rail vehicle within the preset range can be determined according to the center point of each obstacle point cloud.
The point cloud segmentation and clustering algorithm may be a point cloud segmentation and clustering algorithm based on euclidean distance, that is, a distance threshold (e.g., 1 meter) is set, and then all points whose adjacent distances are less than or equal to the threshold are determined to belong to the same class.
In an embodiment of the present application, the above S1510 may be implemented by the following S1511 and S1512:
and S1511, for each target bound region, determining the maximum value and the minimum value of the target bound region on three coordinate axes of a map coordinate system according to the bound node coordinates of the target bound region.
And S1512, performing direct filtering on the point clouds of the second current frame respectively according to the maximum value and the minimum value corresponding to each target limited area to obtain the point clouds located in each target limited area in the point clouds of the second current frame.
In the embodiment of the present application, the specific process of the pass-through filtering is as follows: for each point in the second current frame point cloud, the point is indicated as a point located in the target bounding region if the value of the x-axis corresponding to the point is located between the maximum value and the minimum value of the x-axis of the target bounding region in the map coordinate system, the value of the y-axis corresponding to the point is located between the maximum value and the minimum value of the y-axis of the target bounding region in the map coordinate system, and the value of the z-axis corresponding to the point is located between the maximum value and the minimum value of the z-axis of the target bounding region in the map coordinate system. Otherwise, the latter point is a point located outside the target bounding region.
It should be noted that each point in the second current frame point cloud corresponds to a value in the X-axis, a value in the Y-axis, and a value in the Z-axis, and can be represented according to a set of vectors (usually represented in the form of X, Y, Z three-dimensional coordinates) of the point in the map coordinate system.
In an embodiment of the present application, the step S1512 may be specifically implemented by the following steps S1512-1, S1512-2, and S1512-3:
s1512-1, respectively performing direct filtering on the point clouds of the second current frame according to the maximum value and the minimum value corresponding to each target limited area to obtain rough point clouds located in each target limited area in the point clouds of the second current frame.
In this embodiment of the application, the foregoing S1512-1 is similar to the specific implementation of the foregoing S1512, and is not described here again.
S1512-2, for each coarse point in the coarse point cloud located within each target bounding region, determining whether the coarse point is located on an opposite side of a set of parallel surfaces of the corresponding target bounding region.
S1512-3, in the case of YES, determining that the rough point is located in the corresponding target bounding region.
In the embodiment of the present application, as exemplarily shown in fig. 2, in the case that the target bounding region is represented by a rectangular parallelepiped, when the coarse point is located on the same side of a set of parallel planes of the target bounding region, it means that the coarse point is located outside the rectangular parallelepiped, i.e., outside the target bounding region. Correspondingly, when the rough point is located on the opposite side of the set of parallel surfaces of the target bounding region, the rough point cloud is indicated as being located inside the rectangular parallelepiped, i.e. within the target bounding region.
Wherein ipsilateral means that for a set of parallel planes, the coarse points are all located to the left or right of the set of parallel planes.
And, taking the left and right parallel surfaces of the cuboid corresponding to the target boundary as shown in fig. 2 as an example, the normal vector of the set of parallel surfaces is AB, and for the rough point P ', the point P' is on the same side of the set of parallel surfaces, and then the included angle between the vector AP 'and the vector AB, and the included angle between the vector BP' and the vector AB are necessarily acute angles. For the rough point P, the point P is on the opposite side of the set of parallel planes, and the included angle between the vector AP and the vector AB, and the included angle between the vector BP and the vector AB must be an acute angle one and an obtuse angle the other. On the basis, whether the rough point is positioned on the opposite side of a group of parallel surfaces of the corresponding target limited area can be judged.
In the embodiment of the application, a first current frame point cloud acquired by a laser radar is acquired, and a target frame offline point cloud matched with the current frame point cloud is selected from an offline map. And then, according to the first current frame point cloud and the target frame off-line point cloud, the current frame point cloud can be converted into a map coordinate system of an off-line map, so that the current position of a second current frame point cloud and the current position of the rail vehicle corresponding to the off-line map are obtained. On the basis, according to the current position, the boundary node coordinates of at least one target boundary area in a preset range are selected from the off-line map, and then according to the second current frame point cloud and the boundary node coordinates of each target boundary area, the point cloud in the at least one target boundary area in the second current frame point cloud can be determined. The determined point cloud is the point cloud of the obstacle in the preset range. Further, the distance between the obstacle and the rail vehicle within the preset range can be determined based on the point cloud within each target boundary area. Because the off-line map is a boundary node coordinate of a boundary area including an actual rail where the rail vehicle runs, that is, the embodiment of the present application can obtain the prior rail boundary information, the embodiment of the present application can accurately obtain the distance between the obstacle and the rail vehicle within the preset range.
In an embodiment of the present application, the embodiment of the present application further includes a step of generating an offline map. The offline map is generated by S1210-S1214 as follows:
and S1210, acquiring multi-frame laser point cloud collected by the laser radar along the track line.
In the embodiment of the application, the rail vehicle carries the laser radar, the rail vehicle runs at the track starting position according to the running speed of the rail vehicle in the actual running process, meanwhile, the laser point clouds are collected periodically according to the set time interval, and the laser point clouds collected each time are used as a frame of laser point cloud until the rail vehicle runs to the track ending position. On this basis, each frame of laser point cloud collected by the laser radar constitutes the multi-frame laser point cloud in S1210.
And S1211, determining a positioning map according to each frame of laser point cloud.
The positioning map comprises a frame of off-line point cloud corresponding to each frame of laser point cloud, and a translation distance and an Euler angle of a corresponding laser radar coordinate system relative to a map coordinate system of the positioning map, wherein the frame of off-line point cloud is converted from the frame of laser point cloud to a point cloud under the map coordinate system.
It is understood that each frame of laser point cloud is a frame of laser point cloud in the radar coordinate system of the laser radar. It should be noted that the map coordinate system corresponding to the positioning map and the map coordinate system corresponding to the off-line map in the embodiment of the present application are the same coordinate system.
In this embodiment of the application, the specific implementation of S1211 may be: and determining a high-precision positioning fixed map according to each frame of laser point cloud by utilizing an SLAM algorithm. Wherein the SLAM algorithm can be any one of the LOAM, LIOM and LIO-SAM algorithms.
And S1212, determining a base space and a coordinate origin of the corresponding frame of laser point cloud under the map coordinate system according to each group of translation distance and Euler angle.
In this embodiment of the present application, the specific implementation of S1212 may be: and for a group of translation distances and Euler angles, obtaining a rotation matrix R' corresponding to the Euler angles according to the Euler angles and the conversion relation between the Euler angles and the rotation matrix in robotics. And determining a translation matrix T' corresponding to the translation distance according to the translation distance. And determining the base space and the origin coordinate of the corresponding frame of laser point cloud under the map coordinate according to the rotation matrix R ', the translation matrix T', the base space of the corresponding frame of laser point cloud in the radar coordinate system and the origin coordinate.
Wherein the expression at the translation distance is Ti=[txi、tyi、tzi]In the case of (1), the expression of the translation matrix T' corresponding to the translation distance is:
Figure BDA0002851538730000111
and i represents the frame number of the corresponding frame of laser point cloud.
According to the rotation matrix R ', the translation matrix T', the base space and the origin point coordinate of the corresponding frame of laser point cloud in the radar coordinate system, the base space and the origin point coordinate of the corresponding frame of laser point cloud under the map coordinate are determined, and the method is realized through the following expression:
Figure BDA0002851538730000121
Figure BDA0002851538730000122
wherein,
Figure BDA0002851538730000123
and is the base space of the corresponding frame of laser point cloud in the laser radar coordinate system. ClidarAnd (0, 0, 0) and is the coordinate origin of the corresponding frame of the laser radar point cloud under the laser radar coordinate system.
S1213, determining the bounding node coordinates of each bounding region in the map coordinate system according to the base space and the coordinate origin of each frame of laser point cloud in the map coordinate system and the distance between the coordinate origin of the corresponding frame of laser point cloud in the radar coordinate system and the corresponding bounding region.
In the embodiment of the present application, for each frame of the laser point cloud, the distance between the origin of the coordinate system of the corresponding laser radar and the corresponding bounding region may be calculated in advance. And then combining the position of the origin of the coordinate system corresponding to the laser radar under the map coordinate system with the distance between the origin of the coordinate system and the corresponding limited area to obtain the limited node coordinates of the corresponding limited area. And obtaining the position of the origin of the coordinate system corresponding to the laser radar in the map coordinate system according to the base space and the origin of coordinates of the coordinate system corresponding to the laser radar in the map coordinate system.
In one example, if one bounded region is represented by a cuboid, the distance between the origin of the coordinate system of the corresponding lidar and the corresponding bounded region is the distance between the origin of the coordinate system of the corresponding lidar and six faces of the cuboid of the corresponding bounded region. In this embodiment, any vertex of the rectangular parallelepiped is taken as a boundary node of the corresponding boundary, and the coordinates of the boundary node are calculated by the following formula:
Figure BDA0002851538730000124
the coordinates of the point to be solved are (x, y, z), and the point to be solved is a limit node. The coordinates of the known point are (X, Y, Z), and the known point is a position of the origin of the coordinate system corresponding to the laser radar in the map coordinate system. (m, n, p) is the coordinate between the point to be solved and the known point, and d is the distance between the point to be solved and the known point. (m, n, p) and d are each derived from the distance between the origin of the coordinate system of the lidar and the corresponding bounding region.
And S1214, generating an offline map according to the boundary node coordinates of each boundary area and the corresponding frame of offline point cloud.
In the embodiment of the application, the boundary node coordinates of each boundary area and the corresponding frame of off-line point cloud are directly used as an off-line map.
In one embodiment of the present application, each bounded region is a cuboid, and the bounded node coordinates of each bounded region in the map coordinate system are represented by eight vertex coordinates of the corresponding cuboid. On this basis, the method provided by the embodiment of the present application further includes, after the above S1213, S1215:
s1215, updating the boundary node coordinates of each boundary area in the map coordinate system into coordinates corresponding to four target vertexes in the cuboid; the four target vertexes are formed by vertexes corresponding to three edges of a common point in the cuboid.
In the present embodiment, each bounding region is a rectangular parallelepiped, and it is understood that four vertices corresponding to three common vertex sides of the rectangular parallelepiped can represent the rectangular parallelepiped. Thus, it can be realized by the above S1215 that the bounded region described by the 8 bounded node coordinates can be described by the 4 bounded node coordinates. Thus, the storage capacity is greatly reduced.
In an embodiment of the present application, each of the bounded regions is a rectangular parallelepiped, and the method for detecting an obstacle provided in the embodiment of the present application further includes the following steps S1410-S1412 after the step S1400:
s1410, calculating the included angle of the two edges at the same position in the first limiting area and the second limiting area; the first bounding region and the second bounding region are two adjacent bounding regions of the at least one target bounding region.
S1411, combining the first and second bounding regions into a third bounding region and updating the first and second bounding regions into the third bounding region when the difference between the included angle and the straight angle is smaller than a preset angle.
And S1412, determining the bound node coordinates of the third bound region according to the bound node coordinates of the first bound region and the bound node coordinates of the second bound region.
In the embodiment of the present application, the two sides of the first and second bounded regions at the same position refer to two sides of the first and second bounded regions that are substantially the same in direction and have an intersection point.
For example, the first and second bounding regions may be as shown in fig. 3, and the side CD and the side C 'D' in fig. 3 are two sides of the first and second bounding regions at the same position.
In the embodiment of the present application, in the first bounding area, a vector corresponding to the edge CD may be determined. And a vector corresponding to the edge C 'D' may be determined in the second bounding region. According to the vector corresponding to the edge CD and the vector corresponding to the edge C 'D', the included angle between the edge CD and the edge C 'D' can be calculated.
In the embodiment of the present application, when the difference between the included angle and the straight angle is smaller than the preset angle, it indicates that the included angle and the straight angle are substantially the same, i.e., the side CD and the side C 'D' are substantially on a straight line. On the basis that the first and second bounded regions are cuboids, the first and second bounded regions form an approximately cuboid if the sides CD and C 'D' are substantially in a straight line. In this way, the first bounding region and the second bounding region can be merged into one and the same cuboid, i.e. into a third bounding region. The bounding nodes of the third bounding region may be represented by the coordinates of the 8 vertices of the approximate cuboid described above, or by the coordinates of the four vertices corresponding to the edges of the three common vertices of the approximate cuboid.
It should be noted that, in the case that the coordinates of the bounding nodes are known, the vectors corresponding to the two bounding nodes can be determined according to the coordinates of the bounding nodes corresponding to the two bounding nodes.
In the embodiment of the present application, the number of target bounded regions can be reduced by merging the first bounded region and the second bounded region. Thus, at S1500 executed later, the amount of calculation can be greatly reduced.
< apparatus embodiment >
An embodiment of the present application provides an obstacle detection apparatus 400, as shown in fig. 4, the apparatus 400 includes:
a lidar 410, the lidar 410 being configured to collect a point cloud;
the memory 420 is used for storing computer instructions;
the processor 430 is configured to call the computer instructions from the memory 420 to perform the method according to any one of the above method embodiments.
< apparatus embodiment >
The embodiment of the application provides a railway vehicle which comprises the obstacle detection device 400 provided in the embodiment of the device.
In one embodiment of the present application, the rail vehicle may be a train, high-speed rail, subway, or the like.
< storage Medium embodiment >
An embodiment of the present application provides a computer-readable storage medium, characterized in that a computer program is stored thereon, which, when being executed by a processor, implements the method according to any one of the method embodiments provided above.
The present application may be a system, method and/or computer program product. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for causing a processor to implement various aspects of the present application.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present application may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, the electronic circuitry can execute computer-readable program instructions to implement aspects of the present application by utilizing state information of the computer-readable program instructions to personalize the electronic circuitry, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA).
Various aspects of the present application are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, by software, and by a combination of software and hardware are equivalent.
The foregoing description of the embodiments of the present application has been presented for purposes of illustration and description and is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terms used herein were chosen in order to best explain the principles of the embodiments, the practical application, or technical improvements to the techniques in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. The scope of the application is defined by the appended claims.

Claims (10)

1. An obstacle detection method applied to a railway vehicle mounted with a laser radar, comprising:
acquiring a point cloud collected by a laser radar in a first current frame;
selecting a target frame off-line point cloud matched with the first current frame point cloud from an off-line map;
converting the first current frame point cloud into a map coordinate system of the off-line map according to the first current frame point cloud and the target frame off-line point cloud to obtain a second current frame point cloud and a current position of the rail vehicle corresponding to the off-line map;
according to the current position, boundary node coordinates of at least one target boundary area within a preset range are selected from the off-line map, wherein the target boundary area is any boundary area within the preset range from the current position along the advancing direction of the track;
and determining the distance between an obstacle positioned in a preset range and the rail vehicle according to the second current frame point cloud and the boundary node coordinates of each target boundary area.
2. The method of claim 1, wherein the offline map is generated by:
acquiring multi-frame laser point cloud collected by the laser radar along a track line;
determining a positioning map according to each frame of laser point cloud, wherein the positioning map comprises a frame of off-line point cloud corresponding to each frame of laser point cloud, and a translation distance and an Euler angle of a corresponding laser radar coordinate system relative to a map coordinate system of the positioning map, and one frame of off-line point cloud is converted into a point cloud under the map coordinate system;
determining a base space and a coordinate origin of the corresponding frame of laser point cloud under the map coordinate system according to each group of translation distance and Euler angle;
determining a bounding node coordinate of each bounding region in the map coordinate system according to a base space and a coordinate origin of each frame of laser point cloud in the map coordinate system and a distance between the coordinate origin of the corresponding frame of laser point cloud in the radar coordinate and the corresponding bounding region;
and generating an offline map according to the boundary node coordinates of each boundary area and the corresponding frame of offline point cloud.
3. The method of claim 2, wherein each bounded region is a cuboid, and the bounded node coordinates of each bounded region in the map coordinate system are represented by eight vertex coordinates of the corresponding cuboid;
after determining the boundary node coordinates of each boundary area in the map coordinate system according to the base space and the coordinate origin of each frame of laser point cloud in the map coordinate system and the distance between the coordinate origin of the corresponding frame of laser point cloud in the radar coordinate system and the corresponding boundary area, the method further comprises the following steps:
updating the boundary node coordinates of each boundary area in the map coordinate system to coordinates corresponding to four target vertexes in a cuboid; the four target vertexes are formed by vertexes corresponding to three edges of a common point in the cuboid.
4. The method of claim 1, wherein each bounded region is a cuboid, and wherein after the selecting, according to the current location, bounded node coordinates of at least one target bounded region within a predetermined range from the offline map, the method further comprises:
calculating the included angle of two edges at the same position in the first limiting area and the second limiting area; the first bounding region and the second bounding region are two adjacent bounding regions of at least one target bounding region;
merging the first and second bounding regions into a third bounding region and updating the first and second bounding regions into the third bounding region if a difference between the included angle and a straight angle is less than a preset angle;
determining a bounding node coordinate of the third bounding region according to the bounding node coordinate of the first bounding region and the bounding node coordinate of the second bounding region.
5. The method of claim 1, wherein determining a distance between an obstacle located within a preset range and the rail vehicle according to the second current frame point cloud and the bounding node coordinates of each target bounding region comprises:
determining points located in each target limit area according to the second current frame point cloud and limit node coordinates of each target limit area;
clustering points in each target boundary region to obtain an obstacle point cloud in the preset range;
and determining the distance between the obstacle in the preset range and the rail vehicle according to the obstacle point cloud.
6. The method of claim 5, wherein said determining, from the second current frame point cloud, a point cloud located within each of the target bounding regions comprises:
for each of the target bounded regions, determining a maximum value and a minimum value of the target bounded region on three coordinate axes of the map coordinate system from bounded node coordinates of the target bounded region;
and performing direct filtering on the second current frame point cloud according to the maximum value and the minimum value corresponding to each target limited area to obtain the point cloud in each target limited area in the second current frame point cloud.
7. The method of claim 6, wherein the step of performing pass-through filtering on the second current frame point clouds according to the maximum value corresponding to each target bounding region to obtain point clouds located in each target bounding region in the second current frame point clouds comprises:
performing direct filtering on the second current frame point cloud according to the maximum value and the minimum value corresponding to each target limited area to obtain a rough point cloud located in each target limited area in the second current frame point cloud;
for each coarse point in a coarse point cloud located within each of the target bounding regions, determining whether the coarse point is located on an opposite side of a set of parallel surfaces of the corresponding target bounding region;
in the case of yes, it is determined that the coarse point is located in a corresponding target bounding region.
8. An obstacle detection apparatus, comprising a lidar, a memory, and a processor, wherein:
a lidar to collect a point cloud;
the memory is to store computer instructions;
the processor is configured to invoke the computer instructions from the memory to perform the method of any of claims 1-7.
9. A rail vehicle, characterized in that the rail vehicle comprises an obstacle detecting device according to claim 8.
10. A computer-readable storage medium, characterized in that a computer program is stored thereon which, when being executed by a processor, carries out the method according to any one of claims 1-7.
CN202011529056.8A 2020-12-22 2020-12-22 Obstacle detection method and device, rail vehicle and storage medium Pending CN114663850A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011529056.8A CN114663850A (en) 2020-12-22 2020-12-22 Obstacle detection method and device, rail vehicle and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011529056.8A CN114663850A (en) 2020-12-22 2020-12-22 Obstacle detection method and device, rail vehicle and storage medium

Publications (1)

Publication Number Publication Date
CN114663850A true CN114663850A (en) 2022-06-24

Family

ID=82024829

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011529056.8A Pending CN114663850A (en) 2020-12-22 2020-12-22 Obstacle detection method and device, rail vehicle and storage medium

Country Status (1)

Country Link
CN (1) CN114663850A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117148837A (en) * 2023-08-31 2023-12-01 上海木蚁机器人科技有限公司 Dynamic obstacle determination method, device, equipment and medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117148837A (en) * 2023-08-31 2023-12-01 上海木蚁机器人科技有限公司 Dynamic obstacle determination method, device, equipment and medium

Similar Documents

Publication Publication Date Title
EP3620966A1 (en) Object detection method and apparatus for object detection
EP3201881B1 (en) 3-dimensional model generation using edges
CN108629231B (en) Obstacle detection method, apparatus, device and storage medium
WO2021254019A1 (en) Method, device and system for cooperatively constructing point cloud map
CN112859110B (en) Positioning navigation method based on three-dimensional laser radar
CN114556442A (en) Three-dimensional point cloud segmentation method and device and movable platform
Asvadi et al. Two-stage static/dynamic environment modeling using voxel representation
CN114663850A (en) Obstacle detection method and device, rail vehicle and storage medium
CN114170596A (en) Posture recognition method and device, electronic equipment, engineering machinery and storage medium
CN113126120B (en) Data labeling method, device, equipment, storage medium and computer program product
Wang et al. Improved LeGO-LOAM method based on outlier points elimination
WO2021131990A1 (en) Information processing device, information processing method, and program
CN117197419A (en) Lei Dadian cloud labeling method and device, electronic equipment and storage medium
CN115359089A (en) Point cloud target tracking method, electronic device, medium and vehicle
JP7451628B2 (en) Vehicle attitude estimation method, device, electronic device, storage medium, and program
CN114897974B (en) Target object space positioning method, system, storage medium and computer equipment
CN113935946B (en) Method and device for detecting underground obstacle in real time
CN117635721A (en) Target positioning method, related system and storage medium
CN113763468A (en) Positioning method, device, system and storage medium
CN113808196A (en) Plane fusion positioning method and device, electronic equipment and storage medium
Kovacs et al. Edge detection in discretized range images
CN109613553B (en) Method, device and system for determining number of objects in scene based on laser radar
Zhao et al. The construction method of the digital operation environment for bridge cranes
Chu et al. Convergent application for trace elimination of dynamic objects from accumulated lidar point clouds
Luo et al. Comparison of an L1-regression-based and a RANSAC-based planar segmentation procedure for urban terrain data with many outliers

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination