CN110658531B - Dynamic target tracking method for port automatic driving vehicle - Google Patents

Dynamic target tracking method for port automatic driving vehicle Download PDF

Info

Publication number
CN110658531B
CN110658531B CN201910785988.XA CN201910785988A CN110658531B CN 110658531 B CN110658531 B CN 110658531B CN 201910785988 A CN201910785988 A CN 201910785988A CN 110658531 B CN110658531 B CN 110658531B
Authority
CN
China
Prior art keywords
obstacle
point cloud
current frame
tracking
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910785988.XA
Other languages
Chinese (zh)
Other versions
CN110658531A (en
Inventor
张祖锋
殷嘉伦
刘凯
闵文芳
杨迪海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changjia Fengxing Suzhou Intelligent Technology Co ltd
Original Assignee
Changjia Fengxing Suzhou Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changjia Fengxing Suzhou Intelligent Technology Co ltd filed Critical Changjia Fengxing Suzhou Intelligent Technology Co ltd
Priority to CN201910785988.XA priority Critical patent/CN110658531B/en
Publication of CN110658531A publication Critical patent/CN110658531A/en
Application granted granted Critical
Publication of CN110658531B publication Critical patent/CN110658531B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/66Tracking systems using electromagnetic waves other than radio waves

Landscapes

  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a dynamic target tracking method for port automatic driving vehicles, which comprises the steps of firstly, fusing and merging continuous frame point clouds through a laser radar and GPS positioning information which are arranged at the head of a truck to obtain relatively dense point cloud sensing data; then carrying out voxelization on the fused point cloud data to form a grid 3D image, and reducing the randomness of the point cloud; then, the height data of the voxel 3D information is utilized to eliminate the ground, meanwhile, the difference is made by utilizing the fusion data of the previous frame, most static obstacles are eliminated, and only the dynamic target information is reserved; then, clustering the residual point cloud through density-based clustering calculation to obtain moving target information of the frame; and finally, calculating the center, the size, the speed, the motion direction, the life cycle and the historical track of the moving target of the current frame by using a heuristic tracking algorithm on the moving target of the adjacent frame, and outputting a tracking obstacle list of the current frame. The method has the advantages of small data processing amount, wide sensing area and low cost.

Description

Dynamic target tracking method for port automatic driving vehicle
Technical Field
The invention relates to a dynamic target tracking method for port automatic driving vehicles, and belongs to the technical field of artificial intelligence.
Background
With the development of vehicle-mounted sensors and automobile automation technologies, automatic driving tasks under specific scenes are realized. Sensing of optical imaging by a monocular or monocular camera, sensing of millimeter wave radar based on doppler effect, and multi-line lidar based on active laser are currently mainstream unmanned sensing schemes. However, the camera suffers from problems of ambient light, imaging visual field and the like, although high-definition road semantic information can be obtained, the data processing amount is large, and the monocular or binocular-based distance measurement accuracy is attenuated continuously along with the increase of the distance. Although the millimeter wave radar can utilize the Doppler effect to detect and track a dynamic target in a high-speed state, the false detection rate in a low-speed state is higher; meanwhile, the millimeter wave sensor is easily interfered by peripheral non-obstacles (such as metal well covers and road signs), and further interference is generated. The ultrasonic waves generated by the ultrasonic radar are seriously attenuated in the air, and the effective detection precision and distance are very limited.
In port scenarios, there are a large number of container trucks and typically travel at low speeds, and in addition, there are a large number of metal boxes and other metal obstacles in the scenario. The sensor sensing scheme mainly based on the laser radar can cover a wider sensing area, can ensure comprehensive sensing coverage rate, and can effectively make up the conditions of insufficient camera vision and millimeter wave false detection.
In early researches, target detection and tracking algorithms based on an optical camera are relatively mainstream, but are limited by the optical camera and cannot provide effective distance of obstacles; meanwhile, the method based on binocular ranging is only relatively effective in a short-distance range of an indoor scene, and cannot meet the target tracking requirements of medium and long distances. In addition, the early laser radar target detection method mainly aims at small vehicles in open road scenes, and the surrounding environment is sensed by a single omnidirectional laser radar on the top. Due to the large volume of container trucks, which are over 3 meters in height, it is difficult to achieve full coverage of the body and surroundings, especially near the ground, by installing a single radar on top of the vehicle. At the same time, high beam lidar is more costly, typically tens of times that of a single 16-wire radar. The target detection algorithm based on deep learning also depends heavily on the barrier characteristics of dense point clouds obtained by dense laser beams, and the point cloud characteristics with less laser lines are difficult to obtain stable targets; meanwhile, high-power-consumption GPU equipment cannot be deployed in a vehicle-mounted environment, so that the deep learning scheme is not high in practicability at present.
Disclosure of Invention
The invention aims to provide a dynamic target tracking method for a port automatic driving vehicle, which has the advantages of small data volume, wide sensing area and low cost.
Therefore, the technical scheme of the invention is as follows:
a dynamic target tracking method for port autonomous vehicles, comprising the steps of:
s1, in the process of vehicle driving, acquiring environment point cloud data through laser radars arranged on two sides of a vehicle head, acquiring longitude and latitude coordinates of a vehicle body through a vehicle-mounted inertial navigation system, calibrating a coordinate system of the laser radars to a vehicle body coordinate system taking the position of the inertial navigation system as an origin, and converting the acquired environment point cloud coordinates from the vehicle body coordinate system to a geodetic coordinate system; overlapping the environmental point cloud of the current frame and the environmental point cloud of the previous frame under the geodetic coordinate system to be used as the environmental point cloud of the current frame;
s2, setting a straight-through filter, filtering the length, width and height of the environmental point cloud of the current frame obtained in the step 1, and reserving the environmental point cloud within a fixed distance range; performing voxelization processing on the filtered environment point cloud to obtain a 3D voxel grid; then setting a ground height threshold, traversing all voxel grids, and deleting the voxel grids with the height below the ground height threshold;
s3, performing difference operation on the voxel grid of the current frame obtained in the step 2 and the voxel grid of the previous frame, and eliminating the voxel grid of the static barrier to obtain the voxel grid of the dynamic barrier;
s4, scaling the voxel grids according to the height information, then clustering the scaled voxel grids to obtain a clustering result list of the current frame, and then removing the obstacle information which does not meet the requirements according to the heuristic conditions;
s5, according to the obstacle list of the current frame obtained in the step 4, combining the obstacle list information of the previous frame to carry out target matching and tracking, calculating the moving target center, size, speed, moving direction, life cycle and historical track of the current frame, and outputting the tracking obstacle list of the current frame;
and S6, repeating the steps 1-5 until the automatic driving is finished.
Preferably, in step S1, the laser radar is a 16-line laser radar, the data acquisition frequency of the laser radar is 10Hz, and the GPS clock is used as a clock reference. The geodetic coordinate system adopts a geodetic coordinate system under WGS-84.
Preferably, in step S2, the filtering method is to set the range of the retained point cloud to be 50 meters before and after, 50 meters around, and-1.9-2.4 meters in height, and then project the retained point cloud into a voxel grid with a side length of 0.2 meter, and store the retained point cloud in a KDTree data structure.
Preferably, in step S4, the specific method for scaling the voxel grid is to scale the z-axis data, compress the height information to 0.01 times of the original height information, and keep the length and width of the voxel grid unchanged. The clustering method is a DBSCAN clustering method.
Preferably, the minimum search radius of the clustering parameter is 1 to 1.5 times the side length of the voxel grid. More preferably, the minimum search radius of the clustering parameter is 1.3 times the side length of the voxel grid
In step S4, the heuristic condition is: 1) an obstacle having an area less than a minimum area threshold; 2) an obstacle having a length greater than a maximum vehicle length threshold.
In step S5, the steps of performing object matching and tracking are as follows:
(1) traversing each obstacle Pi of the previous frame, setting the position center coordinate of the Pi as (xi, yi), traversing the position center of each obstacle of the current frame, finding an obstacle Oj with the minimum geometric distance to the Pi (xi, yi), setting the position center coordinate of the Oj as (xj, yj), and setting the minimum geometric distance as min _ dis;
(2) setting a limit movement distance threshold value, wherein the limit movement distance threshold value is obtained by multiplying a limit speed by a time difference delta t of an adjacent frame, and if min _ dis is greater than the limit movement distance threshold value, Pi tracking loss is represented; if min _ dis is smaller than the threshold value of the limit movement distance, calculating the speed of Pi according to the formula (1), and updating the tracking life cycle and the accumulated distance of Pi:
Figure BDA0002178035190000031
and Vx and Vy are respectively the transverse speed and the longitudinal speed of Pi, and the unit is as follows: m/s;
xi, yi: the coordinates of the horizontal axis and the vertical axis of the Pi are respectively;
xj, yj: the coordinates of the horizontal axis and the vertical axis of Oj are respectively;
δ t is the time difference of adjacent frames, unit: second;
(3) tracking the lost barrier for two continuous frames and deleting the barrier; for the tracking obstacle missing one frame, the central position estimation is carried out according to the formula (2), and the accumulated distance is updated, but the life cycle is not updated:
Figure BDA0002178035190000032
(4) traversing each obstacle Oj of the current frame, traversing the position center of each obstacle of the previous frame, finding the obstacle Pi with the minimum geometric distance to the Oj (xj, yj), if min _ dis is larger than the threshold value of the limit movement distance, indicating that the Oj is a newly tracked obstacle, and adding the newly tracked obstacle into the obstacle sequence of the current frame;
(5) and outputting a list of the tracking obstacles of the current frame.
The invention comprehensively considers the characteristics of the existing laser radar sensor and the driving task in the port scene, realizes effective sensing coverage within 50 m of the radius of the periphery of the vehicle by means of two 16-line laser radars loaded on two sides of the head of the container truck, and realizes automatic identification and tracking of moving obstacles of the low-speed container truck in the port road scene. The method can sense moving obstacles of port scenes, mainly comprising trucks, automobiles and gantry cranes.
Compared with the prior art, the invention has the following beneficial effects:
(1) aiming at the dynamic target detection of the unmanned container truck in the low-speed port scene, only two 16-line laser radars are needed, so that the cost is low;
(2) dynamic target detection is performed based on various combined heuristic conditions, and the dynamic target detection is completed only through a CPU (Central processing Unit), and the configuration is convenient;
(3) the method has the advantages that the unstable characteristics of the point cloud can be effectively reduced by the point cloud sensing data superposition and voxelization based on the double 16-line laser radar, and the data volume is small, the sensing area is wide and is far lower than the operation amount of image processing.
Drawings
FIG. 1 is a flow chart of a dynamic target tracking method of a port according to the present invention;
FIG. 2 is raw point cloud data acquired by a lidar in an embodiment of the invention;
FIG. 3 is a comparison of a point cloud before and after overlay in an embodiment of the invention;
FIG. 4 is a 3D voxel grid obtained in an embodiment of the present invention;
FIG. 5 is a comparison graph of voxel grids before and after ground filtering in an embodiment of the invention;
FIG. 6 is a comparison graph of voxel grids before and after filtering for inter-frame difference operations in an embodiment of the present invention;
FIG. 7 is a diagram of DBSCAN clustering effect in the embodiment of the present invention;
FIG. 8 is a set of trace result sequences in an embodiment of the present invention.
Detailed Description
The invention discloses a dynamic target tracking method for port automatic driving vehicles, which mainly comprises the following steps:
firstly, calibrating two laser radars arranged on two sides of a vehicle head, mapping a sensing result of the laser radars to a central point of inertial navigation, and acquiring environmental point cloud data output by the laser radars. Before point cloud data is acquired, the data acquisition frequency of the laser radar sensor is set, and a GPS clock is used as a clock reference. And then, in the real-time driving process, converting the coordinates of the laser radar point cloud data obtained in the vehicle body coordinate system into a geodetic coordinate system by combining the longitude and latitude coordinates of the vehicle body obtained by inertial navigation, and superposing the geodetic coordinate system point cloud data of two adjacent frames. Then, filtering and voxelization processing are carried out on the superposed point cloud to obtain 3D voxel grid information, and randomness of the point cloud is reduced; and then setting a ground height threshold, traversing all the voxel grids, and deleting the voxel grids with the height below the ground height threshold. And filtering the static barrier of the current frame by performing difference operation on adjacent voxel grids, and reserving the grids of the dynamic barrier. Then, the voxel grid is scaled according to the height information, and the scaled voxel grid is clustered by a density-based DBSCAN method to obtain a clustering result list of the current frame. And according to heuristic conditions, eliminating clustering results which do not meet the conditions. And finally, tracking the target by combining the information of the clustering result list of the previous frame, updating the tracking result of the current frame, and obtaining the moving target center, size, speed, moving direction and historical track of the current frame.
The method of the present invention is described in detail below with reference to the figures and specific examples.
Example one
Referring to fig. 1, the inventive harbor dynamic target tracking method includes the following steps:
step 1, acquiring environmental point cloud data, converting point cloud coordinates and performing point cloud superposition:
in this embodiment, 16 line laser radars are respectively installed at positions about 1.7m high on both sides of the vehicle head, and the two 16 line laser radars are used to acquire the original point cloud data of the surrounding environment, and the acquired original point cloud data is shown in fig. 2. The data acquisition frequency of the laser radar sensor is set to be 10Hz, and a GPS clock is used as a clock reference. And obtaining the longitude and latitude coordinates of the vehicle body through a vehicle-mounted inertial navigation system (inertial navigation for short). Because the point cloud data obtained by the laser radar is in a vehicle body coordinate system, and the point cloud data obtained by the inertial navigation is latitude and longitude, and along with the movement of the vehicle, relative movement errors exist in the two adjacent frames of radar point cloud data, the point cloud is projected to a uniform coordinate system through coordinate conversion and then overlapped. Therefore, a laser radar coordinate system is calibrated to a vehicle body coordinate system with an inertial navigation position as an origin, and coordinate conversion is performed according to longitude and latitude coordinates obtained by inertial navigation to obtain a laser radar point cloud under a geodetic coordinate system, wherein the geodetic coordinate system under WGS-84 is adopted in the embodiment. Because the sampling density of the 16-line laser radar is low and the characteristics are not obvious, the environmental point cloud of the current frame and the environmental point cloud of the previous frame under the geodetic coordinate system are superposed to obtain relatively dense point cloud sensing data, so that the characteristics of the obstacle are more stable, and the superposed environmental point cloud is used as the point cloud of the current frame. The comparison graph before and after the point cloud superposition is shown in fig. 3, the left graph is before the point cloud superposition, the right graph is after the point cloud superposition, and the graph shows that the point clouds after the superposition are much denser, and the characteristics are strengthened to a certain extent.
Step 2, performing through filtering, voxelization and ground filtering on the point cloud:
and (3) setting a straight-through filter, filtering the length, width and height of the point cloud of the current frame obtained in the step (1) according to the distance, and reserving the point cloud within a fixed distance range. In this embodiment, the retained point cloud range is set as follows: 50 meters in front and back, 50 meters in left and right, and-1.9 to 2.4 meters in height (using laser radar as the origin of coordinates). The filtered point cloud is then projected into a voxel grid with a side length of 0.2 meters (i.e., voxelized resolution 250 x 18) and stored in a KDTree data structure, with the resulting voxel grid being shown in fig. 4. Due to the fact that the point cloud is high in volatility, the characteristics of the point cloud can be stabilized through down-sampling, and the clustering speed is improved. Then ground filtration is carried out: a ground height threshold is set, for example 1.6m, all voxel grids are traversed and voxel grids with heights below the ground height threshold are deleted. A comparison of voxel grids before and after ground filtering is shown in fig. 5, with the grid of pre-elimination pixels on the left and the grid of post-elimination pixels on the right.
Step 3, removing static obstacles:
and (3) eliminating the voxel grid of the static obstacle through the inter-frame information, and performing difference operation on the voxel grid of the current frame obtained in the step (2) and the voxel grid of the previous frame, wherein the voxel grid after the difference operation is the voxel grid of the dynamic obstacle, as shown in fig. 6, the left image is the voxel grid before the difference operation, and the right image is the voxel grid after the difference operation.
Step 4, obstacle clustering and filtering:
and (4) clustering the residual voxel grids obtained in the step (3). First, the z-axis data is scaled (i.e., the height information of the voxel grid is scaled), the height information is compressed to 0.01 times of the original height information, and the length and width of the voxel grid are kept unchanged. This both preserves the information of height and allows all voxels to be in an approximate plane. The voxel grid is then clustered using Density-Based Spatial Clustering of Applications with Noise (DBSCAN) Clustering, the minimum search radius of the Clustering parameter is related to the voxel grid side length parameter, which is typically set to 1-1.5 times the side length of the voxel grid, in this embodiment 1.3 times the side length of the voxel grid. Clustering to obtain an obstacle list of the current frame, where the obstacle list information includes x and y coordinates of an arithmetic center of a planar projection of each target, i.e., an obstacle center, a serial number of an obstacle category, and a serial number of a point cloud data frame, and fig. 7 shows a clustering result of the DBSCAN. And then rejecting the obstacle information which does not meet the requirement according to the following heuristic conditions:
1) an obstacle having an area less than a minimum area threshold;
2) an obstacle having a length greater than a maximum vehicle length threshold.
The threshold parameter in the heuristic condition can be set according to actual conditions.
Step 5, heuristic obstacle tracking:
and 4, according to the obstacle list of the current frame obtained in the step 4, combining the obstacle list information of the previous frame to perform target matching and tracking. Traversing each obstacle Pi of the previous frame, setting the position center of the Pi as (xi, yi), traversing the position center of each obstacle of the current frame, finding an obstacle Oj with the minimum geometric distance to the Pi (xi, yi), setting the position center of the Oj as (xj, yj), and setting the minimum geometric distance as min _ dis.
Setting a limit movement distance threshold value, wherein the limit movement distance threshold value is the limit speed multiplied by the time difference deltat of adjacent frames, and if min _ dis is larger than the limit movement distance threshold value, Pi tracking loss is represented; and if min _ dis is smaller than the threshold value of the limit movement distance, calculating the speed of Pi according to the formula (1). Update Pi tracking lifecycle (1 per accumulation) and accumulated distance (accumulated geometric distance):
Figure BDA0002178035190000061
wherein:
and Vx and Vy are respectively the transverse speed and the longitudinal speed of Pi, and the unit is as follows: m/s;
xi, yi: the coordinates of the horizontal axis and the vertical axis of the Pi are respectively;
xj, yj: the coordinates of the horizontal axis and the vertical axis of Oj are respectively;
δ t is the time difference of adjacent frames, unit: and s.
Missing obstacles are tracked for both consecutive frames and deleted. For the tracking obstacle missing one frame, the central position estimation is carried out according to the formula (2), and the accumulated distance is updated, but the life cycle is not updated:
Figure BDA0002178035190000062
traversing each obstacle Oj of the current frame, traversing the position center of each obstacle of the previous frame, finding the obstacle Pi with the minimum geometric distance to the Oj (xj, yj), if min _ dis is larger than the threshold value of the limit movement distance, indicating that the Oj is a newly tracked obstacle, adding the newly tracked obstacle into the obstacle sequence of the current frame, finally calculating the center, the size, the speed, the movement direction, the life cycle and the historical track of the moving target of the current frame, and outputting the tracked obstacle list of the current frame. Fig. 8 shows a set of trace result sequences for this method.
And 6, repeating the steps 1-5 until the automatic driving is finished.

Claims (10)

1. A dynamic target tracking method for port autonomous vehicles, comprising the steps of:
s1, in the process of vehicle driving, acquiring environment point cloud data through laser radars arranged on two sides of a vehicle head, acquiring longitude and latitude coordinates of a vehicle body through a vehicle-mounted inertial navigation system, calibrating a coordinate system of the laser radars to a vehicle body coordinate system taking the position of the inertial navigation system as an origin, and converting the acquired environment point cloud coordinates from the vehicle body coordinate system to a geodetic coordinate system; overlapping the environmental point cloud of the current frame and the environmental point cloud of the previous frame under the geodetic coordinate system to be used as the environmental point cloud of the current frame;
s2, setting a straight-through filter, filtering the length, width and height of the environmental point cloud of the current frame obtained in the step 1, and reserving the environmental point cloud within a fixed distance range; performing voxelization processing on the filtered environment point cloud to obtain a 3D voxel grid; then setting a ground height threshold, traversing all voxel grids, and deleting the voxel grids with the height below the ground height threshold;
s3, performing difference operation on the voxel grid of the current frame obtained in the step 2 and the voxel grid of the previous frame, and eliminating the voxel grid of the static barrier to obtain the voxel grid of the dynamic barrier;
s4, scaling the voxel grids according to the height information, then clustering the scaled voxel grids to obtain a clustering result list of the current frame, and then removing the obstacle information which does not meet the requirements according to the heuristic conditions;
s5, according to the obstacle list of the current frame obtained in the step 4, combining the obstacle list information of the previous frame to carry out target matching and tracking, calculating the moving target center, size, speed, moving direction, life cycle and historical track of the current frame, and outputting the tracking obstacle list of the current frame;
and S6, repeating the steps 1-5 until the automatic driving is finished.
2. The dynamic target tracking method of claim 1, wherein: in step S1, the laser radar is a 16-line laser radar, the data acquisition frequency of the laser radar is 10Hz, and the GPS clock is used as a clock reference.
3. The dynamic target tracking method of claim 1, wherein: in step S1, the geodetic coordinate system is the geodetic coordinate system in WGS-84.
4. The dynamic target tracking method of claim 1, wherein: in step S2, the filtering method is to set the range of the retained point cloud to be 50 meters before and after, 50 meters around and at the height of-1.9 to 2.4 meters, and then project the retained point cloud into a voxel grid with a side length of 0.2 meter, and store the point cloud in a KDTree data structure.
5. The dynamic target tracking method of claim 1, wherein: in step S4, the specific method of scaling the voxel grid is to scale the data of the z-axis, compress the height information to 0.01 times of the original height information, and keep the length and width of the voxel grid unchanged.
6. The dynamic target tracking method of claim 1, wherein: in step S4, the clustering method is a DBSCAN clustering method.
7. The dynamic target tracking method of claim 6, wherein: the minimum search radius of the clustering parameters is 1-1.5 times the side length of the voxel grid.
8. The dynamic target tracking method of claim 7, wherein: the minimum search radius of the clustering parameters is 1.3 times the side length of the voxel grid.
9. The dynamic object tracking method according to claim 1, wherein in step S4, the heuristic conditions are: 1) an obstacle having an area less than a minimum area threshold; 2) an obstacle having a length greater than a maximum vehicle length threshold.
10. The dynamic target tracking method according to claim 1, wherein in step S5, the steps of performing target matching and tracking are as follows:
(1) traversing each obstacle Pi of the previous frame, setting the position center coordinate of the Pi as (xi, yi), traversing the position center of each obstacle of the current frame, finding an obstacle Oj with the minimum geometric distance to the Pi (xi, yi), setting the position center coordinate of the Oj as (xj, yj), and setting the minimum geometric distance as min _ dis;
(2) setting a limit movement distance threshold value, wherein the limit movement distance threshold value is obtained by multiplying a limit speed by a time difference delta t of an adjacent frame, and if min _ dis is greater than the limit movement distance threshold value, Pi tracking loss is represented; if min _ dis is smaller than the threshold value of the limit movement distance, calculating the speed of Pi according to the formula (1), and updating the tracking life cycle and the accumulated distance of Pi:
Figure FDA0002178035180000021
wherein:
and Vx and Vy are respectively the transverse speed and the longitudinal speed of Pi, and the unit is as follows: m/s;
xi, yi: the coordinates of the horizontal axis and the vertical axis of the Pi are respectively;
xj, yj: the coordinates of the horizontal axis and the vertical axis of Oj are respectively;
δ t is the time difference of adjacent frames, unit: second;
(3) tracking the lost barrier for two continuous frames and deleting the barrier; for the tracking obstacle missing one frame, the central position estimation is carried out according to the formula (2), and the accumulated distance is updated, but the life cycle is not updated:
Figure FDA0002178035180000022
(4) traversing each obstacle Oj of the current frame, traversing the position center of each obstacle of the previous frame, finding the obstacle Pi with the minimum geometric distance to the Oj (xj, yj), if min _ dis is larger than the threshold value of the limit movement distance, indicating that the Oj is a newly tracked obstacle, and adding the newly tracked obstacle into the obstacle sequence of the current frame;
(5) and outputting a tracking obstacle list of the current frame.
CN201910785988.XA 2019-08-23 2019-08-23 Dynamic target tracking method for port automatic driving vehicle Active CN110658531B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910785988.XA CN110658531B (en) 2019-08-23 2019-08-23 Dynamic target tracking method for port automatic driving vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910785988.XA CN110658531B (en) 2019-08-23 2019-08-23 Dynamic target tracking method for port automatic driving vehicle

Publications (2)

Publication Number Publication Date
CN110658531A CN110658531A (en) 2020-01-07
CN110658531B true CN110658531B (en) 2022-03-29

Family

ID=69036404

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910785988.XA Active CN110658531B (en) 2019-08-23 2019-08-23 Dynamic target tracking method for port automatic driving vehicle

Country Status (1)

Country Link
CN (1) CN110658531B (en)

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111337941B (en) * 2020-03-18 2022-03-04 中国科学技术大学 Dynamic obstacle tracking method based on sparse laser radar data
CN111505662B (en) * 2020-04-29 2021-03-23 北京理工大学 Unmanned vehicle positioning method and system
CN111776948B (en) * 2020-07-01 2022-08-02 上海汽车集团股份有限公司 Tire crane positioning method and device
CN111781608B (en) * 2020-07-03 2023-04-25 浙江光珀智能科技有限公司 Moving target detection method and system based on FMCW laser radar
CN112101092A (en) * 2020-07-31 2020-12-18 北京智行者科技有限公司 Automatic driving environment sensing method and system
CN114066739A (en) * 2020-08-05 2022-02-18 北京万集科技股份有限公司 Background point cloud filtering method and device, computer equipment and storage medium
CN112162297B (en) * 2020-09-24 2022-07-19 燕山大学 Method for eliminating dynamic obstacle artifacts in laser point cloud map
CN112348848A (en) * 2020-10-26 2021-02-09 国汽(北京)智能网联汽车研究院有限公司 Information generation method and system for traffic participants
CN112526993B (en) * 2020-11-30 2023-08-08 广州视源电子科技股份有限公司 Grid map updating method, device, robot and storage medium
CN112666569B (en) * 2020-12-01 2023-03-24 天津优控智行科技有限公司 Compression method of laser radar continuous point cloud of unmanned system
CN112731324A (en) * 2020-12-16 2021-04-30 中交第一公路勘察设计研究院有限公司 Multi-radar cross-regional networking multi-target tracking method for expressway
CN112750114A (en) * 2021-01-14 2021-05-04 北京斯年智驾科技有限公司 Port obstacle detection method and device, electronic device and storage medium
CN113144264A (en) * 2021-03-18 2021-07-23 武汉联一合立技术有限公司 Intelligent killing system and method
CN113126115B (en) * 2021-04-06 2023-11-17 北京航空航天大学杭州创新研究院 Semantic SLAM method and device based on point cloud, electronic equipment and storage medium
CN113312992A (en) * 2021-05-18 2021-08-27 中山方显科技有限公司 Dynamic object sensing and predicting method based on multi-source sensor information fusion
CN113253293B (en) * 2021-06-03 2021-09-21 中国人民解放军国防科技大学 Method for eliminating laser point cloud distortion and computer readable storage medium
CN115797900B (en) * 2021-09-09 2023-06-27 廊坊和易生活网络科技股份有限公司 Vehicle-road gesture sensing method based on monocular vision
CN113911174B (en) * 2021-11-04 2024-04-12 北京埃福瑞科技有限公司 Speed measuring method and device for train
CN114137509B (en) * 2021-11-30 2023-10-13 南京慧尔视智能科技有限公司 Millimeter wave Lei Dadian cloud clustering method and device
CN116168036B (en) * 2023-04-26 2023-07-04 深圳市岑科实业有限公司 Abnormal intelligent monitoring system for inductance winding equipment
CN117148837A (en) * 2023-08-31 2023-12-01 上海木蚁机器人科技有限公司 Dynamic obstacle determination method, device, equipment and medium
CN116859406B (en) * 2023-09-05 2023-11-28 武汉煜炜光学科技有限公司 Calculation method and device for vehicle speed based on laser radar
CN117491983B (en) * 2024-01-02 2024-03-08 上海几何伙伴智能驾驶有限公司 Method for realizing passable region boundary acquisition and target relative position discrimination

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07104066A (en) * 1993-10-01 1995-04-21 Mazda Motor Corp Obstacle detecting device for vehicle
CN106541945A (en) * 2016-11-15 2017-03-29 广州大学 A kind of unmanned vehicle automatic parking method based on ICP algorithm
CN106772434A (en) * 2016-11-18 2017-05-31 北京联合大学 A kind of unmanned vehicle obstacle detection method based on TegraX1 radar datas
CN108845579A (en) * 2018-08-14 2018-11-20 苏州畅风加行智能科技有限公司 A kind of automated driving system and its method of port vehicle
CN109212532A (en) * 2017-07-04 2019-01-15 百度在线网络技术(北京)有限公司 Method and apparatus for detecting barrier

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07104066A (en) * 1993-10-01 1995-04-21 Mazda Motor Corp Obstacle detecting device for vehicle
CN106541945A (en) * 2016-11-15 2017-03-29 广州大学 A kind of unmanned vehicle automatic parking method based on ICP algorithm
CN106772434A (en) * 2016-11-18 2017-05-31 北京联合大学 A kind of unmanned vehicle obstacle detection method based on TegraX1 radar datas
CN109212532A (en) * 2017-07-04 2019-01-15 百度在线网络技术(北京)有限公司 Method and apparatus for detecting barrier
CN108845579A (en) * 2018-08-14 2018-11-20 苏州畅风加行智能科技有限公司 A kind of automated driving system and its method of port vehicle

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
智能驾驶车载激光雷达关键技术与应用算法;陈晓冬 等;《光电工程》;20190731;第46卷(第7期);第190182-1至190182-13页 *

Also Published As

Publication number Publication date
CN110658531A (en) 2020-01-07

Similar Documents

Publication Publication Date Title
CN110658531B (en) Dynamic target tracking method for port automatic driving vehicle
US11393097B2 (en) Using light detection and ranging (LIDAR) to train camera and imaging radar deep learning networks
CN110531376B (en) Obstacle detection and tracking method for port unmanned vehicle
US20200217950A1 (en) Resolution of elevation ambiguity in one-dimensional radar processing
JP6441993B2 (en) Method and system for object detection using a laser point cloud
EP3745158B1 (en) Methods and systems for computer-based determining of presence of dynamic objects
CN111192295B (en) Target detection and tracking method, apparatus, and computer-readable storage medium
EP4057227A1 (en) Pose estimation of inertial measurement unit and camera mounted on a moving object
KR20190082291A (en) Method and system for creating and updating vehicle environment map
CN115803781A (en) Method and system for generating a bird's eye view bounding box associated with an object
CN112162297B (en) Method for eliminating dynamic obstacle artifacts in laser point cloud map
US20200218909A1 (en) Lane marker detection and lane instance recognition
Pantilie et al. Real-time obstacle detection using dense stereo vision and dense optical flow
US20230027622A1 (en) Automated real-time calibration
Sakic et al. Camera-LIDAR object detection and distance estimation with application in collision avoidance system
CN111781606A (en) Novel miniaturization implementation method for fusion of laser radar and ultrasonic radar
EP4148599A1 (en) Systems and methods for providing and using confidence estimations for semantic labeling
CN115965847A (en) Three-dimensional target detection method and system based on multi-modal feature fusion under cross view angle
US11557129B2 (en) Systems and methods for producing amodal cuboids
CN117635721A (en) Target positioning method, related system and storage medium
US20220221585A1 (en) Systems and methods for monitoring lidar sensor health
Madake et al. Visualization of 3D Point Clouds for Vehicle Detection Based on LiDAR and Camera Fusion
JP2018194417A (en) Position estimation device, mobile device
CN116635919A (en) Object tracking device and object tracking method
CN115376365B (en) Vehicle control method, device, electronic equipment and computer readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant