CN113866742A - Method for point cloud processing and target classification of 4D millimeter wave radar - Google Patents

Method for point cloud processing and target classification of 4D millimeter wave radar Download PDF

Info

Publication number
CN113866742A
CN113866742A CN202111466169.2A CN202111466169A CN113866742A CN 113866742 A CN113866742 A CN 113866742A CN 202111466169 A CN202111466169 A CN 202111466169A CN 113866742 A CN113866742 A CN 113866742A
Authority
CN
China
Prior art keywords
point
target
trace
track
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111466169.2A
Other languages
Chinese (zh)
Other versions
CN113866742B (en
Inventor
宋玛君
王奇
朱彦博
吴军
张洁
张我弓
张吉
汪玮喆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Chuhang Technology Co ltd
Original Assignee
Nanjing Chuhang Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Chuhang Technology Co ltd filed Critical Nanjing Chuhang Technology Co ltd
Priority to CN202111466169.2A priority Critical patent/CN113866742B/en
Publication of CN113866742A publication Critical patent/CN113866742A/en
Application granted granted Critical
Publication of CN113866742B publication Critical patent/CN113866742B/en
Priority to PCT/CN2022/092002 priority patent/WO2023097971A1/en
Priority to DE112022000017.1T priority patent/DE112022000017T5/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/415Identification of targets based on measurements of movement associated with the target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/418Theoretical aspects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • G01S13/726Multiple target tracking
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/414Discriminating targets with respect to background clutter

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a point cloud processing and target classification method for a 4D millimeter wave radar. The method comprises the steps of trace point input, trace point preprocessing, Kalman filtering prediction, association of trace points and tracks, clustering of trace points, starting of tracks, updating of tracks and track management. The invention realizes the transformation from a two-dimensional plane to a three-dimensional plane, and the trace point characteristic of the target is more obvious. The method comprises the steps of constructing a target measurement virtual point trace in a signal-to-noise ratio weighting mode, projecting an associated point trace on an xoy plane, rotating a course angle of the xoy plane clockwise by taking the virtual point trace as an original point, calculating size information of a target by adopting a multi-frame sliding window mode and a virtual point position relation displacement among multiple frames, improving the problem that the size of the target is not obvious due to the fact that millimeter wave point cloud is sparse, obtaining the classification probability of the target of a frame according to the class probability given to the target of the single frame according to the characteristics of the target, combining a historical probability and a single frame probability weighting mode, and adopting the class with the largest frame probability as the final classification result of the frame.

Description

Method for point cloud processing and target classification of 4D millimeter wave radar
Technical Field
The invention relates to the technical field of 4D millimeter wave radar point cloud processing and target classification methods, in particular to a 4D millimeter wave radar point cloud processing and target classification method.
Background
At present, point clouds of millimeter wave radars are sparse, the contained target features are few, and the influence of different directions of targets on target classification in the driving process is not fully considered in a classification method using point trace features, so that the target classification method based on the millimeter wave radars is low in accuracy and poor in practicability, great challenges are brought to the development of the millimeter wave radars in practical application, the existing target classification based on the millimeter wave radars is mainly applied to distinguishing pedestrians and vehicles, and the market demands for target classification are not only the same. The 4D millimeter wave radar increases the pitch angle information on the original distance, horizontal angle and speed, and expands a target from a two-dimensional plane to a three-dimensional space in space, so that the shape characteristic of the target is more obvious. Under the conditions that point cloud of a traditional millimeter wave radar is sparse and various targets influence target classification judgment in reality at different azimuth angles, the 4D radar further improves the number and quality of traces, meanwhile, point cloud processing is carried out on a three-dimensional space to extract course information of the targets, the trace rotation is carried out according to the course angle information by associating the trace information with the targets through a multi-frame sliding window, the length, width and height shape characteristics of the targets are calculated through trace displacement based on the position relation of multi-frame virtual traces, the unobvious characteristics of the sizes of the targets are improved, and the applicability of the classification of the targets in different azimuth driving is improved. Meanwhile, the length, width, height, RCS, volume and the like of the target are used as the characteristics of the target, the probability of each class of the single-frame target is given, the classification probability of the target of the frame is obtained by combining the historical probability and the single-frame probability weighting mode, the class with the maximum probability of the frame is used as the final classification result of the frame, pedestrians, two-wheel vehicles, trolleys and commercial vehicles can be distinguished in real time, and the accuracy and universality of target classification are further improved.
Disclosure of Invention
The invention aims to provide a method for processing point cloud and classifying targets of a 4D millimeter wave radar aiming at the defects in the prior art.
In order to achieve the above object, the present invention provides a method for point cloud processing and target classification of a 4D millimeter wave radar, comprising:
inputting a point trace of a target acquired by a 4D millimeter wave radar, and preprocessing the point trace, wherein the measurement of the point trace comprises the distance of the target
Figure DEST_PATH_IMAGE001
Horizontal angle
Figure 692593DEST_PATH_IMAGE002
Pitch angle
Figure DEST_PATH_IMAGE003
And radial velocity
Figure 990151DEST_PATH_IMAGE004
Performing Kalman filtering prediction on all existing tracks, and then predicting the loss times of all tracks after prediction
Figure DEST_PATH_IMAGE005
And life cycle of all tracks
Figure 854464DEST_PATH_IMAGE006
Respectively adding 1 to the extrapolated time of the flight path and the radar period T;
associating the preprocessed trace points with the existing flight path before the frame;
clustering the preprocessed point tracks which are not related to the upper track by adopting a density clustering mode;
carrying out initial track on the clustering result;
and updating the flight path, wherein the flight path updating comprises Kalman filtering updating, and the Kalman filtering updating mode is as follows: traversing all the effective tracks, judging whether each track has an associated point track in the frame, if so, initializing a Kalman filter for the track, and if the track is more than two frames, updating Kalman filtering; then, the following operations are carried out on all tracks with associated point tracks in the frame: updating the radar scattering cross section of the flight path to be the maximum value of the radar scattering cross section in the associated point path of the frame, resetting the number of the associated point paths of the flight path to be 0, and losing the flight path
Figure 218449DEST_PATH_IMAGE005
Subtracting 1, and resetting the extrapolation time of the flight path to 0;
and deleting the flight path of the unreal target or the flight path which cannot be stably tracked when the target is not in the radar detection range.
Further, the pretreatment comprises angle correction, and the horizontal angle after the angle correction
Figure DEST_PATH_IMAGE007
Pitch angle
Figure 717039DEST_PATH_IMAGE008
Respectively as follows:
Figure DEST_PATH_IMAGE009
wherein the content of the first and second substances,
Figure 245104DEST_PATH_IMAGE010
respectively the level angle and the pitch installation angle of the calibrated radar.
Further, the pretreatment also comprises dynamic and static separation, and the dynamic and static separation mode is as follows:
radial velocity of the spot
Figure DEST_PATH_IMAGE011
Decompose into
Figure 693534DEST_PATH_IMAGE012
Plane velocity of the object
Figure DEST_PATH_IMAGE013
Decomposing the speed of the vehicle where the radar is located into a point track and projecting the point track in the radial direction
Figure 193916DEST_PATH_IMAGE012
Plane velocity of the object
Figure 256550DEST_PATH_IMAGE014
Radial velocity of the spot
Figure DEST_PATH_IMAGE015
Resolving to Z-axis direction to obtain velocity
Figure 212524DEST_PATH_IMAGE016
If velocity
Figure 713912DEST_PATH_IMAGE013
And speed
Figure 740905DEST_PATH_IMAGE014
Sum less than threshold
Figure DEST_PATH_IMAGE017
And speed of
Figure 584227DEST_PATH_IMAGE018
Less than threshold
Figure DEST_PATH_IMAGE019
If not, the trace point is judged as a moving target.
Further, associating the preprocessed trace point with the existing track before the current frame specifically includes: when the preprocessed point track passes through a distance wave gate, a horizontal angle wave gate, a pitching angle wave gate and a radial speed wave gate set by a certain track, the distance between the point track and the track is recorded, and finally, the nearest track is associated with the predicted point track by adopting a nearest neighbor association mode.
Further, the step of starting the track of the clustering result specifically includes:
traversing the storage of the clustering result and the flight path, and if a certain position in the array for storing the flight path is empty, storing a new flight path at the position;
and (3) performing dynamic and static inspection on the traces in each class, specifically as follows: if the number of the motion points exceeds the total associated point trace ratio threshold value
Figure 821305DEST_PATH_IMAGE020
Marking the track as non-stationary, otherwise, marking the track as stationary;
constructing a virtual measuring point track of the initial flight path, which comprises the following specific steps:
Figure 532909DEST_PATH_IMAGE021
Figure DEST_PATH_IMAGE022
Figure 801691DEST_PATH_IMAGE023
Figure DEST_PATH_IMAGE024
wherein the content of the first and second substances,
Figure 737286DEST_PATH_IMAGE025
the number of all traces in a class,
Figure DEST_PATH_IMAGE026
to construct the distance of the virtual metrology trace,
Figure 664922DEST_PATH_IMAGE027
the distance of the ith trace point within the class,
Figure 727687DEST_PATH_IMAGE028
to construct the horizontal angle of the virtual measurement trace,
Figure DEST_PATH_IMAGE029
is the horizontal angle of the ith point trace in the class,
Figure 306436DEST_PATH_IMAGE030
To construct the pitch angle of the virtual metrology trace,
Figure DEST_PATH_IMAGE031
the pitch angle of the ith point trace in the class,
Figure 584838DEST_PATH_IMAGE032
to construct the radial velocity of the virtual metrology trace,
Figure 734191DEST_PATH_IMAGE033
the radial velocity of the ith point trace in the class,
Figure DEST_PATH_IMAGE034
the sum weight of the signal-to-noise ratios of all points in the signal-to-noise ratio occupation class of each trace is taken as the weight of the sum of the signal-to-noise ratios of all points in the signal-to-noise ratio occupation class of each trace;
of the initial flight path
Figure 725281DEST_PATH_IMAGE035
Generating a period
Figure DEST_PATH_IMAGE036
Initialized to 1, number of losses
Figure 250546DEST_PATH_IMAGE037
The initialization is 0, the filtering state is set to be uninitialized, the track associated point track number is initialized to be 0, and the distance of the initial track is
Figure 934469DEST_PATH_IMAGE026
Horizontal angle of
Figure 836697DEST_PATH_IMAGE028
A pitch angle of
Figure 365898DEST_PATH_IMAGE030
And a radial velocity of
Figure 407322DEST_PATH_IMAGE032
Further, the track update further includes a target classification, and the target classification is performed in the following manner:
the sliding window stores the track information of a plurality of frames of targets, and the track information is included in
Figure DEST_PATH_IMAGE038
Calculating the maximum values of projection points of all associated point tracks of each frame of flight path on the xoy surface after clockwise rotating according to the course angle alpha and then respectively rotating on the X axis and the Y axis
Figure 340774DEST_PATH_IMAGE039
Figure DEST_PATH_IMAGE040
And the minimum value of projection points of all associated point tracks of each frame track on the xoy surface on the X axis and the Y axis after clockwise rotation according to the heading angle alpha
Figure 150205DEST_PATH_IMAGE041
Figure DEST_PATH_IMAGE042
Calculating the maximum and minimum values of all associated point tracks of each frame track on the Z axis
Figure 217518DEST_PATH_IMAGE043
Figure DEST_PATH_IMAGE044
Respectively calculating multi-frame track associated point tracks
Figure 172836DEST_PATH_IMAGE045
Figure DEST_PATH_IMAGE046
Figure 330717DEST_PATH_IMAGE047
And
Figure DEST_PATH_IMAGE048
respectively carrying out displacement according to the coincidence of the virtual measuring point trace coordinate of each frame and the measuring point trace coordinate of the first frame, and then respectively obtaining the maximum values on the X axis and the Y axis after the displacement
Figure 738696DEST_PATH_IMAGE049
And
Figure DEST_PATH_IMAGE050
and minimum values in X-axis and Y-axis after translation, respectively
Figure 750645DEST_PATH_IMAGE051
And
Figure DEST_PATH_IMAGE052
(ii) a Of multiple frames
Figure 294890DEST_PATH_IMAGE053
Figure DEST_PATH_IMAGE054
Respectively carrying out displacement according to the coincidence of the Z-axis coordinate of the virtual measuring point trace of each frame and the Z-axis coordinate of the measuring point trace of the first frame, and taking the maximum value after displacement
Figure 426270DEST_PATH_IMAGE055
And minimum value
Figure DEST_PATH_IMAGE056
Respectively calculating:
length of the target
Figure 55965DEST_PATH_IMAGE057
Width of the target
Figure DEST_PATH_IMAGE058
Height of the target
Figure 386452DEST_PATH_IMAGE059
Volume of target
Figure DEST_PATH_IMAGE060
Giving the length of each class, the radar scattering cross section and the corresponding threshold value of the volume according to the distance and the horizontal angle of the target, and determining the length of the target
Figure 50783DEST_PATH_IMAGE061
Speed, velocity
Figure DEST_PATH_IMAGE062
Calculating the class probability of a single-frame target based on the characteristics of the target, calculating the class probability of the single-frame target in a weighted form by combining the historical probability and the class probability of the single-frame target, and taking the class with the maximum class probability of the frame target as the final classification result of the frame, wherein,
Figure 90414DEST_PATH_IMAGE063
Figure DEST_PATH_IMAGE064
Figure 413598DEST_PATH_IMAGE065
the velocities on the x-axis and the y-axis after the k-th kalman filter update of the target, respectively, k being an integer greater than zero.
Has the advantages that: the target tracking method based on the 4D millimeter wave point cloud data realizes the conversion from a two-dimensional plane to a three-dimensional plane, and the point trace characteristics of the target are more obvious. On the basis of target tracking, a course angle alpha on an xoy plane is calculated through a filtering speed of a moving track, a target measurement virtual point track is constructed in a signal-to-noise ratio weighting mode, an associated point track is projected onto the xoy plane, the virtual point track serves as an original point and rotates alpha clockwise, meanwhile, the size information of a target is calculated in a multi-frame sliding window mode through the displacement of the virtual point position relation among multiple frames, the problem that the size of the target is not obvious due to the fact that millimeter wave point cloud is sparse is solved, meanwhile, the length, RCS, the volume and the like of the target serve as the characteristics of the target, the class probability of a single-frame target is given, the classification probability of the frame target is obtained through combination of a historical probability and a single-frame probability weighting mode, the class with the largest probability of the frame serves as the final classification result of the frame, and classification of people, two-wheeled vehicles, vehicles and commercial vehicles is achieved. The invention meets the future intelligent requirement of the development of the vehicle-mounted millimeter wave radar and has important practical significance on the research of the automatic driving perception capability.
Drawings
FIG. 1 is a schematic diagram of a coordinate system employed for trace point processing;
FIG. 2 is a flow chart of a method of 4D millimeter wave radar point cloud processing and target classification in an embodiment of the invention;
FIG. 3 is a flow chart of trace point clustering according to an embodiment of the invention;
FIG. 4 is a flow diagram illustrating object classification according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a sliding window storing multiple frames of target information;
FIG. 6 is a schematic diagram of calculating the length, width and height of a target;
FIG. 7 is a flow diagram of classification based on calculating class probabilities.
Detailed Description
The present invention will be further illustrated with reference to the accompanying drawings and specific examples, which are carried out on the premise of the technical solution of the present invention, and it should be understood that these examples are only for illustrating the present invention and are not intended to limit the scope of the present invention.
As shown in fig. 1 to 7, an embodiment of the present invention provides a method for point cloud processing and target classification of a 4D millimeter wave radar, which performs point trace processing based on the coordinate system shown in fig. 1, where a radar is installed at a coordinate dot in fig. 1, and fig. 1 is a diagram where the radar is installed correctly, that is, a normal line coincides with an X axis. The method specifically comprises the following steps:
inputting a point trace of a target acquired by a 4D millimeter wave radar, preprocessing the point trace, and measuring the point trace to obtain the distance of the target
Figure 767350DEST_PATH_IMAGE001
Horizontal angle
Figure DEST_PATH_IMAGE066
Pitch angle
Figure 286187DEST_PATH_IMAGE003
And radial velocity
Figure 949250DEST_PATH_IMAGE067
. The preprocessing of the trace comprises angle correction and horizontal angle after the angle correction
Figure 225642DEST_PATH_IMAGE068
Pitch angle
Figure 163511DEST_PATH_IMAGE008
Respectively as follows:
Figure DEST_PATH_IMAGE069
wherein the content of the first and second substances,
Figure 737187DEST_PATH_IMAGE070
respectively the level angle and the pitch installation angle of the calibrated radar.
The pretreatment of the trace also preferably comprises dynamic and static separation, and the dynamic and static separation mode is as follows:
radial velocity of the spot
Figure 321883DEST_PATH_IMAGE004
Decompose into
Figure 600418DEST_PATH_IMAGE071
Plane velocity of the object
Figure 295972DEST_PATH_IMAGE013
Decomposing the speed of the vehicle where the radar is located into a point track and projecting the point track in the radial direction
Figure 914035DEST_PATH_IMAGE038
Plane velocity of the object
Figure 653321DEST_PATH_IMAGE014
Radial velocity of the spot
Figure DEST_PATH_IMAGE072
Resolving to Z-axis direction to obtain velocity
Figure 232201DEST_PATH_IMAGE016
If velocity
Figure 746095DEST_PATH_IMAGE013
And speed
Figure 15403DEST_PATH_IMAGE073
Sum less than threshold
Figure DEST_PATH_IMAGE074
And speed of
Figure 207481DEST_PATH_IMAGE018
Less than threshold
Figure 945761DEST_PATH_IMAGE019
If not, the trace point is judged as a moving target.
Performing Kalman filtering prediction on all existing tracks, calculating prediction covariance, and the distance, horizontal angle, pitch angle and radial speed of the predicted target, and predicting the loss times of all tracks
Figure 497965DEST_PATH_IMAGE005
And life cycle of all tracks
Figure 372511DEST_PATH_IMAGE075
1 is added to each, and the extrapolated time of the flight path is added to the radar period T. Specifically, the Kalman filtering prediction is carried out by utilizing a uniform linear motion model, and an equation of the uniform linear motion model is as follows:
Figure DEST_PATH_IMAGE076
wherein the content of the first and second substances,
Figure 326036DEST_PATH_IMAGE077
to predict the state vector of the acquired target measurement point trace for the (k + 1) th measurement point,
Figure DEST_PATH_IMAGE078
in order to be a noise of the process,
Figure 551612DEST_PATH_IMAGE079
to predict the state vector of the target kth measurement point trace obtained,
Figure DEST_PATH_IMAGE080
can be expressed as:
Figure 720556DEST_PATH_IMAGE081
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE082
Figure 230035DEST_PATH_IMAGE083
Figure 498337DEST_PATH_IMAGE084
Figure 460476DEST_PATH_IMAGE085
Figure 842566DEST_PATH_IMAGE086
and
Figure 206551DEST_PATH_IMAGE087
respectively sequentially updating the position on an x axis, the position on a y axis, the position on a z axis, the speed on the x axis, the speed on the y axis and the speed on the z axis of the target after the k-th filtering, wherein k is an integer greater than 0;
Figure DEST_PATH_IMAGE088
in order to be a state transition matrix,
Figure 911333DEST_PATH_IMAGE089
expressed as:
Figure DEST_PATH_IMAGE090
wherein T is radar single frame time.
While updating, the Kalman filter measures the input observations
Figure 908239DEST_PATH_IMAGE091
Comprises the following steps:
Figure DEST_PATH_IMAGE092
Figure 153407DEST_PATH_IMAGE093
Figure DEST_PATH_IMAGE094
Figure 447597DEST_PATH_IMAGE095
and
Figure 57701DEST_PATH_IMAGE096
respectively the distance of the k-th measuring point trace of the targetHorizontal angle, pitch angle and radial velocity, then its observation equation is:
Figure 728854DEST_PATH_IMAGE097
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE098
in order to measure the noise, the noise is measured,
Figure 715396DEST_PATH_IMAGE099
for the observation function:
Figure DEST_PATH_IMAGE100
Figure 867023DEST_PATH_IMAGE101
wherein a and b are
Figure DEST_PATH_IMAGE102
A variable in a function.
And associating the preprocessed point track with the existing track before the current frame. The method comprises the following specific steps: when the preprocessed point track passes through a distance wave gate, a horizontal angle wave gate, a pitching angle wave gate and a radial speed wave gate set by a certain track, the distance between the point track and the track is recorded, and finally, the nearest track is associated with the predicted point track by adopting a nearest neighbor association mode. The principle of associating the trace points with the tracks is that a single trace point can only be associated with a single track, but a single track can be associated with a plurality of traces.
Referring to fig. 3, the embodiment of the present invention preferably uses Density-Based Spatial Clustering of Applications with Noise (DBSCAN) to cluster predicted traces that are not associated with tracks. Specifically, set variables
Figure 904818DEST_PATH_IMAGE103
Figure DEST_PATH_IMAGE104
If the Euclidean distance between trace A and trace B is smaller than the Euclidean distance
Figure 876316DEST_PATH_IMAGE105
If the number of trace points in the neighborhood of trace point A is less than the number of trace points in the neighborhood of trace point A, then trace point B is called the trace point in the neighborhood of trace point A
Figure DEST_PATH_IMAGE106
Then, point trace A is called as a core object, and the points in the neighborhood of A are all reachable by the direct density of point A.
And carrying out initial track on the clustering result. Specifically, the clustering result and the flight path are stored, and if a certain position in the array for storing the flight path is empty, a new flight path is stored at the position.
And (3) performing dynamic and static inspection on the traces in each class, specifically as follows: if the number of the motion points exceeds the total associated point trace ratio threshold value
Figure 994445DEST_PATH_IMAGE107
The track is marked as non-stationary, otherwise, the track is marked as stationary.
Constructing a measuring point trace of the initial flight trace, wherein the signal-to-noise ratio of the point trace is higher than the intensity of the reflective signal, and the accuracy of the point trace in all aspects is generally considered to be higher if the signal-to-noise ratio is higher, so that a virtual measuring point trace is preferably constructed in a signal-to-noise ratio weighting manner, which is specifically as follows:
Figure DEST_PATH_IMAGE108
Figure 266158DEST_PATH_IMAGE022
Figure 480714DEST_PATH_IMAGE109
Figure DEST_PATH_IMAGE110
wherein the content of the first and second substances,
Figure 673929DEST_PATH_IMAGE025
the number of all traces in a class,
Figure 251541DEST_PATH_IMAGE026
to construct the distance of the virtual metrology trace,
Figure 49864DEST_PATH_IMAGE027
the distance of the ith trace point within the class,
Figure 421939DEST_PATH_IMAGE028
to construct the horizontal angle of the virtual measurement trace,
Figure 40133DEST_PATH_IMAGE029
is the horizontal angle of the ith point trace in the class,
Figure 155857DEST_PATH_IMAGE030
To construct the pitch angle of the virtual metrology trace,
Figure 811616DEST_PATH_IMAGE111
the pitch angle of the ith point trace in the class,
Figure DEST_PATH_IMAGE112
to construct the radial velocity of the virtual metrology trace,
Figure 167642DEST_PATH_IMAGE033
the radial velocity of the ith point trace in the class,
Figure 584717DEST_PATH_IMAGE113
the sum weight of the signal-to-noise ratios of all points in the signal-to-noise ratio occupation class of each trace is
Of the initial flight path
Figure DEST_PATH_IMAGE114
Generating cycle
Figure 51601DEST_PATH_IMAGE036
Number of times of loss
Figure 824516DEST_PATH_IMAGE037
The filtering state is set to be not initialized, the track number of the track associated points is initialized to be 0, and the distance of the initial track is
Figure 741657DEST_PATH_IMAGE026
Horizontal angle of
Figure 659410DEST_PATH_IMAGE028
A pitch angle of
Figure 867668DEST_PATH_IMAGE030
And a radial velocity of
Figure 947620DEST_PATH_IMAGE112
And updating the flight path, wherein the updating of the flight path comprises Kalman filtering updating, specifically, traversing all effective flight paths, judging whether each flight path has an associated point path in the frame, if so, initializing a Kalman filter for the flight path, and if the flight path is more than two frames, updating the Kalman filtering. Thereby updating the track filter state and the filter covariance. Then, the following operations are carried out on all tracks with associated point tracks in the frame: updating the radar scattering cross section (RCS) of the flight path to the maximum value of the radar scattering cross section in the associated point path of the frame, resetting the number of the associated point paths of the flight path to 0 and the loss times of the flight path
Figure 379869DEST_PATH_IMAGE115
Subtract 1 and reset the extrapolated time of the flight path to 0.
And carrying out track management to delete the track of the unreal target or the track which cannot be stably tracked when the target is not in the radar detection range. Specifically, all tracks are traversed, loss times and extrapolation time detection are carried out on effective tracks, and if the tracks are around L frames before the tracks in the track initialization stage and the track loss times are larger than or equal to m, the tracks are deleted. And if the extrapolation time of the flight path is greater than the radar cycle time of the s frames, deleting the flight path. In general, L can be from 4 to 8, m can be from 2 to 4, and s can be from 5 to 8.
The track update of the embodiment of the present invention further includes a target classification, see fig. 4, the target classification mode is as follows:
the sliding window stores the track information of a plurality of frames of targets, and the track information is included in
Figure 912482DEST_PATH_IMAGE012
Calculating the maximum values of projection points of all associated point tracks of each frame of flight path on the xoy surface after clockwise rotating according to the course angle alpha and then respectively rotating on the X axis and the Y axis
Figure 173699DEST_PATH_IMAGE039
Figure DEST_PATH_IMAGE116
And the minimum value of projection points of all associated point tracks of each frame track on the xoy surface on the X axis and the Y axis after clockwise rotation according to the heading angle alpha
Figure 717944DEST_PATH_IMAGE041
Figure 539005DEST_PATH_IMAGE042
. Specifically, referring to fig. 5, the black dots are projected to the associated trace of the flight path
Figure 355652DEST_PATH_IMAGE117
Projecting virtual measuring point trace with plane point and small grey dots as flight trace to
Figure DEST_PATH_IMAGE118
Points of the plane, alpha being according to the flight path
Figure 499188DEST_PATH_IMAGE119
Calculating course angle by using speed on the plane, rotating all black small dots clockwise by alpha by using gray small dots as circle centers, and calculating the positions of the dots after the rotation is finishedXYMaximum and minimum on axis
Figure 835623DEST_PATH_IMAGE120
Figure 62205DEST_PATH_IMAGE046
Figure 320142DEST_PATH_IMAGE121
And
Figure 923162DEST_PATH_IMAGE048
. Calculating the maximum and minimum values of all associated point tracks of each frame of flight path on the Z axis
Figure 439069DEST_PATH_IMAGE122
Figure 305394DEST_PATH_IMAGE123
Figure 581786DEST_PATH_IMAGE124
And
Figure 988496DEST_PATH_IMAGE125
directly according to the maximum and minimum height of the associated trace point.
Referring to FIG. 6, the black origin in the graph is the projection of the first frame measurement trace onto
Figure 33944DEST_PATH_IMAGE126
Points of a plane, white origin being projected onto a measurement trace of some non-first frame
Figure 867907DEST_PATH_IMAGE127
The virtual measuring point trace constructed by each frame is projected to the plane point and the gray round point
Figure 897174DEST_PATH_IMAGE126
A point of the plane. Respectively calculating multi-frame track associated point tracks
Figure 841997DEST_PATH_IMAGE128
Figure 744880DEST_PATH_IMAGE046
Figure 15325DEST_PATH_IMAGE129
And
Figure 266309DEST_PATH_IMAGE048
respectively carrying out displacement according to the coincidence of the virtual measuring point trace coordinate of each frame and the measuring point trace coordinate of the first frame, and then respectively obtaining the maximum values on the X axis and the Y axis after the displacement
Figure DEST_PATH_IMAGE130
And
Figure 562292DEST_PATH_IMAGE131
and minimum values in X-axis and Y-axis after translation, respectively
Figure DEST_PATH_IMAGE132
And
Figure 910228DEST_PATH_IMAGE052
(ii) a Of multiple frames
Figure 85994DEST_PATH_IMAGE053
Figure 555765DEST_PATH_IMAGE054
Respectively carrying out displacement according to the coincidence of the Z-axis coordinate of the virtual measuring point trace of each frame and the Z-axis coordinate of the measuring point trace of the first frame, and taking the maximum value after displacement
Figure 842390DEST_PATH_IMAGE133
And minimum value
Figure DEST_PATH_IMAGE134
Respectively calculating:
length of the target
Figure 44832DEST_PATH_IMAGE057
Width of the target
Figure 876653DEST_PATH_IMAGE058
Height of the target
Figure 351497DEST_PATH_IMAGE059
Volume of target
Figure 192545DEST_PATH_IMAGE060
Referring to fig. 7, the length of the target is calculated according to the empirical values of each class obtained by the offline analysis, and the length, the radar cross-sectional area, and the threshold corresponding to the volume are given to each class according to the distance and the horizontal angle of the target
Figure 436445DEST_PATH_IMAGE061
Speed, velocity
Figure 695957DEST_PATH_IMAGE135
Calculating the class probability of a single-frame target based on the characteristics of the target, calculating the classification probability of the target in the frame in a weighted form by combining the historical probability and the class probability of the single-frame target, and taking the class with the maximum classification probability of the target in the frame as the final classification result of the frame, wherein,
Figure DEST_PATH_IMAGE136
Figure 284195DEST_PATH_IMAGE137
Figure 739054DEST_PATH_IMAGE065
respectively the k-th Karman of the targetThe velocity on the x-axis and on the y-axis after the filter update, k being an integer greater than zero. The above categories include pedestrians, motorcycles, cars, and commercial vehicles (trucks, buses), etc.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that other parts not specifically described are within the prior art or common general knowledge to those of ordinary skill in the art. Without departing from the principle of the invention, several improvements and modifications can be made, and these improvements and modifications should also be construed as the scope of the invention.

Claims (6)

1. A method for point cloud processing and target classification of a 4D millimeter wave radar is characterized by comprising the following steps:
inputting a point trace of a target acquired by a 4D millimeter wave radar, and preprocessing the point trace, wherein the measurement of the point trace comprises the distance of the target
Figure 803143DEST_PATH_IMAGE001
Horizontal angle
Figure 507925DEST_PATH_IMAGE002
Pitch angle
Figure 895044DEST_PATH_IMAGE003
And radial velocity
Figure 77895DEST_PATH_IMAGE004
Performing Kalman filtering prediction on all existing tracks, and then predicting the loss times of all tracks after prediction
Figure 765228DEST_PATH_IMAGE005
And life cycle of all tracks
Figure 624600DEST_PATH_IMAGE006
Respectively adding 1 to the extrapolated time of the flight path and the radar period T;
associating the preprocessed trace points with the existing flight tracks;
clustering the preprocessed point tracks which are not related to the upper track by adopting a density clustering mode;
carrying out initial track on the clustering result;
and updating the flight path, wherein the flight path updating comprises Kalman filtering updating, and the Kalman filtering updating mode is as follows: traversing all the effective tracks, judging whether each track has an associated point track in the frame, if so, initializing a Kalman filter for the track, and if the track is more than two frames, updating Kalman filtering; then, the following operations are carried out on all tracks with associated point tracks in the frame: updating the radar scattering cross section of the flight path to be the maximum value of the radar scattering cross section in the associated point path of the frame, resetting the number of the associated point paths of the flight path to be 0, and losing the flight path
Figure 37696DEST_PATH_IMAGE005
Subtracting 1, and resetting the extrapolation time of the flight path to 0;
and deleting the flight path of the unreal target or the flight path which cannot be stably tracked when the target is not in the radar detection range.
2. The method for 4D millimeter wave radar point cloud processing and target classification as claimed in claim 1, wherein the pre-processing comprises angle rectification, horizontal angle after angle rectification
Figure 7926DEST_PATH_IMAGE007
Pitch angle
Figure 97236DEST_PATH_IMAGE008
Respectively as follows:
Figure 127509DEST_PATH_IMAGE009
wherein the content of the first and second substances,
Figure 239952DEST_PATH_IMAGE010
respectively the level angle and the pitch installation angle of the calibrated radar.
3. The method for point cloud processing and target classification of a 4D millimeter wave radar according to claim 1, wherein the pre-processing further comprises dynamic and static separation in the following manner:
radial velocity of the spot
Figure 13873DEST_PATH_IMAGE004
Decompose into
Figure 957690DEST_PATH_IMAGE011
Plane velocity of the object
Figure 362126DEST_PATH_IMAGE012
Decomposing the speed of the vehicle where the radar is located into a point track and projecting the point track in the radial direction
Figure 273450DEST_PATH_IMAGE013
Plane velocity of the object
Figure 598865DEST_PATH_IMAGE014
Radial velocity of the spot
Figure 584139DEST_PATH_IMAGE015
Resolving to Z-axis direction to obtain velocity
Figure 706946DEST_PATH_IMAGE016
If velocity
Figure 839988DEST_PATH_IMAGE012
And speed
Figure 503181DEST_PATH_IMAGE017
Sum less than threshold
Figure 405278DEST_PATH_IMAGE018
And speed of
Figure 902250DEST_PATH_IMAGE019
Less than threshold
Figure 788166DEST_PATH_IMAGE020
If not, the trace point is judged as a moving target.
4. The method for point cloud processing and target classification of a 4D millimeter wave radar according to claim 1, wherein associating the preprocessed point trace with an existing track of the current frame specifically comprises: when the preprocessed point track passes through a distance wave gate, a horizontal angle wave gate, a pitching angle wave gate and a radial speed wave gate set by a certain track, the distance between the point track and the track is recorded, and finally, the nearest track is associated with the predicted point track by adopting a nearest neighbor association mode.
5. The method for point cloud processing and target classification for 4D millimeter wave radar according to claim 1, wherein the step of starting the track for the result of clustering specifically comprises:
traversing the storage of the clustering result and the flight path, and if a certain position in the array for storing the flight path is empty, storing a new flight path at the position;
and (3) performing dynamic and static inspection on the traces in each class, specifically as follows: if the number of the motion points exceeds the total associated point trace ratio threshold value
Figure 645264DEST_PATH_IMAGE021
Marking the track as non-stationary, otherwise, marking the track as stationary;
constructing a virtual measuring point track of the initial flight path, which comprises the following specific steps:
Figure 155529DEST_PATH_IMAGE022
Figure 400566DEST_PATH_IMAGE023
Figure 196615DEST_PATH_IMAGE024
Figure 654141DEST_PATH_IMAGE025
wherein the content of the first and second substances,
Figure 547141DEST_PATH_IMAGE026
the number of all traces in a class,
Figure 431921DEST_PATH_IMAGE027
to construct the distance of the virtual metrology trace,
Figure 512003DEST_PATH_IMAGE028
the distance of the ith trace point within the class,
Figure 773221DEST_PATH_IMAGE029
to construct the horizontal angle of the virtual measurement trace,
Figure 252219DEST_PATH_IMAGE030
is the horizontal angle of the ith point trace in the class,
Figure 307900DEST_PATH_IMAGE031
To construct the pitch angle of the virtual metrology trace,
Figure 327808DEST_PATH_IMAGE032
the pitch angle of the ith point trace in the class,
Figure 674607DEST_PATH_IMAGE033
to construct the radial velocity of the virtual metrology trace,
Figure 729151DEST_PATH_IMAGE034
the radial velocity of the ith point trace in the class,
Figure 972044DEST_PATH_IMAGE035
the sum weight of the signal-to-noise ratios of all points in the signal-to-noise ratio occupation class of each trace is taken as the weight of the sum of the signal-to-noise ratios of all points in the signal-to-noise ratio occupation class of each trace;
of the initial flight path
Figure 275987DEST_PATH_IMAGE036
Generating a period
Figure 364160DEST_PATH_IMAGE037
Initialized to 1, number of losses
Figure 69947DEST_PATH_IMAGE038
The initialization is 0, the filtering state is set to be uninitialized, the track associated point track number is initialized to be 0, and the distance of the initial track is
Figure 436074DEST_PATH_IMAGE027
Horizontal angle of
Figure 492892DEST_PATH_IMAGE029
A pitch angle of
Figure 837285DEST_PATH_IMAGE031
And a radial velocity of
Figure 413891DEST_PATH_IMAGE033
6. The method of 4D millimeter wave radar point cloud processing and target classification of claim 1, wherein the track update further comprises a target classification by:
the sliding window stores the track information of a plurality of frames of targets, and the track information is included in
Figure 451117DEST_PATH_IMAGE039
Calculating the maximum values of projection points of all associated point tracks of each frame of flight path on the xoy surface after clockwise rotating according to the course angle alpha and then respectively rotating on the X axis and the Y axis
Figure 480384DEST_PATH_IMAGE040
Figure 425207DEST_PATH_IMAGE041
And the minimum value of projection points of all associated point tracks of each frame track on the xoy surface on the X axis and the Y axis after clockwise rotation according to the heading angle alpha
Figure 43270DEST_PATH_IMAGE042
Figure 64447DEST_PATH_IMAGE043
Calculating the maximum and minimum values of all associated point tracks of each frame track on the Z axis
Figure 767960DEST_PATH_IMAGE044
Figure 264276DEST_PATH_IMAGE045
Respectively calculating multi-frame track associated point tracks
Figure 2425DEST_PATH_IMAGE046
Figure 443771DEST_PATH_IMAGE047
Figure 385313DEST_PATH_IMAGE048
And
Figure 671938DEST_PATH_IMAGE049
respectively carrying out displacement according to the coincidence of the virtual measuring point trace coordinate of each frame and the measuring point trace coordinate of the first frame, and then respectively obtaining the maximum values on the X axis and the Y axis after the displacement
Figure 812063DEST_PATH_IMAGE050
And
Figure 627573DEST_PATH_IMAGE051
and minimum values in X-axis and Y-axis after translation, respectively
Figure 305679DEST_PATH_IMAGE052
And
Figure 146727DEST_PATH_IMAGE053
(ii) a Of multiple frames
Figure 656206DEST_PATH_IMAGE054
Figure 111458DEST_PATH_IMAGE055
Respectively carrying out displacement according to the coincidence of the Z-axis coordinate of the virtual measuring point trace of each frame and the Z-axis coordinate of the measuring point trace of the first frame, and taking the maximum value after displacement
Figure 92839DEST_PATH_IMAGE056
And minimum value
Figure 190108DEST_PATH_IMAGE057
Respectively calculating:
length of the target
Figure 304825DEST_PATH_IMAGE058
Width of the target
Figure 196558DEST_PATH_IMAGE059
Height of the target
Figure 849256DEST_PATH_IMAGE060
Volume of target
Figure 297686DEST_PATH_IMAGE061
Giving the length of each class, the radar scattering cross section and the corresponding threshold value of the volume according to the distance and the horizontal angle of the target, and determining the length of the target
Figure 453861DEST_PATH_IMAGE062
Speed, velocity
Figure 578812DEST_PATH_IMAGE063
Calculating the class probability of a single-frame target based on the characteristics of the target, calculating the class probability of the single-frame target in a weighted form by combining the historical probability and the class probability of the single-frame target, and taking the class with the maximum class probability of the frame target as the final classification result of the frame, wherein,
Figure 203960DEST_PATH_IMAGE064
Figure 439769DEST_PATH_IMAGE065
Figure 981609DEST_PATH_IMAGE066
the velocities on the x-axis and the y-axis after the k-th kalman filter update of the target, respectively, k being an integer greater than zero.
CN202111466169.2A 2021-12-03 2021-12-03 Method for point cloud processing and target classification of 4D millimeter wave radar Active CN113866742B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202111466169.2A CN113866742B (en) 2021-12-03 2021-12-03 Method for point cloud processing and target classification of 4D millimeter wave radar
PCT/CN2022/092002 WO2023097971A1 (en) 2021-12-03 2022-05-10 4d millimeter wave radar data processing method
DE112022000017.1T DE112022000017T5 (en) 2021-12-03 2022-05-10 DATA PROCESSING METHODS FOR 4D MILLIMETER WAVE RADAR

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111466169.2A CN113866742B (en) 2021-12-03 2021-12-03 Method for point cloud processing and target classification of 4D millimeter wave radar

Publications (2)

Publication Number Publication Date
CN113866742A true CN113866742A (en) 2021-12-31
CN113866742B CN113866742B (en) 2022-02-22

Family

ID=78985803

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111466169.2A Active CN113866742B (en) 2021-12-03 2021-12-03 Method for point cloud processing and target classification of 4D millimeter wave radar

Country Status (3)

Country Link
CN (1) CN113866742B (en)
DE (1) DE112022000017T5 (en)
WO (1) WO2023097971A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115236674A (en) * 2022-06-15 2022-10-25 北京踏歌智行科技有限公司 Mining area environment sensing method based on 4D millimeter wave radar
CN115656962A (en) * 2022-12-26 2023-01-31 南京楚航科技有限公司 Method for identifying height-limited object based on millimeter wave radar
CN115825912A (en) * 2023-01-09 2023-03-21 南京隼眼电子科技有限公司 Radar signal processing method and device and storage medium
CN115840221A (en) * 2023-02-20 2023-03-24 上海几何伙伴智能驾驶有限公司 Method for realizing target feature extraction and multi-target tracking based on 4D millimeter wave radar
WO2023097971A1 (en) * 2021-12-03 2023-06-08 南京楚航科技有限公司 4d millimeter wave radar data processing method
CN116593973A (en) * 2023-03-29 2023-08-15 深圳承泰科技有限公司 Method and system for automatically calibrating installation angle of vehicle-mounted millimeter wave radar
CN116990773A (en) * 2023-09-27 2023-11-03 广州辰创科技发展有限公司 Low-speed small target detection method and device based on self-adaptive threshold and storage medium
CN117250595A (en) * 2023-11-20 2023-12-19 长沙莫之比智能科技有限公司 False alarm suppression method for vehicle-mounted millimeter wave radar metal well lid target
CN117491965A (en) * 2024-01-02 2024-02-02 上海几何伙伴智能驾驶有限公司 Target track starting method based on 4D millimeter wave radar
CN117647807A (en) * 2024-01-30 2024-03-05 安徽隼波科技有限公司 Motor vehicle size estimation method based on millimeter wave radar

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116881385B (en) * 2023-09-08 2023-12-01 中国铁塔股份有限公司 Track smoothing method, track smoothing device, electronic equipment and readable storage medium
CN117647806B (en) * 2024-01-30 2024-04-12 安徽隼波科技有限公司 Point trace condensation and target tracking method based on millimeter wave radar

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110031834A (en) * 2018-01-12 2019-07-19 西安艾索信息技术有限公司 A kind of improved multiple target radar track processing method
US20190302252A1 (en) * 2018-03-27 2019-10-03 Infineon Technologies Ag System and method of monitoring an air flow using a millimeter-wave radar sensor
CN111428573A (en) * 2020-03-02 2020-07-17 南京莱斯电子设备有限公司 Infrared weak and small target detection false alarm suppression method under complex background
CN111929655A (en) * 2020-09-08 2020-11-13 中国电子科技集团公司第三十八研究所 Automobile millimeter wave radar road target tracking method and system
CN112166336A (en) * 2019-09-27 2021-01-01 深圳市大疆创新科技有限公司 Method and device for calibrating pitching installation angle of millimeter wave radar, vehicle control system and vehicle
CN113671481A (en) * 2021-07-21 2021-11-19 西安电子科技大学 3D multi-target tracking processing method based on millimeter wave radar
CN113721234A (en) * 2021-08-30 2021-11-30 南京慧尔视智能科技有限公司 Vehicle-mounted millimeter wave radar point cloud data dynamic and static separation filtering method and device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019234795A1 (en) * 2018-06-04 2019-12-12 三菱電機株式会社 Light impingement device
CN113866742B (en) * 2021-12-03 2022-02-22 南京楚航科技有限公司 Method for point cloud processing and target classification of 4D millimeter wave radar

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110031834A (en) * 2018-01-12 2019-07-19 西安艾索信息技术有限公司 A kind of improved multiple target radar track processing method
US20190302252A1 (en) * 2018-03-27 2019-10-03 Infineon Technologies Ag System and method of monitoring an air flow using a millimeter-wave radar sensor
CN112166336A (en) * 2019-09-27 2021-01-01 深圳市大疆创新科技有限公司 Method and device for calibrating pitching installation angle of millimeter wave radar, vehicle control system and vehicle
CN111428573A (en) * 2020-03-02 2020-07-17 南京莱斯电子设备有限公司 Infrared weak and small target detection false alarm suppression method under complex background
CN111929655A (en) * 2020-09-08 2020-11-13 中国电子科技集团公司第三十八研究所 Automobile millimeter wave radar road target tracking method and system
CN113671481A (en) * 2021-07-21 2021-11-19 西安电子科技大学 3D multi-target tracking processing method based on millimeter wave radar
CN113721234A (en) * 2021-08-30 2021-11-30 南京慧尔视智能科技有限公司 Vehicle-mounted millimeter wave radar point cloud data dynamic and static separation filtering method and device

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023097971A1 (en) * 2021-12-03 2023-06-08 南京楚航科技有限公司 4d millimeter wave radar data processing method
CN115236674B (en) * 2022-06-15 2024-06-04 北京踏歌智行科技有限公司 Mining area environment sensing method based on 4D millimeter wave radar
CN115236674A (en) * 2022-06-15 2022-10-25 北京踏歌智行科技有限公司 Mining area environment sensing method based on 4D millimeter wave radar
CN115656962A (en) * 2022-12-26 2023-01-31 南京楚航科技有限公司 Method for identifying height-limited object based on millimeter wave radar
CN115825912A (en) * 2023-01-09 2023-03-21 南京隼眼电子科技有限公司 Radar signal processing method and device and storage medium
CN115825912B (en) * 2023-01-09 2023-05-23 南京隼眼电子科技有限公司 Radar signal processing method, device and storage medium
CN115840221A (en) * 2023-02-20 2023-03-24 上海几何伙伴智能驾驶有限公司 Method for realizing target feature extraction and multi-target tracking based on 4D millimeter wave radar
CN116593973A (en) * 2023-03-29 2023-08-15 深圳承泰科技有限公司 Method and system for automatically calibrating installation angle of vehicle-mounted millimeter wave radar
CN116990773A (en) * 2023-09-27 2023-11-03 广州辰创科技发展有限公司 Low-speed small target detection method and device based on self-adaptive threshold and storage medium
CN117250595A (en) * 2023-11-20 2023-12-19 长沙莫之比智能科技有限公司 False alarm suppression method for vehicle-mounted millimeter wave radar metal well lid target
CN117250595B (en) * 2023-11-20 2024-01-12 长沙莫之比智能科技有限公司 False alarm suppression method for vehicle-mounted millimeter wave radar metal well lid target
CN117491965A (en) * 2024-01-02 2024-02-02 上海几何伙伴智能驾驶有限公司 Target track starting method based on 4D millimeter wave radar
CN117491965B (en) * 2024-01-02 2024-03-19 上海几何伙伴智能驾驶有限公司 Target track starting method based on 4D millimeter wave radar
CN117647807A (en) * 2024-01-30 2024-03-05 安徽隼波科技有限公司 Motor vehicle size estimation method based on millimeter wave radar
CN117647807B (en) * 2024-01-30 2024-04-19 安徽隼波科技有限公司 Motor vehicle size estimation method based on millimeter wave radar

Also Published As

Publication number Publication date
DE112022000017T5 (en) 2023-07-27
CN113866742B (en) 2022-02-22
WO2023097971A1 (en) 2023-06-08

Similar Documents

Publication Publication Date Title
CN113866742B (en) Method for point cloud processing and target classification of 4D millimeter wave radar
US11093799B2 (en) Rare instance classifiers
Wang et al. A point cloud-based robust road curb detection and tracking method
CN109087510B (en) Traffic monitoring method and device
Lin et al. A Real‐Time Vehicle Counting, Speed Estimation, and Classification System Based on Virtual Detection Zone and YOLO
He et al. Obstacle detection of rail transit based on deep learning
CN106228125B (en) Method for detecting lane lines based on integrated study cascade classifier
CN111340855A (en) Road moving target detection method based on track prediction
CN110940971B (en) Radar target point trace recording method and device and storage medium
CN114488194A (en) Method for detecting and identifying targets under structured road of intelligent driving vehicle
CN112990004A (en) Black smoke vehicle detection method based on optical flow method and deep learning convolutional neural network
CN114358140A (en) Rapid capturing method for sparse point cloud aircraft under low visibility
CN114879192A (en) Decision tree vehicle type classification method based on road side millimeter wave radar and electronic equipment
CN117075097B (en) Maritime radar target tracking method and system based on expanded target cluster division
CN117689995A (en) Unknown spacecraft level detection method based on monocular image
Piroli et al. Towards robust 3D object detection in rainy conditions
Huang et al. An improved YOLOv3‐tiny algorithm for vehicle detection in natural scenes
Yang et al. Learn to model and filter point cloud noise for a near-infrared ToF LiDAR in adverse weather
CN113313008B (en) Target and identification tracking method based on YOLOv3 network and mean shift
Qi et al. Vehicle detection under unmanned aerial vehicle based on improved YOLOv3
CN114895274A (en) Guardrail identification method
CN114170196A (en) SAR image small target identification method based on CenterNet2
Wang et al. An Improved Object Detection Method for Underwater Sonar Image Based on PP‐YOLOv2
Abdalwohab et al. Deep learning based camera and radar fusion for object detection and classification
Ding et al. Lane line detection based on YOLOv4

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant