CN109581345A - Object detecting and tracking method and system based on millimetre-wave radar - Google Patents
Object detecting and tracking method and system based on millimetre-wave radar Download PDFInfo
- Publication number
- CN109581345A CN109581345A CN201811432907.XA CN201811432907A CN109581345A CN 109581345 A CN109581345 A CN 109581345A CN 201811432907 A CN201811432907 A CN 201811432907A CN 109581345 A CN109581345 A CN 109581345A
- Authority
- CN
- China
- Prior art keywords
- location data
- target
- data
- tracking
- radar
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/06—Systems determining position data of a target
- G01S13/08—Systems for measuring distance only
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/50—Systems of measurement based on relative movement of target
- G01S13/58—Velocity or trajectory determination systems; Sense-of-movement determination systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/66—Radar-tracking systems; Analogous systems
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
The object detecting and tracking method and system based on millimetre-wave radar that the invention discloses a kind of, method is the following steps are included: obtain the radar detection data of target area and the monitor video of target area;Processing radar detection data obtains the location data A of target;Identification monitor video obtains the location data B of target;Location data A and location data B are compared, the centroid distance d of location data A and location data B are obtained;Judge whether centroid distance d is less than or equal to threshold value dmin;If so, being judged to merging location data A and location data B as object location data with a target;If it is not, being then determined as different target.Same target is represented in judgement location data A and B, location data A and B are merged to obtain the true location data of target, radar fix data are merged with monitor video location data, improve positioning accuracy, and target efficiently can be positioned and be tracked in conjunction with video image.
Description
Technical field
The present invention relates to radar auxiliary positioning fields, especially relate to a kind of target detection based on millimetre-wave radar
With tracking and system.
Background technique
With the development of economy, people's lives level is continuously improved, and automobile is popularized, but automobile is people's
While trip offers convenience, traffic accidents also increasingly take place frequently.If come by the vision of driver and the sense of hearing
Road conditions are judged, hence it is evident that there is perception blind areas to work as appearance since travel speed of the vehicle on road generally compares comparatively fast
When problem, stays the reaction time of a driver and reaction distance often insufficient, not only seriously compromise the life wealth of people in this way
Safety is produced, also brings great hidden danger to social stability.
Automobile assistant driving applies (ADAS), and also referred to as advanced driving assistance system is using vehicle-mounted various each
Sample sensor incudes the environment of surrounding at any time while the car is driving, collects data, carries out static, dynamic object distinguish
Know, detect and track, and navigation instrument map datum, the operation and analysis of system is carried out, to allow driver to discover in advance
To the danger that may occur, the comfortableness and security of car steering are effectively increased.
There is target position inaccurate using (ADAS) in existing automobile assistant driving.
Summary of the invention
In order to solve the defect of the above-mentioned prior art, the object of the present invention is to provide a kind of targets based on millimetre-wave radar
Detection and tracking and system.
In order to achieve the above objectives, the technical scheme is that
A kind of intelligent Target detection and tracking based on millimetre-wave radar, comprising the following steps:
Obtain the radar detection data of target area and the monitor video of target area;
Processing radar detection data obtains the location data A of target;
Identification monitor video obtains the location data B of target;
Location data A and location data B are compared, the centroid distance d of location data A and location data B are obtained;
Judge whether centroid distance d is less than or equal to threshold value dmin;
If so, being judged to merging location data A and location data B as object location data with a target;
If it is not, being then determined as different target.
Further, the processing radar detection data obtains the location data step A of target, including,
The distance and speed data of target are calculated according to the difference frequency signal of radar emission signal and echo-signal;
Distance and speed data are converted into the pixel coordinate in image coordinate system, as location data A.
Further, described that location data A and location data B are compared, obtain location data A and location data B
Centroid distance Step d, including,
The centroid distance matrix of location data A and location data B is calculated,
DN, m=centroid_distance { A:B };
The minimum value of centroid distance matrix rows is extracted,
D '=min { dK, j|1≤j≤m}(1≤k≤n);
Wherein, DN, mIndicate that n × m of location data A and location data B ties up centroid distance matrix, dkjIndicate mass center matrix
DN, mThe minimum value of every row.
Further, described to judge whether centroid distance d is less than or equal to threshold value dminStep, including,
By dkjWith threshold value dminIt is compared and judges dkjWhether threshold value d is less than or equal tomin, it is as follows to compare formula:
Md{dK, j|dK, j≤dmin, dK, j∈D′}
Wherein, dminFor the threshold value of setting, MdIndicate that centroid distance is less than dminTarget collection.
Further, the fusion location data A and location data B be as object location data step, including,
On the basis of the mass center of location data B, reference location data A carries out location data according to target geometrical property
Adjustment.
Further, after the fusion location data A and location data B is as object location data step, including,
Extract the characteristic information of target region of interest;
The fusion of target multicharacteristic information obtains target trajectory and tracking target using multiple target tracking algorithm.
Further, described to be determined as after different target step, including,
The location data B' of adjacent objects in monitor video is obtained as new location data B;
Location data A and location data B are compared for repetition, obtain the centroid distance of location data A and location data B
Step d.
The invention also provides a kind of intelligent Target detection and tracking system based on millimetre-wave radar, comprising:
Data capture unit, for obtaining the radar detection data of target area and the monitor video of target area;
Data processing unit obtains the location data A of target for handling radar detection data;
Data identification unit, monitor video obtains the location data B of target for identification;
Mass center comparing unit obtains location data A and positioning number for location data A and location data B to be compared
According to the centroid distance d of B;
Mass center judging unit, for judging whether centroid distance d is less than or equal to threshold value dmin, if it is not, being then determined as different mesh
Mark;
Data fusion unit, for merging location data A and location data B as target when being determined as with a target
Location data.
It further, further include feature extraction unit, target tracking unit and neighbouring comparison unit,
The feature extraction unit, for extracting the characteristic information of target region of interest;
The target tracking unit is merged for target multicharacteristic information, obtains target track using multiple target tracking algorithm
Mark and tracking target;
The neighbouring comparison unit, for obtaining the location data B' of adjacent objects in monitor video as new positioning number
According to B.
Further, the data fusion unit includes data fusion module, and the data fusion module is used for positioning
On the basis of the mass center of data B, reference location data A is adjusted location data according to target geometrical property;
The data processing unit includes data processing module and coordinate transferring, and the data processing module is used for
The distance and speed data of target are calculated according to the difference frequency signal of radar emission signal and echo-signal;The coordinate conversion
Module, for distance and speed data to be converted into the pixel coordinate in image coordinate system, as location data A.
The beneficial effects of the present invention are: obtaining the location data A of target by radar data, mesh is obtained by monitor video
Target location data B, after determining that location data A and location data B represents the same target, by location data A and positioning
Data B is merged to obtain the true location data of target, and radar fix data are merged with monitor video location data,
Positioning accuracy is improved, and target efficiently can be positioned and be tracked in conjunction with video image.
Detailed description of the invention
Fig. 1 is a kind of functional block diagram of millimetre-wave radar of the present invention;
Fig. 2 is a kind of antenna structure layout's figure of millimetre-wave radar of the present invention;
Fig. 3 is a kind of method flow of the object detecting and tracking method based on millimetre-wave radar of one embodiment of the invention
Figure;
Fig. 4 is a kind of method flow diagram for obtaining location data A of the present invention;
Fig. 5 is a kind of method flow diagram for the centroid distance d for obtaining location data A and location data B of the present invention;
Fig. 6 is a kind of method flow of the object detecting and tracking method based on millimetre-wave radar of another embodiment of the present invention
Figure;
Fig. 7 is a kind of functional block diagram of the object detecting and tracking system based on millimetre-wave radar of the present invention;
Fig. 8 is the functional block diagram of data processing unit of the present invention;
Fig. 9 is the functional block diagram of data fusion module of the present invention.
Specific embodiment
To illustrate thought and purpose of the invention, the present invention is done further below in conjunction with the drawings and specific embodiments
Explanation.
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete
Site preparation description, it is clear that described embodiment is only a part of the embodiments of the present invention, instead of all the embodiments.Base
Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts it is all its
His embodiment, shall fall within the protection scope of the present invention.
It is to be appreciated that the directional instruction (up, down, left, right, before and after etc.) of institute is only used in the embodiment of the present invention
It explains in relative positional relationship, the motion conditions etc. under a certain particular pose (as shown in the picture) between each component, if the spy
When determining posture and changing, then directionality instruction also correspondingly changes correspondingly, and the connection, which can be, to be directly connected to, can also
To be to be indirectly connected with.
In addition, the description for being such as related to " first ", " second " in the present invention is used for description purposes only, and should not be understood as
Its relative importance of indication or suggestion or the quantity for implicitly indicating indicated technical characteristic.Define as a result, " first ",
The feature of " second " can explicitly or implicitly include at least one of the features.In addition, the technical side between each embodiment
Case can be combined with each other, but must be based on can be realized by those of ordinary skill in the art, when the combination of technical solution
Conflicting or cannot achieve when occur will be understood that the combination of this technical solution is not present, also not the present invention claims guarantor
Within the scope of shield.
Unless otherwise instructed, "/" herein represents meaning as "or".
FPGA (Field-Programmable Gate Array), i.e. field programmable gate array
Digital Signal Processing, English: Digital Signal Processing is abbreviated as DSP.
Referring to Fig.1-5, propose an of the invention specific embodiment, a kind of intelligent Target detection based on millimetre-wave radar with
Track method, comprising the following steps:
The monitor video of S1, the radar detection data for obtaining target area and target area;
S2, processing radar detection data obtain the location data A of target;
S3, identification monitor video obtain the location data B of target;
S4, location data A and location data B are compared, obtain the centroid distance of location data A and location data B
dN, m;
S5, judge centroid distance dN, mWhether threshold value d is less than or equal tomin;
S6, if so, being judged to merging location data A and location data B as object location data with target;
S7, if it is not, being then determined as different target.
For step S1, obtaining radar data is realized based on millimetre-wave radar, and monitor video is to pass through monitoring camera
Head obtains in real time, specifically, millimetre-wave radar is consistent with the target area that monitoring camera is aligned, by obtaining radar data
With video data and merge can movement to target area and non-athletic target be measured in real time and track.
As shown in Figs. 1-2, a kind of millimetre-wave radar includes the radio-frequency module for sending and receiving radar signal, frequency hair
Raw device, analog-to-digital conversion module, FPGA module and two DSP modules.When work, frequency generator generates transmitting signal by radio frequency mould
Block transmitting, after radio-frequency module receives echo-signal, is sampled by analog-to-digital conversion module ADC and is transferred in FPGA module and carry out
Data prediction is last transmitted to carry out data processing.Two of them DSP module is responsible for for one receiving FPGA module pretreatment
I, Q two-way orthogonal signalling afterwards, while carrying out signal processing algorithm;Another piece of DSP then carries out the relevant algorithm of track, realizes mesh
Target tracking.
Wherein, FPGA module+DSP module constitutes the signal-processing board of multicore isomery, for passing through corresponding algorithm to line
Property CW with frequency modulation carries out handling to obtain range-to-go and speed data.
Wherein, for being mainly responsible for, signal source generates radio-frequency module and frequency multiplication is to 24GHz.The generation of signal source is sent out by frequency
Raw device generates linear frequency modulation and simple signal, specifically, frequency generator is the AD9910 chip of FPGA module control, AD9910
Chip can produce single-frequency, linear frequency modulation, Frequency Hopping Signal, with the excellent of low cost, low-power consumption, high-resolution and fast conversion times
Point.24GHz transmitting signal, which is generated, by operations such as the up-conversion of radio-frequency module level-one level-one, filter and amplification, function point passes through transmitting
Antenna is emitted.Simultaneously radio-frequency module also to complete down coversion, low noise etc. operation, the signal received with receiving antenna into
Row mixing obtains intermediate-freuqncy signal, and intermediate-freuqncy signal sends analog-to-digital conversion module ADC to and carries out after handling again by frequency overlapped-resistable filter
Sampling.
Radio-frequency module includes a transmitting antenna and three receiving antennas, receiving antenna include receiving antenna 1, receives day
Line 2 and receiving antenna 3, three L-shaped settings of receiving antenna.Wherein, transmitting antenna transmitting terminal frequency is adjustable up to 24GHz
Maximum bandwidth is 500MHz.The radio frequency unit received by an above-mentioned hair three, can measure the distance and speed data of target, together
When also can measure target azimuth and pitch angle.
The transmitting signal of radar and reception are utilized due to linear frequency modulation continuous wave special waveform system for step S2
The mixing of signal obtains difference frequency signal, includes range information and Doppler effect bring speed in the centre frequency of difference frequency signal
Information can acquire the distance and speed data of the target for including in radar data using signal processing algorithms such as FFT transform.
With reference to Fig. 4, step S2, comprising:
S21, the distance and speed data that target is calculated according to the difference frequency signal of radar emission signal and echo-signal;
S22, distance and speed data are converted into the pixel coordinate in image coordinate system, as location data A.
For step S21 and S22, due to the special waveform system of linear frequency modulation continuous wave, according to transmitting signal and echo
Signal available corresponding difference frequency signal includes range information and Doppler effect bring in the centre frequency of difference frequency signal
Velocity information can carry out the operations such as FFT transform and crest frequency search to difference frequency signal to obtain the distance in radar data
And velocity information.
Meanwhile according to monitor video obtain location data B include coordinate value be image coordinate system in coordinate value, institute
To need to convert the distance and speed data that obtain according to radar data to the coordinate value in image coordinate system, guarantee coordinate system
Unanimously, it is convenient for subsequent carry out data fusion.
For step S3, according to monitor video, the coordinate value on the boundary of the available target object into video, according to side
The coordinate value on boundary can determine specific location of the target object in image coordinate system.
It include mesh in location data B specifically, including the position coordinate data and speed data of target in location data A
Target boundary coordinate data, target category data and class probability data, above-mentioned coordinate Value Data is in image coordinate system
Coordinate value.
For step S4, after obtaining location data A and location data B respectively, since these two types of location datas are in target
There are deviations in number and coordinate data, it is therefore desirable to calculating the corresponding centroid distance d of two class location datasN, mWith
Just it is matched.
With reference to Fig. 5, step S4 the specific implementation process is as follows:
S41, the centroid distance matrix for calculating location data A and location data B,
DN, m=centroid_distance { A:B }.
S42, the minimum value for extracting centroid distance matrix rows,
D '=min { dK, j|1≤j≤m}(1≤k≤n)。
Wherein, DN, mIndicate that n × m of location data A and location data B ties up centroid distance matrix, dkjIndicate mass center matrix
DN, mThe minimum value of every row.
For step S5, centroid distance d is being calculatedN, m(dkj) after, by dN, m(dkj) with setting threshold value dminInto
Row compares, in dN, m(dkj) it is less than threshold value dminThen it is mutually matched for expression location data A and location data B.
With reference to figure X, step S5 the specific implementation process is as follows:
By dkjWith threshold value dminIt is compared and judges dkjWhether threshold value d is less than or equal tomin, it is as follows to compare formula:
Md={ dK, j|dK, j≤dmin, dK, j∈D′}
Wherein, dminFor the threshold value of setting, MdIndicate that centroid distance is less than dminTarget collection.
Later, it is also necessary to same target and different target separating treatment, specific formula is as follows:
Wherein, A ' expression is except other targets in A and B being same target, B ' expression except be in A and B same target its
His target.
Calculate by the above process the set M for being determined as same target in available location data A and location data B and
The set A ' and B ' of non-equal target.
For step S6, step S5 determine location data that location data A and location data B is the same target it
Afterwards, Data Fusion is carried out to location data A and location data B, guarantees the accuracy of the final location data of target.
Step S6 specifically: on the basis of the mass center of location data B, reference location data A, according to target geometrical property pair
Location data is adjusted.
Specifically, recycling target with reference to the coordinate information of location data A on the basis of the mass center of location data B
Geometrically symmetric characteristic be finely adjusted.For example, it is straight that target vehicle color can be calculated when target is road vehicle
Fang Tu finds symmetrical matching optimum position.
For step S7, after step S5 determines location data A and location data B for the location data of different target,
It being finely adjusted on the basis of location data A and the corresponding centroid position of location data B, method for trimming is identical as the same target,
With reference to arest neighbors information.
Specifically, a kind of intelligent Target detection based on millimetre-wave radar of the invention can be used for road with tracking
Vehicle location or other need the field of precise positioning, and detect target and can be moving target and be also possible to static target.
The location data A of target is obtained by radar data, the location data B of target is obtained by monitor video, is being sentenced
Determine to be merged location data A and location data B to obtain mesh after location data A and location data B represents the same target
Radar fix data are merged with monitor video location data, improve positioning accuracy, and energy by the true location data of target
Enough combine video image that target is efficiently positioned and tracked.
With reference to Fig. 6, in an alternative embodiment of the invention, it is also proposed that another intelligent Target detection based on millimetre-wave radar
With tracking, the difference with a upper embodiment is, after step S6, comprising the following steps:
S81, the characteristic information for extracting target region of interest;
S82, the fusion of target multicharacteristic information obtain target trajectory and tracking target using multiple target tracking algorithm.
For step S81 and S82, in order to improve the subsequent tracking effect of target, the sense for the target after precise positioning is emerging
Interesting region (ROI) extracts target signature information using the residual error neural network for establishing shallow-layer proposed in Deep SORT algorithm.It is logical
Excessive target tracking algorism to carry out track following to the corresponding target of fused data, and multiple target tracking algorithm includes two parts:
Track management is handled with tracking.
(1) track manages: being mainly responsible for the judgement and decision of track state, accurately finds the appearance of new track, loses
The deletion of track.
(2) target following: core is the association algorithm of Multi-target Data.In the algorithm, the pass between target and observation
Connection is indicated that intuitionistic fuzzy degree of membership is obtained by improved intuitionistic fuzzy c- means clustering algorithm by intuitionistic fuzzy degree of membership.
After step S7, comprising the following steps:
S9, the location data B' of adjacent objects in monitor video is obtained as new location data B.
For step S9, step S9 repeats step S4-S5 after executing, until have corresponding location data B' with
Location data A Corresponding matching, and merged location data by step S6.
The location data A of target is obtained by radar data, the location data B of target is obtained by monitor video, is being sentenced
Determine to be merged location data A and location data B to obtain mesh after location data A and location data B represents the same target
Radar fix data are merged with monitor video location data, improve positioning accuracy, and energy by the true location data of target
Enough combine video image that target is efficiently positioned and tracked.
With reference to Fig. 7-9, another embodiment of the present invention also proposed it is a kind of based on millimetre-wave radar intelligent Target detection with
Tracking system, comprising:
Data capture unit 10, for obtaining the radar detection data of target area and the monitor video of target area;
Data processing unit 20 obtains the location data A of target for handling radar detection data;
Data identification unit 30, monitor video obtains the location data B of target for identification;
Mass center comparing unit 40 obtains location data A and positioning for location data A and location data B to be compared
The centroid distance d of data B;
Mass center judging unit 50, for judging whether centroid distance d is less than or equal to threshold value dmin, if it is not, being then determined as difference
Target;
Data fusion unit 60, for merging location data A and location data B as mesh when being determined as with a target
Mark location data.
Feature extraction unit 70, for extracting the characteristic information of target region of interest;
Target tracking unit 80 is merged for target multicharacteristic information, obtains target trajectory using multiple target tracking algorithm
And tracking target;
Neighbouring comparison unit 90, for obtaining the location data B' of adjacent objects in monitor video as new location data
B。
For data capture unit 10, obtaining radar data is realized based on millimetre-wave radar, and monitor video is to pass through
Monitoring camera obtains in real time, specifically, millimetre-wave radar is consistent with the target area that monitoring camera is aligned, passes through acquisition
Radar data and video data and merge can movement to target area and non-athletic target be measured in real time and track.
As shown in Figs. 1-2, a kind of millimetre-wave radar includes the radio-frequency module for sending and receiving radar signal, frequency hair
Raw device, analog-to-digital conversion module, FPGA module and two DSP modules.When work, frequency generator generates transmitting signal by radio frequency mould
Block transmitting, after radio-frequency module receives echo-signal, is sampled by analog-to-digital conversion module ADC and is transferred in FPGA module and carry out
Data prediction is last transmitted to carry out data processing.Two of them DSP module is responsible for for one receiving FPGA module pretreatment
I, Q two-way orthogonal signalling afterwards, while carrying out signal processing algorithm;Another piece of DSP then carries out the relevant algorithm of track, realizes mesh
Target tracking.
Wherein, FPGA module+DSP module constitutes the signal-processing board of multicore isomery, for passing through corresponding algorithm to line
Property CW with frequency modulation carries out handling to obtain range-to-go and speed data.
Wherein, for being mainly responsible for, signal source generates radio-frequency module and frequency multiplication is to 24GHz.The generation of signal source is sent out by frequency
Raw device generates linear frequency modulation and simple signal, specifically, frequency generator is the AD9910 chip of FPGA module control, AD9910
Chip can produce single-frequency, linear frequency modulation, Frequency Hopping Signal, with the excellent of low cost, low-power consumption, high-resolution and fast conversion times
Point.24GHz transmitting signal, which is generated, by operations such as the up-conversion of radio-frequency module level-one level-one, filter and amplification, function point passes through transmitting
Antenna is emitted.Simultaneously radio-frequency module also to complete down coversion, low noise etc. operation, the signal received with receiving antenna into
Row mixing obtains intermediate-freuqncy signal, and intermediate-freuqncy signal sends analog-to-digital conversion module ADC to and carries out after handling again by frequency overlapped-resistable filter
Sampling.
Radio-frequency module includes a transmitting antenna and three receiving antennas, receiving antenna include receiving antenna 1, receives day
Line 2 and receiving antenna 3, three L-shaped settings of receiving antenna.Wherein, transmitting antenna transmitting terminal frequency is adjustable up to 24GHz
Maximum bandwidth is 500MHz.The radio frequency unit received by an above-mentioned hair three, can measure the distance and speed data of target, together
When also can measure target azimuth and pitch angle.
Data processing unit 20 is believed due to the special waveform system of linear frequency modulation continuous wave using the transmitting of radar
Number and receive the mixing of signal and obtain difference frequency signal, include range information and Doppler effect band in the centre frequency of difference frequency signal
The velocity information come, the distance and speed of the target for including in radar data can be acquired using signal processing algorithms such as FFT transform
Data.
With reference to Fig. 8, data processing unit 20 includes data processing module 21 and coordinate transferring 22.
Data processing module 21, for target to be calculated according to the difference frequency signal of radar emission signal and echo-signal
Distance and speed data.
Coordinate transferring 22, for distance and speed data to be converted into the pixel coordinate in image coordinate system, as
Location data A.
For data processing module 21 and data conversion module 22, due to the special waveform system of linear frequency modulation continuous wave,
It include range information in the centre frequency of difference frequency signal according to transmitting signal and the available corresponding difference frequency signal of echo-signal
With Doppler effect bring velocity information, the operations such as FFT transform and crest frequency search can be carried out to difference frequency signal and are come
Distance and velocity information into radar data.
Meanwhile according to monitor video obtain location data B include coordinate value be image coordinate system in coordinate value, institute
To need to convert the distance and speed data that obtain according to radar data to the coordinate value in image coordinate system, guarantee coordinate system
Unanimously, it is convenient for subsequent carry out data fusion.
For data identification unit 30, according to monitor video, the coordinate on the boundary of the available target object into video
Value, can determine specific location of the target object in image coordinate system according to the coordinate value on boundary.
It include mesh in location data B specifically, including the position coordinate data and speed data of target in location data A
Target boundary coordinate data, target category data and class probability data, above-mentioned coordinate Value Data is in image coordinate system
Coordinate value.
For mass center comparing unit 40, after obtaining location data A and location data B respectively, since these two types position number
According to there are deviations in target numbers and coordinate data, it is therefore desirable to calculate the corresponding mass center of two class location datas away from
From dN, mTo be matched.
Specifically, the realization process of mass center comparing unit 40 is as follows:
The centroid distance matrix of location data A and location data B is calculated,
DN, m=centroid_distance { A:B }.
The minimum value of centroid distance matrix rows is extracted,
D '=min { dK, j|1≤j≤m}(1≤k≤n)。
Wherein, DN, mIndicate that n × m of location data A and location data B ties up centroid distance matrix, dkjIndicate mass center matrix
DN, mThe minimum value of every row.
For mass center judging unit 50, centroid distance d is being calculatedN, m(dkj) after, by dN, m(dkj) with setting threshold
Value dminIt is compared, in dN, m(dkj) it is less than threshold value dminThen it is mutually matched for expression location data A and location data B.
Specifically, the course of work of mass center judging unit 50 is as follows:
By dkjWith threshold value dminIt is compared and judges dkjWhether threshold value d is less than or equal tomin, it is as follows to compare formula:
Md={ dK, j|dK, j≤dmin, dK, j∈D′}
Wherein, dminFor the threshold value of setting, MdIndicate that centroid distance is less than dminTarget collection.
Later, it is also necessary to same target and different target separating treatment, specific formula is as follows:
Wherein, A ' expression is except other targets in A and B being same target, B ' expression except be in A and B same target its
His target.
Calculate by the above process the set M for being determined as same target in available location data A and location data B and
The set A ' and B ' of non-equal target.
For data fusion unit 60, determine that location data A and location data B is the same mesh in mass center judging unit 50
After target location data, Data Fusion is carried out to location data A and location data B, guarantees the final location data of target
Accuracy.
With reference to Fig. 9, data fusion unit 60 includes data fusion module 61, and data fusion module is used for location data B
Mass center on the basis of, reference location data A is adjusted location data according to target geometrical property.
For data fusion module 61, believed on the basis of the mass center of location data B with reference to the coordinate of location data A
Breath, recycles the geometrically symmetric characteristic of target to be finely adjusted.For example, can be calculated when target is road vehicle
Target vehicle color histogram finds symmetrical matching optimum position.
After mass center judging unit 50 determines location data A and location data B for the location data of different target, with fixed
It is finely adjusted on the basis of position data A and the corresponding centroid position of location data B, method for trimming is identical as the same target, simultaneously
With reference to arest neighbors information.
Specifically, a kind of intelligent Target detection based on millimetre-wave radar of the invention can be used for road with tracking
Vehicle location or other need the field of precise positioning, and detect target and can be moving target and be also possible to static target.
For feature extraction unit 70 and target tracking unit 80, in order to improve the subsequent tracking effect of target, feature extraction
The area-of-interest (ROI) that unit 70 is directed to the target after precise positioning establishes shallow-layer using what is proposed in Deep SORT algorithm
Residual error neural network extract target signature information.Target tracking unit 80 is by multiple target tracking algorithm come to fused data pair
The target answered carries out track following, and multiple target tracking algorithm includes two parts: track management is handled with tracking.
(1) track manages: being mainly responsible for the judgement and decision of track state, accurately finds the appearance of new track, loses
The deletion of track.
(2) target following: core is the association algorithm of Multi-target Data.In the algorithm, the pass between target and observation
Connection is indicated that intuitionistic fuzzy degree of membership is obtained by improved intuitionistic fuzzy c- means clustering algorithm by intuitionistic fuzzy degree of membership.
For neighbouring comparison unit 90, the location data B' of adjacent objects in monitor video is obtained as new location data
B.Data return to mass center comparing unit 40 after neighbouring comparison unit 90 executes, and repeat to judge until there is corresponding positioning
Data B' and location data A Corresponding matching, and location data is merged.
This programme obtains the location data A of target by radar data, obtains the location data of target by monitor video
B merges location data A and location data B after determining that location data A and location data B represents the same target
The true location data of target is obtained, radar fix data are merged with monitor video location data, improve positioning accurate
Degree, and target efficiently can be positioned and be tracked in conjunction with video image.
The above description is only a preferred embodiment of the present invention, is not intended to limit the scope of the invention, all utilizations
Equivalent structure or equivalent flow shift made by description of the invention and accompanying drawing content is applied directly or indirectly in other correlations
Technical field, be included within the scope of the present invention.
Claims (10)
1. a kind of intelligent Target detection and tracking based on millimetre-wave radar, which comprises the following steps:
Obtain the radar detection data of target area and the monitor video of target area;
Processing radar detection data obtains the location data A of target;
Identification monitor video obtains the location data B of target;
Location data A and location data B are compared, the centroid distance d of location data A and location data B are obtained;
Judge whether centroid distance d is less than or equal to threshold value dmin;
If so, being judged to merging location data A and location data B as object location data with a target;
If it is not, being then determined as different target.
2. the intelligent Target detection based on millimetre-wave radar and tracking as described in claim 1, which is characterized in that described
Processing radar detection data obtains the location data step A of target, including,
The distance and speed data of target are calculated according to the difference frequency signal of radar emission signal and echo-signal;
Distance and speed data are converted into the pixel coordinate in image coordinate system, as location data A.
3. the intelligent Target detection based on millimetre-wave radar and tracking as described in claim 1, which is characterized in that described
Location data A and location data B are compared, the centroid distance Step d of location data A and location data B are obtained, including,
The centroid distance matrix of location data A and location data B is calculated,
DN, m=centroid_distance { A:B };
The minimum value of centroid distance matrix rows is extracted,
D '=min { dK, j|1≤h=j≤m}(1≤k≤n);
Wherein, DN, mIndicate that n × m of location data A and location data B ties up centroid distance matrix, dkjIndicate mass center matrix DN, mOften
Capable minimum value.
4. the intelligent Target detection based on millimetre-wave radar and tracking as claimed in claim 3, which is characterized in that described
Judge whether centroid distance d is less than or equal to threshold value dminStep, including,
By dkjWith threshold value dminIt is compared and judges dkjWhether threshold value d is less than or equal tomin, it is as follows to compare formula:
Md={ dK, j|dK, j≤dmin, dK, j∈D′}
Wherein, dminFor the threshold value of setting, MdIndicate that centroid distance is less than dminTarget collection.
5. the intelligent Target detection based on millimetre-wave radar and tracking as described in claim 1, which is characterized in that described
Location data A and location data B is merged as object location data step, including,
On the basis of the mass center of location data B, reference location data A is adjusted location data according to target geometrical property.
6. the intelligent Target detection based on millimetre-wave radar and tracking as claimed in claim 5, which is characterized in that described
After location data A and location data B is merged as object location data step, including,
Extract the characteristic information of target region of interest;
The fusion of target multicharacteristic information obtains target trajectory and tracking target using multiple target tracking algorithm.
7. the intelligent Target detection based on millimetre-wave radar and tracking as described in claim 1, which is characterized in that described
It is determined as after different target step, including,
The location data B' of adjacent objects in monitor video is obtained as new location data B;
Location data A and location data B are compared for repetition, obtain the centroid distance d step of location data A and location data B
Suddenly.
8. a kind of intelligent Target detection and tracking system based on millimetre-wave radar characterized by comprising
Data capture unit, for obtaining the radar detection data of target area and the monitor video of target area;
Data processing unit obtains the location data A of target for handling radar detection data;
Data identification unit, monitor video obtains the location data B of target for identification;
Mass center comparing unit obtains location data A and location data B for location data A and location data B to be compared
Centroid distance d;
Mass center judging unit, for judging whether centroid distance d is less than or equal to threshold value dmin, if it is not, being then determined as different target;
Data fusion unit, for when being determined as with a target, fusion location data A and location data B to be positioned as target
Data.
9. intelligent Target detection and tracking system based on millimetre-wave radar as claimed in claim 8, which is characterized in that also wrap
Feature extraction unit, target tracking unit and neighbouring comparison unit are included,
The feature extraction unit, for extracting the characteristic information of target region of interest;
The target tracking unit, for target multicharacteristic information merge, using multiple target tracking algorithm obtain target trajectory with
And tracking target;
The neighbouring comparison unit, for obtaining the location data B' of adjacent objects in monitor video as new location data B.
10. intelligent Target detection and tracking system based on millimetre-wave radar as claimed in claim 8, which is characterized in that institute
Stating data fusion unit includes data fusion module, and the data fusion module is used on the basis of the mass center of location data B, ginseng
Location data A is examined, location data is adjusted according to target geometrical property;
The data processing unit includes data processing module and coordinate transferring, and the data processing module is used for basis
The distance and speed data of target is calculated in radar emission signal and the difference frequency signal of echo-signal;The coordinate modulus of conversion
Block, for distance and speed data to be converted into the pixel coordinate in image coordinate system, as location data A.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811432907.XA CN109581345A (en) | 2018-11-28 | 2018-11-28 | Object detecting and tracking method and system based on millimetre-wave radar |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811432907.XA CN109581345A (en) | 2018-11-28 | 2018-11-28 | Object detecting and tracking method and system based on millimetre-wave radar |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109581345A true CN109581345A (en) | 2019-04-05 |
Family
ID=65924744
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811432907.XA Pending CN109581345A (en) | 2018-11-28 | 2018-11-28 | Object detecting and tracking method and system based on millimetre-wave radar |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109581345A (en) |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109946703A (en) * | 2019-04-10 | 2019-06-28 | 北京小马智行科技有限公司 | A kind of sensor attitude method of adjustment and device |
CN110118966A (en) * | 2019-05-28 | 2019-08-13 | 长沙莫之比智能科技有限公司 | Personnel's detection and number system based on millimetre-wave radar |
CN110161505A (en) * | 2019-05-21 | 2019-08-23 | 一汽轿车股份有限公司 | One kind being based on millimetre-wave radar rear anti-crash method for early warning |
CN110187334A (en) * | 2019-05-28 | 2019-08-30 | 深圳大学 | A kind of target monitoring method, apparatus and computer readable storage medium |
CN110208793A (en) * | 2019-04-26 | 2019-09-06 | 纵目科技(上海)股份有限公司 | DAS (Driver Assistant System), method, terminal and medium based on millimetre-wave radar |
CN110225449A (en) * | 2019-05-22 | 2019-09-10 | 东南大学 | It is a kind of based on millimeter wave CRAN 3D positioning, test the speed and environment mapping method |
CN110320507A (en) * | 2019-06-25 | 2019-10-11 | 成都九洲迪飞科技有限责任公司 | A kind of low small slow target detects automatically, tracks, identifying system |
CN110673606A (en) * | 2019-09-24 | 2020-01-10 | 芜湖酷哇机器人产业技术研究院有限公司 | Edge cleaning method and system of sweeper |
CN110738846A (en) * | 2019-09-27 | 2020-01-31 | 同济大学 | Vehicle behavior monitoring system based on radar and video group and implementation method thereof |
CN110865368A (en) * | 2019-11-30 | 2020-03-06 | 山西禾源科技股份有限公司 | Radar video data fusion method based on artificial intelligence |
CN111007880A (en) * | 2019-12-24 | 2020-04-14 | 桂林电子科技大学 | Extended target tracking method based on automobile radar |
CN111127701A (en) * | 2019-12-24 | 2020-05-08 | 武汉光庭信息技术股份有限公司 | Vehicle failure scene detection method and system |
CN111257882A (en) * | 2020-03-19 | 2020-06-09 | 北京三快在线科技有限公司 | Data fusion method and device, unmanned equipment and readable storage medium |
CN111856446A (en) * | 2020-05-22 | 2020-10-30 | 青岛若愚科技有限公司 | Network monitoring system based on millimeter wave radar and millimeter wave antenna array structure |
CN111856445A (en) * | 2019-04-11 | 2020-10-30 | 杭州海康威视数字技术股份有限公司 | Target detection method, device, equipment and system |
CN111986232A (en) * | 2020-08-13 | 2020-11-24 | 上海高仙自动化科技发展有限公司 | Target object detection method, target object detection device, robot and storage medium |
WO2020237501A1 (en) * | 2019-05-28 | 2020-12-03 | 深圳大学 | Multi-source collaborative road vehicle monitoring system |
CN112034445A (en) * | 2020-08-17 | 2020-12-04 | 东南大学 | Vehicle motion trail tracking method and system based on millimeter wave radar |
CN112379362A (en) * | 2020-10-23 | 2021-02-19 | 连云港杰瑞电子有限公司 | Event self-adaptive acquisition equipment and method based on multi-source data fusion |
CN112565343A (en) * | 2020-11-13 | 2021-03-26 | 杭州捷尚智能电网技术有限公司 | Substation operation track management and control method based on millimeter wave radar fusion video analysis |
CN112699319A (en) * | 2021-03-23 | 2021-04-23 | 上海迹寻科技有限公司 | Space clutter signal calibration method and device |
CN113687341A (en) * | 2021-08-16 | 2021-11-23 | 山东沂蒙交通发展集团有限公司 | Holographic intersection sensing method based on multi-source sensor |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103823208A (en) * | 2012-11-16 | 2014-05-28 | 通用汽车环球科技运作有限责任公司 | Method and apparatus for state of health estimation of object sensing fusion system |
CN104637059A (en) * | 2015-02-09 | 2015-05-20 | 吉林大学 | Night preceding vehicle detection method based on millimeter-wave radar and machine vision |
CN106842188A (en) * | 2016-12-27 | 2017-06-13 | 上海思致汽车工程技术有限公司 | A kind of object detection fusing device and method based on multisensor |
CN106908783A (en) * | 2017-02-23 | 2017-06-30 | 苏州大学 | Obstacle detection method based on multi-sensor information fusion |
CN107089231A (en) * | 2017-03-27 | 2017-08-25 | 中国第汽车股份有限公司 | It is a kind of automatic with car drive-control system and its method |
CN108509918A (en) * | 2018-04-03 | 2018-09-07 | 中国人民解放军国防科技大学 | Target detection and tracking method fusing laser point cloud and image |
-
2018
- 2018-11-28 CN CN201811432907.XA patent/CN109581345A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103823208A (en) * | 2012-11-16 | 2014-05-28 | 通用汽车环球科技运作有限责任公司 | Method and apparatus for state of health estimation of object sensing fusion system |
CN104637059A (en) * | 2015-02-09 | 2015-05-20 | 吉林大学 | Night preceding vehicle detection method based on millimeter-wave radar and machine vision |
CN106842188A (en) * | 2016-12-27 | 2017-06-13 | 上海思致汽车工程技术有限公司 | A kind of object detection fusing device and method based on multisensor |
CN106908783A (en) * | 2017-02-23 | 2017-06-30 | 苏州大学 | Obstacle detection method based on multi-sensor information fusion |
CN107089231A (en) * | 2017-03-27 | 2017-08-25 | 中国第汽车股份有限公司 | It is a kind of automatic with car drive-control system and its method |
CN108509918A (en) * | 2018-04-03 | 2018-09-07 | 中国人民解放军国防科技大学 | Target detection and tracking method fusing laser point cloud and image |
Non-Patent Citations (1)
Title |
---|
赵春玲等: "一种提高目标参数估计精度的雷达、红外数据融合方法", 《电光与控制》 * |
Cited By (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109946703A (en) * | 2019-04-10 | 2019-06-28 | 北京小马智行科技有限公司 | A kind of sensor attitude method of adjustment and device |
CN111856445A (en) * | 2019-04-11 | 2020-10-30 | 杭州海康威视数字技术股份有限公司 | Target detection method, device, equipment and system |
CN110208793A (en) * | 2019-04-26 | 2019-09-06 | 纵目科技(上海)股份有限公司 | DAS (Driver Assistant System), method, terminal and medium based on millimetre-wave radar |
CN110161505A (en) * | 2019-05-21 | 2019-08-23 | 一汽轿车股份有限公司 | One kind being based on millimetre-wave radar rear anti-crash method for early warning |
CN110225449A (en) * | 2019-05-22 | 2019-09-10 | 东南大学 | It is a kind of based on millimeter wave CRAN 3D positioning, test the speed and environment mapping method |
CN110187334B (en) * | 2019-05-28 | 2021-06-08 | 深圳大学 | Target monitoring method and device and computer readable storage medium |
CN110118966A (en) * | 2019-05-28 | 2019-08-13 | 长沙莫之比智能科技有限公司 | Personnel's detection and number system based on millimetre-wave radar |
CN110187334A (en) * | 2019-05-28 | 2019-08-30 | 深圳大学 | A kind of target monitoring method, apparatus and computer readable storage medium |
WO2020237501A1 (en) * | 2019-05-28 | 2020-12-03 | 深圳大学 | Multi-source collaborative road vehicle monitoring system |
CN110320507B (en) * | 2019-06-25 | 2024-03-26 | 成都九洲迪飞科技有限责任公司 | Automatic detection, tracking and identification system for low-speed and slow-speed targets |
CN110320507A (en) * | 2019-06-25 | 2019-10-11 | 成都九洲迪飞科技有限责任公司 | A kind of low small slow target detects automatically, tracks, identifying system |
CN110673606A (en) * | 2019-09-24 | 2020-01-10 | 芜湖酷哇机器人产业技术研究院有限公司 | Edge cleaning method and system of sweeper |
CN110738846A (en) * | 2019-09-27 | 2020-01-31 | 同济大学 | Vehicle behavior monitoring system based on radar and video group and implementation method thereof |
CN110738846B (en) * | 2019-09-27 | 2022-06-17 | 同济大学 | Vehicle behavior monitoring system based on radar and video group and implementation method thereof |
CN110865368A (en) * | 2019-11-30 | 2020-03-06 | 山西禾源科技股份有限公司 | Radar video data fusion method based on artificial intelligence |
CN111007880A (en) * | 2019-12-24 | 2020-04-14 | 桂林电子科技大学 | Extended target tracking method based on automobile radar |
CN111127701A (en) * | 2019-12-24 | 2020-05-08 | 武汉光庭信息技术股份有限公司 | Vehicle failure scene detection method and system |
CN111007880B (en) * | 2019-12-24 | 2022-09-02 | 桂林电子科技大学 | Extended target tracking method based on automobile radar |
CN111257882B (en) * | 2020-03-19 | 2021-11-19 | 北京三快在线科技有限公司 | Data fusion method and device, unmanned equipment and readable storage medium |
CN111257882A (en) * | 2020-03-19 | 2020-06-09 | 北京三快在线科技有限公司 | Data fusion method and device, unmanned equipment and readable storage medium |
CN111856446A (en) * | 2020-05-22 | 2020-10-30 | 青岛若愚科技有限公司 | Network monitoring system based on millimeter wave radar and millimeter wave antenna array structure |
CN111986232B (en) * | 2020-08-13 | 2021-09-14 | 上海高仙自动化科技发展有限公司 | Target object detection method, target object detection device, robot and storage medium |
CN111986232A (en) * | 2020-08-13 | 2020-11-24 | 上海高仙自动化科技发展有限公司 | Target object detection method, target object detection device, robot and storage medium |
CN112034445A (en) * | 2020-08-17 | 2020-12-04 | 东南大学 | Vehicle motion trail tracking method and system based on millimeter wave radar |
CN112379362A (en) * | 2020-10-23 | 2021-02-19 | 连云港杰瑞电子有限公司 | Event self-adaptive acquisition equipment and method based on multi-source data fusion |
CN112379362B (en) * | 2020-10-23 | 2024-06-04 | 连云港杰瑞电子有限公司 | Event self-adaptive acquisition equipment and method based on multi-source data fusion |
CN112565343A (en) * | 2020-11-13 | 2021-03-26 | 杭州捷尚智能电网技术有限公司 | Substation operation track management and control method based on millimeter wave radar fusion video analysis |
CN112699319A (en) * | 2021-03-23 | 2021-04-23 | 上海迹寻科技有限公司 | Space clutter signal calibration method and device |
CN112699319B (en) * | 2021-03-23 | 2021-06-29 | 上海迹寻科技有限公司 | Space clutter signal calibration method and device |
CN113687341A (en) * | 2021-08-16 | 2021-11-23 | 山东沂蒙交通发展集团有限公司 | Holographic intersection sensing method based on multi-source sensor |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109581345A (en) | Object detecting and tracking method and system based on millimetre-wave radar | |
CN101303735B (en) | Method for detecting moving objects in a blind spot region of a vehicle and blind spot detection device | |
CN103064086B (en) | Vehicle tracking method based on depth information | |
WO2022141914A1 (en) | Multi-target vehicle detection and re-identification method based on radar and video fusion | |
CN104237881B (en) | FMCW anti-collision radar multi-target detecting and tracking system and method | |
CN111045008B (en) | Vehicle millimeter wave radar target identification method based on widening calculation | |
CN1940591B (en) | System and method of target tracking using sensor fusion | |
CN102508246B (en) | Method for detecting and tracking obstacles in front of vehicle | |
CN107991671A (en) | A kind of method based on radar data and vision signal fusion recognition risk object | |
CN108509972A (en) | A kind of barrier feature extracting method based on millimeter wave and laser radar | |
CN111398924A (en) | Radar installation angle calibration method and system | |
GB2619196A (en) | Multi-target vehicle detection and re-identification method based on radar and video fusion | |
CN104134354A (en) | Traffic monitoring system for speed measurement and assignment of moving vehicles in a multi-target recording module | |
US20160343252A1 (en) | Vehicle identification | |
CN103728599B (en) | The method of false targets interference is suppressed with the active radar and passive radar net of other place configure | |
CN113791414B (en) | Scene recognition method based on millimeter wave vehicle-mounted radar view | |
CN107015249B (en) | ADS-B Deceiving interference detection method based on space correlation consistency | |
CN113156417B (en) | Anti-unmanned aerial vehicle detection system, method and radar equipment | |
CN109188430A (en) | A kind of target extraction method based on ground surveillance radar system | |
CN109541601A (en) | Differentiating obstacle and its detection method based on millimeter wave | |
CN109696676A (en) | A kind of effective obstacle target determines method, apparatus and vehicle | |
CN109100697B (en) | Target condensation method based on ground monitoring radar system | |
CN103679754A (en) | Real-time target searching and tracking technique | |
CN112731296A (en) | Automobile millimeter wave radar point trace condensation method and system | |
CN110703272B (en) | Surrounding target vehicle state estimation method based on vehicle-to-vehicle communication and GMPHD filtering |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190405 |