WO2020262772A1 - Radar signal processing method and radar system for human detection and number-of-people detection using deep learning - Google Patents
Radar signal processing method and radar system for human detection and number-of-people detection using deep learning Download PDFInfo
- Publication number
- WO2020262772A1 WO2020262772A1 PCT/KR2019/015552 KR2019015552W WO2020262772A1 WO 2020262772 A1 WO2020262772 A1 WO 2020262772A1 KR 2019015552 W KR2019015552 W KR 2019015552W WO 2020262772 A1 WO2020262772 A1 WO 2020262772A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- index
- distance
- data
- radar signal
- fourier transform
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R21/00—Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
- B60R21/01—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
- B60R21/015—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting the presence or position of passengers, passenger seats or child seats, and the related safety parameters therefor, e.g. speed or timing of airbag inflation in relation to occupant position or seat belt use
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/04—Systems determining presence of a target
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Definitions
- This patent relates to radar and AI (Artificial Intelligence) deep learning. It is a technology that detects a target using a radar and then uses a clutter removal algorithm and deep learning to distinguish the detected signal from people and objects, and to determine the number and location of people.
- AI Artificial Intelligence
- Radar technology has been used in the military of airplanes and is also used in automobiles recently. Deep learning technology is a technology that is widely used in image processing, medical care, and robot technology. Existing radar technology detects surrounding objects and then detects the distance and moving speed of the object, but does not have the ability to distinguish whether the detected object is a person or an object.
- An object of the present invention is to monitor the presence or absence of people and the number of people in a vehicle or a specific space in real time to prevent all possible problems that may occur due to a person being left unattended in a specific space such as a vehicle or a building.
- the radar system includes a radar front end that transmits and receives radio signals, a radar signal processing stage that primarily processes the collected signals, a clutter removal and data extraction algorithm, and a deeper that performs a final decision. It may include a running stage.
- the radar front-end stage can generate raw data by receiving the radio wave that has been hit and returned.
- the radar signal processing stage first processes the generated raw data.
- the clutter removal and data extraction algorithm the clutter generated by the object is removed by distinguishing people from objects, and data is extracted in a form to form a deep learning learning model.
- the deep learning stage is composed of algorithms for determining the presence of people and the number of people and detection through deep learning learning.
- the FMCW digital radar signal is obtained through N sampling per one of C chirps included in M receiving ends.
- a step of generating three-dimensional window data by accumulating two-dimensional data consisting of data corresponding to the and peripheral distance indexes and each of the peripheral indexes on the time axis of the nesting period, and inputting the three-dimensional window data into a convolutional neural network to obtain a distance of interest.
- calculating a target existence probability of each index and selecting a distance at which the target exists and each index from each distance index and
- a radar system is a radar front end that acquires an FMCW digital radar signal through N sampling per one of C chirps included in the M receiving ends.
- Radar signal processing that generates 3D window data by accumulating 2D data consisting of data corresponding to the range index, angle index, and surrounding distance index and each surrounding index on the time axis of the chirp period
- a deep learning stage that calculates the distance of interest and the target existence probability of each index by inputting 3D window data into the convolutional neural network, and the target existence from each distance index and the target existence probability of each index.
- It includes a bio-signal and number of people detection stage for selecting the distance and each index, and the radar signal processing stage: performs a third-order Fourier transform on the FMCW digital radar signal, and based on the result of the third-order Fourier transform, a data cube ( data cubes), generate range-angle maps for each chirps from the data cubes, and subtract the distance-angle maps from each other on the time axis of the chirp period. Can be done.
- a very complex algorithm is required to detect information such as the presence of people and the number of people only with general radar signal processing technology. In addition, it may take a long time to detect the radar sensor and the false detection rate is high.
- the complexity of the entire algorithm can be greatly reduced. Therefore, the detection time of the radar sensor can be greatly reduced. It can also greatly reduce the false positive rate. If the overall algorithm is simplified, the performance required by the hardware can be lowered. This can lead to a reduction in hardware cost.
- FIG. 1 is a flowchart illustrating a configuration of a radar signal processing method according to an exemplary embodiment.
- FIG. 2 illustrates a method of calculating a data cube through a third-order fast Fourier transform.
- FIG. 4 shows a flow of determining the location and number of people through a convolutional neural network.
- FIG. 5 is a diagram illustrating a radar system according to an embodiment of the present invention.
- FIG. 6 is a flowchart illustrating a method of detecting a biosignal according to an embodiment of the present invention.
- FIG. 7 is a flowchart illustrating a method of detecting the presence or absence of persons and the number of persons according to an embodiment of the present invention.
- the terminal, unit, block, and ⁇ or, ⁇ er mentioned in the present specification may be composed of hardware, software, and combinations thereof.
- the hardware includes a central processing unit (CPU), graphics processing unit (GPU), vision processing unit (VPU), and neural processing unit (NPU), digital signal processor (DSP), system on chip (SoC), field programmable gate (FPGA). array array), and an application specific integrated circuit (ASIC).
- the software may be machine language, firmware, embedded code, and application software.
- the term (terminal), unit (unit), block (block), ⁇ group ( ⁇ or, ⁇ er) mentioned in the present specification may be used in a substantially equivalent meaning, and may be used interchangeably in some cases.
- the radar signal received from the antenna is amplified, frequency synthesized, filtered, digitally sampled, and supplied as a frequency-modulated continuous-wave (FMCW) digital radar signal according to the proposed invention.
- FMCW frequency-modulated continuous-wave
- FIG. 2 shows FMCW radar raw data obtained through M receiving ends, C chirps, and N sampling per chirp through a third-order fast Fourier transform (FFT) transform. It shows a method of calculating (obtaining, obtaining, or generating) a data cube.
- FFT fast Fourier transform
- the clutter removal algorithm is performed by subtracting (subtracting) from the distance-angle maps generated for each chirp in the data cube from each other on the time axis of the chirp period.
- Distance-In each map when the target is a person, the strength of the signal varies on the time axis of the chirp period, and when the target is an object, the strength of the signal is kept constant on the time axis of the chirp period.
- the generated distance-each map is subtracted from each other on the time axis, the part where the target is a person remains (maintains), and the part where the target is an object cancel each other out and disappear. This can be used to distinguish people from objects.
- the periphery of the index is defined as a range index and angle index. From the two-dimensional data composed of the corresponding data, three-dimensional data accumulated on the time axis of the chirp period is extracted.
- FIG. 4 illustrates a flow of determining the location and number of people through a convolutional neural network by receiving the extracted distance of interest and each index as inputs.
- a distance at which a target exists and each index are selected from each distance and target existence probabilities of each index.
- a computer-readable recording medium storing a program comprising commands for executing the radar signal processing method according to the present invention described with reference to FIGS. 1 to 4 may be provided. .
- the radar system 100 may include a radar front end stage 110, a radar signal processing stage 120, a deep learning stage 130, and a bio-signal and number of people detection stage 140.
- the radar front end stage 110 may detect radar raw data.
- the radar front-end end 110 may transmit and receive radio signals.
- the radar front-end end 110 may acquire (require) an FMCW digital radar signal through N sampling per one of the C chirps included in the M receiving ends (M, C, N is an integer greater than or equal to 1).
- the radar signal processing stage 120 may pre-process raw data in a form suitable for deep learning.
- the radar signal processing stage 120 may primarily process the collected signals.
- the radar signal processing stage 120 may store processed data in a form for forming a deep learning learning model.
- the radar signal processing stage 120 may detect a phase of a signal based on raw data.
- the radar signal processing stage 120 accumulates two-dimensional data consisting of data corresponding to a range index, an angle index, a peripheral distance index, and each neighboring index on the time axis of the chirping period. You can create dimensional window data.
- the radar front-end end 110 also includes two-dimensional data consisting of data corresponding to a range index, an angle index, a peripheral distance index, and each peripheral index, and It is possible to generate 3D window data by accumulating on the axis.
- the radar signal processing stage 120 may perform third-order Fourier transform on the FMCW digital radar signal.
- the radar signal processing stage 120 may calculate data cubes based on the result of the third-order Fourier transform.
- the radar signal processing stage 120 may generate range-angle maps for each chirp in the data cubes.
- the radar signal processing stage 120 may perform a clutter removal algorithm by subtracting the distance-each maps from each other on the time axis of the chirp period.
- the radar signal processing stage 120 receives the results of the first Fourier transform and the first Fourier transform for generating distance data, which is a coefficient value for each range index by Fourier transforming the FMCW digital radar signal in units of sampling period.
- the second Fourier transform that collects and generates Fourier transforms in units of the distance of the receiving antennas, and generates each data that is a coefficient value for each angle index, and the results of the second Fourier transforms are collected for C chirps and a Fourier transform is performed in units of chirp periods.
- a third Fourier transform may be performed to generate time-based data that is a coefficient value for each chirp index.
- at least one of the first to third Fourier transforms may be performed (instead of) by the deep learning stage 130.
- the deep learning stage 130 may apply deep learning by using different characteristics of signals entering the radar when there is a person and when there is an object based on the preprocessed raw data.
- the deep learning stage 130 may process the detected phase using deep learning.
- the deep learning stage 130 may form an algorithm for detecting and determining the presence or absence of a person, the number of persons, and a biosignal through deep learning.
- the deep learning stage 130 may input 3D window data into a convolutional neural network to calculate (require, derive, or obtain) a distance of interest and a target existence probability of each index.
- the deep learning stage 130 may perform a third-order Fourier transform on an FMCW digital radar signal, calculate data cubes based on a result of the third-order Fourier transform, and It is possible to generate range-angle maps for each chirp in cubes, and/or perform a clutter removal algorithm by subtracting the distance-angle maps from each other on the time axis of the chirp period. I can.
- the radar signal processing stage 120 or the deep learning stage 130 subtracts the distance-each maps generated for each chirp on the time axis of the chirp period, thereby maintaining the part where the target is human, and The parts where the target is an object can cancel each other out.
- the intensity of the signal in the phase can be kept constant.
- the biosignal and number of people detection unit 140 may detect and determine the biosignal, the presence or absence of a person, and the number of people based on the deep learning application result.
- the biosignal and number of people detection stage 140 may select a distance at which a target exists and each index from each distance index and target existence probabilities of each index.
- the radar system 100 may include a convolutional neural network.
- the convolutional neural network included in the radar system 100 receives a clutter-removed data cube as an input, and the index value of the location where the person exists is large, and the index value of the location where the person exists is changed according to the period time. Since the location where a person can exist in a specific space such as a point and/or a vehicle is limited, the location and the number of people can be calculated using the point where the distance and each index can be pre-designated for each seat. By learning the power value and calculating the threshold power at each location, the presence or absence of a person and the number of people can be well detected.
- the radar system 100 and each of the components 110, 120, 130, and/or 140 of the radar system 100 are logical elements such as AND, OR, XOR, NOR, latch, flip-flop, and the like, and It may be implemented as a combination, or may be implemented by including dedicated circuits (eg, Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), etc.), or as a System on Chip (SoC).
- FPGAs Field Programmable Gate Arrays
- ASICs Application Specific Integrated Circuits
- SoC System on Chip
- FIG. 6 is a flowchart illustrating a method of detecting a biosignal according to an embodiment of the present invention. 6 will be described with reference to FIG. 5.
- the radar front end 110 may detect radar raw data.
- the radar signal processing stage 120 may perform first and second fast Fourier transforms (FFT) on the raw data.
- the radar signal processing stage 120 may apply a CFAR (Constant False Alarm Rate) algorithm to the result of the fast Fourier transform.
- the radar signal processing stage 120 may detect the phase of the signal based on the raw data.
- the deep learning stage 130 may process the detected phase using deep learning.
- the bio-signal and the number of people detecting stage 140 may detect the bio-signal based on the deep learning application result.
- FIG. 7 is a flowchart illustrating a method of detecting the presence or absence of persons and the number of persons according to an embodiment of the present invention. 7 will be described with reference to FIG. 5.
- the radar front end 110 may detect radar raw data.
- the radar signal processing stage 120 may perform first and second fast Fourier transforms (FFT) on the raw data.
- the radar signal processing stage 120 may apply a CFAR (Constant False Alarm Rate) algorithm to the result of the fast Fourier transform.
- the deep learning stage 130 may apply deep learning by using different characteristics of signals entering the radar when there is a person and when there is an object based on the preprocessed raw data.
- the bio-signal and the number of people detection unit 140 may detect and determine the presence or absence of a person and the number of people based on the deep learning application result.
- the present invention proposes a radar signal processing method for detecting people and detecting the number of people using deep learning.
- An object of the present invention is to be used for accident prevention and safety management in various application fields by monitoring the presence or absence of people and the number of people in real time.
- a clutter removal algorithm was proposed to distinguish people from objects, and the presence and location of the final person was determined using a deep learning-based convolution neural network (CNN).
- CNN convolution neural network
- the monitoring system according to the present invention can show a low false positive rate even in various situations through model learning based on deep learning. Moreover, unlike a system in which one application is applied with one sensor, the monitoring system according to the present invention can simultaneously detect various situations such as drowsiness, drinking alcohol, and sudden change of driver through deep learning. As other application fields, the monitoring system according to the present invention can be applied to industrial fields such as disaster relief sites, indoor navigation, autonomous driving sensors, and people/posture recognition systems.
- the contents described above are specific examples for carrying out the present invention.
- the present invention will include not only the embodiments described above, but also embodiments that can be changed in design or easily changed.
- the present invention will also include techniques that can be easily modified and implemented in the future using the above-described embodiments.
- the field of application utilizing deep learning technology is increasing at a very rapid rate.
- the present invention relates to a radar sensor to which deep learning technology is applied, and industrial applicability is recognized in that the false detection rate of the radar sensor can be greatly reduced.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Mathematical Physics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Computational Linguistics (AREA)
- Software Systems (AREA)
- Artificial Intelligence (AREA)
- Mechanical Engineering (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
Description
Claims (11)
- 레이더 시스템에 의해 수행되는 레이더 신호 처리 방법에 있어서,In the radar signal processing method performed by the radar system,M개의 수신단에 포함된 C개의 첩(chirp) 중 하나의 첩 당 N개의 샘플링(sampling)를 통해 FMCW 디지털 레이더 신호를 획득하는 단계;Acquiring an FMCW digital radar signal through N sampling per one of C chirps included in the M receiving terminals;상기 FMCW 디지털 레이더 신호에 대해 3차 푸리에 변환을 수행하고 그리고 상기 3차 푸리에 변환의 결과에 기초하여 데이터 큐브(data cube)들을 계산하는 단계(M, C, N은 1 이상의 정수);Performing a third-order Fourier transform on the FMCW digital radar signal and calculating data cubes based on the result of the third-order Fourier transform (M, C, and N are integers greater than or equal to 1);상기 데이터 큐브들에서 각각의 첩들에 대해 거리-각 맵(range-angle map)들을 생성하는 단계;Generating range-angle maps for respective chirps in the data cubes;상기 거리-각 맵들을 첩 주기의 시간 축 상에서 서로 감산하는 방식으로 클러터 제거 알고리즘을 수행하는 단계;Performing a clutter removal algorithm by subtracting the distance-each maps from each other on a time axis of a chirp period;관심 거리 인덱스(range index) 및 각 인덱스(angle index)와 주변 거리 인덱스 및 주변 각 인덱스에 해당하는 데이터로 구성되는 2차원 데이터를 상기 첩 주기의 상기 시간 축 상에서 누적하여 3차원 윈도우 데이터를 생성하는 단계;To generate three-dimensional window data by accumulating two-dimensional data consisting of data corresponding to a range index, an angle index, a peripheral distance index, and each peripheral index on the time axis of the chirp period. step;상기 3차원 윈도우 데이터를 합성곱 신경회로망에 입력하여 상기 관심 거리 및 각 인덱스의 타겟 존재 확률을 계산하는 단계; 및Inputting the 3D window data into a convolutional neural network to calculate the target distance and the target existence probability of each index; And각각의 거리 인덱스 및 각각의 각 인덱스의 타겟 존재 확률들로부터 타겟이 존재하는 거리 및 각 인덱스를 선택하는 단계를 포함하는 레이더 신호 처리 방법.A radar signal processing method comprising the step of selecting a distance at which a target exists and each index from each distance index and target existence probabilities of each index.
- 제 1 항에 있어서,The method of claim 1,상기 데이터 큐브들을 계산하는 단계는:Computing the data cubes comprises:상기 FMCW 디지털 레이더 신호를 샘플링 주기 단위로 푸리에 변환하여 거리 인덱스(range index)별 계수 값인 거리 데이터를 생성하는 제 1 푸리에 변환 단계;A first Fourier transform step of generating distance data, which is a coefficient value for each range index, by Fourier transforming the FMCW digital radar signal in units of a sampling period;상기 제 1 푸리에 변환 단계의 결과를 상기 M개의 수신단으로부터 수집하여 수신 안테나 거리 단위로 푸리에 변환하고 그리고 각 인덱스(angle index)별 계수 값인 각 데이터를 생성하는 제 2 푸리에 변환 단계; 및A second Fourier transform step of collecting the results of the first Fourier transform step from the M receiving ends, performing a Fourier transform in units of receive antenna distances, and generating each data that is a coefficient value for each angle index; And상기 제 2 푸리에 변환 단계의 결과를 상기 C개의 첩 동안 수집하여 첩 주기 단위로 푸리에 변환하여 첩 인덱스(chirp index)별 계수 값인 시간별 데이터를 생성하는 제 3 푸리에 변환 단계를 포함하는 레이더 신호 처리 방법.A radar signal processing method comprising a third Fourier transform step of collecting the result of the second Fourier transform step during the C chirps and performing Fourier transform in units of chirp periods to generate time-wise data that is a coefficient value for each chirp index.
- 제 1 항에 있어서,The method of claim 1,상기 클러터 제거 알고리즘을 수행하는 단계는:The step of performing the clutter removal algorithm is:상기 각각의 첩들에 대해 생성되는 상기 거리-각 맵들을 상기 첩 주기의 상기 시간 축 상에서 서로 감산함으로써, 타켓이 사람인 부분은 유지하고 그리고 타겟이 사물인 부분은 서로 상쇄시키는 단계를 포함하는 레이더 신호 처리 방법.Radar signal processing comprising the step of subtracting the distance-each maps generated for each of the chirps from each other on the time axis of the chirp period, thereby maintaining a part where a target is a person and canceling out a part where the target is an object Way.
- 제 3 항에 있어서,The method of claim 3,상기 클러터 제거 알고리즘을 수행하는 단계의 상기 데이터 큐브들에서 상기 각각의 첩들에 대해 생성되는 상기 거리-각 맵들에서:In the distance-each maps generated for each of the chirps in the data cubes of performing the clutter removal algorithm:상기 타겟이 상기 사람인 경우, 상기 첩 주기의 상기 시간 축 상에서 신호의 세기가 달라지고, 그리고When the target is the person, the intensity of the signal is different on the time axis of the chirp period, and상기 타겟이 상기 사물인 경우 상기 첩 주기의 상기 시간 축 상에서 상기 신호의 세기가 일정하게 유지되는 레이더 신호 처리 방법.When the target is the object, the radar signal processing method in which the intensity of the signal is kept constant on the time axis of the chirp period.
- 제 1 항에 있어서,The method of claim 1,상기 레이더 신호 처리 방법은 상기 합성곱 신경 회로망을 이용하여 사람이 존재하는 위치와 인원수를 계산하고 그리고 위치별로 들어오는 파워의 값에 대한 학습을 통해 각각의 위치마다 문턱 파워를 계산하는 단계를 더 포함하되,The radar signal processing method further includes calculating a location where a person exists and the number of people using the convolutional neural network, and calculating a threshold power for each location through learning about the value of the incoming power for each location. ,상기 합성곱 신경회로망은:The convolutional neural network is:클러터를 제거한 데이터 큐브들을 입력으로 받아 사람이 존재하는 위치의 인덱스 값이 큰 점;The fact that the index value of the location where the person exists is large by receiving the data cubes with the clutter removed as input;첩 주기 시간에 따라 사람이 존재하는 위치의 인덱스 값이 변하는 점; 및The point that the index value of the location where a person exists is changed according to the concubine cycle time; And차량 등 특정 공간에서 사람이 존재할 수 있는 위치가 제한되어 있어서 자리별로 거리 인덱스 및 각 인덱스를 미리 지정할 수 있는 점을 특징으로 하는 레이더 신호 처리 방법.A radar signal processing method, characterized in that a position in which a person can exist in a specific space, such as a vehicle, is limited, so that a distance index and each index can be pre-designated for each seat.
- 제 1 항에 기재된 레이더 신호 처리 방법을 컴퓨터에 실행시키기 위한 명령으로 이루어지는 프로그램을 저장한 컴퓨터 판독 가능한 기록매체.A computer-readable recording medium storing a program comprising instructions for causing a computer to execute the radar signal processing method according to claim 1.
- 레이더 시스템에 있어서,In the radar system,M개의 수신단에 포함된 C개의 첩(chirp) 중 하나의 첩 당 N개의 샘플링(sampling)를 통해 FMCW 디지털 레이더 신호를 획득하는 레이더 프론트 엔드 단(M, C, N은 1 이상의 정수);A radar front end (M, C, N is an integer greater than or equal to 1) for acquiring an FMCW digital radar signal through N sampling per one of the C chirps included in the M receiving ends;관심 거리 인덱스(range index) 및 각 인덱스(angle index)와 주변 거리 인덱스 및 주변 각 인덱스에 해당하는 데이터로 구성되는 2차원 데이터를 첩 주기의 시간 축 상에서 누적하여 3차원 윈도우 데이터를 생성하는 레이더 신호처리 단;A radar signal that generates 3D window data by accumulating 2D data consisting of data corresponding to the range index and angle index and the surrounding distance index and each surrounding index on the time axis of the chirp period Treatment stage;상기 3차원 윈도우 데이터를 합성곱 신경회로망에 입력하여 상기 관심 거리 및 각 인덱스의 타겟 존재 확률을 계산하는 딥러닝 단; 및A deep learning stage for inputting the 3D window data into a convolutional neural network to calculate the distance of interest and a target existence probability of each index; And각각의 거리 인덱스 및 각각의 각 인덱스의 타겟 존재 확률들로부터 타겟이 존재하는 거리 및 각 인덱스를 선택하는 생체 신호 및 인원 수 검출 단을 포함하되,Including a bio-signal and a number of people detection stage for selecting the distance and each index at which the target exists from each distance index and target existence probabilities of each index,상기 레이더 신호처리 단은:The radar signal processing stage is:상기 FMCW 디지털 레이더 신호에 대해 3차 푸리에 변환을 수행하고, 상기 3차 푸리에 변환의 결과에 기초하여 데이터 큐브(data cube)들을 계산하고,Perform a third-order Fourier transform on the FMCW digital radar signal, and calculate data cubes based on the result of the third-order Fourier transform,상기 데이터 큐브들에서 각각의 첩들에 대해 거리-각 맵(range-angle map)들을 생성하고,Generate distance-angle maps for each chirps in the data cubes,상기 거리-각 맵들을 상기 첩 주기의 상기 시간 축 상에서 서로 감산하는 방식으로 클러터 제거 알고리즘을 수행하는 레이더 시스템.A radar system that performs a clutter removal algorithm by subtracting the distance-each maps from each other on the time axis of the chirp period.
- 제 7 항에 있어서,The method of claim 7,상기 레이더 신호처리 단은:The radar signal processing stage is:상기 FMCW 디지털 레이더 신호를 샘플링 주기 단위로 푸리에 변환하여 거리 인덱스(range index)별 계수 값인 거리 데이터를 생성하는 제 1 푸리에 변환;A first Fourier transform for generating distance data, which is a coefficient value for each range index, by Fourier transforming the FMCW digital radar signal in units of a sampling period;상기 제 1 푸리에 변환의 결과를 상기 M개의 수신단으로부터 수집하여 수신 안테나 거리 단위로 푸리에 변환하고 그리고 각 인덱스(angle index)별 계수 값인 각 데이터를 생성하는 제 2 푸리에 변환; 및A second Fourier transform for collecting the results of the first Fourier transform from the M receiving terminals, performing a Fourier transform in units of receive antenna distances, and generating each data that is a coefficient value for each angle index; And상기 제 2 푸리에 변환의 결과를 상기 C개의 첩 동안 수집하여 첩 주기 단위로 푸리에 변환하여 첩 인덱스(chirp index)별 계수 값인 시간별 데이터를 생성하는 제 3 푸리에 변환을 수행하는 레이더 시스템.A radar system that performs a third Fourier transform that collects the results of the second Fourier transform during the C chirps and performs a Fourier transform in units of chirp periods to generate time-wise data that is a coefficient value for each chirp index.
- 제 7 항에 있어서,The method of claim 7,상기 레이더 신호처리 단은:The radar signal processing stage is:상기 각각의 첩들에 대해 생성되는 상기 거리-각 맵들을 상기 첩 주기의상기 시간 축 상에서 서로 감산함으로써, 타켓이 사람인 부분은 유지하고 그리고 타겟이 사물인 부분은 서로 상쇄시키는 레이더 시스템.By subtracting the distance-each maps generated for each of the chirps from each other on the time axis of the chirp period, a portion where a target is a person is maintained and a portion where a target is a thing cancels each other.
- 제 9 항에 있어서,The method of claim 9,상기 레이더 신호처리 단에 의해 생성되는 상기 거리-각 맵들에서:In the distance-each maps generated by the radar signal processing stage:상기 타겟이 상기 사람인 경우, 상기 첩 주기의 상기 시간 축 상에서 신호의 세기가 달라지고, 그리고When the target is the person, the strength of the signal is different on the time axis of the chirp period, and상기 타겟이 상기 사물인 경우 상기 첩 주기의 상기 시간 축 상에서 상기 신호의 세기가 일정하게 유지되는 레이더 시스템.When the target is the object, the radar system in which the intensity of the signal is kept constant on the time axis of the chirp period.
- 제 7 항에 있어서,The method of claim 7,상기 레이더 시스템은 상기 합성곱 신경 회로망을 이용하여 사람이 존재하는 위치와 인원수를 계산하고 그리고 위치별로 들어오는 파워의 값에 대한 학습을 통해 각각의 위치마다 문턱 파워를 계산하되,The radar system calculates the location of the person and the number of people using the convolutional neural network, and calculates the threshold power for each location through learning about the value of the incoming power for each location,상기 합성곱 신경회로망은:The convolutional neural network is:클러터를 제거한 데이터 큐브들을 입력으로 받아 사람이 존재하는 위치의 인덱스 값이 큰 점;The fact that the index value of the location where the person exists is large by receiving the data cubes with the clutter removed as input;첩 주기 시간에 따라 사람이 존재하는 위치의 인덱스 값이 변하는 점; 및The point that the index value of the location of the person's presence changes according to the period of the concubine; And차량 등 특정 공간에서 사람이 존재할 수 있는 위치가 제한되어 있어서 자리별로 거리 인덱스 및 각 인덱스를 미리 지정할 수 있는 점을 특징으로 하는 레이더 시스템.A radar system characterized in that the position in which a person can exist in a specific space such as a vehicle is limited, so that a distance index and each index can be specified in advance for each seat.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2019-0076734 | 2019-06-27 | ||
KR20190076734 | 2019-06-27 | ||
KR1020190144252A KR20210001840A (en) | 2019-06-27 | 2019-11-12 | Radar signal processing and radar system for occupant detection and people counting using deep learning |
KR10-2019-0144252 | 2019-11-12 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020262772A1 true WO2020262772A1 (en) | 2020-12-30 |
Family
ID=74060936
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2019/015552 WO2020262772A1 (en) | 2019-06-27 | 2019-11-14 | Radar signal processing method and radar system for human detection and number-of-people detection using deep learning |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2020262772A1 (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20010099500A (en) * | 2001-10-11 | 2001-11-09 | 윤준섭 | Artificial intelligence type radar system for auto traffic control |
KR20170124984A (en) * | 2017-10-24 | 2017-11-13 | 세종대학교산학협력단 | Method for processing data of ground penetrating radar |
KR101927364B1 (en) * | 2017-12-13 | 2018-12-10 | 주식회사 에스원 | Outside Intruding and Monitering Radar Syatem Based on Deep -Learning and Method thereof |
KR101987846B1 (en) * | 2018-07-26 | 2019-06-11 | 한국해양과학기술원 | Apparatus and method for avoiding ship collision by image analysis of monitor of radar device |
-
2019
- 2019-11-14 WO PCT/KR2019/015552 patent/WO2020262772A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20010099500A (en) * | 2001-10-11 | 2001-11-09 | 윤준섭 | Artificial intelligence type radar system for auto traffic control |
KR20170124984A (en) * | 2017-10-24 | 2017-11-13 | 세종대학교산학협력단 | Method for processing data of ground penetrating radar |
KR101927364B1 (en) * | 2017-12-13 | 2018-12-10 | 주식회사 에스원 | Outside Intruding and Monitering Radar Syatem Based on Deep -Learning and Method thereof |
KR101987846B1 (en) * | 2018-07-26 | 2019-06-11 | 한국해양과학기술원 | Apparatus and method for avoiding ship collision by image analysis of monitor of radar device |
Non-Patent Citations (1)
Title |
---|
SEVGI ZUBEYDE GURBUZ: "Radar-Based Human-Motion Recognition With Deep Learning: Promising Applications for Indoor Monitoring", IEEE EXPLORE DIGITAL LIBRARY, vol. 36, no. 4, 26 June 2019 (2019-06-26), pages 16 - 28, XP011732376, Retrieved from the Internet <URL:https://ieeexplore.ieee.org/document/8746862> [retrieved on 20200306], DOI: 10.1109/MSP.2018.2890128 * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9429650B2 (en) | Fusion of obstacle detection using radar and camera | |
JP5297078B2 (en) | Method for detecting moving object in blind spot of vehicle, and blind spot detection device | |
KR20210001840A (en) | Radar signal processing and radar system for occupant detection and people counting using deep learning | |
US9812008B2 (en) | Vehicle detection and tracking based on wheels using radar and vision | |
AU2004269298B2 (en) | Target detection improvements using temporal integrations and spatial fusion | |
CN109597065B (en) | False alarm suppression method and device for through-wall radar detection | |
CN109891262A (en) | Object detection device | |
CN104290730B (en) | A kind of radar applied to senior emergency braking system and video information fusion method | |
CN113640792B (en) | Machine learning-based millimeter wave radar detection method for in-vehicle living body | |
CN111497741B (en) | Collision early warning method and device | |
CN114814832A (en) | Millimeter wave radar-based real-time monitoring system and method for human body falling behavior | |
CN110703272B (en) | Surrounding target vehicle state estimation method based on vehicle-to-vehicle communication and GMPHD filtering | |
CN108839614A (en) | A kind of vehicle safety deceleration system for electric vehicle | |
WO2020262772A1 (en) | Radar signal processing method and radar system for human detection and number-of-people detection using deep learning | |
KR20220052526A (en) | Radar signal processing for people counting and localization in vehicle using doppler effect and cfar for each range | |
CN110287957B (en) | Low-slow small target positioning method and positioning device | |
CN113740855B (en) | Space occupation identification method and device, millimeter wave radar and storage medium | |
CN107256382A (en) | Virtual bumper control method and system based on image recognition | |
JPWO2022190719A5 (en) | ||
CN113682259A (en) | Vehicle door opening early warning anti-collision system and control method | |
KR20210136542A (en) | Radar System for Vehicle And Control Method Therefor | |
CN112672047B (en) | Image acquisition system and image processing method | |
US20230184920A1 (en) | Object detection device and object detection method | |
CN114814778B (en) | Carrier speed calculation method based on millimeter wave radar | |
CN116125466B (en) | Ship personnel hidden threat object carrying detection method and device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19935651 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19935651 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19935651 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 04/07/2022) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19935651 Country of ref document: EP Kind code of ref document: A1 |