CN112633101A - Obstacle speed detection method and device - Google Patents

Obstacle speed detection method and device Download PDF

Info

Publication number
CN112633101A
CN112633101A CN202011476736.8A CN202011476736A CN112633101A CN 112633101 A CN112633101 A CN 112633101A CN 202011476736 A CN202011476736 A CN 202011476736A CN 112633101 A CN112633101 A CN 112633101A
Authority
CN
China
Prior art keywords
obstacle
video data
radar
speed
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011476736.8A
Other languages
Chinese (zh)
Inventor
陈海波
李逸岳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenlan Artificial Intelligence Shenzhen Co Ltd
Original Assignee
Shenlan Artificial Intelligence Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenlan Artificial Intelligence Shenzhen Co Ltd filed Critical Shenlan Artificial Intelligence Shenzhen Co Ltd
Priority to CN202011476736.8A priority Critical patent/CN112633101A/en
Publication of CN112633101A publication Critical patent/CN112633101A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • G01S13/585Velocity or trajectory determination systems; Sense-of-movement determination systems processing the video signal in order to evaluate or display the velocity value
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • G01S13/589Velocity or trajectory determination systems; Sense-of-movement determination systems measuring the velocity vector
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • G01S13/60Velocity or trajectory determination systems; Sense-of-movement determination systems wherein the transmitter and receiver are mounted on the moving object, e.g. for determining ground speed, drift angle, ground track
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/50Systems of measurement, based on relative movement of the target
    • G01S15/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • G01S15/588Velocity or trajectory determination systems; Sense-of-movement determination systems measuring the velocity vector
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/50Systems of measurement, based on relative movement of the target
    • G01S15/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • G01S15/60Velocity or trajectory determination systems; Sense-of-movement determination systems wherein the transmitter and receiver are mounted on the moving object, e.g. for determining ground speed, drift angle, ground track
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/50Systems of measurement based on relative movement of target
    • G01S17/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Traffic Control Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The application provides a method and a device for detecting barrier speed, wherein the method for detecting the barrier speed comprises the following steps: acquiring radar data acquired by a radar and video data acquired by a camera; matching the radar data and the video data; fusing the radar data and the video data which are subjected to matching processing based on the weight characteristics of the radar data and the weight characteristics of the video data, and determining the azimuth information of the obstacle; determining an obstacle speed based on a target time threshold and the obstacle location information corresponding to the target time threshold. According to the obstacle speed detection method and device, the radar data and the video data are matched, fusion is carried out based on the weight characteristics of the radar data and the video data, and the obstacle azimuth information is determined according to the fusion result, so that the obstacle speed in the target time threshold is obtained, the confidence coefficient is more objective, and the accuracy of obstacle speed detection can be improved.

Description

Obstacle speed detection method and device
Technical Field
The application relates to the technical field of unmanned driving, in particular to a method and a device for detecting barrier speed.
Background
With the development of vehicle technologies, vehicles are more and more biased to informatization and intellectualization, and for example, unmanned driving technologies need to detect obstacles appearing in front of the vehicles to obtain the speed of the obstacles, so as to control the vehicles or remind drivers to assist driving.
At present, the problem that the speed detection of dangerous targets in a driving area in front of a vehicle by using a single sensor is frequently faced with a false detection rate and a high missing detection rate is solved, the information quantity obtained by using the method for realizing driving assistance by using the single sensor is small, the identification result is inaccurate, and the method cannot adapt to a complex road environment.
Disclosure of Invention
The application provides a method and a device for detecting the speed of an obstacle, which aim to realize more objective confidence of the speed detection of the obstacle and can improve the accuracy of the speed detection of the obstacle.
The application provides an obstacle speed detection method, which comprises the following steps: acquiring radar data acquired by a radar and video data acquired by a camera; matching the radar data and the video data; fusing the radar data and the video data which are subjected to matching processing based on the weight characteristics of the radar data and the weight characteristics of the video data, and determining the azimuth information of the obstacle; determining an obstacle speed based on a target time threshold and the obstacle location information corresponding to the target time threshold.
According to the obstacle speed detection method provided by the application, the weight characteristic of the radar data is obtained based on a first detection speed obtained by detecting an obstacle sample by the radar and a speed tag corresponding to the obstacle sample; the weight characteristic of the video data is obtained based on a second detection speed obtained by detecting the obstacle sample by the camera and a speed label corresponding to the obstacle sample.
According to the obstacle speed detection method provided by the application, the weight characteristics of the radar data are the first detection speed and a first covariance matrix corresponding to the speed tag; and the weight characteristic of the video data is a second covariance matrix corresponding to the second detection speed and the speed label.
According to the obstacle speed detection method provided by the application, the obstacle speed is determined based on the obstacle position information and the preset time threshold, and the method comprises the following steps: acquiring barrier displacement characteristics from the barrier position information based on a preset time threshold; and obtaining the barrier speed based on the barrier displacement characteristics and the preset time threshold.
According to the obstacle speed detection method provided by the application, the matching processing of the radar data and the video data comprises the following steps: carrying out space calibration processing on the radar data and the video data; and carrying out time synchronization processing on the radar data and the video data.
According to the obstacle speed detection method provided by the application, the space calibration processing is performed on the radar data and the video data, and the method comprises the following steps: setting the coordinate system of the radar data and the coordinate system of the video data to be coincident.
According to the obstacle speed detection method provided by the application, the time synchronization processing of the radar data and the video data comprises the following steps: aligning respective points in time of the radar data with respective points in time of the video data.
The present application further provides an obstacle speed detection device, the obstacle speed detection device includes: the acquisition module is used for acquiring radar data acquired by a radar and video data acquired by a camera; the matching module is used for matching the radar data and the video data; the fusion module is used for fusing the radar data and the video data which are subjected to matching processing based on the weight characteristics of the radar data and the weight characteristics of the video data to determine the position information of the obstacle; a determination module for determining an obstacle speed based on a target time threshold and the obstacle position information corresponding to the target time threshold.
According to the obstacle speed detection device provided by the application, the determination module comprises: the first determining submodule is used for acquiring barrier displacement characteristics from the barrier position information based on a preset time threshold; and the second determining submodule is used for obtaining the barrier speed based on the barrier displacement characteristic and the preset time threshold.
According to an obstacle speed detection device provided by the application, the matching module comprises: and the first matching sub-module is used for carrying out space calibration processing on the radar data and the video data. And the second matching submodule is used for carrying out time synchronization processing on the radar data and the video data.
The present application further provides an electronic device, comprising a memory, a processor and a computer program stored on the memory and operable on the processor, wherein the processor implements the steps of any of the above-mentioned obstacle speed detection methods when executing the computer program.
The present application also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method of obstacle speed detection as described in any of the above.
According to the obstacle speed detection method, the radar data and the video data are matched, the weight characteristics of the radar data and the video data are fused, and the obstacle azimuth information is determined according to the fusion result, so that the obstacle speed in the target time threshold is obtained, the confidence coefficient of the obstacle speed detection is more objective, and the accuracy of the obstacle speed detection can be improved.
Drawings
In order to more clearly illustrate the technical solutions in the present application or the prior art, the drawings needed for the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flow chart of an obstacle speed detection method provided in the present application;
fig. 2 is a second schematic flow chart of the obstacle speed detection method provided in the present application;
fig. 3 is a third schematic flow chart of the obstacle speed detection method provided in the present application;
fig. 4 is a schematic structural diagram of an obstacle speed detection apparatus provided in the present application;
fig. 5 is a schematic structural diagram of a matching module of the obstacle speed detection apparatus provided in the present application;
fig. 6 is a schematic structural diagram of a determination module of the obstacle speed detection apparatus provided in the present application;
fig. 7 is a schematic structural diagram of an electronic device provided in the present application.
Detailed Description
To make the purpose, technical solutions and advantages of the present application clearer, the technical solutions in the present application will be clearly and completely described below with reference to the drawings in the present application, and it is obvious that the described embodiments are some, but not all embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The obstacle speed detection method and apparatus of the present application are described below with reference to fig. 1 to 5.
In the obstacle speed detection method, a radar and a camera are used in combination to detect an obstacle in front of a vehicle, and the radar and the camera can be mounted on the head of the vehicle, for example, can be mounted on a bumper of the vehicle.
The method for detecting the speed of the obstacle can be suitable for various types of vehicles, for example, the vehicle can be an automobile, a truck, a logistics vehicle, a garbage truck, a sweeper, or other vehicles with a walking function. The obstacle in front of the vehicle can be a moving object such as a motor vehicle, a pedestrian, a bicycle, an electric vehicle or an animal, and can also be a traffic facility such as a guardrail, a signal lamp post or a roadside tree or a municipal public facility.
As shown in fig. 1, the present application provides an obstacle speed detection method, which includes the following steps 110 to 140.
Wherein, step 110: and acquiring radar data acquired by a radar and video data acquired by a camera.
The radar may be: laser radar, millimeter wave radar, ultrasonic radar, or the like.
Here, a millimeter wave radar is taken as an example, the millimeter wave radar is a radar which works in a millimeter wave band (millimeter wave) for detection, the working frequency band is generally 24 GHz-300 GHz, the wavelength is 1-10 mm, the range is between microwave and centimeter wave, the direction and the distance of a target are accurately detected by transmitting electromagnetic waves to an obstacle and receiving echoes, and all-weather all-day-long and accurate speed and distance measurement are realized. The millimeter wave radar has the advantages of both the microwave radar and the photoelectric radar, and compared with the ultrasonic radar, the millimeter wave radar has the characteristics of small volume, light weight and high spatial resolution. Compared with optical sensors such as infrared sensors, laser sensors, cameras and the like, the millimeter wave radar has strong capability of penetrating fog, smoke and dust and has the characteristics of all weather and all day. In addition, the anti-interference capability of the millimeter wave radar is superior to that of other vehicle-mounted sensors. The operating frequencies of millimeter wave radars used on vehicles may be 24GHz and 77 GHz.
Meanwhile, the laser radar is also an important sensor in the field of automatic driving, the laser radar detects a target by using laser, and by scanning at 600 revolutions per minute or 1200 revolutions per minute, the laser radar can obtain real-time three-dimensional point cloud data in a very detailed manner, wherein the data comprises three-dimensional coordinates, distances, azimuth angles, intensity of reflected laser, laser codes, time and the like of the target, and the common sensors comprise single lines, 4 lines, 16 lines, 32 lines, 64 lines and 128 lines, are high-precision sensors, and have good stability and high robustness, however, the laser radar is high in cost, in addition, laser is greatly influenced by atmosphere and weather, the working distance is reduced due to atmospheric attenuation and severe weather, atmospheric turbulence can reduce the measurement precision of the laser radar, and the target is difficult to search and capture under the condition of narrow laser beams. Generally, other equipment is used for roughly capturing a target in a large airspace and quickly, and then the target is subjected to precision tracking measurement by a laser radar.
The camera can be: monocular camera, binocular stereo vision camera, panorama vision camera and infrared camera, use the monocular camera here as an example, the video in vehicle the place ahead can be shot to the monocular camera.
The monocular camera is mainly used for detecting and identifying characteristic symbols, such as lane line detection, traffic sign identification, traffic light identification, pedestrian and vehicle detection and the like, and although the reliability of vision detection is not high at present, vision calculation based on machine learning is an essential part in the popularization of automatic driving.
The radar and the camera face the front area of the vehicle, and when the radar and the camera are installed, the view field of the radar and the view field of the camera can be overlapped, namely, the acquisition ranges of the radar and the camera are approximately the same.
The radar can acquire radar data containing obstacles, the camera can acquire video data containing the obstacles, and the formats of the radar data and the video data are continuous files in the sequence of time points.
In practical application, the radar transmits electromagnetic waves to the outside, receives radar signals reflected by an obstacle, processes the radar signals, and can obtain radar data in a point trace form changing along with time, the radar data comprises the distance, the direction and the pitching value of the obstacle, the radar signals can be preprocessed, for example, measurement sets obtained by multi-turn scanning can be associated to obtain the track of the obstacle, and errors of the radar data can be corrected through a filtering algorithm and data processing.
The camera can shoot multi-frame images of the obstacle, the multi-frame images form video data as time goes on, the video data refer to a continuous image sequence and are composed of a group of continuous images, and for the images, except the appearance sequence, the images do not have any structural information.
Step 120: and matching the radar data and the video data.
It is understood that the formats of the radar data and the video data are not the same, and the formats of the radar data and the video data are adjusted here, for example, the size of each frame of the radar data and the video data can be scaled, so that the radar data and the video data can be correspondingly matched, so that the radar data and the video data at a certain intercepted moment can be used for describing the same obstacle.
Step 130: and fusing the radar data and the video data which are subjected to matching processing based on the weight characteristics of the radar data and the weight characteristics of the video data, and determining the position information of the obstacle.
It can be understood that the weight characteristics of the radar data and the weight characteristics of the video data are determined when the vehicle is debugged, the weight characteristics of the radar data are related to the physical characteristics of the radar and can represent the identification accuracy of the radar, and the weight characteristics of the video data are related to the physical characteristics of the camera and can represent the identification accuracy of the camera.
The radar data and the video data which are subjected to matching processing are put together, the radar data and the video data are weighted according to the weight characteristics of the radar data and the video data to determine the respective occupation ratios of the radar data and the video data, such as the occupation ratio of the radar data is 60% and the occupation ratio of the video data is 40%, at this time, the radar data and the video data are fused, and fusion output can be performed after searching and matching are performed on corresponding pixel points of the radar data and corresponding pixel points of the video data, in other words, the radar data are composed of a plurality of pixel points, the video data are composed of a plurality of pixel points, the plurality of pixel points of the radar data and the plurality of pixel points of the video data are correspondingly matched according to a time sequence and a space sequence, so that the radar data and the video data are fused to obtain fusion data, for example: the feature points in the radar data may be weighted according to a proportion of 60%, the feature points in the video data may be weighted according to a proportion of 40%, and then the feature points of the radar data subjected to the weighting processing and the feature points of the radar data subjected to the weighting processing are superimposed, that is, each feature point in the radar data and each feature point in the video data are superimposed according to a weight feature, so as to obtain fusion data, for example, 60% x + 40% y is z, where x represents a feature point of the radar data, y represents a feature point of the video data, and z represents a feature point of the fusion data.
The orientation information of the obstacle is included in the fusion data, and the orientation information of the obstacle may include a position of the obstacle and a moving direction of the obstacle.
Step 140: determining an obstacle speed based on a target time threshold and the obstacle position information corresponding to the target time threshold.
It can be understood that the obstacle location information has obstacle location information of the obstacle at different time points, and the obstacle location information may include a position of the obstacle and a moving direction of the obstacle, where a preset target time threshold is given, the target time threshold is a time period, and the obstacle location information in the time period is found in the obstacle location information.
The change value of the obstacle azimuth information of the target time threshold value can be divided by the target time threshold value to obtain the obstacle speed.
The preset target time threshold may be, for example, 0.01s, and the obstacle speed may be determined according to the change of the obstacle orientation information in the 0.01s, where the obstacle speed includes the moving direction and the absolute value of the moving speed of the obstacle.
For example, the vehicle start time is taken as 0 time, and the obstacle direction information includes, at 0 time: position (0, 0), direction: due to the north direction, at the time of 0.01s, the orientation information of the obstacle includes: the position is (0, 0.2), and the direction is the north direction, the obstacle speed may be (0.2-0)/0.01 ═ 20 m/s.
After the speed of the barrier is identified, the host of the vehicle can make corresponding judgment according to the current speed and the speed of the barrier, so that the direction of the vehicle is controlled, or voice prompt can be sent to a driver or the vehicle-mounted display screen is displayed, the driver is reminded to make adjustment in time, and safety accidents are avoided.
It is to be noted that the millimeter wave radar mainly obtains the distance, speed, and angle of a target object by transmitting an electromagnetic wave to the target object and receiving an echo. The visual scheme is slightly complex, and the monocular camera needs to perform target identification first and then estimate the distance of the target according to the pixel size of the target in the image. The camera scheme is with low costs, can discern different objects, has the advantage in aspects such as object height and width measurement accuracy, lane line discernment and pedestrian's identification accuracy, is the sensor that realizes functions such as lane departure early warning and traffic sign discernment indispensable, but working distance and range finding precision are not like the millimeter wave radar to receive the influence of factors such as illumination and weather easily. The millimeter wave radar is less affected by illumination and weather factors, has high ranging precision, but is difficult to identify elements such as lane lines, traffic signs and the like. In addition, the millimeter wave radar can realize higher-precision target speed detection by the principle of doppler shift.
It is worth mentioning that the radar device has a confidence level, the confidence level of the radar device is obtained according to the physical characteristics of the radar device, the camera device also has a confidence level, the confidence level of the camera is related to the type of the obstacle, the frame rate of the camera, the acquisition frequency of the camera and other parameters, and the confidence level of the fusion scheme for obtaining the obstacle data by fusing the radar data and the video data is as follows: the confidence of the camera and the confidence of the radar are summed, if the summation result is larger than or equal to 1, the confidence of the fusion scheme is 1, if the summation result is smaller than 1, the confidence of the fusion scheme is equal to the summation result, through experimental verification, the confidence of the fusion scheme for fusing the radar data and the video data to obtain the obstacle data is more objective, the obstacle azimuth information obtained after the radar data and the video data are fused can better reflect the real movement condition of the obstacle, the respective measurement errors of the radar and the camera can be eliminated, and the obstacle speed detection is more accurate.
The scheme of radar for measuring barrier speed independently is that a first barrier position feature is directly extracted from radar data, a first barrier speed is calculated according to the first barrier position feature, a camera is used for measuring barrier speed independently, a second barrier position feature is directly extracted from video data, a second barrier speed is calculated according to the second barrier position feature, if the first barrier speed and the second barrier speed are directly averaged simply, some information in the first barrier position feature and the second barrier position feature can be lost, and radar data and video data can not be fully applied, the radar data and the video data are directly fused, radar data and video data are completely reserved in the barrier position information obtained after fusion, important information in the radar data and the video data can be fully applied, the real movement condition of the barrier can be reflected, so that the barrier speed detection is more accurate.
According to the obstacle speed detection method, the radar data and the video data are matched, the weight characteristics of the radar data and the video data are fused, and the obstacle azimuth information is determined according to the fusion result, so that the obstacle speed in the target time threshold is obtained, the confidence coefficient of the obstacle speed detection is more objective, and the accuracy of the obstacle speed detection can be improved.
As shown in fig. 2, in some embodiments, the step 120: matching processing is carried out on radar data and video data, and the matching processing comprises the following steps: step 121-step 122 as follows.
Wherein, step 121: and carrying out spatial calibration processing on the radar data and the video data.
It is understood that the spatial calibration process refers to adjusting the spatial states of the radar data and the video data to a uniform reference system, and the coordinate system of the radar data and the coordinate system of the video data may be set to coincide.
It will be appreciated that accurate coordinate transformation relationships between the radar coordinate system, the three-dimensional world coordinate system, the camera coordinate system, the image coordinate system and the pixel coordinate system may be established, and that setting the coordinate systems of the radar data and the video data to coincide translates measurements from different coordinate systems into the same coordinate system. Because the video data collected by the camera is mainly visual, the measuring point in the radar coordinate system can be converted into the pixel coordinate system corresponding to the camera through the coordinate system, and the conversion relation between the radar coordinate system and the camera pixel coordinate system can be obtained according to the conversion relation, so that the coordinate system of the radar data and the coordinate system of the video data can be set to be coincident.
Step 122: and performing time synchronization processing on the radar data and the video data.
It is to be understood that the time synchronization process refers to aligning the time-sequential representation of the radar data and the time-sequential representation of the video data may be to align respective points in time of the radar data with respective points in time of the video data.
It will be appreciated that the radar data and the video data may be time synchronized by the camera and the radar acquiring data synchronously in time, for example, the millimeter wave radar has a sampling period of 50ms, i.e. a sampling frame rate of 20 frames/second, and the camera sampling frame rate of 25 frames/second. In order to ensure the reliability of data, the sampling rate of the camera is taken as a reference, the camera selects data cached in a frame of millimeter wave radar every time the camera acquires a frame of image, namely the time alignment of sampling a frame of radar data and video data together is completed, and therefore the time synchronization of the radar data and the video data is ensured.
In some embodiments, the weight characteristics of the radar data are obtained based on a first detection speed obtained by the radar through detection of the obstacle sample and a speed tag corresponding to the obstacle sample; the weight characteristic of the video data is obtained based on a second detection speed obtained by the camera for detecting the obstacle sample and a speed label corresponding to the obstacle sample.
It can be understood that the target sample and the speed label corresponding to the target sample can be obtained; obtaining a first detection speed based on the detection of the radar on the target sample, and obtaining a second detection speed based on the detection of the camera on the target sample; and obtaining the weight characteristics of the radar data based on the first detection speed and the speed label, and obtaining the weight characteristics of the video data based on the second detection speed and the speed label.
In other words, during vehicle commissioning, the radar and camera are tested to simulate a plurality of target samples, each having a corresponding speed tag, which may be a true speed value of the target sample.
Detecting a target sample by using a radar to obtain a first detection speed, comparing the first detection speed with a speed label to obtain a weight characteristic of the radar, wherein the weight characteristic of the radar represents the identification accuracy of the radar; meanwhile, the target sample is detected by the camera to obtain a second detection speed, the second detection speed is compared with the speed label to obtain the weight characteristic of the camera, and the weight characteristic of the camera represents the identification accuracy of the camera.
In some embodiments, the weight characteristic of the radar data is a first covariance matrix corresponding to the first detected velocity and the velocity tag; the weight characteristic of the video data is a second covariance matrix corresponding to a second detection speed and a speed label.
It is understood that Covariance (Covariance) is used in probability theory and statistics to measure the overall error of two variables. Variance is a special case of covariance, i.e. when the two variables are the same.
Covariance represents the error of the sum of two variables, as opposed to variance which represents the error of only one variable. If the two variables have the same trend, i.e. if one of them is greater than its expected value and the other is also greater than its expected value, the covariance between the two variables is positive. If the two variables have opposite trend, i.e. one of them is larger than the expected value of itself and the other is smaller than the expected value of itself, the covariance between the two variables is negative.
It is to be understood that the covariance matrix is defined as m × n matrix, where X includes variables X1, X2,.. once, Xm, and Y includes variables Y1, Y2,.. once, Yn, and if X1 has an expected value μ 1 and Y2 has an expected value μ 2, then the elements in the covariance matrix (1,2) are the covariances of X1 and Y2, and the covariance matrix can clearly show the correlation between the two variables.
As shown in fig. 3, in some embodiments, step 140: determining the speed of the obstacle based on the obstacle location information and the preset time threshold, including the following steps 141-142.
Step 141: and acquiring the obstacle displacement characteristics from the obstacle azimuth information based on a preset time threshold.
It is understood that a preset time threshold, i.e. a preselected time period, is determined, data corresponding to the time period is intercepted in the obstacle position information, and an obstacle displacement feature is selected from the data, the obstacle displacement feature is characterized by the displacement of the obstacle at the preset time threshold, and the obstacle displacement feature may include the position change of the obstacle and the movement direction of the obstacle.
Step 142: and obtaining the barrier speed based on the barrier displacement characteristics and the preset time threshold.
It is understood that the displacement of the obstacle characterized by the obstacle displacement characteristic at the preset time threshold may be divided by the preset time threshold to obtain the obstacle speed.
The obstacle speed detection device provided by the present application is described below, and the obstacle speed detection device described below and the obstacle speed detection method described above may be referred to in correspondence with each other.
As shown in fig. 4, an embodiment of the present application further provides an obstacle speed detection apparatus, including: an acquisition module 410, a matching module 420, a fusion module 430, and a determination module 440.
The obtaining module 410 is configured to obtain radar data collected by a radar and video data collected by a camera.
The matching module 420 is configured to perform matching processing on the radar data and the video data.
The fusion module 430 is configured to fuse the radar data and the video data subjected to matching processing based on the weight features of the radar data and the weight features of the video data, and determine the position information of the obstacle.
The determining module 440 is configured to determine an obstacle speed based on a target time threshold and the obstacle location information corresponding to the target time threshold.
In some embodiments, the weight characteristics of the radar data are obtained based on a first detection speed obtained by the radar through detection of the obstacle sample and a speed tag corresponding to the obstacle sample; the weight characteristic of the video data is obtained based on a second detection speed obtained by the camera for detecting the obstacle sample and a speed label corresponding to the obstacle sample.
In some embodiments, the weight characteristic of the radar data is a first covariance matrix corresponding to the first detected velocity and the velocity tag; the weight characteristic of the video data is a second covariance matrix corresponding to a second detection speed and a speed label.
As shown in fig. 5, in some embodiments, the matching module 420 includes: a first matching submodule 421 and a second matching submodule 422.
The first matching sub-module 421 is configured to perform spatial calibration processing on the radar data and the video data.
The second matching sub-module 422 is configured to perform time synchronization processing on the radar data and the video data.
In some embodiments, the first matching submodule 421 is further configured to set the coordinate system of the radar data and the coordinate system of the video data to coincide.
In some embodiments, the second matching sub-module 422 is also used to align respective points in time of the radar data with respective points in time of the video data.
As shown in fig. 6, in some embodiments, the determining module 440 includes: a first determination submodule 441 and a second determination submodule 442.
The first determining submodule 441 is configured to obtain an obstacle displacement feature from the obstacle azimuth information based on a preset time threshold;
the second determining submodule 442 is configured to obtain an obstacle speed based on the obstacle displacement characteristic and a preset time threshold.
The obstacle speed detection device provided in the embodiment of the present application is used to execute the obstacle speed detection method, and a specific implementation manner thereof is consistent with the implementation manner described in the embodiment of the method, and can achieve the same beneficial effects, and details are not repeated here.
Fig. 7 illustrates a physical structure diagram of an electronic device, and as shown in fig. 7, the electronic device may include: a processor (processor)710, a communication Interface (Communications Interface)720, a memory (memory)730, and a communication bus 740, wherein the processor 710, the communication Interface 720, and the memory 730 communicate with each other via the communication bus 740. Processor 710 may invoke logic instructions in memory 730 to perform an obstacle speed detection method comprising: acquiring radar data acquired by a radar and video data acquired by a camera; matching the radar data and the video data; fusing the radar data and the video data which are subjected to matching processing based on the weight characteristics of the radar data and the weight characteristics of the video data, and determining the position information of the obstacle; and determining the speed of the obstacle based on the target time threshold and the obstacle azimuth information corresponding to the target time threshold.
The processor 710 may further execute to perform an obstacle speed detection method, the determining an obstacle speed based on the obstacle location information and a preset time threshold, including: acquiring barrier displacement characteristics from barrier azimuth information based on a preset time threshold; and obtaining the barrier speed based on the barrier displacement characteristics and a preset time threshold.
The processor 710 may further execute to perform an obstacle speed detection method, the matching the radar data and the video data, comprising: carrying out spatial calibration processing on radar data and the video data; and carrying out time synchronization processing on the radar data and the video data.
In addition, the logic instructions in the memory 730 can be implemented in the form of software functional units and stored in a computer readable storage medium when the software functional units are sold or used as independent products. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The processor 710 in the electronic device according to the embodiment of the present application may call a logic instruction in the memory 730 to implement the method for detecting the speed of the obstacle, and the specific implementation manner of the method is consistent with that of the method, and the same beneficial effects may be achieved, which is not described herein again.
In another aspect, the present application also provides a computer program product comprising a computer program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions which, when executed by a computer, enable the computer to perform the obstacle speed detection method provided by the above methods, the obstacle speed detection method comprising: acquiring radar data acquired by a radar and video data acquired by a camera; matching the radar data and the video data; fusing the radar data and the video data which are subjected to matching processing based on the weight characteristics of the radar data and the weight characteristics of the video data, and determining the position information of the obstacle; and determining the speed of the obstacle based on the target time threshold and the obstacle azimuth information corresponding to the target time threshold.
Meanwhile, the present application also provides a computer program product comprising a computer program stored on a non-transitory computer-readable storage medium, the computer program comprising program instructions, which when executed by a computer, enable the computer to perform the obstacle speed detection method provided by the above methods, the determining the obstacle speed based on the obstacle azimuth information and a preset time threshold, comprising: acquiring barrier displacement characteristics from barrier azimuth information based on a preset time threshold; and obtaining the barrier speed based on the barrier displacement characteristics and a preset time threshold.
Meanwhile, the present application also provides a computer program product comprising a computer program stored on a non-transitory computer-readable storage medium, the computer program comprising program instructions that, when executed by a computer, enable the computer to execute the obstacle speed detection method provided by the above methods, the matching process of the radar data and the video data comprising: carrying out spatial calibration processing on radar data and the video data; and carrying out time synchronization processing on the radar data and the video data.
When the computer program product provided in the embodiment of the present application is executed, the method for detecting a speed of an obstacle described above is implemented, and the specific implementation manner is consistent with the method implementation manner, and the same beneficial effects can be achieved, which is not described herein again.
In yet another aspect, the present application also provides a non-transitory computer readable storage medium having stored thereon a computer program that when executed by a processor is implemented to perform the obstacle speed detection methods provided above, the obstacle speed detection method comprising: acquiring radar data acquired by a radar and video data acquired by a camera; matching the radar data and the video data; fusing the radar data and the video data which are subjected to matching processing based on the weight characteristics of the radar data and the weight characteristics of the video data, and determining the position information of the obstacle; and determining the speed of the obstacle based on the target time threshold and the obstacle azimuth information corresponding to the target time threshold.
Meanwhile, the present application also provides a non-transitory computer-readable storage medium having stored thereon a computer program, which when executed by a processor is implemented to perform the above-provided obstacle speed detection methods, the determining an obstacle speed based on the obstacle location information and a preset time threshold, comprising: acquiring barrier displacement characteristics from barrier azimuth information based on a preset time threshold; and obtaining the barrier speed based on the barrier displacement characteristics and a preset time threshold.
Meanwhile, the present application also provides a non-transitory computer-readable storage medium having stored thereon a computer program which, when executed by a processor, is implemented to perform the obstacle speed detection methods provided above, the matching processing of the radar data and the video data including: carrying out spatial calibration processing on radar data and the video data; and carrying out time synchronization processing on the radar data and the video data.
When the computer program stored on the non-transitory computer-readable storage medium provided in the embodiment of the present application is executed, the method for detecting the speed of the obstacle is implemented, and the specific implementation manner is consistent with the method implementation manner and can achieve the same beneficial effects, which is not described herein again.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (12)

1. An obstacle speed detection method, comprising:
acquiring radar data acquired by a radar and video data acquired by a camera;
matching the radar data and the video data;
fusing the radar data and the video data which are subjected to matching processing based on the weight characteristics of the radar data and the weight characteristics of the video data, and determining the azimuth information of the obstacle;
determining an obstacle speed based on a target time threshold and the obstacle location information corresponding to the target time threshold.
2. The method according to claim 1, wherein the weight characteristic of the radar data is obtained based on a first detection speed obtained by detecting an obstacle sample by the radar and a speed tag corresponding to the obstacle sample;
the weight characteristic of the video data is obtained based on a second detection speed obtained by detecting the obstacle sample by the camera and a speed label corresponding to the obstacle sample.
3. The obstacle speed detection method according to claim 2,
the weight characteristics of the radar data are the first detection speed and a first covariance matrix corresponding to the speed label;
and the weight characteristic of the video data is a second covariance matrix corresponding to the second detection speed and the speed label.
4. The obstacle speed detection method according to claim 1, wherein the determining an obstacle speed based on the obstacle location information and a preset time threshold comprises:
acquiring barrier displacement characteristics from the barrier position information based on a preset time threshold;
and obtaining the barrier speed based on the barrier displacement characteristics and the preset time threshold.
5. The obstacle speed detection method according to any one of claims 1 to 4, wherein the matching the radar data and the video data includes:
carrying out space calibration processing on the radar data and the video data;
and carrying out time synchronization processing on the radar data and the video data.
6. The method of obstacle speed detection according to claim 5, wherein said spatially scaling said radar data and said video data comprises:
setting the coordinate system of the radar data and the coordinate system of the video data to be coincident.
7. The obstacle speed detection method according to claim 5, wherein the time-synchronizing the radar data and the video data includes:
aligning respective points in time of the radar data with respective points in time of the video data.
8. An obstacle speed detection apparatus, comprising:
the acquisition module is used for acquiring radar data acquired by a radar and video data acquired by a camera;
the matching module is used for matching the radar data and the video data;
the fusion module is used for fusing the radar data and the video data which are subjected to matching processing based on the weight characteristics of the radar data and the weight characteristics of the video data to determine the position information of the obstacle;
a determination module for determining an obstacle speed based on a target time threshold and the obstacle position information corresponding to the target time threshold.
9. The obstacle speed detection apparatus according to claim 8, wherein the determination module includes:
the first determining submodule is used for acquiring barrier displacement characteristics from the barrier position information based on a preset time threshold;
and the second determining submodule is used for obtaining the barrier speed based on the barrier displacement characteristic and the preset time threshold.
10. The obstacle speed detection apparatus according to claim 8, wherein the matching module includes:
the first matching sub-module is used for carrying out space calibration processing on the radar data and the video data;
and the second matching submodule is used for carrying out time synchronization processing on the radar data and the video data.
11. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor, when executing the program, carries out the steps of the obstacle speed detection method according to any of claims 1 to 7.
12. A non-transitory computer readable storage medium, having stored thereon a computer program, wherein the computer program, when being executed by a processor, is adapted to carry out the steps of the obstacle speed detection method according to any one of the claims 1 to 7.
CN202011476736.8A 2020-12-14 2020-12-14 Obstacle speed detection method and device Pending CN112633101A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011476736.8A CN112633101A (en) 2020-12-14 2020-12-14 Obstacle speed detection method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011476736.8A CN112633101A (en) 2020-12-14 2020-12-14 Obstacle speed detection method and device

Publications (1)

Publication Number Publication Date
CN112633101A true CN112633101A (en) 2021-04-09

Family

ID=75313047

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011476736.8A Pending CN112633101A (en) 2020-12-14 2020-12-14 Obstacle speed detection method and device

Country Status (1)

Country Link
CN (1) CN112633101A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113125795A (en) * 2021-04-20 2021-07-16 广州文远知行科技有限公司 Obstacle speed detection method, device, equipment and storage medium
CN115144843A (en) * 2022-06-28 2022-10-04 海信集团控股股份有限公司 Fusion method and device for object positions
CN117079416A (en) * 2023-10-16 2023-11-17 德心智能科技(常州)有限公司 Multi-person 5D radar falling detection method and system based on artificial intelligence algorithm

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110018496A (en) * 2018-01-10 2019-07-16 北京京东尚科信息技术有限公司 Obstacle recognition method and device, electronic equipment, storage medium
CN110850413A (en) * 2019-11-26 2020-02-28 奇瑞汽车股份有限公司 Method and system for detecting front obstacle of automobile
CN111191600A (en) * 2019-12-30 2020-05-22 深圳元戎启行科技有限公司 Obstacle detection method, obstacle detection device, computer device, and storage medium
CN111578894A (en) * 2020-06-02 2020-08-25 北京经纬恒润科技有限公司 Method and device for determining heading angle of obstacle

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110018496A (en) * 2018-01-10 2019-07-16 北京京东尚科信息技术有限公司 Obstacle recognition method and device, electronic equipment, storage medium
CN110850413A (en) * 2019-11-26 2020-02-28 奇瑞汽车股份有限公司 Method and system for detecting front obstacle of automobile
CN111191600A (en) * 2019-12-30 2020-05-22 深圳元戎启行科技有限公司 Obstacle detection method, obstacle detection device, computer device, and storage medium
CN111578894A (en) * 2020-06-02 2020-08-25 北京经纬恒润科技有限公司 Method and device for determining heading angle of obstacle

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李洋: "智能车辆障碍物检测技术综述", 《大众科技》, 20 June 2019 (2019-06-20), pages 65 - 68 *
王贺: "雷达摄像头数据融合在智能辅助驾驶的应用", 《中国优秀硕士学位论文全文数据库工程科技II辑》, 15 November 2019 (2019-11-15), pages 34 - 35 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113125795A (en) * 2021-04-20 2021-07-16 广州文远知行科技有限公司 Obstacle speed detection method, device, equipment and storage medium
CN115144843A (en) * 2022-06-28 2022-10-04 海信集团控股股份有限公司 Fusion method and device for object positions
CN117079416A (en) * 2023-10-16 2023-11-17 德心智能科技(常州)有限公司 Multi-person 5D radar falling detection method and system based on artificial intelligence algorithm
CN117079416B (en) * 2023-10-16 2023-12-26 德心智能科技(常州)有限公司 Multi-person 5D radar falling detection method and system based on artificial intelligence algorithm

Similar Documents

Publication Publication Date Title
US11915470B2 (en) Target detection method based on fusion of vision, lidar, and millimeter wave radar
US11276189B2 (en) Radar-aided single image three-dimensional depth reconstruction
CN112572430A (en) Collision risk determination method and device
EP3792660B1 (en) Method, apparatus and system for measuring distance
EP4027167A1 (en) Sensor calibration method and apparatus
CN112784679A (en) Vehicle obstacle avoidance method and device
CN112633101A (en) Obstacle speed detection method and device
KR100521119B1 (en) Obstacle detecting apparatus for vehicle
CN109085570A (en) Automobile detecting following algorithm based on data fusion
US20030011509A1 (en) Method for detecting stationary object on road
CN109583416B (en) Pseudo lane line identification method and system
CN109747530A (en) A kind of dual camera and millimeter wave merge automobile sensory perceptual system
KR101180621B1 (en) Apparatus and method for detecting a vehicle
Cui et al. 3D detection and tracking for on-road vehicles with a monovision camera and dual low-cost 4D mmWave radars
CN113850102A (en) Vehicle-mounted vision detection method and system based on millimeter wave radar assistance
CN112906777A (en) Target detection method and device, electronic equipment and storage medium
CN110750153A (en) Dynamic virtualization device of unmanned vehicle
CN111123262A (en) Automatic driving 3D modeling method, device and system
CN113988197A (en) Multi-camera and multi-laser radar based combined calibration and target fusion detection method
CN113884090A (en) Intelligent platform vehicle environment sensing system and data fusion method thereof
CN117173666A (en) Automatic driving target identification method and system for unstructured road
KR101704635B1 (en) Method and apparatus for detecting target using radar and image raster data
CN113895482B (en) Train speed measuring method and device based on trackside equipment
Godfrey et al. Evaluation of Flash LiDAR in Adverse Weather Conditions towards Active Road Vehicle Safety
CN112784678A (en) Danger prompting method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination