CN115019512A - Road event detection system based on radar video fusion - Google Patents

Road event detection system based on radar video fusion Download PDF

Info

Publication number
CN115019512A
CN115019512A CN202210784009.0A CN202210784009A CN115019512A CN 115019512 A CN115019512 A CN 115019512A CN 202210784009 A CN202210784009 A CN 202210784009A CN 115019512 A CN115019512 A CN 115019512A
Authority
CN
China
Prior art keywords
radar
data
coordinate system
traffic
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210784009.0A
Other languages
Chinese (zh)
Inventor
沈炜
裴植嵩
马乙恒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dongshiyuan Technology Co ltd
Original Assignee
Beijing Dongshiyuan Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dongshiyuan Technology Co ltd filed Critical Beijing Dongshiyuan Technology Co ltd
Priority to CN202210784009.0A priority Critical patent/CN115019512A/en
Publication of CN115019512A publication Critical patent/CN115019512A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing

Landscapes

  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention relates to the technical field of road event detection, and discloses a radar video fusion-based road event detection system, which comprises a drive test sensing device, a computing unit and a cloud server, wherein the detection system method comprises the following steps: s1: the road surface information is acquired by the drive test sensing equipment, and then the acquired data is transmitted to the computing unit; s2: the computing unit is used for carrying out timestamp unification and unknown coordinate calibration on the information extracted by the drive test sensing equipment and carrying out real-time track acquisition and prejudgment through a deep learning algorithm. Through the detection system and the algorithm provided by the detection system, multi-target tracking can be realized in a complex environment, so that the accuracy of target tracking is improved, and the efficiency of analyzing, judging and processing the traffic state by a subsequent intelligent traffic system is improved.

Description

Road event detection system based on radar video fusion
Technical Field
The invention relates to the technical field of road event detection, in particular to a road event detection system based on radar video fusion.
Background
At present, with the rapid increase of the holding quantity of motor vehicles in China, the traffic flow of urban roads rapidly rises, and the problems of traffic jam and the like become more serious day by day. The intersection is used as a main node of urban traffic and is influenced by signal control, and the traffic capacity of the intersection is lower than that of a common road section. When vehicles travel at early and late peaks, road intersections are frequent congestion points, and the areas are also main concern points of urban traffic control. Particularly, when events such as retrograde motion, traffic accidents and the like occur at the road junction, whether the events can be found and responded in real time directly influences whether the traffic events can be processed properly in time or not, and whether secondary events occur or expansion and spread of traffic jam can be avoided or not. Thus, the timeliness and effectiveness of traffic event detection is particularly important at road intersections.
The existing urban traffic supervision infrastructure mainly comprises an electric police, a bayonet, a millimeter wave radar, a coil and the like, and generally detects traffic events such as sprinklers, vehicle abnormity and the like on roads by a way of identifying road states through videos. However, traffic events such as accidents, sprinkles and other image features occurring in the intersection area are various, the driving features of vehicles in the intersection area are also complex, and if the traffic events are detected by means of pure visual deep learning, the recognition rate is low, the false detection rate is high, and the practicability is not high. With the construction of the intelligent networking city, the construction of the holographic intersection based on multi-sensing fusion of radar, cameras and the like is steadily promoted, and on the basis of the multi-sensing fusion, refined real-time perception data of targets of all traffic participants at the intersection can be obtained, so that the track of the traffic participants is precisely depicted. Intersection event identification based on roadside edge holographic sensing can effectively avoid the problems existing in pure video detection, greatly improve intersection event response speed, timely link traffic management facilities such as signal lamps and the like, and avoid large-scale urban traffic jam caused by intersection traffic events.
The track changes of other vehicles at the intersection are often brought about by the occurrence of traffic events such as abnormal vehicles, sprinkled objects, traffic accidents and the like at the intersection, so that the invention determines whether the traffic event occurs by detecting and comparing the track changes of the vehicles in real time.
Disclosure of Invention
Technical problem to be solved
Aiming at the defects of the prior art, the invention provides a road event detection system based on radar video fusion.
(II) technical scheme
In order to achieve the purpose, the invention provides the following technical scheme: a road event detection system based on radar video fusion comprises a drive test sensing device, a computing unit and a cloud server, and the implementation method of the detection system comprises the following steps:
s1: the road surface information is acquired by the drive test sensing equipment, and then the acquired data is transmitted to the computing unit;
s2: the computing unit carries out timestamp unification and unknown coordinate calibration on the information extracted by the drive test sensing equipment, and carries out real-time track acquisition and prejudgment through a deep learning algorithm
S3: determining a single-sensor target tracking problem: according to a traffic application scene, selecting an overpass test scene to carry out data acquisition work of radar and video and analyzing to obtain a target tracking result as input and comparison data of a subsequent decision-level fusion algorithm;
s4; performing space-time matching: selecting a field which is relatively spacious and has high test positions, and controlling the number of targets; the single-target test scene data is adopted to debug the coordinate system matching, accurate conversion parameters between the two sensors are obtained, the matched data can be displayed in the same dimension, and the accuracy of subsequent data set fusion is guaranteed;
s4: constructing a radar and video fusion algorithm based on a fuzzy set theory: respectively carrying out target classification and fuzzy evaluation problem classification, determining a target judgment criterion, and solving or improving the tracking problem of the single sensor according to the judgment criterion;
s5: the computing unit identifies a road traffic event based on the characteristics of the motor vehicle running track on the basis of acquiring the motor vehicle target track at the intersection and reports the occurrence range of the traffic event to the cloud server;
s6: and the cloud server receives the traffic event data reported by the computing unit, feeds back the occurrence position and the site real-time video of the traffic event based on the visual map, and performs early warning and secondary confirmation on the traffic event.
Preferably, the sensing equipment comprises a plurality of sensor equipment which is used for acquiring intersection sensing data in real time and is deployed on the road side.
Preferably, the sensor device comprises a traffic information acquisition sensor millimeter wave radar and a camera, the traffic information acquisition sensor millimeter wave radar acquires real-time point cloud data, and the camera is used for acquiring video image data.
Preferably, the space-time matching comprises time matching and space matching; wherein the time matching comprises: during data testing, the radar data acquisition rate is 20 frames/second, the video data acquisition rate is 60 frames/second, in the aspect of time matching, firstly, the mixed Gaussian modeling with the aligned starting point needs to be trained by the first 30 frames, therefore, the radar data starting time is 0.5 second later than the video data starting time, the matching of the data frames is carried out after the time starting points are aligned, and as the camera acquisition rate is greater than the radar acquisition rate and is an integral multiple, the video data is sampled at intervals of 2 frames on a timestamp for acquiring the radar data, so that the matching in time is achieved.
Preferably, the spatial matching includes: the space matching is used for converting data of two different dimensions into the same coordinate system; when traffic target information is collected, equipment is generally erected at a higher position, has a certain inclination angle and is monitored downwards, and a space coordinate system conversion method is designed according to a view angle in a set scene; in the process of converting the space coordinate system, conversion among five coordinate systems is involved, and finally data in the radar coordinate system is converted into a pixel coordinate system.
Preferably, the computing unit is arranged in a circuit box on the road side, and performs space-time synchronization, target extraction of road traffic participants and target fusion monthly tracking on the original data obtained by the drive test sensing equipment, gives real-time trajectory data to perform traffic event analysis, and transmits the data to the cloud server through a network.
Preferably, because the scenes and the equipment for acquiring the data are different, the angle conversion relation for converting the radar coordinate system into the world coordinate system is also different, and therefore the two coordinate systems are converted according to the set radar coordinate system and the height and inclination angle information for installing the equipment;
wherein, if the radar detects that the radial distance of the target is s and the angle is gamma, the radar is converted into a two-dimensional radar coordinate system:
xr=s·sin(γ)28;
yr=s·cos(γ);
if the installation height of the radar is h, the radar wave beam is emitted downwards in an inclined mode, the inclination angle of the equipment is alpha, and theta meets the following requirements: assuming that the origin of the world coordinate system is the same as the radar coordinate system, the xw axis is the same as the xr axis of the radar coordinate system, the zw axis of the world coordinate system is the same as the yr axis of the radar coordinate system, and the yr axis of the world coordinate system is set to be perpendicular to the radar emission plane and to face downward according to the geometric angle relationship, so the conversion relationship from the radar coordinate system to the world coordinate system is as follows:
xw=xr;
yw=yrsin(θ);
zw=yrcos(θ)。
preferably, when the test equipment is installed, the radar development board and the camera are placed side by side, the offset is small, and the conversion between the world coordinate system and the camera coordinate system is mainly realized by simply debugging to obtain the external parameters of the camera for conversion; and secondly, converting the camera coordinate system to a pixel coordinate system through calibrating the obtained camera internal parameters, wherein in an actual single-target test scene, the height h of the sensor from the ground is 4.6m, and the inclination angle alpha is 30.
(III) advantageous effects
Compared with the prior art, the invention provides a road event detection system based on radar video fusion, which has the following beneficial effects:
1. the road event detection system based on radar video fusion can perform multi-target tracking in a complex environment through the detection system and the algorithm provided by the detection system, so that the accuracy of target tracking is improved, and the analysis, judgment and processing efficiency of a follow-up intelligent traffic system on traffic states is improved.
2. The road event detection system based on radar video fusion can effectively solve the problems that video images cannot be fully covered due to algorithm training data, and are difficult to deal with different image feature recognition of different traffic events, sprinkles of different shapes and the like, so that the detection rate is low and the false detection rate is high.
3. The road event detection system based on radar video fusion is not influenced by weather and illumination when data are collected, is high in acquisition capacity of depth information such as the speed of a target, and the like, and is high in calculated data volume and relatively high in processing speed and efficiency.
4. This road incident detecting system based on radar video fuses has solved single sensor data volume list and has been thin can't realize intelligent transportation system's construction, and because its self structure and monitoring range limit always appear the false retrieval and miss the condition with following.
Drawings
FIG. 1 is a flow chart of the system of the present invention;
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments.
Examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
In the description of the present invention, it is to be understood that the terms "central," "longitudinal," "lateral," "length," "width," "thickness," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," "clockwise," "counterclockwise," "axial," "radial," "circumferential," and the like are used in the orientations and positional relationships indicated in the drawings for convenience in describing the invention and to simplify the description, and are not intended to indicate or imply that the referenced device or element must have a particular orientation, be constructed and operated in a particular orientation, and are not to be considered limiting of the invention.
In the present invention, unless otherwise expressly stated or limited, the terms "mounted," "connected," "secured," and the like are to be construed broadly and can, for example, be fixedly connected, detachably connected, or integrally formed; can be mechanically or electrically connected; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
As shown in fig. 1, the invention provides a road event detection system based on radar video fusion, which comprises a drive test sensing device, a computing unit and a cloud server, wherein an implementation method of the detection system comprises the following steps:
s1: the road surface information is acquired by the drive test sensing equipment, and then the acquired data is transmitted to the computing unit;
s2: the calculation unit carries out timestamp unification and unknown coordinate calibration on the information extracted by the drive test sensing equipment, and carries out real-time track acquisition and prejudgment through a deep learning algorithm
S3: determining a single-sensor target tracking problem: according to a traffic application scene, selecting an overpass test scene to carry out data acquisition work of radar and video and analyzing to obtain a target tracking result as input and comparison data of a subsequent decision-level fusion algorithm;
s4; performing space-time matching: selecting a field which is relatively spacious and has high test positions, and controlling the number of targets; the single-target test scene data is adopted to debug the coordinate system matching, accurate conversion parameters between the two sensors are obtained, the matched data can be displayed in the same dimension, and the accuracy of subsequent data set fusion is guaranteed;
s4: constructing a radar and video fusion algorithm based on a fuzzy set theory: respectively carrying out target classification and fuzzy evaluation problem classification, determining a target judgment criterion, and solving or improving the tracking problem of the single sensor according to the judgment criterion;
s5: the method comprises the steps that on the basis of obtaining a motor vehicle target track at an intersection, a computing unit identifies a road traffic event based on motor vehicle running track characteristics, and reports the occurrence range of the traffic event to a cloud server;
s6: the cloud server receives the traffic event data reported by the computing unit, feeds back the occurrence position and the site real-time video of the traffic event based on the visual map, and performs early warning and secondary confirmation on the traffic event.
The sensing equipment comprises a plurality of sensor equipment which is used for acquiring intersection sensing data in real time and is arranged on the road side.
The sensor equipment comprises a traffic information acquisition sensor millimeter wave radar and a camera, wherein the traffic information acquisition sensor millimeter wave radar acquires real-time point cloud data, and the camera is used for acquiring video image data.
The space-time matching comprises time matching and space matching; wherein, the time matching comprises: during data testing, the radar data acquisition rate is 20 frames per second, the video data acquisition rate is 60 frames per second, in the aspect of time matching, firstly, the initial point alignment Gaussian mixture modeling is carried out, the first 30 frames are required for training, therefore, the radar data initial time is 0.5 second later than the video data initial time, the matching correspondence between the data frames is carried out after the time initial points are aligned, and as the camera acquisition rate is greater than the radar acquisition rate and is integral multiple, the video data are sampled at intervals of 2 frames on a timestamp for acquiring the radar data, so that the matching in time is achieved.
Spatial matching, comprising: the space matching is used for converting data of two different dimensions into the same coordinate system; when traffic target information is collected, equipment is generally erected at a higher position, has a certain inclination angle and is monitored downwards, and a space coordinate system conversion method is designed according to a view angle in a set scene; in the process of converting the space coordinate system, conversion among five coordinate systems is involved, and finally data in the radar coordinate system is converted into a pixel coordinate system.
The computing unit is arranged in a circuit box at the road side, and carries out time-space synchronization, target extraction of road traffic participants and target fusion month tracking on original data obtained by the road test sensing equipment, gives real-time trajectory data for traffic event analysis, and transmits the data to the cloud server through a network.
Because the scenes and equipment for acquiring data are different, the angle conversion relation for converting the radar coordinate system into the world coordinate system is also different, and therefore the conversion of the two coordinate systems is carried out according to the set radar coordinate system and the height and inclination angle information for installing the equipment;
wherein, if the radar detects that the radial distance of the target is s and the angle is gamma, the radar is converted into a two-dimensional radar coordinate system:
xr=s·sin(γ)28;
yr=s·cos(γ);
if the installation height of the radar is h, the radar wave beam is emitted downwards in an inclined mode, the inclination angle of the equipment is alpha, and theta meets the following requirements: assuming that the origin of the world coordinate system is the same as the radar coordinate system, the xw axis is the same as the xr axis of the radar coordinate system, the zw axis of the world coordinate system is the same as the yr axis of the radar coordinate system, and the yr axis of the world coordinate system is set to be perpendicular to the radar emission plane and to face downward according to the geometric angle relationship, so the conversion relationship from the radar coordinate system to the world coordinate system is as follows:
xw=xr;
yw=yrsin(θ);
zw=yrcos(θ)。
when the test equipment is installed, the radar development board and the camera are placed side by side, the offset is small, and the conversion between the world coordinate system and the camera coordinate system is mainly carried out by obtaining the external parameters of the camera through simple debugging; and secondly, converting the camera coordinate system to a pixel coordinate system through calibrating the obtained camera internal parameters, wherein in an actual single-target test scene, the height h of the sensor from the ground is 4.6m, and the inclination angle alpha is 30.
It should be noted that, in this document, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising a reference structure" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.

Claims (8)

1. A road event detection system based on radar video fusion comprises a drive test sensing device, a computing unit and a cloud server, and is characterized in that the implementation method of the detection system comprises the following steps:
s1: the road surface information is acquired by the drive test sensing equipment, and then the acquired data is transmitted to the computing unit;
s2: the computing unit is used for carrying out timestamp unification and unknown coordinate calibration on the information extracted by the drive test sensing equipment and carrying out real-time track acquisition and prejudgment through a deep learning algorithm;
s3: determining a single-sensor target tracking problem: according to a traffic application scene, selecting an overpass test scene to carry out data acquisition work of radar and video and analyzing to obtain a target tracking result as input and comparison data of a subsequent decision-level fusion algorithm;
s4; performing space-time matching: selecting a field which is relatively spacious and has high test positions, and controlling the number of targets; the single-target test scene data is adopted to debug the coordinate system matching, accurate conversion parameters between the two sensors are obtained, the matched data can be displayed in the same dimension, and the accuracy of subsequent data set fusion is guaranteed;
s4: constructing a radar and video fusion algorithm based on a fuzzy set theory: respectively carrying out target classification and fuzzy evaluation problem classification, determining a target judgment criterion, and solving or improving the tracking problem of the single sensor according to the judgment criterion;
s5: the computing unit identifies a road traffic event based on the characteristics of the motor vehicle running track on the basis of acquiring the motor vehicle target track at the intersection and reports the occurrence range of the traffic event to the cloud server;
s6: and the cloud server receives the traffic event data reported by the computing unit, feeds back the occurrence position and the site real-time video of the traffic event based on the visual map, and performs early warning and secondary confirmation on the traffic event.
2. The radar video fusion-based road event detection system according to claim 1, wherein: the sensing equipment comprises a plurality of sensor equipment which is used for acquiring intersection sensing data in real time and is arranged on the road side.
3. The radar video fusion-based road event detection system of claim 2, wherein: the sensor equipment comprises a traffic information acquisition sensor millimeter wave radar and a camera, wherein the traffic information acquisition sensor millimeter wave radar acquires real-time point cloud data, and the camera is used for acquiring video image data.
4. The radar video fusion-based road event detection system according to claim 1, wherein: the space-time matching comprises time matching and space matching; wherein the time matching comprises: during data testing, the radar data acquisition rate is 20 frames/second, the video data acquisition rate is 60 frames/second, in the aspect of time matching, firstly, the mixed Gaussian modeling with the aligned starting point needs to be trained by the first 30 frames, therefore, the radar data starting time is 0.5 second later than the video data starting time, the matching of the data frames is carried out after the time starting points are aligned, and as the camera acquisition rate is greater than the radar acquisition rate and is an integral multiple, the video data is sampled at intervals of 2 frames on a timestamp for acquiring the radar data, so that the matching in time is achieved.
5. The radar video fusion-based road event detection system according to claim 4, wherein: the spatial matching includes: the space matching is used for converting data of two different dimensions into the same coordinate system; when traffic target information is collected, equipment is generally erected at a higher position, has a certain inclination angle and is monitored downwards, and a space coordinate system conversion method is designed according to a view angle in a set scene; in the process of converting the space coordinate system, conversion among five coordinate systems is involved, and finally data in the radar coordinate system is converted into a pixel coordinate system.
6. The radar video fusion-based road event detection system according to claim 5, wherein: the computing unit is arranged in a circuit box at the road side, and carries out time-space synchronization, target extraction of road traffic participants and target fusion month tracking on original data obtained by the road test sensing equipment, gives real-time trajectory data for traffic event analysis, and transmits the data to the cloud server through a network.
7. The radar video fusion-based road event detection system according to claim 5, wherein: because the scenes and equipment for acquiring data are different, the angle conversion relation for converting the radar coordinate system into the world coordinate system is also different, and therefore the conversion of the two coordinate systems is carried out according to the set radar coordinate system and the height and inclination angle information for installing the equipment;
wherein, if the radar detects that the radial distance of the target is s and the angle is gamma, the radar is converted into a two-dimensional radar coordinate system:
xr=s·sin(γ)28;
yr=s·cos(γ);
if the installation height of the radar is h, the radar wave beam is emitted downwards in an inclined mode, the inclination angle of the equipment is alpha, and theta meets the following requirements: assuming that the origin of the world coordinate system is the same as the radar coordinate system, the xw axis is the same as the xr axis of the radar coordinate system, the zw axis of the world coordinate system is the same as the yr axis of the radar coordinate system, and the yr axis of the world coordinate system is set to be perpendicular to the radar emission plane and to face downward according to the geometric angle relationship, so the conversion relationship from the radar coordinate system to the world coordinate system is as follows:
xw=xr;
yw=yrsin(θ);
zw=yrcos(θ)。
8. the radar video fusion-based road event detection system according to claim 5, wherein: when the test equipment is installed, the radar development board and the camera are placed side by side, the offset is small, and the conversion between the world coordinate system and the camera coordinate system is mainly realized by simply debugging to obtain the external parameters of the camera for conversion; and secondly, converting the camera coordinate system to a pixel coordinate system through calibrating the obtained camera internal parameters, wherein in an actual single-target test scene, the height h of the sensor from the ground is 4.6m, and the inclination angle alpha is 30.
CN202210784009.0A 2022-07-05 2022-07-05 Road event detection system based on radar video fusion Pending CN115019512A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210784009.0A CN115019512A (en) 2022-07-05 2022-07-05 Road event detection system based on radar video fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210784009.0A CN115019512A (en) 2022-07-05 2022-07-05 Road event detection system based on radar video fusion

Publications (1)

Publication Number Publication Date
CN115019512A true CN115019512A (en) 2022-09-06

Family

ID=83078714

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210784009.0A Pending CN115019512A (en) 2022-07-05 2022-07-05 Road event detection system based on radar video fusion

Country Status (1)

Country Link
CN (1) CN115019512A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115985113A (en) * 2022-12-07 2023-04-18 北京万集科技股份有限公司 Traffic signal lamp control method and electronic equipment
CN116189116A (en) * 2023-04-24 2023-05-30 江西方兴科技股份有限公司 Traffic state sensing method and system
CN116630866A (en) * 2023-07-24 2023-08-22 ***数字城市科技有限公司 Abnormal event monitoring method, device, equipment and medium for audio-video radar fusion
CN117095540A (en) * 2023-10-18 2023-11-21 四川数字交通科技股份有限公司 Early warning method and device for secondary road accidents, electronic equipment and storage medium
CN117649632A (en) * 2024-01-29 2024-03-05 杭州感想科技有限公司 Expressway event identification method and device based on multi-source traffic data

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112946628A (en) * 2021-02-08 2021-06-11 江苏中路工程技术研究院有限公司 Road running state detection method and system based on radar and video fusion
CN112991391A (en) * 2021-03-31 2021-06-18 武汉大学 Vehicle detection and tracking method based on radar signal and vision fusion
JP2021140764A (en) * 2020-03-05 2021-09-16 富士通株式会社 Data fusion method, device, and data processing device
CN113671480A (en) * 2021-07-10 2021-11-19 亿太特(陕西)科技有限公司 Radar and video fusion traffic target tracking method, system, equipment and terminal
CN114333330A (en) * 2022-01-27 2022-04-12 浙江嘉兴数字城市实验室有限公司 Intersection event detection system and method based on roadside edge holographic sensing

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021140764A (en) * 2020-03-05 2021-09-16 富士通株式会社 Data fusion method, device, and data processing device
CN112946628A (en) * 2021-02-08 2021-06-11 江苏中路工程技术研究院有限公司 Road running state detection method and system based on radar and video fusion
CN112991391A (en) * 2021-03-31 2021-06-18 武汉大学 Vehicle detection and tracking method based on radar signal and vision fusion
CN113671480A (en) * 2021-07-10 2021-11-19 亿太特(陕西)科技有限公司 Radar and video fusion traffic target tracking method, system, equipment and terminal
CN114333330A (en) * 2022-01-27 2022-04-12 浙江嘉兴数字城市实验室有限公司 Intersection event detection system and method based on roadside edge holographic sensing

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115985113A (en) * 2022-12-07 2023-04-18 北京万集科技股份有限公司 Traffic signal lamp control method and electronic equipment
CN115985113B (en) * 2022-12-07 2023-11-14 北京万集科技股份有限公司 Traffic signal lamp control method and electronic equipment
CN116189116A (en) * 2023-04-24 2023-05-30 江西方兴科技股份有限公司 Traffic state sensing method and system
CN116189116B (en) * 2023-04-24 2024-02-23 江西方兴科技股份有限公司 Traffic state sensing method and system
CN116630866A (en) * 2023-07-24 2023-08-22 ***数字城市科技有限公司 Abnormal event monitoring method, device, equipment and medium for audio-video radar fusion
CN116630866B (en) * 2023-07-24 2023-10-13 ***数字城市科技有限公司 Abnormal event monitoring method, device, equipment and medium for audio-video radar fusion
CN117095540A (en) * 2023-10-18 2023-11-21 四川数字交通科技股份有限公司 Early warning method and device for secondary road accidents, electronic equipment and storage medium
CN117095540B (en) * 2023-10-18 2024-01-23 四川数字交通科技股份有限公司 Early warning method and device for secondary road accidents, electronic equipment and storage medium
CN117649632A (en) * 2024-01-29 2024-03-05 杭州感想科技有限公司 Expressway event identification method and device based on multi-source traffic data
CN117649632B (en) * 2024-01-29 2024-05-07 杭州感想科技有限公司 Expressway event identification method and device based on multi-source traffic data

Similar Documents

Publication Publication Date Title
CN115019512A (en) Road event detection system based on radar video fusion
CN110532896B (en) Road vehicle detection method based on fusion of road side millimeter wave radar and machine vision
CN113671480B (en) Radar and video fusion traffic target tracking method, system, equipment and terminal
CN111368706B (en) Data fusion dynamic vehicle detection method based on millimeter wave radar and machine vision
Semertzidis et al. Video sensor network for real-time traffic monitoring and surveillance
CN114333330B (en) Intersection event detection system based on road side edge holographic sensing
CN103176185B (en) Method and system for detecting road barrier
CN112099040A (en) Whole-course continuous track vehicle tracking system and method based on laser radar network
Song et al. Vehicle behavior analysis using target motion trajectories
CN103456024B (en) A kind of moving target gets over line determination methods
KR20130127822A (en) Apparatus and method of processing heterogeneous sensor fusion for classifying and positioning object on road
CN109272482B (en) Urban intersection vehicle queuing detection system based on sequence images
Zhang et al. Background filtering and vehicle detection with roadside lidar based on point association
CN115965655A (en) Traffic target tracking method based on radar-vision integration
Wang et al. A roadside camera-radar sensing fusion system for intelligent transportation
CN111477011A (en) Detection device and detection method for road intersection early warning
CN111381232A (en) River channel safety control method based on photoelectric integration technology
CN111275957A (en) Traffic accident information acquisition method, system and camera
US20210104059A1 (en) Method for size estimation by image recognition of specific target using given scale
CN110853356A (en) Vehicle lane change detection method based on radar and video linkage
CN115690713A (en) Binocular camera-based radar-vision fusion event detection method
CN116089903A (en) Multi-sensor target tracking method and device
CN114627409A (en) Method and device for detecting abnormal lane change of vehicle
CN113313182A (en) Target identification method and terminal based on radar and video fusion
CN112817006A (en) Vehicle-mounted intelligent road disease detection method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination