CN116935631A - Abnormal traffic situation detection method, device and system based on radar fusion - Google Patents

Abnormal traffic situation detection method, device and system based on radar fusion Download PDF

Info

Publication number
CN116935631A
CN116935631A CN202310749007.2A CN202310749007A CN116935631A CN 116935631 A CN116935631 A CN 116935631A CN 202310749007 A CN202310749007 A CN 202310749007A CN 116935631 A CN116935631 A CN 116935631A
Authority
CN
China
Prior art keywords
vehicles
information
traffic situation
vehicle
target area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310749007.2A
Other languages
Chinese (zh)
Inventor
石鑫
张希庆
刘洁
舒国明
吴琼
李华伟
殷秀玉
孟玉文
高金毓
王海岗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hebei Jiaotong Vocational and Technical College
Original Assignee
Hebei Jiaotong Vocational and Technical College
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hebei Jiaotong Vocational and Technical College filed Critical Hebei Jiaotong Vocational and Technical College
Priority to CN202310749007.2A priority Critical patent/CN116935631A/en
Publication of CN116935631A publication Critical patent/CN116935631A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0133Traffic data processing for classifying traffic situation
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides an abnormal traffic situation detection method, device and system based on radar fusion, and relates to the technical field of abnormal traffic situation identification. The method overcomes the limitation of the data of the single drive test camera, solves the problem of insufficient precision caused by the fact that abnormal traffic situation identification is easily affected by bad weather, improves the accuracy and efficiency of abnormal traffic situation identification, and provides high-efficiency and accurate evidence data for processing abnormal traffic conditions of roads.

Description

Abnormal traffic situation detection method, device and system based on radar fusion
Technical Field
The invention relates to the technical field of abnormal traffic situation identification, in particular to a method, a device and a system for detecting abnormal traffic situations based on radar fusion.
Background
With the development of urban, the problem of urban traffic jam is highlighted, and the running and economic development of cities are seriously affected. The method has the advantages that real-time dynamic traffic information of roads is comprehensively and effectively obtained, the state of traffic flow in a road network is rapidly and accurately judged through information processing and analysis, traffic jams can be found timely, a reasonable and effective traffic jam guiding strategy is formulated, the influence range of the traffic jams can be greatly reduced, the harm of the traffic jams on time, economy and environment is reduced, and road network traffic paralysis caused by local traffic jams can be avoided.
At present, the traffic situation analysis of most roads is mainly performed by a road test camera shooting device. However, under the influence of weather, especially under severe weather such as storm, the road test camera shooting device has the problem of inaccurate acquisition, and the analysis result of traffic situation is inaccurate.
Disclosure of Invention
The invention provides an abnormal traffic situation detection method, device and system based on radar fusion, which can realize the fusion of radar data and video data, comprehensively judge traffic situations and improve the accuracy of traffic situation analysis.
In a first aspect, the invention provides a method for detecting abnormal traffic situation based on radar fusion, which comprises the following steps: acquiring radar detection road condition data and video detection road condition data of a target area; image recognition and feature extraction are carried out on the video detection road condition data, and vision measurement information of a plurality of vehicles in a target area is determined; the vision measurement information comprises identification information and lane information; extracting features of radar detection road condition data, and determining radar detection information of a plurality of vehicles in a target area; the thunder measurement information comprises vehicle speed information and position coordinates; based on the vision measurement information and the thunder measurement information of a plurality of vehicles, carrying out data fusion to obtain the thunder measurement information of the plurality of vehicles, wherein the thunder measurement information comprises identification information, track information, vehicle speed information and lane information; and carrying out traffic situation analysis based on the lightning detection information of the vehicles, and determining whether an abnormal traffic situation exists in the target area.
In one possible implementation, the video detection road condition data includes video images at a plurality of moments; image recognition and feature extraction are carried out on the video detection road condition data, and vision measurement information of a plurality of vehicles in a target area is determined, wherein the method comprises the following steps: inputting video images at each moment into a pre-trained vehicle identification model according to time sequence to obtain identification information of a plurality of vehicles and a plurality of detection frames of each vehicle in the plurality of vehicles at each moment, wherein the vehicle identification model inputs the vehicle images to output the identification information of the vehicle and the plurality of detection frames of the vehicle; for each vehicle, calculating the similarity between a plurality of detection frames of each vehicle and preset track information of each vehicle; the preset track information of each vehicle is determined according to a plurality of detection frames of one or more moments before each moment of each vehicle. And determining the lane information of each vehicle according to the calculated similarity and the lane line information of the pre-stored target area.
In one possible implementation, the radar detection road condition data includes radar images at a plurality of moments; feature extraction is performed on radar detection road condition data, and radar detection information of a plurality of vehicles in a target area is determined, wherein the method comprises the following steps: converting the thunder-survey image at each moment into a binary image; performing edge feature detection on the binary image to obtain contour points of a plurality of vehicles in the binary image; determining position coordinates of a plurality of vehicles in the lightning detection image at each moment based on contour points of the plurality of vehicles in the binary image; vehicle speed information of a plurality of vehicles is determined based on position coordinates of the plurality of vehicles in the radar images at a plurality of times.
In one possible implementation, the video detection road condition data includes video images at a plurality of moments; based on the vision measurement information and the thunder measurement information of a plurality of vehicles, carrying out data fusion to obtain the thunder measurement information of the plurality of vehicles, comprising: determining contour information and pixel point information of a plurality of vehicles in video images at a plurality of moments based on identification information and lane information of the plurality of vehicles in the vision measurement information; correcting and fusing contour information and pixel point information of a plurality of vehicles in video images at a plurality of moments based on position coordinates and speed information of the plurality of vehicles in the radar measurement information to obtain a plurality of Lei Shi images; and analyzing a plurality of vehicles in the multiple Zhang Lei view images to obtain the lightning detection information of the plurality of vehicles.
In one possible implementation, based on the lightning detection information of the plurality of vehicles, traffic situation analysis is performed to determine whether an abnormal traffic situation exists in the target area, including: calculating the number of vehicles on each lane of the target area and the average speed of the vehicles based on the lightning detection information of the plurality of vehicles; and determining whether an abnormal traffic situation exists in the target area based on the number of vehicles on each lane of the target area and the average speed of the vehicles.
In one possible implementation, the abnormal traffic situation includes: mild congestion, moderate congestion, or severe congestion; determining whether an abnormal traffic situation exists in the target area based on the number of vehicles on each lane of the target area and the average speed of the vehicles comprises: if the average vehicle speed is smaller than or equal to the first vehicle speed and larger than the second vehicle speed, and the number of vehicles is smaller than the first number, determining that an abnormal traffic situation exists and is light congestion, wherein the first vehicle speed is larger than the second vehicle speed; if the average speed is smaller than or equal to the second speed and larger than the third speed, the number of vehicles is larger than or equal to the first number and smaller than the second number, the abnormal traffic situation is determined to exist, and the abnormal traffic situation is moderate congestion, wherein the second speed is larger than the third speed, and the first number is smaller than the second number; if the average speed is less than or equal to the third speed and the number of vehicles is greater than or equal to the second number, determining that an abnormal traffic situation exists and the abnormal traffic situation is heavy congestion.
In one possible implementation manner, after the traffic situation analysis is performed based on the lightning detection information of the plurality of vehicles and whether the abnormal traffic situation exists in the target area is determined, the method further includes: if the abnormal traffic situation exists in the target area, analyzing illegal events of a plurality of vehicles based on the lightning detection information of the plurality of vehicles to obtain the illegal events of the target area; the illegal event includes at least one of: illegal lane change events, overspeed events, low-speed events, retrograde events, emergency lane occupation events, out-of-limit queuing, overflow events and congestion events; and generating first prompt information based on the illegal event of the target area, and sending the first prompt information to the duty traffic police so as to indicate the duty traffic police to duty on site.
In one possible implementation manner, after the traffic situation analysis is performed based on the lightning detection information of the plurality of vehicles and whether the abnormal traffic situation exists in the target area is determined, the method further includes: acquiring destinations of a plurality of vehicles and traffic situations of a plurality of areas within a preset range of a target area; taking the place of the target vehicle as a starting place and the destination of the target vehicle as a target place, planning a plurality of paths; the target vehicle is any one of a plurality of vehicles; calculating the time length of a plurality of paths based on traffic situation of a plurality of areas; determining the path with the shortest duration as a target path, and generating second prompt information; the second prompt information carries a target path; and sending second prompt information to the target vehicle so as to prompt the target vehicle to run along the target path.
In a second aspect, an embodiment of the present invention provides an abnormal traffic situation detection device based on radar fusion, including: the communication module is used for acquiring radar detection road condition data and video detection road condition data of the target area; the processing module is used for carrying out image recognition and feature extraction on the video detection road condition data and determining vision measurement information of a plurality of vehicles in the target area; the vision measurement information comprises identification information and lane information; extracting features of radar detection road condition data, and determining radar detection information of a plurality of vehicles in a target area; the thunder measurement information comprises vehicle speed information and position coordinates; based on vision measurement information and thunder measurement information of a plurality of vehicles, carrying out data fusion to obtain thunder measurement information of the plurality of vehicles, wherein the thunder measurement information comprises identification information, track information, vehicle speed information and lane information; and carrying out traffic situation analysis based on the lightning detection information of the vehicles, and determining whether an abnormal traffic situation exists in the target area.
In one possible implementation, the video detection road condition data includes video images at a plurality of moments; the processing module is specifically used for inputting video images at each moment into a pre-trained vehicle identification model according to time sequence to obtain identification information of a plurality of vehicles and a plurality of detection frames of each vehicle in the plurality of vehicles at each moment, wherein the vehicle identification model inputs the vehicle images and outputs the identification information of the vehicle and the plurality of detection frames of the vehicle; for each vehicle, calculating the similarity between a plurality of detection frames of each vehicle and preset track information of each vehicle; the preset track information of each vehicle is determined according to a plurality of detection frames of one or more moments before each moment of each vehicle. And determining the lane information of each vehicle according to the calculated similarity and the lane line information of the pre-stored target area.
In one possible implementation, the radar detection road condition data includes radar images at a plurality of moments; the processing module is specifically used for converting the thunder-survey image at each moment into a binary image; performing edge feature detection on the binary image to obtain contour points of a plurality of vehicles in the binary image; determining position coordinates of a plurality of vehicles in the lightning detection image at each moment based on contour points of the plurality of vehicles in the binary image; vehicle speed information of a plurality of vehicles is determined based on position coordinates of the plurality of vehicles in the radar images at a plurality of times.
In one possible implementation, the video detection road condition data includes video images at a plurality of moments; the processing module is specifically used for determining contour information and pixel point information of a plurality of vehicles in video images at a plurality of moments based on identification information and lane information of the plurality of vehicles in the vision measurement information; correcting and fusing contour information and pixel point information of a plurality of vehicles in video images at a plurality of moments based on position coordinates and speed information of the plurality of vehicles in the radar measurement information to obtain a plurality of Lei Shi images; and analyzing a plurality of vehicles in the multiple Zhang Lei view images to obtain the lightning detection information of the plurality of vehicles.
In one possible implementation manner, the processing module is specifically configured to calculate, based on the radar measurement information of the plurality of vehicles, the number of vehicles on each lane in the target area and an average speed of the vehicles; and determining whether an abnormal traffic situation exists in the target area based on the number of vehicles on each lane of the target area and the average speed of the vehicles.
In one possible implementation, the abnormal traffic situation includes: mild congestion, moderate congestion, or severe congestion; the processing module is specifically configured to determine that an abnormal traffic situation exists if the average vehicle speed is less than or equal to a first vehicle speed and greater than a second vehicle speed, and the number of vehicles is less than the first number, and the abnormal traffic situation is slightly congested, where the first vehicle speed is greater than the second vehicle speed; if the average speed is smaller than or equal to the second speed and larger than the third speed, the number of vehicles is larger than or equal to the first number and smaller than the second number, the abnormal traffic situation is determined to exist, and the abnormal traffic situation is moderate congestion, wherein the second speed is larger than the third speed, and the first number is smaller than the second number; if the average speed is less than or equal to the third speed and the number of vehicles is greater than or equal to the second number, determining that an abnormal traffic situation exists and the abnormal traffic situation is heavy congestion.
In one possible implementation manner, the processing module is further configured to analyze, based on the radar information of the plurality of vehicles, an illegal event of the plurality of vehicles to obtain an illegal event of the target area if an abnormal traffic situation exists in the target area; the illegal event includes at least one of: illegal lane change events, overspeed events, low-speed events, retrograde events, emergency lane occupation events, out-of-limit queuing, overflow events and congestion events; and generating first prompt information based on the illegal event of the target area, and sending the first prompt information to the duty traffic police so as to indicate the duty traffic police to duty on site.
In one possible implementation manner, the communication module is further configured to obtain destinations of a plurality of vehicles and traffic situations of a plurality of areas within a preset range of the target area; the processing module is also used for planning a plurality of paths by taking the location of the target vehicle as a starting place and the destination of the target vehicle as a target place; the target vehicle is any one of a plurality of vehicles; calculating the time length of a plurality of paths based on traffic situation of a plurality of areas; determining the path with the shortest duration as a target path, and generating second prompt information; the second prompt information carries a target path; and sending second prompt information to the target vehicle so as to prompt the target vehicle to run along the target path.
In a third aspect, embodiments of the present invention provide an electronic system comprising a wide area radar monitor, a video vehicle detector and a data fusion processor, wherein the data fusion processor is configured to invoke and run a computer program to perform the steps of the method according to the first aspect and any possible implementation of the first aspect.
In a fourth aspect, an embodiment of the present invention provides an electronic device, including a memory storing a computer program and a processor for calling and running the computer program stored in the memory to perform the steps of the method according to the first aspect and any possible implementation manner of the first aspect.
In a fifth aspect, embodiments of the present invention provide a computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the method according to the first aspect and any one of the possible implementations of the first aspect.
The invention provides an abnormal traffic situation detection method, device and system based on radar fusion. The method overcomes the limitation of the data of the single drive test camera, solves the problem of insufficient precision caused by the fact that abnormal traffic situation identification is easily affected by bad weather, improves the accuracy and efficiency of abnormal traffic situation identification, and provides high-efficiency and accurate evidence data for processing abnormal traffic conditions of roads.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a highway all-weather risk identification system according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a highway all-weather risk identification system according to an embodiment of the present invention;
fig. 3 is a schematic flow chart of an abnormal traffic situation detection method based on radar fusion provided by the embodiment of the invention;
fig. 4 is a schematic structural diagram of an abnormal traffic situation detection device based on radar fusion according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of an electronic system according to an embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
In the description of the present application, "/" means "or" unless otherwise indicated, for example, A/B may mean A or B. "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. Further, "at least one", "a plurality" means two or more. The terms "first," "second," and the like do not limit the number and order of execution, and the terms "first," "second," and the like do not necessarily differ.
In embodiments of the application, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g." in an embodiment should not be taken as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion that may be readily understood.
Furthermore, references to the terms "comprising" and "having" and any variations thereof in the description of the present application are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or modules is not limited to only those steps or modules but may, alternatively, include other steps or modules not listed or inherent to such process, method, article, or apparatus.
In the related technology, the single drive test camera data is limited by the influence of bad weather, so that the abnormal traffic situation identification precision is low and the accuracy is low.
In order to solve the technical problem, the embodiment of the invention provides an abnormal traffic situation detection method based on the radar fusion, radar detection data and video detection data are subjected to data fusion to obtain radar information of a plurality of vehicles, traffic situations of a target area are comprehensively determined according to the radar information of the plurality of vehicles, and accuracy of traffic situation analysis is improved.
For the purpose of making the objects, technical solutions and advantages of the present invention more apparent, the following description will be made with reference to the accompanying drawings of the present invention by way of specific embodiments.
As shown in fig. 1, the embodiment of the invention provides a structural schematic diagram of an all-weather risk identification system for a highway. The system includes a video sensor and a lidar. The video sensor is used for shooting video to detect road condition data. The laser radar is used for detecting and obtaining radar detection road condition data. Then, the radar detection road condition data and the video detection road condition data are subjected to radar fusion, and the traffic situation is judged.
As shown in fig. 2, the embodiment of the invention provides a schematic architecture of an all-weather risk identification system for a highway. The system has the functions of road video monitoring, traffic target identification, traffic flow detection, traffic event detection, traffic information release and the like.
Exemplary road video detection includes ultra-low illuminance detection, glare suppression, 120db wide dynamic detection, intelligent video coding, depth intelligent detection, and the like.
Exemplary traffic target identifications include license plate identifications, vehicle model identifications, logo identifications, body color identifications, unmanned aerial vehicle captures, and the like.
Exemplary traffic flow detection includes vehicle position detection, vehicle speed detection, real-time flow detection, lane occupancy detection, vehicle queuing length detection, headway/pitch detection, and the like.
Exemplary traffic event detection includes vehicle lane change, abnormal parking, traffic congestion, vehicle reverse, vehicle overspeed, vehicle queue overflow, and the like.
Exemplary traffic information issues include vehicle issues, vehicle speed issues, vehicle distance issues, overspeed issues, and the like.
As shown in fig. 3, the embodiment of the invention provides an abnormal traffic situation detection method based on radar fusion. The method comprises steps S101-S105.
S101, radar detection road condition data and video detection road condition data of a target area are obtained.
In some embodiments, the video detection road condition data includes video images at a plurality of moments. The radar detection road condition data includes radar images at a plurality of moments.
S102, performing image recognition and feature extraction on the video detection road condition data, and determining the vision measurement information of a plurality of vehicles in the target area.
In the embodiment of the application, the visual inspection information comprises identification information and lane information.
As a possible implementation manner, the embodiment of the present application may determine the vision measurement information of a plurality of vehicles in the target area through steps S1021-S1023.
S1021, inputting video images at each moment into a pre-trained vehicle identification model according to the time sequence, obtaining identification information of a plurality of vehicles, and detecting frames of each vehicle in the plurality of vehicles at each moment.
Wherein the vehicle identification model inputs the vehicle image and outputs the identification information of the vehicle and a plurality of detection frames of the vehicle.
S1022, calculating the similarity between a plurality of detection frames of each vehicle and the preset track information of each vehicle for each vehicle.
In some embodiments, the preset trajectory information for each vehicle is determined from a plurality of detection frames for one or more times of each vehicle prior to each time.
S1023, determining the lane information of each vehicle according to the calculated similarity and the pre-stored lane line information of the target area.
For example, the present embodiment may determine a detection frame with the greatest similarity, and determine information of each vehicle, such as a driving track, based on the detection frame with the greatest similarity, and further determine lane information of each vehicle based on the driving track of each vehicle and pre-stored lane line information of the target area, such as a lane mark formed by the lane lines and a lane position.
It should be noted that, the present application may detect through the vehicle identification model to obtain a plurality of detection frames, then match the plurality of detection frames with the track information, determine the similarity, and finally determine the information of the vehicle, so that the detection speed is increased and the detection efficiency is improved through the detection of the vehicle identification model; and combining with track matching, on the basis of improving the detection efficiency, the accuracy of the detection result is ensured, and meanwhile, the detection efficiency and accuracy of the vehicle information are considered.
S103, feature extraction is carried out on the radar detection road condition data, and the radar detection information of a plurality of vehicles in the target area is determined.
In the embodiment of the application, the lightning detection information comprises vehicle speed information and position coordinates.
As a possible implementation manner, the embodiment of the present application may determine the radar information of a plurality of vehicles in the target area through steps S1031 to S1034.
S1031, converting the thunder image at each moment into a binary image.
For example, the embodiment of the invention can convert the thunder image into a gray image, and then convert the gray image into a binary image.
S1032, edge feature detection is carried out on the binary image, and contour points of a plurality of vehicles in the binary image are obtained.
S1033, determining position coordinates of the plurality of vehicles in the lightning detection image at each moment based on the contour points of the plurality of vehicles in the binary image.
S1034, determining vehicle speed information of the plurality of vehicles based on the position coordinates of the plurality of vehicles in the radar images at the plurality of times.
And S104, carrying out data fusion based on the vision measurement information and the thunder measurement information of the plurality of vehicles to obtain the thunder measurement information of the plurality of vehicles.
In some embodiments, the radar information includes identification information, track information, vehicle speed information, and lane information.
As a possible implementation manner, the embodiment of the present invention may determine the radar information of a plurality of vehicles through steps S1041 to S1043.
S1041, determining contour information and pixel information of a plurality of vehicles in the video images at a plurality of moments based on the identification information and lane information of the plurality of vehicles in the vision measurement information.
S1042, correcting and fusing the contour information and the pixel point information of a plurality of vehicles in the video image based on the position coordinates and the speed information of the plurality of vehicles in the radar information to obtain a plurality of Lei Shi images.
Alternatively, the embodiment may determine the lane information on which each vehicle is traveling at each moment, the track information on the traveling lane, the vehicle speed information, and the like according to the identification information and the lane information of the vehicle, the position coordinates of the vehicle, and the vehicle speed information of the vehicle, so as to realize data fusion of the vision measurement information and the thunder measurement information of the vehicle and obtain the thunder measurement information of the vehicle.
S1043, analyzing a plurality of vehicles in the multi-Zhang Lei view image to obtain lightning detection information of the plurality of vehicles.
Therefore, the invention can perform data fusion on the video image and the thunder-surveying image at each moment to obtain more accurate Lei Shitu image and improve the accuracy of traffic situation analysis.
S105, analyzing traffic situation based on the lightning detection information of the vehicles, and determining whether an abnormal traffic situation exists in the target area.
In some embodiments, traffic conditions include normal conditions including free flow or substantially unimpeded conditions and abnormal conditions; abnormal situations include: mild congestion, moderate congestion, or severe congestion.
As a possible implementation manner, the embodiment of the present invention may determine the traffic situation of the target area through steps S1051-S1052.
S1051, calculating the number of vehicles on each lane of a target area and the average speed of the vehicles based on the lightning detection information of a plurality of vehicles.
S1052, determining whether an abnormal traffic situation exists in the target area based on the number of vehicles on each lane of the target area and the average speed of the vehicles. .
Step S1052 may be embodied as A1-A5, for example.
A1, if the average vehicle speed is smaller than or equal to the first vehicle speed and larger than the second vehicle speed, and the number of vehicles is smaller than the first number, determining that an abnormal traffic situation exists, and the abnormal traffic situation is light congestion.
Wherein the first vehicle speed is greater than the second vehicle speed;
a2, if the average vehicle speed is smaller than or equal to the second vehicle speed and larger than the third vehicle speed, the number of vehicles is larger than or equal to the first number and smaller than the second number, determining that an abnormal traffic situation exists, and the abnormal traffic situation is moderate congestion.
Wherein the second vehicle speed is greater than the third vehicle speed and the first amount is less than the second amount;
a3, if the average speed is smaller than or equal to the third speed and the number of vehicles is larger than or equal to the second number, determining that an abnormal traffic situation exists and the abnormal traffic situation is heavy congestion.
And A4, if the average speed is greater than the first speed and the number of vehicles is less than the first number, determining that no abnormal traffic situation exists, wherein the traffic situation is a normal situation and the traffic situation is free flow.
A5, if the average speed is greater than the first speed, the number of vehicles is greater than or equal to the first number and less than the second number, and it is determined that no abnormal traffic situation exists, the traffic situation is a normal situation, and the traffic situation is basically smooth.
The invention provides an abnormal traffic situation detection method based on radar fusion, which obtains radar information of a plurality of vehicles by carrying out data fusion on radar detection data and video detection data, comprehensively determines traffic situations of a target area according to the radar information of the plurality of vehicles, and improves the accuracy of traffic situation analysis. The method overcomes the limitation of the data of the single drive test camera, solves the problem of insufficient precision caused by the fact that abnormal traffic situation identification is easily affected by bad weather, improves the accuracy and efficiency of abnormal traffic situation identification, and provides high-efficiency and accurate evidence data for processing abnormal traffic conditions of roads.
Optionally, the method for detecting abnormal traffic situation based on the radar fusion provided by the embodiment of the present invention further includes steps S201 to S202 after step S105.
S201, if an abnormal traffic situation exists in the target area, based on the lightning detection information of the plurality of vehicles, analyzing illegal events of the plurality of vehicles to obtain the illegal events of the target area.
In some embodiments, the illegal event comprises at least one of: illegal lane change events, overspeed events, low speed events, retrograde events, emergency lane occupancy events, out-of-line queuing events, overflow events, and congestion events.
S202, generating first prompt information based on the illegal event of the target area, and sending the first prompt information to the duty traffic police so as to indicate the duty traffic police to perform duty on site.
Therefore, when the traffic situation is abnormal, the embodiment of the invention can analyze the illegal situation of the target area, determine the cause of the congestion of the target area, and send the cause to the duty traffic police, thereby being convenient for the duty traffic police to perform on-site processing and rapidly solving the problem of traffic congestion.
Optionally, after step S105, S301-S305 are further included.
S301, acquiring destinations of a plurality of vehicles and traffic situations of a plurality of areas within a preset range of a target area.
S302, planning a plurality of paths by taking the location of the target vehicle as a starting place and the destination of the target vehicle as a target place.
In some embodiments, the target vehicle is any one of a plurality of vehicles.
S303, calculating the duration of a plurality of paths based on traffic situations of a plurality of areas.
S304, determining the path with the shortest duration as a target path, and generating second prompt information.
In some embodiments, the second hint information carries the target path.
S305, sending second prompt information to the target vehicle so as to prompt the target vehicle to run along the target path.
Therefore, the embodiment of the invention can optimize the navigation path of each vehicle based on the traffic situation of a plurality of areas, determine the path with the shortest spending duration, recommend the path to the vehicle user, and reduce the congestion probability of the target area.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present invention.
The following are device embodiments of the invention, for details not described in detail therein, reference may be made to the corresponding method embodiments described above.
Fig. 4 shows a schematic structural diagram of an abnormal traffic situation detection device based on radar fusion according to an embodiment of the present invention. The detection device 400 comprises a communication module 401 and a processing module 402.
The communication module 401 is configured to obtain radar detection road condition data and video detection road condition data of a target area.
The processing module 402 is configured to perform image recognition and feature extraction on the video detection road condition data, and determine visual measurement information of a plurality of vehicles in the target area; the vision measurement information comprises identification information and lane information; extracting features of radar detection road condition data, and determining radar detection information of a plurality of vehicles in a target area; the thunder measurement information comprises vehicle speed information and position coordinates; based on vision measurement information and thunder measurement information of a plurality of vehicles, carrying out data fusion to obtain thunder measurement information of the plurality of vehicles, wherein the thunder measurement information comprises identification information, track information, vehicle speed information and lane information; and carrying out traffic situation analysis based on the lightning detection information of the vehicles, and determining whether an abnormal traffic situation exists in the target area.
In one possible implementation, the video detection road condition data includes video images at a plurality of moments; the processing module 402 is specifically configured to input, in chronological order, a video image at each moment to a pre-trained vehicle recognition model to obtain identification information of a plurality of vehicles, and a plurality of detection frames of each vehicle in the plurality of vehicles at each moment, where the vehicle recognition model inputs the vehicle image and outputs the identification information of the vehicle and the plurality of detection frames of the vehicle; for each vehicle, calculating the similarity between a plurality of detection frames of each vehicle and preset track information of each vehicle; the preset track information of each vehicle is determined according to a plurality of detection frames of one or more moments before each moment of each vehicle. And determining the lane information of each vehicle according to the calculated similarity and the lane line information of the pre-stored target area.
In one possible implementation, the radar detection road condition data includes radar images at a plurality of moments; the processing module 402 is specifically configured to convert the radar image at each moment into a binary image; performing edge feature detection on the binary image to obtain contour points of a plurality of vehicles in the binary image; determining position coordinates of a plurality of vehicles in the lightning detection image at each moment based on contour points of the plurality of vehicles in the binary image; vehicle speed information of a plurality of vehicles is determined based on position coordinates of the plurality of vehicles in the radar images at a plurality of times.
In one possible implementation, the video detection road condition data includes video images at a plurality of moments; the processing module 402 is specifically configured to determine contour information and pixel point information of a plurality of vehicles in the video images at a plurality of moments based on the identification information and lane information of the plurality of vehicles in the vision measurement information; correcting and fusing contour information and pixel point information of a plurality of vehicles in video images at a plurality of moments based on position coordinates and speed information of the plurality of vehicles in the radar measurement information to obtain a plurality of Lei Shi images; and analyzing a plurality of vehicles in the multiple Zhang Lei view images to obtain the lightning detection information of the plurality of vehicles.
In one possible implementation, the processing module 402 is specifically configured to calculate, based on the radar measurement information of the plurality of vehicles, the number of vehicles on each lane of the target area and an average speed of the vehicles; and determining whether an abnormal traffic situation exists in the target area based on the number of vehicles on each lane of the target area and the average speed of the vehicles.
In one possible implementation, the abnormal traffic situation includes: mild congestion, moderate congestion, or severe congestion; the processing module 402 is specifically configured to determine that an abnormal traffic situation exists and the abnormal traffic situation is slightly congested if the average vehicle speed is less than or equal to a first vehicle speed and greater than a second vehicle speed, and the number of vehicles is less than the first number, where the first vehicle speed is greater than the second vehicle speed; if the average speed is smaller than or equal to the second speed and larger than the third speed, the number of vehicles is larger than or equal to the first number and smaller than the second number, the abnormal traffic situation is determined to exist, and the abnormal traffic situation is moderate congestion, wherein the second speed is larger than the third speed, and the first number is smaller than the second number; if the average speed is less than or equal to the third speed and the number of vehicles is greater than or equal to the second number, determining that an abnormal traffic situation exists and the abnormal traffic situation is heavy congestion.
In a possible implementation manner, the processing module 402 is further configured to analyze, based on the radar measurement information of the plurality of vehicles, the illegal events of the plurality of vehicles to obtain the illegal events of the target area if the abnormal traffic situation exists in the target area; the illegal event includes at least one of: illegal lane change events, overspeed events, low-speed events, retrograde events, emergency lane occupation events, out-of-limit queuing, overflow events and congestion events; and generating first prompt information based on the illegal event of the target area, and sending the first prompt information to the duty traffic police so as to indicate the duty traffic police to duty on site.
In a possible implementation manner, the communication module 401 is further configured to obtain destinations of a plurality of vehicles and traffic situations of a plurality of areas within a preset range of the target area; the processing module 402 is further configured to plan a plurality of paths with the location of the target vehicle as a starting location and the destination of the target vehicle as a target location; the target vehicle is any one of a plurality of vehicles; calculating the time length of a plurality of paths based on traffic situation of a plurality of areas; determining the path with the shortest duration as a target path, and generating second prompt information; the second prompt information carries a target path; and sending second prompt information to the target vehicle so as to prompt the target vehicle to run along the target path.
Fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present invention. As shown in fig. 5, the electronic apparatus 500 of this embodiment includes: a processor 501, a memory 502 and a computer program 503 stored in said memory 502 and executable on said processor 501. The steps of the method embodiments described above, such as steps S101-S105 shown in fig. 1, are implemented when the processor 501 executes the computer program 503. Alternatively, the processor 501 may implement the functions of the modules/units in the above-described device embodiments when executing the computer program 503, for example, the functions of the communication module 401 and the processing module 402 shown in fig. 4.
Illustratively, the computer program 503 may be split into one or more modules/units that are stored in the memory 502 and executed by the processor 501 to accomplish the present invention. The one or more modules/units may be a series of computer program instruction segments capable of performing the specified functions, which instruction segments are used to describe the execution of the computer program 503 in the electronic device 500. For example, the computer program 503 may be divided into the communication module 401 and the processing module 402 shown in fig. 4.
The processor 501 may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 502 may be an internal storage unit of the electronic device 500, such as a hard disk or a memory of the electronic device 500. The memory 502 may also be an external storage device of the electronic device 500, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the electronic device 500. Further, the memory 502 may also include both internal storage units and external storage devices of the electronic device 500. The memory 502 is used for storing the computer program and other programs and data required by the terminal. The memory 502 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal and method may be implemented in other manners. For example, the apparatus/terminal embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present invention may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth.
The above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention, and are intended to be included in the scope of the present invention.

Claims (10)

1. The abnormal traffic situation detection method based on the radar fusion is characterized by comprising the following steps of:
acquiring radar detection road condition data and video detection road condition data of a target area;
performing image recognition and feature extraction on the video detection road condition data, and determining vision measurement information of a plurality of vehicles in the target area; the visual inspection information comprises identification information and lane information;
extracting characteristics of the radar detection road condition data, and determining radar detection information of a plurality of vehicles in the target area; the thunder measurement information comprises vehicle speed information and position coordinates;
based on the vision measurement information and the thunder measurement information of the plurality of vehicles, carrying out data fusion to obtain the thunder measurement information of the plurality of vehicles, wherein the thunder measurement information comprises identification information, track information, vehicle speed information and lane information;
And analyzing traffic situation based on the lightning detection information of the vehicles, and determining whether the abnormal traffic situation exists in the target area.
2. The method for detecting abnormal traffic situation based on the thunder fusion according to claim 1, wherein the video detection road condition data comprises video images at a plurality of moments;
the image recognition and feature extraction are performed on the video detection road condition data, and the determination of the vision measurement information of a plurality of vehicles in the target area comprises the following steps:
inputting video images at each moment into a pre-trained vehicle identification model according to time sequence to obtain identification information of the vehicles and a plurality of detection frames of each vehicle in the vehicles at each moment, wherein the vehicle identification model inputs the vehicle images to output the identification information of the vehicles and the detection frames of the vehicles;
for each vehicle, calculating the similarity between a plurality of detection frames of each vehicle and preset track information of each vehicle; the preset track information of each vehicle is determined according to a plurality of detection frames of one or more moments before each moment of each vehicle.
And determining the lane information of each vehicle according to the calculated similarity and the pre-stored lane line information of the target area.
3. The method for detecting abnormal traffic situation based on the radar fusion according to claim 1, wherein the radar detection road condition data comprises radar images at a plurality of moments;
the feature extraction of the radar detection road condition data is performed to determine radar detection information of a plurality of vehicles in the target area, including:
converting the thunder-survey image at each moment into a binary image;
performing edge feature detection on the binary image to obtain contour points of the vehicles in the binary image;
determining position coordinates of the plurality of vehicles in the lightning detection image at each moment based on the contour points of the plurality of vehicles in the binary image;
and determining the speed information of the vehicles based on the position coordinates of the vehicles in the lightning detection images at the multiple moments.
4. The method for detecting abnormal traffic situation based on the thunder fusion according to claim 1, wherein the video detection road condition data comprises video images at a plurality of moments;
the data fusion is performed based on the vision measurement information and the thunder measurement information of the plurality of vehicles to obtain the thunder measurement information of the plurality of vehicles, and the method comprises the following steps:
Determining contour information and pixel point information of the plurality of vehicles in the video images at the plurality of moments based on the identification information and lane information of the plurality of vehicles in the vision measurement information;
correcting and fusing the contour information and the pixel point information of the vehicles in the video images at the multiple moments based on the position coordinates and the speed information of the vehicles in the radar measurement information to obtain multiple Lei Shi images;
and analyzing the plurality of vehicles in the multi-Zhang Lei visual image to obtain the lightning detection information of the plurality of vehicles.
5. The method for detecting abnormal traffic situation based on the radar fusion according to claim 1, wherein the step of analyzing traffic situation based on the radar information of the plurality of vehicles to determine whether the target area has the abnormal traffic situation comprises the steps of:
calculating the number of vehicles and the average speed of the vehicles on each lane of the target area based on the lightning detection information of the plurality of vehicles;
and determining whether an abnormal traffic situation exists in the target area based on the number of vehicles on each lane of the target area and the average speed of the vehicles.
6. The method for detecting abnormal traffic situation based on the radar fusion according to claim 5, wherein the abnormal traffic situation comprises: mild congestion, moderate congestion, or severe congestion;
The determining whether the abnormal traffic situation exists in the target area based on the number of vehicles on each lane of the target area and the average speed of the vehicles comprises the following steps:
if the average vehicle speed is smaller than or equal to a first vehicle speed and larger than a second vehicle speed, and the number of vehicles is smaller than the first number, determining that an abnormal traffic situation exists and is slightly congested, wherein the first vehicle speed is larger than the second vehicle speed;
if the average vehicle speed is smaller than or equal to the second vehicle speed and larger than a third vehicle speed, the number of vehicles is larger than or equal to the first number and smaller than the second number, determining that an abnormal traffic situation exists and is moderate congestion, wherein the second vehicle speed is larger than the third vehicle speed, and the first number is smaller than the second number;
and if the average vehicle speed is smaller than or equal to the third vehicle speed and the number of vehicles is larger than or equal to the second number, determining that an abnormal traffic situation exists and the abnormal traffic situation is heavy congestion.
7. The method for detecting an abnormal traffic situation based on the radar fusion according to any one of claims 1 to 6, wherein after the traffic situation analysis is performed based on the radar information of the plurality of vehicles to determine whether the abnormal traffic situation exists in the target area, further comprising:
If the abnormal traffic situation exists in the target area, analyzing illegal events of the vehicles based on the lightning detection information of the vehicles to obtain the illegal events of the target area; the illicit event includes at least one of: illegal lane change events, overspeed events, low-speed events, retrograde events, emergency lane occupation events, out-of-limit queuing, overflow events and congestion events;
and generating first prompt information based on the illegal event of the target area, and sending the first prompt information to the duty traffic police so as to indicate the duty traffic police to duty on site.
8. The method for detecting an abnormal traffic situation based on the radar fusion according to any one of claims 1 to 6, wherein after the traffic situation analysis is performed based on the radar information of the plurality of vehicles to determine whether the abnormal traffic situation exists in the target area, further comprising:
acquiring destinations of the vehicles and traffic situations of the areas within a preset range of the target area;
taking the place of a target vehicle as a starting place and the destination of the target vehicle as a target place, and planning a plurality of paths; the target vehicle is any one of the plurality of vehicles;
Calculating the time length of the paths based on the traffic situation of the areas;
determining the path with the shortest duration as a target path, and generating second prompt information; the second prompt information carries the target path;
and sending the second prompt information to the target vehicle so as to prompt the target vehicle to run along the target path.
9. An abnormal traffic situation detection device based on radar fusion is characterized by comprising:
the communication module is used for acquiring radar detection road condition data and video detection road condition data of the target area;
the processing module is used for carrying out image recognition and feature extraction on the video detection road condition data and determining vision measurement information of a plurality of vehicles in the target area; the visual inspection information comprises identification information and lane information; extracting characteristics of the radar detection road condition data, and determining radar detection information of a plurality of vehicles in the target area; the thunder measurement information comprises vehicle speed information and position coordinates; based on the vision measurement information and the thunder measurement information of the plurality of vehicles, carrying out data fusion to obtain the thunder measurement information of the plurality of vehicles, wherein the thunder measurement information comprises identification information, track information, vehicle speed information and lane information; and analyzing traffic situation based on the lightning detection information of the vehicles, and determining whether the abnormal traffic situation exists in the target area.
10. An electronic system, comprising: wide area radar monitor, video vehicle detector and data fusion processor for invoking and running a computer program to perform the steps of the method according to any of claims 1 to 8.
CN202310749007.2A 2023-06-25 2023-06-25 Abnormal traffic situation detection method, device and system based on radar fusion Pending CN116935631A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310749007.2A CN116935631A (en) 2023-06-25 2023-06-25 Abnormal traffic situation detection method, device and system based on radar fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310749007.2A CN116935631A (en) 2023-06-25 2023-06-25 Abnormal traffic situation detection method, device and system based on radar fusion

Publications (1)

Publication Number Publication Date
CN116935631A true CN116935631A (en) 2023-10-24

Family

ID=88385483

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310749007.2A Pending CN116935631A (en) 2023-06-25 2023-06-25 Abnormal traffic situation detection method, device and system based on radar fusion

Country Status (1)

Country Link
CN (1) CN116935631A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117292551A (en) * 2023-11-27 2023-12-26 辽宁邮电规划设计院有限公司 Urban traffic situation adjustment system and method based on Internet of Things

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117292551A (en) * 2023-11-27 2023-12-26 辽宁邮电规划设计院有限公司 Urban traffic situation adjustment system and method based on Internet of Things
CN117292551B (en) * 2023-11-27 2024-02-23 辽宁邮电规划设计院有限公司 Urban traffic situation adjustment system and method based on Internet of things

Similar Documents

Publication Publication Date Title
CN108986465B (en) Method, system and terminal equipment for detecting traffic flow
CN109087510B (en) Traffic monitoring method and device
CN110164130B (en) Traffic incident detection method, device, equipment and storage medium
CN106255899B (en) Device for signaling an object to a navigation module of a vehicle equipped with such a device
CN108764042B (en) Abnormal road condition information identification method and device and terminal equipment
CN110796007B (en) Scene recognition method and computing device
US20130093895A1 (en) System for collision prediction and traffic violation detection
KR20220047732A (en) Vehicle monitoring method and apparatus, electronic device, storage medium and computer program, cloud control platform and vehicle road cooperation system
CN108550258B (en) Vehicle queuing length detection method and device, storage medium and electronic equipment
CN111898491B (en) Identification method and device for reverse driving of vehicle and electronic equipment
CN109284801B (en) Traffic indicator lamp state identification method and device, electronic equipment and storage medium
CN111723854B (en) Expressway traffic jam detection method, equipment and readable storage medium
CN112818792A (en) Lane line detection method, lane line detection device, electronic device, and computer storage medium
CN116935631A (en) Abnormal traffic situation detection method, device and system based on radar fusion
CN113888860A (en) Method and device for detecting abnormal running of vehicle, server and readable storage medium
CN114639085A (en) Traffic signal lamp identification method and device, computer equipment and storage medium
CN115618932A (en) Traffic incident prediction method and device based on internet automatic driving and electronic equipment
CN111160132B (en) Method and device for determining lane where obstacle is located, electronic equipment and storage medium
CN110969864A (en) Vehicle speed detection method, vehicle driving event detection method and electronic equipment
CN114730492A (en) Assertion vehicle detection model generation and implementation
CN115762230A (en) Parking lot intelligent guiding method and device based on remaining parking space amount prediction
CN115019511A (en) Method and device for identifying illegal lane change of motor vehicle based on automatic driving vehicle
CN114895274A (en) Guardrail identification method
CN113994391A (en) Vehicle passing reminding method and device and vehicle-mounted terminal
CN113380039A (en) Data processing method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination