CN114670852A - Method, device, equipment and medium for identifying abnormal driving behaviors - Google Patents

Method, device, equipment and medium for identifying abnormal driving behaviors Download PDF

Info

Publication number
CN114670852A
CN114670852A CN202210192029.9A CN202210192029A CN114670852A CN 114670852 A CN114670852 A CN 114670852A CN 202210192029 A CN202210192029 A CN 202210192029A CN 114670852 A CN114670852 A CN 114670852A
Authority
CN
China
Prior art keywords
information
vehicle
radar
average speed
lane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210192029.9A
Other languages
Chinese (zh)
Inventor
吴冬升
王传奇
郑泽彬
郑廷钊
倪泓鑫
刘双广
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gosuncn Technology Group Co Ltd
Original Assignee
Gosuncn Technology Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gosuncn Technology Group Co Ltd filed Critical Gosuncn Technology Group Co Ltd
Priority to CN202210192029.9A priority Critical patent/CN114670852A/en
Publication of CN114670852A publication Critical patent/CN114670852A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • B60W40/105Speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/408Radar; Laser, e.g. lidar

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a method for identifying abnormal driving behaviors, which comprises the following steps: time synchronization and space synchronization are carried out on the roadside laser radar, the roadside millimeter wave radar and the roadside camera; after time synchronization and space synchronization are completed, acquiring first radar information from the roadside laser radar, acquiring second radar information from the roadside millimeter wave radar, and acquiring video information from a roadside camera; performing self-adaptive fusion algorithm according to the environmental information, and performing fusion operation on the first radar information, the second radar information and the video information by adopting the self-adaptive fusion algorithm to obtain target object running state information; and identifying abnormal driving behaviors in the target object according to the target object running condition information. According to the invention, through fusion, the data dimension is richer, the data confidence coefficient is higher, different fusion algorithms are suitable for all weather, the accuracy of the running state information of the target object is improved, and the accuracy of identifying abnormal running behaviors is improved.

Description

Method, device, equipment and medium for identifying abnormal driving behaviors
Technical Field
The invention relates to the technical field of Internet of vehicles and artificial intelligence, in particular to a method, a device, equipment and a medium for identifying abnormal driving behaviors.
Background
The automatic driving is a high-level complex systematic engineering consisting of a plurality of vehicle-mounted sensors such as a laser RADAR (LiDAR), a millimeter wave RADAR (RADAR), a Camera (Camera), a Global Positioning System (GPS) and an Inertial Measurement Unit (IMU), and subsystems such as a driving computer ECU (vehicle-mounted control system).
In the actual automatic driving process, in the prior art, one of the abnormal vehicle identification input data sources is mainly performed through vehicle-mounted devices such as a laser radar, a millimeter wave radar and a camera. However, the mounting angles of the vehicle-mounted devices such as the laser radar, the millimeter wave radar and the camera are easy to cause the problems of interference of a front vehicle (for example, a large vehicle blocks a small vehicle, and the traffic light state cannot be identified through the camera), a blind area, a limited detection coverage range and the like, and the accuracy of detection data is influenced; and only the detection data of the laser radar, the millimeter wave radar and the camera are obtained and combined, so that the method is difficult to be applied to all scenes, the detection of targets and events is lacked, and the identification accuracy of abnormal driving behaviors is poor.
Disclosure of Invention
The embodiment of the invention provides a method, a device, equipment and a medium for identifying abnormal driving behaviors, and aims to solve the problems of low detection data precision, missing use and poor detection precision of the abnormal driving behaviors in the prior art in the automatic driving process.
A method of identifying abnormal driving behavior, the method comprising:
time synchronization and space synchronization are carried out on the roadside laser radar, the roadside millimeter wave radar and the roadside camera;
after time synchronization and space synchronization are completed, acquiring first radar information from the roadside laser radar, acquiring second radar information from the roadside millimeter wave radar, and acquiring video information from a roadside camera;
according to an environment information self-adaptive fusion algorithm, performing fusion operation on the first radar information, the second radar information and the video information by adopting the self-adaptive fusion algorithm to obtain target object running state information;
and identifying abnormal driving behaviors in the target object according to the target object running condition information.
Optionally, the environmental information includes lighting information and weather information;
the self-adaptive fusion algorithm according to the environment information, and the fusion operation of the first radar information, the second radar information and the video information by adopting the self-adaptive fusion algorithm comprises the following steps:
when the illumination information is illumination and the weather information is free of influence of rain, fog and snow, a first fusion algorithm is adopted and used for carrying out fusion operation on the first radar information, the second radar information and the video information;
When the weather information is influenced by rain, fog and snow, a second fusion algorithm is adopted, and the second fusion algorithm is used for performing fusion operation on the video information and second radar information;
when the illumination information is influenced by illumination, a third fusion algorithm is adopted, and the third fusion algorithm is used for carrying out fusion operation on the first radar information and the second radar information;
and when the illumination information is poor in illumination and is influenced by rain, fog and snow, adopting a fourth fusion algorithm, wherein the fourth fusion algorithm is used for performing fusion operation on the second radar information.
Optionally, the target object operating condition information includes at least a vehicle on each lane and its speed;
the identifying of the abnormal driving behavior in the target object according to the target object running condition information comprises the following steps:
traversing each lane, and calculating the average speed corresponding to the lane according to the speed of the vehicle on the lane;
calculating the average speed corresponding to each vehicle;
and identifying slow/fast vehicle information on the lane according to the average speed corresponding to the lane and the average speed corresponding to the vehicle.
Optionally, the identifying slow-moving vehicle information on the lane according to the average speed corresponding to the lane and the average speed corresponding to the vehicle includes:
Comparing the average speed V1 corresponding to the lane with the average speed V2 corresponding to the vehicle;
and if the V2 is less than K1V 1, the vehicle corresponding to the average speed V2 is taken as a slow-moving vehicle, and slow-moving vehicle information is acquired, wherein the first proportional coefficient K1 is less than or equal to 1.
Optionally, the identifying the fast vehicle information on the lane according to the average speed corresponding to the lane and the average speed corresponding to the vehicle includes:
comparing the average speed V1 corresponding to the lane with the average speed V2 corresponding to the vehicle;
and if V2> K2V 1, the vehicle corresponding to the average speed V2 is used as a fast-running vehicle, and fast-running vehicle information is obtained, wherein a second proportionality coefficient K2 is larger than or equal to 1.
Optionally, the identifying abnormal driving behavior in the target object according to the target object operating condition information further includes:
acquiring speed limit information V3 of a lane;
and identifying the fast-moving vehicle information on the lane according to the limit information V3, the average speed V1 corresponding to the lane and the average speed V2 corresponding to the vehicle.
Optionally, the identifying fast-moving vehicle information on the lane according to the limit information V3, the average speed V1 corresponding to the lane, and the average speed V2 corresponding to the vehicle includes:
When K3V 3> -K2V 1, comparing the average speed V1 corresponding to the lane with the average speed V2 corresponding to the vehicle, and if V2> K2V 1, taking the vehicle corresponding to the average speed V2 as a fast-running vehicle and acquiring fast-running vehicle information;
and when K3V 3< K2V 1, comparing the average speed V1 corresponding to the lane with speed limit information V3, and if V2> K3V 3, using the vehicle corresponding to the average speed V2 as a fast-driving vehicle to acquire fast-driving vehicle information, wherein the second proportionality coefficient K2 and the third proportionality coefficient K3 are both greater than or equal to 1.
An apparatus for identifying abnormal driving behavior, the apparatus comprising:
the synchronization module is used for carrying out time synchronization and space synchronization on the roadside laser radar, the roadside millimeter wave radar and the roadside camera;
the acquisition module is used for acquiring first radar information from the roadside laser radar, second radar information from the roadside millimeter wave radar and video information from a roadside camera after time synchronization and space synchronization are completed;
the fusion module is used for carrying out fusion operation on the first radar information, the second radar information and the video information by adopting a self-adaptive fusion algorithm according to an environment information self-adaptive fusion algorithm to obtain target object running state information;
And the identification module is used for identifying abnormal driving behaviors in the target object according to the target object running state information.
A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method of identifying abnormal driving behaviour as described above.
A computer device comprising a memory, a processor and a computer program stored in said memory and executable on said processor, said processor implementing the method of identifying abnormal driving behaviour as described above when executing said computer program.
According to the embodiment of the invention, time synchronization and space synchronization are carried out on the road side laser radar, the road side millimeter wave radar and the road side camera; after time synchronization and space synchronization are completed, acquiring first radar information from the roadside laser radar, acquiring second radar information from the roadside millimeter wave radar, and acquiring video information from a roadside camera; according to the environment information self-adaptive fusion algorithm, the self-adaptive fusion algorithm is adopted to perform fusion operation on the first radar information, the second radar information and the video information to obtain the target object running state information, so that the data dimensionality is richer, the data confidence coefficient is higher, different fusion algorithms are suitable for all weather, and the accuracy of the target object running state information is improved; and identifying the abnormal driving behaviors in the target object according to the target object running condition information, so that the accuracy of identifying the abnormal driving behaviors is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
Fig. 1 is a flowchart of a method for identifying an abnormal driving behavior according to an embodiment of the present invention;
fig. 2 is a flowchart of step S104 in the method for identifying abnormal driving behaviors according to an embodiment of the present invention;
fig. 3 is a flowchart of step S104 in the method for identifying abnormal driving behavior according to another embodiment of the present invention;
fig. 4 is a schematic structural diagram of an abnormal driving behavior recognition apparatus according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a computer device according to an embodiment of the invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
According to the embodiment of the invention, the detection data of the road side sensing equipment (the laser radar, the millimeter wave radar and the camera) are fused, so that the data dimension is richer, the confidence coefficient of the data is higher, the detection precision of the target vehicle is improved, the data with higher confidence coefficient is adopted to integrate the vehicle driving state identification mechanism, and the identification accuracy of the abnormal driving behavior is effectively improved.
The method for identifying an abnormal driving behavior according to the present embodiment will be described in detail below. As shown in fig. 1, the method for identifying an abnormal driving behavior includes:
in step S101, time synchronization and spatial synchronization are performed on the roadside lidar, the roadside millimeter wave radar, and the roadside camera.
Here, the embodiment of the present invention performs time synchronization on the roadside lidar, the roadside millimeter wave radar, and the roadside camera through the heterogeneous time synchronization module, so as to ensure that the time between the devices is consistent. The Time synchronization module can acquire standard clock signal information from a geostationary satellite through a GPS, and then performs Time synchronization among the road side laser radar, the road side millimeter wave radar and the road side camera by adopting a Network Time Protocol (NTP) or a Precision Time Protocol (PTP).
According to the embodiment of the invention, the spatial synchronization is carried out on the road side laser radar, the road side millimeter wave radar and the road side camera through the heterogeneous spatial synchronization module, for example, by adopting pixel coordinate transformation, so that the consistency of the space among all devices is ensured.
In step S102, after time synchronization and space synchronization are completed, first radar information is acquired from the roadside lidar, second radar information is acquired from the roadside millimeter wave radar, and video information is acquired from a roadside camera.
And after the time-space synchronization is finished, respectively acquiring the related data from the roadside laser radar, the related data from the roadside millimeter wave radar and the related data from a roadside camera. The first radar information is preliminary target object identification information which is output by the roadside laser radar after a deep learning point cloud algorithm and a deep learning radar algorithm are utilized on the basis of the acquired information, and the first radar information comprises but is not limited to target object identification type, position, speed and direction information; the second radar information is preliminary target object identification information which is output by the roadside millimeter wave radar after a deep learning point cloud algorithm and a deep learning radar algorithm are utilized based on the acquired information, and the preliminary target object identification information comprises but is not limited to target object identification type, speed and direction information; the video information is primary target object identification information which is output by the roadside camera based on the acquired information and by utilizing a deep learning image algorithm, and the primary target object identification information comprises but is not limited to target object identification type, vehicle matching, speed, position and direction information.
In step S103, according to the environment information adaptive fusion algorithm, performing fusion operation on the first radar information, the second radar information, and the video information by using the adaptive fusion algorithm to obtain the target object operating condition information.
In this embodiment of the present invention, a fusion algorithm is used to perform fusion operations, such as target matching and information weighted fusion, on the first radar information, the second radar information, and the video information, so as to obtain operation status information of the target object. The target object running condition information includes, but is not limited to, target object, speed, position, direction, and license plate information.
The embodiment of the invention supports the self-adaptation of the fusion algorithm, and the fusion algorithm is self-adapted according to the environmental information. Optionally, the environment information includes illumination information and weather information, and may be acquired through video information collected by the road side camera and information collected by the environment sensor. The self-adaptive fusion algorithm according to the environment information, and the fusion operation of the first radar information, the second radar information and the video information by adopting the self-adaptive fusion algorithm comprises the following steps:
when the illumination information is illumination and the weather information is free of influence of rain, fog and snow, a first fusion algorithm is adopted and used for carrying out fusion operation on the first radar information, the second radar information and the video information;
When the weather information is influenced by rain, fog and snow, a second fusion algorithm is adopted, and the second fusion algorithm is used for performing fusion operation on the video information and second radar information;
when the illumination information is influenced by illumination, adopting a third fusion algorithm, wherein the third fusion algorithm is used for performing fusion operation on the first radar information and the second radar information;
and when the illumination information is poor in illumination and the weather information is influenced by rain, fog and snow, adopting a fourth fusion algorithm, wherein the fourth fusion algorithm is used for performing fusion operation on the second radar information.
Here, when the lighting is normal and there is no influence of rain, fog or snow, the embodiment of the present invention performs the fusion operation on the first radar information, the second radar information and the video information by using the first fusion algorithm. And when the environmental information is influenced by rain, fog and snow, the identification accuracy of the laser radar is reduced, and at the moment, a second fusion algorithm is adopted to perform fusion operation on the video information and the second radar information. And when the influence of illumination is large, the accuracy of the video information is reduced, and at the moment, a third fusion algorithm is adopted to perform fusion operation on the first radar information and the second radar information. And when the illumination is poor and the influence of rain, fog and snow is caused, only the second radar information is subjected to fusion operation by adopting a fourth fusion algorithm.
According to the embodiment of the invention, different fusion algorithms are self-adapted according to illumination and rain, fog and snow conditions, so that the method and the device are suitable for all-weather scenes, and are beneficial to improving the accuracy of target data acquisition and identification.
In step S104, an abnormal driving behavior in the target object is identified based on the target object operating condition information.
Here, as described above, the target object running condition information includes, but is not limited to, target objects, speed, position, direction, and license plate information, i.e., vehicles on each lane and their speeds, and the recognition of abnormal driving behavior in the target object includes abnormal recognition for slow-moving vehicles and abnormal recognition for fast-moving vehicles. Optionally, as shown in fig. 2, the identifying the abnormal driving behavior in the target object according to the target object operating condition information in step S104 further includes:
in step S201, each lane is traversed, and an average speed corresponding to the lane is calculated according to the speed of the vehicle on the lane.
The embodiment of the invention integrates the traffic light signal control machine, does not consider the accuracy problems of video identification of traffic lights and the like, and directly obtains the traffic light signals. In the green light condition, the running speed of the vehicle in the preset time T is acquired, and the average speed V1 of all vehicles on each lane is calculated, including but not limited to motor vehicles and non-motor vehicles. Optionally, the highest speed and lowest speed data may be filtered in the calculation according to the number of vehicles.
In step S202, an average speed corresponding to each vehicle is calculated.
Then, for each vehicle, the corresponding average speed V2 of the vehicle within the preset time T is calculated.
In step S203, the slow/fast vehicle information on the lane is identified according to the average speed corresponding to the lane and the average speed corresponding to the vehicle.
Optionally, as a preferred example of the present invention, in the recognizing of the abnormal behavior of the slow moving vehicle, the recognizing of the slow moving vehicle information on the lane according to the average speed corresponding to the lane and the average speed corresponding to the vehicle in step S203 further includes:
in step S301, the average speed V1 corresponding to the lane and the average speed V2 corresponding to the vehicle are compared.
In step S302, if V2< K1 × V1, the vehicle corresponding to the average speed V2 is regarded as a slow-moving vehicle, and slow-moving vehicle information is acquired, where the first scaling factor K1 is less than or equal to 1.
Here, K1 is the first scale factor, K1 is less than or equal to 1, and may be adjusted according to practical situations, for example, K1 is 0.5. According to the embodiment of the invention, the average speed V2 of each vehicle and the calculated average speed V1 on the lane are multiplied by a first proportion coefficient K1 for comparison. When V2< K1V 1, i.e. a certain vehicle average speed V2 is lower than K1V 1, a slow-moving vehicle is considered to be a slow-moving vehicle, and slow-moving vehicle information including, but not limited to, a target type, speed, position and direction is screened out for each lane.
Optionally, as another preferred example of the present invention, in the identifying of the abnormal behavior of the fast-moving vehicle, the identifying of the fast-moving vehicle information on the lane according to the average speed corresponding to the lane and the average speed corresponding to the vehicle in step S203 further includes:
in step S401, the average speed V1 corresponding to the lane and the average speed V2 corresponding to the vehicle are compared.
In step S402, if V2> K2 × V1, the vehicle corresponding to the average speed V2 is taken as a fast-driving vehicle, and fast-driving vehicle information is acquired, where the second proportionality coefficient K2 is greater than or equal to 1.
Here, K2 is the second proportionality coefficient, K2 is greater than or equal to 1, and may be adjusted according to practical situations, for example, K2 is 1.5. According to the embodiment of the invention, the average speed V2 of each vehicle is compared with the calculated average speed V1 on the lane multiplied by the second proportionality coefficient K2. And when V2> K2V 1, namely the average speed V2 of a certain vehicle exceeds K2V 1, the vehicle is regarded as a fast vehicle, and fast vehicle information of each lane is screened out, wherein the fast vehicle information comprises but is not limited to target type, speed, position and direction.
Optionally, another preferable example of the present invention is provided on the basis of the embodiment of fig. 2, and as shown in fig. 3, the identifying the abnormal driving behavior in the target object according to the target object operating condition information further includes:
In step S501, each lane is traversed, and an average speed corresponding to the lane is calculated according to the speed of the vehicle in the lane.
In step S502, an average speed corresponding to each vehicle is calculated.
Steps S501 to S502 are the same as steps S201 to S202 described in the above embodiment, and for details, refer to the description of the above embodiment, which is not repeated herein.
In step S503, the speed limit information V3 of the lane is acquired.
In step S504, the fast-moving vehicle information on the lane is identified based on the limit information V3, the average speed V1 corresponding to the lane, and the average speed V2 corresponding to the vehicle.
Here, according to the speed limit information V3 of the lane, the calculated average speed V1 of the lane is multiplied by the second proportionality coefficient K2 for comparison, and the judgment condition of the fast-moving vehicle is determined according to the comparison result. Optionally, the step S504 further includes:
in step S601, when K3 × V3> -K2 × V1, the average speed V1 corresponding to the lane is compared with the average speed V2 corresponding to the vehicle, and if V2> K2 × V1, the vehicle corresponding to the average speed V2 is regarded as a fast-moving vehicle, and fast-moving vehicle information is acquired.
In step S602, when K3 × V3< K2 × V1, the average speed V1 corresponding to the lane is compared with the speed limit information V3, and if V2> K3 × V3, the vehicle corresponding to the average speed V2 is regarded as a fast-traveling vehicle, and fast-traveling vehicle information is acquired.
Here, the second scaling factor K2 and the third scaling factor K3 are both greater than or equal to 1. Among them, the K3 is preferably 1.2. When 1.2 × V3> -K2 × V1, K2 × V1 is used as a judgment condition for fast-moving vehicles, and K2 × V1 is compared with V2 per vehicle. When V2> K2V 1, the vehicle corresponding to V2 is taken as a fast-moving vehicle, and fast-moving vehicle information including but not limited to target type, speed, position and direction is screened out.
When 1.2 × V3< K2 × V1, 1.2 × V3 is used as a determination condition for a fast-traveling vehicle, and the average speed V2 per vehicle is compared with 1.2 × V3. When V2>1.2 × V3, the vehicle corresponding to V2 is used as a fast-moving vehicle, and fast-moving vehicle information including, but not limited to, target type, speed, position and direction is screened out.
Optionally, in the embodiment of the present invention, the slow vehicle information and the fast vehicle information of each lane are sent to the roadside unit RSU through the mobile edge calculation module MEC, the RSU is broadcasted (in a V2X manner) to the on-board unit OBU on the motor vehicle, the OBU calculates the relative position of the vehicle according to the position (lane) of the vehicle and the broadcasted information, determines whether there is an abnormally-driving vehicle in front of and behind the lane of the vehicle, calculates the distance to the abnormally-driving vehicle, presents the driving information of the abnormally-driving vehicle, such as that there is X meters in front of/behind the abnormally-driving vehicle and there is big vehicle XXXX fast driving, and please pay attention to deceleration or lane change.
In summary, aiming at the defects that the driving scene data of the vehicle is acquired by combining a plurality of vehicle-mounted sensors such as a laser radar, a millimeter wave radar and a camera and the detection of the target and the event is not performed by data fusion in the prior art, the embodiment of the invention fuses the roadside sensing devices (the laser radar, the millimeter wave radar and the laser radar) to enable the data (the acquisition of relevant parameters such as vehicle identification and tracking, license plate detection, vehicle driving direction and vehicle driving speed) to have richer dimensionality and higher confidence level of the data, avoids the problems caused by the laser radar, the problems of front vehicle interference (such as a large vehicle shelters from a small vehicle and cannot identify the traffic light state through a camera), single vehicle intelligent blind areas, limited detection coverage range and the like caused by the installation angles of vehicle-mounted devices such as a millimeter wave radar and the camera are solved, the detection range of the vehicle running state is expanded, and the detection precision of a target vehicle is effectively improved; the higher confidence data is adopted to integrate a vehicle running state recognition mechanism, so that the recognition accuracy of abnormal driving behaviors, such as slow-running vehicles with running speed obviously lower than that of other vehicles, fast-running vehicles with running speed obviously higher than that of other vehicles and the like, is effectively improved; the system supports all-weather work and is not influenced by weather conditions; and the driving behavior detection result is broadcasted to the surrounding related vehicle OBUs through the RSU (in a V2X mode), so that the vehicle is assisted to make correct decision and control.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by functions and internal logic of the process, and should not limit the implementation process of the embodiments of the present invention in any way.
In an embodiment, the present invention further provides an abnormal driving behavior recognition device, which is in one-to-one correspondence with the abnormal driving behavior recognition method in the above embodiment. As shown in fig. 4, the device for identifying the abnormal driving behavior includes a synchronization module 41, an acquisition module 42, a fusion module 43, and an identification module 44. The detailed description of each functional module is as follows:
a synchronization module 41, configured to perform time synchronization and space synchronization on the roadside laser radar, the roadside millimeter wave radar, and the roadside camera;
an obtaining module 42, configured to obtain first radar information from the roadside lidar, obtain second radar information from the roadside millimeter wave radar, and obtain video information from a roadside camera after time synchronization and space synchronization are completed;
the fusion module 43 is configured to perform fusion operation on the first radar information, the second radar information, and the video information by using a self-adaptive fusion algorithm according to an environment information self-adaptive fusion algorithm to obtain target object operating condition information;
And the identification module 44 is used for identifying abnormal driving behaviors in the target object according to the target object running condition information.
Optionally, the environmental information includes lighting information and weather information;
the fusion module 43 is specifically configured to:
when the illumination information is illumination and the weather information is free of influence of rain, fog and snow, a first fusion algorithm is adopted and used for carrying out fusion operation on the first radar information, the second radar information and the video information;
when the weather information is influenced by rain, fog and snow, a second fusion algorithm is adopted, and the second fusion algorithm is used for carrying out fusion operation on the video information and second radar information;
when the illumination information is influenced by illumination, a third fusion algorithm is adopted, and the third fusion algorithm is used for carrying out fusion operation on the first radar information and the second radar information;
and when the illumination information is poor in illumination and influenced by rain, fog and snow, adopting a fourth fusion algorithm, wherein the fourth fusion algorithm is used for performing fusion operation on the second radar information.
Optionally, the target object operating condition information includes at least a vehicle on each lane and its speed;
The identification module 44 includes:
the first calculation unit is used for traversing each lane and calculating the average speed corresponding to the lane according to the speed of the vehicle on the lane;
a second calculation unit for calculating an average speed corresponding to each vehicle;
and the identification unit is used for identifying slow/fast vehicle information on the lane according to the average speed corresponding to the lane and the average speed corresponding to the vehicle.
Optionally, the identification unit is configured to:
comparing the average speed V1 corresponding to the lane with the average speed V2 corresponding to the vehicle;
and if the V2 is less than K1V 1, the vehicle corresponding to the average speed V2 is taken as a slow-moving vehicle, and slow-moving vehicle information is acquired, wherein the first proportional coefficient K1 is less than or equal to 1.
Optionally, the identification unit is configured to:
comparing the average speed V1 corresponding to the lane with the average speed V2 corresponding to the vehicle;
and if V2> K2V 1, the vehicle corresponding to the average speed V2 is used as a fast-driving vehicle, and fast-driving vehicle information is acquired, wherein the second proportionality coefficient K2 is greater than or equal to 1.
Optionally, the identification module 44 further includes:
an acquisition unit for acquiring speed limit information V3 of a lane;
the identification unit is further used for identifying the fast-running vehicle information on the lane according to the limit information V3, the average speed V1 corresponding to the lane and the average speed V2 corresponding to the vehicle.
Optionally, the identifying the fast vehicle information on the lane according to the limit information V3, the average speed V1 corresponding to the lane, and the average speed V2 corresponding to the vehicle includes:
when K3V 3> -K2V 1, comparing the average speed V1 corresponding to the lane with the average speed V2 corresponding to the vehicle, and if V2> K2V 1, taking the vehicle corresponding to the average speed V2 as a fast-running vehicle and acquiring fast-running vehicle information;
and when K3V 3< K2V 1, comparing the average speed V1 corresponding to the lane with speed limit information V3, and if V2> K3V 3, using the vehicle corresponding to the average speed V2 as a fast-driving vehicle to acquire fast-driving vehicle information, wherein the second proportionality coefficient K2 and the third proportionality coefficient K3 are both greater than or equal to 1.
The specific definition of the abnormal driving behavior recognition device can be referred to the above definition of the abnormal driving behavior recognition method, and is not described herein again. The above-described abnormal driving behavior recognition apparatus may be implemented in whole or in part by software, hardware, or a combination thereof. The modules can be embedded in a hardware form or independent from a processor in a computer device, and can also be stored in a memory in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a server, the internal structure of which may be as shown in fig. 5. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a method of identifying abnormal driving behavior.
In one embodiment, a computer device is provided, comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing the following steps when executing the computer program:
time synchronization and space synchronization are carried out on the roadside laser radar, the roadside millimeter wave radar and the roadside camera;
After time synchronization and space synchronization are completed, acquiring first radar information from the roadside laser radar, acquiring second radar information from the roadside millimeter wave radar, and acquiring video information from a roadside camera;
according to an environment information self-adaptive fusion algorithm, performing fusion operation on the first radar information, the second radar information and the video information by adopting the self-adaptive fusion algorithm to obtain target object running state information;
and identifying abnormal driving behaviors in the target object according to the target object running condition information.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, databases, or other media used in embodiments provided herein may include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (10)

1. A method for identifying abnormal driving behavior, the method comprising:
time synchronization and space synchronization are carried out on the roadside laser radar, the roadside millimeter wave radar and the roadside camera;
After time synchronization and space synchronization are completed, first radar information is obtained from the roadside laser radar, second radar information is obtained from the roadside millimeter wave radar, and video information is obtained from a roadside camera;
according to an environment information self-adaptive fusion algorithm, performing fusion operation on the first radar information, the second radar information and the video information by adopting the self-adaptive fusion algorithm to obtain target object running state information;
and identifying abnormal driving behaviors in the target object according to the target object running condition information.
2. The method for identifying abnormal driving behavior according to claim 1, wherein the environmental information includes illumination information and weather information;
the self-adaptive fusion algorithm according to the environment information, and the fusion operation of the first radar information, the second radar information and the video information by adopting the self-adaptive fusion algorithm comprises the following steps:
when the illumination information is illumination and the weather information is free of influence of rain, fog and snow, a first fusion algorithm is adopted and used for carrying out fusion operation on the first radar information, the second radar information and the video information;
when the weather information is influenced by rain, fog and snow, a second fusion algorithm is adopted, and the second fusion algorithm is used for carrying out fusion operation on the video information and second radar information;
When the illumination information is influenced by illumination, adopting a third fusion algorithm, wherein the third fusion algorithm is used for performing fusion operation on the first radar information and the second radar information;
and when the illumination information is poor in illumination and is influenced by rain, fog and snow, adopting a fourth fusion algorithm, wherein the fourth fusion algorithm is used for performing fusion operation on the second radar information.
3. The abnormal driving behavior recognition method according to claim 1, wherein the target object running condition information includes at least a vehicle on each lane and its speed;
the step of identifying abnormal driving behaviors in the target object according to the target object running condition information comprises the following steps:
traversing each lane, and calculating the average speed corresponding to the lane according to the speed of the vehicle on the lane;
calculating the average speed corresponding to each vehicle;
and identifying slow/fast vehicle information on the lane according to the average speed corresponding to the lane and the average speed corresponding to the vehicle.
4. The method for identifying abnormal driving behavior according to claim 3, wherein the identifying slow-moving vehicle information on the lane according to the average speed corresponding to the lane and the average speed corresponding to the vehicle comprises:
Comparing the average speed V1 corresponding to the lane with the average speed V2 corresponding to the vehicle;
and if the V2 is less than K1V 1, the vehicle corresponding to the average speed V2 is taken as a slow-moving vehicle, and slow-moving vehicle information is acquired, wherein the first proportional coefficient K1 is less than or equal to 1.
5. The method for identifying abnormal driving behavior according to claim 3, wherein the identifying the fast-moving vehicle information on the lane according to the average speed corresponding to the lane and the average speed corresponding to the vehicle comprises:
comparing the average speed V1 corresponding to the lane with the average speed V2 corresponding to the vehicle;
and if V2> K2V 1, the vehicle corresponding to the average speed V2 is used as a fast-driving vehicle, and fast-driving vehicle information is acquired, wherein the second proportionality coefficient K2 is greater than or equal to 1.
6. The method for identifying abnormal driving behavior according to claim 3, wherein the identifying abnormal driving behavior in an object based on the object operating condition information further comprises:
acquiring speed limit information V3 of a lane;
and identifying the fast-running vehicle information on the lane according to the limit information V3, the average speed V1 corresponding to the lane and the average speed V2 corresponding to the vehicle.
7. The abnormal driving behavior recognition method according to claim 6, wherein the recognizing fast-moving vehicle information on the lane, based on the limit information V3, the average speed V1 corresponding to the lane, and the average speed V2 corresponding to the vehicle, comprises:
When K3V 3> K2V 1, comparing the average speed V1 corresponding to the lane with the average speed V2 corresponding to the vehicle, and if V2> K2V 1, using the vehicle corresponding to the average speed V2 as a fast-driving vehicle to acquire fast-driving vehicle information;
and when K3V 3< K2V 1, comparing the average speed V1 corresponding to the lane with speed limit information V3, and if V2> K3V 3, using the vehicle corresponding to the average speed V2 as a fast-driving vehicle to acquire fast-driving vehicle information, wherein the second proportionality coefficient K2 and the third proportionality coefficient K3 are both greater than or equal to 1.
8. An apparatus for recognizing an abnormal traveling behavior, characterized by comprising:
the synchronization module is used for carrying out time synchronization and space synchronization on the roadside laser radar, the roadside millimeter wave radar and the roadside camera;
the acquisition module is used for acquiring first radar information from the roadside laser radar, second radar information from the roadside millimeter wave radar and video information from a roadside camera after time synchronization and space synchronization are completed;
the fusion module is used for carrying out fusion operation on the first radar information, the second radar information and the video information by adopting a self-adaptive fusion algorithm according to an environment information self-adaptive fusion algorithm to obtain target object running state information;
And the identification module is used for identifying abnormal driving behaviors in the target object according to the target object running state information.
9. A computer-readable storage medium, in which a computer program is stored, which computer program, when being executed by a processor, carries out a method of identifying abnormal driving behavior according to any one of claims 1 to 7.
10. A computer device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method of identifying abnormal driving behavior according to any one of claims 1 to 7 when executing the computer program.
CN202210192029.9A 2022-02-28 2022-02-28 Method, device, equipment and medium for identifying abnormal driving behaviors Pending CN114670852A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210192029.9A CN114670852A (en) 2022-02-28 2022-02-28 Method, device, equipment and medium for identifying abnormal driving behaviors

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210192029.9A CN114670852A (en) 2022-02-28 2022-02-28 Method, device, equipment and medium for identifying abnormal driving behaviors

Publications (1)

Publication Number Publication Date
CN114670852A true CN114670852A (en) 2022-06-28

Family

ID=82072268

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210192029.9A Pending CN114670852A (en) 2022-02-28 2022-02-28 Method, device, equipment and medium for identifying abnormal driving behaviors

Country Status (1)

Country Link
CN (1) CN114670852A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114879177A (en) * 2022-07-11 2022-08-09 浙江大华技术股份有限公司 Target analysis method and device based on radar information
CN115527364A (en) * 2022-08-25 2022-12-27 西安电子科技大学广州研究院 Traffic accident tracing method and system based on radar vision data fusion
CN115909728A (en) * 2022-11-02 2023-04-04 智道网联科技(北京)有限公司 Road side sensing method and device, electronic equipment and storage medium
CN116381674A (en) * 2023-06-02 2023-07-04 陕西欧卡电子智能科技有限公司 Fusion method of water surface laser radar point cloud and millimeter wave Lei Dadian cloud

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114879177A (en) * 2022-07-11 2022-08-09 浙江大华技术股份有限公司 Target analysis method and device based on radar information
CN114879177B (en) * 2022-07-11 2022-10-28 浙江大华技术股份有限公司 Target analysis method and device based on radar information
CN115527364A (en) * 2022-08-25 2022-12-27 西安电子科技大学广州研究院 Traffic accident tracing method and system based on radar vision data fusion
CN115527364B (en) * 2022-08-25 2023-11-21 西安电子科技大学广州研究院 Traffic accident tracing method and system based on radar data fusion
CN115909728A (en) * 2022-11-02 2023-04-04 智道网联科技(北京)有限公司 Road side sensing method and device, electronic equipment and storage medium
CN116381674A (en) * 2023-06-02 2023-07-04 陕西欧卡电子智能科技有限公司 Fusion method of water surface laser radar point cloud and millimeter wave Lei Dadian cloud
CN116381674B (en) * 2023-06-02 2023-08-22 陕西欧卡电子智能科技有限公司 Fusion method of water surface laser radar point cloud and millimeter wave Lei Dadian cloud

Similar Documents

Publication Publication Date Title
CN114670852A (en) Method, device, equipment and medium for identifying abnormal driving behaviors
CN109920246B (en) Collaborative local path planning method based on V2X communication and binocular vision
CN113379805B (en) Multi-information resource fusion processing method for traffic nodes
CN105793669B (en) Vehicle position estimation system, device, method, and camera device
CN113223317B (en) Method, device and equipment for updating map
CN111108538B (en) System for generating and/or updating digital models of digital maps
JP2021099793A (en) Intelligent traffic control system and control method for the same
EP3657464A1 (en) Control device, control method, and program
EP3674161A1 (en) A failure detection device for an external sensor and a failure detection method for an external sensor
EP3895950A1 (en) Methods and systems for automated driving system monitoring and management
CN111753639A (en) Perception map generation method and device, computer equipment and storage medium
CN112835030A (en) Data fusion method and device for obstacle target and intelligent automobile
CN114091626B (en) True value detection method, device, equipment and storage medium
EP3896639A1 (en) Methods and systems for managing an automated driving system of a vehicle
CN111753901A (en) Data fusion method, device and system and computer equipment
US20200256682A1 (en) Method and device
CN115953748A (en) Multi-sensor fusion sensing method, system, device and medium for Internet of vehicles
CN116524311A (en) Road side perception data processing method and system, storage medium and electronic equipment thereof
CN114528040A (en) Environment self-adaption method, device, medium and roadside perception and calculation system
CN111681430B (en) Method for predicting number of stop lines of signal lamp intersection in future in real time
CN114821531A (en) Lane line recognition image display system based on electronic outside rear-view mirror ADAS
WO2023145738A1 (en) Map update system, vehicle-mounted device, and management server
CN110874549A (en) Target visual field determining method, system, computer device and storage medium
CN117496483B (en) Night image recognition method and system
US11804131B2 (en) Communication system for determining vehicle context and intent of a target vehicle based on perceived lane of travel

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination