CN117382644B - Distraction driving detection method, computer device, storage medium and intelligent device - Google Patents

Distraction driving detection method, computer device, storage medium and intelligent device Download PDF

Info

Publication number
CN117382644B
CN117382644B CN202311684597.1A CN202311684597A CN117382644B CN 117382644 B CN117382644 B CN 117382644B CN 202311684597 A CN202311684597 A CN 202311684597A CN 117382644 B CN117382644 B CN 117382644B
Authority
CN
China
Prior art keywords
target
collision
vehicle
duration
risk
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311684597.1A
Other languages
Chinese (zh)
Other versions
CN117382644A (en
Inventor
周航
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Weilai Zhijia Technology Co Ltd
Original Assignee
Anhui Weilai Zhijia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Weilai Zhijia Technology Co Ltd filed Critical Anhui Weilai Zhijia Technology Co Ltd
Priority to CN202311684597.1A priority Critical patent/CN117382644B/en
Publication of CN117382644A publication Critical patent/CN117382644A/en
Application granted granted Critical
Publication of CN117382644B publication Critical patent/CN117382644B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0953Predicting travel path or likelihood of collision the prediction being responsive to vehicle dynamic parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0956Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0097Predicting future conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/06Direction of travel
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/10Longitudinal speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/12Lateral speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/225Direction of gaze
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/229Attention level, e.g. attentive to driving, reading or sleeping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4041Position
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4042Longitudinal speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4043Lateral speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4044Direction of movement, e.g. backwards

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application relates to the technical field of vehicles, and particularly provides a distraction driving detection method, computer equipment, a storage medium and intelligent equipment, which aim to solve the problem of improving the distraction driving detection reliability. For this purpose, the method provided by the application comprises the steps of obtaining an external target overlapping with a track existing in a vehicle in a future preset duration as a candidate target according to the motion information of the vehicle and the motion information of the external target; acquiring a risk target with collision risk with the vehicle according to the candidate target; acquiring the gazing state of a driver of the vehicle on a risk target; and judging whether the driver is in a distraction driving state according to the gazing state. Based on the method, false detection can not occur when the road surface has no target or no risk target; and when more road vehicles or more risk vehicles exist, whether the driver is in a distraction driving state or not can be accurately judged according to any risk target, and the method has higher sensitivity and reliability.

Description

Distraction driving detection method, computer device, storage medium and intelligent device
Technical Field
The application relates to the technical field of vehicles, in particular to a method for detecting distraction driving, computer equipment, storage medium and intelligent equipment
Background
In order to ensure safe driving of a vehicle, whether a driver is in a distraction driving state or not is usually detected, and an alarm is given in time to remind the driver when distraction driving is detected. The conventional detection method at present mainly comprises the steps of dividing a distraction area (such as an in-vehicle area, a rearview mirror and the like) and a non-distraction area (such as a front window area, a left rearview mirror and the like) according to a vehicle, then identifying whether the sight of a driver at each moment falls in the distraction area or the non-distraction area, obtaining the proportion of the sight of the driver in the distraction area within a period of time, judging whether the driver is in a distraction driving state according to the size of the proportion, and dividing the distraction driving degree (such as light, medium and heavy) according to the size of the proportion.
However, this method merely makes a classification judgment based on whether the driver's line of sight falls in a distraction area or a non-distraction area to detect distraction driving, and is less flexible and unsuitable for increasingly complex driving conditions. For example, in some driving situations, the road surface has no vehicle or no risk vehicle, and the driver is determined to be distracted to warn as long as the line of sight of the driver satisfies the above conditions, which may adversely affect the driver to safely drive the vehicle. For another example, in some driving situations, there are many road vehicles or vehicles at risk, it is necessary to accurately detect whether the driver is distracted to drive in time to give an alarm, but the regularity is strong, and the line of sight of the driver must meet the above conditions to give an alarm, so that the alarm is likely to be delayed, the safe driving of the vehicle is affected, and even a traffic accident is caused.
Accordingly, there is a need in the art for a new solution to the above-mentioned problems.
Disclosure of Invention
In order to overcome the above-mentioned drawbacks, the present application is proposed to provide a method for detecting a distraction driving, a computer device, a storage medium and an intelligent device, which solve or at least partially solve the technical problem how to improve the reliability of the detection of a distraction driving.
In a first aspect, there is provided a method of detecting distraction driving, the method comprising:
according to the motion information of the vehicle and the motion information of the external target, acquiring the external target overlapping with the track of the vehicle in the future preset time length as a candidate target;
acquiring a risk target with collision risk with the vehicle according to the candidate target;
acquiring the gazing state of a driver of the vehicle on the risk target;
and judging whether the driver is in a distraction driving state according to the gazing state.
In one aspect of the foregoing method for detecting a distraction driving, the acquiring, according to the candidate target, a risk target having a risk of collision with the vehicle includes:
acquiring the overlapping moment when the candidate target overlaps with the vehicle track;
acquiring the collision residual duration of the candidate target at the current moment according to the duration from the current moment to the overlapping moment;
And judging whether the candidate target is used as a risk target according to the residual collision duration.
In one technical solution of the above method for detecting distracted driving, the obtaining the remaining duration of collision of the candidate target at the current time includes:
taking the time length from the current time to the overlapped time as a first time length;
acquiring a longitudinal overlapping distance between the candidate target and the vehicle at the overlapping moment and a longitudinal speed difference at the overlapping moment;
acquiring the ratio of the longitudinal overlapping distance to the longitudinal speed difference as a second duration;
and acquiring the residual collision duration according to the difference value between the first duration and the second duration.
In one technical solution of the above method for detecting distracted driving, the determining whether to use the candidate target as a risk target according to the remaining duration of collision includes:
judging whether the collision residual duration meets a first preset condition or not;
if yes, taking the candidate target as a risk target;
and if not, the candidate target is not taken as a risk target.
In one technical scheme of the above method for detecting the distracted driving, the determining whether to use the candidate target as a risk target according to the remaining duration of the collision further includes:
Acquiring the collision residual duration of the candidate target at a plurality of continuous moments before and/or after the current moment;
acquiring a change trend of the collision residual duration according to the collision residual duration of the candidate target at the current moment and a plurality of continuous moments before and/or after the current moment;
judging whether the residual collision time lengths at the current moment and a plurality of continuous moments before and/or after the current moment meet a first preset condition, and the change trend meets a second preset condition;
if yes, taking the candidate target as a risk target;
and if not, the candidate target is not taken as a risk target.
In one technical scheme of the above method for detecting the distracted driving, the method further includes judging whether the remaining duration of the collision meets a first preset condition by:
judging whether the collision residual duration is smaller than or equal to a preset duration threshold value;
if yes, the residual collision duration meets a first preset condition;
if not, the residual collision duration does not meet a first preset condition.
In one technical scheme of the above method for detecting the distraction driving, the method further includes judging whether the trend of the remaining duration of the collision satisfies a second preset condition by:
Judging whether the change trend is a decreasing trend or not, wherein the difference value between the collision residual time lengths at every two adjacent moments is larger than the time difference between every two adjacent moments;
if yes, the change trend meets a second preset condition;
if not, the change trend does not meet a second preset condition.
In one aspect of the foregoing method for detecting a distraction driving, the obtaining a remaining duration of collision of the candidate target at a plurality of consecutive moments before and/or after a current moment includes:
acquiring azimuth information of the candidate target relative to the vehicle;
determining the error level of the motion information of the candidate target according to the azimuth information;
acquiring the number of the preset collision residual time length matched with the error grade;
and acquiring the residual collision duration of the candidate target at a plurality of continuous moments before and/or after the current moment according to the number.
In one aspect of the above method for detecting distraction driving, the determining whether the driver is in the distraction driving state according to the gazing state includes:
if the driver does not look at the risk target, the driver is in a distraction driving state;
If the driver is looking at the risk target, the driver is not in a distracted driving state.
In one aspect of the above method for detecting distraction driving, the method further includes: outputting alarm information when the driver is judged to be in the distraction driving state;
wherein the alert information includes location information of the risk target relative to the vehicle.
In a second aspect, a computer device is provided, the computer device comprising a processor and a storage means, the storage means being adapted to store a plurality of program codes, the program codes being adapted to be loaded and run by the processor to perform the method according to any one of the above solutions of the method for detecting distraction driving.
In a third aspect, a computer readable storage medium is provided, in which a plurality of program codes are stored, the program codes being adapted to be loaded and run by a processor to perform the method according to any one of the above solutions of the distraction detection method.
In a fourth aspect, a smart device is provided, the smart device comprising at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores a computer program that when executed by the at least one processor implements the method of any one of the above-described solutions for the distraction driving detection method.
The technical scheme has at least one or more of the following beneficial effects:
in the technical scheme for implementing the method for detecting the distracted driving, according to the motion information of the vehicle and the motion information of the external targets (such as other vehicles), the external targets which are overlapped with the existing track of the vehicle in the preset time length in the future can be obtained as candidate targets; acquiring a risk target with collision risk with the vehicle according to the candidate target; and acquiring the gazing state of the driver of the vehicle on the risk target, and judging whether the driver is in a distraction driving state according to the gazing state. Based on the above embodiment, it is possible to distinguish which targets outside the vehicle are risk targets and which are not risk targets, and determine whether the vehicle is in a distracted driving state only according to the gaze state of the driver on the risk targets, without considering the non-risk targets. Thus, when the road surface has no target or no risk target, false detection can not occur as in the prior art, and the driver can not be influenced to safely drive the vehicle. When more road vehicles or more risk vehicles exist, whether the vehicle is in a distraction driving state or not can be accurately judged according to the gazing state of a driver on any risk target, and the vehicle has higher sensitivity and reliability.
Drawings
The disclosure of the present application will become more readily understood with reference to the accompanying drawings. As will be readily appreciated by those skilled in the art: these drawings are for illustrative purposes only and are not intended to limit the scope of the present application. Wherein:
FIG. 1 is a flow chart of the main steps of a method of detecting distraction driving according to one embodiment of the present application;
FIG. 2 is a flow chart of the main steps of a method of acquiring a risk objective according to one embodiment of the present application;
FIG. 3 is a flow chart of the main steps of a method for detecting distraction driving according to another embodiment of the present application;
fig. 4 is a schematic diagram of the main structure of a computer device according to one embodiment of the present application.
Fig. 5 is a schematic diagram of the main structure of a smart device according to an embodiment of the present application.
List of reference numerals:
11: a storage device; 12: a processor; 21: a memory; 22: a processor.
Detailed Description
Some embodiments of the present application are described below with reference to the accompanying drawings. It should be understood by those skilled in the art that these embodiments are merely for explaining the technical principles of the present application, and are not intended to limit the scope of the present application.
In the description of the present application, a "processor" may include hardware, software, or a combination of both. The processor may be a central processor, a microprocessor, an image processor, a digital signal processor, or any other suitable processor. The processor has data and/or signal processing functions. The processor may be implemented in software, hardware, or a combination of both. The computer readable storage medium includes any suitable medium that can store program code, such as magnetic disks, hard disks, optical disks, flash memory, read-only memory, random access memory, and the like.
The personal information of the relevant user possibly related in each embodiment of the application is personal information which is strictly required by laws and regulations, is processed actively provided by the user in the process of using the product/service or is generated by using the product/service and is obtained by authorization of the user according to legal, legal and necessary principles and based on reasonable purposes of business scenes.
The personal information of the user processed by the application may be different according to the specific product/service scene, and the specific scene that the user uses the product/service may be referred to as account information, equipment information, driving information, vehicle information or other related information of the user. The present application treats the user's personal information and its processing with a high diligence.
The method and the device have the advantages that safety of personal information of the user is very important, and safety protection measures which meet industry standards and are reasonable and feasible are adopted to protect the information of the user and prevent the personal information from unauthorized access, disclosure, use, modification, damage or loss.
Embodiments of the method for detecting distraction driving provided in the present application are described below.
Referring to fig. 1, fig. 1 is a schematic flow chart of main steps of a method for detecting distraction driving according to an embodiment of the present application. As shown in fig. 1, the method for detecting the distraction driving in the embodiment of the present application mainly includes the following steps S101 to S104.
Step S101: and acquiring the external target overlapping with the existing track of the vehicle in the future preset time length as a candidate target according to the motion information of the vehicle and the motion information of the external target.
Off-board targets include, but are not limited to, motor vehicles, non-motor vehicles, pedestrians, riders, and the like.
The motion information of the vehicle and the object outside the vehicle comprises but is not limited to information such as the motion direction, the motion speed, the motion position and the like, the motion track of the vehicle and the object outside the vehicle can be predicted according to the information, and whether the vehicle and the object outside the vehicle collide or not can be judged according to the motion track; if collision occurs, the collision risk of the vehicle exterior target and the vehicle is shown, the vehicle exterior target is required to be used as a risk target, otherwise, the vehicle exterior target is not required to be used as the risk target.
Specifically, a first motion track of the vehicle within a preset time period in the future can be predicted according to motion information of the vehicle, a second motion track of each external target within the preset time period in the future can be predicted according to motion information of each external target, whether the first motion track and the second motion track overlap is judged, the overlapped second motion tracks are screened out, and the external targets of the second motion tracks are taken as candidate targets. If the track of the vehicle and the track of the vehicle are overlapped, the vehicle exterior object and the vehicle can collide at the overlapped position, and the parking space object has higher collision risk.
In addition, when predicting the motion trail, the trail points at each moment in the future preset time period can be predicted in sequence, and the trail points are connected to form the motion trail, so that the first motion trail and the second motion trail are actually composed of a plurality of trail points. When judging whether the first and second motion trajectories overlap, whether the trajectory points at each moment overlap can be sequentially judged, and if a certain trajectory point is detected to overlap, it can be judged that the first and second motion trajectories overlap.
Step S102: and acquiring a risk target with collision risk with the vehicle according to the candidate target. In this embodiment, the candidate target may be used as a risk target, or the candidate target may be further screened, and the screened candidate target may be used as a risk target.
Step S103: a state of gaze of a driver of the vehicle on the risk target is obtained.
The gaze state may include a target being gazed at risk and a target not being gazed at risk.
In this embodiment, a position at which the driver's line of sight is directed may be obtained, and if the position is the same as the risk target, the driver is looking at the risk target, otherwise the driver is not looking at the risk target.
Step S104: and judging whether the driver is in a distraction driving state according to the gazing state. Based on the gaze status, it may be determined whether the driver is looking at the risk target, thereby determining whether the driver is in a distracted driving state.
In some embodiments, if it is determined that the driver does not have a gaze risk target based on the gaze state, determining that the driver is in a distracted driving state; if the driver is looking at the risk target according to the looking state, the driver is judged not to be in the distraction driving state. Based on this, it is possible to judge whether the driver is in a distracted driving state, conveniently and accurately.
In some embodiments, the warning information is output upon determining that the driver is in a distracted driving state, and the warning information may include information on the orientation of the risk target relative to the vehicle. Based on the alarm information, a driver can accurately observe where a risk target is based on the alarm information while focusing attention, so that countermeasures can be timely taken, and the running safety of the vehicle is ensured. Further, in some embodiments, during the process of outputting the alarm information, whether the driver starts to watch the risk target may be monitored, if the driver starts to watch the risk target, the output of the alarm information may be stopped, otherwise, the alarm information may be continuously output until the driver watches the risk target.
Based on the methods described in the above steps S101 to S104, it is possible to distinguish which targets outside the vehicle are risk targets and which are not risk targets, and determine whether the vehicle is in a distracted driving state only according to the gazing state of the driver on the risk targets, without considering the non-risk targets. Thus, when the road surface has no target or no risk target, false detection cannot occur, and the driver is not influenced to safely drive the vehicle. When more road vehicles or more risk vehicles exist, whether the road vehicles are in a distraction driving state or not can be accurately judged based on any risk target, and the road vehicles are high in sensitivity and reliability.
In an application scenario according to the embodiment of the present application, a device capable of executing the method for detecting distraction driving of the embodiment of the present application may be configured on a vehicle, and in a process that a user drives the vehicle, whether the driver drives distraction can be monitored in real time through the device, and if the driver drives distraction, alarm information is output to remind the driver until the driver concentrates attention, and the driver does not drive distraction, so that running safety of the vehicle is ensured.
Step S102 is further described below.
In some embodiments of step S102 described above, the risk target may be acquired by the following steps S201 to S203 shown in fig. 2 and based on the candidate target.
Step S201: and acquiring the overlapping moment when the candidate target overlaps with the track of the vehicle.
As is clear from the description of the foregoing step S101, when determining whether the first and second motion trajectories overlap, it is possible to sequentially determine whether the trajectory points at each time point overlap, and to obtain the time point at which the overlapping trajectory points occur as the overlapping time point. In addition, in order to ensure the safety of the vehicle running, the overlapping time when the trajectories of the candidate target and the vehicle overlap for the first time may be acquired, and the subsequent steps may be continuously executed according to the overlapping time when the trajectories overlap for the first time.
Step S202: and acquiring the collision residual duration of the candidate target at the current time according to the duration from the current time to the overlapping time.
In this embodiment, the above-mentioned duration may be directly used as the remaining duration of the collision, or the duration may be reduced, and the reduced duration may be used as the remaining duration of the collision.
Step S203: and judging whether the candidate target is used as a risk target according to the residual collision duration. The larger the residual duration of the collision is, the longer the time for avoiding the collision is, the lower the risk degree of the collision between the candidate target and the vehicle is; conversely, the smaller the remaining duration of the collision, the higher the risk of the candidate object colliding with the vehicle. Therefore, the degree of risk of collision between the candidate target and the vehicle can be determined according to the size of the residual duration of collision, and whether the candidate target is used as a risk target or not can be further judged. For example, if the risk level is high, the risk is targeted, otherwise the risk is not targeted.
Based on the methods described in the above steps S201 to S203, after the candidate target is obtained, the risk degree may be determined according to the remaining duration of the collision of the candidate target, so as to accurately obtain the risk target.
The above-described step S202 and step S203 are further described below.
Step S202 is further described.
In some embodiments of step S202 described above, the remaining duration of collision of the candidate target at the current time may be acquired through the following steps S2021 to S2024.
Step S2021: and taking the time length from the current time to the overlapped time as a first time length.
Step S2022: and acquiring the longitudinal overlapping distance between the candidate object and the vehicle at the overlapping moment and the longitudinal speed difference between the candidate object and the vehicle at the overlapping moment.
The longitudinal direction refers to the forward running direction of the vehicle.
As is clear from the foregoing description of step S201, the time of the overlapping track point can be obtained as the overlapping time when determining whether the first and second movement tracks overlap. Since the first and second motion trajectories are composed of a plurality of discrete trajectory points, each two trajectory points are separated by a period of time, such as 0.1s. It is thus possible that the movement tracks of the vehicle and the object outside the vehicle have been overlapped before the above-mentioned overlapped track points, and that there is a longitudinal overlap distance between the vehicle and the object outside the vehicle when they move to the above-mentioned overlapped track points, which overlap distance is to be obtained in this step.
The longitudinal speed difference is the difference between the longitudinal speed of the candidate object at the overlap time and the longitudinal speed of the vehicle at the overlap time.
Step S2023: and acquiring the ratio of the longitudinal overlapping distance to the longitudinal speed difference as a second duration. The longitudinal speed of the candidate target may be greater or less than the longitudinal speed of the vehicle, and thus the longitudinal speed difference may be positive or negative. For this, after the ratio of the longitudinal overlap distance to the longitudinal speed difference is obtained, the absolute value of this ratio may be obtained as the second period.
Step S2024: and acquiring the residual collision duration according to the difference value between the first duration and the second duration.
As is clear from the foregoing description of step S2022, the movement tracks of the vehicle and the object outside the vehicle may have been overlapped before the overlapped track points, and the overlapping time obtained from the overlapped track points cannot accurately represent the time at which the collision moment of the vehicle and the object outside the vehicle occurs. The longitudinal overlapping distance can represent the distance generated by continuous collision between the vehicle and the object outside the vehicle after the collision moment, so that the second duration calculated according to the longitudinal overlapping distance can represent the duration of continuous collision, and the duration from the current moment to the moment when the vehicle collides with the object outside the vehicle, namely the residual duration of collision, can be accurately obtained by subtracting the second duration from the first duration.
Based on the methods described in the steps S2021 to S2024, a more accurate remaining duration of collision can be obtained, accuracy of the risk target is improved, and running safety of the vehicle is ensured.
Step S203 will be further described.
In some embodiments of step S203 described above, it may be determined whether or not the candidate target is to be a risk target by the following steps 11 to 13.
Step 11: judging whether the residual collision duration meets a first preset condition or not;
if yes, go to step 12; if not, go to step 13.
In some embodiments, it may be determined whether the collision remaining duration is less than or equal to a preset duration threshold; if yes, judging that the residual collision duration meets a first preset condition, namely the residual collision duration is relatively short; if not, judging that the residual duration of the collision does not meet the first preset condition, namely the residual duration of the collision is longer. The value of the preset duration threshold can be flexibly set by a person skilled in the art according to actual requirements, and the embodiment is not particularly limited. For example, in some preferred embodiments, the preset duration threshold may be 3s.
Step 12: and taking the candidate target as a risk target.
Step 13: candidate targets are not taken as risk targets.
Based on the methods described in the above steps 11 to 13, the risk targets may be rapidly and accurately screened by performing condition matching on the remaining duration of the collision.
In other embodiments of step S203, it may be determined whether or not the candidate target is to be a risk target through the following steps 21 to 25.
Step 21: and acquiring the collision residual time length of the candidate target at a plurality of continuous moments before and/or after the current moment.
The foregoing step S202 describes a method for acquiring the remaining duration of the collision of the candidate object at the current time, and the same method may be adopted for acquiring the remaining duration of the collision at the time before and at the time after the current time. The present embodiment may store the remaining duration of collision at one time every time the remaining duration of collision of the candidate target is acquired. When the user is required to collide with the rest time, the call can be directly made.
Step 22: and acquiring the change trend of the residual duration of the collision according to the residual duration of the candidate target at the current moment and a plurality of continuous moments before and/or after the current moment.
The change trend refers to a change trend of the remaining duration of collision in the period of time in which the candidate target is formed at the above-described timing. For example, the trend may be a gradual decrease, a gradual increase, or no regular fluctuation, etc.
Step 23: judging whether the residual duration of collision at the current moment and a plurality of continuous moments before and/or after the current moment meets a first preset condition and the change trend meets a second preset condition.
If the first and second preset conditions are satisfied, the process goes to step 24;
if the first and second preset conditions cannot be satisfied at the same time, the process goes to step 25.
The method for determining whether the first preset condition is satisfied is the same as the related method in the foregoing steps 11 to 13, and will not be described herein.
And in the current moment and a plurality of continuous moments before and/or after the current moment, if the residual duration of the collision at each moment meets the first preset condition, indicating that the residual duration of the collision at each moment is relatively short. If the change trend of the residual duration of the collision meets a second preset condition, the change trend is indicated to be capable of representing that the candidate target and the vehicle have higher collision risk.
As is apparent from the description of the foregoing embodiments, it is necessary to acquire motion information of a vehicle, an off-vehicle target, and the like when acquiring the remaining duration of a collision, and the motion information is mainly acquired by a sensor or the like on the vehicle, for example, by acquiring the vehicle speed through an IMU on the vehicle, acquiring an off-vehicle image through a camera on the vehicle, and performing target recognition on the off-vehicle image. Since it is necessary to acquire the motion information by using a device such as a sensor, there is a possibility that the motion information may be erroneous due to an error or the like of the device at a certain time, and the remaining time length of the collision at a certain time obtained based on such motion information may be deviated. Therefore, if it is judged whether or not the candidate target is a risk target based on the magnitude of the remaining time of collision at only one moment, erroneous judgment may occur. In this embodiment, the method may further combine the trend of the remaining duration of the collision to make up for the defect and prevent erroneous judgment. If the residual duration of the collision at each moment meets the first preset condition and the change trend of the residual duration of the collision also meets the second preset condition, the candidate target can be determined to have extremely high collision risk and can be used as a risk target.
Step 24: and taking the candidate target as a risk target.
Step 25: candidate targets are not taken as risk targets.
Based on the methods described in the above steps 21 to 25, the value of the remaining duration of the collision at each moment and the trend of the remaining duration of the collision in a period of time can be combined at the same time to accurately determine whether the candidate target is a risk target, so as to prevent erroneous determination.
The above steps 21 and 23 are further described below.
1. Step 21 is described.
In some embodiments of step 21, the remaining duration of the collision of the candidate object at a plurality of consecutive moments before and/or after the current moment may be obtained through the following steps 211 to 214.
Step 211: and acquiring the azimuth information of the candidate target relative to the vehicle.
The position information may include a direction of movement and a position of the candidate object relative to the vehicle.
The direction of movement may include the same direction, opposite direction, lateral direction, etc., wherein the same direction means running in the same direction as the vehicle, opposite direction means running opposite the vehicle, lateral direction means moving laterally in front of the vehicle.
The location may include being in the same lane as the vehicle, being in an adjacent lane of the vehicle, and so on.
Step 212: and determining the error level of the motion information of the candidate target according to the azimuth information.
From the foregoing description of step 23, it is known that the motion information is mainly acquired by devices such as sensors on the vehicle, and the motion information obtained by using these devices may have different errors for the off-vehicle targets with different orientations. In this embodiment, the correspondence between different azimuth information and different error levels may be preset, and after azimuth information is obtained in step 211, the correspondence is matched to obtain a corresponding error level.
In some embodiments, four types of orientation information may be set, namely:
1. the off-vehicle target is in the same lane as the vehicle and is traveling in front of the vehicle in the same direction as the vehicle.
2. The off-vehicle target travels in the same direction as the vehicle in the adjacent lane of the vehicle and in front of the vehicle,
3. the off-vehicle target is in the lane or adjacent lane where the vehicle is located, but is traveling opposite the vehicle.
4. The off-vehicle target moves laterally in adjacent lanes of the vehicle and in front of the vehicle travel.
Of the four azimuth information, the 1 st corresponding error level is the smallest, the 2 nd corresponding error level is the next, and the 3 rd and 4 th corresponding error levels are the largest.
Step 213: and acquiring the number of the preset collision residual time length matched with the error level. The higher the error level is, the more the number of the collision residual time is required to be obtained, and the accuracy of the change trend domain of the collision residual time can be ensured. Based on this, the correspondence between the different error levels and the number of the remaining duration of the collision may be preset, and after the error level is obtained in step 212, the correspondence is matched to obtain the number of the remaining duration of the collision.
Taking the foregoing embodiment in step 212 as an example, the minimum, secondary, and maximum error levels may obtain the remaining collision durations at 5, 10, and 20 times, respectively.
Step 214: and acquiring the collision residual duration of the candidate target at a plurality of continuous moments before and/or after the current moment according to the number.
Taking the example of acquiring the collision remaining time lengths of 10 times, in addition to the current time, the collision remaining time lengths of 9 consecutive times after the current time may be acquired, or the collision remaining time lengths of 9 consecutive times before the current time may be acquired, or the collision remaining time lengths of a plurality of consecutive times before and after the current time may be acquired simultaneously, as long as the 9 times are ensured.
Based on the methods described in the above steps 211 to 214, a certain amount of residual duration of collision may be obtained in a targeted manner according to the azimuth information of the candidate object relative to the vehicle, so as to ensure that whether a higher collision risk exists between the candidate object and the vehicle can be accurately determined according to the variation trend obtained from the residual duration of collision.
2. Step 23 is explained.
In some embodiments of step 23, it may be determined whether the trend of the remaining duration of the collision satisfies the second preset condition through the following steps 231 to 233.
Step 231: judging whether the change trend is a decreasing trend or not, and judging whether the difference between the collision residual time lengths at every two adjacent moments is larger than the time difference between every two adjacent moments.
If the trend is decreasing and the difference is greater than the time difference, it indicates that the vehicle is approaching the candidate object quickly, and the risk of collision with the candidate object is high, and it may be determined that the second preset condition is satisfied, and the process goes to step 232. Otherwise, go to step 233.
Step 232: and judging that the change trend meets a second preset condition.
Step 233: and judging that the change trend does not meet the second preset condition.
Based on the methods described in steps 231 to 233, accuracy and reliability of risk target determination based on the second preset condition can be improved.
The following is a brief description of the method for detecting distraction driving according to the embodiment of the present application, with reference to fig. 3. As shown in fig. 3, the distraction driving detection can be performed by the following steps.
Firstly, acquiring the motion parameters of the vehicle and the motion parameters of the object outside the vehicle, and respectively carrying out track calculation according to the motion parameters of the vehicle and the object outside the vehicle to obtain the motion track of the vehicle (namely the track of the vehicle in fig. 3) and the motion track of the object outside the vehicle (namely the track of the object in fig. 3).
The host vehicle motion parameters may include a position of the vehicle, a longitudinal/lateral speed, a longitudinal/lateral acceleration, and size information of the vehicle (including at least a length, a width, and a height of the vehicle), and the target motion parameters may include a position of the target, a longitudinal speed, a longitudinal acceleration, a yaw rate, and size information of the target (including at least a length, a width, and a height of the target). When track estimation is carried out, three-dimensional positions of the vehicle at preset intervals (such as 0.1 second) within a preset time length (such as 5 seconds) in the future can be respectively obtained as track points, and the track points are connected to form a train track. Similarly, the same method may be used to obtain the target trajectory.
After the track of the vehicle and the track of the target are obtained, the remaining collision duration ttc (time to crash) of each external target at the current moment is calculated, and the calculation method can adopt the related method in the embodiment of the method.
After obtaining the remaining collision time ttc of an off-board target at the current moment, judging whether ttc meets the condition. The condition may be a first preset condition in the foregoing method embodiment, and the judging method may also be a related method in the foregoing method embodiment. If the condition is met, indicating that the object outside the vehicle is a risk object, and updating the attribute of the object to be a risk attribute; if the condition is not met, the condition that the off-board target is not a risk target is indicated, the attribute of the off-board target is updated to be a non-risk attribute, and ttc of the off-board target at the current moment is taken as one frame of information in target multi-frame risk information to be stored.
And then, carrying out target multi-frame risk judgment on the off-vehicle targets with the risk attributes. Specifically, the remaining time periods of the collision of the external target at the current time and the multiple continuous time periods before and/or after the current time period can be obtained, whether the change trend of the remaining time periods of the collision meets the second preset condition is judged, and the judging method can adopt the related method in the embodiment of the method.
If the condition is met, inquiring the attention state (namely the attention state in the embodiment) of the driver on the object outside the vehicle, and judging whether multi-frame risks exist according to the attention state, namely the collision risk exists on the change trend of the residual collision duration; if the risk exists, alarming and reminding, and if the risk does not exist, clearing multi-frame risk degrees, namely clearing the degree information of collision risk existing in the change trend of the residual duration of the collision, namely setting the change trend of the residual duration of the collision to be free of collision risk.
If the condition is not satisfied, continuing to judge the risk of the target multiframe.
The above is a simple explanation of the distraction driving detection method shown in fig. 3.
It should be noted that, although the foregoing embodiments describe the steps in a specific sequential order, it should be understood by those skilled in the art that, in order to achieve the effects of the present application, the steps need not necessarily be performed in such an order, and may be performed simultaneously (in parallel) or in other orders, and those solutions after these adjustments belong to equivalent solutions to those described in the present application, and therefore will also fall within the protection scope of the present application.
It will be appreciated by those skilled in the art that the present application may implement all or part of the above-described methods according to the above-described embodiments, or may be implemented by means of a computer program for instructing relevant hardware, where the computer program may be stored in a computer readable storage medium, and the computer program may implement the steps of the above-described embodiments of the method when executed by a processor. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable storage medium may include: any entity or device, medium, usb disk, removable hard disk, magnetic disk, optical disk, computer memory, read-only memory, random access memory, electrical carrier wave signals, telecommunications signals, software distribution media, and the like capable of carrying the computer program code. It should be noted that the computer readable storage medium may include content that is subject to appropriate increases and decreases as required by jurisdictions and by jurisdictions in which such computer readable storage medium does not include electrical carrier signals and telecommunications signals.
Another aspect of the present application also provides a computer device.
In an embodiment of a computer device according to the present application, the computer device mainly includes a storage device and a processor, the storage device may be configured to store a program for executing the method of detecting the distraction driving of the above-described method embodiment, and the processor may be configured to execute the program in the storage device, the program including, but not limited to, the program for executing the method of detecting the distraction driving of the above-described method embodiment. Referring to fig. 4, a memory device 11 and a processor 12 are illustratively shown in fig. 4 as being communicatively coupled via a bus. For convenience of description, only the parts related to the present embodiment are shown, and specific technical details are not disclosed, please refer to the method parts of the embodiment of the present application.
In some possible implementations, a computer device may include a plurality of storage devices and a plurality of processors. The plurality of processors may be processors disposed on the same device, for example, the computer device may be a high-performance device composed of a plurality of processors, and the plurality of processors may be processors configured on the high-performance device. In addition, the plurality of processors may be processors disposed on different devices, for example, the computer device may be a server cluster, and the plurality of processors may be processors on different servers in the server cluster.
Another aspect of the present application also provides a computer-readable storage medium.
In an embodiment of a computer-readable storage medium according to the present application, the computer-readable storage medium may be configured to store a program for executing the distraction detection method of the above-described method embodiment, which may be loaded and executed by a processor to implement the above-described distraction detection method. For convenience of explanation, only those portions relevant to the embodiments of the present application are shown, and specific technical details are not disclosed, refer to the method portions of the embodiments of the present application. The computer readable storage medium may be a storage device including various electronic devices, and optionally, in embodiments of the present application, the computer readable storage medium is a non-transitory computer readable storage medium.
Another aspect of the present application also provides an intelligent device.
In an embodiment of a smart device according to the present application, the smart device may include at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores a computer program which, when executed by the at least one processor, implements the method of any of the embodiments described above. The intelligent device can comprise driving equipment, intelligent vehicles, robots and the like. Referring to fig. 5, memory 21 and processor 22 are illustratively shown in fig. 5 as being communicatively coupled via a bus.
In some embodiments of the present application, the smart device may further include at least one sensor for sensing information. The sensor is communicatively coupled to any of the types of processors referred to herein. Optionally, the smart device may further comprise an autopilot system for guiding the smart device to drive itself or assist in driving. The processor communicates with the sensors and/or the autopilot system for performing the method of any one of the embodiments described above.
Thus far, the technical solution of the present application has been described in connection with one embodiment shown in the drawings, but it is easily understood by those skilled in the art that the scope of protection of the present application is not limited to these specific embodiments. Equivalent modifications and substitutions for related technical features may be made by those skilled in the art without departing from the principles of the present application, and such modifications and substitutions will be within the scope of the present application.

Claims (10)

1. A method of detecting distraction, the method comprising:
according to the motion information of the vehicle and the motion information of the external target, acquiring the external target overlapping with the track of the vehicle in the future preset time length as a candidate target;
Acquiring a risk target with collision risk with the vehicle according to the candidate target;
acquiring the gazing state of a driver of the vehicle on the risk target;
judging whether the driver is in a distraction driving state according to the gazing state;
the step of acquiring the risk target with collision risk with the vehicle according to the candidate target comprises the following steps: acquiring the overlapping moment when the candidate target overlaps with the vehicle track; acquiring the collision residual duration of the candidate target at the current moment according to the duration from the current moment to the overlapping moment; judging whether the candidate target is used as a risk target or not according to the collision residual duration;
judging whether the candidate target is used as a risk target according to the collision residual duration, wherein the judging comprises the following steps: acquiring the collision residual duration of the candidate target at a plurality of continuous moments before and/or after the current moment; acquiring a change trend of the collision residual duration according to the collision residual duration of the candidate target at the current moment and a plurality of continuous moments before and/or after the current moment; judging whether the residual collision time lengths at the current moment and a plurality of continuous moments before and/or after the current moment meet a first preset condition, and the change trend meets a second preset condition; if yes, taking the candidate target as a risk target; and if not, the candidate target is not taken as a risk target.
2. The method according to claim 1, wherein the obtaining the remaining duration of collision of the candidate target at the current time includes:
taking the time length from the current time to the overlapped time as a first time length;
acquiring a longitudinal overlapping distance between the candidate target and the vehicle at the overlapping moment and a longitudinal speed difference at the overlapping moment;
acquiring the ratio of the longitudinal overlapping distance to the longitudinal speed difference as a second duration;
and acquiring the residual collision duration according to the difference value between the first duration and the second duration.
3. The method of claim 1, further comprising determining whether the remaining duration of the collision satisfies a first preset condition by:
judging whether the collision residual duration is smaller than or equal to a preset duration threshold value;
if yes, the residual collision duration meets a first preset condition;
if not, the residual collision duration does not meet a first preset condition.
4. The method according to claim 1, further comprising determining whether the trend of change in the remaining duration of the collision satisfies a second preset condition by:
Judging whether the change trend is a decreasing trend or not, wherein the difference value between the collision residual time lengths at every two adjacent moments is larger than the time difference between every two adjacent moments;
if yes, the change trend meets a second preset condition;
if not, the change trend does not meet a second preset condition.
5. The method according to claim 1, wherein the obtaining the collision remaining duration of the candidate target at a plurality of consecutive moments before and/or after the current moment comprises:
acquiring azimuth information of the candidate target relative to the vehicle;
determining the error level of the motion information of the candidate target according to the azimuth information;
acquiring the number of the preset collision residual time length matched with the error grade;
and acquiring the residual collision duration of the candidate target at a plurality of continuous moments before and/or after the current moment according to the number.
6. The method of claim 1, wherein the determining whether the driver is in a distracted driving state based on the gaze state comprises:
if the driver does not look at the risk target, the driver is in a distraction driving state;
If the driver is looking at the risk target, the driver is not in a distracted driving state.
7. The method according to claim 1 or 6, characterized in that the method further comprises: outputting alarm information when the driver is judged to be in the distraction driving state;
wherein the alert information includes location information of the risk target relative to the vehicle.
8. A computer device comprising a processor and a storage means adapted to store a plurality of program code, characterized in that the program code is adapted to be loaded and run by the processor to perform the method of detecting distraction driving of any one of claims 1 to 7.
9. A computer readable storage medium, in which a plurality of program codes are stored, characterized in that the program codes are adapted to be loaded and executed by a processor to perform the distraction driving detection method according to any one of claims 1 to 7.
10. An intelligent device, comprising:
at least one processor;
and a memory communicatively coupled to the at least one processor;
wherein the memory has stored therein a computer program which, when executed by the at least one processor, implements the method of detecting distraction of any one of claims 1 to 7.
CN202311684597.1A 2023-12-11 2023-12-11 Distraction driving detection method, computer device, storage medium and intelligent device Active CN117382644B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311684597.1A CN117382644B (en) 2023-12-11 2023-12-11 Distraction driving detection method, computer device, storage medium and intelligent device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311684597.1A CN117382644B (en) 2023-12-11 2023-12-11 Distraction driving detection method, computer device, storage medium and intelligent device

Publications (2)

Publication Number Publication Date
CN117382644A CN117382644A (en) 2024-01-12
CN117382644B true CN117382644B (en) 2024-02-27

Family

ID=89439553

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311684597.1A Active CN117382644B (en) 2023-12-11 2023-12-11 Distraction driving detection method, computer device, storage medium and intelligent device

Country Status (1)

Country Link
CN (1) CN117382644B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011198247A (en) * 2010-03-23 2011-10-06 Toyota Motor Corp Driving support device
CN103770780A (en) * 2014-01-15 2014-05-07 中国人民解放军国防科学技术大学 Vehicle active safety system alarm shielding device
KR20190104274A (en) * 2019-07-05 2019-09-09 엘지전자 주식회사 Method for displaying driving situation of vehicle by sensing driver's gaze and apparatus hereof
FR3085927A1 (en) * 2018-09-18 2020-03-20 Valeo Comfort And Driving Assistance DEVICE, SYSTEM AND METHOD FOR DETECTING DISTRACTION OF A CONDUCTOR
CN111709264A (en) * 2019-03-18 2020-09-25 北京市商汤科技开发有限公司 Driver attention monitoring method and device and electronic equipment

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11238739B2 (en) * 2020-05-11 2022-02-01 Hyundai Mobis Co., Ltd. Method and apparatus to prevent VRU accidents
US11584378B2 (en) * 2020-10-27 2023-02-21 Arm Limited Vehicle-assist system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011198247A (en) * 2010-03-23 2011-10-06 Toyota Motor Corp Driving support device
CN103770780A (en) * 2014-01-15 2014-05-07 中国人民解放军国防科学技术大学 Vehicle active safety system alarm shielding device
FR3085927A1 (en) * 2018-09-18 2020-03-20 Valeo Comfort And Driving Assistance DEVICE, SYSTEM AND METHOD FOR DETECTING DISTRACTION OF A CONDUCTOR
CN111709264A (en) * 2019-03-18 2020-09-25 北京市商汤科技开发有限公司 Driver attention monitoring method and device and electronic equipment
KR20190104274A (en) * 2019-07-05 2019-09-09 엘지전자 주식회사 Method for displaying driving situation of vehicle by sensing driver's gaze and apparatus hereof

Also Published As

Publication number Publication date
CN117382644A (en) 2024-01-12

Similar Documents

Publication Publication Date Title
US11256252B2 (en) Alerting predicted accidents between driverless cars
US9841762B2 (en) Alerting predicted accidents between driverless cars
CN111775940B (en) Automatic channel changing method, device, equipment and storage medium
CN103303306B (en) The unsafe condition just occurring is made warning method to vehicle driver
CN106233159B (en) False alarm reduction using location data
US9524643B2 (en) Orientation sensitive traffic collision warning system
CN110400478A (en) A kind of road condition notification method and device
CN106470884B (en) Determination of vehicle state and driver assistance while driving a vehicle
CN1862620A (en) Intelligent detecting prewarning method for expressway automobile running and prewaring system thereof
CN113442917B (en) Warning system for a host motor vehicle
CN111754813A (en) Driving assisting method and device
CN111932881A (en) Traffic intersection management method and device, terminal device and storage medium
Eidehall Tracking and threat assessment for automotive collision avoidance
CN110843776B (en) Vehicle anti-collision method and device
CN111754814A (en) Driving assisting method, device and system
CN112966613B (en) Multi-stage early warning method and device for automobile running environment, electronic equipment and storage medium
CN114387821B (en) Vehicle collision early warning method, device, electronic equipment and storage medium
CN110940982B (en) Vehicle front destination identification method and corresponding device
CN106314275A (en) Anti-scratching device and method used for car as well as car
CN109887321B (en) Unmanned vehicle lane change safety judgment method and device and storage medium
KR20170070580A (en) Ecu, autonomous vehicle including the ecu, and method of controlling lane change for the same
CN113053165A (en) Vehicle and collision recognition method, device and equipment thereof
US20220314940A1 (en) Vehicle rear warning system and control method thereof
CN113272197A (en) Device and method for improving an assistance system for lateral vehicle movement
US11195417B2 (en) Vehicle and method for predicating collision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant