WO2022062659A1 - 智能驾驶控制方法及装置、车辆、电子设备和存储介质 - Google Patents

智能驾驶控制方法及装置、车辆、电子设备和存储介质 Download PDF

Info

Publication number
WO2022062659A1
WO2022062659A1 PCT/CN2021/109831 CN2021109831W WO2022062659A1 WO 2022062659 A1 WO2022062659 A1 WO 2022062659A1 CN 2021109831 W CN2021109831 W CN 2021109831W WO 2022062659 A1 WO2022062659 A1 WO 2022062659A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
driving
driving state
driver
information
Prior art date
Application number
PCT/CN2021/109831
Other languages
English (en)
French (fr)
Inventor
伍俊
范亦卿
陶莹
许亮
祁凯悦
Original Assignee
上海商汤临港智能科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海商汤临港智能科技有限公司 filed Critical 上海商汤临港智能科技有限公司
Priority to JP2023518924A priority Critical patent/JP2023542992A/ja
Publication of WO2022062659A1 publication Critical patent/WO2022062659A1/zh

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W40/09Driving style or behaviour
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60TVEHICLE BRAKE CONTROL SYSTEMS OR PARTS THEREOF; BRAKE CONTROL SYSTEMS OR PARTS THEREOF, IN GENERAL; ARRANGEMENT OF BRAKING ELEMENTS ON VEHICLES IN GENERAL; PORTABLE DEVICES FOR PREVENTING UNWANTED MOVEMENT OF VEHICLES; VEHICLE MODIFICATIONS TO FACILITATE COOLING OF BRAKES
    • B60T7/00Brake-action initiating means
    • B60T7/12Brake-action initiating means for automatic initiation; for initiation not subject to will of driver or passenger
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/143Alarm means

Definitions

  • the present disclosure relates to the field of computer technology, and in particular, to an intelligent driving control method and device, a vehicle, an electronic device, and a storage medium.
  • the present disclosure proposes a technical solution for intelligent driving control.
  • an intelligent driving control method including:
  • Acquiring driving state information of the vehicle acquiring image information of the driving area of the vehicle; determining the driving state of the driver of the vehicle according to the driving state information of the vehicle and the image information of the driving area; in response to detecting The driver of the vehicle is in a preset dangerous driving state and performs intelligent driving control.
  • the determining the driving state of the driver of the vehicle according to the driving state information of the vehicle and the image information of the driving area includes:
  • the driving state of the driver is determined according to the driving state information of the vehicle and the detection result of the driver's behavior.
  • the performing behavior detection on the driver of the vehicle includes:
  • the determining of the driving state of the driver according to the driving state information of the vehicle and the behavior detection result of the driver includes:
  • the determining the driving state of the driver according to the driving state information of the vehicle and the behavior detection result of the driver includes:
  • the determining the driving state of the driver of the vehicle according to the driving state information of the vehicle and the image information of the driving area includes:
  • the abnormal driving state of the vehicle includes at least one of the following:
  • the number of times of line pressing of the vehicle within the first set time period reaches the threshold of the number of times of line pressing; the left and right swing of the vehicle reaches the set amplitude threshold; the vehicle does not pass according to the instructions of traffic signs or traffic lights; the speed of the vehicle exceeds the set speed threshold.
  • the performing intelligent driving control includes generating alarm information based on at least one of the following:
  • the performing intelligent driving control includes:
  • the warning device is controlled to continuously output warning information until both the driving state of the vehicle and the driving state of the driver return to the normal state.
  • the performing intelligent driving control includes:
  • the assisted driving/automatic driving function is controlled to be activated, and/or the vehicle is controlled to decelerate to a stop.
  • the method further includes:
  • the determining of the driving state of the driver of the vehicle according to the driving state information of the vehicle and the image information of the driving area includes:
  • the driving state of the driver of the vehicle is determined according to the environmental information around the vehicle, the driving state information of the vehicle, and the image information of the driving area.
  • the environmental information includes road condition information
  • the acquiring environmental information around the vehicle includes: acquiring road condition information of the road on which the vehicle is traveling;
  • the determining of the driving state of the driver of the vehicle according to the environmental information around the vehicle, the driving state information of the vehicle and the image information of the driving area includes:
  • the driving state information of the vehicle and the road condition information do not satisfy a preset matching relationship, and the result of detecting the behavior of the driver of the vehicle based on the image information of the driving area indicates the behavior state of the driver In the abnormal state, it is determined that the driver is in a preset dangerous driving state.
  • the acquiring the driving state information of the vehicle includes:
  • the sensing data includes at least one of the following: image information collected by an advanced driving assistance system of the vehicle, image information collected by a driving recorder of the vehicle, and speed information sensed by a speed sensor of the vehicle;
  • the driving state information of the vehicle is determined.
  • the determining the driving state of the driver of the vehicle according to the driving state information of the vehicle and the image information of the driving area includes:
  • the obtained driving state information of the vehicle and the image information of the driving area are input into a trained neural network, and the driving state of the driver of the vehicle is determined through the neural network.
  • an intelligent driving control device comprising:
  • a first acquisition module used for acquiring the driving state information of the vehicle
  • a second acquisition module configured to acquire image information of the driving area of the vehicle
  • a driving state determination module configured to determine the driving state of the driver of the vehicle according to the driving state information of the vehicle and the image information of the driving area;
  • the control module is configured to perform intelligent driving control in response to detecting that the driver of the vehicle is in a preset dangerous driving state.
  • the driving state determination module includes a first driving state determination submodule and a second driving state determination submodule, wherein:
  • the first driving state determination sub-module is configured to perform behavior detection on the driver of the vehicle according to the image information of the driving area;
  • the second driving state determination sub-module is configured to determine the driving state of the driver according to the driving state information of the vehicle and the behavior detection result of the driver.
  • the first driving state determination sub-module is configured to determine the danger level corresponding to the behavior detection result of the driver
  • the second driving state determination submodule is used for:
  • the second driving state determination sub-module is configured to determine the driving state of the vehicle and the driving state of the vehicle in response to the driving state information of the vehicle and the detection result of the driver's behavior
  • the behavior states of the driver are all abnormal states, and it is determined that the driver is in a preset dangerous driving state.
  • the driving state determination module includes a third driving state determination submodule and a fourth driving state determination submodule, wherein:
  • the third driving state determination sub-module is configured to perform behavior detection of the driver according to the image information of the driving area in response to detecting that the vehicle is in an abnormal driving state according to the driving state information of the vehicle;
  • the fourth driving state determination sub-module is configured to determine that the driver is in a preset dangerous driving state in response to determining that the driver's behavior detection result is the driver's distracted driving or fatigued driving.
  • the abnormal driving state of the vehicle includes at least one of the following:
  • the number of times of line pressing of the vehicle within the first set time period reaches the threshold of the number of times of line pressing; the left and right swing of the vehicle reaches the set amplitude threshold; the vehicle does not pass according to the instructions of traffic signs or traffic lights; the speed of the vehicle exceeds the set speed threshold.
  • control module is configured to generate alarm information based on at least one of the following:
  • control module is configured to control the warning device to continuously output warning information until both the driving state of the vehicle and the driving state of the driver return to a normal state.
  • control module is configured to, in response to detecting that the driver is in a preset dangerous driving state for a duration exceeding a preset duration, control the assisted driving/automatic driving function to be activated, and/ Or control the vehicle to slow down and stop.
  • the apparatus further includes:
  • a third acquiring module configured to acquire environmental information around the vehicle
  • the driving state determination module is configured to determine the driving state of the driver of the vehicle according to the environment information around the vehicle, the driving state information of the vehicle and the image information of the driving area.
  • the environmental information includes road condition information
  • the third obtaining module is configured to obtain road condition information of the road on which the vehicle is traveling;
  • the driving state determination module is configured to detect the behavior of the driver of the vehicle based on the image information of the driving area in response to the fact that the driving state information of the vehicle and the road condition information do not satisfy a preset matching relationship The result indicates that the behavior state of the driver is an abnormal state, and it is determined that the driver is in a preset dangerous driving state.
  • the first acquisition module includes a sensor data acquisition sub-module and a driving state information determination sub-module, wherein:
  • the sensing data acquisition sub-module is used for acquiring driving sensing data, and the sensing data includes at least one of the following: image information collected by the vehicle's advanced driving assistance system, image information collected by the vehicle's driving recorder, vehicle The speed information sensed by the speed sensor;
  • the driving state information determination sub-module is configured to determine the driving state information of the vehicle according to the sensing data.
  • the driving state determination module is configured to input the acquired driving state information of the vehicle and the image information of the driving area into a trained neural network, and determine through the neural network The driving state of the driver of the vehicle.
  • a vehicle characterized in that it includes:
  • a first sensor used for collecting image information of the driving area of the vehicle
  • a controller configured to acquire the driving state information of the vehicle, determine the driving state of the driver of the vehicle according to the driving state information of the vehicle and the image information of the driving area, and respond to detecting the driving of the vehicle
  • the driver is in a preset dangerous driving state and performs intelligent driving control.
  • the vehicle further includes: a second sensor for collecting driving sensing data;
  • the controller is configured to acquire the driving state information of the vehicle according to the driving sensing data.
  • an electronic device comprising: a processor; a memory for storing instructions executable by the processor; wherein the processor is configured to invoke the instructions stored in the memory to execute the above method.
  • a computer-readable storage medium having computer program instructions stored thereon, the computer program instructions implementing the above method when executed by a processor.
  • a computer program comprising computer-readable code, which when the computer-readable code is executed in an electronic device, is executed by a processor in the electronic device for implementing the above method.
  • the driving state of the driver of the vehicle is determined according to the driving state information of the vehicle and the image information of the driving area, and in response to the detection
  • the driver of the vehicle is in a preset dangerous driving state and performs intelligent driving control.
  • the driving state of the driver of the vehicle can be determined by combining the driving state of the vehicle and the image information of the driving area, which improves the accuracy of the determined driving state of the driver.
  • intelligent driving control is performed, which further improves the driving safety of the vehicle.
  • FIG. 1 shows a flowchart of an intelligent driving control method provided by an embodiment of the present disclosure.
  • FIG. 2 shows a block diagram of an intelligent driving control device provided by an embodiment of the present disclosure.
  • FIG. 3 shows a block diagram of an electronic device 800 provided by an embodiment of the present disclosure.
  • FIG. 4 shows a block diagram of an electronic device 1900 provided by an embodiment of the present disclosure.
  • the execution body of the method may be an intelligent driving control device installed on a vehicle.
  • the method may be executed by a terminal device or a server or other processing device.
  • the terminal device may be a vehicle-mounted device, a user equipment (User Equipment, UE), a mobile device, a user terminal, a terminal, a cellular phone, a cordless phone, a personal digital assistant (Personal Digital Assistant, PDA), a handheld device, a computing device or a wearable devices, etc.
  • UE user equipment
  • PDA Personal Digital Assistant
  • the in-vehicle device may be a vehicle or a domain controller in the cabin, or may be an ADAS (Advanced Driving Assistance System), an OMS (Occupant Monitoring System, an occupant monitoring system) or a DMS (Driver Monitoring System, A device host used in a driver monitoring system) for executing an intelligent driving control method, etc.
  • the intelligent driving control method may be implemented by the processor calling computer-readable instructions stored in the memory.
  • FIG. 1 shows a flowchart of an intelligent driving control method according to an embodiment of the present disclosure. As shown in FIG. 1 , the intelligent driving control method includes:
  • step S11 the driving state information of the vehicle is obtained
  • the vehicle here may be at least one type of vehicle, such as a private car, a shared car, an online car-hailing, a taxi, a truck, and the like, and the specific type of the vehicle is not limited in the present disclosure.
  • the driving state information is used to characterize the driving state of the vehicle.
  • the driving state can be divided into a normal driving state and an abnormal driving state.
  • the normal driving state can be that the vehicle follows a predetermined speed, direction, route and traffic regulations. Wait for the rules to follow.
  • the abnormal driving state corresponds to the normal driving state, which may be driving not in accordance with predetermined rules such as speed, direction, route, and traffic regulations.
  • the driving state information may include information that characterizes the normal driving state or abnormal driving state of the vehicle, such as the speed, direction, acceleration of the vehicle, information on the lane in which it is located, line pressure information, body swing information, lane change information, speed change information, and brake information , at least one of braking information, route information, speeding situation, consistency information of driving state and traffic signs, and the like.
  • the abnormal driving state of the vehicle includes at least one of the following: the number of times of line pressing of the vehicle within the first set period of time reaches a set threshold of the number of times of line pressing; Amplitude threshold; the vehicle is not passing as indicated by traffic signs or traffic lights; the speed of the vehicle exceeds the set speed threshold.
  • the number of times of line pressing of the vehicle within the first set period of time reaches the threshold of the set number of line pressing times, indicating that the vehicle continuously presses the line in a relatively short period of time.
  • the first set duration can be set according to the actual situation, for example, it can be 1 minute or 15 seconds
  • the threshold value of the number of times to set the line can also be set according to the actual situation, for example, it can be 2 times, which is not specifically limited in this disclosure. .
  • the set amplitude threshold may be set according to actual conditions such as the type of the vehicle, which is not specifically limited in the present disclosure.
  • the vehicle does not pass according to the instructions of traffic signs or traffic lights, and the speed of the vehicle exceeds the set speed threshold, it also indicates that the vehicle is in an abnormal driving state, which will not be repeated here.
  • the driving state information of the vehicle can be determined according to the driving speed, driving direction, position of the vehicle, and the surrounding environment of the vehicle, etc., which can be used for sensing the external environment information of the vehicle cabin.
  • the sensor that senses the running state is obtained.
  • the acquiring the driving state information of the vehicle includes: acquiring driving sensing data, where the sensing data includes at least one of the following: image information collected by an ADAS of the vehicle; The image information collected by the driving recorder of the vehicle and the speed information sensed by the speed sensor of the vehicle; and the driving state information of the vehicle is determined according to the sensing data.
  • these sensors can be installed in the front and rear insurance of the vehicle bar, side mirrors, inside the steering column, or on the windshield to capture perception data outside the cabin.
  • the environment around the vehicle can be detected, including objects such as pedestrians and vehicles around the vehicle, as well as information such as lane lines, traffic lights, and traffic signs.
  • information such as the speed, acceleration, position, and attitude of the vehicle can also be determined through a speed/acceleration sensor, an attitude sensor, a positioning device, and the like.
  • step S12 acquiring image information of the driving area of the vehicle
  • the driving area here may be the area where the driver is located in the cabin, or any area including the area where the driver is located, and the area is usually the area of the main driver's seat.
  • the image information of the driving area may be the image information of the area where the driver is located in the cabin, and the image information may be collected by an in-vehicle image acquisition device disposed in the cabin of the vehicle or outside the cabin, and the in-vehicle image acquisition device may be Vehicle camera or image acquisition device equipped with camera.
  • the camera may be a camera for collecting image information inside the vehicle, or a camera for collecting image information outside the vehicle.
  • the camera may include a camera in a DMS and/or a camera in an OMS, etc., these cameras may be used to collect image information inside the vehicle; the camera may also include a camera in ADAS, which may be used to collect image information outside the vehicle .
  • the in-vehicle image acquisition device may also be a camera in other systems, or may also be a separately configured camera, and the embodiment of the present disclosure does not limit the specific in-vehicle image acquisition device.
  • the carrier of the image information here can be a two-dimensional image or a video.
  • the image information can be a visible light image/video, or an infrared light image/video; it can also be a three-dimensional image formed by a point cloud scanned by a radar, and so on. It may be determined according to the actual application scenario, which is not limited in the present disclosure.
  • the image information collected by the vehicle can be obtained through the communication connection established with the vehicle image collection device.
  • the vehicle-mounted image acquisition device can transmit the collected image information to the vehicle-mounted controller or remote server through the bus or wireless communication channel in real time, and the vehicle-mounted controller or the remote server can receive the real-time image information through the bus or wireless communication channel .
  • step S13 the driving state of the driver of the vehicle is determined according to the driving state information of the vehicle and the image information of the driving area;
  • the driving state of the driver may be the state of the driver when driving the vehicle, and the driving state may be divided into a normal driving state and a dangerous driving state.
  • the dangerous driving state may be a state in which the driver has preset irregular driving behaviors, for example, a state under behaviors such as making a phone call, looking at a mobile phone, taking a hand off the steering wheel, and fatigued driving.
  • the determination of the driving state of the driver may be based on the image information of the driving area and the driving state information of the vehicle, so as to improve the reliability of the determined driving state of the driver.
  • the image information of the driving area is the image information of the area where the driver is located in the cabin. Then, by analyzing and processing the image information of the driving area through image processing technology, the characteristics of the driver can be detected, for example, body characteristics, Facial features, etc. Based on the detected characteristics of the driver, the behavior of the driver can be analyzed to determine whether the driver has made a phone call, drinking water, taking his hands off the steering wheel, closing his eyes, etc.
  • the driving state of the driver can be determined by combining the driving state information of the vehicle with the image information of the driving area. For example, by analyzing the image information of the driving area, it is determined that the confidence level of the driver in the fatigue driving state is 0.5, and at this time the vehicle appears in the driving state of continuous line pressure, the confidence level of the driver in the fatigue driving state can be determined.
  • the mild fatigue driving state can be upgraded to Moderate fatigue driving state or severe fatigue driving state.
  • step S14 intelligent driving control is performed in response to detecting that the driver of the vehicle is in a preset dangerous driving state.
  • the preset dangerous driving state is a preset driving state with potential driving safety hazards, which can include one or more predefined driver states, such as moderate fatigue driving state, severe fatigue state, and severe distracted driving. status, etc.
  • the preset dangerous driving state may also be a state defined by a preset driver behavior combined with a preset vehicle state, for example, it is defined that the state when the driver makes a phone call and the vehicle presses the line for more than two consecutive times is the preset dangerous driving state.
  • intelligent driving control can be performed.
  • the intelligent driving control here can be the vehicle control center or the remote control terminal to intervene driving according to the road conditions the vehicle is driving on.
  • the specific intelligent driving control method can be, for example, issuing an alarm. information, control the activation of assisted driving/autonomous driving functions, control vehicle deceleration and other measures. The details will be described in detail with reference to the possible implementation manners later in the present disclosure, which will not be repeated here.
  • the danger level may be determined according to the dangerous driving state of the driver and the driving state of the vehicle, and then the corresponding intelligent driving control mode may be selected according to the danger level. For example, when the driver is in a dangerous driving state such as making a phone call, looking at a mobile phone, or taking his hand off the steering wheel, and the vehicle continues to press the line for less than a preset period of time (for example, 5 seconds), the intelligent driving control is performed by issuing a warning message; when driving When the driver is in a dangerous driving state such as making a phone call, looking at a mobile phone, or taking his hand off the steering wheel, and the vehicle is in a state of overspeeding by 50%, the risk is higher than that when the vehicle is continuously pressing the line for no more than a preset time (for example, 5 seconds). Intelligent driving control is carried out by controlling the deceleration of the vehicle, so as to realize flexible and safe driving control.
  • a dangerous driving state such as making a phone call, looking at a mobile phone, or taking his hand off the steering
  • the driving state of the driver of the vehicle is determined according to the driving state information of the vehicle and the image information of the driving area, and in response to the detection
  • the driver of the vehicle is in a preset dangerous driving state and performs intelligent driving control.
  • the driving state of the driver of the vehicle can be determined in combination with the driving state of the vehicle and the image information of the driving area, which improves the accuracy of the determined driving state of the driver.
  • intelligent driving control is performed, which further improves the driving safety of the vehicle.
  • the image information of the driving area may be image information in the vehicle cabin
  • the driving state information of the vehicle may be information determined according to perception data outside the vehicle cabin. That is, in the present disclosure, it may be The driving state of the driver is determined in combination with the information inside and outside the vehicle cabin, which improves the accuracy of the determined driving state of the driver. Intelligent driving control is carried out to improve the driving safety of the vehicle.
  • determining the driving state of the driver of the vehicle according to the driving state information of the vehicle and the image information of the driving area includes: determining the driving state of the vehicle according to the image information of the driving area.
  • the driver performs behavior detection; the driving state of the driver is determined according to the driving state information of the vehicle and the behavior detection result of the driver.
  • the behavior of the driver of the vehicle is detected according to the image information of the driving area to obtain the behavior detection result, and then the driving state of the driver is determined in combination with the driving state information of the vehicle and the behavior detection result, which improves the performance.
  • the accuracy of the driver's driving state is determined in combination with the driving state information of the vehicle and the behavior detection result, which improves the performance.
  • the image information of the driving area can be analyzed by image processing technology to detect the driver's behavior and obtain the behavior detection result.
  • the preset dangerous driving state may be that the driver has a preset irregular driving behavior.
  • the behavior detection of the driver of the vehicle according to the image information of the driving area may be the non-standard driving behavior of the driver. Standardize the detection of driving behavior. For example, when it is detected that the driver is holding a phone and the phone is near the ear, it can be determined that the behavior detection result is that the driver has the behavior of making a phone call while driving; it can be detected that the driver's hand is not on the steering wheel.
  • the result of the behavior detection is that the driver has the act of leaving the steering wheel with his hands; when it is detected that the duration of the driver's eyes closed reaches the second duration, the result of the behavior detection is determined to be the presence of the driver.
  • Behavior of mildly fatigued driving it may be determined that the result of the behavior detection is that the driver has a behavior of moderately fatigued driving when it is detected that the duration of the driver's eyes closed reaches the third duration.
  • the driving status information of the vehicle indicates that the driving status of the vehicle is also a normal status, the driving status of the driver can be determined for normal driving.
  • the detection result is that the driver's unsafe driving behavior is detected
  • the driving state information of the vehicle indicates that the driving state of the vehicle is a normal state
  • the detected irregularity determines the driving state of the driver. For example, if the behavior detection result is the driver's eye-closing frequency or the number of consecutive eye-closing times within the interval corresponding to mild fatigue driving, if the driving state of the vehicle is normal, the driver's driving state is still determined to be mild Fatigue driving.
  • the driving state information of the vehicle can be further modified to obtain the driver's driving state, so as to improve the reliability of the obtained behavior detection result. For example, since the abnormal driving state of the vehicle is often caused by the driver's irregular driving behavior, the confidence level of the driver's behavior in the behavior detection result can be increased to be an unsafe driving behavior, so as to improve the reliability of the obtained dangerous driving state. ; In addition, since the vehicle has been in an abnormal driving state, the danger level of the detection result can be increased so that higher-level response measures can be taken.
  • the performing behavior detection on the driver of the vehicle includes: determining a danger level corresponding to the behavior detection result of the driver; Determining the driving state of the driver from the behavior detection result of the driver includes: in response to determining that the vehicle is in an abnormal driving state according to the driving state information of the vehicle, upgrading the danger level of the driver's behavior detection result is a first dangerous level; in response to determining that the first dangerous level reaches a preset warning level, it is determined that the driving state of the driver is in a preset dangerous driving state.
  • the driver's behavior detection result may correspond to the danger level, and the danger level represents the dangerous degree of the driver's dangerous driving state.
  • the behavior detection result can be determined as The driver has a behavior of driving with mild fatigue; when it is detected that the driver's eyes are closed for a third period of time, it may be determined that the behavior detection result is that the driver has a behavior of moderately fatigued driving.
  • the third duration is longer than the second duration. Obviously, the longer the driver's eyes are closed, the higher the degree of danger and the higher the probability of vehicle accidents.
  • the danger level of the detection result can be upgraded to take higher-level response measures.
  • the upgraded danger level is referred to as the first danger level here.
  • the first danger level it can be determined whether the first danger level reaches a preset alarm level, and if so, an alarm or other intelligent control operations can be performed.
  • an alarm may not be issued, and for the behavior of severe fatigue driving, because it belongs to a preset dangerous driving state, an alarm is issued.
  • the danger level of the driver's behavior detection result is upgraded to the first danger level, and in the first When the danger level reaches the preset warning level, it is determined that the driving state of the driver is in the preset dangerous driving state. Therefore, in the case that the vehicle has been in an abnormal driving state, the danger level of the detection result is increased, so that response measures can be taken in a timely and accurate manner, thereby improving the driving safety of the vehicle.
  • the determining the driving state of the driver according to the driving state information of the vehicle and the behavior detection result of the driver includes: in response to the driving state information of the vehicle and The result of the behavior detection of the driver determines that both the driving state of the vehicle and the behavior state of the driver are abnormal states, and it is determined that the driver is in a preset dangerous driving state.
  • a traffic accident may be caused. If the driver's behavioral state is also abnormal at this time, it indicates that the driver is not driving safely, which has a great potential safety hazard. Therefore, when both the driving state of the vehicle and the behavior state of the driver are abnormal states, it can be determined that the driver is in a preset dangerous driving state, so as to respond to detecting that the driver of the vehicle is in a preset dangerous driving state Driving state, carry out intelligent driving control, improve the driving safety of the vehicle.
  • the determining the driving state of the driver of the vehicle according to the driving state information of the vehicle and the image information of the driving area includes: responding to the driving state information according to the vehicle It is detected that the vehicle is in an abnormal driving state, and the behavior of the driver is detected according to the image information of the driving area; in response to determining that the driver's behavior detection result is the driver's distracted driving or fatigued driving, determine the driver's behavior.
  • the driver is in a preset dangerous driving state.
  • the driver's distracted driving here can be the driver's behavior that the driver has his hands off the steering wheel, talking on the phone, drinking water while driving, playing with the mobile phone while driving, etc. yawning, etc.
  • the driving state of the vehicle considering that the driving state of the vehicle is abnormal, it may lead to a traffic accident. At this time, if the driver's behavior detection result is that the driver is distracted or fatigued, it can be determined that the driver is distracted or fatigued. Fatigue driving makes the vehicle in an unsafe state. Therefore, when the driving state of the vehicle is abnormal and the result of the driver's behavior detection is that the driver is distracted or fatigued, it can be determined that the driver is in a preset state. the dangerous driving state, so that the intelligent driving control can intervene in time to improve safety.
  • the performing intelligent driving control includes generating alarm information based on at least one of the following: information on the driving state of the vehicle; Behavior detection result; the driving state of the driver.
  • the driver in order to achieve a better warning effect, the driver can be informed of at least one of the vehicle's driving state information, the driver's behavior detection result, and the driver's driving state, so that the driver is aware of the current The severity of the dangerous driving condition. For example: if the driving state information of the vehicle is "the vehicle has been pressed three times in a row", and the result of the behavior detection of the driver of the vehicle based on the image information of the driving area is "the driver is on the phone", the generated warning information can be: "You have pressed the line three times in a row, please don't make a phone call while driving.”
  • the specific form of sending the alarm information may be in the form of voice or video, which is not limited in the present disclosure.
  • the warning information is generated based on at least one of the driving state information of the vehicle, the behavior detection result of the driver, and the driving state of the driver, so that the driver is informed of the unsafe state of the vehicle and the current dangerous driving state to achieve a better warning effect.
  • the performing intelligent driving control includes: controlling the warning device to continuously output warning information until both the driving state of the vehicle and the driving state of the driver return to a normal state.
  • the warning effect can be further improved, the continuous monitoring and warning of the driver's state can be realized, and the driving safety can be further improved.
  • the alarm is stopped, which can reduce the interference caused by unnecessary alarms to the driver.
  • the performing intelligent driving control includes: in response to detecting that the driver is in a preset dangerous driving state for a duration exceeding a preset duration, controlling the assisted driving/automatic driving function to be activated, and /or control the vehicle to slow down and stop.
  • the assisted driving/automatic driving function can be controlled to start, and the vehicle can also be controlled to decelerate and stop.
  • the assisted driving/autonomous driving function can control the accelerator, braking and direction according to the driving sensor data collected by the vehicle's sensors to adapt the vehicle to changing traffic conditions, such as adaptive cruise control, keeping the vehicle in the lane Driving functions such as driving and controlling the following distance between the vehicle and the vehicle in front.
  • the assisted driving/automatic driving function is controlled to start, and/or the vehicle is controlled to decelerate and stop, so that the driver can When the vehicle cannot be normally controlled for a long time, the driving state of the vehicle is controlled, thereby improving the driving safety of the vehicle.
  • the method further includes: acquiring environmental information around the vehicle;
  • the determining of the driving state of the driver of the vehicle according to the driving state information of the vehicle and the image information of the driving area includes: according to the environment information around the vehicle, the driving state information of the vehicle and the The image information of the driving area determines the driving state of the driver of the vehicle.
  • the driving state of the vehicle may be affected by the surrounding environment of the vehicle.
  • the speed of the vehicle may be fast or slow due to road congestion, and the vehicle swaying left and right may be due to the vehicle driving on a curved road. Therefore, in order to further improve the accuracy of the determined driving state of the vehicle driver, the driving state of the vehicle driver may be determined according to the surrounding environment information of the vehicle, the driving state information of the vehicle, and the image information of the driving area.
  • the environmental information around the vehicle may include road condition information of the road on which the vehicle travels, and the road condition information may represent the condition of the road the vehicle travels, and may include indicating the degree of road congestion, indicating the shape of the road, indicating the type of the road, indicating the speed limit of the road at least one of the information, etc.
  • the road conditions can be divided into multiple grades, such as very crowded, crowded, unobstructed, etc.
  • the shape of the road can be divided into curves and straight lines, and the types of roads can be divided into urban roads, rural roads, etc. Types of roads, highways, etc.
  • the acquiring the environmental information around the vehicle includes: acquiring road condition information of the road on which the vehicle travels; and the image information of the driving area to determine the driving state of the driver of the vehicle, including: in response to the driving state information of the vehicle and the road condition information not satisfying a preset matching relationship, and based on the driving area
  • the result of detecting the behavior of the driver of the vehicle indicates that the behavior state of the driver is an abnormal state, and it is determined that the driver is detected to be in a preset dangerous driving state.
  • the road condition information of the road can be obtained through a third-party platform according to the current geographic location of the vehicle, for example, through a navigation service platform.
  • the preset matching relationship refers to the matching relationship between the road condition information and the driving state information of the vehicle that runs safely under the road conditions represented by the road condition information.
  • it may include at least one of the following: the matching relationship between “unblocked” road conditions and the vehicle running at a constant speed; the matching relationship between the straight road conditions and the vehicle’s left-right swing amplitude that does not exceed the set amplitude threshold; the road speed limit information and the speed limit within the range
  • the matching relationship of driving may include at least one of the following: the matching relationship between “unblocked” road conditions and the vehicle running at a constant speed; the matching relationship between the straight road conditions and the vehicle’s left-right swing amplitude that does not exceed the set amplitude threshold; the road speed limit information and the speed limit within the range The matching relationship of driving.
  • the driving state information of the vehicle and the road condition information do not meet the preset matching relationship, for example, it may include: the left and right swing amplitude of the vehicle on the straight road section reaches the set amplitude threshold, the vehicle speed is fast or slow on the road section with clear roads, the vehicle Speeding, etc., all indicate that the vehicle is in an abnormal driving state.
  • the vehicle when the vehicle is in an abnormal driving state and the driver's behavior state is abnormal, it can be determined that the driver is in a preset dangerous driving state, the accuracy of the determined driving state of the driver is improved, and the further The driving safety of the vehicle is improved.
  • the determining the driving state of the driver of the vehicle according to the driving state information of the vehicle and the image information of the driving area includes: obtaining the driving state of the vehicle
  • the information and the image information of the driving area are input into the trained neural network, and the driving state of the driver of the vehicle is determined via the neural network.
  • the driving state of the vehicle driver is determined through the neural network, which can improve the accuracy and speed of determining the driving state.
  • the neural network can be pre-trained and deployed, and can quickly detect images or video streams with a large amount of data, so it can be applied to intelligent driving control in real-time driving scenarios.
  • the image information of the driving area can be collected by the DMS camera
  • the processor can obtain the image information of the driving area from the DMS camera, and based on the image information of the driving area, the behavior of the driver of the vehicle can be detected, and the behavior detection can be obtained.
  • the ADAS camera collects image information outside the cabin
  • the processor can obtain the image information outside the cabin from the ADAS camera to determine the driving state information of the vehicle.
  • a voice warning message will be generated: "You have pressed the line three times in a row, please Do not drive on the phone” and make a voice broadcast.
  • the present disclosure also provides intelligent driving control devices, electronic equipment, computer-readable storage media, and programs, all of which can be used to implement any intelligent driving control method provided by the present disclosure. Corresponding records will not be repeated.
  • FIG. 2 shows a block diagram of an intelligent driving control device according to an embodiment of the present disclosure. As shown in FIG. 2 , the device includes:
  • the first acquisition module 21 is used to acquire the driving state information of the vehicle
  • a second acquisition module 22 configured to acquire image information of the driving area of the vehicle
  • a driving state determination module 23 configured to determine the driving state of the driver of the vehicle according to the driving state information of the vehicle and the image information of the driving area;
  • the control module 24 is configured to perform intelligent driving control in response to detecting that the driver of the vehicle is in a preset dangerous driving state.
  • the driving state determination module 23 includes a first driving state determination submodule and a second driving state determination submodule, wherein:
  • the first driving state determination sub-module is configured to perform behavior detection on the driver of the vehicle according to the image information of the driving area;
  • the second driving state determination sub-module is configured to determine the driving state of the driver according to the driving state information of the vehicle and the behavior detection result of the driver.
  • the first driving state determination sub-module is configured to determine the danger level corresponding to the behavior detection result of the driver
  • the second driving state determination submodule is used for:
  • the second driving state determination sub-module is configured to determine the driving state of the vehicle and the driving state of the vehicle in response to the driving state information of the vehicle and the detection result of the driver's behavior
  • the behavior states of the driver are all abnormal states, and it is determined that the driver is in a preset dangerous driving state.
  • the driving state determination module 23 includes a third driving state determination submodule and a fourth driving state determination submodule, wherein:
  • the third driving state determination sub-module is configured to perform behavior detection of the driver according to the image information of the driving area in response to detecting that the vehicle is in an abnormal driving state according to the driving state information of the vehicle;
  • the fourth driving state determination sub-module is configured to determine that the driver is in a preset dangerous driving state in response to determining that the driver's behavior detection result is the driver's distracted driving or fatigued driving.
  • the abnormal driving state of the vehicle includes at least one of the following:
  • the number of times of line pressing of the vehicle within the first set time period reaches the threshold of the number of times of line pressing; the left and right swing of the vehicle reaches the set amplitude threshold; the vehicle does not pass according to the instructions of traffic signs or traffic lights; the speed of the vehicle exceeds the set speed threshold.
  • control module 24 is configured to generate alarm information based on at least one of the following:
  • control module 24 is configured to control the warning device to continuously output warning information until both the driving state of the vehicle and the driving state of the driver return to the normal state.
  • control module 24 is configured to control the assisted driving/automatic driving function to be activated in response to detecting that the driver is in a preset dangerous driving state for a duration exceeding a preset period of time, and /or control the vehicle to slow down and stop.
  • the apparatus further includes:
  • a third acquiring module configured to acquire environmental information around the vehicle
  • the driving state determination module 23 is configured to determine the driving state of the driver of the vehicle according to the environmental information around the vehicle, the driving state information of the vehicle and the image information of the driving area.
  • the environmental information includes road condition information
  • the third obtaining module is configured to obtain road condition information of the road on which the vehicle is traveling;
  • the driving state determination module 23 is configured to respond to the fact that the driving state information of the vehicle and the road condition information do not satisfy a preset matching relationship, and determine the behavior of the driver of the vehicle based on the image information of the driving area.
  • the detection result indicates that the behavior state of the driver is an abnormal state, and it is determined that the driver is in a preset dangerous driving state.
  • the first acquisition module 21 includes a sensor data acquisition sub-module and a driving state information determination sub-module, wherein:
  • the sensing data acquisition sub-module is used for acquiring driving sensing data, and the sensing data includes at least one of the following: image information collected by the vehicle's advanced driving assistance system, image information collected by the vehicle's driving recorder, vehicle The speed information sensed by the speed sensor;
  • the driving state information determination sub-module is configured to determine the driving state information of the vehicle according to the sensing data.
  • the driving state determination module 23 is configured to input the acquired driving state information of the vehicle and the image information of the driving area into a trained neural network, and through the neural network, The driving state of the driver of the vehicle is determined.
  • the functions or modules included in the apparatuses provided in the embodiments of the present disclosure may be used to execute the methods described in the above method embodiments.
  • the embodiments of the present disclosure also provide a vehicle, including:
  • a first sensor used for collecting image information of the driving area of the vehicle
  • a controller configured to acquire the driving state information of the vehicle, determine the driving state of the driver of the vehicle according to the driving state information of the vehicle and the image information of the driving area, and respond to detecting the driving of the vehicle
  • the driver is in a preset dangerous driving state and performs intelligent driving control.
  • the vehicle of this embodiment determines the driving state of the driver of the vehicle by combining the driving state of the vehicle and the image information of the driving area, thereby improving the accuracy of determining the driving state of the driver.
  • the vehicle further includes a second sensor for collecting driving sensor data; the controller is used for acquiring driving state information of the vehicle according to the driving sensor data.
  • the controller is used for acquiring driving state information of the vehicle according to the driving sensor data.
  • Embodiments of the present disclosure further provide a computer-readable storage medium, on which computer program instructions are stored, and when the computer program instructions are executed by a processor, the foregoing method is implemented.
  • the computer-readable storage medium may be a volatile computer-readable storage medium or a non-volatile computer-readable storage medium.
  • An embodiment of the present disclosure further provides an electronic device, including: a processor; a memory for storing instructions executable by the processor; wherein the processor is configured to invoke the instructions stored in the memory to execute the above method.
  • Embodiments of the present disclosure also provide a computer program product, including computer-readable codes.
  • a processor in the device executes the method for implementing the intelligent driving control provided by any of the above embodiments. instruction.
  • Embodiments of the present disclosure further provide another computer program product for storing computer-readable instructions, which, when executed, cause the computer to execute the operations of the intelligent driving control method provided by any of the foregoing embodiments.
  • the electronic device may be provided as a terminal, server or other form of device.
  • FIG. 3 shows a block diagram of an electronic device 800 according to an embodiment of the present disclosure.
  • electronic device 800 may be a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, fitness device, personal digital assistant, etc. terminal.
  • electronic device 800 may include one or more of the following components: processing component 802, memory 804, power supply component 806, multimedia component 808, audio component 810, input/output (I/O) interface 812, sensor component 814 , and the communication component 816 .
  • the processing component 802 generally controls the overall operation of the electronic device 800, such as operations associated with display, phone calls, data communications, camera operations, and recording operations.
  • the processing component 802 can include one or more processors 820 to execute instructions to perform all or some of the steps of the methods described above.
  • processing component 802 may include one or more modules that facilitate interaction between processing component 802 and other components.
  • processing component 802 may include a multimedia module to facilitate interaction between multimedia component 808 and processing component 802.
  • Memory 804 is configured to store various types of data to support operation at electronic device 800 . Examples of such data include instructions for any application or method operating on electronic device 800, contact data, phonebook data, messages, pictures, videos, and the like. Memory 804 may be implemented by any type of volatile or nonvolatile storage device or combination thereof, such as static random access memory (SRAM), electrically erasable programmable read only memory (EEPROM), erasable Programmable Read Only Memory (EPROM), Programmable Read Only Memory (PROM), Read Only Memory (ROM), Magnetic Memory, Flash Memory, Magnetic or Optical Disk.
  • SRAM static random access memory
  • EEPROM electrically erasable programmable read only memory
  • EPROM erasable Programmable Read Only Memory
  • PROM Programmable Read Only Memory
  • ROM Read Only Memory
  • Magnetic Memory Flash Memory
  • Magnetic or Optical Disk Magnetic Disk
  • Power supply assembly 806 provides power to various components of electronic device 800 .
  • Power supply components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power to electronic device 800 .
  • Multimedia component 808 includes a screen that provides an output interface between the electronic device 800 and the user.
  • the screen may include a liquid crystal display (LCD) and a touch panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user.
  • the touch panel includes one or more touch sensors to sense touch, swipe, and gestures on the touch panel. The touch sensor may not only sense the boundaries of a touch or swipe action, but also detect the duration and pressure associated with the touch or swipe action.
  • the multimedia component 808 includes a front-facing camera and/or a rear-facing camera. When the electronic device 800 is in an operation mode, such as a shooting mode or a video mode, the front camera and/or the rear camera may receive external multimedia data. Each of the front and rear cameras can be a fixed optical lens system or have focal length and optical zoom capability.
  • Audio component 810 is configured to output and/or input audio signals.
  • audio component 810 includes a microphone (MIC) that is configured to receive external audio signals when electronic device 800 is in operating modes, such as calling mode, recording mode, and voice recognition mode.
  • the received audio signal may be further stored in memory 804 or transmitted via communication component 816 .
  • audio component 810 also includes a speaker for outputting audio signals.
  • the I/O interface 812 provides an interface between the processing component 802 and a peripheral interface module, which may be a keyboard, a click wheel, a button, or the like. These buttons may include, but are not limited to: home button, volume buttons, start button, and lock button.
  • Sensor assembly 814 includes one or more sensors for providing status assessment of various aspects of electronic device 800 .
  • the sensor assembly 814 can detect the on/off state of the electronic device 800, the relative positioning of the components, such as the display and the keypad of the electronic device 800, the sensor assembly 814 can also detect the electronic device 800 or one of the electronic device 800 Changes in the position of components, presence or absence of user contact with the electronic device 800 , orientation or acceleration/deceleration of the electronic device 800 and changes in the temperature of the electronic device 800 .
  • Sensor assembly 814 may include a proximity sensor configured to detect the presence of nearby objects in the absence of any physical contact.
  • Sensor assembly 814 may also include a light sensor, such as a complementary metal oxide semiconductor (CMOS) or charge coupled device (CCD) image sensor, for use in imaging applications.
  • CMOS complementary metal oxide semiconductor
  • CCD charge coupled device
  • the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
  • Communication component 816 is configured to facilitate wired or wireless communication between electronic device 800 and other devices.
  • the electronic device 800 may access a wireless network based on a communication standard, such as wireless network (WiFi), second generation mobile communication technology (2G) or third generation mobile communication technology (3G), or a combination thereof.
  • the communication component 816 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel.
  • the communication component 816 also includes a near field communication (NFC) module to facilitate short-range communication.
  • the NFC module may be implemented based on radio frequency identification (RFID) technology, infrared data association (IrDA) technology, ultra-wideband (UWB) technology, Bluetooth (BT) technology and other technologies.
  • RFID radio frequency identification
  • IrDA infrared data association
  • UWB ultra-wideband
  • Bluetooth Bluetooth
  • electronic device 800 may be implemented by one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable A programmed gate array (FPGA), controller, microcontroller, microprocessor or other electronic component implementation is used to perform the above method.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGA field programmable A programmed gate array
  • controller microcontroller, microprocessor or other electronic component implementation is used to perform the above method.
  • a non-volatile computer-readable storage medium such as a memory 804 comprising computer program instructions executable by the processor 820 of the electronic device 800 to perform the above method is also provided.
  • FIG. 4 shows a block diagram of an electronic device 1900 according to an embodiment of the present disclosure.
  • the electronic device 1900 may be provided as a server. 4
  • electronic device 1900 includes processing component 1922, which further includes one or more processors, and a memory resource represented by memory 1932 for storing instructions executable by processing component 1922, such as applications.
  • An application program stored in memory 1932 may include one or more modules, each corresponding to a set of instructions.
  • the processing component 1922 is configured to execute instructions to perform the above-described methods.
  • the electronic device 1900 may also include a power supply assembly 1926 configured to perform power management of the electronic device 1900, a wired or wireless network interface 1950 configured to connect the electronic device 1900 to a network, and an input output (I/O) interface 1958 .
  • the electronic device 1900 can operate based on an operating system stored in the memory 1932, such as a Microsoft server operating system (Windows Server TM ), a graphical user interface based operating system (Mac OS X TM ) introduced by Apple, a multi-user multi-process computer operating system (Unix TM ), Free and Open Source Unix-like Operating System (Linux TM ), Open Source Unix-like Operating System (FreeBSD TM ) or the like.
  • Microsoft server operating system Windows Server TM
  • Mac OS X TM graphical user interface based operating system
  • Uniix TM multi-user multi-process computer operating system
  • Free and Open Source Unix-like Operating System Linux TM
  • FreeBSD TM Open Source Unix-like Operating System
  • a non-volatile computer-readable storage medium such as memory 1932 comprising computer program instructions executable by processing component 1922 of electronic device 1900 to perform the above-described method.
  • the present disclosure may be a system, method and/or computer program product.
  • the computer program product may include a computer-readable storage medium having computer-readable program instructions loaded thereon for causing a processor to implement various aspects of the present disclosure.
  • a computer-readable storage medium may be a tangible device that can hold and store instructions for use by the instruction execution device.
  • the computer-readable storage medium may be, for example, but not limited to, an electrical storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
  • Non-exhaustive list of computer readable storage media include: portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM) or flash memory), static random access memory (SRAM), portable compact disk read only memory (CD-ROM), digital versatile disk (DVD), memory sticks, floppy disks, mechanically coded devices, such as printers with instructions stored thereon Hole cards or raised structures in grooves, and any suitable combination of the above.
  • RAM random access memory
  • ROM read only memory
  • EPROM erasable programmable read only memory
  • flash memory static random access memory
  • SRAM static random access memory
  • CD-ROM compact disk read only memory
  • DVD digital versatile disk
  • memory sticks floppy disks
  • mechanically coded devices such as printers with instructions stored thereon Hole cards or raised structures in grooves, and any suitable combination of the above.
  • Computer-readable storage media are not to be construed as transient signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through waveguides or other transmission media (eg, light pulses through fiber optic cables), or through electrical wires transmitted electrical signals.
  • the computer readable program instructions described herein may be downloaded to various computing/processing devices from a computer readable storage medium, or to an external computer or external storage device over a network such as the Internet, a local area network, a wide area network, and/or a wireless network.
  • the network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers, and/or edge servers.
  • a network adapter card or network interface in each computing/processing device receives computer-readable program instructions from a network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in each computing/processing device .
  • Computer program instructions for carrying out operations of the present disclosure may be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine-dependent instructions, microcode, firmware instructions, state setting data, or instructions in one or more programming languages.
  • Source or object code written in any combination, including object-oriented programming languages, such as Smalltalk, C++, etc., and conventional procedural programming languages, such as the "C" language or similar programming languages.
  • the computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server implement.
  • the remote computer may be connected to the user's computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computer (eg, using an Internet service provider through the Internet connect).
  • LAN local area network
  • WAN wide area network
  • custom electronic circuits such as programmable logic circuits, field programmable gate arrays (FPGAs), or programmable logic arrays (PLAs) can be personalized by utilizing state information of computer readable program instructions.
  • Computer readable program instructions are executed to implement various aspects of the present disclosure.
  • These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer or other programmable data processing apparatus to produce a machine that causes the instructions when executed by the processor of the computer or other programmable data processing apparatus , resulting in means for implementing the functions/acts specified in one or more blocks of the flowchart and/or block diagrams.
  • These computer readable program instructions can also be stored in a computer readable storage medium, these instructions cause a computer, programmable data processing apparatus and/or other equipment to operate in a specific manner, so that the computer readable medium on which the instructions are stored includes An article of manufacture comprising instructions for implementing various aspects of the functions/acts specified in one or more blocks of the flowchart and/or block diagrams.
  • Computer readable program instructions can also be loaded onto a computer, other programmable data processing apparatus, or other equipment to cause a series of operational steps to be performed on the computer, other programmable data processing apparatus, or other equipment to produce a computer-implemented process , thereby causing instructions executing on a computer, other programmable data processing apparatus, or other device to implement the functions/acts specified in one or more blocks of the flowcharts and/or block diagrams.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more functions for implementing the specified logical function(s) executable instructions.
  • the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations can be implemented in dedicated hardware-based systems that perform the specified functions or actions , or can be implemented in a combination of dedicated hardware and computer instructions.
  • the computer program product can be specifically implemented by hardware, software or a combination thereof.
  • the computer program product is embodied as a computer storage medium, and in another optional embodiment, the computer program product is embodied as a software product, such as a software development kit (Software Development Kit, SDK), etc. Wait.
  • a software development kit Software Development Kit, SDK

Landscapes

  • Engineering & Computer Science (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

本公开涉及一种智能驾驶控制方法及装置、车辆、电子设备和存储介质,所述方法包括:获取车辆的行驶状态信息;获取所述车辆的驾驶区域的影像信息;根据所述车辆的行驶状态信息和所述驾驶区域的影像信息,确定所述车辆的驾驶员的驾驶状态;响应于检测到所述车辆的驾驶员处于预设的危险驾驶状态,进行智能驾驶控制。

Description

智能驾驶控制方法及装置、车辆、电子设备和存储介质 技术领域
本公开涉及计算机技术领域,尤其涉及一种智能驾驶控制方法及装置、车辆、电子设备和存储介质。
背景技术
在车辆驾驶过程中,驾驶员处于危险驾驶状态对行车安全造成隐患,容易引起交通事故。因此,如何准确地检测驾驶员的危险驾驶状态并及时采取措施,对于安全驾驶具有重要意义。
发明内容
本公开提出了一种智能驾驶控制技术方案。
根据本公开的一方面,提供了一种智能驾驶控制方法,包括:
获取车辆的行驶状态信息;获取所述车辆的驾驶区域的影像信息;根据所述车辆的行驶状态信息和所述驾驶区域的影像信息,确定所述车辆的驾驶员的驾驶状态;响应于检测到所述车辆的驾驶员处于预设的危险驾驶状态,进行智能驾驶控制。
在一种可能的实现方式中,所述根据所述车辆的行驶状态信息和所述驾驶区域的影像信息确定所述车辆的驾驶员的驾驶状态,包括:
根据所述驾驶区域的影像信息对所述车辆的驾驶员进行行为检测;
根据所述车辆的行驶状态信息和所述驾驶员的行为检测结果确定所述驾驶员的驾驶状态。
在一种可能的实现方式中,所述对所述车辆的驾驶员进行行为检测,包括:
确定所述驾驶员的行为检测结果对应的危险级别;
所述根据所述车辆的行驶状态信息和所述驾驶员的行为检测结果确定所述驾驶员的驾驶状态,包括:
响应于根据所述车辆的行驶状态信息确定所述车辆处于异常行驶状态,将所述驾驶员的行为检测结果的危险级别升级为第一危险级别;
响应于确定所述第一危险级别达到预设的告警级别,确定所述驾驶员的驾驶状态处于预设的危险驾驶状态。
在一种可能的实现方式中,所述根据所述车辆的行驶状态信息和所述驾驶员的行为检测结果确定所述驾驶员的驾驶状态,包括:
响应于根据所述车辆的行驶状态信息和所述驾驶员的行为检测结果确定所述车辆的行驶状态和所述驾驶员的行为状态均为异常状态,确定检测到所述驾驶员处于预设的危险驾驶状态。
在一种可能的实现方式中,所述根据所述车辆的行驶状态信息和所述驾驶区域的影 像信息确定所述车辆的驾驶员的驾驶状态,包括:
响应于根据所述车辆的行驶状态信息检测到所述车辆处于异常行驶状态,根据所述驾驶区域的影像信息对所述驾驶员进行行为检测;
响应于确定所述驾驶员的行为检测结果为驾驶员分心驾驶或疲劳驾驶,确定所述驾驶员处于预设的危险驾驶状态。
在一种可能的实现方式中,所述车辆的异常行驶状态包括以下至少一项:
车辆在第一设定时长内的压线次数达到设定压线次数阈值;车辆的左右摆动幅度达到设定幅度阈值;车辆未按照交通标志或交通信号灯的指示通行;车辆的车速超出设定速度阈值。
在一种可能的实现方式中,所述进行智能驾驶控制,包括基于以下至少一项生成告警信息:
所述车辆的行驶状态信息;
基于所述驾驶区域的影像信息对所述车辆的驾驶员的行为检测结果;
所述驾驶员的驾驶状态。
在一种可能的实现方式中,所述进行智能驾驶控制包括:
控制告警设备持续输出告警信息,直至所述车辆行驶状态和驾驶员的驾驶状态均恢复正常状态。
在一种可能的实现方式中,所述进行智能驾驶控制包括:
响应于检测到所述驾驶员处于预设的危险驾驶状态的持续时长超过预设时长,控制辅助驾驶/自动驾驶功能启动,和/或控制所述车辆减速停车。
在一种可能的实现方式中,所述方法还包括:
获取所述车辆周围的环境信息;
所述根据所述车辆的行驶状态信息和所述驾驶区域的影像信息确定所述车辆的驾驶员的驾驶状态,包括:
根据所述车辆周围的环境信息、所述车辆的行驶状态信息和所述驾驶区域的影像信息确定所述车辆的驾驶员的驾驶状态。
在一种可能的实现方式中,所述环境信息包括路况信息;
所述获取所述车辆周围的环境信息,包括:获取所述车辆行驶的道路的路况信息;
所述根据所述车辆周围的环境信息、所述车辆的行驶状态信息和所述驾驶区域的影像信息确定所述车辆的驾驶员的驾驶状态,包括:
响应于所述车辆的行驶状态信息与所述路况信息不满足预设的匹配关系,且基于所述驾驶区域的影像信息对所述车辆的驾驶员的行为检测结果指示所述驾驶员的行为状态为异常状态,确定检测到所述驾驶员处于预设的危险驾驶状态。
在一种可能的实现方式中,所述获取车辆的行驶状态信息,包括:
获取行车传感数据,所述传感数据包括以下至少一项:车辆的高级驾驶辅助***采集的影像信息、车辆的行车记录仪采集的影像信息、车辆的速度传感器感知的速度信息;
根据所述传感数据,确定所述车辆的行驶状态信息。
在一种可能的实现方式中,所述根据所述车辆的行驶状态信息和所述驾驶区域的影像信息确定所述车辆的驾驶员的驾驶状态,包括:
将获取的所述车辆的行驶状态信息、和驾驶区域的影像信息,输入经过训练的神经网络,经由所述神经网络,确定所述车辆的驾驶员的驾驶状态。
根据本公开的一方面,提供了一种智能驾驶控制装置,包括:
第一获取模块,用于获取车辆的行驶状态信息;
第二获取模块,用于获取所述车辆的驾驶区域的影像信息;
驾驶状态确定模块,用于根据所述车辆的行驶状态信息和所述驾驶区域的影像信息,确定所述车辆的驾驶员的驾驶状态;
控制模块,用于响应于检测到所述车辆的驾驶员处于预设的危险驾驶状态,进行智能驾驶控制。
在一种可能的实现方式中,所述驾驶状态确定模块包括第一驾驶状态确定子模块和第二驾驶状态确定子模块,其中:
所述第一驾驶状态确定子模块,用于根据所述驾驶区域的影像信息对所述车辆的驾驶员进行行为检测;
所述第二驾驶状态确定子模块,用于根据所述车辆的行驶状态信息和所述驾驶员的行为检测结果确定所述驾驶员的驾驶状态。
在一种可能的实现方式中,所述第一驾驶状态确定子模块,用于确定所述驾驶员的行为检测结果对应的危险级别;
所述第二驾驶状态确定子模块,用于:
响应于根据所述车辆的行驶状态信息确定所述车辆处于异常行驶状态,将所述驾驶员的行为检测结果的危险级别升级为第一危险级别;
响应于确定所述第一危险级别达到预设的告警级别,确定所述驾驶员的驾驶状态处于预设的危险驾驶状态。
在一种可能的实现方式中,所述第二驾驶状态确定子模块,用于响应于根据所述车辆的行驶状态信息和所述驾驶员的行为检测结果确定所述车辆的行驶状态和所述驾驶员的行为状态均为异常状态,确定检测到所述驾驶员处于预设的危险驾驶状态。
在一种可能的实现方式中,所述驾驶状态确定模块包括第三驾驶状态确定子模块和第四驾驶状态确定子模块,其中:
所述第三驾驶状态确定子模块,用于响应于根据所述车辆的行驶状态信息检测到所述车辆处于异常行驶状态,根据所述驾驶区域的影像信息对所述驾驶员进行行为检测;
所述第四驾驶状态确定子模块,用于响应于确定所述驾驶员的行为检测结果为驾驶员分心驾驶或疲劳驾驶,确定所述驾驶员处于预设的危险驾驶状态。
在一种可能的实现方式中,所述车辆的异常行驶状态包括以下至少一项:
车辆在第一设定时长内的压线次数达到设定压线次数阈值;车辆的左右摆动幅度达 到设定幅度阈值;车辆未按照交通标志或交通信号灯的指示通行;车辆的车速超出设定速度阈值。
在一种可能的实现方式中,所述控制模块,用于基于以下至少一项生成告警信息:
所述车辆的行驶状态信息;
基于所述驾驶区域的影像信息对所述车辆的驾驶员的行为检测结果;
所述驾驶员的驾驶状态。
在一种可能的实现方式中,所述控制模块,用于控制告警设备持续输出告警信息,直至所述车辆行驶状态和驾驶员的驾驶状态均恢复正常状态。
在一种可能的实现方式中,所述控制模块,用于响应于检测到所述驾驶员处于预设的危险驾驶状态的持续时长超过预设时长,控制辅助驾驶/自动驾驶功能启动,和/或控制所述车辆减速停车。
在一种可能的实现方式中,所述装置还包括:
第三获取模块,用于获取所述车辆周围的环境信息;
所述驾驶状态确定模块,用于根据所述车辆周围的环境信息、所述车辆的行驶状态信息和所述驾驶区域的影像信息确定所述车辆的驾驶员的驾驶状态。
在一种可能的实现方式中,所述环境信息包括路况信息;
所述第三获取模块,用于获取所述车辆行驶的道路的路况信息;
所述驾驶状态确定模块,用于响应于所述车辆的行驶状态信息与所述路况信息不满足预设的匹配关系,且基于所述驾驶区域的影像信息对所述车辆的驾驶员的行为检测结果指示所述驾驶员的行为状态为异常状态,确定检测到所述驾驶员处于预设的危险驾驶状态。
在一种可能的实现方式中,所述第一获取模块包括传感数据获取子模块和行驶状态信息确定子模块,其中:
所述传感数据获取子模块,用于获取行车传感数据,所述传感数据包括以下至少一项:车辆的高级驾驶辅助***采集的影像信息、车辆的行车记录仪采集的影像信息、车辆的速度传感器感知的速度信息;
所述行驶状态信息确定子模块,用于根据所述传感数据,确定所述车辆的行驶状态信息。
在一种可能的实现方式中,所述驾驶状态确定模块,用于将获取的所述车辆的行驶状态信息、和驾驶区域的影像信息,输入经过训练的神经网络,经由所述神经网络,确定所述车辆的驾驶员的驾驶状态。
一种车辆,其特征在于,包括:
第一传感器,用于采集车辆的驾驶区域的影像信息;
控制器,用于获取车辆的行驶状态信息,根据所述车辆的行驶状态信息和所述驾驶区域的影像信息,确定所述车辆的驾驶员的驾驶状态,并响应于检测到所述车辆的驾驶员处于预设的危险驾驶状态,进行智能驾驶控制。
在一种可能的实现方式中,所述车辆还包括:第二传感器,用于采集行车传感数据;
所述控制器用于根据所述行车传感数据获取所述车辆的行驶状态信息。
根据本公开的一方面,提供了一种电子设备,包括:处理器;用于存储处理器可执行指令的存储器;其中,所述处理器被配置为调用所述存储器存储的指令,以执行上述方法。
根据本公开的一方面,提供了一种计算机可读存储介质,其上存储有计算机程序指令,所述计算机程序指令被处理器执行时实现上述方法。
根据本公开的一方面,提供了一种计算机程序,包括计算机可读代码,当所述计算机可读代码在电子设备中运行时,所述电子设备中的处理器执行用于实现上述方法。
在本公开实施例中,通过获取车辆的行驶状态信息,以及车辆的驾驶区域的影像信息,根据车辆的行驶状态信息和驾驶区域的影像信息,确定车辆的驾驶员的驾驶状态,并响应于检测到车辆的驾驶员处于预设的危险驾驶状态,进行智能驾驶控制。由此可以结合车辆的行驶状态与驾驶区域的影像信息,确定车辆的驾驶员的驾驶状态,提高了确定的驾驶员驾驶状态的准确性。另外,响应于检测到车辆的驾驶员处于预设的危险驾驶状态,进行智能驾驶控制,进一步提高了车辆的驾驶安全。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,而非限制本公开。根据下面参考附图对示例性实施例的详细说明,本公开的其它特征及方面将变得清楚。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,这些附图示出了符合本公开的实施例,并与说明书一起用于说明本公开的技术方案。
图1示出本公开实施例提供的智能驾驶控制方法的流程图。
图2示出本公开实施例提供的智能驾驶控制装置的框图。
图3示出本公开实施例提供的一种电子设备800的框图。
图4示出本公开实施例提供的一种电子设备1900的框图。
具体实施方式
以下将参考附图详细说明本公开的各种示例性实施例、特征和方面。附图中相同的附图标记表示功能相同或相似的元件。尽管在附图中示出了实施例的各种方面,但是除非特别指出,不必按比例绘制附图。
在这里专用的词“示例性”意为“用作例子、实施例或说明性”。这里作为“示例性”所说明的任何实施例不必解释为优于或好于其它实施例。
本文中术语“和/或”,仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,本文中术语“至少一种”表示多种中的任意一种或多种中的至少两种的任意组合, 例如,包括A、B、C中的至少一种,可以表示包括从A、B和C构成的集合中选择的任意一个或多个元素。
另外,为了更好地说明本公开,在下文的具体实施方式中给出了众多的具体细节。本领域技术人员应当理解,没有某些具体细节,本公开同样可以实施。在一些实例中,对于本领域技术人员熟知的方法、手段、元件和电路未作详细描述,以便于凸显本公开的主旨。
本公开提供一种智能驾驶控制方法,该方法的执行主体可以是安装于车辆上的智能驾驶控制装置。在一种可能的实现方式中,该方法可以由终端设备或服务器或其它处理设备执行。其中,终端设备可以是车载设备、用户设备(User Equipment,UE)、移动设备、用户终端、终端、蜂窝电话、无绳电话、个人数字助理(Personal Digital Assistant,PDA)、手持设备、计算设备或者可穿戴设备等。其中,车载设备可以是车舱内的车机或者域控制器,还可以是ADAS(Advanced Driving Assistance System,高级驾驶辅助***)、OMS(Occupant Monitoring System,乘员监控***)或者DMS(Driver Monitoring System,驾驶员监控***)中用于执行智能驾驶控制方法的设备主机等。在一些可能的实现方式中,所述智能驾驶控制方法可以通过处理器调用存储器中存储的计算机可读指令的方式来实现。
图1示出根据本公开实施例的智能驾驶控制方法的流程图,如图1所示,所述智能驾驶控制方法包括:
在步骤S11中,获取车辆的行驶状态信息;
这里的车辆可以是私家车、共享汽车、网约车、出租车、货车等类型的车辆中的至少一种车辆,本公开对车辆的具体类型不作限定。
行驶状态信息用于表征车辆的行驶状态,在一种可能的实现方式中,行驶状态可分为正常行驶状态和异常行驶状态,正常行驶状态可以是车辆按预定的速度、方向、路线和交通法规等规则来行驶。而异常行驶状态与正常行驶状态相对应,可以是未按照预定的速度、方向、路线和交通法规等规则来行驶。
行驶状态信息可以包括表征车辆的正常行驶状态或异常行驶状态的信息,例如车辆的速度、方向、加速度、所处车道的信息、压线信息、车身摆动信息、变道信息、变速信息、刹车信息、制动信息、路线信息、超速情况、行驶状态与交通标志的一致性信息等中的至少一项。
在一种可能的实现方式中,所述车辆的异常行驶状态包括以下至少一项:车辆在第一设定时长内的压线次数达到设定压线次数阈值;车辆的左右摆动幅度达到设定幅度阈值;车辆未按照交通标志或交通信号灯的指示通行;车辆的车速超出设定速度阈值。
其中,车辆在第一设定时长内的压线次数达到设定压线次数阈值,表明车辆在较短的时间内连续压线,这种情况即可确定车辆处于异常行驶状态。第一设定时长可以根据实际情况来设定,例如可以是1分钟或15秒,设定压线次数阈值也可以根据实际情况来设定,例如可以是2次,本公开对此不作具体限定。
车辆的左右摆动幅度达到设定幅度阈值,表明车身左右摆动的幅度较大,这种情况即可确定车辆处于异常行驶状态。设定幅度阈值可以根据车辆的类型等实际情况来设定,本公开对此不作具体限定。
另外,对于车辆未按照交通标志或交通信号灯的指示通行,以及车辆的车速超出设定速度阈值,也表明车辆处于异常行驶状态,此处不做赘述。
车辆的行驶状态信息可以依据车辆的行驶速度、行驶方向、位置以及车辆的周围环境等信息来确定,这些信息可以利用用于对车舱外部环境信息进行感知的传感器和/或用于对车辆的运行状态进行感知的传感器来获取。
在一种可选的实现方式中,所述获取车辆的行驶状态信息,包括:获取行车传感数据,所述传感数据包括以下至少一项:车辆的高级驾驶辅助***ADAS采集的影像信息、车辆的行车记录仪采集的影像信息、车辆的速度传感器感知的速度信息;根据所述传感数据,确定所述车辆的行驶状态信息。
例如,通过车身上的图像采集装置、毫米波/激光雷达、声纳传感器等,来获取车舱外部的图像、温度、压力等用于监测汽车状态的信息,这些传感器可以设置于车辆的前后保险杠、侧视镜、驾驶杆内部或者挡风玻璃上,以采集车舱外部的感知数据。通过这些感知数据能够检测车辆周围的环境,包括车辆周围的行人、车辆等对象,以及车道线、交通信号灯、交通标志灯等信息。另外,也可以通过速度/加速度传感器、姿态传感器、定位装置等来确定车辆的速度、加速度、位置和姿态等信息。
在步骤S12中,获取所述车辆的驾驶区域的影像信息;
这里的驾驶区域可以是车舱内驾驶员所在的区域,或者包含驾驶员所在区域的任意区域,该区域往往是主驾驶座的区域。
那么,驾驶区域的影像信息可以是车舱内部驾驶员所在区域的影像信息,该影像信息可以通过设置于车辆的车舱内或车舱外的车载影像采集设备来采集,车载影像采集设备可以是车载摄像头或配置有摄像头的影像采集装置。该摄像头既可以是用于采集车辆内部影像信息的摄像头,也可是用于采集车辆外部影像信息的摄像头。
例如,该摄像头可以包括DMS中的摄像头和/或OMS中的摄像头等,这些摄像头可以用于采集车辆内部的影像信息;该摄像头还可以包括ADAS中的摄像头,可以用于采集车辆外部的影像信息。当然,车载影像采集设备也可以是其它***中的摄像头,或者也可以是单独配置的摄像头,本公开实施例对具体的车载影像采集设备不做限定。
这里的影像信息的载体可以是二维图像或视频,例如,影像信息具体可以是可见光图像/视频,或者红外光图像/视频;也可以是雷达扫描的点云构成的三维影像,等等,具体可依据实际应用场景而定,本公开对此不做限定。
可以通过与车载影像采集设备之间建立的通信连接获取其采集的影像信息。在一个示例中,车载影像采集设备可以实时地将采集的影像信息通过总线或无线通信通道传输至车载控制器或远程服务器,车载控制器或远程服务器可以通过总线或无线通信通道接收实时的影像信息。
在步骤S13中,根据所述车辆的行驶状态信息和所述驾驶区域的影像信息,确定所述车辆的驾驶员的驾驶状态;
驾驶员的驾驶状态可以是驾驶员在驾驶车辆时的状态,驾驶状态可分为正常驾驶状态和危险驾驶状态。危险驾驶状态可以是驾驶员存在预设的不规范驾驶行为的状态,例如,打电话、看手机、手脱离方向盘、疲劳驾驶等行为下的状态。
在本公开实施例中,考虑到在驾驶员存在不规范驾驶行为的情况下,往往会伴随着车辆行驶状态的改变,例如,当驾驶员处于分心、疲劳驾驶等危险驾驶状态时,车辆的行驶状态往往也会出现异常,例如车速不稳定、连续压线等异常行驶状态。因此,本公开实施例中,驾驶员驾驶状态的确定,可以依据驾驶区域的影像信息和车辆的行驶状态信息,以提高确定的驾驶员驾驶状态的可靠性。
如前文所述,驾驶区域的影像信息是车舱内部驾驶员所在区域的影像信息,那么,通过图像处理技术对驾驶区域的影像信息进行分析处理,可以检测驾驶员的特征,例如,肢体特征、面部特征等等,基于检测到的驾驶员的特征,可以分析驾驶员的行为,判断驾驶员是否有打电话、喝水、手脱离方向盘、闭眼等行为。
在车辆的行驶状态已经异常的情况下,驾驶员处于预设的危险驾驶状态的概率较高。因而,可以将车辆的行驶状态信息与驾驶区域的影像信息相结合,来确定驾驶员的驾驶状态。例如,通过对驾驶区域的影像信息进行分析,确定驾驶员处于疲劳驾驶状态的置信度是0.5,而此时车辆出现了连续压线的行驶状态,则可以将驾驶员处于疲劳驾驶状态的置信度提升为0.8;又如,通过对驾驶区域的影像信息进行分析,确定驾驶员处于轻度疲劳驾驶状态,而此时车辆出现了连续压线的行驶状态,则可以将轻度疲劳驾驶状态升级为中度疲劳驾驶状态或重度疲劳驾驶状态。
在步骤S14中,响应于检测到所述车辆的驾驶员处于预设的危险驾驶状态,进行智能驾驶控制。
预设的危险驾驶状态,是预先设定的、有行车安全隐患的驾驶状态,可以包括一种或多种预先定义的驾驶员状态,例如中度疲劳驾驶状态、重度疲劳状态,重度分心驾驶状态,等等。预设的危险驾驶状态也可以是由预设驾驶员行为结合预设车辆状态定义的状态,例如定义驾驶员打电话并且车辆连续两次以上压线时的状态为预设的危险驾驶状态。针对预设的危险驾驶状态,可以进行智能驾驶控制,这里的智能驾驶控制可以是由车载控制中心或远程控制端根据车辆所行驶的道路情况干预驾驶,具体的智能驾驶控制方式例如可以是发出告警信息、控制辅助驾驶/自动驾驶功能启动、控制车辆减速等措施。具体将结合本公开后文可能的实现方式进行详细描述,此处不做赘述。
在一些可选的实现方式中,可以根据驾驶员所处的危险驾驶状态以及车辆的行驶状态确定危险等级,进而根据危险等级选择相应的智能驾驶控制方式。例如,当驾驶员处于打电话、看手机、手脱离方向盘等危险驾驶状态且车辆连续压线时长不超过预设时长(例如5秒)时,采用发出告警信息的方式进行智能驾驶控制;当驾驶员处于打电话、看手机、手脱离方向盘等危险驾驶状态且车辆处于超速50%的状态时,其危险高于车辆连 续压线时长不超过预设时长(例如5秒)的状态,这时可以采用控制车辆减速的方式进行智能驾驶控制,从而实现灵活的安全驾驶控制。
在本公开实施例中,通过获取车辆的行驶状态信息,以及车辆的驾驶区域的影像信息,根据车辆的行驶状态信息和驾驶区域的影像信息,确定车辆的驾驶员的驾驶状态,并响应于检测到车辆的驾驶员处于预设的危险驾驶状态,进行智能驾驶控制。由此可以结合车辆的行驶状态与驾驶区域的影像信息,确定车辆的驾驶员的驾驶状态,提高了确定的驾驶员驾驶状态的准确性。另外,响应于检测到车辆的驾驶员处于预设的危险驾驶状态,进行智能驾驶控制,进一步提高了车辆的驾驶安全。
在一些可能的实现方式中,驾驶区域的影像信息可以是车舱内的影像信息,车辆的行驶状态信息可以是根据车舱外的感知数据确定的信息,也就是说,在本公开中,可以结合车舱内和车舱外的信息确定驾驶员的驾驶状态,提高了确定的驾驶员驾驶状态的准确性,并且,结合车舱内和车舱外的信息确定驾驶员的驾驶状态,对车辆进行智能驾驶控制,提高了车辆的驾驶安全。
在一种可能的实现方式中,根据所述车辆的行驶状态信息和所述驾驶区域的影像信息确定所述车辆的驾驶员的驾驶状态,包括:根据所述驾驶区域的影像信息对所述车辆的驾驶员进行行为检测;根据所述车辆的行驶状态信息和所述驾驶员的行为检测结果确定所述驾驶员的驾驶状态。
在该实现方式中,通过根据驾驶区域的影像信息对车辆的驾驶员进行行为检测,得到行为检测结果,然后再结合车辆的行驶状态信息和行为检测结果来确定驾驶员的驾驶状态,提高了得到驾驶员驾驶状态的准确性。
可以通过图像处理技术对驾驶区域的影像信息进行分析,以对驾驶员进行行为检测,得到行为检测结果。在本公开实施例中,预设的危险驾驶状态可以是驾驶员存在预设的不规范驾驶行为,那么,根据驾驶区域的影像信息对车辆的驾驶员进行行为检测,可以是对驾驶员的不规范驾驶行为的检测。例如,可以在检测到驾驶员的手中持有电话且电话在耳部附近的情况下,确定行为检测结果为驾驶员存在开车打电话的行为;可以在检测到驾驶员的手未在方向盘上的时长达到第一时长的情况下,确定行为检测结果为驾驶员存在手部脱离方向盘的行为;可以在检测到驾驶员闭眼的时长达到第二时长的情况下,确定行为检测结果为驾驶员存在轻度疲劳驾驶的行为;可以在检测到驾驶员闭眼的时长达到第三时长的情况下,确定行为检测结果为驾驶员存在中度疲劳驾驶的行为。
在一些可选的实现方式中,在检测结果为未检测到驾驶员的不安全驾驶行为的情况下,如果车辆的行驶状态信息指示车辆的行驶状态也是正常状态,则可以确定驾驶员的驾驶状态为正常驾驶状态。
在一些可选的实现方式中,在检测结果为检测到驾驶员的不安全驾驶行为的情况下,如果车辆的行驶状态信息指示车辆的行驶状态是正常状态,则可以根据检测到的该不规范驾驶行为确定驾驶员的驾驶状态。例如,在行为检测结果为驾驶员的闭眼频率或连续闭眼次数轻度疲劳驾驶对应的区间内的情况下,如果车辆的行驶状态是正常状态,则驾 驶员的驾驶状态仍然确定为轻度疲劳驾驶。
在一些可选的实现方式中,在检测结果为检测到驾驶员的不安全驾驶行为的情况下,如果车辆的行驶状态信息指示车辆的行驶状态是异常状态,则可以通过车辆的行驶状态信息对驾驶员的行为检测结果进行进一步的更改,得到驾驶员的驾驶状态,以提高得到的行为检测结果的可靠度。例如,由于车辆的异常行驶状态往往是由驾驶员的不规范驾驶行为导致的,因此可以提高行为检测结果中驾驶员行为属于不安全驾驶行为的置信度,以提高得到的危险驾驶状态的可靠度;另外,由于车辆已经出现了异常行驶状态,因此可以提高检测结果的危险级别,以便采取更高级别的响应措施。
在一种可能的实现方式中,所述对所述车辆的驾驶员进行行为检测,包括:确定所述驾驶员的行为检测结果对应的危险级别;所述根据所述车辆的行驶状态信息和所述驾驶员的行为检测结果确定所述驾驶员的驾驶状态,包括:响应于根据所述车辆的行驶状态信息确定所述车辆处于异常行驶状态,将所述驾驶员的行为检测结果的危险级别升级为第一危险级别;响应于确定所述第一危险级别达到预设的告警级别,确定所述驾驶员的驾驶状态处于预设的危险驾驶状态。
驾驶员的行为检测结果可以与危险级别相对应,危险级别表征驾驶员危险驾驶状态的危险程度,例如,可以在检测到驾驶员闭眼的时长达到第二时长的情况下,确定行为检测结果为驾驶员存在轻度疲劳驾驶的行为;可以在检测到驾驶员闭眼的时长达到第三时长的情况下,确定行为检测结果为驾驶员存在中度疲劳驾驶的行为。其中,第三时长大于第二时长,显然,驾驶员闭眼的时间越长,表明危险程度越高,车辆出现事故的概率越高。
由于车辆已经出现了异常行驶状态,因此,可以对检测结果的危险级别进行升级,以便采取更高级别的响应措施,为便于描述,这里将升级后的危险级别称为第一危险级别。针对第一危险级别,可以判断第一危险级别是否达到预设的告警级别,如果达到,则可以进行告警或执行其他智能控制操作。
例如,对于轻度疲劳驾驶的行为,可以不进行告警,而对于重度疲劳驾驶的行为,由于其属于预设的危险驾驶状态,则进行告警。
在该实现方式中,通过确定驾驶员的行为检测结果对应的危险级别,在车辆处于异常行驶状态的情况下,将驾驶员的行为检测结果的危险级别升级为第一危险级别,并且在第一危险级别达到预设的告警级别的情况下,确定驾驶员的驾驶状态处于预设的危险驾驶状态。由此,在车辆已经出现了异常行驶状态的情况下,提高检测结果的危险级别,以便及时、准确采取响应措施,从而提升了车辆行驶的安全性。
在一种可能的实现方式中,所述根据所述车辆的行驶状态信息和所述驾驶员的行为检测结果确定所述驾驶员的驾驶状态,包括:响应于根据所述车辆的行驶状态信息和所述驾驶员的行为检测结果确定所述车辆的行驶状态和所述驾驶员的行为状态均为异常状态,确定检测到所述驾驶员处于预设的危险驾驶状态。
在该实现方式中,考虑到车辆的行驶状态为异常状态时可能会导致交通事故,如果 此时驾驶员的行为状态也是异常状态,则表明驾驶员未安全驾驶,具有较大的安全隐患。因此,在车辆的行驶状态和驾驶员的行为状态均为异常状态的情况下,即可确定驾驶员处于预设的危险驾驶状态,以便响应于检测到所述车辆的驾驶员处于预设的危险驾驶状态,进行智能驾驶控制,提高车辆的行驶安全。
在一种可能的实现方式中,所述根据所述车辆的行驶状态信息和所述驾驶区域的影像信息确定所述车辆的驾驶员的驾驶状态,包括:响应于根据所述车辆的行驶状态信息检测到所述车辆处于异常行驶状态,根据所述驾驶区域的影像信息对所述驾驶员进行行为检测;响应于确定所述驾驶员的行为检测结果为驾驶员分心驾驶或疲劳驾驶,确定所述驾驶员处于预设的危险驾驶状态。
这里的驾驶员分心驾驶可以是驾驶员存在手脱离方向盘、开车打电话、开车喝水、开车玩手机等会对驾驶造成分心的行为,疲劳驾驶可以是驾驶员存在长时间闭眼、打哈欠等行为。
在该实现方式中,考虑到车辆的行驶状态为异常状态时可能会导致交通事故,此时如果驾驶员的行为检测结果为驾驶员分心驾驶或疲劳驾驶,可以确定由于驾驶员分心驾驶或疲劳驾驶使得车辆处于不安全的状态,因此,在车辆的行驶状态为异常行驶状态,且驾驶员的行为检测结果为驾驶员分心驾驶或疲劳驾驶的情况下,即可确定驾驶员处于预设的危险驾驶状态,以便及时地通过智能驾驶控制介入,提高安全性。
在一种可能的实现方式中,所述进行智能驾驶控制,包括基于以下至少一项生成告警信息:所述车辆的行驶状态信息;基于所述驾驶区域的影像信息对所述车辆的驾驶员的行为检测结果;所述驾驶员的驾驶状态。
在该实现方式中,为达到较好的告警效果,可以将车辆的行驶状态信息、驾驶员的行为检测结果、驾驶员的驾驶状态中的至少一项告知驾驶员,以使得驾驶员意识到当前危险驾驶状态的严重程度。例如:若车辆的行驶状态信息为“车辆已连续三次压线”,基于驾驶区域的影像信息对车辆的驾驶员的行为检测结果为“驾驶员在打电话”,则生成的告警信息可以是:“您已连续三次压线,请勿开车打电话”。具体发出告警信息的形式可以是语音或者视频等形式,本公开对此不作限定。
在该实现方式中,通过基于车辆的行驶状态信息、驾驶员的行为检测结果和驾驶员的驾驶状态中的至少一项生成告警信息,以使得驾驶员获知车辆的不安全状态和当前危险驾驶状态的严重程度,达到较好的告警效果。
在一种可能的实现方式中,所述进行智能驾驶控制包括:控制告警设备持续输出告警信息,直至所述车辆行驶状态和驾驶员的驾驶状态均恢复正常状态。通过持续输出告警信息以对驾驶员进行持续的告警,能够进一步提升告警效果,实现驾驶员状态的持续监控和告警,有利于进一步提升行车安全。在车辆的行驶状态和驾驶员的驾驶状态均恢复正常后停止告警,可以减少不必要的告警对驾驶员造成的干扰。
在一种可能的实现方式中,所述进行智能驾驶控制包括:响应于检测到所述驾驶员处于预设的危险驾驶状态的持续时长超过预设时长,控制辅助驾驶/自动驾驶功能启动, 和/或控制所述车辆减速停车。
驾驶员处于预设的危险驾驶状态的持续时长超过预设时长,表明驾驶员长时间无法正常地操控车辆,这会严重威胁车辆的行驶安全,带来极大的安全隐患,因此,在该情况下,可以控制辅助驾驶/自动驾驶功能启动,也可以控制车辆减速停车,控制车辆减速停车也可以是通过辅助驾驶/自动驾驶功能来实现。
辅助驾驶/自动驾驶功能能够根据车辆的传感器采集的行车传感数据,控制油门、制动和、方向使车辆与变化的交通状况相适应,实现诸如自适应定速巡航、将车辆保持在车道内行驶、控制车辆与前车的跟车距离等驾驶功能。
在该实现方式中,在驾驶员处于预设的危险驾驶状态的持续时长超过预设时长的情况下,控制辅助驾驶/自动驾驶功能启动,和/或控制所述车辆减速停车,能够在驾驶员长时间无法正常地操控车辆的情况下,对车辆的行驶状态进行控制,提高了车辆的行车安全。
在一种可能的实现方式中,所述方法还包括:获取所述车辆周围的环境信息;
所述根据所述车辆的行驶状态信息和所述驾驶区域的影像信息确定所述车辆的驾驶员的驾驶状态,包括:根据所述车辆周围的环境信息、所述车辆的行驶状态信息和所述驾驶区域的影像信息确定所述车辆的驾驶员的驾驶状态。
在实际驾驶场景中,车辆的行驶状态可能会受车辆周围环境的影响,例如,车辆的车速时快时慢可能是由于道路拥堵导致的,车辆左右摇摆行驶可能是由于车辆在弯曲的路段行驶,因此,为进一步提高确定的车辆驾驶员的驾驶状态的准确性,可以根据车辆周围的环境信息、车辆的行驶状态信息和驾驶区域的影像信息确定车辆的驾驶员的驾驶状态。
这里的车辆周围的环境信息可以包括车辆行驶的道路的路况信息,该路况信息可以表征车辆所行使的道路的状况,可以包括指示道路拥挤程度、指示道路形状、指示道路类、指示道路限速情况的信息等中的至少一项。依据道路的拥挤程度,可以将路况分为多个等级,例如,非常拥挤、拥挤、通畅等多个等级,道路的形状可以分为曲线和直线等形状,道路的类型可以分为城市道路、乡村公路、高速公路等类型。
在一种可能的实现方式中,所述获取所述车辆周围的环境信息,包括:获取所述车辆行驶的道路的路况信息;所述根据所述车辆周围的环境信息、所述车辆的行驶状态信息和所述驾驶区域的影像信息确定所述车辆的驾驶员的驾驶状态,包括:响应于所述车辆的行驶状态信息与所述路况信息不满足预设的匹配关系,且基于所述驾驶区域的影像信息对所述车辆的驾驶员的行为检测结果指示所述驾驶员的行为状态为异常状态,确定检测到所述驾驶员处于预设的危险驾驶状态。
道路的路况信息可以根据车辆的当前地理位置通过第三方平台获取,例如通过导航服务平台获取。
可以根据车辆周围的环境信息和车辆的行驶状态信息,确定车辆的行驶状态信息与路况信息是否满足预设的匹配关系,车辆的行驶状态信息与路况信息不满足预设的匹配 关系,则表明车辆处于异常行驶状态。
预设的匹配关系是指路况信息与在该路况信息表征的路况下安全行驶的车辆的行驶状态信息之间的匹配关系。例如,可以包括下述至少一种:“畅通”路况与车辆匀速行驶的匹配关系;直线路况与车辆的左右摆动幅度不超过设定幅度阈值的匹配关系;道路限速信息与在限速范围内行驶的匹配关系。
相应地,车辆的行驶状态信息与路况信息不满足预设的匹配关系,例如可以包括:车辆在直线路段左右摆动幅度达到设定幅度阈值,车辆在道路畅通的路段行驶速度时快时慢,车辆超速,等等,这些情况均表明车辆处于异常行驶状态。
那么,在车辆处于异常行驶状态,且驾驶员的行为状态为异常状态的情况下,即可确定检测到驾驶员处于预设的危险驾驶状态,提高了确定的驾驶员驾驶状态的准确性,进一步提高了车辆的驾驶安全。
在一种可选的实现方式中,所述根据所述车辆的行驶状态信息和所述驾驶区域的影像信息确定所述车辆的驾驶员的驾驶状态,包括:将获取的所述车辆的行驶状态信息、和驾驶区域的影像信息,输入经过训练的神经网络,经由所述神经网络,确定所述车辆的驾驶员的驾驶状态。在该实现方式中,通过神经网络确定车辆驾驶员的驾驶状态,能够提高确定驾驶状态的准确性和速度。且神经网络可以预先训练并部署,可以快速实现较大数据量的图像或视频流的检测,因此可以应用于实时驾驶场景的智能驾驶控制中。
下面对本公开实施例的一个应用场景进行说明。在该应用场景中,可以通过DMS摄像头采集驾驶区域的影像信息,处理器可以从DMS摄像头获取驾驶区域的影像信息,并基于驾驶区域的影像信息,对车辆的驾驶员进行行为检测,得到行为检测结果;另外,通过ADAS摄像头采集车舱外部的影像信息,处理器可以从ADAS摄像头获取车舱外部的影像信息,对车辆的行驶状态信息进行确定。若确定的车辆的行驶状态信息为“车辆连续三次压线”,对车辆的驾驶员的行为检测结果为“驾驶员在打电话”,则生成语音告警信息:“您已连续三次压线,请勿开车打电话”,并进行语音播报。
可以理解,本公开提及的上述各个方法实施例,在不违背原理逻辑的情况下,均可以彼此相互结合形成结合后的实施例,限于篇幅,本公开不再赘述。本领域技术人员可以理解,在具体实施方式的上述方法中,各步骤的具体执行顺序应当以其功能和可能的内在逻辑确定。
此外,本公开还提供了智能驾驶控制装置、电子设备、计算机可读存储介质、程序,上述均可用来实现本公开提供的任一种智能驾驶控制方法,相应技术方案和描述和参见方法部分的相应记载,不再赘述。
图2示出根据本公开实施例的智能驾驶控制装置的框图,如图2所示,所述装置包括:
第一获取模块21,用于获取车辆的行驶状态信息;
第二获取模块22,用于获取所述车辆的驾驶区域的影像信息;
驾驶状态确定模块23,用于根据所述车辆的行驶状态信息和所述驾驶区域的影像信息,确定所述车辆的驾驶员的驾驶状态;
控制模块24,用于响应于检测到所述车辆的驾驶员处于预设的危险驾驶状态,进行智能驾驶控制。
在一种可能的实现方式中,所述驾驶状态确定模块23包括第一驾驶状态确定子模块和第二驾驶状态确定子模块,其中:
所述第一驾驶状态确定子模块,用于根据所述驾驶区域的影像信息对所述车辆的驾驶员进行行为检测;
所述第二驾驶状态确定子模块,用于根据所述车辆的行驶状态信息和所述驾驶员的行为检测结果确定所述驾驶员的驾驶状态。
在一种可能的实现方式中,所述第一驾驶状态确定子模块,用于确定所述驾驶员的行为检测结果对应的危险级别;
所述第二驾驶状态确定子模块,用于:
响应于根据所述车辆的行驶状态信息确定所述车辆处于异常行驶状态,将所述驾驶员的行为检测结果的危险级别升级为第一危险级别;
响应于确定所述第一危险级别达到预设的告警级别,确定所述驾驶员的驾驶状态处于预设的危险驾驶状态。
在一种可能的实现方式中,所述第二驾驶状态确定子模块,用于响应于根据所述车辆的行驶状态信息和所述驾驶员的行为检测结果确定所述车辆的行驶状态和所述驾驶员的行为状态均为异常状态,确定检测到所述驾驶员处于预设的危险驾驶状态。
在一种可能的实现方式中,所述驾驶状态确定模块23包括第三驾驶状态确定子模块和第四驾驶状态确定子模块,其中:
所述第三驾驶状态确定子模块,用于响应于根据所述车辆的行驶状态信息检测到所述车辆处于异常行驶状态,根据所述驾驶区域的影像信息对所述驾驶员进行行为检测;
所述第四驾驶状态确定子模块,用于响应于确定所述驾驶员的行为检测结果为驾驶员分心驾驶或疲劳驾驶,确定所述驾驶员处于预设的危险驾驶状态。
在一种可能的实现方式中,所述车辆的异常行驶状态包括以下至少一项:
车辆在第一设定时长内的压线次数达到设定压线次数阈值;车辆的左右摆动幅度达到设定幅度阈值;车辆未按照交通标志或交通信号灯的指示通行;车辆的车速超出设定速度阈值。
在一种可能的实现方式中,所述控制模块24,用于基于以下至少一项生成告警信息:
所述车辆的行驶状态信息;
基于所述驾驶区域的影像信息对所述车辆的驾驶员的行为检测结果;
所述驾驶员的驾驶状态。
在一种可能的实现方式中,所述控制模块24,用于控制告警设备持续输出告警信息,直至所述车辆行驶状态和驾驶员的驾驶状态均恢复正常状态。
在一种可能的实现方式中,所述控制模块24,用于响应于检测到所述驾驶员处于预设的危险驾驶状态的持续时长超过预设时长,控制辅助驾驶/自动驾驶功能启动,和/或控 制所述车辆减速停车。
在一种可能的实现方式中,所述装置还包括:
第三获取模块,用于获取所述车辆周围的环境信息;
所述驾驶状态确定模块23,用于根据所述车辆周围的环境信息、所述车辆的行驶状态信息和所述驾驶区域的影像信息确定所述车辆的驾驶员的驾驶状态。
在一种可能的实现方式中,所述环境信息包括路况信息;
所述第三获取模块,用于获取所述车辆行驶的道路的路况信息;
所述驾驶状态确定模块23,用于响应于所述车辆的行驶状态信息与所述路况信息不满足预设的匹配关系,且基于所述驾驶区域的影像信息对所述车辆的驾驶员的行为检测结果指示所述驾驶员的行为状态为异常状态,确定检测到所述驾驶员处于预设的危险驾驶状态。
在一种可能的实现方式中,所述第一获取模块21包括传感数据获取子模块和行驶状态信息确定子模块,其中:
所述传感数据获取子模块,用于获取行车传感数据,所述传感数据包括以下至少一项:车辆的高级驾驶辅助***采集的影像信息、车辆的行车记录仪采集的影像信息、车辆的速度传感器感知的速度信息;
所述行驶状态信息确定子模块,用于根据所述传感数据,确定所述车辆的行驶状态信息。
在一种可能的实现方式中,所述驾驶状态确定模块23,用于将获取的所述车辆的行驶状态信息、和驾驶区域的影像信息,输入经过训练的神经网络,经由所述神经网络,确定所述车辆的驾驶员的驾驶状态。
在一些实施例中,本公开实施例提供的装置具有的功能或包含的模块可以用于执行上文方法实施例描述的方法,其具体实现可以参照上文方法实施例的描述,为了简洁,这里不再赘述。
本公开实施例还提出一种车辆,包括:
第一传感器,用于采集车辆的驾驶区域的影像信息;
控制器,用于获取车辆的行驶状态信息,根据所述车辆的行驶状态信息和所述驾驶区域的影像信息,确定所述车辆的驾驶员的驾驶状态,并响应于检测到所述车辆的驾驶员处于预设的危险驾驶状态,进行智能驾驶控制。
本实施例的车辆通过结合车辆的行驶状态与驾驶区域的影像信息,确定车辆的驾驶员的驾驶状态,提高了确定的驾驶员驾驶状态的准确性。
在一些可能的实现方式中,车辆还包括第二传感器,用于采集行车传感数据;所述控制器用于根据所述行车传感数据获取所述车辆的行驶状态信息。从而通过用于感知车辆的不同数据的传感器实现了驾驶状态的准确检测。
本公开实施例还提出一种计算机可读存储介质,其上存储有计算机程序指令,所述计算机程序指令被处理器执行时实现上述方法。计算机可读存储介质可以是易失性计算 机可读存储介质或非易失性计算机可读存储介质。
本公开实施例还提出一种电子设备,包括:处理器;用于存储处理器可执行指令的存储器;其中,所述处理器被配置为调用所述存储器存储的指令,以执行上述方法。
本公开实施例还提供了一种计算机程序产品,包括计算机可读代码,当计算机可读代码在设备上运行时,设备中的处理器执行用于实现如上任一实施例提供的智能驾驶控制方法的指令。
本公开实施例还提供了另一种计算机程序产品,用于存储计算机可读指令,指令被执行时使得计算机执行上述任一实施例提供的智能驾驶控制方法的操作。
电子设备可以被提供为终端、服务器或其它形态的设备。
图3示出根据本公开实施例的一种电子设备800的框图。例如,电子设备800可以是移动电话,计算机,数字广播终端,消息收发设备,游戏控制台,平板设备,医疗设备,健身设备,个人数字助理等终端。
参照图3,电子设备800可以包括以下一个或多个组件:处理组件802,存储器804,电源组件806,多媒体组件808,音频组件810,输入/输出(I/O)的接口812,传感器组件814,以及通信组件816。
处理组件802通常控制电子设备800的整体操作,诸如与显示,电话呼叫,数据通信,相机操作和记录操作相关联的操作。处理组件802可以包括一个或多个处理器820来执行指令,以完成上述的方法的全部或部分步骤。此外,处理组件802可以包括一个或多个模块,便于处理组件802和其他组件之间的交互。例如,处理组件802可以包括多媒体模块,以方便多媒体组件808和处理组件802之间的交互。
存储器804被配置为存储各种类型的数据以支持在电子设备800的操作。这些数据的示例包括用于在电子设备800上操作的任何应用程序或方法的指令,联系人数据,电话簿数据,消息,图片,视频等。存储器804可以由任何类型的易失性或非易失性存储设备或者它们的组合实现,如静态随机存取存储器(SRAM),电可擦除可编程只读存储器(EEPROM),可擦除可编程只读存储器(EPROM),可编程只读存储器(PROM),只读存储器(ROM),磁存储器,快闪存储器,磁盘或光盘。
电源组件806为电子设备800的各种组件提供电力。电源组件806可以包括电源管理***,一个或多个电源,及其他与为电子设备800生成、管理和分配电力相关联的组件。
多媒体组件808包括在所述电子设备800和用户之间的提供一个输出接口的屏幕。在一些实施例中,屏幕可以包括液晶显示器(LCD)和触摸面板(TP)。如果屏幕包括触摸面板,屏幕可以被实现为触摸屏,以接收来自用户的输入信号。触摸面板包括一个或多个触摸传感器以感测触摸、滑动和触摸面板上的手势。所述触摸传感器可以不仅感测触摸或滑动动作的边界,而且还检测与所述触摸或滑动操作相关的持续时间和压力。在一些实施例中,多媒体组件808包括一个前置摄像头和/或后置摄像头。当电子设备800处于操作模式,如拍摄模式或视频模式时,前置摄像头和/或后置摄像头可以接收外部的多媒体数据。每个前置摄像头和后置摄像头可以是一个固定的光学透镜***或具有焦距和光 学变焦能力。
音频组件810被配置为输出和/或输入音频信号。例如,音频组件810包括一个麦克风(MIC),当电子设备800处于操作模式,如呼叫模式、记录模式和语音识别模式时,麦克风被配置为接收外部音频信号。所接收的音频信号可以被进一步存储在存储器804或经由通信组件816发送。在一些实施例中,音频组件810还包括一个扬声器,用于输出音频信号。
I/O接口812为处理组件802和***接口模块之间提供接口,上述***接口模块可以是键盘,点击轮,按钮等。这些按钮可包括但不限于:主页按钮、音量按钮、启动按钮和锁定按钮。
传感器组件814包括一个或多个传感器,用于为电子设备800提供各个方面的状态评估。例如,传感器组件814可以检测到电子设备800的打开/关闭状态,组件的相对定位,例如所述组件为电子设备800的显示器和小键盘,传感器组件814还可以检测电子设备800或电子设备800一个组件的位置改变,用户与电子设备800接触的存在或不存在,电子设备800方位或加速/减速和电子设备800的温度变化。传感器组件814可以包括接近传感器,被配置用来在没有任何的物理接触时检测附近物体的存在。传感器组件814还可以包括光传感器,如互补金属氧化物半导体(CMOS)或电荷耦合装置(CCD)图像传感器,用于在成像应用中使用。在一些实施例中,该传感器组件814还可以包括加速度传感器,陀螺仪传感器,磁传感器,压力传感器或温度传感器。
通信组件816被配置为便于电子设备800和其他设备之间有线或无线方式的通信。电子设备800可以接入基于通信标准的无线网络,如无线网络(WiFi),第二代移动通信技术(2G)或第三代移动通信技术(3G),或它们的组合。在一个示例性实施例中,通信组件816经由广播信道接收来自外部广播管理***的广播信号或广播相关信息。在一个示例性实施例中,所述通信组件816还包括近场通信(NFC)模块,以促进短程通信。例如,在NFC模块可基于射频识别(RFID)技术,红外数据协会(IrDA)技术,超宽带(UWB)技术,蓝牙(BT)技术和其他技术来实现。
在示例性实施例中,电子设备800可以被一个或多个应用专用集成电路(ASIC)、数字信号处理器(DSP)、数字信号处理设备(DSPD)、可编程逻辑器件(PLD)、现场可编程门阵列(FPGA)、控制器、微控制器、微处理器或其他电子元件实现,用于执行上述方法。
在示例性实施例中,还提供了一种非易失性计算机可读存储介质,例如包括计算机程序指令的存储器804,上述计算机程序指令可由电子设备800的处理器820执行以完成上述方法。
图4示出根据本公开实施例的一种电子设备1900的框图。例如,电子设备1900可以被提供为一服务器。参照图4,电子设备1900包括处理组件1922,其进一步包括一个或多个处理器,以及由存储器1932所代表的存储器资源,用于存储可由处理组件1922的执行的指令,例如应用程序。存储器1932中存储的应用程序可以包括一个或一个以上的每一个 对应于一组指令的模块。此外,处理组件1922被配置为执行指令,以执行上述方法。
电子设备1900还可以包括一个电源组件1926被配置为执行电子设备1900的电源管理,一个有线或无线网络接口1950被配置为将电子设备1900连接到网络,和一个输入输出(I/O)接口1958。电子设备1900可以操作基于存储在存储器1932的操作***,例如微软服务器操作***(Windows Server TM),苹果公司推出的基于图形用户界面操作***(Mac OS X TM),多用户多进程的计算机操作***(Unix TM),自由和开放原代码的类Unix操作***(Linux TM),开放原代码的类Unix操作***(FreeBSD TM)或类似。
在示例性实施例中,还提供了一种非易失性计算机可读存储介质,例如包括计算机程序指令的存储器1932,上述计算机程序指令可由电子设备1900的处理组件1922执行以完成上述方法。
本公开可以是***、方法和/或计算机程序产品。计算机程序产品可以包括计算机可读存储介质,其上载有用于使处理器实现本公开的各个方面的计算机可读程序指令。
计算机可读存储介质可以是可以保持和存储由指令执行设备使用的指令的有形设备。计算机可读存储介质例如可以是(但不限于)电存储设备、磁存储设备、光存储设备、电磁存储设备、半导体存储设备或者上述的任意合适的组合。计算机可读存储介质的更具体的例子(非穷举的列表)包括:便携式计算机盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、静态随机存取存储器(SRAM)、便携式压缩盘只读存储器(CD-ROM)、数字多功能盘(DVD)、记忆棒、软盘、机械编码设备、例如其上存储有指令的打孔卡或凹槽内凸起结构、以及上述的任意合适的组合。这里所使用的计算机可读存储介质不被解释为瞬时信号本身,诸如无线电波或者其他自由传播的电磁波、通过波导或其他传输媒介传播的电磁波(例如,通过光纤电缆的光脉冲)、或者通过电线传输的电信号。
这里所描述的计算机可读程序指令可以从计算机可读存储介质下载到各个计算/处理设备,或者通过网络、例如因特网、局域网、广域网和/或无线网下载到外部计算机或外部存储设备。网络可以包括铜传输电缆、光纤传输、无线传输、路由器、防火墙、交换机、网关计算机和/或边缘服务器。每个计算/处理设备中的网络适配卡或者网络接口从网络接收计算机可读程序指令,并转发该计算机可读程序指令,以供存储在各个计算/处理设备中的计算机可读存储介质中。
用于执行本公开操作的计算机程序指令可以是汇编指令、指令集架构(ISA)指令、机器指令、机器相关指令、微代码、固件指令、状态设置数据、或者以一种或多种编程语言的任意组合编写的源代码或目标代码,所述编程语言包括面向对象的编程语言—诸如Smalltalk、C++等,以及常规的过程式编程语言—诸如“C”语言或类似的编程语言。计算机可读程序指令可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络—包括局域网(LAN)或广域网(WAN)—连接到用户计算机,或者,可以连接到外 部计算机(例如利用因特网服务提供商来通过因特网连接)。在一些实施例中,通过利用计算机可读程序指令的状态信息来个性化定制电子电路,例如可编程逻辑电路、现场可编程门阵列(FPGA)或可编程逻辑阵列(PLA),该电子电路可以执行计算机可读程序指令,从而实现本公开的各个方面。
这里参照根据本公开实施例的方法、装置(***)和计算机程序产品的流程图和/或框图描述了本公开的各个方面。应当理解,流程图和/或框图的每个方框以及流程图和/或框图中各方框的组合,都可以由计算机可读程序指令实现。
这些计算机可读程序指令可以提供给通用计算机、专用计算机或其它可编程数据处理装置的处理器,从而生产出一种机器,使得这些指令在通过计算机或其它可编程数据处理装置的处理器执行时,产生了实现流程图和/或框图中的一个或多个方框中规定的功能/动作的装置。也可以把这些计算机可读程序指令存储在计算机可读存储介质中,这些指令使得计算机、可编程数据处理装置和/或其他设备以特定方式工作,从而,存储有指令的计算机可读介质则包括一个制造品,其包括实现流程图和/或框图中的一个或多个方框中规定的功能/动作的各个方面的指令。
也可以把计算机可读程序指令加载到计算机、其它可编程数据处理装置、或其它设备上,使得在计算机、其它可编程数据处理装置或其它设备上执行一系列操作步骤,以产生计算机实现的过程,从而使得在计算机、其它可编程数据处理装置、或其它设备上执行的指令实现流程图和/或框图中的一个或多个方框中规定的功能/动作。
附图中的流程图和框图显示了根据本公开的多个实施例的***、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段或指令的一部分,所述模块、程序段或指令的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个连续的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或动作的专用的基于硬件的***来实现,或者可以用专用硬件与计算机指令的组合来实现。
该计算机程序产品可以具体通过硬件、软件或其结合的方式实现。在一个可选实施例中,所述计算机程序产品具体体现为计算机存储介质,在另一个可选实施例中,计算机程序产品具体体现为软件产品,例如软件开发包(Software Development Kit,SDK)等等。
以上已经描述了本公开的各实施例,上述说明是示例性的,并非穷尽性的,并且也不限于所披露的各实施例。在不偏离所说明的各实施例的范围和精神的情况下,对于本技术领域的普通技术人员来说许多修改和变更都是显而易见的。本文中所用术语的选择,旨在最好地解释各实施例的原理、实际应用或对市场中的技术的改进,或者使本技术领域的其它普通技术人员能理解本文披露的各实施例。

Claims (19)

  1. 一种智能驾驶控制方法,其特征在于,包括:
    获取车辆的行驶状态信息;
    获取所述车辆的驾驶区域的影像信息;
    根据所述车辆的行驶状态信息和所述驾驶区域的影像信息,确定所述车辆的驾驶员的驾驶状态;
    响应于检测到所述车辆的驾驶员处于预设的危险驾驶状态,进行智能驾驶控制。
  2. 根据权利要求1所述的方法,其特征在于,所述根据所述车辆的行驶状态信息和所述驾驶区域的影像信息确定所述车辆的驾驶员的驾驶状态,包括:
    根据所述驾驶区域的影像信息对所述车辆的驾驶员进行行为检测;
    根据所述车辆的行驶状态信息和所述驾驶员的行为检测结果确定所述驾驶员的驾驶状态。
  3. 根据权利要求2所述的方法,其特征在于,所述对所述车辆的驾驶员进行行为检测,包括:
    确定所述驾驶员的行为检测结果对应的危险级别;
    所述根据所述车辆的行驶状态信息和所述驾驶员的行为检测结果确定所述驾驶员的驾驶状态,包括:
    响应于根据所述车辆的行驶状态信息确定所述车辆处于异常行驶状态,将所述驾驶员的行为检测结果的危险级别升级为第一危险级别;
    响应于确定所述第一危险级别达到预设的告警级别,确定所述驾驶员的驾驶状态处于预设的危险驾驶状态。
  4. 根据权利要求2所述的方法,其特征在于,所述根据所述车辆的行驶状态信息和所述驾驶员的行为检测结果确定所述驾驶员的驾驶状态,包括:
    响应于根据所述车辆的行驶状态信息和所述驾驶员的行为检测结果确定所述车辆的行驶状态和所述驾驶员的行为状态均为异常状态,确定检测到所述驾驶员处于预设的危险驾驶状态。
  5. 根据权利要求2-4任一项所述的方法,其特征在于,所述根据所述车辆的行驶状态信息和所述驾驶区域的影像信息确定所述车辆的驾驶员的驾驶状态,包括:
    响应于根据所述车辆的行驶状态信息检测到所述车辆处于异常行驶状态,根据所述驾驶区域的影像信息对所述驾驶员进行行为检测;
    响应于确定所述驾驶员的行为检测结果为驾驶员分心驾驶或疲劳驾驶,确定所述驾驶员处于预设的危险驾驶状态。
  6. 根据权利要求1-5任一项所述的方法,其特征在于,所述车辆的异常行驶状态包括以下至少一项:
    车辆在第一设定时长内的压线次数达到设定压线次数阈值;车辆的左右摆动幅度达到设定幅度阈值;车辆未按照交通标志或交通信号灯的指示通行;车辆的车速超出设定速度阈值。
  7. 根据权利要求1-6任一项所述的方法,其特征在于,所述进行智能驾驶控制,包括基于以下至少一项生成告警信息:
    所述车辆的行驶状态信息;
    基于所述驾驶区域的影像信息对所述车辆的驾驶员的行为检测结果;
    所述驾驶员的驾驶状态。
  8. 根据权利要求1-7任一项所述的方法,其特征在于,所述进行智能驾驶控制包括:
    控制告警设备持续输出告警信息,直至所述车辆行驶状态和驾驶员的驾驶状态均恢复正常状态。
  9. 根据权利要求1-8任一所述的方法,其特征在于,所述进行智能驾驶控制包括:
    响应于检测到所述驾驶员处于预设的危险驾驶状态的持续时长超过预设时长,控制辅助驾驶/自动驾驶功能启动,和/或控制所述车辆减速停车。
  10. 根据权利要求1-9任一所述的方法,其特征在于,所述方法还包括:
    获取所述车辆周围的环境信息;
    所述根据所述车辆的行驶状态信息和所述驾驶区域的影像信息确定所述车辆的驾驶员的驾驶状态,包括:
    根据所述车辆周围的环境信息、所述车辆的行驶状态信息和所述驾驶区域的影像信息确定所述车辆的驾驶员的驾驶状态。
  11. 根据权利要求10所述方法,其特征在于,所述环境信息包括路况信息;
    所述获取所述车辆周围的环境信息,包括:获取所述车辆行驶的道路的路况信息;
    所述根据所述车辆周围的环境信息、所述车辆的行驶状态信息和所述驾驶区域的影像信息确定所述车辆的驾驶员的驾驶状态,包括:
    响应于所述车辆的行驶状态信息与所述路况信息不满足预设的匹配关系,且基于所述驾驶区域的影像信息对所述车辆的驾驶员的行为检测结果指示所述驾驶员的行为状态为异常状态,确定检测到所述驾驶员处于预设的危险驾驶状态。
  12. 根据权利要求1-11任一所述的方法,其特征在于,所述获取车辆的行驶状态信息,包括:
    获取行车传感数据,所述传感数据包括以下至少一项:车辆的高级驾驶辅助***采集的影像信息、车辆的行车记录仪采集的影像信息、车辆的速度传感器感知的速度信息;
    根据所述传感数据,确定所述车辆的行驶状态信息。
  13. 根据权利要求1-12任一所述的方法,其特征在于,所述根据所述车辆的行驶状态信息和所述驾驶区域的影像信息确定所述车辆的驾驶员的驾驶状态,包括:
    将获取的所述车辆的行驶状态信息、和驾驶区域的影像信息,输入经过训练的神经网络,经由所述神经网络,确定所述车辆的驾驶员的驾驶状态。
  14. 一种智能驾驶控制装置,其特征在于,包括:
    第一获取模块,用于获取车辆的行驶状态信息;
    第二获取模块,用于获取所述车辆的驾驶区域的影像信息;
    驾驶状态确定模块,用于根据所述车辆的行驶状态信息和所述驾驶区域的影像信息,确定所述车辆的驾驶员的驾驶状态;
    控制模块,用于响应于检测到所述车辆的驾驶员处于预设的危险驾驶状态,进行智能驾驶控制。
  15. 一种车辆,其特征在于,包括:
    第一传感器,用于采集车辆的驾驶区域的影像信息;
    控制器,用于获取车辆的行驶状态信息,根据所述车辆的行驶状态信息和所述驾驶区域的影像信息,确定所述车辆的驾驶员的驾驶状态,并响应于检测到所述车辆的驾驶员处于预设的危险驾驶状态,进行智能驾驶控制。
  16. 根据权利要求15所述的车辆,其特征在于,还包括:
    第二传感器,用于采集行车传感数据;
    所述控制器用于根据所述行车传感数据获取所述车辆的行驶状态信息。
  17. 一种电子设备,其特征在于,包括:
    处理器;
    用于存储处理器可执行指令的存储器;
    其中,所述处理器被配置为调用所述存储器存储的指令,以执行权利要求1至13中任意一项所述的方法。
  18. 一种计算机可读存储介质,其上存储有计算机程序指令,其特征在于,所述计算机程序指令被处理器执行时实现权利要求1至13中任意一项所述的方法。
  19. 一种计算机程序,包括计算机可读代码,当所述计算机可读代码在电子设备中运行时,所述电子设备中的处理器执行用于实现权利要求1-13中的任一权利要求所述的方法。
PCT/CN2021/109831 2020-09-23 2021-07-30 智能驾驶控制方法及装置、车辆、电子设备和存储介质 WO2022062659A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023518924A JP2023542992A (ja) 2020-09-23 2021-07-30 インテリジェントドライブ制御方法及び装置、車両、電子機器並びに記憶媒体

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011010951.9A CN112141119B (zh) 2020-09-23 2020-09-23 智能驾驶控制方法及装置、车辆、电子设备和存储介质
CN202011010951.9 2020-09-23

Publications (1)

Publication Number Publication Date
WO2022062659A1 true WO2022062659A1 (zh) 2022-03-31

Family

ID=73896284

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/109831 WO2022062659A1 (zh) 2020-09-23 2021-07-30 智能驾驶控制方法及装置、车辆、电子设备和存储介质

Country Status (3)

Country Link
JP (1) JP2023542992A (zh)
CN (1) CN112141119B (zh)
WO (1) WO2022062659A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116129653A (zh) * 2023-04-17 2023-05-16 创意信息技术股份有限公司 卡口车辆检测方法、装置、设备及存储介质
CN116698445A (zh) * 2023-06-08 2023-09-05 北京速度时空信息有限公司 一种自动驾驶安全性检测方法及***
CN117657170A (zh) * 2024-02-02 2024-03-08 江西五十铃汽车有限公司 一种新能源汽车智能安全和整车控制方法及***

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112141119B (zh) * 2020-09-23 2022-03-11 上海商汤临港智能科技有限公司 智能驾驶控制方法及装置、车辆、电子设备和存储介质
CN112721730A (zh) * 2021-01-07 2021-04-30 广州橙行智动汽车科技有限公司 车辆电源管理方法、装置、车辆及存储介质
CN112849149A (zh) * 2021-01-28 2021-05-28 上海商汤临港智能科技有限公司 驾驶员状态检测方法、装置、设备、介质及程序产品
CN112861677A (zh) * 2021-01-28 2021-05-28 上海商汤临港智能科技有限公司 轨交驾驶员的动作检测方法及装置、设备、介质及工具
CN114291097B (zh) * 2021-04-21 2022-07-19 浙江大学 实时反馈的驾驶行为差异化矫正方法、装置、设备
CN113096409A (zh) * 2021-04-25 2021-07-09 华蓝设计(集团)有限公司 基于5g物联网技术的运输车辆全过程安全监控***
WO2022241598A1 (zh) * 2021-05-17 2022-11-24 海南师范大学 一种自动检测驾驶员玩手机的装置及方法
CN114103963A (zh) * 2021-11-23 2022-03-01 重庆大学 一种驾驶员不安全行为检测预警***及方法
WO2023092490A1 (zh) * 2021-11-26 2023-06-01 吉林大学 自动驾驶汽车接管紧迫度判别方法和警示方法及***
CN114030475A (zh) * 2021-12-22 2022-02-11 清华大学苏州汽车研究院(吴江) 一种车辆辅助驾驶方法、装置、车辆及存储介质
CN114708628A (zh) * 2022-03-07 2022-07-05 深圳市德驰微视技术有限公司 基于域控制器平台的车辆驾驶员监测方法及装置
CN115631626A (zh) * 2022-10-11 2023-01-20 重庆长安新能源汽车科技有限公司 一种车辆数据监控分析方法、装置、设备及介质
CN115892051B (zh) * 2023-03-08 2023-05-16 禾多科技(北京)有限公司 自动驾驶辅助公共道路测试方法及***

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3637165A1 (de) * 1986-10-31 1988-05-05 Rainer Ashauer Verfahren und einrichtung zum verhindern von zusammenstoessen, insbesondere fuer kraftfahrzeuge im strassenverkehr
CN102509418A (zh) * 2011-10-11 2012-06-20 东华大学 一种多传感信息融合的疲劳驾驶评估预警方法及装置
CN110901650A (zh) * 2019-11-02 2020-03-24 芜湖职业技术学院 一种车辆压实线自调整***和方法
CN112141119A (zh) * 2020-09-23 2020-12-29 上海商汤临港智能科技有限公司 智能驾驶控制方法及装置、车辆、电子设备和存储介质

Family Cites Families (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8698639B2 (en) * 2011-02-18 2014-04-15 Honda Motor Co., Ltd. System and method for responding to driver behavior
DE102013204882A1 (de) * 2013-03-20 2014-09-25 Robert Bosch Gmbh Verfahren und Vorrichtung zum Zuordnen eines Fahrers eines Fahrzeugs zu einer im Fahrzeug hinterlegten, ein spezifisches Fahrverhalten des Fahrers repräsentierenden, Fahrerklasse
DE102013009339A1 (de) * 2013-06-04 2014-12-04 Volkswagen Aktiengesellschaft Verfahren und Vorrichtung zur Notfallassistenz
US20150166059A1 (en) * 2013-12-18 2015-06-18 Automotive Research & Testing Center Autonomous vehicle driving support system and autonomous driving method performed by the same
US9290174B1 (en) * 2014-10-23 2016-03-22 GM Global Technology Operations LLC Method and system for mitigating the effects of an impaired driver
US20160267335A1 (en) * 2015-03-13 2016-09-15 Harman International Industries, Incorporated Driver distraction detection system
CN106228755A (zh) * 2016-08-12 2016-12-14 深圳市元征科技股份有限公司 疲劳驾驶预警方法及云端服务器
US10640117B2 (en) * 2016-08-17 2020-05-05 Allstate Insurance Company Driving cues and coaching
US11608085B2 (en) * 2017-01-31 2023-03-21 Mitsubishi Electric Corporation Automatic driving control device
US10592785B2 (en) * 2017-07-12 2020-03-17 Futurewei Technologies, Inc. Integrated system for detection of driver condition
CN107316436B (zh) * 2017-07-31 2021-06-18 努比亚技术有限公司 危险驾驶状态处理方法、电子设备及存储介质
CN107640152B (zh) * 2017-08-08 2019-11-22 吉利汽车研究院(宁波)有限公司 一种车道保持辅助***的可靠性控制装置及方法
US10235859B1 (en) * 2017-08-17 2019-03-19 State Farm Mutual Automobile Insurance Company Systems and methods for the mitigation of drowsy or sleepy driving
EP3456599A1 (en) * 2017-09-18 2019-03-20 The Hi-Tech Robotic Systemz Ltd Monitoring drivers and external environment for vehicles
CN109747656A (zh) * 2017-11-02 2019-05-14 罗曦明 人工智能车辆辅助驾驶方法、装置、设备及存储介质
JP6812952B2 (ja) * 2017-11-15 2021-01-13 オムロン株式会社 脇見判定装置、脇見判定方法、およびプログラム
CN108423006A (zh) * 2018-02-02 2018-08-21 辽宁友邦网络科技有限公司 一种辅助驾驶预警方法及***
JP6637091B2 (ja) * 2018-03-07 2020-01-29 本田技研工業株式会社 車両制御装置
CN108275156A (zh) * 2018-03-27 2018-07-13 斑马网络技术有限公司 驾驶行为检测方法、存储介质、设备及车辆
CN109159667A (zh) * 2018-07-28 2019-01-08 上海商汤智能科技有限公司 智能驾驶控制方法和装置、车辆、电子设备、介质、产品
CN109050520A (zh) * 2018-08-16 2018-12-21 上海小蚁科技有限公司 车辆驾驶状态提醒方法及装置、计算机可读存储介质
US20200062246A1 (en) * 2018-08-27 2020-02-27 Mando Corporation Emergency braking device for vehicle
KR20200056187A (ko) * 2018-11-14 2020-05-22 현대자동차주식회사 운전자의 부주의 경고 제어 장치 및 방법
US11623564B2 (en) * 2018-12-17 2023-04-11 Auto Telematics Ltd. Method and system for determining driving information
US20200207359A1 (en) * 2018-12-27 2020-07-02 Southern Taiwan University Of Science And Technology Smart driving management system and method
US11400944B2 (en) * 2019-01-04 2022-08-02 Byton North America Corporation Detecting and diagnosing anomalous driving behavior using driving behavior models
WO2020246637A1 (ko) * 2019-06-05 2020-12-10 엘지전자 주식회사 자율 주행 차량 제어 방법
CN110371133A (zh) * 2019-06-28 2019-10-25 神龙汽车有限公司 汽车驾驶员疲劳检测***和方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3637165A1 (de) * 1986-10-31 1988-05-05 Rainer Ashauer Verfahren und einrichtung zum verhindern von zusammenstoessen, insbesondere fuer kraftfahrzeuge im strassenverkehr
CN102509418A (zh) * 2011-10-11 2012-06-20 东华大学 一种多传感信息融合的疲劳驾驶评估预警方法及装置
CN110901650A (zh) * 2019-11-02 2020-03-24 芜湖职业技术学院 一种车辆压实线自调整***和方法
CN112141119A (zh) * 2020-09-23 2020-12-29 上海商汤临港智能科技有限公司 智能驾驶控制方法及装置、车辆、电子设备和存储介质

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116129653A (zh) * 2023-04-17 2023-05-16 创意信息技术股份有限公司 卡口车辆检测方法、装置、设备及存储介质
CN116698445A (zh) * 2023-06-08 2023-09-05 北京速度时空信息有限公司 一种自动驾驶安全性检测方法及***
CN116698445B (zh) * 2023-06-08 2024-01-30 北京速度时空信息有限公司 一种自动驾驶安全性检测方法及***
CN117657170A (zh) * 2024-02-02 2024-03-08 江西五十铃汽车有限公司 一种新能源汽车智能安全和整车控制方法及***
CN117657170B (zh) * 2024-02-02 2024-05-17 江西五十铃汽车有限公司 一种新能源汽车智能安全和整车控制方法及***

Also Published As

Publication number Publication date
CN112141119A (zh) 2020-12-29
CN112141119B (zh) 2022-03-11
JP2023542992A (ja) 2023-10-12

Similar Documents

Publication Publication Date Title
WO2022062659A1 (zh) 智能驾驶控制方法及装置、车辆、电子设备和存储介质
RU2656933C2 (ru) Способ и устройство для предупреждения о встречном транспортном средстве
US20200160724A1 (en) Vehicle operation assistance
US10181266B2 (en) System and method to provide driving assistance
JP7163407B2 (ja) 衝突制御方法及び装置、電子機器並びに記憶媒体
WO2022041671A1 (zh) 方向盘脱手检测方法及装置、电子设备和存储介质
CN109017554B (zh) 行驶提醒方法、装置及计算机可读存储介质
CN105513427A (zh) 车辆和车辆的行车预警方法及装置
EP3665564B1 (en) Autonomous vehicle notification system and method
CN104134371A (zh) 一种车辆碰撞预警的方法和装置
US11873007B2 (en) Information processing apparatus, information processing method, and program
CN113205088B (zh) 障碍物图像展示方法、电子设备和计算机可读介质
CN112991684A (zh) 一种驾驶预警的方法及装置
JP2021163345A (ja) 運転支援装置及びデータ収集システム
CN111186435B (zh) 汽车的防碰撞方法、装置及存储介质
CN115269097A (zh) 导航界面的显示方法、装置、设备、存储介质及程序产品
KR20180112336A (ko) 복수의 센서를 이용하여 객체를 인식하는 전자 기기 및 방법
KR101947473B1 (ko) 후방 차량을 고려한 안전 운전 지원 장치 및 방법
KR102319383B1 (ko) 블랙박스 영상을 이용하여 교통 법규 위반 차량을 자동으로 신고하는 장치 및 방법
TWI573713B (zh) 行車距離提示裝置及其方法
KR20170016579A (ko) 운전중 통화 검출 및 경보 방법과 그 시스템
JP7294483B2 (ja) 運転支援装置、運転支援方法及びプログラム
KR102531722B1 (ko) 차량 단말을 이용한 주차위치안내 서비스 방법 및 장치
WO2024046353A2 (en) Presentation control method, device for in-vehicle glass of vehicle, and vehicle
JP2024047112A (ja) 運転支援装置、運転支援方法、及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21871037

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023518924

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21871037

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 21871037

Country of ref document: EP

Kind code of ref document: A1