GB2606018A - Emotion recognition for artificially-intelligent system - Google Patents

Emotion recognition for artificially-intelligent system Download PDF

Info

Publication number
GB2606018A
GB2606018A GB2105780.7A GB202105780A GB2606018A GB 2606018 A GB2606018 A GB 2606018A GB 202105780 A GB202105780 A GB 202105780A GB 2606018 A GB2606018 A GB 2606018A
Authority
GB
United Kingdom
Prior art keywords
data
driver
emotional
vehicle
assistance system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB2105780.7A
Other versions
GB202105780D0 (en
Inventor
Kim Sangho
Kuang Hondson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mercedes Benz Group AG
Original Assignee
Daimler AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Daimler AG filed Critical Daimler AG
Priority to GB2105780.7A priority Critical patent/GB2606018A/en
Publication of GB202105780D0 publication Critical patent/GB202105780D0/en
Publication of GB2606018A publication Critical patent/GB2606018A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0098Details of control systems ensuring comfort, safety or stability not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0872Driver physiology
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • B60W2050/0075Automatic parameter input, automatic initialising or calibrating means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/22Psychological state; Stress level or workload
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/221Physiology, e.g. weight, heartbeat, health or special needs

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Traffic Control Systems (AREA)

Abstract

A method for controlling an assistance system of a vehicle, the method including the steps of collecting driver data (S1), determining (S2) emotional data from the driver data, the emotional data representing an emotion of the driver and determining (S3) control data depending on the emotional data, characterized by controlling (S4) the assistance system of the vehicle. The sensors may be a camera, microphone, blood pressure monitor or an oximeter. The data may be collected by acoustic, optical, or biometric means. The system may have the ability to learn via an algorithm.

Description

EMOTION RECOGNITION FOR ARTIFICIALLY-INTELLIGENT SYSTEM
FIELD OF THE INVENTION
[0001] The invention relates to a method for controlling an assistance system of a vehicle. The method includes the steps of collecting driver data of a driver of the vehicle, determining emotional data from the driver data, the emotional data representing an emotion of the driver, and determining control data depending on the emotional data. Furthermore, the present invention relates to a driver for controlling an assistance system of a vehicle. The driver comprises sensor means for controlling driver data of a driver of the vehicle and processing means for determining emotional data from the driver data, the emotional data representing an emotion of the driver, and for determining control data depending on the emotional data.
BACKGROUND INFORMATION
[0002] Document CN 109815817 A discloses a driver emotion recognition method. It comprises the steps of obtaining a current face image of a driver, processing the current face image to obtain an enhanced image, extending an image feature value from the enhanced image, inputting the image feature values into a neural network model for training and outputting a training result. According to a training result, an emotion category is identified to which the image feature value belongs. The emotion state of the driver can be known in time, the cloud server can push music corresponding to each emotion of the driver to the vehicle-mounted terminal in time, the emotion of the driver can be adjusted, and it is guaranteed that the driver conducts safe driving in a stable emotion state.
[0003] Furthermore, document CN 109243490 A also discloses a driver emotion recognition method. This method includes obtaining a driver's current voice data, extracting a frequency domain feature from the voice data, and inputting the frequency domain feature into a trained convolutional neural network to extract short-time acoustic features of the speech data. Text features are extracted from the speech data, and the text features are input into the training long-term memory neural network. The linguistic features of the long-term domain of the speech data are extracted. The emotion type corresponding to the speech data is determined according to the acoustic features and the linguistic features.
[0004] Document US 2017/0140757 Al discloses a method for coordinating and processing user input provided to vehicles during use. One example is a method for processing voice inputs at a vehicle. The method includes determining a profile for a user of the vehicle using electronics of the vehicle. The method further includes receiving via a microphone of the vehicle a voice command from the user of the vehicle. The electronics of the vehicle process the voice command to enable a two-way conversation exchange between the user and the vehicle.
[0005] Additionally, document ON 105303829 A discloses a vehicle driver emotion recognition method. The method comprises the steps that the relevant driving information of a current driving vehicle is acquired, and relevant driving features are extracted from the relevant driving information. The relevant driving features are classified in a default classifier. A driver's emotion corresponding to the relevant driving features is recognized. The recognized emotion is in line with the current actual driving situation of the vehicle and is not affected by facial features and facial environment of the driver and other factors.
[0006] Finally, document EP 3 166 833 Al discloses an automated device control system for use in a vehicle. In the system a user identification module is configured to identify the vehicle user, and a monitor module is configured to obtain a value of a parameter indicative of a user emotion state. The system has an input for obtaining an instruction from a user for control of a vehicle device. A training module is configured to use user vehicle device control instructions and parameter values indicative of user emotion states to train an algorithm configured to provide an automated vehicle device control instruction in response to a measured parameter indicative of a user emotion state.
SUMMARY OF THE INVENTION
[0007] The object of the present invention is to provide a method for controlling an assistance system of a vehicle more reliably based on emotional data.
[0008] This object is solved with a method according to claim 1 and a device according to claim 5. Further favorable developments of the invention are defined in the subclaims.
[0009] According to the invention, there is provided a method for controlling an assistance system of a vehicle, the method including the steps of collecting driver data of a driver of the vehicle, determining emotional data from the driver data, the emotional data representing an emotion of the driver and determining control data depending on the emotional data, characterized by controlling the assistance system of the vehicle with the control data.
[0010] The assistance system of the vehicle may be any system of the vehicle assisting the driver. Specifically, the assistance system may alter the state of a car at least partially autonomously. In the first step mentioned above, driver data are collected from the driver of the vehicle by respective sensors. These driver data shall be used to obtain an impression of the emotion of the driver. Thus, any driver data like temperature data, face recognition data, blood pressure data or other biometrical data may be collected. The driver data are used to determine respective emotional data of the driver. For instance, the emotional data represent categories of driver's emotions. Specifically, the emotional data may represent an upset state or a calm state of the driver.
[0011] Based on the emotional data or the category of emotion of the driver control data are determined by respective processing means. Such control data may include specific controlling signals for controlling components of the vehicle. Specifically, these components of the vehicle belong to an assistance system of the vehicle. More specifically, these control data are used for controlling the output of a user interface of the assistance system.
[0012] In a specific embodiment, the driver data are collected from acoustical, optical and/or biometrical signals. Acoustical signals may include vocal signals of the driver.
Optical signals may include video signals from the face or the body of the driver. Biometrical signals may include but not limited to blood pressure, oxygen saturation in the blood, humidity of the skin etc. All these signals can be used for judging the emotional state of the driver.
[0013] The emotional data may be determined by an algorithm, preferably by a self-learning algorithm. In a simple case, a pregiven algorithm is used to determine the emotional data. However, in an advanced method a self-learning algorithm or a machine learning algorithm is used. Such self-learning algorithm may be based on a convolutional neural network.
[0014] Furthermore, augmented voice commands may be output from the assistance system, wherein the augmented voice commands differ depending on the emotional data. Such augmented voice commands assist the driver when driving the vehicle. For instance, the augmented voice commands may be loud or quiet. Alternatively or additionally, the augmented voice commands may be fast or slow. All these differences may influence the mood of the driver. Thus, it is advantageous if the voice commands are adapted to the emotion of the driver.
[0015] According to the present invention, the above object is also solved by a device for controlling an assistance system of a vehicle, the device includes sensor means for collecting driver data of a driver of the vehicle and processing means for determining emotional data from the driver data, the emotional data representing an emotion of the driver and for determining control data depending on the emotional data, characterized by controlling means for controlling the assistance system of the vehicle with the control data.
[0016] Preferably, the sensor may comprise a camera, a microphone and/or an oximeter. Furthermore, the sensor means may also include a blood pressure monitor, a humidity sensor, or the like. Specifically, the camera may detect visible light, infrared light, or ultraviolet light.
[0017] Moreover, there may be provided an arrangement according to the present invention. Such an arrangement may comprise an assistance system and a device as described above. The system includes a user interface, an interior light or a visualization unit, the output of which is controlled by the controlling means depending on the emotional data. The output of the user interface may be an acoustic output, an optic output, or a haptic output. Each of these specific outputs may be changed on a basis of the emotional state of the driver. Even the haptic output may have a lower amplitude, if the driver is in an upset state. Otherwise, if the driver is in a sleepy state, the amplitude of the haptic signal may be relatively high.
[0018] Further advantages, features, and details of the invention derive from the following description of preferred embodiments as well as from the drawings. The features and feature combinations previously mentioned in the description as well as the features and feature combinations mentioned in the following description of the figures and/or shown in the figures alone can be employed not only in the respectively indicated combination but also in any other combination or taken alone without leaving the scope of the invention.
BRIEF DESCRIPTION OF THE DRAWINGS
[0019] The novel features and characteristic of the disclosure are set forth in the appended claims. The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and together with the description, serve to explain the disclosed principles. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the figures to reference like features and components. Some embodiments of system and/or methods in accordance with embodiments of the present subject matter are now described below, by way of example only, and with reference to the accompanying figures.
[0020] The drawings show in: [0021] Fig. 1 a schematical view of a driver area; and [0022] Fig. 2 a block diagram of an exemplary method according to the present invention.
[0023] In the figures the same elements or elements having the same function are indicated by the same reference signs.
DETAILED DESCRIPTION
[0024] In the present document, the word "exemplary" is used herein to mean "serving as an example, instance, or illustration". Any embodiment or implementation of the present subject matter described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
[0025] While the disclosure is susceptible to various modifications and alternative forms, specific embodiments thereof have been shown by way of example in the drawing and will be described in detail below. It should be understood, however, that it is not intended to limit the disclosure to the particular forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternatives falling within the scope of the disclosure.
[0026] The terms "comprises", "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion so that a setup, device or method that comprises a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or device or method. In other words, one or more elements in a system or apparatus preceded by "comprises" or "comprise" does not or do not, without more constraints, preclude the existence of other elements or additional elements in the system or method.
[0027] In the following detailed description of the embodiment of the disclosure, reference is made to the accompanying drawings that forms part hereof, and in which is shown by way of illustration a specific embodiment in which the disclosure may be practiced. This embodiment is described in sufficient detail to enable those skilled in the art to practice the disclosure, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present disclosure. The following description is, therefore, not to be taken in a limiting sense.
[0028] Researches in the fields of voice recognition and speech synthesis have made significant leaps over the past few years. These advancements have been compelling enough to create consumer products like vocal assistance systems. Car companies also created their own assistants catered towards in-car usage. Hardware and software systems have been investigated that will be capable of augmenting digital assistants through emotion recognition and response.
[0029] The path to emotional intelligence starts with the hardware. Thermal cameras, cameras and oximeters integrated into the cabin and steering wheel will help provide machine inside into biological indicators of emotional state. Microphones throughout the cabin will also be used to detect vocal signals of emotional state.
[0030] On the software side, machine learning models are used to aid with detection of emotional cues. For example, model-based facial recognition will be used with cabin cameras to detect facial expressions commonly correlated to various emotions. Emotions in vocal signals can be detected and verified through changes in frequency, pitch and intensity. Thermal cameras and oximeters can easily detect signals of destress in many situations.
[0031] This combination of hardware and software will provide the in-car system with contextual data of user's emotional state at any given time, creating a baseline of emotion recognition. To create proper responses to conversations with emotional content, proper modelling of human interactions is necessary.
[0032] The emotion recognition system depicted above will be used in a similar manner to build up a model of emotional intelligent conversations. These models will then be used to form responses of a digital assistant. For instance, if the user is being detected to be in an upset state manner, the car voice assistance system will be speaking in a softer manner.
[0033] In this example, the assistance system of the vehicle is a car voice assistance system. The driver data are voice signals of the driver. The emotional data include information that the driver is in an upset state manner. Control data for controlling the car voice assistance system are determined depending on the emotional data. Finally, the car voice assistance system is controlled on the basis of this control data.
[0034] In the present example, acoustic signals are used to alter the state of the car as a system. Specifically, voice signals are taken in to depict the emotional state of the user. This emotional state is then used to trigger an emotional response by the car. The car's reaction can in turn affect the user's emotional state, forming a closed feedback loop.
[0035] There are several differences with respect to the prior art. Specifically, based on user's emotion, there are several different outputs that match user's emotion and respond accordingly with a specific set of lexical, acoustic values being developed. Furthermore, the system cannot only respond via a voice output. Alternatively, or additionally, a user interface, interior lights and an artificial intelligence visualization may also interact differently based on user's emotions.
[0036] Fig. 1 shows a concrete embodiment of an arrangement comprising an assistance system controlled according to the invention. Specifically, Fig. 1 shows a driver's area. A camera 1 may be mounted at a windscreen 2 of a vehicle however, the camera may be inserted where applicable to this invention. Various camera sensors represented by camera 1 may be used to detect visual and thermal cues of emotion of the driver's facial expressions.
[0037] An infotainment system 3 provides visual or acoustical information to the driver. The infotainment system 3 may be employed with a machine-learning algorithm used to detect the user's current emotional state. In this case, the processing means for determining the emotional data from the driver data and the controlling means for controlling the assistance system are integrated into the infotainment system 3, i.e. the assistance system.
[0038] A steering wheel 4 may house a sensor for gathering biometrical data from the driver. Especially, a pulse oximeter may be embedded in the steering wheel 4 in order to detect biological signals of emotional state of the user/driver.
[0039] Furthermore, a seat 5 of the driver may be equipped with additional sensors to provide driver data. For instance, the additional data may collect emotional state of the driver when the driver is acting nervously on the seat. This is a clear information about its emotional state. In another situation, the driver may calmly rest in his seat. Thus, emotion detectors in the seat 5 may provide respective driver data.
[0040] Additionally, haptic sensors may be built in the steering wheel 4 or the seat 5. These sensors may be controlled depending on the emotion of the driver.
[0041] On the software side, known algorithms can be employed for emotion detection. Thereby, cues from facial recognition can be used. Optionally, also cues from thermal imaging can be used for emotion detection.
[0042] In a specific embodiment, an ECG signal (electrocardiogram) may be gathered as driver data. A heart rate variability information may be obtained from the ECG signal. Therefrom, Poincare-plots and Poincare-indices can be obtained from the HRV data. In parallel, the finger pulse activity of the driver may be measured. Thereby, the pulse rate variability (PRV) can be determined. Both data from the ECG and the finger pulse activity can be used individually or in combination when feeding a classifier. The classifier may determine the respective class of emotion of the driver. Alternatively, the data from the ECG signal and the data from the finger pulse activity are separately classified in individual classifiers. The classification results of both classifiers can be combined in a combining classifier which outputs a combining emotional result. This combined emotional result can be used together with the results of the other two classifiers to determine the emotional state of the driver.
[0043] Emotional data can be gathered with several other techniques. For instance, data may be collected from human conversations that depict emotional intelligence. Furthermore, data from an in-car system may be recorded and used to build conversational models that can be used to train the artificial intelligence used to create responses from the in-car digital assistant.
[0044] Fig. 2 shows a block diagram of an embodiment of a method according to the present invention. The method starts in a first step with collecting Si driver data of a driver 6 of the vehicle. Determining S2 emotional data from the driver data is a second step. The emotional data represents an emotional state of the driver 6. Determining S3 control data depending on the emotional data is performed in a third step. Controlling S4 the assistance system of the vehicle with the control data is performed in a fourth step S4. The variation of the assistance system, specifically the user interface, will affect the emotion of the driver 6. In turn, the sensors for collecting Si the driver data will detect the actual emotion of the driver. Thus, there may be a closed feedback loop including the steps Si to S4 and the emotional reaction of the driver as depicted in Fig. 2.
Reference signs 1 Camera 2 Windscreen 3 Infotainment system 4 Steering wheel Seat 6 Driver Si Step 1 S2 Step 2 S3 Step 3 S4 Step 4

Claims (1)

  1. CLAIMSA method for controlling an assistance system (3) of a vehicle, the method including the steps of - collecting (Si) driver data of a driver (6) of the vehicle, - determining (S2) emotional data from the driver data, the emotional data representing an emotion of the driver (6) and - determining (S3) control data depending on the emotional data, characterized by -controlling (S4) the assistance system of the vehicle with the control data.The method according to claim 1, characterized in that the driver data are collected from acoustical, optical and/or biometrical signals.The method according to claim 1 or 2, characterized in that the emotional data is determined by an algorithm, preferably by a self-learning algorithm.The method according to any one of claims 1 to 3, characterized in that augmented voice commands are output from the assistance system, wherein the augmented voice commands differ depending on the emotional data.A device for controlling an assistance system (3) of a vehicle, the device includes -sensor means (1) for collecting driver data of a driver (6) of the vehicle and -processing means for determining emotional data from the driver data, the emotional data representing an emotion of the driver (6) and for determining control data depending on the emotional data, characterized by -controlling means for controlling the assistance system of the vehicle with the control data.The device according to claims 5, characterized in that the sensor means (1) comprise a camera, a microphone, a blood pressure monitor, a humidity sensor and/or an oximeter.Arrangement comprising an assistance system and a device according to any one of claims 5 and 6, characterized in that the assistance system (3) includes a user interface, an interior light or a visualization unit the output of which is controlled by the controlling means depending on the emotional data.
GB2105780.7A 2021-04-23 2021-04-23 Emotion recognition for artificially-intelligent system Withdrawn GB2606018A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2105780.7A GB2606018A (en) 2021-04-23 2021-04-23 Emotion recognition for artificially-intelligent system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2105780.7A GB2606018A (en) 2021-04-23 2021-04-23 Emotion recognition for artificially-intelligent system

Publications (2)

Publication Number Publication Date
GB202105780D0 GB202105780D0 (en) 2021-06-09
GB2606018A true GB2606018A (en) 2022-10-26

Family

ID=76193434

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2105780.7A Withdrawn GB2606018A (en) 2021-04-23 2021-04-23 Emotion recognition for artificially-intelligent system

Country Status (1)

Country Link
GB (1) GB2606018A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2528083A (en) * 2014-07-08 2016-01-13 Jaguar Land Rover Ltd System and method for automated device control for vehicles using driver emotion
US20180281811A1 (en) * 2017-03-29 2018-10-04 Mazda Motor Corporation Method and system of assisting driving of vehicle
CN110910881A (en) * 2019-12-02 2020-03-24 苏州思必驰信息科技有限公司 Control method and device based on voice recognition and computer readable storage medium
KR20200055171A (en) * 2018-11-07 2020-05-21 현대모비스 주식회사 Apparatus and method for supporting safe driving
CN112078588A (en) * 2020-08-11 2020-12-15 大众问问(北京)信息科技有限公司 Vehicle control method and device and electronic equipment
WO2021039779A1 (en) * 2019-08-30 2021-03-04 株式会社デンソー Vehicle control device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2528083A (en) * 2014-07-08 2016-01-13 Jaguar Land Rover Ltd System and method for automated device control for vehicles using driver emotion
US20180281811A1 (en) * 2017-03-29 2018-10-04 Mazda Motor Corporation Method and system of assisting driving of vehicle
KR20200055171A (en) * 2018-11-07 2020-05-21 현대모비스 주식회사 Apparatus and method for supporting safe driving
WO2021039779A1 (en) * 2019-08-30 2021-03-04 株式会社デンソー Vehicle control device
CN110910881A (en) * 2019-12-02 2020-03-24 苏州思必驰信息科技有限公司 Control method and device based on voice recognition and computer readable storage medium
CN112078588A (en) * 2020-08-11 2020-12-15 大众问问(北京)信息科技有限公司 Vehicle control method and device and electronic equipment

Also Published As

Publication number Publication date
GB202105780D0 (en) 2021-06-09

Similar Documents

Publication Publication Date Title
US10322728B1 (en) Method for distress and road rage detection
Hoch et al. Bimodal fusion of emotional data in an automotive environment
US7729914B2 (en) Method for detecting emotions involving subspace specialists
Kröger et al. Towards a neurocomputational model of speech production and perception
US7373301B2 (en) Method for detecting emotions from speech using speaker identification
CN110422174A (en) Biometric sensor is merged to classify to Vehicular occupant state
WO2017219319A1 (en) Automatic vehicle driving method and automatic vehicle driving system
KR102476675B1 (en) Method and server for smart home control based on interactive brain-computer interface
CN112766173B (en) Multi-mode emotion analysis method and system based on AI deep learning
JP2017007652A (en) Method for recognizing a speech context for speech control, method for determining a speech control signal for speech control, and apparatus for executing the method
CN107554528A (en) Level of fatigue detection method and device, storage medium, the terminal of driver and crew
KR20200115692A (en) A Deep learning-based real time emotional recognition system using bi-signal and methodology.
EP2224425A1 (en) An audio signal processing system and autonomous robot having such system
CN115205729A (en) Behavior recognition method and system based on multi-mode feature fusion
US11697420B2 (en) Method and device for evaluating a degree of fatigue of a vehicle occupant in a vehicle
CN108875617A (en) Auxiliary driving method and device, vehicle
JP2018031918A (en) Interactive control device for vehicle
CN112617829A (en) Method and device for recognizing a safety-relevant emotional state of a driver
KR101950721B1 (en) Safety speaker with multiple AI module
CN111477226B (en) Control method, intelligent device and storage medium
GB2606018A (en) Emotion recognition for artificially-intelligent system
JP2004314750A (en) Vehicle instrument operation control device
JP2018190318A (en) Data collecting apparatus and learning apparatus
CN110503943A (en) A kind of voice interactive method and voice interactive system
Bojanić et al. Application of neural networks in emotional speech recognition

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)