WO2020255238A1 - Dispositif de traitement d'informations, programme et procédé de traitement d'informations - Google Patents

Dispositif de traitement d'informations, programme et procédé de traitement d'informations Download PDF

Info

Publication number
WO2020255238A1
WO2020255238A1 PCT/JP2019/024021 JP2019024021W WO2020255238A1 WO 2020255238 A1 WO2020255238 A1 WO 2020255238A1 JP 2019024021 W JP2019024021 W JP 2019024021W WO 2020255238 A1 WO2020255238 A1 WO 2020255238A1
Authority
WO
WIPO (PCT)
Prior art keywords
level
head
driver
unit
evaluation
Prior art date
Application number
PCT/JP2019/024021
Other languages
English (en)
Japanese (ja)
Inventor
拓也 村上
雄大 中村
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2019557640A priority Critical patent/JP6689470B1/ja
Priority to DE112019007484.9T priority patent/DE112019007484T5/de
Priority to PCT/JP2019/024021 priority patent/WO2020255238A1/fr
Publication of WO2020255238A1 publication Critical patent/WO2020255238A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/06Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/103Static body considered as a whole, e.g. static pedestrian or occupant recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0818Inactivity or incapacity of driver
    • B60W2040/0827Inactivity or incapacity of driver due to sleepiness
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/10Longitudinal speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/223Posture, e.g. hand, foot, or seat position, turned or inclined

Definitions

  • the present invention relates to an information processing device, a program, and an information processing method.
  • driver monitoring system has been put into practical use.
  • Driver monitoring systems are becoming more widespread not only in commercial vehicles such as trucks and buses, but also in general vehicles.
  • the main functions installed in the driver monitoring system are the detection functions for drowsy driving and sideways driving as safe driving support functions. Further, in the automatic driving Lv (Level) 3, since the authority is transferred from the automatic driving system to a human being, these functions are required as a technique for determining whether or not the driver is in a driving state.
  • the driver's face is photographed by a camera installed in the car, and the driver's condition is estimated from the facial expression and the like.
  • a technique for estimating drowsiness a technique for detecting the degree of eye opening is often used as described in Patent Document 1.
  • an object of the present invention is to enable the driver's alertness level to be detected even when the driver's eyes are hidden.
  • the information processing device includes a posture holding function evaluation unit that evaluates the state of the posture holding function, which is a function of the driver holding the posture, from an image including the driver's head, and the posture holding function evaluation unit. It is characterized by comprising an arousal degree estimation unit for estimating the arousal degree level of the driver by using the evaluation.
  • the program according to one aspect of the present invention includes a posture holding function evaluation unit that evaluates the state of the posture holding function, which is a function of the driver holding the posture, from an image including the driver's head, and a computer. It is characterized in that it functions as an arousal degree estimation unit that estimates the arousal degree level of the driver by using the evaluation.
  • the posture holding function evaluation unit evaluates the state of the posture holding function, which is a function of the driver holding the posture, from an image including the driver's head.
  • the arousal degree estimation unit is characterized in that the arousal degree estimation unit estimates the level of the arousal degree of the driver by using the evaluation.
  • the level of alertness of the driver can be detected even when the driver's eyes are hidden.
  • FIG. It is a block diagram which shows schematic structure of the information processing apparatus which concerns on Embodiment 1.
  • FIG. It is the schematic for demonstrating the coordinate system in Embodiment 1.
  • FIG. It is the schematic which shows the arrangement example in the vehicle of the information processing apparatus which concerns on Embodiment 1.
  • FIG. It is a block diagram which shows the hardware configuration example of the information processing apparatus which concerns on Embodiment 1.
  • FIG. It is a flowchart which shows the operation of the information processing apparatus which concerns on Embodiment 1.
  • FIG. It is a block diagram which shows schematic structure of the information processing apparatus which concerns on Embodiment 2.
  • FIG. It is the schematic which shows the arrangement example in the vehicle of the information processing apparatus which concerns on Embodiment 2.
  • FIG. It is a flowchart which shows the operation of the information processing apparatus which concerns on Embodiment 2.
  • FIG. 1 is a block diagram schematically showing the configuration of the information processing device 100, which is the alertness estimation device according to the first embodiment. As shown in FIG. 1, the information processing device 100 is connected to the image pickup device 130 and the alertness decrease notification device 140.
  • the information processing device 100 receives the input of image data indicating the image captured by the image pickup device 130, estimates the level of the driver's arousal level, and says that the driver's arousal level is lowered by the estimated level. If it is determined, the arousal reduction notification device 140 is notified.
  • the image pickup device 130 may use, for example, a CCD (Charge Coupled Device) or the like, and the alertness decrease notification device 140 may use a display, a speaker, or the like.
  • CCD Charge Coupled Device
  • the information processing device 100 includes an input interface unit (hereinafter referred to as an input I / F unit) 101, a posture holding function evaluation unit 102, an alertness estimation unit 106, and an output interface unit (hereinafter referred to as an output I / F unit). It is provided with 107.
  • the posture holding function evaluation unit 102 includes a head coordinate detection unit 103, a head inclination angle detection unit 104, and an evaluation unit 105.
  • the input I / F unit 101 is an input unit that receives input of image data from the image pickup apparatus 130.
  • the input I / F unit 101 gives the input image data to the head coordinate detection unit 103 and the head inclination angle detection unit 104.
  • the image shown by the image data one driver may be shown or another occupant may be shown. Further, if the head coordinates can be detected and the head inclination angle can be detected, the image may be an RGB (Red Green Blue) image, an IR (infrared) image, or a distance image.
  • RGB Red Green Blue
  • IR infrared
  • the posture holding function evaluation unit 102 evaluates the state of the posture holding function, which is a function for the driver to hold the posture, from the image shown by the image data.
  • the head coordinate detection unit 103 detects the head coordinates, which are the coordinates of the driver's head, from the image indicated by the image data given from the input I / F unit 101.
  • the head coordinate detection unit 103 gives the detected head coordinates to the evaluation unit 105.
  • the head coordinate detection unit 103 detects the driver's face from the image indicated by the image data given from the input I / F unit 101.
  • the face detection method which is a method of detecting the driver's face, is not particularly limited.
  • the head coordinate detection unit 103 may detect the driver's face by using an AdaBoost-based detector using a commonly used Har-like feature amount.
  • the "Haar-like feature amount” is a feature amount that captures the feature by the difference in brightness of the image, and is less susceptible to fluctuations in lighting conditions or noise than when the pixel value is used as the feature amount as it is.
  • AdaBoost is an abbreviation for Adaptive Boosting, and is a machine learning algorithm that improves performance by using a plurality of weak classifiers in combination.
  • the head coordinate detection unit 103 detects the head coordinates, which are the coordinates of the driver's head, from the face detection frame, which is an area including the detected face.
  • the method of calculating the head coordinates may be the center of the face detection frame, or a representative point of the head may be set from the feature points of the face, and the representative point may be used as the head coordinates.
  • the midpoint between the left and right eyes may be used as the representative point.
  • the representative point may be estimated by setting it as the midpoint of the outer corner of the eye.
  • the "face detection frame” is a frame indicating the area where the face exists in the image, and includes the feature points of the face. In many cases, the shape of the frame is square. "Face feature points” are coordinates that constitute faciality, and refer to coordinates such as contour, nostrils, outer corners of eyes, inner corners of eyes, corners of mouth, lips, and eyebrows.
  • the head coordinate detection unit 103 provides the head tilt angle detection unit 104 with face area information indicating the detected face detection frame.
  • the head inclination angle detection unit 104 detects the head inclination angle, which is the inclination angle of the driver's head, from the image indicated by the image data given from the input I / F unit 101.
  • the head tilt angle detection unit 104 gives the detected head tilt angle to the evaluation unit 105.
  • the head tilt angle detection unit 104 is an image in the face detection frame indicated by the face area information given by the head coordinate detection unit 103 among the images indicated by the image data given by the input I / F unit 101.
  • the tilt angle of the driver's head is detected from the face area image.
  • the head inclination angle detection unit 104 creates a detector from images taken every 10 degrees of the inclination angle of the head, and creates a detector as a face area image.
  • the detector may be used to detect the head inclination angle.
  • the head tilt angle detection unit 104 may specify a plurality of feature points from the face region image and detect the head tilt angle from the distance between the feature points.
  • the head tilt angle detection unit 104 may specify two feature points in the vertical direction of the face region image and detect the head tilt angle by the distance between the two feature points. For example, as the head tilt angle increases, the distance between the two feature points decreases. In this case, if image data captured by a stereo camera or a TOF (Time of Flight) camera or the like for which a distance is required is used, the distance can be obtained accurately and easily.
  • TOF Time of Flight
  • the evaluation unit 105 determines the variation of the head coordinates detected by the head coordinate detection unit 103 in a predetermined period and the head inclination angle detected by the head inclination angle detection unit 104 in the predetermined period.
  • the state of the driver's posture holding function is evaluated using the variation.
  • the evaluation unit 105 disperses the head coordinates detected by the head coordinate detection unit 103 in a predetermined period, and the head detected by the head inclination angle detection unit 104 in a predetermined period.
  • the state of the driver's posture holding function is evaluated using the dispersion of the inclination angle.
  • the evaluation unit 105 calculates the variance of the head coordinates in a predetermined period. Specifically, the evaluation unit 105 calculates the variance of the head coordinates for the last 20 seconds. In this case, assuming that the image pickup apparatus 130 inputs image data at 30 fps (frame per second), the evaluation unit 105 records 600 frames of the head coordinates in the gravity direction of the earth and calculates the variance of the coordinates. .. As shown in FIG. 2, the direction of gravity of the earth is taken as the Y axis here. The frame rate of the image pickup apparatus 130 and the time for calculating the variance are not limited to this example. Further, in the example shown in FIG. 2, the direction around the X-axis is the Pitch direction, the direction around the Y-axis is the Yaw direction, and the direction around the Z-axis is the Roll direction.
  • the evaluation unit 105 calculated the variance of the coordinates in the direction of gravity of the earth, but similarly, the variance of the coordinates in the horizontal direction with respect to the ground may be calculated.
  • the horizontal direction with respect to the ground is defined as the X axis here.
  • the evaluation unit 105 calculates the variance of the head inclination angle in a predetermined period. Specifically, the evaluation unit 105 calculates the variance of the head inclination angle for the last 20 seconds. In this case, assuming that the image pickup apparatus 130 inputs image data at 30 fps, the evaluation unit 105 records the angle in the Pitch direction, which is the inclination angle of the earth in the gravity direction for 600 frames, and calculates the variance thereof.
  • the evaluation unit 105 records the angle in the Pitch direction and calculates the variance, but as described above, the evaluation unit 105 may record the angle in the Yaw direction or the Roll direction and calculate the variance.
  • the evaluation unit 105 calculates an evaluation value indicating the state of the posture holding function from the dispersion of the head coordinates and the dispersion of the head inclination angle. For example, the evaluation unit 105 divides the variance of the head coordinates in the predetermined period by the variance of the head inclination angle in the predetermined period, or the value of the head inclination angle in the predetermined period. The value obtained by dividing the variance by the variance of the head coordinates in a predetermined period is calculated as the evaluation value.
  • the evaluation unit 105 calculates the evaluation value by the following equation (1).
  • Evaluation value (dispersion of Pitch angle) ⁇ (variance of Y coordinate) (1)
  • the evaluation unit 105 may calculate the evaluation value by the following equation (2) or (3).
  • Evaluation value (variance of X coordinates) ⁇ (variance of Yaw angle) (2)
  • Evaluation value (variance of Roll angle) ⁇ (variance of X coordinates) (3)
  • the evaluation unit 105 gives the evaluation value calculated as described above to the alertness estimation unit 106.
  • equation (1) when the evaluation value becomes large, the position of the head does not change and the head swings in the vertical direction, so that the driver's alertness decreases and the posture holding function is performed. It can be judged that it is not exhibited.
  • equation (2) when the evaluation value becomes large, it is judged that the driver's arousal level is lowered and the posture holding function is not exhibited because the head is swinging from side to side without rotating. Can be done.
  • equation (3) when the evaluation value becomes large, the position of the head does not change and the head swings left and right, so the driver's alertness decreases and the posture holding function is exhibited. It can be judged that there is no state.
  • the arousal level estimation unit 106 estimates the driver's arousal level using the evaluation performed by the posture holding function evaluation unit 102. For example, the alertness estimation unit 106 estimates the driver's alertness level based on the evaluation value given by the evaluation unit 105.
  • the level of alertness may be divided into 6 levels that can be estimated from facial expressions or body movements, as in the known evaluation method defined by NEDO (New Energy and Industrial Technology Development), for example. It may be divided into degree levels.
  • the arousal level estimation unit 106 sets a threshold value of the evaluation value for each level and estimates the arousal level. In other words, it is assumed that the corresponding level is specified in advance in the evaluation value. Then, when the estimated arousal level is lower than the predetermined threshold value, the arousal estimation unit 106 gives a notification instruction from the output I / F unit 107 to the arousal decrease notification device 140.
  • the output I / F unit 107 is an output unit for outputting an instruction or the like to the alertness decrease notification device 140.
  • the arousal level reduction notification device 140 notifies the driver and the passenger of the decrease in arousal level in response to the notification instruction from the arousal level estimation unit 106.
  • the alertness decrease notification device 140 notifies the driver or the like of the decrease in alertness by using the speaker and the display, and indicates that a break is urged or the driving is in a non-transferable state.
  • FIG. 3 is a schematic view showing an example of arrangement of the information processing device 100 according to the first embodiment in the vehicle 150.
  • an in-vehicle camera 151 that functions as an image pickup device 130, an information processing device 100, a storage device 152, a speaker 153 that functions as an alertness decrease notification device 140, and a display 154 are arranged.
  • the in-vehicle camera 151 captures an image of the upper body including the driver's head to be detected.
  • the in-vehicle camera 151 is connected to the information processing device 100.
  • the in-vehicle camera 151 shown in FIG. 3 may be connected via wiring such as a wire harness, or may be connected wirelessly.
  • the in-vehicle camera 151 may be a general RGB camera or an IR camera. Further, the in-vehicle camera 151 may be a Depth sensor.
  • An "RGB camera” is a camera that communicates signals of three colors of red, green, and blue using three different cables and the like, and generally uses three independent CCD sensors.
  • An "IR camera” is an infrared camera, which is sensitive to wavelengths in the infrared region.
  • the "Dept sensor” is also referred to as a 3D (three dimensions) sensor, a distance measuring sensor, or a Dept camera, and is a sensor capable of generating a depth image in addition to a two-dimensional image.
  • the distance measuring method includes a stereo camera method, a pattern irradiation method, a TOF method, and the like, but any type of sensor may be used.
  • FIG. 4 is a block diagram showing a hardware configuration example of the information processing apparatus 100 according to the first embodiment.
  • the information processing device 100 includes a CPU 160, a program memory 161 and a data memory 162, an interface 163, and a bus 164 connecting them.
  • An in-vehicle camera 151, a speaker 153, and a display 154 are connected to the interface 163.
  • the head coordinate detection unit 103, the head inclination angle detection unit 104, the evaluation unit 105, and the arousal degree estimation unit 106 can be configured by the CPU 160 executing a program stored in the program memory 161. ..
  • the input I / F unit 101 and the output I / F unit 107 can be configured by the interface 163.
  • the information processing apparatus 100 can be realized by a computer that stores a specific program.
  • CPU Central Processing Unit
  • the CPU 160 operates according to the program stored in the program memory 161.
  • the CPU 160 stores various data in the data memory 162 in the process of operation.
  • the speaker 153 and the display 154 are controlled via the interface 163.
  • the operation performed by the information processing apparatus 100 will be described with reference to FIG.
  • FIG. 5 is a flowchart showing the operation of the information processing apparatus 100 according to the first embodiment.
  • the flowchart shown in FIG. 5 includes an image data acquisition step (S10), a face detection step (S11), a head coordinate detection step (S12), a head coordinate distribution calculation step (S13), and a head. It has an inclination angle detection step (S14), a head inclination angle dispersion calculation step (S15), an evaluation value calculation step (S16), an arousal degree estimation step (S17), and an arousal degree decrease notification step (S18). ..
  • the input I / F unit 101 acquires image data of an image of the upper body including the driver's head.
  • the head coordinate detection unit 103 detects the driver's face from the image indicated by the acquired image data.
  • the head coordinate detection unit 103 provides the head inclination angle detection unit 104 with face area information indicating a face detection frame, which is a region including the detected face.
  • the head coordinate detection unit 103 detects the head coordinates from the face detection frame, which is an area including the face detected in the face detection step S11.
  • the evaluation unit 105 calculates the variance of the head coordinates.
  • the head tilt angle detection unit 104 detects the head tilt angle from the face detection frame which is a region including the face detected in the face detection step S11.
  • the evaluation unit 105 calculates the variance of the head tilt angle detected in the head tilt angle detection step S14.
  • the evaluation unit 105 calculates an evaluation value indicating the state of the posture holding function from the dispersion of the head coordinates and the dispersion of the head inclination angle.
  • the arousal level estimation unit 106 estimates the arousal level from the evaluation value calculated in the evaluation value calculation step S16.
  • the arousal level estimation unit 106 uses the output I / F unit 107 via the output I / F unit 107 when the level of the arousal level estimated in the arousal level estimation step S17 falls below a predetermined threshold value.
  • the arousal level decrease notification device 140 is instructed to notify the driver and the like of the decrease in arousal level. Whether the arousal level reduction notification device 140 that receives such an instruction notifies the driver, the passenger, or the like of the decrease in arousal level by using at least one of the speaker 153 and the display 154 and prompts a break. Indicates that the operation cannot be transferred.
  • the level of the driver's alertness can be estimated from the state of the posture holding function.
  • the driver, passengers, and the like can be notified according to the estimation result.
  • the level of arousal level is estimated by evaluating the state of the posture holding function of the head from the coordinates of the head and the inclination angle of the head, awakening is performed even when the eyes are hidden.
  • the level of degree can be estimated.
  • a sensor other than a camera is not required to estimate the arousal level, cost reduction and miniaturization are possible.
  • the inclination angle of the head is used for estimation, it is possible to distinguish whether the movement of the head is due to the turning of the face or the vibration of the vehicle. As a result, it is possible to prevent erroneous detection due to the movement of the head caused by turning around at the time of safety confirmation.
  • FIG. 6 is a block diagram schematically showing the configuration of the information processing device 200, which is the alertness estimation device according to the second embodiment. As shown in FIG. 6, the information processing device 200 is connected to the image pickup device 130 and the alertness decrease notification device 140.
  • the image pickup device 130 and the alertness decrease notification device 140 in the second embodiment are the same as the image pickup device 130 and the alertness decrease notification device 140 in the first embodiment.
  • CAN Controller Area Network
  • CAN is an abbreviation for Controller Area Network, and is a network corresponding to a standard for connecting electronic circuits and various devices developed as a communication technology between in-vehicle devices.
  • the information processing device 200 includes an input I / F unit 201, a posture holding function evaluation unit 102, an alertness estimation unit 206, an output I / F unit 107, a PERCLOS calculation unit 208, and a yawn detection unit 209. .
  • the posture holding function evaluation unit 102 includes a head coordinate detection unit 103, a head inclination angle detection unit 104, and an evaluation unit 105.
  • the posture holding function evaluation unit 102 and the output I / F unit 107 in the second embodiment are the same as the posture holding function evaluation unit 102 and the output I / F unit 107 in the first embodiment.
  • the head coordinate detection unit 103 of the posture holding function evaluation unit 102 transmits the face area information indicating the detected face detection frame to the head inclination angle detection unit 104, as well as the PERCLOS calculation unit 208 and the yawn detection unit 209. give.
  • the input I / F unit 201 receives the input of image data from the image pickup apparatus 130.
  • the input I / F unit 201 gives the input image data to the head coordinate detection unit 103, the head inclination angle detection unit 104, the PERCLOS calculation unit 208, and the yawn detection unit 209. Further, the input I / F unit 201 receives input from the CAN 270 of vehicle information indicating the state of the vehicle driven by the driver, such as the steering angle of the steering wheel, the position of the gear, or the vehicle speed.
  • the input I / F unit 201 gives the input vehicle information to the alertness estimation unit 206.
  • the PERCLOS calculation unit 208 detects the degree of eye opening, which is the rate at which the driver's eyes are open, from the image indicated by the image data given from the input I / F unit 201, and according to the degree of eye opening, the driver at a fixed time. Is an eye closure ratio calculation unit that calculates the eye closure ratio, which is the ratio of the time that the eye is closed.
  • the PERCLOS calculation unit 208 calculates the PERCLOS of the driver.
  • PERCLOS is an abbreviation for percentage of eye closure, which is the ratio of eye closure time within a certain period of time. PERCLOS is known to have a high correlation with the subjective drowsiness of the subject (here, the driver), and can be used as a drowsiness index.
  • the PERCLOS calculation unit 208 has a face area image corresponding to the face detection frame indicated by the face area information given by the head coordinate detection unit 103 among the images indicated by the image data given by the input I / F unit 201.
  • the degree of eye opening is calculated from the distance between the upper and lower eyelids. Specifically, the PERCLOS calculation unit 208 obtains the midpoint between the outer and inner corners of the eyes on the arc of the upper and lower eyelids, and determines the ratio of the distance between the upper and lower eyelids to the distance between the upper and lower eyelids at the time of maximum eye opening. The degree of eye opening.
  • the distance between the upper and lower eyelids with respect to the distance between the upper and lower eyelids at the time of maximum eye opening shall be predetermined. Therefore, the definition of the degree of eye opening is calculated for each frame, with the degree of eye opening when the eyes are closed as 0% and the degree of eye opening when awake as 100%.
  • the PERCLOS calculation unit 208 calculates PERCLOS from the calculated eye opening degree.
  • the PERCLOS calculation unit 208 determines whether the eyes are open or closed at a predetermined threshold value for each frame with respect to the calculated degree of eye opening, and when it is determined that the eyes are closed, the calculation is performed as an eye closing frame. For example, when the PERCLOS calculation unit 208 is provided with a buffer for 1 minute and calculates PERCLOS from an image taken by an image pickup device 130 at 10 fps, the number of eye-closure frames accumulated in 1 minute is divided by 6000 frames. , PERCLOS can be calculated.
  • the yawn detection unit 209 detects the opening degree, which is the ratio at which the driver's mouth is open, from the image indicated by the image data given from the input I / F unit 201, and the driver yawns according to the opening degree. Detects whether or not it is done.
  • the lips detection unit 209 is a face area image corresponding to the face detection frame indicated by the face area information given by the head coordinate detection unit 103 among the images indicated by the image data given by the input I / F unit 201.
  • the degree of opening of the mouth is detected from the feature points of the face from the distance between the upper and lower lips.
  • the accretion detection unit 209 obtains the midpoint between the left and right mouth corners on the arc of the upper lip and the lower lip, respectively, and sets the distance between the upper and lower midpoints as the opening degree with respect to the distance between the upper and lower midpoints at the time of maximum opening. calculate.
  • the yawn detection unit 209 detects the driver's yawn based on the calculated opening degree.
  • the characteristics of yawning are that the mouth opens continuously for a certain period of time, the nose hole widens, tears come out, or inhalation is performed before yawning occurs, but here the mouth opens continuously for a certain period of time. The case where the feature is used will be described.
  • the yawn detection unit 209 determines that the corresponding frame is a yawn frame when the calculated opening degree becomes equal to or higher than a predetermined threshold value. Then, the yawn detection unit 209 determines that the yawn is yawn when the yawn frame continues for a predetermined time, which is a threshold value or longer. Further, since yawning does not occur continuously in a short time, the yawning detection unit 209 does not detect yawning for a certain period of time after detecting yawning once. This makes it possible to prevent erroneous detection.
  • the arousal level estimation unit 206 has vehicle information given by the input I / F unit 201, an evaluation value given by the evaluation unit 105, PERCLOS given by the PERCLOS calculation unit 208, and the presence or absence of yawning given by the yawn detection unit 209. From, the level of alertness of the driver is estimated.
  • the arousal level estimation unit 206 refers to the vehicle state indicated by the vehicle state information, and estimates the driver's arousal level when the vehicle state does not correspond to a predetermined condition.
  • the predetermined conditions are when back or parking is selected as the gear, when the vehicle speed is below the predetermined threshold value, or when the steering angle is above the predetermined threshold value. Is. When none of these conditions are met, the arousal level estimation unit 206 estimates the level of arousal level.
  • the alertness estimation unit 206 advances straight forward toward the front when the vehicle state indicated by the vehicle state information is equal to or higher than the predetermined vehicle speed. If so, estimate the level of alertness of the driver.
  • the arousal level estimation unit 206 determines that the driver's arousal level is estimated, the vehicle information given by the input I / F unit 201, the evaluation value given by the evaluation unit 105, and the PERCLOS calculation unit The level of arousal of the driver is estimated from the PERCLOS given by 208 and the presence or absence of yawn given by the yawn detection unit 209.
  • the arousal level estimation unit 206 includes vehicle information given by the input I / F unit 201, an evaluation value given by the evaluation unit 105, PERCLOS given by the PERCLOS calculation unit 208, and yawns given by the yawn detection unit 209. The level of alertness corresponding to each is estimated from each of the presence or absence. Then, the arousal level estimation unit 206 adopts the lowest arousal level at the same timing as the arousal level at that timing.
  • the level of arousal level may be associated in advance according to the magnitude of the change in steering angle with time. Further, for PERCLOS, the level of arousal level may be associated with each value in advance, and the level of arousal level may be associated with each value in advance for the presence or absence of yawning.
  • the level of alertness corresponding to the evaluation value is the first level
  • the level of alertness corresponding to the state of the vehicle indicated by the vehicle information is the second level
  • the level of alertness corresponding to PERCLOS is the third level.
  • the alertness estimation unit 206 has a first level, a second level, a third level, and a third level. Identify the final level of alertness from level 4. However, if the third level cannot be obtained due to reasons such as wearing sunglasses, the final level of alertness is specified from the first level, the second level, and the third level. .. Here, among these levels, the one with the lowest arousal level is specified as the final arousal level.
  • the evaluation method of NEDO may be used as in the first embodiment, but any level of arousal level may be used. It is desirable to calculate the level of alertness together with the reliability for each index.
  • the reliability is an index indicating whether the output value is a plausible value. The value output when the reliability is high is a plausible value, and the value output when the reliability is low is adjusted so that it is often incorrect.
  • the lowest level may be adopted from among the indicators whose reliability exceeds a predetermined threshold value. As a result, even a driver who has no characteristic of opening and closing eyes can correctly determine the level of arousal level by yawning or variation in steering angle.
  • a method of estimating the level of arousal of the driver from the degree of opening of the eyelids is effective, and that when the level of arousal of the driver decreases, PERCLOS increases.
  • a method of detecting an open / closed eye by using an image pickup device 130 that is sensitive to infrared rays is common, but the state of the eyes cannot be detected when wearing sunglasses having low infrared transmittance.
  • the level of arousal level by PERCLOS is not estimated when wearing sunglasses, but the level estimated by another method can be applied.
  • FIG. 7 is a schematic view showing an example of arrangement of the information processing device 200 according to the second embodiment on the vehicle 150.
  • an in-vehicle camera 151 that functions as an image pickup device 130
  • an information processing device 200 a storage device 152
  • a speaker 153 and a display 154 that function as an arousal decrease notification device 140
  • a vehicle speed sensor 255 a gearbox 256 and a steering wheel 257 are arranged.
  • the vehicle-mounted camera 151, the storage device 152, the speaker 153, and the display 154 in the second embodiment are the same as the vehicle-mounted camera 151, the storage device 152, the speaker 153, and the display 154 in the first embodiment.
  • the information that can be acquired from the vehicle speed sensor 255, the gearbox 256, and the steering wheel 257 is sent to the information processing device 200 as vehicle information via the CAN 270.
  • the information processing apparatus 200 also includes a CPU 160, a program memory 161 and a data memory 162, an interface 163, and a bus 164 connecting them.
  • An in-vehicle camera 151, a speaker 153, and a display 154 are connected to the interface 163.
  • the interface 163 is also connected to the CAN 270.
  • FIG. 8 is a flowchart showing the operation of the information processing apparatus 200 according to the second embodiment.
  • the flowchart shown in FIG. 8 includes an image data acquisition step (S20), a face detection step (S21), a head coordinate detection step (S22), a head coordinate distribution calculation step (S23), and a head.
  • It has (S29), an evaluation value calculation step (S30), an arousal degree estimation step (S31), and an arousal degree decrease notification step (S32).
  • the input I / F unit 201 acquires image data of an image of the upper body including the driver's head.
  • the image shown by the image data is an RGB image, an IR image, or an IR image, if it is possible to calculate the head coordinates, the head tilt angle, the PERCLOS, and the yawn detection to be performed in the subsequent processing. It may be any image such as a distance image.
  • the head coordinate detection unit 103 detects the driver's face from the image indicated by the acquired image data.
  • the head coordinate detection unit 103 provides face area information indicating a detection frame, which is a region including the detected face, to the head inclination angle detection unit 104, the PERCLOS calculation unit 208, and the yawn detection unit 209.
  • the head coordinate detection unit 103 detects the head coordinates from the face detection frame, which is an area including the face detected in the face detection step S21.
  • the evaluation unit 105 calculates the variance of the head coordinates.
  • the head tilt angle detection unit 104 detects the head tilt angle from the face detection frame which is a region including the face detected in the face detection step S21.
  • the evaluation unit 105 calculates the variance of the head tilt angle detected in the head tilt angle detection step S24.
  • the PERCLOS calculation unit 208 uses the driver's facial feature points from the face detection frame, which is an area including the face detected in the face detection step S21, to determine the degree of eye opening. Is detected. In the PERCLOS calculation step S27, the PERCLOS calculation unit 208 calculates PERCLOS from the eye opening degree detected in the eye opening degree detection step S26.
  • the yawn detection unit 209 uses the feature points of the driver's face from the face detection frame, which is an area including the face detected in the face detection step S21, to open the mouth. Detect the degree.
  • the yawn detection unit 209 detects the presence or absence of yawn by using the opening degree detected in the opening degree detection step S28.
  • the evaluation unit 105 calculates an evaluation value indicating the state of the posture holding function from the dispersion of the head coordinates and the dispersion of the head inclination angle.
  • the awakening level estimation unit 206 refers to the vehicle state indicated by the vehicle state information and estimates the driver's arousal level when the vehicle state does not correspond to a predetermined condition. ..
  • the alertness estimation unit 206 includes vehicle information given by the input I / F unit 201, an evaluation value given by the evaluation unit 105, PERCLOS given by the PERCLOS calculation unit 208, and yawn given by the yawn detection unit 209. The level of alertness of the driver is estimated from the presence or absence of.
  • the arousal level estimation unit 206 uses the output I / F unit 107 via the output I / F unit 107 when the level of the arousal level estimated in the arousal level estimation step S31 falls below a predetermined threshold value.
  • the arousal level decrease notification device 140 is instructed to notify the driver and the like of the decrease in arousal level. Whether the arousal level reduction notification device 140 that receives such an instruction notifies the driver, the passenger, or the like of the decrease in arousal level by using at least one of the speaker 153 and the display 154 and prompts a break. Indicates that the operation cannot be transferred.
  • the evaluation can be performed by an index suitable for evaluating the level of the driver's arousal level. Further, by controlling the speaker 153 and the display 154 according to the evaluation result, the driver and the passenger can be notified of the arousal level.
  • 100,200 information processing device 101,201 input I / F unit, 103 head coordinate detection unit, 104 head tilt angle detection unit, 105 evaluation unit, 106,206 arousal degree estimation unit, 107 output I / F unit, 208 PERCLOS calculation unit, 209 accretion detection unit, 130 imaging device, 140 arousal reduction notification device, 150 vehicle, 151 in-vehicle camera, 152 storage device, 153 speaker, 154 display, 160 CPU, 161 program memory, 162 data memory, 163 Interface, 164 bus.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Psychiatry (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Psychology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Human Computer Interaction (AREA)
  • Molecular Biology (AREA)
  • Social Psychology (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Transportation (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Mechanical Engineering (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Traffic Control Systems (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
  • Emergency Alarm Devices (AREA)

Abstract

La présente invention comprend : une unité d'évaluation de fonction de maintien de posture (102) pour évaluer, à partir d'une image comprenant la partie tête d'un conducteur, un état d'une fonction de maintien de posture qui est la fonction pour le maintien de la posture du conducteur ; et une unité d'estimation de degré d'éveil (106) pour estimer le niveau du degré d'éveil du conducteur à l'aide du résultat d'évaluation.
PCT/JP2019/024021 2019-06-18 2019-06-18 Dispositif de traitement d'informations, programme et procédé de traitement d'informations WO2020255238A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2019557640A JP6689470B1 (ja) 2019-06-18 2019-06-18 情報処理装置、プログラム及び情報処理方法
DE112019007484.9T DE112019007484T5 (de) 2019-06-18 2019-06-18 Informationsverarbeitungsvorrichtung, programm und informationsverarbeitungsverfahren
PCT/JP2019/024021 WO2020255238A1 (fr) 2019-06-18 2019-06-18 Dispositif de traitement d'informations, programme et procédé de traitement d'informations

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/024021 WO2020255238A1 (fr) 2019-06-18 2019-06-18 Dispositif de traitement d'informations, programme et procédé de traitement d'informations

Publications (1)

Publication Number Publication Date
WO2020255238A1 true WO2020255238A1 (fr) 2020-12-24

Family

ID=70413776

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/024021 WO2020255238A1 (fr) 2019-06-18 2019-06-18 Dispositif de traitement d'informations, programme et procédé de traitement d'informations

Country Status (3)

Country Link
JP (1) JP6689470B1 (fr)
DE (1) DE112019007484T5 (fr)
WO (1) WO2020255238A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7389078B2 (ja) * 2021-03-26 2023-11-29 矢崎総業株式会社 ドライバ評価装置及びドライバ評価システム

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000515829A (ja) * 1995-09-28 2000-11-28 アドバンスト セイフティ コンセプツ インク. 異常運転輸送車両対処システム
JP2018127112A (ja) * 2017-02-08 2018-08-16 パナソニックIpマネジメント株式会社 覚醒度推定装置及び覚醒度推定方法

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61175129A (ja) 1985-01-29 1986-08-06 Nissan Motor Co Ltd 居眠り運転防止装置

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000515829A (ja) * 1995-09-28 2000-11-28 アドバンスト セイフティ コンセプツ インク. 異常運転輸送車両対処システム
JP2018127112A (ja) * 2017-02-08 2018-08-16 パナソニックIpマネジメント株式会社 覚醒度推定装置及び覚醒度推定方法

Also Published As

Publication number Publication date
JPWO2020255238A1 (ja) 2021-09-13
JP6689470B1 (ja) 2020-04-28
DE112019007484T5 (de) 2022-05-05

Similar Documents

Publication Publication Date Title
CN107665330B (zh) 在车辆中检测头部姿势的***、方法和计算机可读介质
EP2860664B1 (fr) Appareil de détection de visage
US10369926B2 (en) Driver state sensing system, driver state sensing method, and vehicle including the same
US9526448B2 (en) State estimation device and state estimation program
EP3588372B1 (fr) Commande d'un véhicule autonome basée sur le comportement du passager
JP5790762B2 (ja) 瞼検出装置
US9105172B2 (en) Drowsiness-estimating device and drowsiness-estimating method
JP7118136B2 (ja) 搭乗者状態判定装置、警告出力制御装置及び搭乗者状態判定方法
US11453401B2 (en) Closed eye determination device
JP2009294753A (ja) 画像処理装置および画像処理方法
KR20190083155A (ko) 운전자 상태 검출 장치 및 그 방법
EP3440592B1 (fr) Procédé et système de distinction entre un événement de contact visuel et un événement de fermeture d'il
JP2009166783A (ja) 症状推定装置
KR20190134909A (ko) 주행상황 판단 정보 기반 운전자 상태 인식 장치 및 방법
WO2020255238A1 (fr) Dispositif de traitement d'informations, programme et procédé de traitement d'informations
US10945651B2 (en) Arousal level determination device
JP2019087018A (ja) ドライバモニタシステム
JP7046748B2 (ja) 運転者状態判定装置および運転者状態判定方法
WO2022113275A1 (fr) Dispositif de détection de sommeil et système de détection de sommeil
CN115299948A (zh) 一种驾驶员疲劳检测方法及检测***
WO2021262166A1 (fr) Évaluation d'opérateur et commande de véhicule sur la base de données de lunettes
Srivastava Driver's drowsiness identification using eye aspect ratio with adaptive thresholding
JP2019079285A (ja) 安全運転促進装置及び安全運転促進方法
US20230196797A1 (en) Non-contact depth sensing monitoring in vehicles
JP7127661B2 (ja) 開眼度算出装置

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2019557640

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19933802

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19933802

Country of ref document: EP

Kind code of ref document: A1