WO2017056401A1 - 制御装置、制御方法及びプログラム - Google Patents
制御装置、制御方法及びプログラム Download PDFInfo
- Publication number
- WO2017056401A1 WO2017056401A1 PCT/JP2016/004043 JP2016004043W WO2017056401A1 WO 2017056401 A1 WO2017056401 A1 WO 2017056401A1 JP 2016004043 W JP2016004043 W JP 2016004043W WO 2017056401 A1 WO2017056401 A1 WO 2017056401A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- driver
- mirror
- image
- person
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 66
- 230000008569 process Effects 0.000 claims abstract description 49
- 210000003128 head Anatomy 0.000 claims description 37
- 238000012545 processing Methods 0.000 claims description 20
- 230000009471 action Effects 0.000 claims description 15
- 230000010287 polarization Effects 0.000 claims description 14
- 239000011521 glass Substances 0.000 claims description 12
- 239000005357 flat glass Substances 0.000 claims description 5
- 238000013459 approach Methods 0.000 claims description 4
- 230000000994 depressogenic effect Effects 0.000 claims description 4
- 210000000744 eyelid Anatomy 0.000 claims description 3
- 230000006399 behavior Effects 0.000 description 39
- 238000001514 detection method Methods 0.000 description 27
- 210000005252 bulbus oculi Anatomy 0.000 description 18
- 238000005516 engineering process Methods 0.000 description 13
- 230000033001 locomotion Effects 0.000 description 10
- 230000008859 change Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 8
- 230000000694 effects Effects 0.000 description 7
- 238000002474 experimental method Methods 0.000 description 7
- 210000001508 eye Anatomy 0.000 description 7
- 230000008921 facial expression Effects 0.000 description 5
- 230000006870 function Effects 0.000 description 4
- 238000003702 image correction Methods 0.000 description 4
- 230000002265 prevention Effects 0.000 description 4
- 235000002673 Dioscorea communis Nutrition 0.000 description 2
- 241000544230 Dioscorea communis Species 0.000 description 2
- 208000035753 Periorbital contusion Diseases 0.000 description 2
- 206010041349 Somnolence Diseases 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 2
- 230000004397 blinking Effects 0.000 description 2
- 230000001815 facial effect Effects 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 1
- 238000005452 bending Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000035622 drinking Effects 0.000 description 1
- 210000002310 elbow joint Anatomy 0.000 description 1
- 230000007717 exclusion Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 210000002478 hand joint Anatomy 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000003313 weakening effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
- B60W30/0956—Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R11/04—Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/04—Traffic conditions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/584—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/171—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/103—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using camera systems provided with artificial illumination device, e.g. IR light source
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/107—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using stereoscopic cameras
Definitions
- the present technology relates to a control device capable of executing processing for preventing a vehicle accident, a control method in the control device, and a program.
- Patent Document 1 when a side mirror of another vehicle is detected, and the head of the driver of the other vehicle is detected from the mirror image of the side mirror, the own vehicle exists outside the blind spot area of the other vehicle, It is described that when the head of the driver of the vehicle is not detected, it is determined that the host vehicle is present in the blind spot area of the other vehicle, and when it is present in the blind spot area, this is notified to the occupant of the host vehicle.
- Patent Document 1 since the technique described in Patent Document 1 only knows whether or not a person's head is detected from the mirror image of the side mirror, the host vehicle exists outside the blind spot area of the other vehicle. However, it is not known whether the driver of the other vehicle actually recognizes the own vehicle. Therefore, in the technique described in Patent Literature 1, if the driver of the other vehicle does not recognize the own vehicle, an accident may occur between the own vehicle and the other vehicle.
- an object of the present technology is to recognize an occupant state of another vehicle and execute a process according to the state to prevent an accident of the own vehicle or the other vehicle.
- the control device includes an input unit and a control unit.
- a captured image of a camera provided in the host vehicle is input to the input unit.
- the control unit detects a mirror provided in another vehicle existing in front of the host vehicle from the input captured image, detects a person from the mirror image of the detected mirror, and detects the detected The state of the person is recognized from the person image. Further, the control unit executes a warning process or a control process for the host vehicle to prevent an accident of the host vehicle or the other vehicle according to the recognized person's state.
- control device can prevent an accident of the own vehicle or the other vehicle by recognizing the state of the occupant of the other vehicle and executing a process according to the state.
- the control unit recognizes a part of the person's body from the detected person image in the mirror image, recognizes a component of the other vehicle from an image other than the person in the mirror image, and The detected human state may be recognized based on the relationship between the recognized body part and the component.
- control device recognizes the body part of the person and the components of the other vehicle (for example, a handle, a seat, a seat belt, etc.) from the mirror image, and based on the relationship between the part of the body and the component, The state can be recognized.
- the body part of the person and the components of the other vehicle for example, a handle, a seat, a seat belt, etc.
- the control unit recognizes the state of the other vehicle from an image other than the mirror among the input captured images, and performs the action of the person based on the recognized state of the person and the state of the other vehicle. It may be estimated.
- control device recognizes the state of the other vehicle (for example, direction indicator, speed, vehicle body direction, etc.) and integrates it with the state of the person to estimate the action that the person will take. Can do.
- state of the other vehicle for example, direction indicator, speed, vehicle body direction, etc.
- the control unit may determine whether or not the detected person is a driver based on the relationship between the recognized body part and the component.
- the control device can identify the driver from these persons and recognize the state of the driver.
- a storage unit that stores driver's seat information regarding the driver's seat position for each vehicle type may be further included.
- the control unit recognizes the vehicle type of the other vehicle from the captured image, and estimates the position of the driver of the other vehicle based on the recognized vehicle type and the stored driver seat information. May be.
- the control device can estimate the position of the driving vehicle of the other vehicle based on the vehicle type of the other vehicle and the driver's seat information.
- the control unit detects an outer rear view mirror provided in the other vehicle, and the other vehicle When it is detected that the vehicle is in the same lane as the host vehicle, an inner rear view mirror provided in the other vehicle may be detected.
- the control device appropriately recognizes the state of the person in the other vehicle by switching the mirror to be noticed in the other vehicle depending on whether the other vehicle ahead is in the same lane as the own vehicle or in a different lane. can do.
- the outer rear view mirror is a mirror installed outside the vehicle such as a fender mirror or a door mirror (side mirror), and the inner rear view mirror is a mirror installed inside the vehicle.
- the control unit may control an infrared light emitting unit provided in the host vehicle so that the mirror detected at night is irradiated with infrared light.
- control device can robustly acquire a mirror image by irradiating the mirror with infrared light at night, and can recognize the state of the person from the mirror image even at night.
- the control unit may determine whether it is night (day and night) from the brightness of the captured image, or may determine from the current time.
- the control unit may recognize the state of the person based on the captured image from which a reflection component by the window glass of the other vehicle is input, which is input via a polarization camera provided in the host vehicle. .
- the polarizing camera can remove the reflection component of the front door glass of the other vehicle from the mirror image of the outer rear view mirror of the other vehicle, and the reflection component of the rear glass of the other vehicle from the inner rear view mirror of the other vehicle. Can be removed.
- the control unit detects from the captured image that the other vehicle is in a stopped state, and from the relationship between the image of the door in the mirror image and the image of the hand of the person, the person When it is detected that the person has touched the vehicle, the driver of the host vehicle is warned that the person gets off from the other vehicle, or the driver of the other vehicle is warned of the approach of the host vehicle.
- a warning process may be executed.
- control device makes it possible for the control device to detect that a person gets off from the other vehicle while the other vehicle is stopped, and to warn of a danger associated therewith.
- the control unit has a state in which the detected driver's eyelid is closed for a first time or more for a predetermined frequency or more, and a state in which the driver's head is depressed for a predetermined angle or more is a second time.
- warning processing is performed to warn the driver of the host vehicle that the driver of the other vehicle is asleep or to warn the driver of the other vehicle. May be.
- control device can detect the driver's drowsiness based on the state of the driver's bag and head of the other vehicle, and can execute a warning process according to the detection.
- the controller estimates the detected driver's gaze direction, and the driver of the other vehicle does not recognize the host vehicle when the estimated gaze direction does not match the mirror direction. A warning process for warning this may be executed.
- the control device can alert the driver of the host vehicle when it is determined that the driver does not recognize the host vehicle based on the driver's line-of-sight direction detected from the mirror. it can. If it can be recognized from the mirror image that the driver of the other vehicle is looking at the mirror direction, it can be estimated that the driver of the other vehicle is looking at the direction of the host vehicle via the mirror image of the mirror.
- the control unit recognizes the detected driver's hand and the direction indicator lighting lever of the other vehicle from the mirror image, and the driver's hand is applied to the direction indicator lighting lever. When this is detected, a warning process may be executed to warn the driver of the host vehicle that the driver of the other vehicle will change course.
- control device detects from the mirror image that the driver of the other vehicle has touched the direction indicator lighting lever, and notifies the driver of the own vehicle of the possibility of changing the course of the other vehicle. Can do.
- the control unit generates an image indicating the recognized state of the driver of the other vehicle, and the generated image is displayed on the windshield of the host vehicle, and the driver of the host vehicle You may control the display control part of the said own vehicle so that it may superimpose and display on the position corresponding to the driver's seat of the said other vehicle which can be visually recognized through glass.
- the control device displays the state of the driver of the other vehicle that cannot be directly viewed by the driver of the host vehicle by displaying AR (Augmented Reality) on the windshield of the host vehicle, thereby allowing the driver of the host vehicle to The state of the driver can be intuitively grasped.
- AR Augmented Reality
- a control method is as follows: Detecting a mirror provided in another vehicle existing in front of the host vehicle from a captured image input from a camera provided in the host vehicle; Detecting a person from the mirror image of the detected mirror, Recognizing the state of the person from the detected person image; and This includes executing a warning process or a control process for the host vehicle to prevent an accident of the host vehicle or the other vehicle according to the recognized person's state.
- a program according to another embodiment of the present technology is provided in a control device, Detecting a mirror provided in another vehicle existing in front of the host vehicle from a captured image input from a camera provided in the host vehicle; Detecting a person from the mirror image of the detected mirror; Recognizing the state of the person from the detected person image; A step of executing a warning process for preventing an accident of the host vehicle or the other vehicle or a control process of the host vehicle is executed according to the recognized person's state.
- FIG. 1 is a block diagram illustrating a configuration of an automobile including a control device according to an embodiment of the present technology.
- the automobile 1 includes a control device 100, a front camera 51, a front polarization camera 52, an IR projector 53, an alarm device 54, a vehicle body control unit 55, and a display 56.
- the automobile 1 includes a driving device (seat, steering wheel, seat belt, etc.), steering device (power steering, etc.), braking device, vehicle acceleration device, rear-view mirror, head lamp, tail lamp, direction indicator, etc. And having components provided in a typical automobile.
- a driving device seat, steering wheel, seat belt, etc.
- steering device power steering, etc.
- braking device vehicle acceleration device, rear-view mirror, head lamp, tail lamp, direction indicator, etc. And having components provided in a typical automobile.
- the control device 100 is provided in another vehicle (hereinafter referred to as a forward vehicle) that is present in the front field of view of the automobile 1 (own vehicle) from images captured by the front camera 51 and the front polarization camera 52. It is possible to recognize the rear view mirror (rear mirror) and recognize the state of the person detected from the mirror image of the mirror.
- a forward vehicle another vehicle that is present in the front field of view of the automobile 1 (own vehicle) from images captured by the front camera 51 and the front polarization camera 52. It is possible to recognize the rear view mirror (rear mirror) and recognize the state of the person detected from the mirror image of the mirror.
- the rear view mirror there are an outer rear view mirror such as a side mirror (door mirror) provided at the front door outside the vehicle or a fender mirror provided at the front end of the bonnet, and an inner rear view mirror provided at the front in the vehicle.
- a side mirror door mirror
- a fender mirror provided at the front end of the bonnet
- an inner rear view mirror provided at the front in the vehicle.
- the control device 100 has hardware necessary for a computer such as a CPU, RAM, and ROM, and is incorporated in the automobile 1 as a control circuit or a control unit.
- the control method according to the present technology is executed by the CPU loading a program according to the present technology pre-recorded in the ROM into the RAM and executing the program.
- control device 100 is not limited.
- a device such as a PLD (Programmable Logic Device) such as an FPGA (Field Programmable Gate Array), or an ASIC (Application Specific Integrated Circuit) may be used.
- the control device 100 may be configured as a part of the control unit of the automobile 1.
- the front camera 51 is installed, for example, on the roof portion or front bumper portion of the automobile 1.
- the front camera 51 is composed of, for example, an image sensor such as a CMOS or a CCD, and takes an RGB-IR image of the front field of view of the automobile 1 at a predetermined frame rate.
- the front camera 51 When the front camera 51 is configured as a stereo camera, depth (distance) information is acquired in addition to the RGB-IR video.
- the image acquired by the front camera 51 is input to the control device 100 for recognition of the vehicle ahead of the automobile 1 and its rear view mirror.
- the front polarizing camera 52 is also installed, for example, on the roof portion or front bumper portion of the automobile 1.
- the front polarization camera 52 removes the noise of the image of the other vehicle by removing the reflection component due to the window glass of the other vehicle and taking out only the transmission component from the image of the other vehicle existing in front of the automobile 1.
- the front polarization camera 52 is configured to detect a reflection component of a front door glass (in the case of a door mirror) of the front car and a front glass (in the case of a fender mirror) of the front car from a captured image of an outer rear view mirror of the front car. Remove reflection components.
- the front polarization camera 52 removes the reflection component of the rear glass of the front vehicle from the captured image of the inner rear view mirror of the front vehicle.
- the captured image from which the reflection component has been removed by the front polarization camera 52 is input to the control device 100 for recognition of the front vehicle and its rear view mirror.
- Either one of the image captured by the front camera 51 and the image captured by the front polarization camera 52 may be used in the processing by the control device 100, or both may be used.
- the control device 100 includes control blocks of a forward vehicle recognition unit 10, an in-mirror recognition unit 20, and an integrated determination unit 30.
- the control device 100 includes a day / night determination unit 41, an IR light emission control unit 42, and a driver position database 43. Each of these control blocks may be configured as a control circuit or may be configured as a software module.
- the front vehicle recognition unit 10 recognizes the front vehicle of the automobile 1 (own vehicle) from the image captured by the front camera 51 or the front polarization camera 52.
- the forward vehicle recognition unit 10 includes a forward vehicle position detection unit 11, a direction indicator detection unit 12, a forward vehicle speed estimation unit 13, a recognition mirror type determination unit 14, and a mirror detection unit 15.
- the front vehicle position detection unit 11 detects a position where the front vehicle exists from the captured image input from the front camera 51 or the front polarization camera 52. Specifically, the forward vehicle position detection unit 11 detects in which position in the captured image the forward vehicle is present, and whether the forward vehicle is in the same lane as the automobile 1 (own vehicle). Determine whether.
- the direction indicator detection unit 12 detects the direction indicator of the front vehicle from the image of the front vehicle detected by the front vehicle position detection unit 11 and detects its lighting state.
- the front vehicle speed estimation unit 13 estimates the speed of the front vehicle from the movement of the front vehicle detected by the front vehicle position detection unit 11 and the speed of the automobile 1 indicated by the speedometer of the automobile 1.
- the information detected by the direction indicator detection unit 12 and the forward vehicle speed estimation unit 13 is output to the integrated determination unit 30 and is used for estimation of the behavior of the driver of the forward vehicle.
- the recognition mirror type determination unit 14 detects the rear view to be detected among the rear view mirrors of the front vehicle based on the position of the front vehicle (whether or not it exists on the same lane as the automobile 1) detected by the front vehicle position detection unit 11. Determine the type of mirror.
- the mirror detection unit 15 detects the type of mirror determined by the detection mirror type determination unit 14 from the image of the front vehicle detected by the front vehicle position detection unit 11. The detected image of the mirror of the preceding vehicle is output to the in-mirror recognition unit 20.
- the in-mirror recognition unit 20 recognizes the state of the person from the mirror image of the front vehicle mirror detected by the front vehicle recognition unit 10.
- the in-mirror recognition unit 20 includes a mirror image correction unit 21, an image region division unit 22, a driver position estimation unit 23, an upper body posture estimation unit 24, a head / eyeball direction estimation unit 25, a hand detail estimation unit 26, and a handle.
- a door position estimation unit 27 is included.
- the mirror image correction unit 21 acquires a mirror image (hereinafter simply referred to as a mirror image) of the rear view mirror of the forward vehicle detected by the mirror detection unit 15 and performs the subsequent recognition processing of the mirror image. Correct distortion. This is because a convex mirror is often used for the rear view mirror.
- the corrected mirror image is output to the image area dividing unit 22.
- the corrected mirror image may be output to the display device 56 as a high-resolution mirror image that is not normally visible to the driver of the automobile 1.
- the image area dividing unit 22 divides the corrected mirror image into an image area of the human body and an image area of other areas (components in the front vehicle).
- the driver position estimating unit 23 Estimate the position of the driver.
- the driver position database 43 stores information indicating the position of the driver (right handle / left handle) for each vehicle type.
- the driver position estimating unit 23 specifies the position of the driver corresponding to the vehicle type separately estimated by the image analysis (including character recognition) from the captured image in the forward vehicle recognition unit 10 from the driver position database 43, In the divided image, the person existing at the specified driver position is estimated to be the driver.
- the driver position estimating unit 23 identifies the person as the driver.
- the upper body posture estimation unit 24 estimates the posture of the driver's upper body from the divided human body image.
- the upper body posture estimation unit 24 recognizes the head region and the hand region from the image of the human body, performs the processing on the head region to the head / eyeball direction estimation unit 25, and performs the processing on the hand region. Each of them is instructed to the copy detail estimation unit 26.
- the head / eyeball direction estimating unit 25 estimates (detects) the direction in which the head is facing, the facial expression, and the direction in which the eyeball (line of sight) is facing in the human body image (see FIG. 9).
- These estimation (detection) methods include known methods (for example, for the head direction, the angle and positional relationship of the head portion such as the eyes, nose, mouth, etc. in the image, for the eye direction, The positional relationship of the black eye (moving point) with respect to the contour is used.
- the hand detail estimation unit 26 estimates (detects) the movement of the hand in the human body image, particularly the movement of holding the handle.
- the control device 100 uses the joint model data indicating the joint point positions of the human body for the estimation process by the upper body posture estimation unit 24 (the head / eyeball direction estimation unit 25 and the hand detail estimation unit 26) (see FIG. 7). And 3D model data of the human body (the shape and size of each part of the human body, the positional relationship of each part, etc .: see FIG. 8) and the like.
- the control device 100 estimates the posture of the person's upper body on the limited condition that the person is sitting on a seat in the front car and the upper image of the person is reflected in the mirror image.
- Various data are stored. 7 and 8, a model of the whole body in a state where a person stands and raises both hands horizontally is shown as an example.
- the 3D model data of the human body and the joint model data of the human body are displayed in the driver's seat. It is deformed according to various states such as a sitting state, a state of getting into the car (sitting down), a state of getting out of the car, etc., and is used for estimation.
- the upper body posture estimation unit 24 may estimate (detect) the posture of the driver in the mirror image by fitting each part of the human body detected from the mirror image to the joint model or the 3D model of the human body, The position and orientation of the joint, skeleton, body surface, etc. may be estimated directly without fitting.
- the upper body posture estimation unit 24 may use a sensor different from the front camera 51 or may use a plurality of sensors. For example, by using a camera sensor that can acquire depth (distance) information, a human body can be recognized with high accuracy.
- the information estimated by the upper body posture estimation unit 24, the head / eyeball direction estimation unit 25, and the hand detail estimation unit 26 is output to the integrated determination unit 30 and is used to estimate the behavior of the driver of the preceding vehicle. .
- the upper body posture estimation unit 24 instead of the driver position estimation unit 23, based on the estimated posture or The position of the driver may be estimated based on the relationship between the posture and the various parts detected by the handle / door position estimation unit 27.
- the upper body posture estimation unit 24 may estimate, for example, the position of a person who is estimated to be performing a steering operation or an accelerator / brake operation as the position of the driver.
- the handle / door position estimation unit 27 analyzes the image other than the divided human body by pattern matching or the like, and handles various components in the vicinity of the driver's seat such as a handle, a (front) door, a turn indicator lighting lever, and a seat belt. Is estimated. Information on the positions of the various parts is also output to the integrated determination unit 30 and used for estimating the behavior of the driver of the preceding vehicle.
- the integration determination unit 30 integrates information on the preceding vehicle recognized by the preceding vehicle recognition unit 10 and information on the state of the person recognized by the in-mirror recognition unit 20 to determine the behavior of the driver of the preceding vehicle and Determine the corresponding risk level.
- the integrated determination unit 30 includes a driver behavior estimation unit 31 and a risk determination unit 32.
- the driver behavior estimation unit 31 integrates information recognized and output by the front vehicle recognition unit 10 and the in-mirror recognition unit 20, and estimates the behavior or state of the driver of the front vehicle. Specifically, the driver behavior estimation unit 31 calculates the probability of occurrence of various behaviors of the driver from data input from the forward vehicle recognition unit 10 and the in-mirror recognition unit 20 based on learning, experiments, and the like. The behavior with the highest probability of occurrence is estimated as the driver's behavior. The method for calculating the probability of occurrence of the driver's behavior may be determined in advance based on learning data equivalent to data input from the in-mirror recognition unit by learning or experiment, and learning behavior data. . The estimated behavior information is output to the risk determination unit 32. Further, the action information may be output to the display device 56 as, for example, character information or image information.
- the risk level determination unit 32 determines the risk level of the situation that the car 1 (the host vehicle) or the front vehicle is facing or will face. To do. Specifically, the risk determination unit 32 calculates the probability of occurrence of various dangerous events from data indicating the driver's behavior estimated by the driver behavior estimation unit 31 based on learning, experiments, and the like. Among them, the occurrence probability of the dangerous event having the highest occurrence probability is determined as the risk level. The calculation method of the probability of occurrence of the dangerous event may be determined in advance based on learning data equivalent to data indicating the behavior of the driver and learning dangerous event data by learning or experiment. When it is determined that the degree of danger is high, the danger degree determination unit 32 instructs the alarm device 54 to issue an alarm, or instructs the vehicle body control unit 55 to automatically control the vehicle body of the automobile 1. Details of these behavior estimation processing and risk determination processing will be described later.
- the alarm device 54 is an audio output unit for notifying the driver of the automobile 1 (own vehicle) or the driver of the vehicle ahead, for example.
- the warning output from the warning device 54 is a message or buzzer in a predetermined language for the driver of the automobile 1 and a horn (horn) horn for the driver of the preceding vehicle.
- the voice guidance output unit of the car navigation device in the automobile 1 may function as the warning device 54.
- the vehicle body control unit 55 is connected to, for example, a steering device, a braking device, a vehicle body acceleration device, and the like of the automobile 1, and according to the determined degree of risk, for example, a collision with a preceding vehicle, such as deceleration or a course change, etc. Car body control processing of the car 1 for avoiding danger is executed.
- the display 56 is, for example, a display of a car navigation device of the automobile 1 or a projection device that causes at least a part of the windshield of the automobile 1 to function as a head-up display.
- the display unit 56 displays information (characters or images) indicating the behavior of the driver of the preceding vehicle estimated by the driving vehicle behavior estimation unit 31 and the mirror image corrected by the mirror image correction unit 21. Displayed for the driver of the car 1.
- a display control unit that controls image display on the display device 50 may be provided.
- the alarm device 54 and the indicator 56 may be activated at the same time to warn the driver of the automobile 1 of the danger, or only one of them may be activated.
- the day / night determination unit 41 determines whether the current day is daytime or nighttime based on the brightness of the captured image input from the front camera 51.
- the day / night determination unit 41 may determine the day / night using the current time in addition to or instead of the brightness of the captured image.
- the IR light emission control unit 42 When the daylight / night determination unit 41 determines that the present day is nighttime, the IR light emission control unit 42 robustly acquires the image of the preceding vehicle and the mirror image even at night and recognizes the human body in the mirror image. Therefore, the IR projector 53 is controlled to project infrared light toward the detected front vehicle. When the rear view mirror of the vehicle ahead is detected by the mirror detection unit 15, the IR light emission control unit 42 causes the IR projector 53 to focus on only the detected rear view mirror and project infrared light. You may control.
- control device 100 configured as described above. This operation is executed in cooperation with hardware and software included in the control device 100.
- FIG. 2 is a flowchart showing an outline of the operation of the control device 100.
- the forward vehicle recognition unit 10 of the control device 100 acquires a front captured image from the front camera 51 or the front polarization camera 52 (step 61).
- the forward vehicle recognition unit 10 recognizes information on the forward vehicle from the captured image (step 62).
- the forward vehicle recognition unit 10 recognizes the position of the forward vehicle from the captured image by the forward vehicle position detection unit 11, recognizes the state of the direction indicator of the forward vehicle by the direction indicator detection unit 12, and
- the vehicle speed estimation unit 13 recognizes the speed of the preceding vehicle.
- the forward vehicle recognition unit 10 outputs the recognized information to the integrated determination unit 30.
- the front vehicle recognizing unit 10 determines the type of the rear view mirror to be detected among the rear view mirrors of the front vehicle by the recognition mirror type determining unit 14 (step 63).
- FIG. 3 is a diagram for explaining the determination process of the mirror type to be recognized.
- the recognition mirror type determination unit 14 detects from the captured image that the forward vehicle F exists in a lane different from the own vehicle 1 (obliquely forward of the own vehicle 1).
- the outer rear view mirror (door mirror or fender mirror) is determined as a recognition target.
- the recognition mirror type determination unit 14 detects from the captured image that the forward vehicle F is in the same lane as the host vehicle 1 (on the straight line of the host vehicle 1). In this case, the inner rear view mirror among the rear view mirrors of the front vehicle is determined as a recognition target.
- the forward vehicle recognition unit 10 detects the rear view mirror determined as the recognition target from the image of the forward vehicle by the mirror detection unit 15 (step 64).
- the forward vehicle recognition unit 10 determines whether or not the mirror detection unit 15 has successfully detected the rear view mirror (step 65). When the detection is successful (Yes), the forward vehicle recognition unit 10 extracts a region of the detected mirror image from the captured image and outputs it to the in-mirror recognition unit 20.
- the in-mirror recognition unit 20 recognizes human body information in the mirror from the mirror image (step 66).
- the details of the recognition processing of the human body information in the mirror will be described.
- FIG. 4 is a flowchart showing a detailed flow of the in-mirror recognition process.
- the in-mirror recognition unit 20 corrects the distortion or the like of the mirror image for the subsequent recognition processing by the mirror image correction unit 21 (step 71).
- the in-mirror recognition unit 20 divides the corrected mirror image into an image region of the human body and an image region of other parts (components in the front vehicle) by the image region dividing unit 22 (step 72). .
- the in-mirror recognition unit 20 is based on the position of the image of the human body divided by the driver position estimation unit 23 and the position data of the driver for each vehicle type stored in the driver position database 43.
- the position of the driver of the preceding vehicle is estimated (step 73).
- the in-mirror recognition unit 20 uses the upper body posture estimation unit 24 (the head / eyeball direction estimation unit 25 and the hand detail estimation unit 26) to determine the posture of the driver's upper body from the divided human body images. Estimate (step 74).
- the upper body posture estimation unit 24 fits each part of the human body detected from the human body image to a joint model or a 3D model (see FIGS. 7 and 8) of the human body to thereby adjust the posture of the driver. To track.
- the head / eyeball direction estimating unit 25 uses the positional relationship of eyes, nose, mouth, the shape thereof, the positional relationship of black eyes with respect to the contour of the eyes, etc. in the driver image. The direction in which the driver's head is facing, the facial expression, and the line of sight are estimated.
- the hand detail estimation unit 26 is based on the state of the head, upper body, arm and hand joints of the driver image, and the 3D shape of the driver. A state where a hand is being applied to other in-vehicle components (for example, a turn indicator lighting lever, seat belt, front door, etc.) is detected.
- other in-vehicle components for example, a turn indicator lighting lever, seat belt, front door, etc.
- the in-mirror recognition unit 20 causes the steering wheel / door position estimation unit 27 to use various images in the vicinity of the driver's seat such as the steering wheel, the front door, the turn indicator lighting lever, and the seat belt from images other than the human body in the mirror image.
- the position of the component is estimated (step 75).
- the in-mirror recognition unit 20 outputs each piece of information related to the human body and in-vehicle components estimated by the above processing to the integrated determination unit 30 (step 76).
- FIG. 5 is a diagram showing an example of a mirror image of the outer rear view mirror (door mirror) of the front vehicle detected by the mirror detection unit 15.
- FIG. 5 shows a mirror image of a door mirror of a forward vehicle that exists diagonally right forward as viewed from the automobile 1
- FIG. 5A shows a mirror image when the driver's seat (handle) of the front vehicle is on the left side in the vehicle
- FIG. Shows a mirror image when the driver's seat of the front vehicle is on the right side in the vehicle.
- the recognition target is the door mirror 70 of the forward vehicle F
- the mirror image I shows whether the driver's seat is on the side of the recognition target door mirror 70 or on the opposite side. Images of the upper body (including the head and arms) and the driver's seat can be recognized. In particular, since most of the area of the head of the driver can be visually recognized, the in-mirror recognition unit 20 can obtain information on the head and the direction of the line of sight by the head / eyeball direction estimation unit 25.
- the in-mirror recognition unit 20 can recognize information related to the driving state of the driver by recognizing the driver's hand and handle by the hand detail estimation unit 25.
- the upper body posture estimation unit 24 estimates information on whether or not the driver is holding the steering wheel from the angle between the head and upper body of the driver and the arm, the angle of the elbow joint, and the like. Can do.
- FIG. 6 is a diagram showing an example of the mirror image of the inner rear view mirror of the front vehicle detected by the mirror detection unit 15.
- the in-mirror recognition unit 20 does not perform the processes of the upper body posture estimation unit 24 and the hand detail estimation unit 26, but by the head / eyeball direction estimation unit 25. Only the estimation process is executed.
- the integrated determination unit 30 displays the forward vehicle information output from the forward vehicle recognition unit 10, Information on the face, eyes, hands, etc. of the person (driver) output from the inner recognition unit 20 is integrated, the driver behavior is estimated, and the risk corresponding to the behavior is estimated (step 68).
- the integrated determination unit 30 uses the above-mentioned vehicle information, information on the driver's face, eyes, hands, and the like and a method determined in advance by learning (experiment) to determine various actions of the driver.
- the probability of occurrence is calculated, the behavior with the highest probability of occurrence is estimated as the driver's behavior, and various dangerous events are determined using learning (experiment) in advance for the estimated behavior.
- the probability of occurrence of a dangerous event having the highest occurrence probability is estimated as the risk level.
- the method data defined in the above learning (experiment) is based on the conditions under which each behavior of the driver is established (if the driver is performing a certain behavior, the behavior taken by the driver, the driver's state, and the premise)
- the state of the components in the automobile 1 to be defined may be defined.
- the integrated determination unit 30 executes a warning process using at least one of the alarm device 54 and the display 56 or a vehicle body control process of the vehicle 1 by the vehicle body control unit 55 ( Step 69).
- the integrated determination unit 30 can detect a driver who is about to get off by opening a door while the preceding vehicle is stopped.
- the recognition target mirror in this case is an outer rear view mirror.
- the integrated determination unit 30 outputs at least that the estimated speed of the preceding vehicle output from the preceding vehicle recognition unit 10 is 0 (stopped state) and that is output from the in-mirror recognition unit 20. From this information, it is estimated as a determination condition that it is estimated that the driver of the vehicle ahead has touched the front door.
- the fact that the driver has put a hand on the front door is, for example, that the handle / door position estimating unit 27 recognizes the position of the front door from the mirror image of the outer rear view mirror, and the hand detail estimating unit 26 is the driver's hand. It is determined by recognizing that the direction of is facing the direction of the front door.
- the integrated determination unit 30 confirms that both the direction indicators are blinking (hazard lamp blinking state), the driver looks out from the front door glass, and the driver removes the seat belt. At least one of them may be an additional determination condition.
- the driver looking outside from the front door glass is determined by, for example, recognizing the position of the front door glass from the mirror image and recognizing that the driver's line-of-sight direction is facing the front door glass. Is done.
- the fact that the driver has removed the seat belt is, for example, whether or not there is a belt-like object (seat belt) with a predetermined angle and a predetermined width on the upper body (chest) of the driver, or driving on the seat belt. Judgment is made by recognizing the motion that the person has touched and removed.
- the integrated determination unit 30 outputs, for example, a voice message such as “a person is getting out of the car” from the alarm device 54 toward the driver of the automobile 1,
- the character 56 indicating the same message may be displayed on the display device 56 to warn the driver of the vehicle ahead of the vehicle.
- the integrated judgment unit 30 outputs a voice message such as “a car is approaching” or a horn (horn) sound to the vehicle ahead, so that the vehicle behind the vehicle (car 1) An approach may be warned.
- the integrated determination unit 30 determines whether the vehicle 1 is traveling in a position where the vehicle 1 may collide with the vehicle ahead or the driver when the front door of the vehicle ahead opens as a dangerous event. . If it is determined that there is a possibility of a collision (a dangerous event), the integrated determination unit 30 steers the vehicle 1 left or right by the vehicle body control unit 55 so as to avoid the vehicle ahead or the driver. The steering angle of the apparatus may be automatically controlled. On the other hand, if it is determined that there is no possibility of a collision even if the getting-off behavior is detected (not a dangerous event), the integrated determination unit 30 may only display that on the display unit 56, for example.
- the integrated determination unit 30 can detect the drowsy driving of the driver of the preceding vehicle.
- the integrated determination unit 30 estimates at least one of the following as a determination condition. 1) The state where the driver's bag is closed for a predetermined time or more has occurred for a predetermined frequency or more. 2) The state where the driver is depressed more than a predetermined angle has continued for a predetermined time or more. 3) The driver has a sleepy expression. 4) The driver's hand slipped off the steering wheel. 5) The driver does not continue to operate the steering wheel. 6) The accelerator is weakening. 7) The vehicle body is staggered.
- the recognition target mirror in this case is typically an outer rear view mirror, but the conditions 1) to 3) above can be determined even when the inner rear view mirror is the recognition target.
- 1) and 2) may be the determination conditions.
- the conditions 1) to 6) are determined from the estimation result by the in-mirror recognition unit 20, and the condition 7) is determined from the estimation result by the front vehicle recognition unit 10.
- the above condition 1) is determined by counting the duration and the number of times in which no black eye is detected by the head / eyeball direction estimating unit 25.
- the above condition 2) is determined by the estimation result of the upper body posture (the degree of bending of the neck joint) and the head direction by the upper body posture estimation unit 24.
- the determination of the condition of 3) above uses the driver's facial expression recognition technique by the head / eyeball direction estimation unit 25.
- the conditions 4) and 5) are determined based on the tracking result of the driver's hand movement by the hand detail estimation unit 26.
- the condition of 6) above is determined from the deceleration pattern of the vehicle body of the preceding vehicle when it is difficult to determine by estimating the driver's foot posture.
- the condition of 7) is determined based on the change information of the position of the forward vehicle detected by the forward vehicle position detection unit 11.
- the integrated determination unit 30 When the integrated determination unit 30 detects the conditions 1) and 2), for example, by estimating the behavior of the head / eyeball direction and the upper body posture, the driver may be dozing as a dangerous event. Judged to be high. As a warning process when this drowsy driving is detected, for example, the integrated determination unit 30 outputs a voice message to the driver of the automobile 1 that informs the driver of the vehicle ahead of the vehicle from the alarm device 54. Or by displaying character information indicating a similar message on the display 56.
- the integrated judgment unit 30 may warn the driver of the vehicle ahead by outputting a buzzer or a horn (horn) sound from the alarm device 54.
- the integrated judgment unit 30 can estimate whether or not the preceding vehicle recognizes the automobile 1 (own vehicle).
- the recognition target mirror may be an outer rear view mirror or an inner rear view mirror.
- the integrated determination unit 30 estimates the direction of the line of sight of the driver of the preceding vehicle based on the estimation result of the head / eyeball direction estimation unit 25, and the estimated direction is If the direction of the mirror matches, the driver of the other vehicle recognizes the automobile 1. If not, the driver of the other vehicle determines that the driver of the other vehicle does not recognize the automobile 1.
- the integrated determination unit 30 increases the number of times the driver of the vehicle ahead has seen the direction of the automobile 1, the longer the time, and the more recently it has been seen, A discriminant function may be set so that the degree of recognition increases.
- the integrated judgment part 30 is a dangerous event, when it is estimated that the time or frequency that the driver of the front vehicle is looking at the direction of the motor vehicle 1 is below a predetermined threshold value.
- the integrated determination unit 30 sends a “forward” message from the alarm device 54 to the driver of the vehicle 1 when the degree of recognition determined by the determination function is equal to or less than a predetermined threshold.
- a warning is issued by outputting a voice message such as "The car does not recognize the car.”
- the integrated judgment unit 30 may warn the display 56 by displaying character information indicating a similar message.
- the display device 56 may be a display of a car navigation device.
- the integrated determination unit 30 uses the windshield 110 of the automobile 1 as a display, and the driver of the automobile 1 visually recognizes the message on the windshield through the windshield 110.
- the position of the preceding vehicle that does not recognize the automobile 1 may be indicated by superimposing and displaying as AR on the position of the preceding front vehicle F.
- the integrated judgment unit 30 can estimate that the driver of the vehicle ahead is drunk.
- the recognition target mirror in this case may be an outer rear view mirror or an inner rear view mirror.
- the integrated determination unit 30 may determine whether or not the driver of the preceding vehicle is drunk driving depending on whether the determination result of the drunk driving determination device is Yes or No.
- the drunk driving determination device is created, for example, by the head / eyeball direction estimating unit 25 performing machine learning on the face movement, facial expression, facial color, etc. of the driver in front of the vehicle. Whether or not the driver is drunk driving is determined as a dangerous event based on the result of behavior estimation such as the movement, facial expression, and facial color.
- the drunk driving determination device may also take into account the degree of wobbling of the forward vehicle detected by the forward vehicle recognition unit 10.
- the integrated determination unit 30 warns the driver of the automobile 1 by outputting a voice message to that effect from the alarm device 54. May be.
- the integrated determination unit 30 may display character information indicating the message to that effect on the display of the car navigation device, and display the message on the windshield 110 as in the example shown in FIG. The character information may be displayed superimposed on the position of the vehicle ahead as AR.
- the integrated determination unit 30 can detect that the driver of the preceding vehicle is driving aside.
- the recognition target mirror in this case may be an outer rear view mirror or an inner rear view mirror.
- the integrated determination unit 30 determines whether the driver of the preceding vehicle is looking in a direction other than the front (substantially vertical direction of the windshield) by the behavior estimation of the head / eyeball direction estimation unit 25. As a dangerous event, it is determined whether or not the number of times the direction other than the front is viewed for a predetermined time or more and the frequency exceeds a threshold.
- the action is excluded from the threshold determination in principle.
- the time or the number of times exceeds a predetermined threshold, it is regarded as a side effect and the above exclusion is canceled.
- the integrated determination unit 30 is based on the detection result of the front vehicle recognition unit 10, whether the vehicle body of the front vehicle is staggered, is approaching another vehicle, is likely to protrude from the lane, etc. Also consider the event.
- the integrated determination unit 30 may warn the alarm device 54 by outputting a voice message to that effect to the driver of the automobile 1.
- the integrated determination unit 30 may display character information indicating the message to that effect on the display of the car navigation device, and display the message on the windshield 110 as in the example shown in FIG. The character information may be displayed superimposed on the position of the vehicle ahead as AR.
- the integrated determination unit 30 generates an image 115 indicating the behavior of the driver of the preceding vehicle, which is determined based on the estimation result by the head / eyeball direction estimation unit 25. May be superimposed and displayed as AR at the position of the driver's seat of the forward vehicle F on the windshield 110 of the automobile 1.
- the integrated determination unit 130 also generates the image to generate the above image. 115 may be displayed in a superimposed manner.
- the image indicating the state / action of the driver of the preceding vehicle is generated and superimposed on the position of the driver's seat of the preceding vehicle on the windshield 110.
- the in-mirror recognition unit 20 uses a 3D model of the human body when estimating the driver's posture of the front car, the driver of the front car that should not be seen by the driver of the car 1 originally. You can also create images from behind.
- the integrated judgment part 30 can recognize how the driver of a front vehicle looks from the driver
- the driver of the preceding car is 70% drunk by voice or text. Rather than being alerted, it may be easier to understand when the driver in front of the car is directly viewing the motion of the drunk driver. Further, not only whether the driver of the front car is drunk, but also minor side elements such as whether the attention is distracted or the temper is rough, the above-mentioned in-mirror recognition unit 20 moves forward. By recognizing the movement of the car driver's body and showing it to the driver of the automobile 1 as an AR display, the driver can often grasp intuitively.
- the integrated judgment unit 30 can predict whether the vehicle ahead is about to change course.
- the recognition target mirror in this case is an outer rear view mirror.
- the integrated determination unit 30 estimates at least one of the following as a determination condition. 1) The driver's line of sight of the vehicle ahead is facing the outer rear view mirror. 2) The driver in front of the vehicle visually observed the next lane. 3) The driver in front of the car looked back at the back of the lane. 4) The vehicle ahead has slowed down. 5) The driver of the front car started to turn the steering wheel. 6) The driver in front of the vehicle has touched the direction indicator lighting lever. 7) The vehicle ahead has lit the direction indicator.
- the integrated judgment unit 30 may use 6) among the above conditions 1) to 7) as judgment conditions.
- FIG. 11 is a view showing a state where the driver puts his hand H on the turn indicator lighting lever 120.
- the operation 6) is a preliminary operation for turning on the direction indicator, and depending on the person, there is a case where the person has been working for a long time before turning on the direction indicator. By detecting this, it is possible to detect a course change with high probability.
- a warning message is output by outputting a voice message “It is likely to be changed” from the alarm device 54 or displaying character information indicating the message on the windshield 110 or the display of the car navigation device.
- the control device 100 recognizes the state of an occupant of another vehicle and executes a process according to the state to prevent an accident of the host vehicle or the other vehicle. be able to. Specifically, the following is realized by the control device 100.
- the driver of the own vehicle can grasp that the driver of the preceding vehicle has overlooked the own vehicle, leading to accident prevention.
- the driver of the vehicle can grasp in advance that a person opens the door from the inside of the vehicle ahead and stops, leading to accident prevention.
- -The driver of the own vehicle can grasp that a person is trying to open a window and put out his face and hand from the inside of the vehicle ahead, and can drive carefully.
- -The driver of the own vehicle can know whether the driver of the vehicle ahead is driving aside, drinking or snoozing, and driving while paying attention to the vehicle in front leads to accident prevention.
- ⁇ When it is detected that the driver of the vehicle ahead is dozing, it is possible to prevent the accident by notifying the driver of the vehicle ahead by horn.
- the driver of the vehicle can grasp the detailed actions of the driver of the front car that cannot be represented by the direction indicator of the front car, etc., leading to accident prevention (only the direction indicator is lane change, left / right turn or U turn) However, for example, if it can be detected that the driver of the vehicle ahead is looking back to the back, the possibility of a U-turn is high). -The driver of the host vehicle can predict the next driving action even if the vehicle ahead forgets to turn on the direction indicator.
- FIG. 13 is a block diagram showing a configuration of the automobile 1 including the control device 100 when the recognition target is only the inner rear view mirror. As shown in the figure, compared with the block diagram of FIG. 1, the control device 100 of FIG. 13 does not have the upper body posture estimation unit 24 and the hand part detail 26. This is because it is difficult to recognize the posture of the upper body of the driver and the details of the hand from the mirror image of the inner rear view mirror.
- the control device 100 estimates the position of the driver of the front vehicle and recognizes the state and behavior of the driver. However, the control device 100 determines the state of the occupant of the front vehicle other than the driver and A warning process or an automatic vehicle body control process corresponding to the action may be executed.
- control device 100 warns the driver of the automobile 1 or the preceding vehicle of the danger according to the state and action of the driver of the preceding vehicle. Further, a vehicle other than the preceding vehicle (the preceding vehicle to be recognized by the mirror) may be warned.
- the present technology is applied to the outer rear view mirror or the inner rear view mirror of the automobile (four-wheeled vehicle) is shown, but the present technology may be applied to the rear view mirror of the two-wheeled vehicle.
- the host vehicle may be a two-wheeled vehicle or a four-wheeled vehicle.
- the display device may be a wearable display such as a transmissive head-mounted display configured integrally with the helmet of the driver of the host vehicle.
- this technique can also take the following structures.
- An input unit for inputting a captured image of a camera provided in the own vehicle; From the input captured image, a mirror provided in another vehicle existing in front of the host vehicle is detected, a person is detected from the mirror image of the detected mirror, and the detected person image
- a control unit for recognizing the state of the person and executing a warning process or a control process for the own vehicle in order to prevent an accident of the own vehicle or the other vehicle according to the recognized person state;
- a control device comprising: (2) The control device according to (1) above, The control unit recognizes a body part of the person from the detected person image in the mirror image, recognizes a component of the other vehicle from an image other than the person in the mirror image, and A control device for recognizing a state of the detected person based on a relationship between a recognized body part and a component.
- the control device recognizes the state of the other vehicle from an image other than the mirror among the input captured images, and performs the action of the person based on the recognized state of the person and the state of the other vehicle. Controller to estimate.
- the control device determines whether or not the detected person is a driver based on a relationship between the recognized body part and a component.
- the control device further includes a storage unit that stores driver seat information related to the driver seat position for each vehicle type, The control unit recognizes a vehicle type of the other vehicle from the captured image, and estimates a position of a driver of the other vehicle based on the recognized vehicle type and the stored driver seat information. apparatus.
- the control device detects an outer rear view mirror provided in the other vehicle when it is detected from the captured image that the other vehicle exists in a lane different from the own vehicle, and the other vehicle A control device that detects an inner rear view mirror provided in another vehicle when it is detected that the vehicle is in the same lane as the host vehicle.
- the control device controls the infrared-light-emitting part provided in the said vehicle so that infrared light may be irradiated to the said mirror detected at night.
- the control device recognizes the state of the said person based on the said captured image from which the reflection component by the window glass of the said other vehicle was input input via the polarization camera provided in the said own vehicle.
- the control unit detects from the captured image that the other vehicle is in a stopped state, and, based on the relationship between the image of the door in the mirror image and the image of the hand of the person, the person When it is detected that the vehicle has been touched, the driver of the host vehicle is warned that the person will get off from the other vehicle, or the driver of the other vehicle is warned of the approach of the host vehicle.
- a control device that performs warning processing.
- the control device has a state in which the detected driver's eyelid is closed for a predetermined time or more for a first time or more, and a state in which the driver's head is depressed for a predetermined angle or more is a second time.
- warning processing is performed to warn the driver of the host vehicle that the driver of the other vehicle is asleep or to warn the driver of the other vehicle.
- Control device (11) The control device according to (4) above, The controller estimates the detected driver's gaze direction, and the driver of the other vehicle does not recognize the host vehicle when the estimated gaze direction does not match the mirror direction. A control device that executes warning processing to warn you.
- the control device recognizes the detected driver's hand and the direction indicator lighting lever of the other vehicle from the mirror image, and the driver's hand is on the direction indicator lighting lever.
- the control unit generates an image indicating a state of the recognized driver of the other vehicle, and the generated image is displayed on a windshield of the host vehicle, and the driver of the host vehicle
- a control device that controls the display control unit of the host vehicle so as to be superimposed and displayed at a position corresponding to a driver's seat of the other vehicle that can be seen through the glass.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Health & Medical Sciences (AREA)
- Mathematical Physics (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Human Computer Interaction (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Traffic Control Systems (AREA)
- Rear-View Mirror Devices That Are Mounted On The Exterior Of The Vehicle (AREA)
- Image Analysis (AREA)
Abstract
Description
自車両に設けられたカメラから入力された撮像画像から、上記自車両の前方に存在する他車両に設けられたミラーを検出すること、
上記検出されたミラーの鏡像画像から人物を検出すること、
上記検出された人物の画像から当該人物の状態を認識すること、及び、
上記認識された人物の状態に応じて、上記自車両または上記他車両の事故を未然に防ぐための警告処理または上記自車両の制御処理を実行することを含む。
自車両に設けられたカメラから入力された撮像画像から、上記自車両の前方に存在する他車両に設けられたミラーを検出するステップと、
上記検出されたミラーの鏡像画像から人物を検出するステップと、
上記検出された人物の画像から当該人物の状態を認識するステップと、
上記認識された人物の状態に応じて、上記自車両または上記他車両の事故を未然に防ぐための警告処理または上記自車両の制御処理を実行するステップと
を実行させる。
図1は、本技術の一実施形態に係る制御装置を含む自動車の構成を示したブロック図である。
次に、以上のように構成された制御装置100の動作について説明する。当該動作は、制御装置100が有するハードウェアとソフトウェアとの協働により実行される。
図2は、制御装置100の動作の概要を示したフローチャートである。
図4は、上記ミラー内認識処理の詳細な流れを示したフローチャートである。
統合判断部30は、前方車が停車中に、ドアを開けて降りようとしている運転者を検知することができる。この場合の認識対象のミラーは、アウターリアビューミラーである。
統合判断部30は、前方車の運転者の居眠り運転を検知することができる。
1)運転者の瞼が所定時間以上閉じている状態が所定頻度以上生じたこと。
2)運転者が所定角度以上うつむいている状態が所定時間以上継続したこと。
3)運転者が眠い表情をしていること。
4)ハンドルから運転者の手がずり落ちたこと。
5)運転者がハンドルを操作継続していないこと。
6)アクセルが弱まってきていること。
7)車体がふらついていること。
統合判断部30は、前方車が自動車1(自車両)を認識しているか否かを推定することができる。この場合の認識対象のミラーは、アウターリアビューミラーであってもよいしインナーリアビューミラーであってもよい。
統合判断部30は、前方車の運転者が飲酒運転をしていることを推定することができる。この場合の認識対象ミラーは、アウターリアビューミラーでもインナーリアビューミラーでもよい。
統合判断部30は、前方車の運転者がわき見運転をしていることを検知することができる。この場合の認識対象ミラーは、アウターリアビューミラーでもインナーリアビューミラーでもよい。
統合判断部30は、前方車が進路変更しようとしているかを予測することができる。この場合の認識対象ミラーはアウターリアビューミラーである。
1)前方車の運転者の視線がアウターリアビューミラーを向いていること。
2)前方車の運転者が目視で隣の車線を見たこと。
3)前方車の運転者が車線の後ろの方を振り返って見たこと。
4)前方車が減速してきたこと。
5)前方車の運転者がハンドルを切り始めたこと。
6)前方車の運転者が方向指示器点灯用レバーに手をかけたこと。
7)前方車が方向指示器を点灯したこと。
図11は、運転者が方向指示器点灯用レバー120に手Hをかけた状態を示した図である。この6)の動作は、方向指示器を点灯させる予備動作であり、人によっては方向指示器を点灯させるよりもかなり前から手をかけている場合もあるため、統合判断部30は、この動作を検出することで進路変更を高確率に検出できる。
以上説明したように、本実施形態によれば、制御装置100は、他車両の乗員の状態を認識してその状態に応じた処理を実行することで自車両または他車両の事故を未然に防ぐことができる。具体的には、制御装置100によって以下のことが実現される。
・自車両の運転者は、停止している前方車の車内から人がドアを開けて出て来ることが未然に把握でき、事故防止につながる。
・自車両の運転者は、前方車の車内から人が窓を開けて顔や手を出そうとしていることを把握でき、注意して走行することができる。
・自車両の運転者は、前方車の運転者がわき見/飲酒/居眠り運転をしているかどうかを把握でき、その前方車に気を付けながら運転することで、事故防止につながる。
・前方車の運転者が居眠り運転をしていることが検知された場合に、それを前方車の運転者にクラクション等で知らせることで、事故防止につながる。
・自車両の運転者は、前方車の方向指示器等で表しきれない前方車の運転者の詳細動作を把握でき、事故防止につながる(車線変更か右左折かUターンかは方向指示器だけでは判断つかないが、例えば前方車の運転者が後ろまでしっかり振り返って見ていることが検知できればUターンである可能性が高いと判断される)。
・自車両の運転者は、前方車が方向指示器の点灯を忘れていても、次の運転動作を予測できる。
本技術は上述の実施形態にのみ限定されるものではなく、本開示の要旨を逸脱しない範囲内において種々変更され得る。
なお、本技術は以下のような構成もとることができる。
(1)
自車両に設けられたカメラの撮像画像が入力される入力部と、
前記入力された撮像画像から、前記自車両の前方に存在する他車両に設けられたミラーを検出し、当該検出されたミラーの鏡像画像から人物を検出し、当該検出された人物の画像から当該人物の状態を認識し、当該認識された人物の状態に応じて、前記自車両または前記他車両の事故を未然に防ぐための警告処理または前記自車両の制御処理を実行する制御部と、
を具備する制御装置。
(2)
上記(1)に記載の制御装置であって、
前記制御部は、前記鏡像画像のうち前記検出された人物の画像から当該人物の体の部位を認識し、前記鏡像画像のうち当該人物以外の画像から前記他車両の構成部品を認識し、当該認識された体の部位と構成部品との関係を基に、前記検出された人物の状態を認識する
制御装置。
(3)
上記(1)または(2)に記載の制御装置であって、
前記制御部は、前記入力された撮像画像のうち前記ミラー以外の画像から前記他車両の状態を認識し、前記認識された人物の状態と他車両の状態とを基に、前記人物の行動を推定する
制御装置。
(4)
上記(2)に記載の制御装置であって、
前記制御部は、前記認識された体の部位と構成部品との関係を基に、前記検出された人物が運転者であるか否かを判定する
制御装置。
(5)
上記(1)~(4)のいずれかに記載の制御装置であって、
車両種別毎の運転席の位置に関する運転席情報を記憶する記憶部をさらに具備し、
前記制御部は、前記撮像画像から前記他車両の車両種別を認識し、当該認識された車両種別と前記記憶された運転席情報とを基に、前記他車両の運転者の位置を推定する
制御装置。
(6)
上記(1)~(5)のいずれかに記載の制御装置であって、
前記制御部は、前記撮像画像から、前記他車両が前記自車両とは異なる車線に存在することが検出された場合に、当該他車両に設けられたアウターリアビューミラーを検出し、前記他車両が前記自車両と同じ車線に存在することが検出された場合に、当該他車両に設けられたインナーリアビューミラーを検出する
制御装置。
(7)
上記(1)~(6)のいずれかに記載の制御装置であって、
前記制御部は、夜間に検出された前記ミラーに赤外光を照射するように、前記自車両に設けられた赤外光発光部を制御する
制御装置。
(8)
上記(1)~(7)のいずれかに記載の制御装置であって、
前記制御部は、前記自車両に設けられた偏光カメラを介して入力された、前記他車両の窓ガラスによる反射成分が除去された前記撮像画像を基に前記人物の状態を認識する
制御装置。
(9)
上記1に記載の制御装置であって、
前記制御部は、前記撮像画像から、前記他車両が停止状態にあることを検出し、かつ、前記鏡像画像内のドアの画像と前記人物の手の画像との関係から、前記人物が前記ドアに手をかけたことを検出した場合に、前記自車両の運転者に前記他車両から前記人物が降車することを警告し、または、前記他車両の運転者に前記自車両の接近を警告する警告処理を実行する
制御装置。
(10)
上記(4)に記載の制御装置であって、
前記制御部は、前記検出された運転者の瞼が第1の時間以上閉じている状態が所定頻度以上生じ、かつ、前記運転者の頭部が所定角度以上うつむいている状態が第2の時間以上継続したことを検出した場合に、前記自車両の運転者に前記他車両の運転者が居眠りをしていることを警告し、または、前記他車両の運転者に警報する警告処理を実行する
制御装置。
(11)
上記(4)に記載の制御装置であって、
前記制御部は、前記検出された運転者の視線方向を推定し、当該推定された視線方向が前記ミラーの方向に合致しない場合に、前記他車両の運転者が前記自車両を認識していないことを警告する警告処理を実行する
制御装置。
(12)
上記(4)に記載の制御装置であって、
前記制御部は、前記鏡像画像から、前記検出された運転者の手及び前記他車両の方向指示器点灯用レバーを認識し、前記方向指示器点灯用レバーに前記運転者の手がかかっていることを検出した場合に、前記自車両の運転者に前記他車両の運転者が進路変更することを警告する警告処理を実行する
制御装置。
(13)
上記(4)に記載の制御装置であって、
前記制御部は、前記認識された他車両の運転者の状態を示す画像を生成し、当該生成された画像を、前記自車両のフロントガラス上であって、前記自車両の運転者が当該フロントガラス越しに視認できる前記他車両の運転席に対応する位置に重畳して表示するように、前記自車両の表示制御部を制御する
制御装置。
10…前方車認識部
20…ミラー内認識部
30…統合判断部
41…昼夜判断部
42…IR発光制御部
43…運転者位置データベース
51…前方カメラ
52…前方偏光カメラ
53…IR投影機
54…警報装置
55…車体制御部
56…表示器
70…ドアミラー
80…インナーリアビューミラー
100…制御装置
110…フロントガラス
115…運転者画像
120…方向指示器点灯用レバー
F…前方車
I…ミラー画像
Claims (15)
- 自車両に設けられたカメラの撮像画像が入力される入力部と、
前記入力された撮像画像から、前記自車両の前方に存在する他車両に設けられたミラーを検出し、当該検出されたミラーの鏡像画像から人物を検出し、当該検出された人物の画像から当該人物の状態を認識し、当該認識された人物の状態に応じて、前記自車両または前記他車両の事故を未然に防ぐための警告処理または前記自車両の制御処理を実行する制御部と、
を具備する制御装置。 - 請求項1に記載の制御装置であって、
前記制御部は、前記鏡像画像のうち前記検出された人物の画像から当該人物の体の部位を認識し、前記鏡像画像のうち当該人物以外の画像から前記他車両の構成部品を認識し、当該認識された体の部位と構成部品との関係を基に、前記検出された人物の状態を認識する
制御装置。 - 請求項1に記載の制御装置であって、
前記制御部は、前記入力された撮像画像のうち前記ミラー以外の画像から前記他車両の状態を認識し、前記認識された人物の状態と他車両の状態とを基に、前記人物の行動を推定する
制御装置。 - 請求項2に記載の制御装置であって、
前記制御部は、前記認識された体の部位と構成部品との関係を基に、前記検出された人物が運転者であるか否かを判定する
制御装置。 - 請求項1に記載の制御装置であって、
車両種別毎の運転席の位置に関する運転席情報を記憶する記憶部をさらに具備し、
前記制御部は、前記撮像画像から前記他車両の車両種別を認識し、当該認識された車両種別と前記記憶された運転席情報とを基に、前記他車両の運転者の位置を推定する
制御装置。 - 請求項1に記載の制御装置であって、
前記制御部は、前記撮像画像から、前記他車両が前記自車両とは異なる車線に存在することが検出された場合に、当該他車両に設けられたアウターリアビューミラーを検出し、前記他車両が前記自車両と同じ車線に存在することが検出された場合に、当該他車両に設けられたインナーリアビューミラーを検出する
制御装置。 - 請求項1に記載の制御装置であって、
前記制御部は、夜間に検出された前記ミラーに赤外光を照射するように、前記自車両に設けられた赤外光発光部を制御する
制御装置。 - 請求項1に記載の制御装置であって、
前記制御部は、前記自車両に設けられた偏光カメラを介して入力された、前記他車両の窓ガラスによる反射成分が除去された前記撮像画像を基に前記人物の状態を認識する
制御装置。 - 請求項1に記載の制御装置であって、
前記制御部は、前記撮像画像から、前記他車両が停止状態にあることを検出し、かつ、前記鏡像画像内のドアの画像と前記人物の手の画像との関係から、前記人物が前記ドアに手をかけたことを検出した場合に、前記自車両の運転者に前記他車両から前記人物が降車することを警告し、または、前記他車両の運転者に前記自車両の接近を警告する警告処理を実行する
制御装置。 - 請求項4に記載の制御装置であって、
前記制御部は、前記検出された運転者の瞼が第1の時間以上閉じている状態が所定頻度以上生じ、かつ、前記運転者の頭部が所定角度以上うつむいている状態が第2の時間以上継続したことを検出した場合に、前記自車両の運転者に前記他車両の運転者が居眠りをしていることを警告し、または、前記他車両の運転者に警報する警告処理を実行する
制御装置。 - 請求項4に記載の制御装置であって、
前記制御部は、前記検出された運転者の視線方向を推定し、当該推定された視線方向が前記ミラーの方向に合致しない場合に、前記他車両の運転者が前記自車両を認識していないことを警告する警告処理を実行する
制御装置。 - 請求項4に記載の制御装置であって、
前記制御部は、前記鏡像画像から、前記検出された運転者の手及び前記他車両の方向指示器点灯用レバーを認識し、前記方向指示器点灯用レバーに前記運転者の手がかかっていることを検出した場合に、前記自車両の運転者に前記他車両の運転者が進路変更することを警告する警告処理を実行する
制御装置。 - 請求項4に記載の制御装置であって、
前記制御部は、前記認識された他車両の運転者の状態を示す画像を生成し、当該生成された画像を、前記自車両のフロントガラス上であって、前記自車両の運転者が当該フロントガラス越しに視認できる前記他車両の運転席に対応する位置に重畳して表示するように、前記自車両の表示制御部を制御する
制御装置。 - 自車両に設けられたカメラから入力された撮像画像から、前記自車両の前方に存在する他車両に設けられたミラーを検出し、
前記検出されたミラーの鏡像画像から人物を検出し、
前記検出された人物の画像から当該人物の状態を認識し、
前記認識された人物の状態に応じて、前記自車両または前記他車両の事故を未然に防ぐための警告処理または前記自車両の制御処理を実行する
制御方法。 - 制御装置に、
自車両に設けられたカメラから入力された撮像画像から、前記自車両の前方に存在する他車両に設けられたミラーを検出するステップと、
前記検出されたミラーの鏡像画像から人物を検出するステップと、
前記検出された人物の画像から当該人物の状態を認識するステップと、
前記認識された人物の状態に応じて、前記自車両または前記他車両の事故を未然に防ぐための警告処理または前記自車両の制御処理を実行するステップと
を実行させるプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA2999671A CA2999671A1 (en) | 2015-09-30 | 2016-09-05 | Control apparatus, control method, and program |
US15/761,901 US10793149B2 (en) | 2015-09-30 | 2016-09-05 | Control apparatus, control method, and program |
MX2018003613A MX2018003613A (es) | 2015-09-30 | 2016-09-05 | Dispositivo de control, metodo de control y programa. |
EP16850582.4A EP3358548A4 (en) | 2015-09-30 | 2016-09-05 | CONTROL DEVICE, CONTROL METHOD, AND PROGRAM |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015193320 | 2015-09-30 | ||
JP2015-193320 | 2015-09-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017056401A1 true WO2017056401A1 (ja) | 2017-04-06 |
Family
ID=58422969
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/004043 WO2017056401A1 (ja) | 2015-09-30 | 2016-09-05 | 制御装置、制御方法及びプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US10793149B2 (ja) |
EP (1) | EP3358548A4 (ja) |
CA (1) | CA2999671A1 (ja) |
MX (1) | MX2018003613A (ja) |
WO (1) | WO2017056401A1 (ja) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2553616A (en) * | 2016-06-13 | 2018-03-14 | Ford Global Tech Llc | Blind spot detection systems and methods |
JP2019012445A (ja) * | 2017-06-30 | 2019-01-24 | パイオニア株式会社 | 制御装置、制御方法及びプログラム |
JP2019098811A (ja) * | 2017-11-29 | 2019-06-24 | トヨタ自動車株式会社 | 車間距離制御装置 |
WO2019130483A1 (ja) * | 2017-12-27 | 2019-07-04 | 本田技研工業株式会社 | 車両制御装置、車両制御方法、およびプログラム |
WO2019138661A1 (ja) * | 2018-01-12 | 2019-07-18 | ソニー株式会社 | 情報処理装置および情報処理方法 |
JP2019200548A (ja) * | 2018-05-15 | 2019-11-21 | 日産自動車株式会社 | 車載装置、車載装置の制御方法、及び予備動作推定システム |
CN110682913A (zh) * | 2018-07-03 | 2020-01-14 | 矢崎总业株式会社 | 监视*** |
JP2020016950A (ja) * | 2018-07-23 | 2020-01-30 | 株式会社デンソーテン | 衝突判定装置および衝突判定方法 |
WO2020070941A1 (ja) * | 2018-10-05 | 2020-04-09 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 情報処理方法、及び、情報処理システム |
JP2020095356A (ja) * | 2018-12-10 | 2020-06-18 | トヨタ自動車株式会社 | 異常検出装置、異常検出システム及び異常検出プログラム |
JP2020166479A (ja) * | 2019-03-29 | 2020-10-08 | 本田技研工業株式会社 | 運転支援装置 |
JP2021007717A (ja) * | 2019-07-03 | 2021-01-28 | 本田技研工業株式会社 | 乗員観察装置、乗員観察方法、及びプログラム |
WO2022153592A1 (ja) * | 2021-01-12 | 2022-07-21 | 日立Astemo株式会社 | 車両制御装置 |
JP7379545B2 (ja) | 2022-01-06 | 2023-11-14 | 本田技研工業株式会社 | 車両用警告システム |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9919648B1 (en) | 2016-09-27 | 2018-03-20 | Robert D. Pedersen | Motor vehicle artificial intelligence expert system dangerous driving warning and control system and method |
JP6820533B2 (ja) * | 2017-02-16 | 2021-01-27 | パナソニックIpマネジメント株式会社 | 推定装置、学習装置、推定方法、及び推定プログラム |
US10699143B2 (en) * | 2017-03-10 | 2020-06-30 | Gentex Corporation | System and method for vehicle occupant identification and monitoring |
US11151883B2 (en) | 2017-11-03 | 2021-10-19 | International Business Machines Corporation | Empathic autonomous vehicle |
US10762785B1 (en) * | 2018-01-09 | 2020-09-01 | State Farm Mutual Automobile Insurance Company | Vehicle collision alert system and method |
JP7014032B2 (ja) * | 2018-04-23 | 2022-02-01 | 株式会社デンソー | 車両衝突推定装置 |
EP3819891A1 (en) * | 2019-11-07 | 2021-05-12 | Ningbo Geely Automobile Research & Development Co. Ltd. | Threat mitigation for vehicles |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009211309A (ja) * | 2008-03-03 | 2009-09-17 | Honda Motor Co Ltd | 走行支援装置 |
JP2010095187A (ja) * | 2008-10-17 | 2010-04-30 | Toyota Motor Corp | 車両状態検出装置 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9292471B2 (en) * | 2011-02-18 | 2016-03-22 | Honda Motor Co., Ltd. | Coordinated vehicle response system and method for driver behavior |
JP6292054B2 (ja) * | 2013-11-29 | 2018-03-14 | 富士通株式会社 | 運転支援装置、方法、及びプログラム |
JP2016127333A (ja) * | 2014-12-26 | 2016-07-11 | 株式会社リコー | 撮像素子および撮像装置および撮像情報認識システム |
-
2016
- 2016-09-05 WO PCT/JP2016/004043 patent/WO2017056401A1/ja active Application Filing
- 2016-09-05 US US15/761,901 patent/US10793149B2/en active Active
- 2016-09-05 EP EP16850582.4A patent/EP3358548A4/en not_active Withdrawn
- 2016-09-05 CA CA2999671A patent/CA2999671A1/en not_active Abandoned
- 2016-09-05 MX MX2018003613A patent/MX2018003613A/es unknown
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009211309A (ja) * | 2008-03-03 | 2009-09-17 | Honda Motor Co Ltd | 走行支援装置 |
JP2010095187A (ja) * | 2008-10-17 | 2010-04-30 | Toyota Motor Corp | 車両状態検出装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3358548A4 * |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2553616A (en) * | 2016-06-13 | 2018-03-14 | Ford Global Tech Llc | Blind spot detection systems and methods |
JP2019012445A (ja) * | 2017-06-30 | 2019-01-24 | パイオニア株式会社 | 制御装置、制御方法及びプログラム |
JP2019098811A (ja) * | 2017-11-29 | 2019-06-24 | トヨタ自動車株式会社 | 車間距離制御装置 |
WO2019130483A1 (ja) * | 2017-12-27 | 2019-07-04 | 本田技研工業株式会社 | 車両制御装置、車両制御方法、およびプログラム |
WO2019138661A1 (ja) * | 2018-01-12 | 2019-07-18 | ソニー株式会社 | 情報処理装置および情報処理方法 |
JPWO2019138661A1 (ja) * | 2018-01-12 | 2020-11-19 | ソニー株式会社 | 情報処理装置および情報処理方法 |
JP2019200548A (ja) * | 2018-05-15 | 2019-11-21 | 日産自動車株式会社 | 車載装置、車載装置の制御方法、及び予備動作推定システム |
JP7147259B2 (ja) | 2018-05-15 | 2022-10-05 | 日産自動車株式会社 | 車載装置、車載装置の制御方法、及び予備動作推定システム |
CN110682913A (zh) * | 2018-07-03 | 2020-01-14 | 矢崎总业株式会社 | 监视*** |
JP2020008931A (ja) * | 2018-07-03 | 2020-01-16 | 矢崎総業株式会社 | 監視システム |
US11132534B2 (en) | 2018-07-03 | 2021-09-28 | Yazaki Corporation | Monitoring system |
CN110682913B (zh) * | 2018-07-03 | 2022-08-16 | 矢崎总业株式会社 | 监视*** |
JP2020016950A (ja) * | 2018-07-23 | 2020-01-30 | 株式会社デンソーテン | 衝突判定装置および衝突判定方法 |
WO2020070941A1 (ja) * | 2018-10-05 | 2020-04-09 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 情報処理方法、及び、情報処理システム |
JP2020095356A (ja) * | 2018-12-10 | 2020-06-18 | トヨタ自動車株式会社 | 異常検出装置、異常検出システム及び異常検出プログラム |
JP7234614B2 (ja) | 2018-12-10 | 2023-03-08 | トヨタ自動車株式会社 | 異常検出装置、異常検出システム及び異常検出プログラム |
JP2020166479A (ja) * | 2019-03-29 | 2020-10-08 | 本田技研工業株式会社 | 運転支援装置 |
JP2021007717A (ja) * | 2019-07-03 | 2021-01-28 | 本田技研工業株式会社 | 乗員観察装置、乗員観察方法、及びプログラム |
WO2022153592A1 (ja) * | 2021-01-12 | 2022-07-21 | 日立Astemo株式会社 | 車両制御装置 |
JP7379545B2 (ja) | 2022-01-06 | 2023-11-14 | 本田技研工業株式会社 | 車両用警告システム |
Also Published As
Publication number | Publication date |
---|---|
CA2999671A1 (en) | 2017-04-06 |
US10793149B2 (en) | 2020-10-06 |
EP3358548A1 (en) | 2018-08-08 |
MX2018003613A (es) | 2018-04-30 |
EP3358548A4 (en) | 2019-10-16 |
US20180229725A1 (en) | 2018-08-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017056401A1 (ja) | 制御装置、制御方法及びプログラム | |
US11787408B2 (en) | System and method for controlling vehicle based on condition of driver | |
JP6540663B2 (ja) | 車両システム | |
EP3142902B1 (en) | Display device and vehicle | |
CN108621923B (zh) | 车辆的显示***及车辆的显示***的控制方法 | |
JP6454368B2 (ja) | 車両の表示システム及び車両の表示システムの制御方法 | |
CN107074246B (zh) | 动态控制装置及相关方法 | |
WO2017104794A1 (ja) | 視覚認知支援システムおよび視認対象物の検出システム | |
JP4825868B2 (ja) | 車両用警報装置 | |
CN108621794B (zh) | 车辆的显示***及车辆的显示***的控制方法 | |
JP5817843B2 (ja) | 車両用情報伝達装置 | |
US20190367038A1 (en) | Driver monitoring device | |
JP6205640B2 (ja) | 車両用警告装置 | |
CN108621922B (zh) | 车辆的显示***及车辆的显示***的控制方法 | |
JP2016045713A (ja) | 車載制御装置 | |
WO2015166059A1 (en) | Control apparatus and related method | |
JP5925163B2 (ja) | 運転支援装置 | |
JP2016045714A (ja) | 車載制御装置 | |
US11708079B2 (en) | Method, device, and system for influencing at least one driver assistance system of a motor vehicle | |
JP6730096B2 (ja) | 乗員状態監視装置 | |
JP2016055801A (ja) | 車載表示装置 | |
JP6187155B2 (ja) | 注視対象物推定装置 | |
JP6213435B2 (ja) | 注意過多状態判定装置及び注意過多状態判定プログラム | |
WO2019193715A1 (ja) | 運転支援装置 | |
JP2018154329A (ja) | 車載制御装置及び車載システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16850582 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15761901 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2999671 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: MX/A/2018/003613 Country of ref document: MX |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
NENP | Non-entry into the national phase |
Ref country code: JP |