WO2013136740A1 - 運転支援装置および運転支援方法 - Google Patents
運転支援装置および運転支援方法 Download PDFInfo
- Publication number
- WO2013136740A1 WO2013136740A1 PCT/JP2013/001487 JP2013001487W WO2013136740A1 WO 2013136740 A1 WO2013136740 A1 WO 2013136740A1 JP 2013001487 W JP2013001487 W JP 2013001487W WO 2013136740 A1 WO2013136740 A1 WO 2013136740A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display
- driver
- gaze
- area
- image
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 156
- 238000001514 detection method Methods 0.000 claims abstract description 39
- 238000003384 imaging method Methods 0.000 claims abstract 2
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 claims description 12
- 230000033001 locomotion Effects 0.000 claims description 11
- 230000002093 peripheral effect Effects 0.000 claims description 4
- 230000004424 eye movement Effects 0.000 claims 1
- 230000001815 facial effect Effects 0.000 abstract 2
- 238000010586 diagram Methods 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 230000007423 decrease Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000004378 air conditioning Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000013598 vector Substances 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/23—Head-up displays [HUD]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/28—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
- B60K35/285—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver for improving awareness by directing driver's gaze direction or eye points
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/29—Instruments characterised by the way in which information is handled, e.g. showing information on plural displays or prioritising information according to driving conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/02—Rear-view mirror arrangements
- B60R1/06—Rear-view mirror arrangements mounted on vehicle exterior
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/24—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view in front of the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/149—Instrument input by detecting viewing direction not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/18—Information management
- B60K2360/191—Highlight information
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/20—Optical features of instruments
- B60K2360/21—Optical features of instruments using cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/20—Optical features of instruments
- B60K2360/33—Illumination features
- B60K2360/334—Projection means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/307—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing virtually distinguishing relevant parts of a scene from the background of the scene
- B60R2300/308—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing virtually distinguishing relevant parts of a scene from the background of the scene by overlaying the real scene, e.g. through a head-up display on the windscreen
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Definitions
- the present disclosure relates to a driving support apparatus and a driving support method for displaying an image for supporting a driver of a vehicle.
- the line of sight of the driver of the vehicle is detected, and based on the direction of the detected line of sight, the driver shoots a gaze target with the line of sight directed, and this captured image is displayed on a display screen provided in front of the driver.
- a driving assistance device that assists a driver by displaying an enlarged image is known (see, for example, Patent Document 1).
- This disclosure is intended to provide a driving support device and a driving support method for displaying an image for supporting a driver of a vehicle.
- the driving support device and the driving support method it is possible to appropriately perform display of the target to be watched by the driver when the driver desires.
- the driving support device continuously captures the face of the driver of the vehicle to acquire a face image, and the face image acquired by the face image capturing unit.
- Gaze direction detection means for detecting the gaze direction of the driver, and gaze area setting for setting a gaze area where the driver is gaze based on the gaze direction detected by the gaze direction detection means Means, a display means having a display screen arranged at a position where the driver can visually recognize, an operation for moving the line of sight based on a detection result by the line-of-sight direction detecting means, and a display by the display means
- a line-of-sight motion determining means for determining whether or not a display start instruction operation set in advance for instructing the start of the display coincides with the operation of moving the line of sight of the driver and the display start instruction operation before If the line-of-sight operation determination means determines, the display of the display information of the contents set in advance in accordance with the gaze region set by the fixation region setting means, and a display start means for starting the display
- the driver performs an operation of moving the line of sight so as to coincide with the display start instruction operation, so that the content set in advance according to the gaze area that the driver is gazing at
- the display information is displayed on a display screen arranged at a position that can be visually recognized by the driver. That is, in order to display the display information of the contents corresponding to the gaze area on the display screen, the driver not only gazes at a certain area but also matches the display start instruction operation set in advance after gaze. I need to move my eyes. For this reason, the gaze area is not displayed on the display screen when the user simply gazes at a certain area, and the display information of the content corresponding to the gaze area is appropriately displayed when the driver desires. It can be carried out.
- the driving support method continuously captures the face of the driver of the vehicle, acquires a face image, and uses the face image to detect the gaze direction of the driver. Based on the detection result when setting the gaze area where the driver is gazing based on the line-of-sight direction and detecting the line-of-sight direction, the driver moves the line of sight and starts the display. It is determined whether or not a display start instruction operation set in advance to instruct matches, and when the driver moves the line of sight and the display start instruction operation matches, depending on the gaze area Display information of preset contents is displayed on a display screen arranged at a position where the driver can visually recognize the display information.
- the display information of the content set in advance according to the gaze area that the driver was gazing by performing the operation of moving the line of sight so that the driver matches the display start instruction operation It is displayed on a display screen arranged at a position that can be visually recognized by the driver. That is, in order to display the display information of the contents corresponding to the gaze area on the display screen, the driver not only gazes at a certain area but also matches the display start instruction operation set in advance after gaze. I need to move my eyes. For this reason, the gaze area is not displayed on the display screen when the user simply gazes at a certain area, and the display information of the content corresponding to the gaze area is appropriately displayed when the driver desires. It can be carried out.
- FIG. 1 is a block diagram showing a schematic configuration of a driving support device
- FIG. 2 is a diagram showing the arrangement of display screen areas in the windshield.
- FIG. 3 is a flowchart showing the display processing of the first embodiment.
- FIG. 4 is a diagram showing a gaze area in the foreground image data.
- FIG. 5 is a flowchart showing the display processing of the second embodiment.
- FIG. 6 is a flowchart showing the display adjustment process of the second embodiment.
- FIG. 7 is a flowchart showing the display processing of the third embodiment.
- FIG. 8 is a flowchart showing display adjustment processing according to the third embodiment.
- FIG. 1 is a block diagram showing a schematic configuration of a driving support device
- FIG. 2 is a diagram showing the arrangement of display screen areas in the windshield.
- FIG. 3 is a flowchart showing the display processing of the first embodiment.
- FIG. 4 is a diagram showing a gaze area in the foreground image data.
- FIG. 5 is a
- FIG. 9 is a flowchart showing display adjustment processing according to the fourth embodiment.
- FIG. 10 is a diagram showing scrolling of the display image in the display screen area.
- FIG. 11 is a flowchart showing the first half of the display processing of the fifth embodiment.
- FIG. 12 is a flowchart showing the latter half of the display process of the fifth embodiment.
- FIG. 13 is a block diagram illustrating a schematic configuration of the driving support device.
- FIG. 14 is a diagram illustrating a situation when the front of the vehicle is viewed from the passenger compartment.
- FIG. 15A is a flowchart showing the first half of the display processing of the sixth embodiment;
- FIG. 15B is a flowchart showing the first half of the display process of the sixth embodiment;
- FIG. 16 is a flowchart showing the latter half of the display processing of the sixth embodiment.
- FIG. 17A is a flowchart showing the first half of the display process of the seventh embodiment;
- FIG. 17B is a flowchart illustrating the first half of display processing according to the seventh embodiment.
- a driving support device 1 is mounted on a vehicle, and as shown in FIG. 1, an IR-LED 2, cameras 3, 5, image capture boards 4, 6, a head-up display device (hereinafter referred to as a HUD device) 7, and a control unit 8 And.
- a HUD device head-up display device
- the IR-LED 2 irradiates near infrared light toward the face of a person sitting in the driver's seat of the vehicle (hereinafter referred to as the driver).
- the camera 3 is a near-infrared camera and continuously captures the driver's face.
- the image data acquired by photographing by the camera 3 is referred to as face image data.
- the image capture board 4 temporarily stores the face image data acquired by the camera 3.
- the camera 5 continuously shoots the scenery in front of the host vehicle (hereinafter also referred to as foreground) that the driver can visually recognize through the windshield.
- the image data acquired by photographing by the camera 5 is referred to as foreground image data.
- the image capture board 6 temporarily stores the foreground image data acquired by the camera 5.
- the HUD device 7 irradiates display light for displaying an image from below the windshield toward the windshield. As a result, the driver visually recognizes the projected virtual image superimposed on the actual scenery in front of the vehicle.
- the HUD device 7 displays an image on a rectangular display screen region DR provided below the windshield (see FIG. 2).
- the control unit 8 executes various processes in accordance with inputs from the image capture boards 4 and 6 and controls the HUD device 7 using images taken by the cameras 3 and 5. Note that a detection signal from a vehicle speed sensor 9 that detects the traveling speed of the vehicle on which the driving support device 1 is mounted is input to the control unit 8.
- the control unit 8 executes display processing for displaying an image on the display screen region DR provided on the windshield.
- This display process is a process that is repeatedly executed during the operation of the driving support device 1.
- control unit 8 first controls the control unit 8 from the face image data captured by the camera 3 and stored in the image capture board 4 in S10. Is acquired from the image capture board 4. In S20, foreground image data not acquired by the control unit 8 is acquired from the image capture board 6 from the foreground image data captured by the camera 5 and stored in the image capture board 6.
- the face direction and the line-of-sight direction of the driver are detected using the face image data acquired from the image capture board 4.
- the face orientation direction of the driver is detected by fitting the face image data acquired from the image capture board 4 using the face shape model.
- the face shape model is calculated by using a basic shape in which the front face is represented by a plurality of triangular meshes and n shape vectors (n is a natural number) indicating the face direction from the basic shape.
- the driver's eye direction is detected by extracting the driver's eyes from the face image data and performing image recognition processing (for example, pattern matching) on the extracted eyes by the fitting using the face shape model.
- S40 based on the detection result in S30, it is determined whether or not the driver's line of sight is directed to the display screen region DR. If the driver's line of sight is directed to the display screen area DR (S40: YES), the display instruction flag F1 is set in S50, and the process proceeds to S110. On the other hand, when the driver's line of sight is not directed to the display screen region DR (S40: NO), the display instruction flag F1 is cleared in S60.
- S70 based on the detection result in S30, it is determined whether or not the driver's line of sight is directed outside the vehicle. If the driver's line of sight is not pointed out of the vehicle (S70: NO), the gaze flag F2 is cleared in S80, and the process proceeds to S110.
- the gaze area GR being watched by the driver is foreground image data.
- a gaze point GP that the driver is gazing in in the foreground image data is determined based on the driver's line-of-sight direction, and the gaze point GP is set in advance as a center.
- the made rectangular area is set as a gaze area GR.
- the gaze flag F2 is set in S100, and the process proceeds to S110.
- a preset moving image display time for example, 5 seconds
- the foreground image data in the gaze area GR is displayed as a moving image on the HUD device 7 and the display process is temporarily terminated.
- a still image display time for example, 5 seconds set in advance from the time when the driver's line of sight is directed to the display screen region DR in S140.
- the foreground image data in the gaze region GR when the driver's line of sight is directed to the display screen region DR is displayed on the HUD device 7 and the display process is temporarily terminated.
- the driving support device 1 configured as described above, first, the face of the driver of the vehicle is continuously photographed to acquire face image data (S10), and the landscape in front of the vehicle is continuously photographed.
- the foreground image data is acquired (S20), and then the driver's line-of-sight direction is detected using the acquired face image data (S30), and based on the detected line-of-sight direction, the driver A gaze area GR that is being watched is set (S90). Further, the HUD device 7 displays the image data in the gaze area GR among the acquired foreground image data on the display screen area DR arranged at a position where the driver can visually recognize the data.
- the driver performs an operation of directing his / her line of sight toward the display screen region DR, so that the portion of the surrounding gaze region GR in which the driver was gazing in the landscape in front of the vehicle Is displayed in the display screen region DR arranged at a position that can be visually recognized by the driver. That is, in order to display the gaze area GR in the display screen area DR, it is necessary for the driver not only to gaze at a certain area in front of the vehicle but also to perform an operation of directing the line of sight toward the display screen area DR after gaze. There is. For this reason, the gaze area GR is not displayed on the display screen area DR when the user just gazes at the scenery in front of the vehicle, and the gaze area GR is displayed appropriately when the driver desires. Can do.
- the process of S10 is the face image photographing means and the face image photographing procedure
- the process of S30 is the gaze direction detecting means and the gaze direction detecting procedure
- the process of S20 is the landscape image photographing means and the landscape image photographing procedure
- S90 Is the gaze area setting means and the gaze area setting procedure
- the HUD device 7 is the display means
- the process of S40 is the gaze movement determination means and the gaze movement determination procedure
- the processes of S50 and S110 are the display start means and the display start procedure
- S120 The process is a wink determination means.
- the driving support device 1 of the second embodiment is the same as that of the first embodiment except that the display process is changed and the display adjustment process is added.
- the display process of the second embodiment is the same as that of the first embodiment except that the processes of S120 to S140 are omitted and the processes of S210 to S240 are added.
- a moving object existing in front of the host vehicle is detected in S210. Specifically, by obtaining an optical flow by image processing using foreground image data, an area that moves independently of the host vehicle is extracted from the foreground image data, and the extracted area is detected as a moving object.
- S220 based on the detection result in S210, it is determined whether or not there is a moving object in the gaze region GR.
- a moving image display time set in advance for example, from the time when the driver's line of sight is directed to the display screen area DR) 5 seconds
- the foreground image data in the display area set in the display adjustment process described later is displayed as a moving image on the HUD device 7 to display the moving object as a moving image, and the display process is temporarily terminated.
- a still image display time (for example, preset) from the time when the driver's line of sight is directed to the display screen area DR in S240. 5 seconds
- the foreground image data in the gaze area GR at the time when the driver's line of sight is directed to the display screen area DR is displayed on the HUD device 7 as a still image, and the display process is temporarily terminated.
- the display adjustment process is a process repeatedly executed by the control unit 8 during the operation of the driving support device 1.
- the control unit 8 first determines in S260 whether there is a moving object in the gaze region GR based on the detection result in S210. to decide.
- the display adjustment process is temporarily ended.
- the display area is set and displayed so that the detected moving object is positioned at the center of the display screen area DR. The adjustment process is temporarily terminated.
- the driving support device 1 it is determined whether or not there is a moving object in the gaze region GR using the acquired foreground image data (S220), and the moving object is in the gaze region GR.
- S220 the acquired foreground image data
- the moving image display of the image data in the display screen area DR is performed (S230). That is, when there is a moving object in the gaze area GR, the display is automatically performed in a form suitable for displaying the moving object (that is, moving image display) without an instruction from the driver. For this reason, the driver can quickly and easily confirm the moving object existing in front of the vehicle by looking at the display screen region DR.
- the display area is set so that the moving object is positioned at the center of the display screen area DR (S270).
- the moving object is not displayed in the display screen region DR due to the movement of the moving object outside the gaze region GR, and a position suitable for displaying the moving object (that is, the display screen region DR) is prevented.
- the moving object is displayed at the center). For this reason, the driver can quickly and easily confirm the moving object existing in front of the vehicle by looking at the display screen region DR.
- the processes of S220 and S260 are moving object determination means, and the process of S270 is a first display area setting means.
- the driving support device 1 of the third embodiment is the same as that of the first embodiment except that the display process is changed and the display adjustment process is added.
- the display processing of the third embodiment is the same as that of the first embodiment except that the processing of S120 to S140 is omitted and the processing of S310 to S340 is added as shown in FIG.
- S110 when the display instruction flag F1 and the gaze flag F2 are not set (S110: NO), the display process is temporarily ended.
- the display instruction flag F1 and the gaze flag F2 are set (S110: YES)
- a saliency map for example, “J. Harel, C. Koch, and P. Perona.”.
- Graph-Based Visual Saliency ", NIPS 2006” is used to detect a salient region within the gaze region GR.
- an area having significantly different luminance compared to other areas in the foreground image data is defined as a “significant area”. For example, when a road sign is present in the foreground while the vehicle is traveling, the road sign corresponds to a remarkable area.
- S320 based on the detection result in S310, it is determined whether or not a saliency area exists in the gaze area GR.
- a moving image display time for example, preset
- the foreground image data in the display area set in the display adjustment process described later is displayed as a moving image on the HUD device 7 to display the saliency area as a moving image, and the display process is temporarily terminated.
- a still image display time (for example, preset) from the time when the driver's line of sight is directed to the display screen area DR. 5 seconds
- the foreground image data in the gaze area GR at the time when the driver's line of sight is directed to the display screen area DR is displayed on the HUD device 7 as a still image, and the display process is temporarily terminated.
- the display adjustment process is a process repeatedly executed by the control unit 8 during the operation of the driving support device 1.
- the control unit 8 first determines in S360 whether there is a significant area in the gaze area GR based on the detection result in S310. to decide.
- the display adjustment process is temporarily ended.
- the display area is set and displayed in S370 so that the detected saliency area is positioned at the center of the display screen area DR. The adjustment process is temporarily terminated.
- the driving support device 1 it is determined whether or not there is a saliency area in the gaze area GR using the acquired foreground image data (S360), and the saliency area exists in the gaze area GR. If it is determined that it exists (S360: YES), the display area is set so that the saliency area is positioned at the center of the display screen area DR (S370).
- the saliency area is displayed at the center of the display screen area DR.
- the driver looks at the display screen area DR.
- the driver is actually gazing because the accuracy of detection of the line-of-sight direction is poor, even though the driver is visually recognizing a prominent region in the foreground with a central visual field while driving the vehicle, Even when the gazing point GP determined by the driving support device 1 is different, the saliency area is displayed at the center of the display screen area DR. As a result, even when the saliency area exists at a position outside the gazing point GP, the saliency area is displayed at a position suitable for displaying the saliency area (that is, the center of the display screen area DR). . Therefore, the driver can quickly and easily confirm the saliency area in front of the vehicle by looking at the display screen area DR.
- the process of S360 is a saliency area determination unit
- the process of S370 is a second display area setting unit.
- the driving support device 1 of the fourth embodiment is the same as that of the first embodiment except that display adjustment processing is added.
- This display adjustment process is a process repeatedly executed by the control unit 8 during the operation of the driving support device 1.
- the control unit 8 first determines in S410 whether or not the HUD device 7 is displaying a still image in the display screen area DR, as shown in FIG. If a still image is not being displayed (S410: NO), the display adjustment process is temporarily terminated.
- the driver's note at the end of the display screen region DR is displayed in S430 as shown in FIG.
- the display area in the foreground image data is set so that the viewpoint GP is positioned at the center of the display screen area DR (see arrow AL1), and the display adjustment process is temporarily ended.
- the still image displayed in the display screen region DR is scrolled so that the location where the driver is gazing at the end of the display screen region DR becomes the center DC of the display screen region DR.
- the driving support device 1 it is determined whether the driver is gazing at the end of the display screen region DR based on the detected line-of-sight direction (S420), and the end of the display screen region DR is determined.
- the display area is set so that the gazing point GP that the driver is gazing at the end of the display screen area DR is positioned at the center of the display screen area DR. (S430).
- the area (display area) displayed in the display screen area DR in the foreground image data can be easily changed without operating the operation members such as buttons and switches. can do.
- the process of S420 is a gaze determination unit
- the process of S430 is a third display area setting unit.
- the driving support device 1 of the fifth embodiment is the same as that of the first embodiment except that the display process is changed.
- the display process of the fifth embodiment is the first implementation except that the processes of S110 to S140 are omitted and the processes of S510 to S520 and S600 to S720 are added.
- the form is the same.
- the image acquisition counter C1 is incremented (added by 1) in S510.
- the value of the image acquisition counter C1 is initialized and set to 0 when the operation of the driving support device 1 is started.
- an image in the gaze region GR set in S90 is extracted from the foreground image data captured when the driver's line of sight is directed outside the vehicle based on the determination result in S70.
- Data of the extracted image (hereinafter referred to as a gaze area image) is stored in association with the value of the image acquisition counter C1 (hereinafter also referred to as an image acquisition counter value), and the process proceeds to S100.
- the process proceeds to S600, and it is determined whether or not the display instruction flag F1 is set and the image acquisition counter value is greater than 1. If the display instruction flag F1 is cleared or the image acquisition counter value is 1 or less (S600: NO), the display process is temporarily ended. On the other hand, when the display instruction flag F1 is set and the image acquisition counter value is larger than 1 (S600: YES), the driving support device 1 is mounted based on the detection result of the vehicle speed sensor 9 in S610. It is determined whether the vehicle (hereinafter referred to as the host vehicle) is stopped.
- the value of the display counter C2 (hereinafter also referred to as a display counter value) is set as the image acquisition counter value in S620.
- the display counter value is set as the image acquisition counter value in S620.
- S630 among the gaze area images saved by the process of S520, the data of the gaze area image associated with the image acquisition counter value that matches the display counter value is displayed on the HUD device 7 as a still image, The display process is temporarily terminated. By the processes of S620 and S630, the latest one of the stored gaze area images is displayed in the display screen area DR.
- S640 based on the detection result in S30, the driver's face direction, line-of-sight direction, and angle difference are set in advance. It is determined whether the angle is equal to or smaller than a determination angle (for example, 10 ° in the present embodiment).
- a determination angle for example, 10 ° in the present embodiment.
- the process proceeds to S630. Thereby, the gaze area image associated with the image acquisition counter value that matches the display counter value is displayed in the display screen area DR.
- S640 If the angle difference between the face direction and the line-of-sight direction is larger than the display determination angle in S640 (S640: NO), it is determined in S660 whether or not the face direction is directed to the left of the line-of-sight direction. . If the face-facing direction is on the left side of the line-of-sight direction (S660: YES), it is determined in S670 whether the display counter value is greater than 1.
- the display counter C2 is decremented (subtracted by 1) in S680, and the process proceeds to S700.
- the display counter value is 1 (S670: NO)
- the value of the display counter C2 is set as the image acquisition counter value in S690, and the process proceeds to S700.
- the gaze area image data associated with the image acquisition counter value that matches the display counter value is displayed on the HUD device 7 as a still image, and the display process is temporarily terminated.
- S660 if the face direction is on the right side of the line-of-sight direction (S660: NO), it is determined in S710 whether the display counter value is equal to the image acquisition counter value. If the display counter value is equal to the image acquisition counter value (S710: YES), the process proceeds to S730. On the other hand, when the display counter value is not equal to the image acquisition counter value (S710: NO), the display counter C2 is incremented (added by 1) in S720, and the process proceeds to S730. When the process proceeds to S730, the gaze region image data associated with the image acquisition counter value that matches the display counter value is displayed on the HUD device 7 as a still image, and the display process is temporarily terminated.
- the latest one of the stored gaze area images is the display screen area. It is displayed on DR (S620, S630). Thereafter, when the driver turns his / her line of sight toward the display screen area DR while the vehicle is stopped (S600: YES, S610: YES), first, the latest one of the stored gaze area images is displayed in the display screen area DR. (S630). Then, when the driver turns his / her face toward the display screen area DR and turns his face to the left side (S640: NO, S660: YES), the gaze area image acquired immediately before the currently displayed gaze area image is displayed.
- the gaze area image acquired immediately after the currently displayed gaze area image is displayed. It is displayed on the screen area DR (S720, S730).
- an image in the gaze region GR is extracted from the acquired foreground image data, and the extracted gaze region image data is stored in time series (S510, S520) and stored.
- the observed gaze area image data is displayed on the display screen area DR (S630, S700, S730). Further, the driver's face direction is detected using the acquired face image data (S30), and the gaze area image data to be displayed is instructed based on the detected face direction (S660 to S690, S710). To S720).
- the gaze area image displayed on the display screen area DR can be easily changed without operating the operation members such as buttons and switches depending on the face direction.
- the processing of S510 and S520 is the image storage means
- the processing of S630, S700 and S730 is the storage image display means
- the processing of S30 is the face direction detection means
- the processing of S660 to S690 and S710 to S720 is It is a display instruction means.
- the driving support apparatus 101 is mounted on a vehicle and connected to a navigation ECU 103, an air conditioner ECU 104, and a meter ECU 105 via an in-vehicle LAN 102.
- the navigation ECU 103 is configured to detect the current position of the vehicle based on a GPS signal or the like received via a GPS (Global Positioning System) antenna (not shown). Then, the navigation ECU 103 performs control for displaying the current location of the vehicle on the display screen 131 (see FIG. 14) of the navigation device arranged in the instrument panel unit 130 (see FIG. 14), and from the current location to the destination. The control for guiding the route is performed.
- GPS Global Positioning System
- the navigation ECU 103 is configured to execute control of in-vehicle audio (not shown) based on a signal from an audio operation switch group 132 (see FIG. 14) that is installed in the instrument panel unit 130 and operated by the driver.
- the audio operation switch group 132 is operated when switching a device to be operated from among CD, TV, and radio, and when switching which channel or song is selected after the operation device is switched.
- a volume setting switch operated when adjusting the volume.
- the air conditioner ECU 104 is installed in the instrument panel unit 130 and is operated by a driver to operate a signal from an air conditioner operation switch group 133 (see FIG. 14), and a vehicle interior / exterior temperature sensor (not used) that detects temperatures inside and outside the vehicle.
- a car air conditioner (not shown) is operated on the basis of a detection signal from (shown) to control the air conditioning in the passenger compartment.
- the air conditioner operation switch group 133 is operated when an air conditioner is switched between on / off, a temperature setting switch operated when adjusting a set temperature, and an air volume. Air volume setting switch, and an outlet position setting switch operated when adjusting the outlet position.
- the meter ECU 105 is installed in front of the driver's seat and controls the meter unit 140 (see FIG. 14) that displays various states of the vehicle.
- the meter unit 140 displays the vehicle speed, the shift range of an automatic transmission (not shown), the engine rotation speed, the remaining amount of gasoline, the status of auto cruise control (ACC), the status of lane keep assist (LKA), and the like. .
- the driving support device 101 includes an IR-LED 2, a camera 3, an image capture board 4, a HUD device 7, a right side camera 111, a left side camera 112, a rear camera 113, image capture boards 114, 115, 116, an in-vehicle LAN communication unit 117. And a control unit 118.
- the IR-LED 2, the camera 3, the image capture board 4, and the HUD device 7 are the same as those in the first embodiment, description thereof is omitted.
- the right-side camera 111 is installed at the right end of the vehicle (in this embodiment, the side mirror 151 on the right side of the vehicle (see FIG. 14)). Get the data.
- the left side camera 112 is installed at the left end of the vehicle (in this embodiment, the side mirror 152 on the left side of the vehicle (see FIG. 14)), and continuously captures the situation on the left side of the vehicle. Get the data.
- the rear camera 113 is installed at the rear end of the vehicle (in this embodiment, the rear bumper), and acquires image data by continuously photographing the situation behind the vehicle.
- the image capture boards 114, 115, and 116 temporarily store image data acquired by the cameras 111, 112, and 113, respectively.
- the in-vehicle LAN communication unit 117 communicates with various devices (such as the navigation ECU 103) connected to the in-vehicle LAN 102 via the in-vehicle LAN 102.
- the control unit 118 is configured around a well-known microcomputer including a CPU, ROM, RAM, I / O and a bus line connecting these components, and executes various processes based on programs stored in the ROM. To do.
- the control unit 118 executes display processing for displaying an image on the display screen region DR provided on the windshield.
- This display process is a process repeatedly executed during the operation of the driving support apparatus 101.
- control unit 118 first takes a picture by the camera 3 and stores it in the image capture board 4 in S810, as in S10, as shown in FIGS. 15A, 15B, and 16. From the face image data thus obtained, face image data not acquired by the control unit 8 is acquired from the image capture board 4.
- the face orientation direction and the line-of-sight direction of the driver are detected using the face image data acquired from the image capture board 4 in the same manner as S30.
- S830 it is determined whether or not the driver's line of sight is directed to the display screen region DR based on the detection result in S820. If the driver's line of sight is directed to the display screen area DR (S830: YES), the display instruction flag F1 is set in S840, and the process proceeds to S1070. On the other hand, when the driver's line of sight is not directed to the display screen region DR (S830: NO), the display instruction flag F1 is cleared in S850.
- the navigation gaze flag F11 is set in S870, and in S880, other gaze flags F12, F13, F14, F15, F16, and F17 are cleared, and the process proceeds to S1070.
- Other gaze flags are a meter gaze flag F12, an audio gaze flag F13, an air conditioner gaze flag F14, a right side gaze flag F15, a left side gaze flag F16, and a rear gaze flag F17, as will be described later.
- the line of sight is not directed to the display screen 131 of the navigation device (S860: NO), is the driver's line of sight directed to the meter unit 140 based on the detection result in S820 in S890. Judge whether or not. If the line of sight is directed to the meter unit 140 (S890: YES), the meter gaze flag F12 is set in S900, and the gaze flags other than the meter gaze flag F12 are cleared in S910. , The process proceeds to S1070.
- the driver's line of sight is directed to the air conditioner operation switch group 133 based on the detection result in S820 in S950. Determine whether or not. If the line of sight is directed to the air conditioner operation switch group 133 (S950: YES), the air conditioner gaze flag F14 is set in S960, and in S970, gaze flags other than the air conditioner gaze flag F14 are set. Clear it and go to S1070.
- the driver's line of sight is directed to the side mirror 151 on the right side of the vehicle based on the detection result in S820 in S980. Judge whether or not.
- the right side gaze flag F15 is set in S990, and further gaze other than the right side gaze flag F15 is performed in S1000. The flag is cleared and the process proceeds to S1070.
- the driver's line of sight is directed to the side mirror 152 on the left side of the vehicle based on the detection result in S820 in S1010.
- the left side gaze flag F16 is set in S1020, and the gaze other than the left side gaze flag F16 is further set in S1030. The flag is cleared and the process proceeds to S1070.
- the driver's line of sight is based on the detection result in S820 in S1040 (see FIG. 14). It is determined whether or not it is directed to. If the line of sight is directed to the rear view mirror 153 (S1040: YES), the rear gaze flag F17 is set in S1050, and the gaze flags other than the rear gaze flag F17 are cleared in S1060. , The process proceeds to S1070.
- the display instruction flag F1 it is determined whether or not the display instruction flag F1 is set and any one of the gaze flags F11, F12, F13, F14, F15, F16, and F17 is set. If at least one of the display instruction flag F1 and the gaze flag is not set (S1070: NO), the display process is temporarily ended. On the other hand, if the display instruction flag F1 is set and any one of the gaze flags is set (S1070: YES), it is determined in S1080 whether the navigation gaze flag F11 is set.
- the navigation screen display process is executed in S1090, and the process proceeds to S1210 after the navigation screen display process ends.
- image data indicating an image displayed on the display screen 131 of the navigation device is acquired from the navigation ECU 103 via the in-vehicle LAN 102, and the acquired line of sight is displayed by the driver's line of sight.
- the image is displayed on the HUD device 7 until a preset image display time (for example, 3 seconds) elapses from the time when it is directed to the display screen region DR.
- the navigation gaze flag F11 is not set (S1080: NO)
- the information displayed in the specified area is acquired from the meter ECU 105 via the in-vehicle LAN 102, and the acquired information is set in advance from the time when the driver's line of sight is directed to the display screen area DR.
- the HUD device 7 is displayed until the displayed image display time (for example, 3 seconds) elapses.
- the meter unit display process acquires vehicle speed information indicating the vehicle speed from the meter ECU 105, and indicates the acquired vehicle speed information.
- the image is displayed on the HUD device 7.
- the audio operation switch display process is executed in S1130, and the process proceeds to S1210 after the audio operation switch display process is completed. Specifically, in the audio operation switch display process, first, a switch in which the driver's line of sight is directed in the audio operation switch group 132 is specified based on the detection result in S820.
- setting status information indicating the current status set by the operation of the identified switch is acquired from the navigation ECU 103 via the in-vehicle LAN 102, and the driver's line of sight is displayed on the display screen.
- the image is displayed on the HUD device 7 until a preset image display time (for example, 3 seconds) elapses from the time when it is directed to the region DR.
- volume information indicating the current volume is acquired from the meter ECU 105, and an image indicating the acquired volume information is displayed on the HUD device 7. To display.
- the navigation ECU 103 provides steering switches 161 and 162 (see FIG. 14) provided on the steering wheel ST (see FIG. 14). 14), the setting status of the identified switch is changed. For example, when the driver's line of sight faces the volume setting switch, an image indicating the volume is displayed in the display screen region DR. When the driver operates the steering switch 161, the volume output from the audio is increased and an image in which the volume is increased is displayed on the display screen region DR. Conversely, when the driver operates the steering switch 162, the audio is output. As the sound volume output from the sound volume decreases, an image in which the sound volume gradually decreases is displayed in the display screen region DR.
- the audio gaze flag F13 is not set (S1120: NO)
- setting status information indicating the current status set by operating the specified switch is acquired from the air conditioner ECU 104 via the in-vehicle LAN 102, and the driver's line of sight is displayed on the display screen.
- the image is displayed on the HUD device 7 until a preset image display time (for example, 3 seconds) elapses from the time when it is directed to the region DR.
- the air conditioner operation switch display process acquires on / off information indicating whether the air conditioner is operating or stopped from the air conditioner ECU 104. Then, an image indicating the acquired on / off information is displayed on the HUD device 7.
- the air conditioner ECU 104 is specified based on the operation of the steering switches 161 and 162 provided in the steering ST when the switch setting status information specified in the air conditioner operation switch display process is displayed by the HUD device 7. Change the switch setting status. For example, when the driver's line of sight faces the on / off switch, an image indicating on / off information is displayed in the display screen region DR. When the driver operates the steering switch 161, the operation / non-operation of the air conditioner is switched, and an image indicating whether the air conditioner is operating or stopped is displayed in the display screen region DR.
- the air conditioner gaze flag F14 is not set (S1140: NO)
- image data captured by the right camera 111 and stored in the image capture board 114 is acquired from the image capture board 114.
- the image data acquired from the image capture board 114 is displayed until a predetermined image display time (for example, 3 seconds) elapses from when the driver's line of sight is directed to the display screen region DR. , Display on the HUD device 7.
- the right side gaze flag F15 is not set (S1160: NO)
- the image data acquired from the image capture board 115 is displayed until a predetermined image display time (for example, 3 seconds) elapses from the time when the driver's line of sight is directed to the display screen region DR. , Display on the HUD device 7.
- a predetermined image display time for example, 3 seconds
- the rear view mirror display process is executed in S1200, and the process proceeds to S1210 after the rear view mirror display process is completed.
- image data captured by the rear camera 113 and stored in the image capture board 116 is acquired from the image capture board 116.
- the image data acquired from the image capture board 116 is displayed until a preset image display time (for example, 3 seconds) elapses from when the driver's line of sight is directed to the display screen region DR. It is displayed on the HUD device 7.
- the face of the driver of the vehicle is continuously photographed to acquire face image data (S810), and then the acquired face image data is used.
- the driver's line-of-sight direction is detected (S820), and based on the detected line-of-sight direction, the navigation device display screen 131, meter unit 140, audio operation switch group 132, air conditioner operation switch group 133, vehicle right side It is determined which of the mirror 151, the side mirror 152 on the left side of the vehicle, and the rear view mirror 153 is being watched by the driver (S860, S890, S920, S950, S980, S1010, S1040).
- the HUD device having the display screen region DR arranged at a position where the driver can visually recognize the display information of the contents set in advance according to the region watched by the driver 7 (S840, S1070, S1090, S1110, S1130, S1150, S1170, S1190, S1200).
- the driving support device 101 when the driver performs an operation of directing his / her line of sight toward the display screen region DR, display information having contents set in advance according to the gaze region in which the driver is gazing is displayed. And displayed on the display screen region DR arranged at a position visible by the driver.
- the driver in order to display the display information of the contents set in advance according to the gaze area that the driver is gazing on in the display screen area DR, the driver not only gazes a certain area of the vehicle but also gazes. After that, it is necessary to perform an operation of directing the line of sight toward the display screen region DR. For this reason, the gaze area is not displayed on the display screen area DR when the user simply gazes at a certain area.
- the driver desires to display the display information according to the gaze area. Can be done appropriately.
- the display screen 131 of the navigation device is included in the gaze area (S1080: YES)
- the image displayed on the display screen 131 of the navigation device is displayed on the HUD device 7 (S1090).
- the image displayed on the display screen 131 is not displayed on the display screen region DR but is displayed on the display screen 131.
- the display of the image on the display screen region DR can be appropriately performed when the driver desires.
- the vehicle state information displayed by the meter unit 140 is included in the gaze area (S1100: YES)
- the vehicle state information included in the gaze area is displayed as the HUD device 7. (S1100).
- the vehicle state information displayed by the meter unit 140 is not displayed in the display screen region DR, but is displayed by the meter unit 140.
- the vehicle state information can be appropriately displayed on the display screen region DR when the driver desires.
- an audio operation switch group 132 and an air conditioner operation switch group 133 that are operated when changing the operating state of the in-vehicle audio and the car air conditioner mounted on the vehicle are disposed in the instrument panel portion 130 that can be visually recognized by the driver, and the audio
- the switches of the operation switch group 132 and the air conditioner operation switch group 133 are included in the gaze area (S1120: YES, S1140: YES)
- the current operation state that can be changed by the switch included in the gaze area Is displayed on the HUD device 7 (S1130, S1150).
- the driver simply watches the audio operation switch group 132 or the air conditioner operation switch group 133
- the information indicating the operation state that can be changed by the switch of the audio operation switch group 132 or the air conditioner operation switch group 133 is displayed. It is not displayed in the display screen area DR, and is appropriately performed when the driver desires to display information indicating the current operation state that can be changed by the switches of the switch groups 132 and 133 on the display screen area DR. be able to.
- the cameras 111, 112, and 113 continuously capture the scenery reflected on the mirrors 151, 152, and 153 for confirming the rear of the vehicle, and acquire the scenery image behind the vehicle.
- the mirrors 151, 152, and 153 are included in the gaze area (S1160: YES, S1180: YES, S1180: NO)
- the landscape images acquired by the cameras 111, 112, and 113 are transferred to the HUD device 7. It is displayed (S1170, S1190, S1200).
- the landscape images acquired by the cameras 111, 112, and 113 are not displayed in the display screen region DR.
- 112, 113 can be appropriately performed when the driver desires to display the landscape image on the display screen region DR.
- the process of S810 is the face image photographing means and the face image photographing procedure
- the process of S820 is the gaze direction detecting means and the gaze direction detecting procedure
- the processes of S860, S890, S920, S950, S980, S1010, and S1040 Is the gaze area setting means and gaze area setting procedure
- the process of S830 is the gaze movement determination means and the gaze movement determination procedure
- the processes of S840, S1070, S1090, S1110, S1130, S1150, S1170, S1190, S1200 are the display start means and display. This is the starting procedure.
- the display screen 131 is a vehicle interior display screen
- the meter unit 140 is a vehicle state display means
- the audio operation switch group 132 and the air conditioner operation switch group 133 are operation units
- the mirrors 151, 152, and 153 are in-vehicle mirrors
- the cameras 111 and 112. , 113 are rear landscape image photographing means.
- the driving support apparatus 101 of the seventh embodiment is the same as that of the sixth embodiment except that the display process is changed.
- the display processing of the seventh embodiment is the same as that of the sixth embodiment except that the processing of S835 is added as shown in FIGS. 17A and 17B.
- the driving support device 101 configured in this manner determines whether or not the driver has winked using the acquired face image data (S835). Then, the driving support apparatus 101 determines that the driver has performed an operation of directing his / her line of sight toward the display screen region DR (S830: YES), and when the driver determines that the driver has winked (S835: YES), display information Is displayed on the HUD device 7.
- the driving support device 101 configured as described above, in order to display the gaze area GR on the display screen area DR, the driver not only gazes at a certain area in front of the vehicle but also gazes after the gaze. In addition to the operation directed to the display screen area DR, it is necessary to perform winking. In other words, the driver does not wink when the driver performs an operation of directing the line of sight toward the display screen region DR after gazing, even though the driver does not desire to display the gaze region GR.
- the driving support apparatus 101 does not display display information. For this reason, the driving assistance device 101 can more appropriately determine whether or not the driver desires to display the gaze area GR.
- the process of S835 is a wink determination means.
- the HUD device 7 is used to display an image on the windshield.
- a liquid crystal display device having a display screen is installed in front of the driver in the vehicle interior.
- the gaze area image may be displayed on this display screen.
- the moving object and the saliency area are displayed at the center of the display screen area DR.
- the moving object and the saliency area displayed in the display screen area DR are surrounded by a square or the like.
- the highlighted display may be made easier to recognize the presence of the moving object and the saliency area.
- the gaze area image data is stored when the driver gazes.
- the driver performs an operation of directing the line of sight toward the display screen area DR.
- the gaze area image when the portion of the gaze area GR around the driver's gaze area in the scenery in front of the vehicle is displayed on the display screen area DR arranged at a position where the driver can visually recognize the gaze area image. Only data may be stored. Further, the gaze area image data may be saved when the driver presses the acquisition button while gazing.
- the one that determines whether or not the host vehicle is stopped based on the detection result of the vehicle speed sensor 9 is shown. For example, when the shift position is parking (P), When the brake is applied, it may be determined that the host vehicle is stopped.
- the image displayed on the display screen 131 of the navigation device is displayed on the HUD device 7.
- the image displayed on the display screen of the smartphone may be displayed on the HUD device 7.
- the driving support device 101 receives image data indicating an image displayed on the display screen of the smartphone from the navigation ECU 103. You may make it acquire via in-vehicle LAN102.
- the cameras 111 and 112 are installed on the side mirrors 151 and 152.
- the cameras 111 and 112 may be installed on the rear part of the vehicle.
- the rear camera 113 is installed on the rear bumper.
- the rear camera 113 may be installed near the rear view mirror 153.
- the image taken by the cameras 111, 112, 113 is displayed on the HUD device 7.
- the images taken by the right camera 111, the left camera 112, and the rear camera 113 are displayed.
- an image obtained by viewing the right side mirror 151, the left side mirror 152, and the rear view mirror 153 from the viewpoint of the driver sitting in the driver's seat may be displayed in the display screen region DR.
- the HUD device 7 is used to display an image.
- the present invention is not limited to this, and a device that can display an image in front of the driver (for example, a liquid crystal display). ).
- the image data acquired by the cameras 111, 112, and 113 is stored in the image capture boards 114, 115, and 116, respectively. Data may be stored in one image capture board.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Transportation (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Automation & Control Theory (AREA)
- Optics & Photonics (AREA)
- Mathematical Physics (AREA)
- Traffic Control Systems (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
- Studio Devices (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
以下に第1実施形態について図面とともに説明する。
"Passive Driver Gaze Tracking with Active Appearance Models",
Proceedings of the 11th World Congress on Intelligent Transportation Systems,
October, 2004.」を参照)。また、顔形状
モデルを用いた上記フィッティングによって、顔画像データから運転者の眼を抽出し、抽出した眼について画像認識処理(例えばパターンマッチング)を行うことにより、運転者の視線方向を検出する。
以下に第2実施形態を図面とともに説明する。なお第2実施形態では、第1実施形態と異なる部分のみを説明する。
以下に第3実施形態を図面とともに説明する。なお第3実施形態では、第1実施形態と異なる部分のみを説明する。
Saliency", NIPS 2006」を参照)を用いて、注視領域GR内の顕著領域を検出する。なお本実施形態では、前景画像データ内において他の領域と比較して輝度が大きく異なる領域を「顕著領域」と定義する。例えば、車両走行中の前景に道路標識が存在している場合には、道路標識が顕著領域に相当する。
以下に第4実施形態を図面とともに説明する。なお第4実施形態では、第1実施形態と異なる部分のみを説明する。
以下に第5実施形態を図面とともに説明する。なお第5実施形態では、第1実施形態と異なる部分のみを説明する。
以下に第6実施形態を図面とともに説明する。
図示)を介して受信したGPS信号等に基づいて車両の現在位置を検出するように構成されている。そしてナビゲーションECU103は、車両の現在地を、インパネ部130(図14を参照)に配置されているナビゲーション装置の表示画面131(図14を参照)に表示するための制御、および、現在地から目的地までの経路を案内するための制御などを実行するように構成されている。
以下に第7実施形態を図面とともに説明する。なお第7実施形態では、第6実施形態と異なる部分のみを説明する。
Claims (14)
- 車両の運転者の顔を連続して撮影して、顔画像を取得する顔画像撮影手段(S10,S810)と、
前記顔画像撮影手段により取得された顔画像を用いて、前記運転者の視線方向を検出する視線方向検出手段(S30,S820)と、
前記視線方向検出手段により検出された視線方向に基づいて、前記運転者が注視している注視領域を設定する注視領域設定手段(S90,S860,S890,S920,S950,S980,S1010,S1040)と、
前記運転者が視認可能な位置に配置された表示画面を有する表示手段(7)と、
前記視線方向検出手段による検出結果に基づいて、前記運転者が視線を動かす動作と、前記表示手段による表示の開始を指示するために予め設定された表示開始指示動作とが一致するか否かを判断する視線動作判断手段(S40,S830)と、
前記運転者が視線を動かす動作と前記表示開始指示動作とが一致すると前記視線動作判断手段が判断した場合に、前記注視領域設定手段により設定された前記注視領域に応じて予め設定された内容の表示情報の表示を、前記表示手段に開始させる表示開始手段(S50,S110,S840,S1070,S1090,S1110,S1130,S1150,S1170,S1190,S1200)とを備える
ことを特徴とする運転支援装置。 - 前記車両の前方の風景を連続して撮影して、風景画像を取得する風景画像撮影手段(S20)をさらに備え、
前記表示手段は、前記風景画像撮影手段により取得された前記風景画像のうち、予め設定された表示領域内の画像を前記表示画面に表示し、
前記表示開始手段は、前記注視領域設定手段により設定された前記注視領域を前記表示領域と設定し、さらに、前記風景画像撮影手段により取得された前記風景画像のうち、前記表示領域内の画像を前記表示情報と設定する
請求項1に記載の運転支援装置。 - 前記顔画像撮影手段により取得された顔画像を用いて、前記運転者がウィンクしたか否かを判断するウィンク判断手段(S120)をさらに備え、
前記表示手段は、
前記ウィンク判断手段による判断結果に基づいて、前記表示領域内の画像の動画表示および静止画表示の何れかを行う
請求項2に記載の運転支援装置。 - 前記風景画像撮影手段により取得された前記風景画像を用いて、前記注視領域設定手段により設定された前記注視領域内に移動物体が存在するか否かを判断する移動物体判断手段(S220,S260)を備え、
前記表示手段は、
前記注視領域内に前記移動物体が存在すると前記移動物体判断手段が判断した場合に、前記表示領域内の画像の動画表示を行う
請求項2に記載の運転支援装置。 - 前記注視領域内に前記移動物体が存在すると前記移動物体判断手段が判断した場合に、前記表示画面の中心に前記移動物体が位置するように前記表示領域を設定する第1表示領域設定手段(S270)を備える
請求項4に記載の運転支援装置。 - 前記風景画像撮影手段により取得された前記風景画像を用いて、前記注視領域設定手段により設定された前記注視領域内に顕著領域が存在するか否かを判断する顕著領域判断手段(S360)と、
前記注視領域内に前記顕著領域が存在すると前記顕著領域判断手段が判断した場合に、前記表示画面の中心に前記顕著領域が位置するように前記表示領域を設定する第2表示領域設定手段(S370)とを備える
請求項2~請求項5の何れか1項に記載の運転支援装置。 - 前記視線方向検出手段により検出された視線方向に基づいて、前記運転者が前記表示画面の周縁部を注視しているか否かを判断する注視判断手段(S420)と、
前記運転者が前記表示画面の周縁部を注視していると前記注視判断手段が判断した場合に、前記表示画面の周縁部において前記運転者が注視している点が前記表示画面の中心に位置するように前記表示領域を設定する第3表示領域設定手段(S430)とを備える
請求項2~請求項6の何れか1項に記載の運転支援装置。 - 前記風景画像撮影手段により取得された前記風景画像のうち、前記注視領域設定手段により設定された前記注視領域内の画像を抽出して、抽出した画像を時系列で保存する画像保存手段(S510,S520)と、
前記画像保存手段により保存された画像を表示する保存画像表示手段(S630,S700,S730)と、
前記顔画像撮影手段により取得された顔画像を用いて、前記運転者の顔向き方向を検出する顔向き方向検出手段(S30)と、
前記顔向き方向検出手段により検出された顔向きの方向に基づいて、前記保存画像表示手段により表示される画像を指示する表示指示手段(S660~S690,S710~S720)とをさらに備える
請求項2~請求項7の何れか1項に記載の運転支援装置。 - 前記車両の車室内に設置された車載表示機器の表示画面を車室内表示画面(131)として、
前記表示開始手段は、前記車室内表示画面が前記注視領域内に含まれている場合に、前記車室内表示画面に表示されている画像を前記表示情報とする
請求項1に記載の運転支援装置。 - 前記運転者が視認可能な位置に配置され、前記車両の状態を示す車両状態情報を表示する車両状態表示手段(140)をさらに備え、
前記表示開始手段は、
前記車両状態表示手段により表示されている前記車両状態情報が前記注視領域内に含まれている場合に、前記注視領域内に含まれている前記車両状態情報を前記表示情報とする
請求項1または請求項9に記載の運転支援装置。 - 前記車両に搭載された車載機器の動作状態を変更するときに操作される操作部(132,133)が、前記運転者が視認可能な位置に配置され、
前記表示開始手段は、
前記操作部が前記注視領域内に含まれている場合に、前記車載機器の現在の動作状態のうち、前記注視領域内に含まれている前記操作部により変更可能な動作状態を示す情報を、前記表示情報とする
請求項1、請求項9および請求項10の何れか1項に記載の運転支援装置。 - 前記車両の後方を確認するための車載用ミラー(151,152,153)に映る風景を連続して撮影して、前記車両の後方の風景画像を取得する後方風景画像撮影手段(111,112,113)をさらに備え、
前記表示開始手段は、
前記車載用ミラーが前記注視領域内に含まれている場合に、前記後方風景画像撮影手段により取得された前記風景画像を前記表示情報とする
請求項1、請求項9、請求項10および請求項11の何れか1項に記載の運転支援装置。 - 前記顔画像撮影手段により取得された顔画像を用いて、前記運転者がウィンクしたか否かを判断するウィンク判断手段(S835)を備え、
前記表示開始手段は、
前記運転者が視線を動かす動作と前記表示開始指示動作とが一致すると前記視線動作判断手段が判断し、且つ、前記運転者がウィンクしたと前記ウィンク判断手段が判断した場合に、前記表示情報の表示を前記表示手段に開始させる
請求項1、請求項9、請求項10、請求項11および請求項12の何れか1項に記載の運転支援装置。 - 車両の運転者の顔を連続して撮影して、顔画像を取得し(S10,S810)、
前記顔画像を用いて、前記運転者の視線方向を検出し(S30,S820)、
前記視線方向に基づいて、前記運転者が注視している注視領域を設定し(S90,S860,S890,S920,S950,S980,S1010,S1040)、
前記視線方向を検出する際の検出結果に基づいて、前記運転者が視線を動かす動作と、表示の開始を指示するために予め設定された表示開始指示動作とが一致するか否かを判断し(S40,S830)、
前記運転者が視線を動かす動作と前記表示開始指示動作とが一致する場合に、前記注視領域に応じて予め設定された内容の表示情報を、前記運転者が視認可能な位置に配置された表示画面に表示させる(S50,S110,S840,S1070,S1090,S1110,S1130,S1150,S1170,S1190,S1200)
運転支援方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020147026530A KR101562877B1 (ko) | 2012-03-14 | 2013-03-08 | 운전 지원 장치 및 운전 지원 방법 |
US14/376,910 US9317759B2 (en) | 2012-03-14 | 2013-03-08 | Driving assistance device and driving assistance method |
DE112013001472.6T DE112013001472T5 (de) | 2012-03-14 | 2013-03-08 | Fahrunterstützungsvorrichtung und Fahrunterstützungsverfahren |
CN201380013976.6A CN104169993B (zh) | 2012-03-14 | 2013-03-08 | 驾驶辅助装置及驾驶辅助方法 |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-057561 | 2012-03-14 | ||
JP2012057561 | 2012-03-14 | ||
JP2013041039A JP5630518B2 (ja) | 2012-03-14 | 2013-03-01 | 運転支援装置 |
JP2013-041039 | 2013-03-01 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013136740A1 true WO2013136740A1 (ja) | 2013-09-19 |
Family
ID=49160672
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/001487 WO2013136740A1 (ja) | 2012-03-14 | 2013-03-08 | 運転支援装置および運転支援方法 |
Country Status (6)
Country | Link |
---|---|
US (1) | US9317759B2 (ja) |
JP (1) | JP5630518B2 (ja) |
KR (1) | KR101562877B1 (ja) |
CN (1) | CN104169993B (ja) |
DE (1) | DE112013001472T5 (ja) |
WO (1) | WO2013136740A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101562877B1 (ko) | 2012-03-14 | 2015-10-23 | 가부시키가이샤 덴소 | 운전 지원 장치 및 운전 지원 방법 |
CN105522971A (zh) * | 2014-10-21 | 2016-04-27 | 现代摩比斯株式会社 | 车辆外部图像输出控制装置及方法 |
WO2018163811A1 (ja) * | 2017-03-07 | 2018-09-13 | ソニー株式会社 | 情報処理装置、情報処理方法、並びにプログラム |
US20210291853A1 (en) * | 2020-03-20 | 2021-09-23 | Alpine Electronics, Inc. | Vehicle image processing device |
Families Citing this family (55)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6115278B2 (ja) * | 2013-04-16 | 2017-04-19 | 株式会社デンソー | 車両用表示装置 |
US9513702B2 (en) * | 2013-07-15 | 2016-12-06 | Lg Electronics Inc. | Mobile terminal for vehicular display system with gaze detection |
KR20150073269A (ko) * | 2013-12-20 | 2015-07-01 | 현대자동차주식회사 | 차량용 클러스터 장치 |
US9542844B2 (en) * | 2014-02-11 | 2017-01-10 | Google Inc. | Providing navigation directions in view of device orientation relative to user |
DE102014009697A1 (de) * | 2014-06-26 | 2015-12-31 | Audi Ag | Verfahren zum Betreiben eines mobilen Virtual-Reality-Systems in einem Kraftfahrzeug und mobiles Virtual-Reality-System |
KR20160033376A (ko) * | 2014-09-18 | 2016-03-28 | (주)에프엑스기어 | 시선에 의해 제어되는 헤드 마운트형 디스플레이 장치, 이의 제어 방법 및 이의 제어를 위한 컴퓨터 프로그램 |
US9904362B2 (en) * | 2014-10-24 | 2018-02-27 | GM Global Technology Operations LLC | Systems and methods for use at a vehicle including an eye tracking device |
KR101619635B1 (ko) * | 2014-11-06 | 2016-05-10 | 현대자동차주식회사 | 시선추적을 이용한 메뉴 선택장치 |
DE202014009919U1 (de) * | 2014-12-15 | 2016-03-16 | GM Global Technology Operations LLC (n. d. Ges. d. Staates Delaware) | Fahrerassistenzsystem |
US10121063B2 (en) * | 2015-01-12 | 2018-11-06 | BMT Business Meets Technology Holding AG | Wink gesture based control system |
EP3267295B1 (en) * | 2015-03-05 | 2021-12-29 | Sony Group Corporation | Information processing device, control method, and program |
EP3317755B1 (en) * | 2015-07-02 | 2021-10-20 | Volvo Truck Corporation | An information system for a vehicle |
JP6256424B2 (ja) | 2015-07-09 | 2018-01-10 | 株式会社デンソー | 車両用表示装置 |
KR101895485B1 (ko) | 2015-08-26 | 2018-09-05 | 엘지전자 주식회사 | 운전 보조 장치 및 그 제어 방법 |
KR102457262B1 (ko) * | 2015-09-25 | 2022-10-20 | 현대모비스 주식회사 | 운전자의 관심 정보 제공 장치 |
US10503989B2 (en) * | 2015-09-28 | 2019-12-10 | Kyocera Corporation | Image processing apparatus, imaging apparatus, camera monitor system, and image processing method |
US9841813B2 (en) * | 2015-12-22 | 2017-12-12 | Delphi Technologies, Inc. | Automated vehicle human-machine interface system based on glance-direction |
JP6917708B2 (ja) * | 2016-02-29 | 2021-08-11 | 株式会社デンソー | 運転者監視システム |
JP6344417B2 (ja) * | 2016-03-18 | 2018-06-20 | トヨタ自動車株式会社 | 車両用表示装置 |
EP3437949B1 (en) * | 2016-03-28 | 2023-12-27 | Nippon Seiki Co., Ltd. | Display device |
JP6313355B2 (ja) * | 2016-03-31 | 2018-04-18 | 株式会社Subaru | 車両周囲監視装置 |
WO2017195405A1 (ja) * | 2016-05-11 | 2017-11-16 | ソニー株式会社 | 画像処理装置及び画像処理方法、並びに移動体 |
IL246129A0 (en) | 2016-06-08 | 2016-08-31 | Sibony Haim | A visual display system to prevent accidents with vehicles |
KR20200011405A (ko) * | 2016-07-01 | 2020-02-03 | 아이사이트 모빌 테크놀로지 엘티디 | 운전자 모니터링을 위한 시스템 및 방법 |
FR3053480B1 (fr) * | 2016-07-04 | 2018-08-17 | Valeo Comfort & Driving Assistance | Systeme de presentation d'informations visuelles tete-haute pour conducteur a commande d'activation-desactivation de module de projection, dispositif tete-haute et procede correspondants |
CN106295542A (zh) * | 2016-08-03 | 2017-01-04 | 江苏大学 | 一种夜视红外图像中的基于显著性的道路目标提取方法 |
WO2018051734A1 (ja) * | 2016-09-16 | 2018-03-22 | 富士フイルム株式会社 | 投写型表示装置及びその制御方法 |
EP3299241B1 (en) * | 2016-09-26 | 2021-11-10 | Volvo Car Corporation | Method, system and vehicle for use of an object displaying device in a vehicle |
US10289197B2 (en) * | 2017-05-26 | 2019-05-14 | GM Global Technology Operations LLC | Apparatus and method for detecting inappropriate gear selection based on gaze information |
CN109212753B (zh) * | 2017-07-06 | 2021-01-29 | 京东方科技集团股份有限公司 | 可变显示距离的抬头显示***、抬头显示方法、驾驶设备 |
JP7180067B2 (ja) | 2017-11-06 | 2022-11-30 | 日本電気株式会社 | 運転支援装置 |
JP6777060B2 (ja) * | 2017-11-15 | 2020-10-28 | オムロン株式会社 | 脇見判定装置、運転支援システム、脇見判定方法及び脇見判定のためのプログラム |
KR102446387B1 (ko) * | 2017-11-29 | 2022-09-22 | 삼성전자주식회사 | 전자 장치 및 그의 텍스트 제공 방법 |
US10600390B2 (en) * | 2018-01-10 | 2020-03-24 | International Business Machines Corporation | Displaying a vehicle notification in a location determined based on driver eye gaze direction and other criteria |
CN110316066B (zh) * | 2018-03-30 | 2021-05-14 | 比亚迪股份有限公司 | 基于车载显示终端的防倒影方法和装置及车辆 |
CN108888487A (zh) * | 2018-05-22 | 2018-11-27 | 深圳奥比中光科技有限公司 | 一种眼球训练***及方法 |
US10970571B2 (en) | 2018-06-04 | 2021-04-06 | Shanghai Sensetime Intelligent Technology Co., Ltd. | Vehicle control method and system, vehicle-mounted intelligent system, electronic device, and medium |
US10915769B2 (en) | 2018-06-04 | 2021-02-09 | Shanghai Sensetime Intelligent Technology Co., Ltd | Driving management methods and systems, vehicle-mounted intelligent systems, electronic devices, and medium |
CN109002757A (zh) * | 2018-06-04 | 2018-12-14 | 上海商汤智能科技有限公司 | 驾驶管理方法和***、车载智能***、电子设备、介质 |
CN109109666A (zh) * | 2018-09-03 | 2019-01-01 | 王宣武 | 一种汽车前风窗视觉控制*** |
DE102018219481A1 (de) * | 2018-11-15 | 2020-05-20 | Robert Bosch Gmbh | Baugruppe für einen LiDAR-Sensor und LiDAR-Sensor |
EP3884300B1 (en) * | 2018-11-19 | 2024-06-12 | Suteng Innovation Technology Co., Ltd. | Lidar signal receiving circuits, lidar signal gain control methods, and lidars using the same |
JP7053437B2 (ja) * | 2018-11-26 | 2022-04-12 | 本田技研工業株式会社 | 運転支援装置および車両 |
EP3896963A4 (en) * | 2018-12-11 | 2022-01-19 | Sony Group Corporation | IMAGE PROCESSING DEVICE, IMAGE PROCESSING METHOD AND IMAGE PROCESSING SYSTEM |
US11603043B2 (en) | 2018-12-11 | 2023-03-14 | Sony Group Corporation | Image processing apparatus, image processing method, and image processing system |
CN109849788B (zh) * | 2018-12-29 | 2021-07-27 | 北京七鑫易维信息技术有限公司 | 信息提供方法、装置及*** |
JP2020112698A (ja) | 2019-01-11 | 2020-07-27 | 株式会社リコー | 表示制御装置、表示装置、表示システム、移動体、プログラム、画像生成方法 |
JP7192570B2 (ja) * | 2019-02-27 | 2022-12-20 | 株式会社Jvcケンウッド | 記録再生装置、記録再生方法およびプログラム |
CN109968979B (zh) * | 2019-03-14 | 2021-12-07 | 阿波罗智联(北京)科技有限公司 | 车载投射处理方法、装置、车载设备及存储介质 |
IL265495B (en) * | 2019-03-19 | 2022-09-01 | Rober Ohrenstein | Method for travel authorization |
DE102019206749A1 (de) | 2019-05-09 | 2020-11-12 | Volkswagen Aktiengesellschaft | Mensch-Maschine-Interaktion in einem Kraftfahrzeug |
US11042765B2 (en) * | 2019-05-14 | 2021-06-22 | Toyota Motor Engineering & Manufacturing North America, Inc. | Systems and methods for playing vehicle monitored content in a vehicle |
US10885728B1 (en) * | 2019-10-02 | 2021-01-05 | William H. Havins | Cognitively optimized user interface for motor vehicle |
US11106336B2 (en) * | 2019-10-02 | 2021-08-31 | William H. Havins | Cognitively optimized user interface for static equipment |
CN111263133B (zh) * | 2020-02-26 | 2021-10-01 | 中国联合网络通信集团有限公司 | 信息处理方法及*** |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06251287A (ja) * | 1993-02-23 | 1994-09-09 | Mitsubishi Electric Corp | 運転支援システム |
JP2001194161A (ja) * | 2000-01-11 | 2001-07-19 | Alpine Electronics Inc | 視線移動検出情報提示装置 |
JP2001330450A (ja) * | 2000-03-13 | 2001-11-30 | Alpine Electronics Inc | ナビゲーション装置 |
JP2004061259A (ja) * | 2002-07-29 | 2004-02-26 | Mazda Motor Corp | 情報提供装置、情報提供方法及び情報提供用プログラム |
JP2006023953A (ja) * | 2004-07-07 | 2006-01-26 | Fuji Photo Film Co Ltd | 情報表示システム |
JP2006090790A (ja) * | 2004-09-22 | 2006-04-06 | Toyota Motor Corp | 運転支援装置 |
JP2007230369A (ja) * | 2006-03-01 | 2007-09-13 | Toyota Motor Corp | 車載装置調整装置 |
JP2007263931A (ja) * | 2006-03-30 | 2007-10-11 | Denso It Laboratory Inc | ドライバ思考推定装置、ドライバ思考推定方法及びドライバ思考推定プログラム |
JP2008082822A (ja) * | 2006-09-27 | 2008-04-10 | Denso It Laboratory Inc | 注視対象物検出装置および注視対象物検出方法 |
JP2010127779A (ja) * | 2008-11-27 | 2010-06-10 | Denso It Laboratory Inc | 情報提供装置、情報提供方法およびプログラム |
JP2010173530A (ja) * | 2009-01-30 | 2010-08-12 | Toyota Motor Corp | 走行支援装置 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006172215A (ja) | 2004-12-16 | 2006-06-29 | Fuji Photo Film Co Ltd | 運転支援システム |
JP2007178293A (ja) | 2005-12-28 | 2007-07-12 | Suzuki Motor Corp | 車両用表示装置 |
JP4935571B2 (ja) | 2007-08-08 | 2012-05-23 | 株式会社デンソー | 運転支援装置 |
JP5078815B2 (ja) * | 2008-09-12 | 2012-11-21 | 株式会社豊田中央研究所 | 開眼度推定装置 |
JP2010179850A (ja) | 2009-02-06 | 2010-08-19 | Toyota Motor Corp | 車両用表示装置 |
US20120002028A1 (en) * | 2010-07-05 | 2012-01-05 | Honda Motor Co., Ltd. | Face image pick-up apparatus for vehicle |
US8605009B2 (en) * | 2010-12-05 | 2013-12-10 | Ford Global Technologies, Llc | In-vehicle display management system |
JP5630518B2 (ja) | 2012-03-14 | 2014-11-26 | 株式会社デンソー | 運転支援装置 |
US20140070934A1 (en) * | 2012-09-07 | 2014-03-13 | GM Global Technology Operations LLC | Methods and systems for monitoring driver object detection |
-
2013
- 2013-03-01 JP JP2013041039A patent/JP5630518B2/ja not_active Expired - Fee Related
- 2013-03-08 CN CN201380013976.6A patent/CN104169993B/zh not_active Expired - Fee Related
- 2013-03-08 KR KR1020147026530A patent/KR101562877B1/ko active IP Right Grant
- 2013-03-08 US US14/376,910 patent/US9317759B2/en active Active
- 2013-03-08 DE DE112013001472.6T patent/DE112013001472T5/de not_active Ceased
- 2013-03-08 WO PCT/JP2013/001487 patent/WO2013136740A1/ja active Application Filing
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06251287A (ja) * | 1993-02-23 | 1994-09-09 | Mitsubishi Electric Corp | 運転支援システム |
JP2001194161A (ja) * | 2000-01-11 | 2001-07-19 | Alpine Electronics Inc | 視線移動検出情報提示装置 |
JP2001330450A (ja) * | 2000-03-13 | 2001-11-30 | Alpine Electronics Inc | ナビゲーション装置 |
JP2004061259A (ja) * | 2002-07-29 | 2004-02-26 | Mazda Motor Corp | 情報提供装置、情報提供方法及び情報提供用プログラム |
JP2006023953A (ja) * | 2004-07-07 | 2006-01-26 | Fuji Photo Film Co Ltd | 情報表示システム |
JP2006090790A (ja) * | 2004-09-22 | 2006-04-06 | Toyota Motor Corp | 運転支援装置 |
JP2007230369A (ja) * | 2006-03-01 | 2007-09-13 | Toyota Motor Corp | 車載装置調整装置 |
JP2007263931A (ja) * | 2006-03-30 | 2007-10-11 | Denso It Laboratory Inc | ドライバ思考推定装置、ドライバ思考推定方法及びドライバ思考推定プログラム |
JP2008082822A (ja) * | 2006-09-27 | 2008-04-10 | Denso It Laboratory Inc | 注視対象物検出装置および注視対象物検出方法 |
JP2010127779A (ja) * | 2008-11-27 | 2010-06-10 | Denso It Laboratory Inc | 情報提供装置、情報提供方法およびプログラム |
JP2010173530A (ja) * | 2009-01-30 | 2010-08-12 | Toyota Motor Corp | 走行支援装置 |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101562877B1 (ko) | 2012-03-14 | 2015-10-23 | 가부시키가이샤 덴소 | 운전 지원 장치 및 운전 지원 방법 |
CN105522971A (zh) * | 2014-10-21 | 2016-04-27 | 现代摩比斯株式会社 | 车辆外部图像输出控制装置及方法 |
WO2018163811A1 (ja) * | 2017-03-07 | 2018-09-13 | ソニー株式会社 | 情報処理装置、情報処理方法、並びにプログラム |
JPWO2018163811A1 (ja) * | 2017-03-07 | 2020-01-09 | ソニー株式会社 | 情報処理装置、情報処理方法、並びにプログラム |
JP7092110B2 (ja) | 2017-03-07 | 2022-06-28 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、並びにプログラム |
US20210291853A1 (en) * | 2020-03-20 | 2021-09-23 | Alpine Electronics, Inc. | Vehicle image processing device |
US11661004B2 (en) * | 2020-03-20 | 2023-05-30 | Alpine Electronics, Inc. | Vehicle image processing device |
Also Published As
Publication number | Publication date |
---|---|
DE112013001472T5 (de) | 2014-12-04 |
CN104169993B (zh) | 2016-05-25 |
JP2013218671A (ja) | 2013-10-24 |
JP5630518B2 (ja) | 2014-11-26 |
CN104169993A (zh) | 2014-11-26 |
US9317759B2 (en) | 2016-04-19 |
US20150010207A1 (en) | 2015-01-08 |
KR20140126401A (ko) | 2014-10-30 |
KR101562877B1 (ko) | 2015-10-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5630518B2 (ja) | 運転支援装置 | |
CN110678371B (zh) | 车辆控制***、车辆控制方法及存储介质 | |
JP6413207B2 (ja) | 車両用表示装置 | |
CN111433067A (zh) | 平视显示装置及其显示控制方法 | |
JP4222183B2 (ja) | 車両周辺画像表示装置 | |
JP6865006B2 (ja) | 車両用表示装置 | |
JP2008258822A (ja) | 車両周辺監視装置 | |
WO2016103418A1 (ja) | 車両用情報表示装置 | |
CN116323320A (zh) | 虚像显示装置以及显示*** | |
CN110462699B (zh) | 车辆用显示控制装置以及车辆用显示单元 | |
JP2018144554A (ja) | 車両用ヘッドアップディスプレイ装置 | |
JP2009171129A (ja) | 車両用駐車支援装置および画像表示方法 | |
JP6825433B2 (ja) | 虚像表示装置及びコンピュータプログラム | |
JP2015210580A (ja) | 表示システム及びウェアラブル機器 | |
JP5885619B2 (ja) | 運転支援装置 | |
US20230001947A1 (en) | Information processing apparatus, vehicle, and information processing method | |
US11679677B2 (en) | Device control system, moving vehicle, device control method, and non-transitory storage medium | |
JP6996542B2 (ja) | 車両用表示制御装置及び車両用表示ユニット | |
JP2004177315A (ja) | 視線方向判定装置及びそれを利用した対話システムならびに運転支援システム | |
US20240020803A1 (en) | Display control apparatus, display control method, recording medium, and display system | |
US11734928B2 (en) | Vehicle controls and cabin interior devices augmented reality usage guide | |
US20230376123A1 (en) | Display system | |
JP5920528B2 (ja) | 駐車支援装置 | |
JP2022131651A (ja) | ヘッドアップディスプレイ | |
JP2022010371A (ja) | 運転支援装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13760242 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14376910 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1120130014726 Country of ref document: DE Ref document number: 112013001472 Country of ref document: DE |
|
ENP | Entry into the national phase |
Ref document number: 20147026530 Country of ref document: KR Kind code of ref document: A |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13760242 Country of ref document: EP Kind code of ref document: A1 |