WO2013111388A1 - 情報表示装置 - Google Patents
情報表示装置 Download PDFInfo
- Publication number
- WO2013111388A1 WO2013111388A1 PCT/JP2012/075167 JP2012075167W WO2013111388A1 WO 2013111388 A1 WO2013111388 A1 WO 2013111388A1 JP 2012075167 W JP2012075167 W JP 2012075167W WO 2013111388 A1 WO2013111388 A1 WO 2013111388A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display
- information
- display area
- layout
- gaze
- Prior art date
Links
- 230000008859 change Effects 0.000 claims abstract description 85
- 238000001514 detection method Methods 0.000 claims abstract description 9
- 238000009825 accumulation Methods 0.000 claims description 17
- 238000010422 painting Methods 0.000 claims description 3
- 238000007781 pre-processing Methods 0.000 claims description 2
- 230000007704 transition Effects 0.000 abstract description 11
- 238000000034 method Methods 0.000 description 70
- 230000008569 process Effects 0.000 description 69
- 238000012545 processing Methods 0.000 description 21
- 238000010586 diagram Methods 0.000 description 12
- 239000003973 paint Substances 0.000 description 6
- 230000001186 cumulative effect Effects 0.000 description 4
- 239000000446 fuel Substances 0.000 description 4
- 230000004397 blinking Effects 0.000 description 3
- 210000005252 bulbus oculi Anatomy 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000005484 gravity Effects 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 238000002360 preparation method Methods 0.000 description 2
- 230000001174 ascending effect Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 244000145845 chattering Species 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 239000011248 coating agent Substances 0.000 description 1
- 238000000576 coating method Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 210000001508 eye Anatomy 0.000 description 1
- 230000004886 head movement Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000003183 myoelectrical effect Effects 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 241000894007 species Species 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/22—Display screens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/23—Head-up displays [HUD]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/28—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/29—Instruments characterised by the way in which information is handled, e.g. showing information on plural displays or prioritising information according to driving conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/50—Instruments characterised by their means of attachment to or integration in the vehicle
- B60K35/53—Movable instruments, e.g. slidable
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/60—Instruments characterised by their location or relative disposition in or on vehicles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/149—Instrument input by detecting viewing direction not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/166—Navigation
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/18—Information management
- B60K2360/186—Displaying information according to relevancy
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/20—Optical features of instruments
- B60K2360/21—Optical features of instruments using cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/20—Optical features of instruments
- B60K2360/33—Illumination features
- B60K2360/334—Projection means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/77—Instrument locations other than the dashboard
- B60K2360/771—Instrument locations other than the dashboard on the ceiling
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/77—Instrument locations other than the dashboard
- B60K2360/785—Instrument locations other than the dashboard on or in relation to the windshield or windows
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04803—Split screen, i.e. subdividing the display area or the window area into separate subareas
Definitions
- the present invention relates to an information display device that lays out and displays a large number of different types of information on a screen.
- Patent Document 1 discloses an easy-to-use display control apparatus that can display various types of information with a layout and timing reflecting the user's intention.
- the present invention has been made to solve the above-described problems, and an object of the present invention is to provide an information display device that can prevent loss of display information caused by a change in display information layout. To do.
- the present invention provides an information display device that displays on a screen display layout information having at least one or more display information corresponding to the status of a device, wherein the display information includes at least a display area.
- Information to be included gaze determination means for determining whether the user is gazing at the screen, display area change detection means for detecting that the display area of the display information has changed, and the gaze determination means
- the display area change detecting unit detects that the display area has changed.
- Display area calculation means for calculating a display area and display area of the display information included in the display layout information or the display area calculation means. In accordance with the display area of the display information, characterized by comprising a display information displaying means for displaying the display information.
- the display information is moved when the user is gazing at the user's screen, and the original display state is displayed even when the layout of the display information is changed in a situation where the user is not gazing at the screen. Since the change from is displayed when the user gazes, the display transition of information before and after the layout change can be easily recognized.
- FIG. 3 is a flowchart showing a main process of display control means in the first embodiment.
- 3 is a flowchart illustrating gaze start processing according to Embodiment 1.
- 4 is a flowchart showing gaze end processing in the first embodiment.
- 3 is a flowchart showing layout change processing in the first embodiment.
- 3 is a flowchart illustrating drawing update processing according to the first embodiment.
- FIG. 3 is a block diagram showing an internal function of display control means in the first embodiment.
- 5 is a flowchart showing an animation screen creation process in the first embodiment.
- 6 is a display example of a display area calculated by the display area calculation unit in the first embodiment.
- 6 is a display example of a display area calculated for information with which display disappears in the first embodiment.
- 6 is a diagram illustrating an example of actual screen transition in the operation example according to Embodiment 1.
- FIG. 6 is a diagram illustrating an example of display layout information in an operation example according to Embodiment 1.
- FIG. 10 is a flowchart illustrating drawing update processing according to the second embodiment.
- 10 is a flowchart illustrating layout change processing according to the second embodiment.
- 10 is a display example of a screen in the second embodiment.
- FIG. 10 is a diagram showing a table for defining priority in the third embodiment.
- FIG. 10 is a diagram showing display parameters in the fifth embodiment.
- 10 is a display example of a screen in the fifth embodiment.
- 20 is a flowchart illustrating drawing update processing according to the sixth embodiment.
- FIG. 20 is a diagram illustrating a table that defines gazeable time in the sixth embodiment.
- 20 is a flowchart illustrating processing for calculating an animation display time and an animation frame length in the sixth embodiment.
- 18 is a flowchart showing a gaze start process in the sixth embodiment.
- 18 is a flowchart illustrating gaze end processing according to the sixth embodiment.
- 18 is a flowchart showing gaze end processing in the seventh embodiment.
- 18 is a flowchart illustrating drawing update processing according to the seventh embodiment.
- FIG. 29 is a flowchart showing an animation screen creation process in the eighth embodiment.
- 20 is an example of conversion for graphics having different numbers of vertices in the eighth embodiment.
- FIG. 20 is a diagram showing a table for defining display contents permitted in the fourth embodiment.
- FIG. 20 is a diagram showing an animation priority table in the fourth embodiment.
- FIG. 1 is a block diagram showing the overall configuration of the in-vehicle information display device according to Embodiment 1 of the present invention.
- the information display device 10 includes a gaze determination unit 12 that acquires and determines a user's gaze, an information creation unit 13 that creates information to be displayed, and an information creation unit 13.
- a display display 14 for displaying the created information, a display control means 11 for controlling the display screen displayed on the display display 14, and a storage means 15 for storing display layout information and the like are provided.
- the gaze determination means 12 uses a line-of-sight acquisition camera 16 that captures the user's eyes to calculate a gaze point that is the point currently focused on by the user from the movement of the eyeball, and the user's gaze point is displayed on the display 14. It is calculated whether or not there is, and it is determined whether or not the user is gazing at the display screen. If the gazing point is on the display 14, a signal indicating that the user is gazing is sent to the display control unit 11. If the gazing point is not on the display display 14, a signal indicating that the gazing point is not being displayed is displayed.
- an apparatus that detects the line of sight by calculating the movement of the eyeball from the myoelectric potential and brain waves of the user's head.
- a device for deriving the user's gaze target from a camera that captures the user's retina a device for calculating the gaze direction by calculating the head movement from a motion sensor that measures the position and angle of the user's head, the steering angle of the steering wheel, A device that statistically predicts whether or not the user is gazing at the screen using information such as the amount of braking, the distance between the vehicle ahead, the type of road that is running, and whether or not the sound is presented, pressed by the user while gazing Any means may be used as long as it can acquire the user's screen gaze, such as by a button to perform.
- the information creation means 13 creates and provides display information to be displayed on the display 14 and its display layout.
- the information creating means 13 includes, for example, a sensor group 17 necessary for obtaining the current operation status (device status), and a CAN (Controller Area Network) information receiving device for generating information shown on the speedometer and tachometer 18, a camera 19 for generating a vehicle surrounding image, a DSRC (Dedicated Short Range Communication) receiver 20 for receiving traffic conditions, a car navigation system 21 for generating route guidance information and a current position display map, and the like are connected. ing.
- the information creating unit 13 also exchanges information with the storage unit 15 in which a table in which the driving status (device status) and display layout information are associated with each other is stored.
- an association table 32 of operation status (device status) 30 and display layout information 31 is stored.
- the display layout information 31 holds one or more display information 33.
- the information creating unit 13 and the display control unit 11 are described as being included in the same information display device 10, but the information creating unit 13 and the display control unit 11 are divided into separate devices. In that case, the storage means 15 may be provided separately for each.
- the sensor group 17 senses information necessary for generating the driving situation used for determining the information to be displayed and its layout. For example, when switching the information to be displayed and its layout for various driving situations such as “normal driving”, “running in the dark”, “stopping”, “before the intersection”, the “vehicle speed” Various sensors such as “sensor”, “illuminance sensor” for detecting ambient illuminance, and “means for detecting the front of the intersection from map information and current position information” for detecting an intersection are provided.
- the information preparation means 13 is the present driving condition from the information acquired from the sensor group 17, CAN information receiving device 18, camera 19, DSRC receiver 20, car navigation system 21 etc. connected to the information preparation means 13.
- the display layout information 31 matching the current driving situation and the display information 33 included therein are displayed with reference to the association table 32 of the driving situation 30 and the display layout information 31 stored in the storage means 15. This is sent to the control means 11.
- the display control unit 11 uses the information provided from the information creation unit 13 to execute display control processing for switching information displayed on the display 14 and its layout.
- FIG. 3 is a diagram illustrating the configuration of the display information 33.
- the display information 33 represents a display content 40 that represents the content (type) of information to be displayed, a display region 41 that represents the shape of the region to be displayed, and a position (X coordinate, Y coordinate) of the display region. It consists of parameters such as the display position 42.
- the display content 40 serves as an ID that identifies the type of information to be displayed. Even when the contents actually displayed are different, the same value is set for the same display. For example, even when the display is changed from an analog speed meter to a digital speed meter, since the speed meter is the same, a value that can be uniquely specified as “speed meter” is entered.
- the display contents 40 in the first embodiment for example, “audio state”, “route guidance”, “tachometer”, “fuel gauge”, “clock”, “vehicle information”, and the like are conceivable.
- the display area 41 is a parameter that represents the shape and size when display information is displayed on the screen, and is represented using a shape 43 and a circumscribed rectangle 44 of the shape.
- the display position 42 is represented by absolute coordinates that specify at which coordinates on the screen the display information is displayed. These three pieces of information (display content 40, display area 41, and display position 42) define which display information is to be displayed at which position and in what size and shape. Information is displayed.
- FIG. 4 shows a variable list 50 used for display control processing executed by the display control means 11 and a specific example thereof.
- the variable list 50 includes a gaze flag 51 indicating whether or not the user is gazing at the screen, and the time that the user has been gazing at the screen since the display layout information is currently provided.
- Gaze accumulation time 52 shown previous layout information 53 storing the display layout information 31 of the screen that the user has watched sufficiently before, current layout information 54 storing the display layout information used for the current display, and animation showing the layout change Animation counter 55 for calculating the degree of progress (degree of change, rate), animation frame length 56 that defines the length of the animation in units of frames, and animation progress rate that can be calculated from the animation counter 55 and animation frame length 56 (layout change) Level and ratio) Animation progress ratio 57, are included variables such as animation display time 58 that define the display time of the animation.
- the animation frame length 56 and the animation display time 58 are constants set at the time of design.
- the variable list 50 is stored in the storage unit 15.
- FIG. 5 is a flowchart showing a main flow of display control processing in the display control means 11.
- the display control means 11 detects an event (in the case of YES in step ST100)
- the display control means 11 executes the following processing depending on the type of the event, but when an end event is detected (step ST101). If YES, the process is terminated.
- step ST200 When the gaze start event input from the gaze determination means 12 is detected (when step ST101 is NO and step ST102 is YES), the gaze start process shown in FIG. 6 is executed (step ST200), and the gaze end event is detected. If detected (NO in both step ST101 and step ST102 and YES in step ST103), the gaze end process shown in FIG. 7 is executed (step ST210). Further, when neither a gaze start event nor a gaze end event is detected and a change in the display layout input from the information creation device 14 is detected (when steps ST101 to ST103 are NO and step ST104 is YES), the layout With the change as a trigger, the layout change process shown in FIG. 8 is executed (step ST300). Further, when NO in steps ST101 to ST104 and a drawing update event periodically generated by the timer is detected (YES in step ST105), the drawing update process shown in FIG. 9 is executed (step ST400).
- FIG. 6 is a flowchart showing the gaze start process executed in step ST200 of FIG.
- the gaze flag 51 indicating that the user is gazing at the display 14 is turned “ON” (step ST201), and the process is terminated as this flow.
- FIG. 7 is a flowchart showing the gaze end process executed in step ST210 of FIG.
- the gaze flag 51 is set to “OFF” (step ST211), and display information is not moved during drawing.
- the animation counter 55 is reset to “0” (zero) so that the animation is played back from the first frame at the start of the next gaze (step ST212).
- step ST300 is a flowchart showing the layout change process executed in step ST300 of FIG. As shown in FIG. 8, when the layout change process is started, the gaze cumulative time 52 and the animation counter 55 are reset to “0” (zero) (step ST301). Return to the loop.
- FIG. 9 is a flowchart showing the drawing update process executed in step ST400 of FIG.
- FIG. 10 is a block diagram showing the internal functions of the display control means 11.
- the display control means 11 includes a display area change detection means 1, a display area calculation means 2, a display area calculation determination means 3, and a display information display means 4.
- the display area change detection means 1 detects that the display area of the display information has changed from the previous layout information 53 and the current layout information 54.
- the display area calculating means 2 The display area of the display information is calculated according to the degree of change (ratio).
- the display area calculation determination unit 3 determines whether to execute the display area calculation unit 2 according to the accumulated gaze time 52 of the user.
- the display information display unit 4 displays the display information according to the display area of the display information included in the display layout information 31 or the display area of the display information calculated by the display area calculation unit 2.
- step ST401 in a drawing process that starts with a drawing update process (for example, a fixed cycle of 30 times per second), first, whether the gaze flag 51 is “ON” (“OFF”). Whether or not) (step ST401). If the gaze flag 51 is “OFF” (YES in step ST401), a screen is created in accordance with the display layout information 31 stored in the current layout information 54 (step ST402). On the other hand, if the gaze graph is “ON” (NO in step ST401), the display layout information 31 currently input from the information creation unit 13 is written in the current layout information 54, and the current layout information is updated ( Step ST410).
- a drawing update process for example, a fixed cycle of 30 times per second
- the accumulated gaze time 52 for accumulating and managing the time during which the user was gazing at the screen is incremented (step ST411). Then, it is determined whether or not the gaze accumulation time 52 is within an animation display time 58 (for example, 45 [frame]) (step ST412).
- the display area calculation determination means 3 assumes that the user has sufficiently recognized the change in layout, and the current layout.
- the previous layout information 53 is updated by overwriting the display layout information 31 stored in the information 54 with the previous layout information 53 that stores the layout that the user has watched before (step ST413), and the screen is displayed according to the current layout information 54. Create (step ST402).
- step ST420 if the cumulative gaze time 52 is less than the animation display time 58 (NO in step ST412), the animation counter 55 that counts animation frames is incremented (step ST420). However, when the animation counter 55 exceeds the animation frame length 56, the animation counter 55 is reset to “0” (zero), thereby realizing an animation loop. Then, a screen in transition from the previous layout information 53 to the current layout information 54 is automatically created according to the animation counter 55 (step ST421). The animation screen creation process will be described later with reference to FIG. Then, the display information display means 4 actually displays the screen created in the process of step ST402 or ST421 in FIG. 9 on the screen of the display 14 (step ST403). Return to the loop.
- the animation screen creation process in step ST421 in FIG. 9 will be described with reference to the flowchart shown in FIG.
- the progress of the animation to be displayed is calculated.
- the current animation progress rate 57 is calculated and stored from the current animation counter 55 and the animation frame length 56 (step ST500).
- one piece of display information is extracted from a plurality of pieces of display information that is the union of the display information 33 included in the previous layout information 53 and the current layout information 54 (step ST501).
- the display area change detection means 1 determines that the display area of the display information is Detect changes.
- the display area calculation means 2 uses the previous layout of the extracted display information. From the display area in the information 53 and the display area in the current layout information 54, the intermediate form (intermediate coordinates) of the circumscribed rectangle 44 of the two display areas is linearly interpolated using the animation progress rate 57 calculated in step ST500.
- the display area in the current frame is calculated (step ST510). For example, as shown in FIG. 12, the circumscribed rectangle J800 (see FIG. 12A) of the display information “speed meter” included in the previous layout information 53 is (400, 200) (400, 400) (600, 400).
- the circumscribed rectangle J801 (see FIG. 12B) of the “speed meter” included in the current layout information 54 is (100, 100) (100, 400) (400, 400) (400). , 100) and the progress of animation (degree of layout change, ratio) is 20%
- the circumscribed rectangle J802 (see FIG. 12C) in the current animation counter 55 is (340, 180) (340, 400). ) (560, 400) (560, 180).
- the display information is drawn in a rectangular area constituted by the calculated intermediate coordinates, and the target display information is drawn by clipping with the shape shown in the display area (step ST511).
- step ST502 if the extracted display information is not included in the previous layout information 53 or the current layout information 54 (NO in step ST502), is the display information included only in the previous layout information 53?
- step ST520 the display area change detecting means 1 detects that the display area of the display information has changed.
- the display area calculation unit 2 uses the display area and the display position as coordinates of the center of the extracted display information. Calculate (step ST521).
- the direction in which the difference between the center of gravity of the display information and the top, bottom, left and right sides becomes the smallest that is, the closest point from the center of gravity of the display information to the screen edge.
- the direction is calculated (step ST522). For example, in the case of “weather information” J810 (see FIG. 13A) which is the display information shown in FIG. 13, the barycentric coordinates J820 is (200,75) and the screen area is (0,0) ⁇ (800,400). Therefore, it can be calculated that the direction in which the difference is the smallest is the negative direction of the Y axis.
- the display area is moved in the calculated direction to calculate a display area J811 (see FIG. 13B) in which all the display areas disappear from the screen (step ST523).
- the display area J810 indicated by the previous layout information 53 is (20, 20) (20, 130) (180, 130) (180, 20).
- the display area J811 (see FIG. 13B) where the display disappears by moving in the negative direction of the Y axis becomes (20, ⁇ 110) (20, 0) (180, 0) (180, ⁇ 110). .
- the display area calculation means 2 uses the display area where the display disappears as the position to which the animation is moved to calculate the display area in the current animation counter 55 (step ST510), and the display information display means 4 The display information is displayed in the display area (step ST511).
- the extracted display information is not included in the previous layout information 53 or the current layout information 54 (in the case of NO in step ST502), and the display information is not included in the previous layout information 53.
- an animation indicating a new display is created instead of an animation indicating movement.
- the transparency is calculated according to the animation progress rate 57 calculated in step ST500 (step ST530), and the display area is drawn (step ST511), thereby displaying the display area. Realize blinking.
- a screen is generated by performing the above processing for all display information included in the union of the previous layout information 53 and the current layout information 54.
- FIG. 14 is a diagram illustrating a display state in which a plurality of display information is displayed on the display 14.
- the information creation means 13 uses the speed meter J4, tachometer J5, clock J7, vehicle information J8, route guidance J3 to the destination, audio state J2, weather J1 at the destination, and traffic jam information J9 on the route. Display information is created.
- FIG. 4 (operation example a) shows a variable list in the initial state
- FIG. 15 shows display layout information currently provided from the information creation unit 13.
- the drawing update process shown in FIG. 9 is executed by the main loop shown in FIG.
- the gaze flag 51 in step ST401 is “OFF”. Therefore, the “stopped” display layout information 70 (FIG. 15 (FIG. 15) stored in the current layout information 54 in step ST402. A) is used to lay out each display information and create a screen. Then, the screen created in step ST403 is displayed, for example, as shown in FIG. In this state, when the gaze determination unit 12 detects the user's screen gaze, a gaze start event is issued, and the gaze start process shown in FIG. 6 is executed. Then, the gaze flag 51 is turned “ON” in the process of step ST201.
- step ST410 the gaze accumulation time 52 is incremented.
- step ST412 the gaze accumulation time 52 and the animation display time 58 are compared. Since the gaze is just started now, the gaze accumulation time 54 becomes less than the animation display time 58 (for example, 45 [frame]), and the process proceeds to step ST420.
- step ST420 the animation counter 55 is incremented, and in step ST421, animation creation processing (processing shown in FIG. 11) is executed.
- step ST500 the animation progress rate 57 is calculated from the animation counter 55 and the animation frame length 56 and updated. Thereafter, the display information included in the current layout information 54 and the previous layout information 53 is extracted to create a screen during animation. However, since the previous layout information 53 does not exist at present, the process proceeds to NO in both step ST502 and ST520, and step ST530.
- the animation progress rate 57 is converted into the transparency of the display area and drawn as a new appearance animation. The above processing is executed for all display information, and as a result, all display information is blinked.
- step ST412 If the accumulated gaze time 52 is equal to or longer than the animation display time 58 by continuing the gaze, the process proceeds to YES in step ST412, and the current layout information is overwritten on the previous layout information in step ST413 to perform normal display. Stops.
- the state of the variable at this time is shown in FIG. 4 (operation example b).
- the gaze flag 51 is set to “OFF”, and the animation counter 55 is also reset to “0” (zero).
- the driving situation changes to “during normal running”, and the relationship between the driving situation 30 stored in the storage unit 15 and the display layout information 31 is changed.
- the display layout information 80 (see FIG. 15B) corresponding to “during normal driving” starts to be provided from the relationship table 32 shown. Therefore, a layout change event occurs and the layout change process shown in FIG. 8 is executed.
- the gaze accumulation time 52 and the animation counter 55 are reset to “0” (zero) in step ST301.
- the display layout information 70 “stopped” before the driving situation change stored in the current layout information 54 is used.
- the screen is drawn.
- the gaze flag 51 is set to “ON” by the gaze start process.
- the gaze start flag is “ON”, an animation screen creation process is executed in the drawing update process.
- step ST501 the display information is extracted one by one from all the display information included in each of the display layout information “stopped” and “normally running”, and in step ST502, the display information is extracted. It is determined in which display layout information the display information exists, and further, it is determined in step ST520 whether the display information exists only in the “stopped” display layout information 70 stored in the previous layout information 53, and according to the flowchart. The display area is calculated.
- “tachometer” J5, “fuel gauge” J6, “clock” J7, and “vehicle information” J8 the display area in the current frame is calculated from the two display areas.
- the “fuel gauge” J6, “clock” J7, and “vehicle information” J8 do not move, the display area does not change as a result.
- “weather at destination” J1 is included only in the display layout information 70 of “stopped”, it is displayed as an animation that moves outside the screen during “normally traveling”.
- FIGS. 14B and 14C show the display screen as shown in FIGS. 14B and 14C.
- the layout will not change until the user watches, and the user will start watching. Changes in layout are shown in animation while watching from.
- FIG. 14B shows a state where the animation progress rate is 50%
- FIG. 14C shows a state where it is 100%.
- the layout change is indicated by the animation when the user is gazing, so that the display is lost due to the automatic layout change when the user is not gazing.
- the display transition of information before and after the layout change can be easily recognized.
- Embodiment 2 FIG.
- the display information display unit 4 displays the display information in accordance with the display area of the display information included in the display layout information 31 or the display area of the display information calculated by the display area calculation unit 2.
- the display information display means 4 always displays information according to the display layout information 31 input from the information creation means 13, and separately displays additional information display means 5 (not shown) for displaying information indicating a change in layout. By providing, a change in layout may be presented in addition to the normal display.
- this additional information display means 5 is separately provided.
- the display control means 11 includes, as internal functions, a display area change detection means 1, a display area calculation means 2, a display area calculation determination means 3, a display information display means 4, and additional information.
- Display means 5 is provided.
- the display information display unit 4 in the second embodiment displays the display information according to the display area of the display information included in the display layout information 31. Further, the additional information display means 5 displays a change in layout according to the display area of the display information calculated by the display area calculation means 2.
- display information is drawn using the current layout information 54 regardless of the user's gaze state. That is, when a drawing update event issued at a fixed cycle occurs, a screen is created from the current layout information 54 regardless of the user's gaze information (step ST402 in FIG. 16), and then the user's gaze is detected. Move on to animation screen creation processing.
- the current layout information 54 is updated to the latest display layout information provided from the information creating means 13 by replacing the layout change process shown in FIG. 7 with the layout change process shown in FIG. That is, when a layout change event occurs, such as when the vehicle starts running and the driving situation shifts from “stopped” to “normally running”, the gaze accumulated time 52 and the animation counter 55 are set to “0” (zero). (Step ST301) and the current layout information 54 is updated to the latest display layout information (step ST410 in FIG. 17).
- the display information display means 4 does not draw the display information, but the area calculated by the display area calculation means 2 is displayed.
- the additional information display means 5 displays additional information such as an outer frame or an arrow from the center of the area to the center, thereby performing a process of drawing a layout change.
- FIG. 18A shows an example of a screen that displays a change in layout by displaying an outer frame of the display area calculated by the display area calculation means 2.
- the outer frame of the display area is displayed, but the display area (entire display area) may be displayed when there is no outer frame.
- an arrow is displayed from the center in the display area before the layout change to the center in the display area after the layout change based on the display area calculated by the display area calculation means 2.
- This is an example of a screen displaying a change in layout.
- the arrow from the center of the display area to the center is used.
- the top left vertex of the display area before the layout change is changed to the one after the layout change.
- An arrow may be displayed from an arbitrary position in the display area before the layout change to an arbitrary corresponding position in the display area after the layout change, such as an arrow to the upper left vertex of the display area.
- the layout of the display information does not change when the user gazes, but the layout changes when the driving situation changes, and the additional information when the screen gazes.
- Embodiment 3 FIG.
- the display information 40 used in the display area calculation means 2 includes display content 40 as shown in FIG.
- a table 45 that defines the priority 46 uniquely determined may be prepared, and display information with higher priority may be displayed in the foreground when animations overlap.
- this table 45 is applied, that is, the display information includes information defining the priority, there are a plurality of display information, and the display area calculation means 2 for each display information. In the case where the display areas calculated in (1) overlap, display information with high priority is displayed in front.
- step ST501 of the animation screen creation flowchart shown in FIG. 11 when one piece of display information is taken out, the table 45 shown in FIG. 19 is referred to, and information is taken out in ascending order of display priority. Displayed in step ST511. Thereby, display information with higher priority is drawn later, and display information with higher priority is displayed on the front.
- the table 45 is stored in the storage unit 15.
- the information display device in the third embodiment it is possible to control information to be preferentially displayed when a plurality of display information display areas overlap. That is, display information that must be presented to the user, such as a speedometer in a vehicle, is not hidden, and important display information with high priority can be displayed with priority.
- the display device that executes the animation is not particularly limited. However, when the display areas calculated by the display area calculation unit 2 overlap, a display as shown in FIG. It is also possible to refer to the table 47 that predefines display contents that permit the display, and to display only the display contents that are prescribed in the table 47.
- the animation priority information 48 includes an animation simultaneous execution number 49 that can be executed simultaneously and an animation priority table 66.
- the animation simultaneous execution number 49 can be arbitrarily set in advance, and 3 is set here.
- the animation priority table 66 is a table that defines display contents and animation priorities. By preparing and referring to this table 66, when a plurality of animations are overlapped, the animation priority is displayed. The animation display is executed from the highest one to the number defined by the simultaneous animation execution number 49.
- the display information also includes animation priority information 48 that is information that regulates permission of continuous change display, and the display information that allows continuous change display is included in the display information. Only the continuous change display (animation display) is performed for the other display information, and the continuous display of animation is not performed for the other display information.
- the information display device in the fourth embodiment it is possible to control information to be preferentially displayed when a plurality of animation displays are executed simultaneously. That is, for example, display information that the user needs to reliably recognize the display position, such as a speedometer in a vehicle, is displayed with animation priority, and the screen can be prevented from becoming complicated.
- Embodiment 5 In the first to fourth embodiments, no particular consideration is given to the color or pattern of the animation, but for each display content of the display information, the animation frame color 61 and the frame line defined in the unique display parameter 60 are displayed. You may make it display using at least any one of the seed
- the animation frame color 61, frame line type 62, and paint color 63 defined in the display parameter 60 specific to each display content as shown in FIG. And at least one of a paint shading pattern (paint pattern) 64 and a shape 65 is used for display.
- the display parameter 60 is stored in the storage unit 15. For example, when the frame line type is fixed for each display content, a screen as shown in FIG. 21 is displayed. Thus, even if the display information is similar, for example, “speed meter” J4 and “tachometer” J5, the user can change the display color of the display information by changing the frame color 61, the frame line type 62, and the like. The display transition of each display information can be followed without confusion.
- the information display device in the fifth embodiment even when a plurality of display information is mixed and displayed to move, the frame color of the animation, the frame line type, etc. for each display content Therefore, it is possible to prevent the display transition such as the movement of the display information from becoming difficult to understand without confusing the display content of the display information for the user.
- Embodiment 6 FIG.
- the animation display time 58 and the animation frame length 56 used for comparison in the display area calculation determination means 3 are set to preset fixed values.
- the frame length 56 may be a variable value that changes for each driving situation (equipment situation).
- the animation display time 58 and the animation frame length 56 are variable values that change for each driving situation (equipment situation).
- the drawing update process shown in FIG. 9 is replaced with the drawing update process shown in the flowchart of FIG.
- a process of calculating the animation display time 58 and the animation frame length 56 is performed (step ST600).
- the animation display time 58 and the animation frame length 56 are calculated with reference to the gaze allowance table 100 in which the gaze available time 102 is defined for each driving situation (equipment situation) 101 as shown in FIG.
- the gaze margin table 100 is stored in the storage unit 15.
- step ST600 is performed according to the flowchart shown in FIG. 24, for example.
- FIG. 24 is a flowchart showing a process for calculating the animation display time and the animation frame length.
- the gaze available time that matches the current driving situation is extracted from the gaze allowance table 100 shown in FIG. 23 (step ST601).
- the shorter one of the extracted gazeable time and the longest animation time (for example, 500 ms) when displaying the animation slowly is determined as one animation time (step ST602).
- the animation frame length 56 is calculated using the drawing update cycle so as to satisfy one animation time (step ST603). That is, the animation frame length 56 varies (is dynamically changed) based on the gazeable time extracted in step ST601.
- the animation display time 58 is calculated from the animation frame length 56 calculated in step ST603 and the number of animations desired to be shown to the user (for example, 3 times) (step ST604). Return to the flowchart shown. That is, the animation display time 58 also varies (is dynamically changed) based on the gaze available time extracted in step ST601.
- the gaze allowance table 100 that is, the gazeable time 102 may be learned from the time when the user actually gazes at the screen.
- the gaze start process shown in FIG. 6 is replaced with the gaze start process shown in the flowchart of FIG. 25, and the gaze end process shown in FIG. 7 is replaced with the gaze end process shown in the flowchart of FIG.
- the gaze flag 51 is turned “ON” in step ST201, and then the gaze start time is recorded (step ST202).
- the gaze flag 51 is set to “OFF” in step ST211, the animation counter is reset to “0” (zero) in step ST212, and then the actual gaze time is calculated by the formula of current time ⁇ gaze start time. Is calculated (step ST213).
- the gaze accumulation time measured in association with the current driving situation is recorded (step ST214).
- the gaze available time in the gaze allowance table 100 is updated with the measured gaze accumulation time (step ST215).
- step ST215 for example, when gazeable time (margin time) corresponding to 95% or more is set, that is, gazes at a time that can include 95% or more of a plurality of actual gaze times recorded in each gaze.
- the gaze time in the gaze margin table 100 is overwritten on the 5% gaze time from the shortest gaze time group recorded in step ST214 as the gazeable time.
- the gaze time is assumed to follow a normal distribution, a set of gaze times is estimated, the 5% gaze time is calculated, and the gaze available time in the gaze margin table 100 is updated.
- the gazeable time can be corrected through actual use, so that the accuracy of the margin for each predicted situation can be improved during actual use. It is possible to generate an animation that matches the user's gazeable time.
- the information display device in the sixth embodiment it is possible to display an animation suitable for the user's margin for each current driving situation, and the user can display an animation in any situation. It can be visually recognized.
- the accuracy of the margin for each situation to be predicted can be improved during actual use, and an animation that is more suitable for the user's gazeable time can be generated.
- the gaze accumulation time 52 is accumulated by incrementing the time the user is gazing at the screen every time the drawing is updated. It may be a variable value that is different from the accumulated time that is actually being watched, such as counting only. In the seventh embodiment, the gaze accumulation time 52 is a value that changes in accordance with the completion of the animation.
- the display area calculation determination means 3 performs a gaze at the time of the gaze end process shown in FIG. 27 for the gaze accumulated time 52 used for the comparison for considering that the user has sufficiently recognized the change in layout.
- the animation counter 55 is subtracted from the accumulated time 52. That is, the gaze flag 51 is set to “OFF” in step ST211 of the gaze end process, the gaze time is calculated by subtracting the animation counter 55 from the gaze accumulation time 52 (step ST216), and then the animation counter is set to “0” in step ST212. Reset to (zero).
- the drawing update process shown in FIG. 9 is replaced with the drawing update process shown in the flowchart of FIG.
- the gaze time increment process performed in step ST411 in FIG. 9 is performed before the increment of the animation counter 55 in step ST420, and gaze is performed only when the animation counter 55 is equal to the animation frame length.
- An animation frame length 56 may be added to the accumulated time 52 (step ST431).
- the information display device in the seventh embodiment since the accumulated gaze time is counted only after the user has finished watching one animation, the case where the display display 14 is viewed only for a moment or Thus, it is possible to eliminate an extremely short gaze due to chattering of the gaze determination means 12 or a recognition error, and the user can surely see the animation.
- Embodiment 8 FIG.
- the display area 41 of the display information 33 included in both the previous layout information 53 and the current layout information 54 will be described as having the same shape, and the case of different shapes will be described.
- a pre-process for aligning the number of vertices is introduced. It is what I did.
- the display area calculation means 2 when the display area 41 of the display information 33 included in the previous layout information 53 and the current layout information 54 has a different shape, for example, the display area 41 of the display information 33 included in the previous layout information 53. If the shape 43 of the display area 41 is a quadrangle and the shape 43 of the display area 41 of the same display information 33 included in the current layout information 54 is a hexagon, preprocessing is performed to align the number of vertices before and after the layout change. Specifically, the animation screen creation process shown in FIG. 11 is replaced with the animation screen creation process shown in the flowchart of FIG.
- the number of vertices in the shape before and after the layout change is matched to the larger number of vertices.
- Are aligned step ST503.
- the shape of a certain display information 33 changes from a square 201 to a hexagon 202
- the second and fourth vertices of the square 201 are recognized as overlapping vertices.
- the shape 201 of the display information 33 in the layout information 53 is also recognized as having six vertices.
- the intermediate coordinates of each of the six vertices are calculated to calculate the intermediate shape 203 (step ST510).
- the shape 203 calculated in this way is used as the shape 43 of the display area 41.
- FIG. 30 shows the shape 203 calculated assuming that the layout change rate is 50%.
- the display information to be displayed is display information that changes not only the display position and the size of the display area, but also the shape of the display area, Since the animation can be displayed with its shape gradually changing, it is easier for the user to understand the connection.
- the information display device 10 has been described as an in-vehicle information display device.
- the information display device according to the present invention is not limited to a display device in an automobile, and is a device that lays out and displays various types of information on a screen. If so, the present invention can be applied to various display devices such as a monitor at a surgical site and a process display display at a factory.
- the display control of the information display device of the present invention the change from the original display state can be changed by the user. Therefore, it is possible to easily recognize the display transition of information before and after the layout change.
- losing can be a serious problem there have been few display devices that automatically change the layout of the system in the past. It is thought that it can greatly contribute to the development of industry.
- the portion expressed as ⁇ gaze '' is not limited to the meaning of staring and gazing, but also the action of gazing and recognizing, the action of glancing with a momentary gaze movement, etc. included.
- the information display device of the present invention is applicable to various display devices such as an in-vehicle information display device in a car, a monitor at a surgical site, a process display display in a factory, etc., as long as it is a device that lays out and displays various types of information on a screen. can do.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
実施の形態1.
図1は、この発明の実施の形態1における車載情報表示装置の全体構成を示すブロック図である。図1に示すように、実施の形態1における情報表示装置10は、ユーザの注視を取得して判断する注視判断手段12と、表示する情報を作成する情報作成手段13と、情報作成手段13で作成された情報を表示する表示ディスプレイ14と、表示ディスプレイ14に表示される表示画面の制御を行う表示制御手段11と、表示のレイアウト情報などを記憶する記憶手段15と、を備えている。
記憶手段15には、例えば図2に示すような、運転状況(機器の状況)30と表示レイアウト情報31との関連テーブル32が記憶されている。なお、表示レイアウト情報31は、1つ以上の表示情報33を保持している。なお、ここでは、情報作成手段13と表示制御手段11とが同じ情報表示装置10に含まれているものとして説明したが、情報作成手段13と表示制御手段11とが別々の装置にわかれている場合には、記憶手段15もそれぞれに対して別々に備えるようにすればよい。
表示制御手段11は、情報作成手段13から提供される情報を用いて、表示ディスプレイ14に表示する情報とそのレイアウトを切り替える表示制御処理を実行する。
表示位置42は、表示情報を画面のどの座標に表示するかを特定する絶対座標で表す。
これら3つの情報(表示内容40、表示領域41、表示位置42)により、どの表示情報をどの位置にどの大きさおよび形状で表示するのかが定義されており、これに従って表示ディスプレイ14にそれぞれの表示情報が表示される。
図4に示すように、変数リスト50には、ユーザが画面を注視しているかどうかを示す注視フラグ51、現在提供されている表示レイアウト情報になってからユーザが画面を注視していた時間を示す注視累積時間52、ユーザが前に十分注視した画面の表示レイアウト情報31を記憶する前レイアウト情報53、現在の表示に用いる表示レイアウト情報を記憶する現レイアウト情報54、レイアウトの変化を示すアニメーションの進行具合(変化の度合い、割合)を計算するためのアニメーションカウンタ55、アニメーションの長さをフレーム単位で定義するアニメーションフレーム長56、アニメーションカウンタ55とアニメーションフレーム長56から算出できるアニメーション進行割合(レイアウト変化の度合い、割合)を記憶するアニメーション進行割合57、アニメーションの表示時間を定義するアニメーション表示時間58といった変数が含まれている。また、アニメーションフレーム長56とアニメーション表示時間58は設計時に設定される定数である。なお、変数リスト50は、記憶手段15に格納されている。
図5に示すように、表示制御手段11は、イベントを検出すると(ステップST100のYESの場合)、そのイベントの種類によって、以下の処理を実行するが、終了イベントが検出された場合(ステップST101のYESの場合)には、処理を終了する。
図7は、図5のステップST210で実行される注視終了処理を示すフローチャートである。図7に示すように、注視終了処理が開始されると、注視フラグ51を“OFF”にし(ステップST211)、描画時に表示情報の移動を行わないようにする。次に、アニメーションカウンタ55を“0”(ゼロ)にリセットし、次の注視開始時にアニメーションが初めのフレームから再生されるようにして(ステップST212)、このフローとしては処理を終了し、メインループに戻る。
図8は、図5のステップST300で実行されるレイアウト変化処理を示すフローチャートである。図8に示すように、レイアウト変化処理が開始されると、注視累積時間52とアニメーションカウンタ55を“0”(ゼロ)にリセットして(ステップST301)、このフローとしては処理を終了し、メインループに戻る。
表示領域算出手段2は、注視判断手段12によってユーザが画面を注視していると判断された際に、表示領域変化検知手段1によって表示領域が変化したことが検知された場合に、当該表示領域の変化の度合い(割合)に応じて表示情報の表示領域を算出する。
表示領域計算判断手段3は、ユーザの注視累積時間52に応じて表示領域算出手段2を実行するかどうかを判断するものである。
表示情報表示手段4は、表示レイアウト情報31に含まれる表示情報の表示領域または表示領域算出手段2によって算出された表示情報の表示領域に従って、表示情報を表示する。
そして、図9のステップST402またはST421の処理において作成された画面を、表示情報表示手段4が実際に表示ディスプレイ14の画面に表示して(ステップST403)、このフローとしては処理を終了し、メインループに戻る。
まず初めに、表示するアニメーションの進行度合いを計算する。具体的には、現在のアニメーションカウンタ55とアニメーションフレーム長56から、現在のアニメーション進行割合57を算出して格納する(ステップST500)。そして、前レイアウト情報53と現レイアウト情報54に含まれる表示情報33の和集合である複数の表示情報から、表示情報を1つ取り出す(ステップST501)。そして、取り出した表示情報が、前レイアウト情報53にも現レイアウト情報54にも含まれているか否かを判断する(ステップST502)ことにより、表示領域変化検知手段1は、表示情報の表示領域が変化したことを検知する。
以上の処理を、前レイアウト情報53と現レイアウト情報54の和集合に含まれるすべての表示情報に対して行うことで、画面を生成する。
この動作例では、情報作成手段13によって、速度メータJ4、タコメータJ5、時計J7、車両情報J8、目的地までの経路案内J3、オーディオ状態J2、目的地の天気J1、経路上の渋滞情報J9という表示情報を作成している。また、初期状態の変数リストを図4(動作例a)、および、現在、情報作成手段13から提供されている表示レイアウト情報を図15に示す。
この状態で、注視判断手段12によりユーザの画面注視が検知されると、注視開始イベントが発行され、図6に示す注視開始処理が実行される。そして、ステップST201の処理で注視フラグ51が“ON”になる。
ユーザが注視をやめると、注視終了イベントが発生し、図7に示す注視終了処理が実行される。そして、ステップST211において注視フラグ51が“OFF”にセットされ、アニメーションカウンタ55も“0”(ゼロ)にリセットされる。
ユーザが注視を開始すると、注視開始処理によって、注視フラグ51が“ON”となる。注視開始フラグが“ON”になると、描画更新処理において、アニメーション画面作成処理が実行される。
実施の形態1では、表示情報表示手段4が、表示レイアウト情報31に含まれる表示情報の表示領域または表示領域算出手段2によって算出された表示情報の表示領域に従って、表示情報を表示するものとして説明したが、表示情報表示手段4は、常に情報作成手段13から入力される表示レイアウト情報31に従って情報を表示し、別途、レイアウトの変化を示す情報を表示する付加情報表示手段5(図示せず)を設けることにより、通常の表示に付加してレイアウトの変化を提示するようにしてもよい。実施の形態2は、この付加情報表示手段5を別途設けたものである。
この実施の形態2における表示情報表示手段4は、表示レイアウト情報31に含まれる表示情報の表示領域に従って、表示情報を表示する。
また、付加情報表示手段5は、表示領域算出手段2によって算出された表示情報の表示領域に従って、レイアウトの変化を表示する。
図18(a)は、表示領域算出手段2で算出された表示領域の外枠を表示することで、レイアウトの変化を表示した画面の例である。なお、この図18(a)の例では、表示領域の外枠を表示するようにしたが、外枠がない場合など、表示領域(表示領域全体)を表示するようにしてもよい。
実施の形態1,2においては、アニメーションが重なり合う場合の重なり具合については、特に考慮していなかったが、表示領域算出手段2で用いられる表示情報に、図19に示すような、表示内容40に対して一意に定まる優先度46を規定するテーブル45を用意し、アニメーションが重なり合った際に、より優先度の高い表示情報を手前に表示するようにしてもよい。実施の形態3は、このテーブル45を適用して、すなわち、表示情報には優先度を規定する情報も含まれており、表示情報が複数あり、かつ、それぞれの表示情報について表示領域算出手段2において算出される表示領域が重なり合う場合に、優先度の高い表示情報を手前に表示するようにしたものである。
実施の形態1~3においては、アニメーションを実行する表示装置については、特に限定していなかったが、表示領域算出手段2において算出される表示領域が重なり合う場合に、図31に示すような、表示を許可する表示内容を予め規定したテーブル47を参照し、このテーブル47に規定された表示内容のみを表示するようにしてもよい。
実施の形態1~4においては、アニメーションの色やパターン等については、特に考慮していなかったが、表示情報の表示内容ごとに、固有の表示パラメータ60に定義したアニメーションの枠色61、枠線種62、塗り色63、塗りの網掛けパターン(塗りパターン)64、形状65の少なくともいずれか1つを用いて表示するようにしてもよい。実施の形態5は、この固有の表示パラメータ60にしたがって、色やパターン等を表示したものである。
実施の形態1~5では、表示領域計算判断手段3で比較に用いる、アニメーション表示時間58およびアニメーションフレーム長56については、予め設定した固定の値を用いるものとしていたが、アニメーション表示時間58およびアニメーションフレーム長56を運転状況(機器の状況)毎に変化する可変な値としてもよい。実施の形態6では、このアニメーション表示時間58およびアニメーションフレーム長56を、運転状況(機器の状況)毎に変化する可変な値としたものである。
まず、図23に示す注視余裕テーブル100から、現在の運転状況に合致した注視可能時間を抽出する(ステップST601)。抽出した注視可能時間と、もっともゆっくりアニメーション表示をさせる場合の最長アニメーション時間(例えば500ms)の短い方の時間を1回のアニメーション時間として決定する(ステップST602)。そして、1回のアニメーション時間を満たすように、描画更新周期を用いてアニメーションフレーム長56を算出する(ステップST603)。すなわち、このアニメーションフレーム長56は、ステップST601で抽出した注視可能時間に基づいて変動する(動的に変更される)こととなる。
実施の形態1~6では、注視累積時間52は、ユーザが画面を注視している時間を描画更新のたびにインクリメントして累積していくものとしていたが、1回のアニメーションを見終えた後にのみカウントされるような、実際に注視していた累積時間とは異なった変動する値にしてもよい。実施の形態7では、この注視累積時間52を、アニメーションの完了に応じて変化する値としたものである。
描画更新処理の中で、表示領域計算判断手段3が、ユーザが十分にレイアウトの変化を認識したとみなすための比較に用いる注視累積時間52について、図27に示す注視終了処理の実行時に、注視累積時間52からアニメーションカウンタ55を減算する。すなわち、注視終了処理のステップST211において注視フラグ51を“OFF”にし、注視累積時間52からアニメーションカウンタ55を減算して注視時間を算出した後(ステップST216)、ステップST212においてアニメーションカウンタを“0”(ゼロ)にリセットする。
図28に示すフローチャートのように、図9においてステップST411で行っていた注視時間インクリメントの処理を、ステップST420におけるアニメーションカウンタ55のインクリメント前に行い、アニメーションカウンタ55がアニメーションフレーム長に等しい時にだけ、注視累積時間52にアニメーションフレーム長56を加えるようにしてもよい(ステップST431)。
実施の形態1~7においては、前レイアウト情報53と現レイアウト情報54との両方に含まれる表示情報33の表示領域41については、同じ形状であるものとして説明し、異なる形状である場合については特に考慮していなかったが、実施の形態8では、前レイアウト情報53と現レイアウト情報54とで表示情報33の表示領域41が異なる形状である場合に、その頂点数を揃える前処理を導入するようにしたものである。
また、見失いが重大な問題となり得るため、従来はシステムの自動的なレイアウト変更を行う表示装置はほとんど無かったが、本発明によりユーザが表示を見失うことがなくなるため、システム設計の幅が広がり、産業の発展に大きく寄与できるものと考えられる。
Claims (12)
- 機器の状況に対応した少なくとも1つ以上の表示情報を有する表示レイアウト情報を画面に表示する情報表示装置であって、
前記表示情報には少なくとも表示領域を表す情報が含まれており、
ユーザが前記画面を注視しているかどうかを判断する注視判断手段と、
前記表示情報の表示領域が変化したことを検知する表示領域変化検知手段と、
前記注視判断手段によってユーザが画面を注視していると判断された際に、前記表示領域変化検知手段によって表示領域が変化したことが検知された場合に、当該表示領域の変化の割合に応じて前記表示情報の表示領域を算出する表示領域算出手段と、
前記表示レイアウト情報に含まれる前記表示情報の表示領域または前記表示領域算出手段によって算出された前記表示情報の表示領域に従って、前記表示情報を表示する表示情報表示手段と
を備えたことを特徴とする情報表示装置。 - ユーザが前記画面を注視している時間を累積した注視累積時間に応じて、前記表示領域算出手段を実行する期間を判断する表示領域計算判断手段をさらに備えた
ことを特徴とする請求項1記載の情報表示装置。 - 前記表示領域計算判断手段により判断された期間に前記表示領域算出手段が実行されるたびに、前記表示情報の表示領域が変化することにより、前記表示情報表示手段により表示される前記表示情報が連続的に変化するアニメーション表示が行われ、
前記注視累積時間が、ユーザの注視中に終了した前記アニメーション表示の回数に応じて累積される
ことを特徴とする請求項2記載の情報表示装置。 - 前記表示情報表示手段は、前記表示情報を常に提供される表示レイアウト情報に従って表示した上で、前記表示領域変化検知手段によって変化が検知された場合には、当該変化を示す表示を付加する付加情報表示手段を備える
ことを特徴とする請求項1記載の情報表示装置。 - 前記付加情報表示手段は、前記表示領域算出手段により算出された前記表示情報の表示領域の外枠を表示することにより、前記表示情報のレイアウトの変化を表示する
ことを特徴とする請求項4記載の情報表示装置。 - 前記付加情報表示手段は、前記表示領域算出手段により算出された前記表示情報の表示領域に基づいて、前記表示情報のレイアウト変化前の表示領域内の任意の場所から、レイアウト変化後の表示領域内の対応する任意の場所へ矢印を表示することにより、前記表示情報のレイアウトの変化を表示する
ことを特徴とする請求項4記載の情報表示装置。 - 前記表示情報には優先度を規定する情報も含まれており、
前記表示情報が複数あり、かつ、それぞれの前記表示情報について前記表示領域算出手段において算出される表示領域が重なり合う場合に、前記優先度の高い表示情報を手前に表示する
ことを特徴とする請求項1記載の情報表示装置。 - 前記表示情報には連続的な変化表示の許可を規定する情報も含まれており、
前記連続的な変化表示が許可された表示情報に対してのみ連続的な変化を表示する
ことを特徴とする請求項1記載の情報表示装置。 - 前記表示情報には、各表示情報に固有の色、線種、塗りパターン、形状のうちの少なくとも1つ以上のパラメータ情報が含まれており、
前記表示情報表示手段は、前記表示情報ごとの固有のパラメータ情報を用いて当該表示情報を表示する
ことを特徴とする請求項1記載の情報表示装置。 - 前記機器の状況ごとに注視可能時間を表す情報が定義されており、
前記表示領域計算判断手段は、前記注視累積時間が所定値未満の間に、前記表示領域算出手段を実行すると判断し、
前記所定値は、前記注視可能時間に基づいて動的に変更される
ことを特徴とする請求項2記載の情報表示装置。 - 前記注視可能時間は、実際にユーザが画面を注視していた注視累積時間から学習により決定される
ことを特徴とする請求項10記載の情報表示装置。 - 前記表示領域算出手段は、レイアウト変更前後の表示情報に含まれる表示領域の形状が異なる形状である場合に、当該表示情報の表示領域の頂点の数をレイアウト変更前後で揃える前処理を行った上で、前記表示領域の変化の割合に応じて前記表示情報の表示領域を算出する
ことを特徴とする請求項1記載の情報表示装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112012005729.5T DE112012005729T5 (de) | 2012-01-23 | 2012-09-28 | Informations-Anzeigevorrichtung |
JP2013555119A JP5808435B2 (ja) | 2012-01-23 | 2012-09-28 | 情報表示装置 |
CN201280058994.1A CN103959205B (zh) | 2012-01-23 | 2012-09-28 | 信息显示装置 |
US14/352,497 US9696799B2 (en) | 2012-01-23 | 2012-09-28 | Information display device that displays information on a screen |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-011181 | 2012-01-23 | ||
JP2012011181 | 2012-01-23 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013111388A1 true WO2013111388A1 (ja) | 2013-08-01 |
Family
ID=48873137
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/075167 WO2013111388A1 (ja) | 2012-01-23 | 2012-09-28 | 情報表示装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9696799B2 (ja) |
JP (1) | JP5808435B2 (ja) |
CN (1) | CN103959205B (ja) |
DE (1) | DE112012005729T5 (ja) |
WO (1) | WO2013111388A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016158658A (ja) * | 2015-02-26 | 2016-09-05 | 東芝メディカルシステムズ株式会社 | X線診断装置および医用画像診断装置 |
JPWO2017183129A1 (ja) * | 2016-04-20 | 2019-03-07 | 日産自動車株式会社 | 情報表示方法及び表示制御装置 |
Families Citing this family (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8312123B2 (en) * | 2009-11-07 | 2012-11-13 | Harris Technology, Llc | Address sharing network |
EP2924392A4 (en) * | 2012-11-21 | 2016-08-03 | Clarion Co Ltd | INFORMATION PROCESSING DEVICE AND NAVIGATOR CONTROL METHOD |
US10216266B2 (en) * | 2013-03-14 | 2019-02-26 | Qualcomm Incorporated | Systems and methods for device interaction based on a detected gaze |
US9886087B1 (en) * | 2013-11-30 | 2018-02-06 | Allscripts Software, Llc | Dynamically optimizing user interfaces |
JP6281376B2 (ja) * | 2014-03-31 | 2018-02-21 | 株式会社デンソー | 情報表示システム |
CN104238751B (zh) | 2014-09-17 | 2017-06-27 | 联想(北京)有限公司 | 一种显示方法及电子设备 |
US9904362B2 (en) * | 2014-10-24 | 2018-02-27 | GM Global Technology Operations LLC | Systems and methods for use at a vehicle including an eye tracking device |
KR101619635B1 (ko) * | 2014-11-06 | 2016-05-10 | 현대자동차주식회사 | 시선추적을 이용한 메뉴 선택장치 |
KR20170080797A (ko) * | 2015-12-30 | 2017-07-11 | 삼성디스플레이 주식회사 | 차량용 디스플레이 시스템 |
CN106055230B (zh) * | 2016-05-25 | 2019-08-27 | 努比亚技术有限公司 | 应用评价装置、移动终端及方法 |
EP3342619A1 (de) * | 2016-12-27 | 2018-07-04 | Volkswagen Aktiengesellschaft | Anwenderschnittstellen, computerprogrammprodukt, signalfolge, fortbewegungsmittel und verfahren zum anzeigen von informationen auf einer anzeigeeinrichtung |
JP2019017800A (ja) * | 2017-07-19 | 2019-02-07 | 富士通株式会社 | コンピュータプログラム、情報処理装置及び情報処理方法 |
JP6724878B2 (ja) * | 2017-09-22 | 2020-07-15 | カシオ計算機株式会社 | 学習支援装置、学習支援方法及びプログラム |
JP2019096054A (ja) * | 2017-11-22 | 2019-06-20 | 株式会社デンソーテン | 出力処理装置及び出力処理方法 |
CN108280232A (zh) * | 2018-02-26 | 2018-07-13 | 深圳市富途网络科技有限公司 | 一种在宽列表中固定展示重要信息的方法 |
JP7127626B2 (ja) * | 2019-08-09 | 2022-08-30 | 株式会社デンソー | コンテントの表示制御装置、表示制御方法及び表示制御プログラム |
DE102020107997A1 (de) * | 2020-03-24 | 2021-09-30 | Bayerische Motoren Werke Aktiengesellschaft | Verfahren zum Betreiben eines digitalen Assistenten eines Fahrzeugs, computerlesbares Medium, System, und Fahrzeug |
US12019747B2 (en) * | 2020-10-13 | 2024-06-25 | International Business Machines Corporation | Adversarial interpolation backdoor detection |
DE102020213770A1 (de) * | 2020-11-02 | 2022-05-05 | Continental Automotive Gmbh | Anzeigevorrichtung für ein Fahrzeug |
US11537787B2 (en) * | 2021-03-01 | 2022-12-27 | Adobe Inc. | Template-based redesign of a document based on document content |
WO2023076841A1 (en) * | 2021-10-25 | 2023-05-04 | Atieva, Inc. | Contextual vehicle control with visual representation |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003169317A (ja) * | 2001-11-29 | 2003-06-13 | Nippon Telegr & Teleph Corp <Ntt> | 環境変化伝達方法および装置、環境変化伝達プログラム並びにそのプログラムを記録した記録媒体 |
JP2009073431A (ja) * | 2007-09-24 | 2009-04-09 | Denso Corp | 車両用メータユニット |
Family Cites Families (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09123848A (ja) | 1995-11-06 | 1997-05-13 | Toyota Motor Corp | 車両用情報表示装置 |
US5731805A (en) * | 1996-06-25 | 1998-03-24 | Sun Microsystems, Inc. | Method and apparatus for eyetrack-driven text enlargement |
US5850211A (en) * | 1996-06-26 | 1998-12-15 | Sun Microsystems, Inc. | Eyetrack-driven scrolling |
JPH1185452A (ja) | 1997-09-02 | 1999-03-30 | Sanyo Electric Co Ltd | スクロール制御装置 |
JP3777790B2 (ja) | 1998-04-28 | 2006-05-24 | 株式会社デンソー | 表示制御装置 |
US6577329B1 (en) * | 1999-02-25 | 2003-06-10 | International Business Machines Corporation | Method and system for relevance feedback through gaze tracking and ticker interfaces |
JP2001175992A (ja) | 1999-12-14 | 2001-06-29 | Mazda Motor Corp | 車両用表示装置 |
JP2002362186A (ja) | 2001-06-05 | 2002-12-18 | Nissan Motor Co Ltd | 車両用表示装置 |
US7663628B2 (en) * | 2002-01-22 | 2010-02-16 | Gizmoz Israel 2002 Ltd. | Apparatus and method for efficient animation of believable speaking 3D characters in real time |
JP3931338B2 (ja) * | 2003-09-30 | 2007-06-13 | マツダ株式会社 | 車両用情報提供装置 |
JP2005297662A (ja) * | 2004-04-08 | 2005-10-27 | Nissan Motor Co Ltd | 情報操作表示システム |
JP4268191B2 (ja) * | 2004-12-14 | 2009-05-27 | パナソニック株式会社 | 情報提示装置、情報提示方法、プログラム、及び記録媒体 |
EP1679577A1 (en) * | 2005-01-10 | 2006-07-12 | Tobii Technology AB | Adaptive display of eye controllable objects |
JP2007153116A (ja) | 2005-12-05 | 2007-06-21 | Nissan Motor Co Ltd | 車両用計器類表示装置及び車両用計器類表示方法 |
US8793620B2 (en) * | 2011-04-21 | 2014-07-29 | Sony Computer Entertainment Inc. | Gaze-assisted computer interface |
DE102008016527B4 (de) | 2007-04-03 | 2018-12-13 | Denso Corporation | Fahrzeug-Messgeräte-Einheit und Anzeigevorrichtung |
JP2008288767A (ja) * | 2007-05-16 | 2008-11-27 | Sony Corp | 情報処理装置および方法、並びにプログラム |
US20080309616A1 (en) * | 2007-06-13 | 2008-12-18 | Massengill R Kemp | Alertness testing method and apparatus |
JP2009082182A (ja) * | 2007-09-27 | 2009-04-23 | Fujifilm Corp | 検査作業支援装置及び方法、並びに検査作業支援システム |
EP2042969A1 (en) * | 2007-09-28 | 2009-04-01 | Alcatel Lucent | Method for determining user reaction with specific content of a displayed page. |
JP5181659B2 (ja) * | 2007-12-19 | 2013-04-10 | 富士ゼロックス株式会社 | 文書送信装置及び文書送信プログラム、文書受信装置及び文書受信プログラム、並びに文書表示システム |
US8514251B2 (en) * | 2008-06-23 | 2013-08-20 | Qualcomm Incorporated | Enhanced character input using recognized gestures |
WO2010118292A1 (en) * | 2009-04-09 | 2010-10-14 | Dynavox Systems, Llc | Calibration free, motion tolerant eye-gaze direction detector with contextually aware computer interaction and communication methods |
CN101943982B (zh) * | 2009-07-10 | 2012-12-12 | 北京大学 | 基于被跟踪的眼睛运动的图像操作 |
US8913004B1 (en) * | 2010-03-05 | 2014-12-16 | Amazon Technologies, Inc. | Action based device control |
US8947355B1 (en) * | 2010-03-25 | 2015-02-03 | Amazon Technologies, Inc. | Motion-based character selection |
US8982160B2 (en) | 2010-04-16 | 2015-03-17 | Qualcomm, Incorporated | Apparatus and methods for dynamically correlating virtual keyboard dimensions to user finger size |
US8464183B2 (en) * | 2010-06-03 | 2013-06-11 | Hewlett-Packard Development Company, L.P. | System and method for distinguishing multimodal commands directed at a machine from ambient human communications |
US9557812B2 (en) * | 2010-07-23 | 2017-01-31 | Gregory A. Maltz | Eye gaze user interface and calibration method |
US20130145304A1 (en) * | 2011-12-02 | 2013-06-06 | International Business Machines Corporation | Confirming input intent using eye tracking |
-
2012
- 2012-09-28 DE DE112012005729.5T patent/DE112012005729T5/de not_active Withdrawn
- 2012-09-28 WO PCT/JP2012/075167 patent/WO2013111388A1/ja active Application Filing
- 2012-09-28 CN CN201280058994.1A patent/CN103959205B/zh active Active
- 2012-09-28 JP JP2013555119A patent/JP5808435B2/ja active Active
- 2012-09-28 US US14/352,497 patent/US9696799B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003169317A (ja) * | 2001-11-29 | 2003-06-13 | Nippon Telegr & Teleph Corp <Ntt> | 環境変化伝達方法および装置、環境変化伝達プログラム並びにそのプログラムを記録した記録媒体 |
JP2009073431A (ja) * | 2007-09-24 | 2009-04-09 | Denso Corp | 車両用メータユニット |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016158658A (ja) * | 2015-02-26 | 2016-09-05 | 東芝メディカルシステムズ株式会社 | X線診断装置および医用画像診断装置 |
JPWO2017183129A1 (ja) * | 2016-04-20 | 2019-03-07 | 日産自動車株式会社 | 情報表示方法及び表示制御装置 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2013111388A1 (ja) | 2015-05-11 |
JP5808435B2 (ja) | 2015-11-10 |
DE112012005729T5 (de) | 2014-10-02 |
US9696799B2 (en) | 2017-07-04 |
CN103959205A (zh) | 2014-07-30 |
CN103959205B (zh) | 2016-10-26 |
US20140250395A1 (en) | 2014-09-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5808435B2 (ja) | 情報表示装置 | |
JP6808894B2 (ja) | 車両用警報装置 | |
US10304228B2 (en) | Vehicular display apparatus and vehicular display method | |
US7812741B2 (en) | Parking support method and parking support apparatus | |
US9434384B2 (en) | Vehicle cruise control apparatus and method | |
CN111559371B (zh) | 三维泊车的显示方法、车辆和存储介质 | |
WO2015001815A1 (ja) | 運転支援装置 | |
EP2958095B1 (en) | Display control device, display control method, display control program, and projecting device | |
US10059267B2 (en) | Rearview mirror angle setting system, method, and program | |
CN106585532B (zh) | 一种汽车内后视镜视频切换方法及装置 | |
CN113784861A (zh) | 显示控制方法及显示控制装置 | |
JP4541072B2 (ja) | 交通信号制御装置及びこれを用いた交通信号システム | |
JP6145265B2 (ja) | 表示装置 | |
EP2832589A1 (en) | Vehicle display apparatus | |
JP5136773B2 (ja) | 車両用表示装置 | |
EP3185173A1 (en) | Automated vehicle human-machine interface system based on glance-direction | |
CN113722043A (zh) | 用于avp的场景显示方法、装置、电子设备与存储介质 | |
JP6322918B2 (ja) | 車両用速度表示装置 | |
JPH07262492A (ja) | 車載用ナビゲーション装置 | |
JP2014037172A (ja) | ヘッドアップディスプレイの表示制御装置および表示制御方法 | |
JP2017083308A (ja) | 電子装置、施設特定方法および施設特定プログラム | |
JP2022154082A (ja) | 表示補正システム、表示システム、表示補正方法、及びプログラム | |
JP2019109138A (ja) | 表示装置、表示方法及び表示プログラム | |
JP2011069709A (ja) | 可動式メータおよびその表示制御方法 | |
US10852919B2 (en) | Touch input judgment device, touch panel input device, touch input judgment method, and a computer readable medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12866635 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013555119 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14352497 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1120120057295 Country of ref document: DE Ref document number: 112012005729 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12866635 Country of ref document: EP Kind code of ref document: A1 |