WO2023284961A1 - Procédé mis en œuvre par ordinateur d'adaptation d'une interface utilisateur graphique d'une interface homme-machine d'un véhicule, produit-programme informatique, interface homme-machine et véhicule - Google Patents

Procédé mis en œuvre par ordinateur d'adaptation d'une interface utilisateur graphique d'une interface homme-machine d'un véhicule, produit-programme informatique, interface homme-machine et véhicule Download PDF

Info

Publication number
WO2023284961A1
WO2023284961A1 PCT/EP2021/069723 EP2021069723W WO2023284961A1 WO 2023284961 A1 WO2023284961 A1 WO 2023284961A1 EP 2021069723 W EP2021069723 W EP 2021069723W WO 2023284961 A1 WO2023284961 A1 WO 2023284961A1
Authority
WO
WIPO (PCT)
Prior art keywords
controller
configuration
computer
human machine
machine interface
Prior art date
Application number
PCT/EP2021/069723
Other languages
English (en)
Inventor
Rami Zarife
Oliver Sens
Original Assignee
Lotus Tech Innovation Centre Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lotus Tech Innovation Centre Gmbh filed Critical Lotus Tech Innovation Centre Gmbh
Priority to CN202180100624.9A priority Critical patent/CN117651655A/zh
Priority to EP21748816.2A priority patent/EP4370362A1/fr
Priority to PCT/EP2021/069723 priority patent/WO2023284961A1/fr
Priority to TW111126531A priority patent/TWI822186B/zh
Publication of WO2023284961A1 publication Critical patent/WO2023284961A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/22Display screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/29Instruments characterised by the way in which information is handled, e.g. showing information on plural displays or prioritising information according to driving conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/65Instruments specially adapted for specific vehicle types or users, e.g. for left- or right-hand drive
    • B60K35/654Instruments specially adapted for specific vehicle types or users, e.g. for left- or right-hand drive the user being the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/11Instrument graphical user interfaces or menu aspects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/143Touch sensitive instrument input devices
    • B60K2360/1438Touch screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/151Instrument output devices for configurable output
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/18Information management

Definitions

  • the present disclosure relates to computer-implemented methods of adapting a graphical user interface of a human machine interface of a vehicle, computer program products, human ma chine interfaces, and vehicles.
  • Cognitive sciences have identified that there is an optimum of information that an individual driver should receive, not too little, and not too much either. If the driver receives too little infor mation, the driver might be searching for an information the driver expects. If the driver receives too much information, the driver might be overwhelmed and not identify the necessary infor mation quickly.
  • the amount of information to be presented is very individual. It relates to cognitive abilities of the driver, age, experience, and personal preferences. For example, some drivers like to know about the engine temperature, others might not care about this data. In another example, some drivers might prefer easy access to navigation information, others to entertainment information. Also, there are very different types of users, e.g., very engaged users that like more information density versus digital novices that prefer reduced interfaces and few, major functionalities.
  • driver assistance systems such as, for example, adaptive cruise control and automated lane change systems have been successfully deployed to the market to in crease of driver comfort and safety. As these driver assistance systems progress in sophistica tion, less driver interaction may be required. In some cases, the driver assistance systems may be fully automated for sections of a trip. Accordingly, the role of the driver has changed from that of an active driver to that of a passenger, for at least some portion of the trip. Highly auto mated vehicles allow the driver to hand over control to the automated vehicle and to do other tasks while driving. The requirement for information displayed to the user in an autonomous driving mode is different than in a manual driving mode.
  • EP 3240715 B1 discloses an adaptive user interface system for a vehicle with an automatic vehicle system, the adaptive user interface system including: a display; and an electronic con troller, electrically coupled to the display, and configured to generate a graphical user interface indictive of operation of the automatic vehicle system, output the graphical user interface on the display, said electronic controller being characterized by being further con- figured to monitor an indicia of a driver’s comfort level, determine, based on the monitored indicia, when the driver is not comfortable with the operation of the automatic vehicle system, and modify the graphical user interface to provide an increased level of detail in response to determining that the driver is not comfortable with the operation of the automatic vehicle system.
  • the object is solved by a computer-implemented method of adapting a graphical user interface of a human machine interface of a vehicle according to claim 1 , a computer program product according to independent claim 11, a human machine interface according to independent claim 12, and a vehicle according to independent claim 13. Further embodiments are described in de pendent claims.
  • Described is a computer-implemented method of adapting a graphical user interface of a human machine interface of a vehicle, wherein the graphical user interface is controlled by a controller of the human machine interface, wherein the layout of information presented on the graphical user interface is determined by a configuration, wherein the controller accesses at least two dif ferent configurations, wherein each configuration is associated with at least one clutter index or clutter index range, wherein the controller associates at least one of user profile, region, usage information, and/or user alertness with a clutter index or clutter index range and selects a con figuration with a compatible clutter index or clutter index range.
  • the vehicle can be a car, a truck, a bus, or the like.
  • the graphical user interface can, inter alia, be a cockpit display, a head-up display, or a central display.
  • Using displays instead of gauges has established to be a standard in many vehicles, as they are more configurable than classical gauges.
  • the clutter index is an index derived from how a certain configuration of information is perceived by a human.
  • the clutter index of a given configuration can be assessed by test persons.
  • the clutter index can be calculated by a processor.
  • the clutter index can relate to the number, positioning, and density of information and ornaments, the fonts used, in some embodiments including size and type, colors used and other criteria. The more cluttered a display appears, the higher the clutter index will be. The higher the clutter index is, the more precise a user needs to know where to find a respective in formation in order to find it fast and efficient.
  • the configurations can be pre-configured and/or user adjustable and/or dynamic. Dynamic dis plays can react to certain occasions, e.g., if the navigation system provides a direction notice that is to be acted upon soon, this information might be made more visible than other infor mation that then will either be moved to other locations, not displayed or displayed in a smaller size.
  • multiple configurations with respective similar or identical clutter indexes or overlapping clutter index ranges might exist.
  • the different configurations might be rank-or dered by the clutter index.
  • the clutter indexes might be determined in ranges and they might overlap so that it can be determined that more than one clutter indexes are suitable.
  • the controller might select one or more suitable configurations. To do so, the controller must calculate a clutter index or clutter in dex range from at least one of the user profile, region, usage information, and/or user alertness information provided to the controller. The controller then can compare and match this calcu lated clutter index or index range to the clutter indices or clutter index ranges associated with the available configurations and select one or more of the closest matching configurations.
  • the clutter index can be calculated as a rounded sum of weighted factors.
  • the factors can be normalized and weighted afterwards or the weights can be calculated such that they lead to a normalization of the factors.
  • Those factors can include such relating to the usage of certain functions of the infotainment sys tem of the vehicle, e.g. a mean variance of the usage of certain functions in a given time period, e.g., last month.
  • Another factor can relate to the usage of a smartphone, e.g, the mean variance of the usage of certain functions within the smartphone in a given time period. Both of those fac tors relate to user preference with regards to the types of applications the user uses.
  • Another factor can be a user-selected information density on the vehicle’s human machine interface.
  • Yet another factor can be derived from monitoring eye-tracking and component usage in a given pe riod of time.
  • Another factor can be derived from monitoring speech interaction of the user with the human machine interface.
  • the clutter index is linked to a dedicated profile of a user. It can be analyzed on a group level within a cloud application, e.g., to generate specific weights for different regions and/or popula tion groups.
  • the user can be provided with display configurations that match the user’s needs or abilities.
  • the selected configuration is presented to the user, wherein, if the user confirms the selected configuration, the configuration is changed to the selected configura tion.
  • the clutter index is derived from an information density displayed on the graphical user interface at a given time.
  • the information density can be calculated based on the number of different information dis played simultaneously on the display. For example, if a menu is displayed, a menu with 12 menu items displayed at the same time is more cluttered than a menu with only six menu items. Another example is that a cockpit display that shows, at the same time, speed, engine revolu tions, engine temperature, fuel level, outside temperature, navigation information etc. appears generally more cluttered than a display that only shows speed and navigation information.
  • the controller analyses the usage of functions accessible through the human machine interface by the user, wherein the controller adjusts at least one configuration based on the usage analysis.
  • the controller can determine which information is relevant to the particular user and can prioritize information the user needs regularly. For example, if a user often wants to see navigation information, this information might be viewed with higher priority than for example in formation on the current engine conditions.
  • a driver monitoring system is connected to the controller, wherein driver information captured by the driver monitoring system is processed by the control ler, wherein the controller selects a configuration based on the driver information.
  • the display configuration can be changed to a configuration with a lower clutter index.
  • the human machine interface comprises a speech recognition system, wherein information related to the usage of speech commands is analyzed by the con troller, wherein the controller selects a configuration based on the usage of speech commands.
  • the information presented on the display can be matched to the commands given by the user so that the information is more relevant to the individual user.
  • a selection of a configuration is presented at a predetermined time interval.
  • time interval can be, for example, one month, so that the user has a regular reminder that the user’s needs might have changed. It also keeps the user engaged to customize the vehicle to the user’s preferences which is shown to have a beneficial effect on user satisfaction and brand appreciation.
  • a mobile device is connected to the human machine interface, wherein the controller analyses the configuration of the mobile device, wherein the controller se lects a configuration based on the mobile device configuration.
  • controller applies a machine learning algorithm for the selection of the configuration.
  • the learning mechanism can be implemented by reinforcement learning once a certain user group size, e.g. equal to or more than 60 users, in a certain region and/or population group is reached.
  • a user preference profile is stored and accessed by the control ler.
  • the user preference profile can be stored in the vehicle and/or in external storage locations such as a server.
  • the server can be accessible via a remote network connection.
  • the profile can also be stored or accessible via a mobile device of the user.
  • a first independent aspect relates to a computer program product with a non-transitory com puter-readable storage medium having commands embedded therein which, when executed by a processor, cause the processor to execute the method as described above.
  • Another independent aspect relates to a human machine interface of a vehicle with a controller, the controller comprising a computer program product as described above with a non-transitory computer-readable storage medium and a processor.
  • Another independent aspect relates to a vehicle with a human machine interface as described above.
  • Fig. 1 a car with a number of displays
  • Fig. 3 a method of selecting a display configuration.
  • Fig. 1 shows a car 2 with a human machine interface (HMI) 4.
  • HMI human machine interface
  • the human machine interface 4 comprises a cockpit display 6, a central display 8, a micro phone 10, and a loudspeaker arrangement 12.
  • the central display 8 is touch sensitive.
  • the hu man machine interface 4 allows for interaction between a driver 14 and the car’s 2 various sys tems. Information is provided to the driver 14 via displays 6 and 8 as well as via the loudspeaker arrangement 12.
  • the driver 14 can input commands via the microphone 10 and via the touch- sensitive central display 8 as well as through other well-known but not displayed means such as buttons and levers.
  • the human machine interface 4 comprises a controller 16 which comprises a processor 18 and a non-transitory computer-readable storage medium 20.
  • a computer program product 22 is stored on the non-transitory computer-readable storage medium 20. When the computer pro gram product 22 is loaded and executed by the processor 18, the processor implements the method described herein.
  • the controller 16 is responsible for providing the displays 6, 8 with information to be displayed. The controller 16 also controls which information is provided as sounds via the loudspeaker ar rangement 12.
  • the controller 16 is further connected to a driver monitoring system 26 with an eye tracking camera 28 for tracking eyes 30 of the driver 14.
  • the driver monitoring system 26 also utilizes the microphone 10 for registering sounds generated by the driver 14.
  • the eye tracking camera 28 can register a viewing direction 32 of the driver 14. This information as well as the acoustic information captured via the microphone 10, can be used to determine how attentive the driver 14 is at the moment. This information can be used to adjust the configuration of the displays 6 and/or 8.
  • the controller 16 is further connected, via a wired or wireless connection, to a mobile phone 34 of the driver 14. Information related to the configuration of the mobile device 34 is utilized by the controller 16 to determine an appropriate configuration of the displays 6, 8.
  • Fig. 2a-c show different configurations of the cockpit display 6 and how a selection can be made.
  • a selection are 40 is presented that the driver 14 can use to select between different configurations like configuration 42 shown in Fig. 2a, configuration 44 shown in Fig. 2b and con figuration 46 shown in Fig. 2c.
  • the cockpit display 6 is generally split up in four different zones, three upper zones, left zone 50, central zone 52 and right zone 54 and a lower, line-like zone 56.
  • the clutter index relates to the information density provided on the cockpit display 6 and is cal culated based on how much information is provided to the driver 14 at a given time. Different in formation is weighted differently based on the location of the information and type of infor mation. For example, information provided in the central zone 52 has a higher impact on the clutter index than information provided on the left zone 50, the right zone 54 or the lower zone 56. On the other hand, the type of information is important, too. Information that is mandatory or retrieved often by the driver 14, like the speed of the car 2, has a lower impact on the clutter in dex than other, less relevant information like engine temperature.
  • the clutter index C can range from 1 to 6:
  • A1 is the mean variance of function usage within the human machine interface for the last month.
  • A1 rep resents a daily usage of major functions of the infotainment system: navigation, air conditioning, media, driver assistance functions, speech assist function, communication function.
  • An in creased variance will lead to the elevation of functions to a higher level, e.g., the top level, so that more direct quick functions are available on the main screen of the human machine inter face, leading to a higher density of information.
  • W1 equals 1 by default. In a further embodiment, W1 gets adapted due to reinforcement learn ing, e.g. for specific regions or population groups.
  • A2 is the mean variance of function usage of a smartphone.
  • A2 represents a daily usage of all major func tions within the smartphone: messaging, calling, media, gaming, assistant usage, smart-home applications, news, etc..
  • An increased variance will lead to the elevation of functions to a higher level, e.g., the top level, so that more direct quick functions are available on the main screen of the human machine interface, leading to a higher density of information.
  • W2 equals 1 by default, In the aforementioned further embodiment, W2 gets adapted due to re inforcement learning, e.g. for specific regions or population groups.
  • A3 is the information density of selected human machine interface.
  • A3 represents the number of high relevant components (e.g. on/button with label and description) divided by the maximum number of components on a high information density interface (e.g., 20 components). 10 highly relevant components for example lead to an A3 of 0,5.
  • W3 equals 1 by default, In a further embodiment, W3 gets adapted due to reinforcement learn ing, e.g. for specific regions or population groups.
  • A4 relates to eye-tracking & component usage.
  • A4 can consist of multiple sub-factors which are based on components within the human machine interface, e.g., a slider or a button can be con sidered as a component.
  • different sub-weights can be used to adapt the order, e.g. settings, and posi tions within the human machine interface, e.g. menu-level, alignment etc., and to generate new human machine interface variants.
  • W4 equals 2 by default, so it is weighted higher than the other weights, as it is considered as the most important factor in the given example. In the aforementioned further embodiment, W4 gets adapted due to reinforcement learning, e.g. for specific regions or population groups.
  • A5 relates to speech interactions.
  • A5 equals 1 - a function relating to the number of functions that are executed only by voice but are also represented visually, with the function having a range of 0 - 1.
  • This factor A5 can be used to reduce visual elements that correspond to the functions that are executed only by voice by the user.
  • W5 1 by default. In the aforementioned further embodiment, W5 gets adapted due to rein forcement learning, e.g. for specific regions or population groups.
  • the index is linked to the dedicated profile of a user, and only analyzed on a group level within a cloud application, e.g. to generate specific weights (W1 - W6) for different regions or popula tion groups.
  • the sum of all weights can equal a fixed number, in the given example described above, the sum of all weights is 6. If the weights are adapted according to the method described above as further embodiment, they are adapted interdependently in such a way that the sum of all weights remains the same.
  • the selection are 40 displayed on central display 8 comprises a slider 60 that can be moved on a track 62 with multiple stop positions 64 for a number of different configurations that are sorted from left to right with increasing clutter index.
  • the driver 14 can slide the slider 60 to any of the stop positions 64 while the associated cockpit display configuration 42 - 46 is shown above the selection area 40.
  • the configuration 42 shown in Fig. 2a has a very low clutter index as all the attention is drawn towards a display of a speed of the car 2 in the central zone 52 with comparatively small lines in the left zone 50, the right zone 54 and the lower zone 56.
  • the configuration 44 shown in Fig. 2b has a higher, but still relatively low clutter index as the at tention is still drawn towards a display of a speed of the car 2 in the central zone 52 with com paratively small lines in the left zone 50, the right zone 54 and the lower zone 56. But due to ad ditional information that is displayed in the central zone 52 and in the lower zone 56, the infor mation density is increased compared to the configuration 42.
  • the configuration 46 shown in Fig. 2c has a relatively high clutter index and is more suitable for experienced drivers.
  • the display of the car speed is moved to the left zone 50, whereas a charge capacity is visible in the right zone 54.
  • the central zone 52 is occupied with a menu with multiple list lines. In addition to that, a number of additional information in small font or icon size is shown in all the zones 50, 52, 54, 56.
  • the driver 14 After choosing, the driver 14 confirms his selection and the respective layout will be used for the cockpit display 6.
  • Fig. 3 shows a method of selecting the display configuration.
  • the first information source is usage statistics derived from the driver's interactions with the human machine interface 4.
  • the second information source is attention data derived from the driver monitoring system 26.
  • the third information source is data derived from a speech recognition system utilizing the mi crophone 10.
  • the fourth information source is the configuration of the driver's mobile phone 34.
  • the machine learning algorithm processes the different information sources and determines a tolerable clutter index for the individual driver 14.
  • the machine learning algorithm selects and/or modifies existing configurations to meet a tolerable clutter index range for the driver 14 as well as a selection of information to be displayed that is relevant for the driver 14.
  • Components and systems described above can be stand-alone or used by other systems of the car.
  • Sensor data such as camera data for example can be provided to different systems and uti lized for different purposes.
  • Systems can be implemented as functions in control units with more functionalities, e.g., a function of a driving assistance system with multiple components such as lane keeping and adaptive cruise control.

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé mis en œuvre par ordinateur d'adaptation d'une interface utilisateur graphique d'une interface homme-machine d'un véhicule, l'interface utilisateur graphique étant commandée par un dispositif de commande de l'interface homme-machine, l'agencement d'informations présentées sur l'interface utilisateur graphique étant déterminé par une configuration, le dispositif de commande accédant à au moins deux configurations différentes, chaque configuration étant associée à au moins un indice de fouillis ou une plage d'indices de fouillis, le dispositif de commande associant un profil d'utilisateur et/ou une région et/ou des informations d'utilisation et/ou la vivacité d'esprit de l'utilisateur avec un indice de fouillis ou une plage d'indices de fouillis et sélectionnant une configuration comportant un indice de fouillis ou une plage d'indices de fouillis compatible.
PCT/EP2021/069723 2021-07-15 2021-07-15 Procédé mis en œuvre par ordinateur d'adaptation d'une interface utilisateur graphique d'une interface homme-machine d'un véhicule, produit-programme informatique, interface homme-machine et véhicule WO2023284961A1 (fr)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN202180100624.9A CN117651655A (zh) 2021-07-15 2021-07-15 适配车辆的人机界面的图形用户界面的计算机实施方法、计算机程序产品、人机界面和车辆
EP21748816.2A EP4370362A1 (fr) 2021-07-15 2021-07-15 Procédé mis en oeuvre par ordinateur d'adaptation d'une interface utilisateur graphique d'une interface homme-machine d'un véhicule, produit-programme informatique, interface homme-machine et véhicule
PCT/EP2021/069723 WO2023284961A1 (fr) 2021-07-15 2021-07-15 Procédé mis en œuvre par ordinateur d'adaptation d'une interface utilisateur graphique d'une interface homme-machine d'un véhicule, produit-programme informatique, interface homme-machine et véhicule
TW111126531A TWI822186B (zh) 2021-07-15 2022-07-14 調適車輛的人機介面的圖形化使用者介面的電腦實現方法、電腦程式產品、人機介面及車輛

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2021/069723 WO2023284961A1 (fr) 2021-07-15 2021-07-15 Procédé mis en œuvre par ordinateur d'adaptation d'une interface utilisateur graphique d'une interface homme-machine d'un véhicule, produit-programme informatique, interface homme-machine et véhicule

Publications (1)

Publication Number Publication Date
WO2023284961A1 true WO2023284961A1 (fr) 2023-01-19

Family

ID=77155750

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2021/069723 WO2023284961A1 (fr) 2021-07-15 2021-07-15 Procédé mis en œuvre par ordinateur d'adaptation d'une interface utilisateur graphique d'une interface homme-machine d'un véhicule, produit-programme informatique, interface homme-machine et véhicule

Country Status (4)

Country Link
EP (1) EP4370362A1 (fr)
CN (1) CN117651655A (fr)
TW (1) TWI822186B (fr)
WO (1) WO2023284961A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013074897A1 (fr) * 2011-11-16 2013-05-23 Flextronics Ap, Llc Console de véhicule configurable
US20160104486A1 (en) * 2011-04-22 2016-04-14 Angel A. Penilla Methods and Systems for Communicating Content to Connected Vehicle Users Based Detected Tone/Mood in Voice Input
EP3240715B1 (fr) 2014-12-30 2018-12-19 Robert Bosch GmbH Interface utilisateur adaptative pour véhicule autonome
US20180365025A1 (en) * 2017-06-16 2018-12-20 General Electric Company Systems and methods for adaptive user interfaces
EP3461672A1 (fr) * 2017-09-27 2019-04-03 Honda Motor Co., Ltd. Appareil d'affichage, appareil de commande d'affichage et véhicule
DE102019217346A1 (de) * 2019-11-11 2021-05-12 Psa Automobiles Sa Verfahren zur Darstellung von Informationen auf einer Mensch-Maschine-Schnittstelle eines Kraftfahrzeugs, Computerprogrammprodukt, Mensch-Maschine-Schnittstelle sowie Kraftfahrzeug

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10969236B2 (en) * 2018-12-13 2021-04-06 Gm Global Technology Operations, Llc Vehicle route control based on user-provided trip constraints

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160104486A1 (en) * 2011-04-22 2016-04-14 Angel A. Penilla Methods and Systems for Communicating Content to Connected Vehicle Users Based Detected Tone/Mood in Voice Input
WO2013074897A1 (fr) * 2011-11-16 2013-05-23 Flextronics Ap, Llc Console de véhicule configurable
EP3240715B1 (fr) 2014-12-30 2018-12-19 Robert Bosch GmbH Interface utilisateur adaptative pour véhicule autonome
US20180365025A1 (en) * 2017-06-16 2018-12-20 General Electric Company Systems and methods for adaptive user interfaces
EP3461672A1 (fr) * 2017-09-27 2019-04-03 Honda Motor Co., Ltd. Appareil d'affichage, appareil de commande d'affichage et véhicule
DE102019217346A1 (de) * 2019-11-11 2021-05-12 Psa Automobiles Sa Verfahren zur Darstellung von Informationen auf einer Mensch-Maschine-Schnittstelle eines Kraftfahrzeugs, Computerprogrammprodukt, Mensch-Maschine-Schnittstelle sowie Kraftfahrzeug

Also Published As

Publication number Publication date
CN117651655A (zh) 2024-03-05
TW202319261A (zh) 2023-05-16
TWI822186B (zh) 2023-11-11
EP4370362A1 (fr) 2024-05-22

Similar Documents

Publication Publication Date Title
AU2020257137B2 (en) Post-drive summary with tutorial
US11449294B2 (en) Display system in a vehicle
JP6883766B2 (ja) 運転支援方法およびそれを利用した運転支援装置、運転制御装置、車両、運転支援プログラム
US20130038437A1 (en) System for task and notification handling in a connected car
US9524514B2 (en) Method and system for selecting driver preferences
KR102479540B1 (ko) 주행 컨텍스트에 기반한 디스플레이 제어 방법 및 장치
US10053113B2 (en) Dynamic output notification management for vehicle occupant
WO2022111067A1 (fr) Procédé et appareil de réglage de paramètre d'affichage tête haute, affichage tête haute et véhicule
US20170168689A1 (en) Systems and methods for providing vehicle-related information in accord with a pre-selected information-sharing mode
CN113811851A (zh) 用户界面耦合
Riegler et al. Content presentation on 3D augmented reality windshield displays in the context of automated driving
WO2023284961A1 (fr) Procédé mis en œuvre par ordinateur d'adaptation d'une interface utilisateur graphique d'une interface homme-machine d'un véhicule, produit-programme informatique, interface homme-machine et véhicule
CN116155988A (zh) 车载信息推送方法、装置、设备及存储介质
Nakrani Smart car technologies: a comprehensive study of the state of the art with analysis and trends
CN110015309B (zh) 车辆驾驶辅助***和方法
CN118163609A (zh) 一种导航菜单显示方法、***及车辆
CN107054224B (zh) 汽车影像控制方法及其控制***
EP2634688B1 (fr) Procédé et dispositif pour la commande simple de services de communication dans un véhicule par l'utilisation de gestes tactiles sur des écrans sensibles au contact
Burnett et al. Investigating design issues for the use of touchpad technology within vehicles.
CN116700558A (zh) 交互方法、装置、显示界面、终端和车辆
NZ760269A (en) Post-drive summary with tutorial
JP2019123357A (ja) 車両制御支援装置および車両
NZ760269B2 (en) Post-drive summary with tutorial
NZ721392B2 (en) Post-drive summary with tutorial

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21748816

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202180100624.9

Country of ref document: CN

WWE Wipo information: entry into national phase

Ref document number: 2021748816

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2021748816

Country of ref document: EP

Effective date: 20240215