US20210311308A1 - Method and system for operating at least two display devices carried by respective vehicle occupants on the head - Google Patents

Method and system for operating at least two display devices carried by respective vehicle occupants on the head Download PDF

Info

Publication number
US20210311308A1
US20210311308A1 US17/267,083 US201917267083A US2021311308A1 US 20210311308 A1 US20210311308 A1 US 20210311308A1 US 201917267083 A US201917267083 A US 201917267083A US 2021311308 A1 US2021311308 A1 US 2021311308A1
Authority
US
United States
Prior art keywords
vehicle occupants
vehicle
virtual environments
respective virtual
displaying
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US17/267,083
Other versions
US11409104B2 (en
Inventor
Marcus Kühne
Nils Wollny
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Audi AG
Original Assignee
Audi AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Audi AG filed Critical Audi AG
Assigned to AUDI AG reassignment AUDI AG ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KÜHNE, Marcus, WOLLNY, NILS
Publication of US20210311308A1 publication Critical patent/US20210311308A1/en
Application granted granted Critical
Publication of US11409104B2 publication Critical patent/US11409104B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • B60K2360/111
    • B60K2360/177
    • B60K2360/182
    • B60K2360/1868
    • B60K2360/563
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/10Input devices or features thereof
    • B60K2370/11Graphical user interfaces or menu aspects
    • B60K2370/111Graphical user interfaces or menu aspects for controlling multiple devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/16Type of information
    • B60K2370/177Augmented reality
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/18Information management
    • B60K2370/182Distributing information between displays
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/18Information management
    • B60K2370/186Displaying Information according to relevancy
    • B60K2370/1868Displaying Information according to relevancy according to driving situations
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/50Control arrangements; Data network features
    • B60K2370/55Remote controls
    • B60K2370/56Remote controls using mobile devices
    • B60K2370/563Vehicle displaying mobile device information
    • B60K35/10
    • B60K35/28
    • B60K35/29
    • B60K35/80
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • Described below are a method and a system for operating at least two display devices worn on the head by respective vehicle occupants.
  • the use of augmented reality glasses and also virtual reality glasses as such is already known from the related art.
  • DE 10 2015 014 450 A1 discloses virtual reality glasses and a method for operating virtual reality glasses.
  • the virtual reality glasses include a first display device for displaying a virtual environment for a wearer of the virtual reality glasses, wherein a second display device is attached to an outside of the virtual reality glasses in order to display the same virtual environment for a person standing on the outside as the wearer of the virtual reality glasses has displayed.
  • DE 10 2014 015 871 A1 discloses a display system for a motor vehicle and a method for operating such a display system.
  • the display system may include augmented reality glasses which are worn by a driver of the motor vehicle, and virtual reality glasses which are worn by a front passenger of the motor vehicle.
  • DE 10 2012 017 700 A1 discloses a system and a method for simulating an operation of a nonmedical tool.
  • Data glasses are used here to display a virtual environment for a wearer of the data glasses, wherein a display device is used to display an image of the wearer within the virtual environment.
  • Described below is a method by which at least two display devices worn on the head by respective vehicle occupants can be operated particularly well adapted to one another.
  • this method for operating at least two display devices worn on the head by respective vehicle occupants respective virtual environments are displayed by the display devices.
  • data characterizing a relative arrangement of the vehicle occupants with respect to one another are continuously transmitted to the display devices and the virtual environments are displayed as a function of these data.
  • the display devices worn on the head can be, for example, augmented reality glasses, augmented reality contact lenses, or also virtual reality glasses.
  • the method enables synchronization of the content of in-car virtual reality experiences. Since many journeys with motor vehicles will very probably also take place in community in the future, in particular in the case of autonomously driving vehicles, a supply of mechanisms for providing shared experiences is very useful. Against this background, the method enables a shared synchronous experience of virtual environments in the motor vehicle.
  • the at least two vehicle occupants who wear the display device on the head jointly experience something cooperative or also competitive in the form of the respective displayed virtual environment.
  • a permanent information exchange takes place between the employed display devices worn on the head, in that data characterizing a relative arrangement of the vehicle occupants with respect to one another are continuously transmitted to the respective display devices and the virtual environments are displayed as a function of these data.
  • the display devices themselves can include, for example, sensors or other detecting devices, by which an alignment and positioning of the respective display devices can be ascertained. These data can be transmitted to the respective other display device in order to continuously ascertain the respective relative arrangement of the vehicle occupants in relation to one another and to display the virtual environment as a function thereof. It is also possible, for example that sensors installed in the respective vehicles are used to ascertain a relative positioning of the vehicle occupants with respect to one another, so that data in this regard can be continuously transmitted to the relevant display devices.
  • At least two vehicle occupants wear one of the display devices on the head, by which respective virtual environments are displayed, wherein the displayed virtual environments are synchronized with one another with respect to content, above all with regard to the relative arrangement of the vehicle occupants with respect to one another. It is fundamentally unimportant here whether the vehicle occupants are seated in the same vehicle or in different vehicles.
  • One advantageous embodiment provides that respective head movements of the vehicle occupants are continuously detected by respective detection devices and are transmitted as part of the data to the display devices, wherein the detected head movements determine a respective virtual perspective of the vehicle occupants on the respective virtual environments.
  • the vehicle occupants each rotate their heads, the respective virtual perspective of the relevant vehicle occupants on the respective displayed virtual environments therefore changes. The vehicle occupants can thus influence the perspective from which they wish to view the respective virtual environments in a simple manner.
  • a further advantageous embodiment provides that if the vehicle occupants are located in different vehicles, a relative movement of the vehicles with respect to one another is detected and transmitted as part of the data to the display devices, wherein the detected relative movement determines a respective virtual perspective of the vehicle occupants on the respective virtual environments.
  • the vehicle occupants seated or arranged in different vehicles can thus, for example, experience a virtual car race game or a virtual space battle or the like jointly as a virtual experience.
  • the respective virtual perspective of the vehicle occupants on the respective virtual environments changes.
  • the vehicle occupants seated in the different vehicles can thus, for example, play a particularly realistically appearing car race game, and can—if this also takes place with the vehicles in reality—also mutually overtake one another, for example with their virtual vehicles or virtual space planes or the like, and may also appear within the virtual environment.
  • the vehicle occupants are located in the same vehicle, wherein in this case only the relative movements of the vehicle occupants with respect to one another within the same vehicle are detected and taken into consideration in the display of the respective virtual environment.
  • One alternative advantageous embodiment provides that if the vehicle occupants are located in different vehicles, respective relative movements of the vehicle occupants with respect to respective vehicle interiors are detected and transmitted as part of the data to the display devices, wherein the detected relative movements determine a respective virtual perspective of the vehicle occupants on the respective virtual environments and a relative movement of the vehicles with respect to one another remains unconsidered. It could thus be, for example that the two vehicle occupants who are seated in different vehicles share a virtual experience with one another in which the relative movement of the different vehicles with respect to one another would not be relevant at all and the implementation of which would even be disturbing.
  • respective avatars of the respective other vehicle occupants are displayed as part of the virtual environments by the display devices in accordance with the relative arrangement of the vehicle occupants with respect to one another within the virtual environments.
  • the vehicle occupants can thus see respective virtual representations of the respective other vehicle occupant in the form of the avatars within the respective virtual environment. If both vehicle occupants are seated in the same vehicle next to one another, for example, a respective avatar of the relevant vehicle occupant is thus located in the respective displayed virtual environment on the left or right, respectively, of a virtual position of the respective other vehicle occupant within the respective displayed virtual environment.
  • the vehicle occupants can thus very easily experience something cooperative or competitive with one another within the virtual environment, for example.
  • a further advantageous embodiment provides that respective head movements of the vehicle occupants are detected and converted into corresponding head movements of the avatars.
  • the vehicle occupants can thus, by way of corresponding head movements, cause the respective avatars in the virtual environments to pivot their head to the left and right, for example precisely like the vehicle occupants in reality. It is thus possible, for example, that the vehicle occupants can mutually look at one another and also turn away from one another within the virtual environment by way of their respective avatars.
  • a further person on a user interface inputs of a further person on a user interface are detected and the virtual environments are adapted in accordance with the detected inputs. It is thus possible, for example, for persons standing on the outside without virtual reality glasses or also augmented reality glasses to influence the virtual experience for the other vehicle occupants.
  • the outsider can thus, for example, change the virtual experience for the vehicle occupants and, for example, adapt an upcoming route section recognizable to them, as a result of which a virtual route section is also adapted accordingly.
  • an outsider can thus externally influence respective virtual experiences of the vehicle occupants who wear the display device on the head.
  • the outsider is thus also incorporated in a certain form into the virtual experience and a type of socially shared experience results.
  • a vehicle-side user interface in particular a touch screen or the like, is used here as the user interface.
  • a mobile terminal in particular a smart phone, a tablet computer, or a smart watch, is used as the user interface.
  • smart devices of vehicle occupants can be used or also an interaction element in one of the vehicles, for example an infotainment touchscreen.
  • the system for operating at least two display devices worn on the head by respective vehicle occupants is configured to carry out the method according to the invention or an advantageous embodiment of the method.
  • FIG. 1 is a schematic illustration of a motor vehicle in which two vehicle occupants each wearing virtual reality glasses are seated, wherein moreover a system is provided which enables a synchronization of the content of the virtual environment displayed by the virtual reality glasses;
  • FIG. 2 is a schematic illustration of two motor vehicles, wherein a vehicle occupant wearing virtual reality glasses is seated in each case in the respective motor vehicle and the system is provided, for example, in one of the motor vehicles.
  • a motor vehicle 1 is shown in a very schematic illustration in FIG. 1 .
  • Two vehicle occupants 2 , 3 who have put on respective virtual reality glasses 4 , 5 , are seated in the motor vehicle 1 .
  • a system 6 is provided in the motor vehicle 1 , which is used for operating the two virtual reality glasses 4 , 5 .
  • the system 6 can also be formed solely by the two pairs of virtual reality glasses 4 , 5 , which in this case may communicate wirelessly with one another and can thus exchange data with one another.
  • Respective virtual environments are displayed by the virtual reality glasses 4 , 5 .
  • a relative arrangement of the vehicle occupants 2 , 3 with respect to one another is ascertained continuously, wherein data in this regard are transmitted to the system 6 or also transmitted directly to the respective virtual reality glasses 4 , 5 .
  • the virtual environments displayed by the virtual reality glasses 4 , 5 are displayed as a function of the detected relative arrangement of the vehicle occupants 2 , 3 .
  • respective head movements of the vehicle occupants 2 , 3 can be continuously detected, for example by the virtual reality glasses 4 , 5 , and transmitted to the respective virtual reality glasses 4 , 5 or to the system 6 .
  • the respective detected head movements determine in this case a respective virtual perspective of the vehicle occupants 2 , 3 on the respective virtual environments which are displayed by the virtual reality glasses 4 , 5 .
  • respective avatars of the respective other virtual occupants 2 , 3 can also be displayed by the virtual reality glasses 4 , 5 in accordance with the relative arrangement of the vehicle occupants 2 , 3 with respect to one another within the virtual environment.
  • the two vehicle occupants 2 , 3 can thus mutually see one another in the respective displayed virtual environments in the form of avatars and can interact with one another, for example.
  • respective head movements of the vehicle occupants 2 , 3 are detected and converted into corresponding head movements of the avatars. It is thus possible, for example, that the two vehicle occupants 2 , 3 can look at one another or also turn away from one another in the form of the respective avatars in the respective displayed virtual environments.
  • a further vehicle occupant 7 who has not put on virtual reality glasses, are also detected, wherein the virtual environments which are displayed by the virtual reality glasses 4 , 5 are adapted in accordance with the detected inputs of the further person 7 .
  • the vehicle occupant 7 can input his inputs relating to the virtual environments at a user interface 8 .
  • the user interface 8 can be, for example, a touchscreen in the motor vehicle 1 .
  • the user interface 8 is a smart device, for example in the form of a smart watch, a smart phone, a tablet computer, or the like.
  • the further vehicle occupant 7 without himself wearing virtual reality glasses, can thus socially interact with the further vehicle occupants 2 , 3 , who have put on the virtual reality glasses 4 , 5 , namely in that he performs certain inputs which influence the respective virtual experience of the vehicle occupants 2 , 3 .
  • the further person 7 can also be arranged outside the motor vehicle 1 , in order to make these inputs, for example via his smart phone.
  • FIG. 2 Two motor vehicles 1 , 9 are shown in FIG. 2 , wherein one vehicle occupant 2 , 3 , who has put on virtual reality glasses 4 , 5 , is seated in each of the motor vehicles 1 , 9 .
  • the vehicle occupants 2 , 3 are thus seated in different motor vehicles 1 , 9 but also jointly share a virtual experience, in that respective virtual environments are again displayed by the virtual reality glasses 4 , 5 , wherein data characterizing a relative arrangement of the vehicle occupants 2 , 3 with respect to one another are continuously transmitted during this between the virtual reality glasses 4 , 5 and the virtual environments are displayed as a function of these data.
  • a relative movement of the motor vehicles 1 , 9 with respect to one another can be detected and transmitted as part of the data to the respective virtual reality glasses 4 , 5 .
  • the detected relative movement of the motor vehicles 1 , 9 with respect to one another determines a respective virtual perspective of the vehicle occupants 2 , 3 on the respective virtual environments in this case.
  • the vehicle occupants 2 , 3 seated in the different motor vehicles 1 , 9 jointly play a virtual car race or the like. If the motor vehicles 1 , 9 pass one another in reality, for example they are controlled fully autonomously, it can thus be provided that the vehicle occupants 2 , 3 see respective avatars of the respective other vehicle occupant 2 , 3 in the displayed virtual environment who are arranged in corresponding virtual motor vehicles. It is also conceivable that the vehicle occupants 2 , 3 start a car race jointly proceeding from a virtual starting point independently of the precise positioning of the motor vehicles 1 , 9 , wherein the relative movement of the motor vehicles 1 , 9 with respect to one another is then also taken into consideration.
  • a further vehicle occupant 7 can have an influence on the respective virtual environments, which are displayed by the virtual reality glasses 4 , 5 , via a user interface 8 —as described in conjunction with FIG. 1 .

Abstract

Data characterizing a relative arrangement of vehicle occupants with respect to one another are continuously transmitted to display devices worn on the heads of the vehicle occupants. Virtual environments are displayed as a function of these data.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application is a U.S. national stage of International Application No. PCT/EP2019/063014, filed on May 21, 2019. The International Application claims the priority benefit of German Application No. 10 2018 213 556.8 filed on Aug. 10, 2018. Both the International Application and the German Application are incorporated by reference herein in their entirety.
  • BACKGROUND
  • Described below are a method and a system for operating at least two display devices worn on the head by respective vehicle occupants.
  • In the future, route-dependent virtual reality experiences, using which vehicle occupants can experience different virtual events during travel while wearing augmented reality glasses or also virtual reality glasses, will highly probably become established in vehicles at latest from the increased use of autonomously driving vehicles.
  • The use of augmented reality glasses and also virtual reality glasses as such is already known from the related art. Thus, for example DE 10 2015 014 450 A1 discloses virtual reality glasses and a method for operating virtual reality glasses. The virtual reality glasses include a first display device for displaying a virtual environment for a wearer of the virtual reality glasses, wherein a second display device is attached to an outside of the virtual reality glasses in order to display the same virtual environment for a person standing on the outside as the wearer of the virtual reality glasses has displayed.
  • DE 10 2014 015 871 A1 discloses a display system for a motor vehicle and a method for operating such a display system. The display system may include augmented reality glasses which are worn by a driver of the motor vehicle, and virtual reality glasses which are worn by a front passenger of the motor vehicle.
  • DE 10 2012 017 700 A1 discloses a system and a method for simulating an operation of a nonmedical tool. Data glasses are used here to display a virtual environment for a wearer of the data glasses, wherein a display device is used to display an image of the wearer within the virtual environment.
  • SUMMARY
  • Described below is a method by which at least two display devices worn on the head by respective vehicle occupants can be operated particularly well adapted to one another. According to this method for operating at least two display devices worn on the head by respective vehicle occupants, respective virtual environments are displayed by the display devices. During this, data characterizing a relative arrangement of the vehicle occupants with respect to one another are continuously transmitted to the display devices and the virtual environments are displayed as a function of these data.
  • The display devices worn on the head can be, for example, augmented reality glasses, augmented reality contact lenses, or also virtual reality glasses. The method enables synchronization of the content of in-car virtual reality experiences. Since many journeys with motor vehicles will very probably also take place in community in the future, in particular in the case of autonomously driving vehicles, a supply of mechanisms for providing shared experiences is very useful. Against this background, the method enables a shared synchronous experience of virtual environments in the motor vehicle.
  • It is thus possible, for example, that the at least two vehicle occupants who wear the display device on the head jointly experience something cooperative or also competitive in the form of the respective displayed virtual environment. For this purpose, a permanent information exchange takes place between the employed display devices worn on the head, in that data characterizing a relative arrangement of the vehicle occupants with respect to one another are continuously transmitted to the respective display devices and the virtual environments are displayed as a function of these data.
  • The display devices themselves can include, for example, sensors or other detecting devices, by which an alignment and positioning of the respective display devices can be ascertained. These data can be transmitted to the respective other display device in order to continuously ascertain the respective relative arrangement of the vehicle occupants in relation to one another and to display the virtual environment as a function thereof. It is also possible, for example that sensors installed in the respective vehicles are used to ascertain a relative positioning of the vehicle occupants with respect to one another, so that data in this regard can be continuously transmitted to the relevant display devices. It is thus possible that at least two vehicle occupants wear one of the display devices on the head, by which respective virtual environments are displayed, wherein the displayed virtual environments are synchronized with one another with respect to content, above all with regard to the relative arrangement of the vehicle occupants with respect to one another. It is fundamentally unimportant here whether the vehicle occupants are seated in the same vehicle or in different vehicles.
  • One advantageous embodiment provides that respective head movements of the vehicle occupants are continuously detected by respective detection devices and are transmitted as part of the data to the display devices, wherein the detected head movements determine a respective virtual perspective of the vehicle occupants on the respective virtual environments. Thus, if the vehicle occupants each rotate their heads, the respective virtual perspective of the relevant vehicle occupants on the respective displayed virtual environments therefore changes. The vehicle occupants can thus influence the perspective from which they wish to view the respective virtual environments in a simple manner.
  • A further advantageous embodiment provides that if the vehicle occupants are located in different vehicles, a relative movement of the vehicles with respect to one another is detected and transmitted as part of the data to the display devices, wherein the detected relative movement determines a respective virtual perspective of the vehicle occupants on the respective virtual environments. The vehicle occupants seated or arranged in different vehicles can thus, for example, experience a virtual car race game or a virtual space battle or the like jointly as a virtual experience. Depending on how the different vehicles move with respect to one another, the respective virtual perspective of the vehicle occupants on the respective virtual environments changes. The vehicle occupants seated in the different vehicles can thus, for example, play a particularly realistically appearing car race game, and can—if this also takes place with the vehicles in reality—also mutually overtake one another, for example with their virtual vehicles or virtual space planes or the like, and may also appear within the virtual environment. Alternatively, it is also possible that the vehicle occupants are located in the same vehicle, wherein in this case only the relative movements of the vehicle occupants with respect to one another within the same vehicle are detected and taken into consideration in the display of the respective virtual environment.
  • One alternative advantageous embodiment provides that if the vehicle occupants are located in different vehicles, respective relative movements of the vehicle occupants with respect to respective vehicle interiors are detected and transmitted as part of the data to the display devices, wherein the detected relative movements determine a respective virtual perspective of the vehicle occupants on the respective virtual environments and a relative movement of the vehicles with respect to one another remains unconsidered. It could thus be, for example that the two vehicle occupants who are seated in different vehicles share a virtual experience with one another in which the relative movement of the different vehicles with respect to one another would not be relevant at all and the implementation of which would even be disturbing. In this case, it is particularly advantageous if simply only respective relative movements of the vehicle occupants with respect to respective vehicle interiors of the different vehicles are detected, in which the respective vehicle occupants are presently seated, so that these detected relative movements exclusively determine respective virtual perspectives of the vehicle occupants on the respective virtual environments. For example, if one of the vehicle occupants in the relevant vehicle leans to the right, he thus also leans to the right within the virtual environment. It is unimportant here how the vehicle in which this vehicle occupant is seated is presently moving in relation to the other vehicle in which the other vehicle occupant is seated.
  • In a further advantageous embodiment, it is provided that respective avatars of the respective other vehicle occupants are displayed as part of the virtual environments by the display devices in accordance with the relative arrangement of the vehicle occupants with respect to one another within the virtual environments. The vehicle occupants can thus see respective virtual representations of the respective other vehicle occupant in the form of the avatars within the respective virtual environment. If both vehicle occupants are seated in the same vehicle next to one another, for example, a respective avatar of the relevant vehicle occupant is thus located in the respective displayed virtual environment on the left or right, respectively, of a virtual position of the respective other vehicle occupant within the respective displayed virtual environment. The vehicle occupants can thus very easily experience something cooperative or competitive with one another within the virtual environment, for example.
  • A further advantageous embodiment provides that respective head movements of the vehicle occupants are detected and converted into corresponding head movements of the avatars. The vehicle occupants can thus, by way of corresponding head movements, cause the respective avatars in the virtual environments to pivot their head to the left and right, for example precisely like the vehicle occupants in reality. It is thus possible, for example, that the vehicle occupants can mutually look at one another and also turn away from one another within the virtual environment by way of their respective avatars.
  • In a further advantageous embodiment, it is provided that inputs of a further person on a user interface are detected and the virtual environments are adapted in accordance with the detected inputs. It is thus possible, for example, for persons standing on the outside without virtual reality glasses or also augmented reality glasses to influence the virtual experience for the other vehicle occupants. The outsider can thus, for example, change the virtual experience for the vehicle occupants and, for example, adapt an upcoming route section recognizable to them, as a result of which a virtual route section is also adapted accordingly. Very generally, an outsider can thus externally influence respective virtual experiences of the vehicle occupants who wear the display device on the head. The outsider is thus also incorporated in a certain form into the virtual experience and a type of socially shared experience results. For example, a vehicle-side user interface, in particular a touch screen or the like, is used here as the user interface. Alternatively, it is also possible that a mobile terminal, in particular a smart phone, a tablet computer, or a smart watch, is used as the user interface. Thus, for example smart devices of vehicle occupants can be used or also an interaction element in one of the vehicles, for example an infotainment touchscreen.
  • The system for operating at least two display devices worn on the head by respective vehicle occupants is configured to carry out the method according to the invention or an advantageous embodiment of the method.
  • Further advantages, features, and details result from the following description of exemplary embodiments and on the basis of the drawing. The features and feature combinations mentioned above in the description and the features and feature combinations mentioned hereinafter in the description of the figures and/or shown solely in the figures are usable not only in the respective specified combination, but also in other combinations or alone, without leaving the scope of the invention.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • These and other aspects and advantages will become more apparent and more readily appreciated from the following description of the exemplary embodiments, taken in conjunction with the accompanying drawings of which:
  • FIG. 1 is a schematic illustration of a motor vehicle in which two vehicle occupants each wearing virtual reality glasses are seated, wherein moreover a system is provided which enables a synchronization of the content of the virtual environment displayed by the virtual reality glasses; and
  • FIG. 2 is a schematic illustration of two motor vehicles, wherein a vehicle occupant wearing virtual reality glasses is seated in each case in the respective motor vehicle and the system is provided, for example, in one of the motor vehicles.
  • DETAILED DESCRIPTION
  • Reference will now be made in detail to the exemplary embodiments which are illustrated in the accompanying drawings, wherein like reference numerals refer to like elements throughout.
  • In the figures, identical or functionally identical elements have been provided with the same reference signs.
  • A motor vehicle 1 is shown in a very schematic illustration in FIG. 1. Two vehicle occupants 2, 3, who have put on respective virtual reality glasses 4, 5, are seated in the motor vehicle 1. Reference is always made in conjunction with this figure and also the other figure to virtual reality glasses, wherein the explanation hereinafter can also apply, for example, to augmented reality glasses, augmented reality contact lenses, and the like. Moreover, a system 6 is provided in the motor vehicle 1, which is used for operating the two virtual reality glasses 4, 5. Contrary to the present illustration, the system 6 can also be formed solely by the two pairs of virtual reality glasses 4, 5, which in this case may communicate wirelessly with one another and can thus exchange data with one another.
  • Respective virtual environments are displayed by the virtual reality glasses 4, 5. During this, a relative arrangement of the vehicle occupants 2, 3 with respect to one another is ascertained continuously, wherein data in this regard are transmitted to the system 6 or also transmitted directly to the respective virtual reality glasses 4, 5. The virtual environments displayed by the virtual reality glasses 4, 5 are displayed as a function of the detected relative arrangement of the vehicle occupants 2, 3.
  • Thus, for example respective head movements of the vehicle occupants 2, 3 can be continuously detected, for example by the virtual reality glasses 4, 5, and transmitted to the respective virtual reality glasses 4, 5 or to the system 6. The respective detected head movements determine in this case a respective virtual perspective of the vehicle occupants 2, 3 on the respective virtual environments which are displayed by the virtual reality glasses 4, 5.
  • As a component of the virtual environment, respective avatars of the respective other virtual occupants 2, 3 can also be displayed by the virtual reality glasses 4, 5 in accordance with the relative arrangement of the vehicle occupants 2, 3 with respect to one another within the virtual environment. The two vehicle occupants 2, 3 can thus mutually see one another in the respective displayed virtual environments in the form of avatars and can interact with one another, for example. In this context, it can also be provided that respective head movements of the vehicle occupants 2, 3 are detected and converted into corresponding head movements of the avatars. It is thus possible, for example, that the two vehicle occupants 2, 3 can look at one another or also turn away from one another in the form of the respective avatars in the respective displayed virtual environments.
  • Moreover, it is possible that inputs from a further vehicle occupant 7, who has not put on virtual reality glasses, are also detected, wherein the virtual environments which are displayed by the virtual reality glasses 4, 5 are adapted in accordance with the detected inputs of the further person 7. The vehicle occupant 7 can input his inputs relating to the virtual environments at a user interface 8. The user interface 8 can be, for example, a touchscreen in the motor vehicle 1. Alternatively, it is also possible that the user interface 8 is a smart device, for example in the form of a smart watch, a smart phone, a tablet computer, or the like. The further vehicle occupant 7, without himself wearing virtual reality glasses, can thus socially interact with the further vehicle occupants 2, 3, who have put on the virtual reality glasses 4, 5, namely in that he performs certain inputs which influence the respective virtual experience of the vehicle occupants 2, 3. The further person 7 can also be arranged outside the motor vehicle 1, in order to make these inputs, for example via his smart phone.
  • Two motor vehicles 1, 9 are shown in FIG. 2, wherein one vehicle occupant 2, 3, who has put on virtual reality glasses 4, 5, is seated in each of the motor vehicles 1, 9. In contrast to the exemplary embodiment shown in FIG. 1, it is provided here that the vehicle occupants 2, 3 are thus seated in different motor vehicles 1, 9 but also jointly share a virtual experience, in that respective virtual environments are again displayed by the virtual reality glasses 4, 5, wherein data characterizing a relative arrangement of the vehicle occupants 2, 3 with respect to one another are continuously transmitted during this between the virtual reality glasses 4, 5 and the virtual environments are displayed as a function of these data. If, as shown here, the vehicle occupants 2, 3 are located in different motor vehicles 1, 9, a relative movement of the motor vehicles 1, 9 with respect to one another can be detected and transmitted as part of the data to the respective virtual reality glasses 4, 5. The detected relative movement of the motor vehicles 1, 9 with respect to one another determines a respective virtual perspective of the vehicle occupants 2, 3 on the respective virtual environments in this case.
  • It is thus possible, for example, that the vehicle occupants 2, 3 seated in the different motor vehicles 1, 9 jointly play a virtual car race or the like. If the motor vehicles 1, 9 pass one another in reality, for example they are controlled fully autonomously, it can thus be provided that the vehicle occupants 2, 3 see respective avatars of the respective other vehicle occupant 2, 3 in the displayed virtual environment who are arranged in corresponding virtual motor vehicles. It is also conceivable that the vehicle occupants 2, 3 start a car race jointly proceeding from a virtual starting point independently of the precise positioning of the motor vehicles 1, 9, wherein the relative movement of the motor vehicles 1, 9 with respect to one another is then also taken into consideration.
  • Alternatively, it is also possible that only respective relative movements of the vehicle occupants 2, 3 with respect to the respective vehicle interiors of the motor vehicles 1, 9 are detected and exchanged between the virtual reality glasses 4, 5. The detected relative movements determine a respective perspective of the vehicle occupants 2, 3 on the respective virtual environments which are displayed by the virtual reality glasses 4, 5. A relative movement of the motor vehicles 1, 9 with respect to one another remains unconsidered. The vehicle occupants 2, 3 seated in the different motor vehicles 1, 9 can thus also, for example, share a common virtual experience, during which the relative movement of the motor vehicles 1, 9 with respect to one another remains unconsidered. It is solely decisive here how the vehicle occupants 2, 3 move in relation to the respective vehicle interiors of the motor vehicles 1, 9. For example, if the vehicle occupant 2 bends to the right and the vehicle occupant 3 bends to the left, this can thus have the result in the virtual environment that the two vehicle occupants 2, 3 mutually approach one another within the virtual environment, independently of the relative movement of the motor vehicles 1, 9.
  • Contrary to the present description, it is in turn also possible that a further vehicle occupant 7 can have an influence on the respective virtual environments, which are displayed by the virtual reality glasses 4, 5, via a user interface 8—as described in conjunction with FIG. 1.
  • It becomes clear on the basis of the explained exemplary embodiments how a synchronization of the content of various in-car virtual reality experiences can take place and moreover how an influence of the content can take place by inputs of a person 7 who has not put on virtual reality glasses or augmented reality glasses.
  • A description has been provided with particular reference to exemplary embodiments thereof and examples, but it will be understood that variations and modifications can be effected within the spirit and scope of the claims which may include the phrase “at least one of A, B and C” as an alternative expression that means one or more of A, B and C may be used, contrary to the holding in Superguide v. DIRECTV, 358 F3d 870, 69 USPQ2d 1865 (Fed. Cir. 2004).

Claims (20)

1-10. (canceled)
11. A method for operating at least two display devices respectively worn on heads of vehicle occupants, comprising:
displaying respective virtual environments by the at least two display devices; and
continuously transmitting during the displaying, data characterizing a relative arrangement of the vehicle occupants with respect to one another, to the at least two display devices, the virtual environments being displayed as a function of the data.
12. The method according to claim 11,
further comprising detecting respective head movements of the vehicle occupants by respective detection devices,
wherein the data transmitted to the at least two display devices indicate the respective head movements of the vehicle occupants, and
wherein the displaying relies on the respective head movements to determine respective virtual perspectives of the vehicle occupants in the respective virtual environments.
13. The method according to claim 12,
further comprising, when the vehicle occupants are located in different vehicles, detecting a relative movement of the different vehicles with respect to one another,
wherein the data transmitted to the at least two display devices indicate the relative movement of the different vehicles, and
wherein the displaying relies on the relative movement of the different vehicles to determine the respective virtual perspectives of the vehicle occupants in the respective virtual environments.
14. The method according to claim 12,
further comprising, when the vehicle occupants are located in different vehicles, detecting respective relative movements of the vehicle occupants with respect to respective vehicle interiors,
wherein the data transmitted to the at least two display devices indicate the respective relative movements of the vehicle occupants, and
wherein the displaying relies on the respective relative movements of the vehicle occupants to determine the respective virtual perspectives of the vehicle occupants in the respective virtual environments.
15. The method according to claim 14, wherein the displaying displays avatars of at least one of the vehicle occupants as a component of the respective virtual environments by the at least two display devices in accordance with the relative arrangement of the vehicle occupants with respect to one another within the respective virtual environments.
16. The method according to claim 15, wherein the displaying converts the respective head movements of the vehicle occupants into corresponding head movements of the avatars.
17. The method according to claim 16,
further comprising detecting inputs of a person at a user interface, and
wherein the displaying adapts the virtual environments in accordance with the inputs.
18. The method according to claim 17, wherein a vehicle-side user interface with a touchscreen is the user interface.
19. The method according to claim 17, wherein one of a smart phone, a tablet computer and a smart watch is the user interface.
20. The method according to claim 11,
further comprising, when the vehicle occupants are located in different vehicles, detecting a relative movement of the vehicles with respect to one another,
wherein the data transmitted to the at least two display devices indicate the relative movement of the vehicles, and
wherein the displaying relies on the relative movement of the vehicles to determine the respective virtual perspectives of the vehicle occupants in the respective virtual environments.
21. The method according to claim 11,
further comprising, when the vehicle occupants are located in different vehicles, detecting respective relative movements of the vehicle occupants with respect to respective vehicle interiors,
wherein the data transmitted to the at least two display devices indicate the respective relative movements of the vehicle occupants, and
wherein the displaying relies on the respective relative movements of the vehicle occupants to determine the respective virtual perspectives of the vehicle occupants in the respective virtual environments.
22. The method according to claim 11, wherein the displaying displays avatars of at least one of the vehicle occupants as a component of the respective virtual environments by the at least two display devices in accordance with the relative arrangement of the vehicle occupants with respect to one another within the respective virtual environments.
23. The method according to claim 22, wherein the displaying converts the respective head movements of the vehicle occupants into corresponding head movements of the avatars.
24. The method according to claim 11,
further comprising detecting inputs of a person at a user interface, and
wherein the displaying adapts the virtual environments in accordance with the inputs.
25. A system, comprising:
at least two displays worn on respective heads of vehicle occupants;
at least one communication interface coupled to the at least two displays; and
at least one processor, coupled to the communication interface, configured to
continuously receive data, characterizing a relative arrangement of the vehicle occupants, via the communication interface, and
cause the at least two display device to display, while the data is received, respective virtual environments as a function of the data.
26. The system according to claim 25,
wherein the at least two displays are configured to
detect respective head movements of the vehicle occupants, and
transmit, to the at least one processor via the communication interface, information representing the respective head movements of the vehicle occupants, and
wherein the at least one processor is further configured to determine respective virtual perspectives of the vehicle occupants in the respective virtual environments.
27. The system according to claim 25, wherein the at least two displays are located in different vehicles, and
wherein the at least one processor is further configured to determine the respective virtual perspectives of the vehicle occupants in the respective virtual environments based on respective relative movements of the different vehicles.
28. The system according to claim 25, wherein the at least two displays are located in different vehicles, and
wherein the at least one processor is further configured to determine the respective virtual perspectives of the vehicle occupants in the respective virtual environments based on respective relative movements of vehicle occupants in the different vehicles.
29. The system according to claim 25, wherein one of the at least two displays, at least one communication interface and at least one processor are incorporated in each of at least two modified-reality glasses.
US17/267,083 2018-08-10 2019-05-21 Method and system for operating at least two display devices carried by respective vehicle occupants on the head Active US11409104B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE102018213556.8A DE102018213556A1 (en) 2018-08-10 2018-08-10 Method and system for operating at least two display devices worn on the head by respective vehicle occupants
DE102018213556.8 2018-08-10
PCT/EP2019/063014 WO2020030312A1 (en) 2018-08-10 2019-05-21 Method and system for operating at least two display devices carried by respective vehicle occupants on the head

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2019/063014 A-371-Of-International WO2020030312A1 (en) 2018-08-10 2019-05-21 Method and system for operating at least two display devices carried by respective vehicle occupants on the head

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/881,985 Continuation US11940622B2 (en) 2018-08-10 2022-08-05 Method and system for operating at least two display devices carried by respective vehicle occupants on the head

Publications (2)

Publication Number Publication Date
US20210311308A1 true US20210311308A1 (en) 2021-10-07
US11409104B2 US11409104B2 (en) 2022-08-09

Family

ID=66640966

Family Applications (2)

Application Number Title Priority Date Filing Date
US17/267,083 Active US11409104B2 (en) 2018-08-10 2019-05-21 Method and system for operating at least two display devices carried by respective vehicle occupants on the head
US17/881,985 Active US11940622B2 (en) 2018-08-10 2022-08-05 Method and system for operating at least two display devices carried by respective vehicle occupants on the head

Family Applications After (1)

Application Number Title Priority Date Filing Date
US17/881,985 Active US11940622B2 (en) 2018-08-10 2022-08-05 Method and system for operating at least two display devices carried by respective vehicle occupants on the head

Country Status (5)

Country Link
US (2) US11409104B2 (en)
EP (1) EP3834058A1 (en)
CN (1) CN112567318A (en)
DE (1) DE102018213556A1 (en)
WO (1) WO2020030312A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102018213556A1 (en) 2018-08-10 2020-02-13 Audi Ag Method and system for operating at least two display devices worn on the head by respective vehicle occupants
CN115442578A (en) * 2021-12-29 2022-12-06 北京罗克维尔斯科技有限公司 Vehicle-mounted VR glasses image display method, device and system

Family Cites Families (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9703369B1 (en) * 2007-10-11 2017-07-11 Jeffrey David Mullen Augmented reality video game systems
DE102012017700A1 (en) 2012-09-07 2014-03-13 Sata Gmbh & Co. Kg System and method for simulating operation of a non-medical tool
US9996975B2 (en) * 2014-03-18 2018-06-12 Dreamworks Animation L.L.C. Interactive multi-rider virtual reality ride system
DE102014015871B4 (en) 2014-10-25 2020-10-15 Audi Ag Display system for a motor vehicle, motor vehicle with a display system and method for operating a display system
US10642560B2 (en) * 2015-02-13 2020-05-05 Position Imaging, Inc. Accurate geographic tracking of mobile devices
KR101850028B1 (en) * 2015-03-06 2018-05-30 한국전자통신연구원 Device for virtual tour experience and method thereof
DE102015003883A1 (en) * 2015-03-26 2016-09-29 Audi Ag Motor vehicle simulation arrangement for simulating a virtual environment with a virtual motor vehicle and method for simulating a virtual environment
DE102015014450B4 (en) 2015-11-07 2017-11-23 Audi Ag Virtual reality glasses and method of operating a virtual reality glasses
US9889860B2 (en) * 2015-12-15 2018-02-13 Universal City Studios Llc Multi-passenger ride vehicle
DE102016104337A1 (en) * 2016-03-09 2017-09-14 Vr Coaster Gmbh & Co. Kg Positioning and alignment of a virtual reality headset and ride with a virtual reality headset
DE102016003074A1 (en) 2016-03-12 2017-09-14 Audi Ag Method for operating a virtual reality system and virtual reality system
US10657701B2 (en) * 2016-06-30 2020-05-19 Sony Interactive Entertainment Inc. Dynamic entering and leaving of virtual-reality environments navigated by different HMD users
US9922466B2 (en) * 2016-08-05 2018-03-20 Uber Technologies, Inc. Virtual reality experience for a vehicle
US10359863B2 (en) * 2016-11-15 2019-07-23 Google Llc Dragging virtual elements of an augmented and/or virtual reality environment
DE102016224122A1 (en) * 2016-12-05 2018-06-07 Audi Ag Method for operating a VR glasses in an interior of a motor vehicle and control device and motor vehicle
JP6298523B1 (en) * 2016-12-26 2018-03-20 株式会社コロプラ Method executed by computer to communicate through virtual space, program for causing computer to execute the method, and computer apparatus
US20180255285A1 (en) * 2017-03-06 2018-09-06 Universal City Studios Llc Systems and methods for layered virtual features in an amusement park environment
US10580386B2 (en) * 2017-04-21 2020-03-03 Ford Global Technologies, Llc In-vehicle projected reality motion correction
DE102017208936A1 (en) * 2017-05-29 2018-11-29 Audi Ag Method for operating a virtual reality system and virtual reality system
US10334199B2 (en) * 2017-07-17 2019-06-25 Microsoft Technology Licensing, Llc Augmented reality based community review for automobile drivers
DE102017218215A1 (en) * 2017-10-12 2019-04-18 Audi Ag A method of operating a head-mounted electronic display device and display system for displaying a virtual content
DE102018212410A1 (en) * 2018-07-25 2020-01-30 Audi Ag Method and system for evaluating virtual content reproduced in motor vehicles
US10841632B2 (en) * 2018-08-08 2020-11-17 Disney Enterprises, Inc. Sequential multiplayer storytelling in connected vehicles
DE102018213556A1 (en) 2018-08-10 2020-02-13 Audi Ag Method and system for operating at least two display devices worn on the head by respective vehicle occupants
DE102018213634A1 (en) * 2018-08-13 2020-02-13 Audi Ag Method for operating a display device arranged in a motor vehicle and display device for use in a motor vehicle
EP3620319B1 (en) * 2018-09-06 2022-08-10 Audi Ag Method for operating a virtual assistant for a motor vehicle and corresponding backend system

Also Published As

Publication number Publication date
EP3834058A1 (en) 2021-06-16
US11940622B2 (en) 2024-03-26
DE102018213556A1 (en) 2020-02-13
WO2020030312A1 (en) 2020-02-13
US20220374074A1 (en) 2022-11-24
CN112567318A (en) 2021-03-26
US11409104B2 (en) 2022-08-09

Similar Documents

Publication Publication Date Title
US20220374074A1 (en) Method and system for operating at least two display devices carried by respective vehicle occupants on the head
US9001153B2 (en) System and apparatus for augmented reality display and controls
CN105026203B (en) Method for synchronizing a display device of a motor vehicle
CN107450531A (en) The system for dynamically directing the user to the loading position of the autonomous driving vehicles
JP6913765B2 (en) A display system with a mobile sensor device for a head-mounted visual output device that can be used in a moving body and a method for operating it.
KR20160071070A (en) Wearable glass, control method thereof and vehicle control system
US20190171024A1 (en) Method for calculating the movement data of the head of a driver of a transportation vehicle, data glasses and transportation vehicle for use in the method, and computer program
EP3563567A1 (en) Systems, methods, and computer-readable media for using a video capture device to alleviate motion sickness via an augmented display for a passenger
WO2018100377A1 (en) Multi-dimensional display
US20210174573A1 (en) Image processing apparatus, display system, computer readable recoring medium, and image processing method
CN111902792A (en) Method and apparatus for providing information by an augmented reality device, method and apparatus for providing information for controlling display of an augmented reality device, method and apparatus for controlling display of an augmented reality device, computer-readable storage medium having instructions for performing the method
CN111263133B (en) Information processing method and system
JP7078568B2 (en) Display device, display control method, and display system
JP7214548B2 (en) Information processing device and information processing method
US9536414B2 (en) Vehicle with tactile information delivery system
US11277584B2 (en) Method and system for carrying out a virtual meeting between at least a first person and a second person
CN113924461A (en) Method for guiding a target to a target person, electronic device of a target person and electronic device of a receiver, and motor vehicle
JP7446888B2 (en) Image generation device and image generation method
US20240087334A1 (en) Information process system
CN112041937A (en) Method for operating electronic data glasses in a motor vehicle and electronic data glasses
US20240085207A1 (en) Information processing system
KR102559138B1 (en) Non-contact control system for vehicle
JP7383549B2 (en) Image recording system and image recording method
JP2022142517A (en) Image display control device, image display control system and image display control method
CN117706780A (en) Transmitting display content between a presentation of a binding vehicle and a presentation of binding glasses

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

AS Assignment

Owner name: AUDI AG, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KUEHNE, MARCUS;WOLLNY, NILS;SIGNING DATES FROM 20210121 TO 20210130;REEL/FRAME:055391/0952

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STCF Information on status: patent grant

Free format text: PATENTED CASE