EP4073470A1 - Procédé et dispositif de représentation d'éléments de navigation virtuels - Google Patents

Procédé et dispositif de représentation d'éléments de navigation virtuels

Info

Publication number
EP4073470A1
EP4073470A1 EP20821178.9A EP20821178A EP4073470A1 EP 4073470 A1 EP4073470 A1 EP 4073470A1 EP 20821178 A EP20821178 A EP 20821178A EP 4073470 A1 EP4073470 A1 EP 4073470A1
Authority
EP
European Patent Office
Prior art keywords
navigation
picture element
navigation system
real environment
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP20821178.9A
Other languages
German (de)
English (en)
Inventor
Robert Jan Wyszka
Daniel Morales Fernández
Adrian HAAR
Michael Wittkämper
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Volkswagen AG
Original Assignee
Volkswagen AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Volkswagen AG filed Critical Volkswagen AG
Publication of EP4073470A1 publication Critical patent/EP4073470A1/fr
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/365Guidance using head up displays or projectors, e.g. virtual vehicles or arrows projected on the windscreen or on the road itself
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3632Guidance using simplified or iconic instructions, e.g. using arrows

Definitions

  • the present invention relates to a method and a device for displaying virtual navigation elements.
  • Navigation devices that assist a user in navigating from his current location to a desired destination have been known for many years.
  • the position of the navigation device is recorded by means of satellite-controlled position determination and the route to the desired destination is determined by means of map data.
  • Such navigation devices usually have a display on which a map with the position of the device and the route determined are displayed.
  • acoustic or visual indications can be issued about the direction currently to be observed or to be changed.
  • Directional arrows and the like are particularly widespread for visual direction information.
  • Such navigation devices are available for the most varied types of locomotion.
  • navigation devices are often integrated into the infotainment system of motor vehicles.
  • HUDs head-up displays
  • the navigation information should be visible to the driver without having to take his eyes off the road, but the exact placement of the navigation information does not yet have any direct relation to the surroundings perceived by the driver.
  • German patent application DE 102014001 710 A1 describes a device for the augmented display of a virtual image object in a real environment of a vehicle, in which two partial images of the virtual image object to be displayed are projected onto a vehicle window using a HUD in such a way that the user perceives a virtual depth image that appears to be outside the vehicle behind the window in the real environment.
  • the virtual image object is to be introduced into the real environment with a precise fit, so that the driver is given the impression that a perceived real object is being marked directly.
  • German patent application DE 102015006640 A1 describes the display of augmented navigation elements by a head-up display system which comprises contact-analog navigation elements and direction-analog navigation elements.
  • contact-analog navigation elements are understood to mean stationarily projected components of the environment that can be localized in the visible vehicle environment.
  • the contact-like navigation elements are displayed when the object can be augmented in the context of the specific driving situation.
  • direction-analog navigation elements are not assigned to a specific position in the perceptible environment, but merely indicate the next change in direction to be made, even if the corresponding maneuver point cannot yet be augmented.
  • the German patent application DE 102014219567 A1 describes a system for three-dimensional navigation in which graphic elements are projected onto an HUD in different focal planes.
  • a graphic element avatar
  • the graphic element can be animated as a moving avatar. This makes it possible, for example, for the avatar in the user's field of vision to move from the current street into the street to be moved during the change of direction when there is an imminent change of direction and thus to indicate the imminent change of direction more clearly than just by a mere direction arrow.
  • the known methods and devices it is possible to display virtual image objects, for example navigation instructions, in the sense of augmented reality (AR) in the field of vision of a user, such as the driver of a motor vehicle, and also to output additional information, for example warnings about objects on the Road surface or road damage.
  • virtual navigation elements are only image objects whose properties result from the route data supplied by the navigation system.
  • the previously known navigation representations lack a direct link between the image elements that is understandable for the user, in particular the visual impression of the image elements, with the real environment.
  • the user will therefore always perceive the virtual navigation elements as a foreign body in the field of vision, the data of which is supplied by a technical system that is not correlated with his own perception. In the case of pure navigation systems, this may not yet represent a major problem, because ultimately the authority to act in the sense of implementing the navigation instructions remains with the driver. With the increasing degree of autonomy of future vehicle generations, this decoupling of the personal perception of users and autonomous decisions of vehicle systems can lead to a trust problem.
  • the autonomy of the vehicle in road traffic is classified into five levels, from the so-called level 0 (all driving functions are performed by the driver), through level 1 (simple assistance systems such as distance control systems), level 2 (semi-autonomous driving with more complex assistance systems, such as lane departure warning, traffic jam assistant, automatic parking aid), level 3 (highly autonomous driving with an assistance system that can independently carry out overtaking processes and turning maneuvers with lane changes), level 4 (fully automatic driving, with a driver still available who can take control in an emergency) up to level 5 (fully automatic driving without a driver). From level 3 at the latest, at least part of the responsibility for controlling the vehicle is taken over at least temporarily by a computer-controlled system, which can lead to uncertainty for the user.
  • the present invention is therefore based on the technical problem of providing a method and a device for displaying virtual navigation elements which give the user a better understanding of the displays and recommendations for action of the Navigation system allows.
  • the method according to the invention and the device according to the invention are intended to create possibilities for displaying vehicle behavior calculated from navigation data and possibly additional sensor data at an early stage in an intuitive manner.
  • the invention therefore relates to a method for displaying virtual navigation elements, whereby at least one virtual navigation element is generated from data from a navigation system and displayed on a display device in overlay with a real environment, the method according to the invention being characterized in that the virtual navigation element is a primary image element and comprises a secondary picture element, the secondary picture element being displayed in a changeable manner as a function of at least one parameter of the real environment.
  • the invention is therefore based on the idea of dividing the display of navigation elements into at least two picture elements, a first primary picture element being animated essentially on the basis of navigation data and therefore, for example, being able to correspond to conventional navigation elements such as directional arrows or moving avatars.
  • the display of the primary picture element is therefore primarily influenced by navigation data. For example, if the primary image element is animated as a three-dimensional arrow, a change in direction can be displayed by rotating / tilting the arrow.
  • the primary image element can also be animated dynamically, for example to indicate the form in which a lane or direction change is to be carried out.
  • At least one further, secondary image element is assigned to the primary image element of a navigation element, the display of which is changed as a function of at least one parameter of the real environment.
  • This at least one parameter is an additional parameter that goes beyond the information used to represent the primary picture element.
  • the parameters that are used to display the secondary image element preferably come from data that describe the interaction of the navigation system with its surroundings, in particular the influence of the surroundings on the navigation system or on the vehicle in which the Navigation system is built in.
  • sensors that communicate with the navigation system can be used, for example. In the case of a motor vehicle, for example, any sensors installed in modern vehicles can be evaluated.
  • the change in the secondary picture element indicates, so to speak, the interaction of the primary picture element with the real environment.
  • a connection of the primary picture element, which represents the user or the vehicle, to the real environment is symbolized.
  • the user can better view the displays and recommendations for action of the navigation system or autonomous or partially autonomous vehicle actions derived from them Understand what causes the user to trust the technical system more, especially with increasing autonomy of vehicles. Especially when the system provides predictive information on upcoming vehicle actions based on changes in the secondary image element, vehicle actions that are surprising or incomprehensible to the user are reduced, which also increases confidence in an autonomous or semi-autonomous vehicle operation.
  • the interaction of the primary picture element with the real environment, which is graphically represented by changing the secondary picture element, can be modified by the most varied of factors.
  • the primary picture element can be a picture element that is animated to move along the navigation route.
  • the secondary picture element can be used, for example, to link movement information with the primary picture element, for example the secondary picture element can be an elongated picture element attached to the primary picture element, the length of which shows information on the speed of the primary picture element.
  • the user recognizes, for example by changing the length of the secondary picture element, that the navigation system recommends a change in the vehicle speed, or in the case of an autonomous system, the user recognizes that the vehicle is about to make a corresponding change in speed.
  • information about the current accuracy of the position determination of the navigation system, possible dangerous situations and the like can also flow into the representation of the secondary image element.
  • additional sensor data available depending on the application such as camera data, radar data, sonar data, LIDAR data, can be used.
  • the primary picture element is represented as a non-transparent graphic symbol, for example as a two- or three-dimensional animated arrow symbol.
  • Non-transparent means that it is preferably shown as a filled symbol so that the real environment does not or only insignificantly shine through.
  • the secondary picture element is represented as a graphic network connected to the primary picture element, between the meshes of which the real environment can be recognized.
  • this network representation already represents a link between the primary image element and the environment, since the network is to a certain extent superimposed on the environment and thus already a link to the technical system (navigation system based on the selected representation) ) symbolized with the influences of the environment, which increases the user's trust in the system.
  • the graphical network is constructed from triangles connected at their corner points. By changing the size of the triangles, different interactions with the environment can be symbolized, for example higher accuracies of sensor data through a denser network or also differently important areas of the interaction of the primary image element with the environment through more or less dense sections of the network.
  • the user can be signaled, for example, that a prompt to take over the vehicle in manual operation can be expected due to the deteriorated sensor data. If such a request is then actually triggered, for example in the context of a more complex traffic situation that can no longer be controlled autonomously with the deteriorated sensor data, at least the element of surprise is reduced for the user.
  • the secondary picture element has a variable length in order, for example, to symbolize different speeds. The network can also disappear entirely if, for example, it is necessary to stop the vehicle.
  • the secondary picture element has a changeable color, for example to symbolize dangerous situations such as potholes, black ice, aquaplaning or objects on the roadway.
  • the invention also relates to a navigation system which comprises means for generating at least one virtual navigation element and a display device for displaying the virtual navigation element superimposed on a real environment, the navigation system having means for performing the method according to the invention.
  • the navigation system can be, for example, a navigation system integrated into a driver information system of a motor vehicle.
  • the display device is preferably designed as a head-up display (HUD).
  • the head-up display can include, for example, a projection device that projects the virtual navigation element onto the windshield in the driver's field of vision of the motor vehicle or onto a dedicated, transparent but partially reflective HUD panel arranged between the driver and the windshield. Since the real environment is then in the driver's field of vision anyway, an improved augmented reality display of navigation elements is possible.
  • the navigation system according to the invention can, however, also be a dedicated navigation device or an application (app) implemented in a smartphone.
  • the real environment can be recorded by means of a camera of the navigation device or a camera of the smartphone and shown together with the virtual navigation element on a display of the navigation device or on the smartphone display as a display device.
  • data glasses to be worn by the user also known as augmented reality (AR) glasses
  • AR augmented reality
  • the method according to the invention is not only suitable for augmented reality (AR) applications, but can also be used, for example, in virtual reality (VR) environments and mixed reality (MR) environments.
  • the method according to the invention and the navigation system according to the invention can be used in various mobility applications, such as in drones, aircraft, trains, Ships etc.
  • the system according to the invention is particularly preferably used in motor vehicles.
  • the invention therefore also relates to a motor vehicle that has a navigation system as described above.
  • the invention also relates to a computer program which is designed to carry out the steps of the method for displaying virtual navigation elements in the manner described above when processed in a computing unit.
  • FIG. 1 shows a cockpit of a motor vehicle with a head-up display
  • FIG. 2 shows a view through the windshield of a motor vehicle with a projection area of the head-up display superimposed on a real environment
  • FIG. 3 shows an embodiment of a virtual navigation element according to the invention
  • FIG. 4 shows a variant of the illustration in FIG. 3 in a different traffic situation
  • FIG. 1 shows a cockpit of a motor vehicle 10 in which a driver information system / infotainment system 11 with a touch-sensitive screen 12 is arranged, which comprises a navigation system 12.
  • a driver information system / infotainment system 11 with a touch-sensitive screen 12 is arranged, which comprises a navigation system 12.
  • infotainment system refers to the combination of car radio, navigation system, hands-free system, driver assistance systems and other functions in a central operating unit.
  • infotainment is composed of the words information and entertainment.
  • the touch-sensitive screen 12 (“touchscreen”) is mainly used to operate the infotainment system, this screen 12 in particular can be viewed and operated easily by a driver of the vehicle 10, but also by a passenger of the vehicle 10.
  • buttons for example buttons, rotary controls or combinations thereof, such as for example rotary push-buttons, can also be arranged in an input unit 13 below the screen 12.
  • parts of the infotainment system 11 can also be operated via a steering wheel 14 of the vehicle 10.
  • information such as navigation data can also be projected onto a projection area 17 of a windshield 18 of the vehicle 10 by means of a projector 15 of a head-up display 16.
  • FIG. 2 shows the detail of a view of the driver of the motor vehicle 10 from FIG. 1 in the direction of travel through the windshield 18, the transparent projection area 17 of the head-up display 16 through which the real surroundings of the motor vehicle 10 can be seen above the steering wheel 14 , for example a road section 19 lying in the direction of travel, can be seen.
  • the projection area 17 by means of the projector 15, which is concealed by the steering wheel 14 in FIG. 2, virtual information elements 20 are projected which, from the driver's point of view, are superimposed on the real environment.
  • the information elements 20 consist of a navigation element 21 and a data element 22 that can display the current vehicle speed, for example.
  • the navigation element 21 consists of a primary picture element 21a and a secondary picture element 21b, which are explained in more detail below with reference to FIGS.
  • FIGS. 3 and 4 show the driver's view through the windshield 18 of the motor vehicle 10 of FIGS. 1 and 2, whereby, due to the active head-up display 16, a virtual information element 20 of a real environment, which in the example shown is a scenery of streets 23 and buildings 24 includes, is superimposed.
  • FIG. 3 an embodiment of a virtual navigation element 20 is shown in a situation-dependent first representation.
  • the virtual navigation element IO consists of a primary image element 21a, which in the present case is a non-transparent directional arrow, and a secondary image element 21b, which in the example shown is designed as a graphic network 21c made up of numerous triangles 21d linked to one another at the corner points.
  • a graphic network 21c made up of numerous triangles 21d linked to one another at the corner points.
  • Such a representation is also called mesh in computer graphics designated.
  • the network 21c symbolizes a digital carpet that links the primary picture element 21a with the real environment.
  • the width of the network 21c transversely to the road section 19 of a road 23 lying in the direction of travel and / or the mesh size of the network can represent the quality of the sensor data that are currently available for navigation and / or for autonomous driving.
  • the width of the network 21c therefore indicates how precisely the primary picture element 21a is located in the environment.
  • the driver can therefore already be prepared on the basis of this visual information to be prompted by the system to take over the manual driving function, if necessary, at short notice.
  • Such mesh objects are also used in computer graphics to scan a real environment and to represent them graphically in an AR, MR or VR environment.
  • Corresponding software applications are already commercially available, for example as part of the HoloLens Kinetic project from Microsoft Inc. for data glasses in an MR environment.
  • Corresponding applications can also be used within the scope of the present invention, for example, to scan the road section 19 and to spatially reproduce dangerous objects, such as road bumps, by appropriately adapting the network and / or the network as a whole or a corresponding section of the network, which is the location corresponding to the identified dangerous object, for example by a signal color or a local increase in intensity in the representation of the network.
  • sensor data or weather data received via the information topic system can be used to color the network 21c in a warning or alarm color if the data indicate, for example, actually registered or potentially possible icing on the road or the like.
  • the length of the network 21c can be linked to the current vehicle speed or the vehicle speed recommended for the next few moments.
  • the extensive network shown in FIG. 3 can symbolize that no speed reduction is currently necessary or even an increase in speed is possible.
  • FIG. 4 which essentially corresponds to the illustration in FIG. 3, it is assumed, however, that the vehicle should turn left at the next junction in accordance with the route planning determined.
  • the primary image element 21a is still on the current road section 19 in the traffic situation shown, the secondary image element 21b, i.e.
  • the network 21c by shortening its extent in the current direction of travel, already indicates that the speed should be reduced, or in the case of an autonomous ferry operation that the speed is currently being reduced or will be reduced shortly.
  • the primary image element 21a which is designed here as a plastic directional arrow, is rotated into the new direction of travel at the level of the junction in order to also signal the impending change of direction.
  • a further movable picture element 21e can be temporarily displayed, which in the example shown has a similar shape to the primary picture element and moves from this in the new direction of travel in order to display the planned change of direction even more clearly.
  • Navigation element a primary image element b secondary image element c mesh d linked triangles e moving image element

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)
  • Instrument Panels (AREA)

Abstract

L'invention concerne un procédé et un dispositif de représentation d'éléments de navigation virtuels, au moins un élément de navigation virtuel (21) étant produit à partir de données d'un système de navigation et étant représenté sur un dispositif d'affichage en superposition avec un environnement réel, l'élément de navigation virtuel (21) comprenant un élément d'image primaire (21a) et un élément d'image secondaire (21b), l'élément d'image secondaire (21b) étant représenté de manière modifiable en fonction d'au moins un paramètre de l'environnement réel.
EP20821178.9A 2020-01-06 2020-12-08 Procédé et dispositif de représentation d'éléments de navigation virtuels Pending EP4073470A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102020200047.6A DE102020200047A1 (de) 2020-01-06 2020-01-06 Verfahren und Vorrichtung zur Darstellung von virtuellen Navigationselementen
PCT/EP2020/085009 WO2021139941A1 (fr) 2020-01-06 2020-12-08 Procédé et dispositif de représentation d'éléments de navigation virtuels

Publications (1)

Publication Number Publication Date
EP4073470A1 true EP4073470A1 (fr) 2022-10-19

Family

ID=73790094

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20821178.9A Pending EP4073470A1 (fr) 2020-01-06 2020-12-08 Procédé et dispositif de représentation d'éléments de navigation virtuels

Country Status (4)

Country Link
EP (1) EP4073470A1 (fr)
CN (1) CN114867992A (fr)
DE (1) DE102020200047A1 (fr)
WO (1) WO2021139941A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102021128257A1 (de) 2021-10-29 2023-05-04 Bayerische Motoren Werke Aktiengesellschaft Darstellung von Informationen an Bord eines Fahrzeugs
DE102022133776A1 (de) 2022-12-16 2024-06-27 Bayerische Motoren Werke Aktiengesellschaft Vorrichtung, Fortbewegungsmittel und Verfahren zur Ausgabe eines Hinweises an einen Anwender eines Fortbewegungsmittels

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007121001A (ja) * 2005-10-26 2007-05-17 Matsushita Electric Ind Co Ltd ナビゲーション装置
DE102010052000A1 (de) * 2010-11-19 2012-05-24 Bayerische Motoren Werke Aktiengesellschaft Verfahren zur Ausgabe von Navigationshinweisen
JP2013123970A (ja) * 2011-12-14 2013-06-24 Toshiba Corp 表示装置
DE102014219567A1 (de) 2013-09-30 2015-04-02 Honda Motor Co., Ltd. Dreidimensionale (3-d) navigation
DE102014001710A1 (de) 2014-02-08 2014-08-14 Daimler Ag Vorrichtung und Verfahren zur augmentierten Darstellung eines virtuellen Bildobjektes in einer realen Umgebung
JP6149824B2 (ja) * 2014-08-22 2017-06-21 トヨタ自動車株式会社 車載装置、車載装置の制御方法及び車載装置の制御プログラム
DE102015006640A1 (de) 2015-05-22 2016-03-10 Daimler Ag Darstellung augmentierter Navigationselemente durch ein Head-Up-Display System
DE102016203080A1 (de) * 2016-02-26 2017-08-31 Robert Bosch Gmbh Verfahren zum Betreiben eines Head-Up-Displays, Head-Up-Display Einrichtung
US10769452B2 (en) * 2016-11-14 2020-09-08 Lyft, Inc. Evaluating and presenting pick-up and drop-off locations in a situational-awareness view of an autonomous vehicle
CN106500716A (zh) * 2016-12-13 2017-03-15 英业达科技有限公司 车辆导航投影***及其方法
DE102017221488A1 (de) * 2017-11-30 2019-06-06 Volkswagen Aktiengesellschaft Verfahren zur Anzeige des Verlaufs einer Trajektorie vor einem Fahrzeug oder einem Objekt mit einer Anzeigeeinheit, Vorrichtung zur Durchführung des Verfahrens sowie Kraftfahrzeug und Computerprogramm
DE102018203462A1 (de) * 2018-03-07 2019-09-12 Volkswagen Aktiengesellschaft Verfahren zur Berechnung einer Einblendung von Zusatzinformationen für eine Anzeige auf einer Anzeigeeinheit, Vorrichtung zur Durchführung des Verfahrens sowie Kraftfahrzeug und Computerprogramm
DE102018203927A1 (de) * 2018-03-15 2019-09-19 Volkswagen Aktiengesellschaft Verfahren, Vorrichtung und computerlesbares Speichermedium mit Instruktionen zur Steuerung einer Anzeige einer Augmented-Reality-Display-Vorrichtung für ein Kraftfahrzeug
DE102018207440A1 (de) * 2018-05-14 2019-11-14 Volkswagen Aktiengesellschaft Verfahren zur Berechnung einer "augmented reality"-Einblendung für die Darstellung einer Navigationsroute auf einer AR-Anzeigeeinheit, Vorrichtung zur Durchführung des Verfahrens sowie Kraftfahrzeug und Computerprogramm

Also Published As

Publication number Publication date
CN114867992A (zh) 2022-08-05
WO2021139941A1 (fr) 2021-07-15
DE102020200047A1 (de) 2021-07-08

Similar Documents

Publication Publication Date Title
EP3543059B1 (fr) Procédé de calcul d'une surimpression des informations supplémentaires pour un affichage sur une unité d'affichage, dispositif de mise en oeuvre du procédé, ainsi que véhicule automobile et programme informatique
DE102017221191B4 (de) Verfahren zur Anzeige des Verlaufs einer Sicherheitszone vor einem Fahrzeug oder einem Objekt mit einer Anzeigeeinheit, Vorrichtung zur Durchführung des Verfahrens sowie Kraftfahrzeug und Computerprogramm
EP3668742B1 (fr) Procédé pour faire fonctionner un système d'aide à la conduite d'un véhicule à moteur et véhicule à moteur
DE102017212367B4 (de) Vorrichtung zur Anzeige des Verlaufs einer Trajektorie vor einem Fahrzeug oder einem Objekt mit einer Anzeigeeinheit sowie Kraftfahrzeug
EP3717954B1 (fr) Procédé d'affichage du déroulement d'une trajectoire devant un véhicule ou un objet pourvu d'une unité d'affichage, dispositif de mise en oeuvre du procédé
EP3762684A1 (fr) Incrustation d'informations additionnelles sur une unité d'affichage
DE112017007631B4 (de) Anzeigesystem und Anzeigeverfahren
DE102018203121B4 (de) Verfahren zur Berechnung einer AR-Einblendung von Zusatzinformationen für eine Anzeige auf einer Anzeigeeinheit, Vorrichtung zur Durchführung des Verfahrens sowie Kraftfahrzeug und Computerprogramm
EP3425442B1 (fr) Procédé d'enrichissement d'un champ de vision d'un conducteur d'un véhicule avec des informations additionnelles, dispositif d'utilisation dans un véhicule observateur, ainsi que véhicule automobile
WO2003005102A1 (fr) Systeme de visualisation tete haute et procede de representation correctement localisee d'un objet dans l'espace exterieur d'un vehicule par rapport a la position du conducteur
EP3803275A1 (fr) Procédé de calcul d'une incrustation à réalité augmentée pour la représentation d'un itinéraire de navigation sur une unité d'affichage à réalité augmentée, dispositif servant à mettre en oeuvre le procédé, ainsi que véhicule à moteur et programme informatique
WO2004018970A1 (fr) Procede et dispositif pour afficher des informations de navigation pour un vehicule
DE102014203138A1 (de) Fahrerassistenzsystem für ein Kraftfahrzeug und Verfahren zur Assistenz eines Kraftfahrzeug-Fahrers
DE102015212664A1 (de) Kraftfahrzeug mit einem automatischen Fahrsystem
DE102010003851A1 (de) Verfahren und Informationssystem zum Markieren eines Zielorts für ein Fahrzeug
WO2021139941A1 (fr) Procédé et dispositif de représentation d'éléments de navigation virtuels
EP3296795B1 (fr) Procédé d'affichage d'un objet image dans un véhicule sur des affichages perçus intérieur et extérieur du véhicule
DE102019122895A1 (de) Vorrichtung, Verfahren sowie Computerprogramm zum Betreiben eines automatisiert fahrbaren Fahrzeugs mit einem von einem Nutzer des Fahrzeugs bedienbaren Bedienelement
DE102022105187B4 (de) Verfahren zum Darstellen einer Umgebung eines Fahrzeugs und Anzeigesystem
DE102022119302A1 (de) Verfahren, Vorrichtung und Computerprogramm zum Betreiben eines Anzeige-Bediensystems eines Fahrzeugs
WO2021156018A1 (fr) Procédé et dispositif pour faire fonctionner une interface visuelle entre un véhicule et un occupant du véhicule
DE102019122892A1 (de) Vorrichtung, Verfahren sowie Computerprogramm zum Betreiben eines automatisiert fahrbaren Fahrzeugs mit einem von einem Nutzer des Fahrzeugs bedienbaren Bedienelement
DE102019122894A1 (de) Vorrichtung, Verfahren sowie Computerprogramm zum Betreiben eines automatisiert fahrbaren Fahrzeugs mit einem von einem Nutzer des Fahrzeugs bedienbaren Bedienelement

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20220712

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20240119