WO2009062492A2 - Procédé de représentation d'objets images dans un espace image tridimensionnel virtuel - Google Patents

Procédé de représentation d'objets images dans un espace image tridimensionnel virtuel Download PDF

Info

Publication number
WO2009062492A2
WO2009062492A2 PCT/DE2008/001881 DE2008001881W WO2009062492A2 WO 2009062492 A2 WO2009062492 A2 WO 2009062492A2 DE 2008001881 W DE2008001881 W DE 2008001881W WO 2009062492 A2 WO2009062492 A2 WO 2009062492A2
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
image
objects
viewer
viewing direction
Prior art date
Application number
PCT/DE2008/001881
Other languages
German (de)
English (en)
Other versions
WO2009062492A3 (fr
Inventor
Thomas Schmitt
Steffen Bottcher
Wolfgang Opel
Original Assignee
Spatial View Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Spatial View Gmbh filed Critical Spatial View Gmbh
Publication of WO2009062492A2 publication Critical patent/WO2009062492A2/fr
Publication of WO2009062492A3 publication Critical patent/WO2009062492A3/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/111Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
    • H04N13/117Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes

Definitions

  • the invention relates to a method for displaying image objects in a virtual three-dimensional image space, in particular for generating a virtual reality in the sense of a simulation.
  • VR virtual reality
  • Edutainment In the most diverse areas an attempt is increasingly being made to map, carry out or test processes in virtual realities, since in particular specific working conditions can be purposefully simulated there.
  • the term "virtual reality” (VR) refers to the representation and simultaneous perception of reality and its physical properties in a mostly real-time computer-generated interactive-virtual environment.
  • VR virtual reality
  • Examples include their use in aircraft simulators in the training of pilots, in the creation of virtual prototypes in industry, in the performance of ergonomic tests, in the visualization of buildings, as well as in medical diagnostics, in the simulation of operations, in virtual visits difficult to reach places, referred to Edutainment or the like.
  • two views of an object are generated and displayed from slightly different positions (stereo projection). tion).
  • the distance between the two positions should be equal to the distance of the viewer's eyes.
  • the two views must be fed to the respective eye.
  • the active methods include, for example, shutter glasses, which can be switched transparent and dark at high speed. These are used in conjunction with a monitor that alternately displays an image for the left eye and an image for the right eye. When the glasses are synchronized with the monitor, the correct image is transmitted to each eye.
  • Passive techniques include the anaglyph and polarization techniques, in which two views of a small spaced image are superimposed in an image. Using color or polarized filter glasses, these image views can be separated again.
  • autostereoscopic monitors are known in practice, which allow the user a perception of spatial depth of the objects shown without the use of special aids such as glasses or the like.
  • An autostereoscopic monitor has a very fine image matrix in front of which an optical means, usually in the form of lenticular or parallax barrier systems, is directly attached.
  • the optical means Due to the special geometry of the optical means is achieved that certain pixels of the image matrix are emitted in a defined spatial direction.
  • images for the left and the right eye can be displayed simultaneously and independently of each other.
  • the quality of the three-dimensional impression is the higher, the better the two views can be perceived separately. This can be achieved by limiting the solid angle in which a three-dimensional perception is possible.
  • tracking systems are used, which continuously detect the position of the viewer.
  • the pixels on the image matrix or the position of the optical means become readjusted by slight displacement, so that the spatially very narrow viewing angle is tracked to the viewer.
  • a special feature of autostereoscopic monitors is that all image objects - regardless of their apparent spatial distance from the viewer - are displayed in the monitor plane, and thus the eyes of the beholder always remain focused on this monitor level (must) contrary to the natural accommodation on different distances of viewed objects ,
  • the naturalness of the perception of virtual reality suffers due to the above-described restrictions on accommodation.
  • a viewer of a three-dimensional object not only wants to look at an object of a complex scene as naturally as possible, but also other objects that appear to be at different distances to the viewer and to other objects.
  • the present invention is therefore based on the object of designing and further developing a method of the generic type in such a way that the representation of image objects in a virtual image space that is as realistic as possible, in particular in the case of several image objects in complex scenes, is possible.
  • the above object is solved by the features of claim 1.
  • the method in question is characterized in that the viewing direction of a viewer of the image object is detected and taken into account for the representation of the image objects and for interactions with the image objects.
  • the doctrine also refers to several viewers on one or more displays, although from now on - for the sake of simplicity - is always talked about by the viewer.
  • the viewing direction can also be advantageously used as the sole means of interaction.
  • this requires only suitable software and is therefore an advantageous "input device" in many respects you use this information to interact with the object.
  • the viewing direction of both eyes is detected independently of one another in order to represent the virtual reality in order to be able to determine a convergence point in real as well as in virtual space and to be able to take account accordingly.
  • This is of particular advantage, for example in the presentation of semi- transparent objects, such as tissue, fluids or in the representation of convex (patchy, holey) objects that can occur in a complex scene.
  • the real and the virtual space have two different coordinate systems that need to be calibrated to each other.
  • eye gestures such as single or multiple eyelid closure
  • eye gestures should also be able to flow in a further advantageous manner in order to allow sole or combined interaction with the image content without further aids.
  • Manipulation e.g., moving, zooming, performing contextual actions such as texture or lighting changes
  • a viewer uses a tool to interact with the image object, for example in the form of a selection operation in the virtual image space, it is expected that exactly the image object with which it is interacting, i. on which the viewer directs both his gaze and the tool is sharply displayed, while other image objects that are in apparent distance to the viewer, are perceived out of focus in front of or behind it.
  • This is not necessarily the case with autostereoscopic displays.
  • This viewing direction or viewing direction change transmitted into the virtual space is used to control at least one virtual camera, wherein a virtual camera corresponds to a view displayed to the viewer.
  • at least two virtual cameras are to be provided, which generate the views for one eye each of the observer.
  • the imaging characteristics of the virtual cameras in the virtual space correspond to the imaging properties of the viewer's eyes.
  • the viewer For a viewing direction detection, the viewer must be detected with a camera, advantageously a stereo camera system.
  • the current viewing direction of the observer is determined and converted into a position on the display device, which in turn can be associated with an image object in the scene shown.
  • the views of the viewer of the scene shown can be determined very easily and directly.
  • the presentation should then be perceived as particularly realistic if the views of the viewer are calculated in real time. There may be a hard or soft real time. Especially with fast sight changes For the viewer, a soft real time should be sufficient, since missing intermediate images, for example, are not perceived too clearly here.
  • the views of the image object could be recalculated when the viewer's viewing direction changes.
  • the viewing direction changes are detected, assigned to a viewed object in the virtual image space and used to control one or more virtual cameras.
  • the views of the scene for the viewer can be displayed in a realistic manner.
  • the recalculation of the views could be a spatial frequency filtering of selected areas of the views of the scene.
  • the spatial frequency plays an essential role in the perception of sharpness in an image. Pictures with low spatial frequency are blurred and flat, pictures with high spatial frequency are rich in detail and contrast and with accented outlines.
  • Corresponding algorithms for local frequency filtering e.g. Fourier transformation is known in practice.
  • the image sharpness also has a continuous transition when calculating the views of the image objects.
  • information about the three-dimensional nature of the object is necessary for the calculation of the new image information.
  • a three-dimensional model of the image object could be present. This three-dimensional model could be realized in many different ways. For example, if the image object is generated as a virtual object, then the three-dimensional information will most likely already be in a simple manner.
  • Known rendering filters can be used here to display the views in the desired quality.
  • the performance of the processors used can be achieved relatively quickly.
  • it may be advantageous to use precalculated sub-sections of the views of the picture object, the photographs or the video sequences for subimages processed with different spatial frequency filters. store different accommodation conditions in a memory. These data could then be read out of the memory as a function of the current viewing direction of the viewer and displayed appropriately on the display device.
  • intermediate images between the stored views could be calculated in a suitable manner, for example by morphing. Such types of calculation are also known in practice.
  • the method according to the invention is preferably used in connection with the representation on an autostereoscopic display device. It is advantageous if, in addition to the calculation of the views as a function of the viewing direction and of the position or the movement of the viewer in addition an accurate control of the viewing angle is made. This is done - as described above - by suitably driving the luminous dots behind the optical means of the autostereoscopic display device.
  • the adaptation can be carried out as a control loop in parallel or sequentially to the recalculation of the views. It is important to consider that in the readjustment only in a small range pixels are moved. A complete recreation of views of the image object is not done here.
  • the inventive method is not necessarily seen in conjunction with display devices for three-dimensional representation. It is thus also possible to use a standard monitor and monosize the views of the image object. All you have to do is create a virtual view that just creates a view of the image object.
  • the method may also be used in conjunction with a selector that allows interaction with the image object or parts thereof.
  • This selection device is preferably freely movable in the image space. With this selection device, the image object or parts thereof can be selected, marked, moved, edited, rotated or otherwise influenced.
  • the selection device could be formed by any object, its three-dimensional position and optionally orientation by means of a suitable object System is determined.
  • a stereoscopically operating camera system could be used, with which the object is detected.
  • the object to be tracked could be realized by a pen, any tool with which the viewer interacts with the image object, or the like.
  • the viewer could also use a finger as a selector. This can be interacted naturally with individual areas of the image object.
  • the illustrated image object appears to float in front of the display device. If a viewer selects a point of the image object, one can assume that he is also looking at this point. When selecting a point of the image object, therefore, it can be determined which image areas the observer sees lying behind the selection device. These image areas also correspond to the areas covered by the current viewing direction. This has the consequence that in addition to the inclusion of the viewing direction of the viewer and the position of the viewer and the position of the selector can be used as information to control the recalculation of views of the image object.
  • FIG. 1 to 3 exemplary arrangements for applying the method according to the invention.
  • the display device 1 comprises an autostereoscopic display device, in which the image object 2a appears to float at least partially in front of the display device, while the image object 2b is at least partially perceived in the background behind the display device.
  • a viewer whose eyes 3 are indicated in the figures views the image object 2 displayed on the display device 1.
  • a position detection in the form of a stereoscopically operating camera system continuously determines the position of the eyes 3 of the observer in all three spatial directions and his viewing direction.
  • two views of the image object are appropriately displayed with a corresponding offset, so that a virtual three-dimensional image space is spanned in front of the display device 1.
  • the image object 2a is apparently at least partially in front of Display device displayed while the image object 2b is at least partially behind the display device.
  • the viewing direction of the eyes 3 of the observer determined by the position and visual angle detection is transmitted into the virtual image space. Since a representation of the image objects 2a and 2b which is as realistic as possible is to be achieved on the display device 1, this viewing direction in FIG. 1 corresponds to an accommodation of the eyes 3 on the image object 2a. This image object is displayed in sharp focus, while the seemingly farther away image object 2b appears out of focus. A change of the viewing direction of the eyes 3 to the right towards the image object 2b - shown in FIG. 2 - is recognized by the system and the views are adjusted.
  • the representation takes place according to an accommodation on image object 2b.
  • the image object 2b is displayed sharply while the seemingly closer image object 2a appears out of focus.
  • the views generated by the two virtual cameras are in turn converted into images suitable for the display device and displayed on the display device 1.
  • FIG. 3 shows the case where an image object 2a is marked by means of a selector 4.
  • the selector 4 is formed here by a finger of the hand of the observer.
  • the viewer has in the virtual image space marked in Fig. 3 marked with a circle area 5. It is assumed that the viewer directs his gaze to the selection device and also to the marked area 5.
  • a detection unit for detecting the position of the selector 4 first determines the position of the selector with respect to the display 1. Using virtual cameras, it can be determined which point 5 in the virtual image space is marked by the selector 4. This marked area 5 corresponds to the area to which the eyes of the observer are accommodated during the selection process. A part of the image object 2a is displayed sharply while seemingly more distant parts of the image object 2a and image object 2b appear out of focus. In the calculation of the views of the image objects 2a and 2b, therefore, not only the viewing direction of the eyes 3 of the observer is taken into account. but also the selected area 5 and the position of the eyes 3 of the observer are used as information.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'invention concerne un procédé de représentation d'objets images dans un espace image tridimensionnel virtuel, en particulier pour créer une réalité virtuelle au sens d'une simulation. Selon l'invention, on détermine la direction d'observation d'un observateur de l'objet image, et on en tient compte pour la représentation des objets images ainsi que pour les interactions avec les objets images.
PCT/DE2008/001881 2007-11-15 2008-11-14 Procédé de représentation d'objets images dans un espace image tridimensionnel virtuel WO2009062492A2 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
DE102007054898 2007-11-15
DE102007054898.4 2007-11-15
DE102007057208A DE102007057208A1 (de) 2007-11-15 2007-11-26 Verfahren zum Darstellen von Bildobjekten in einem virtuellen dreidimensionalen Bildraum
DE102007057208.7 2007-11-26

Publications (2)

Publication Number Publication Date
WO2009062492A2 true WO2009062492A2 (fr) 2009-05-22
WO2009062492A3 WO2009062492A3 (fr) 2010-04-22

Family

ID=40577148

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/DE2008/001881 WO2009062492A2 (fr) 2007-11-15 2008-11-14 Procédé de représentation d'objets images dans un espace image tridimensionnel virtuel

Country Status (2)

Country Link
DE (1) DE102007057208A1 (fr)
WO (1) WO2009062492A2 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10372288B2 (en) 2011-09-08 2019-08-06 Airbus Defence and Space GmbH Selection of objects in a three-dimensional virtual scene
CN112258612A (zh) * 2019-08-01 2021-01-22 北京灵医灵科技有限公司 一种基于断层图像的虚拟解剖对象观察方法和***
WO2024032137A1 (fr) * 2022-08-12 2024-02-15 腾讯科技(深圳)有限公司 Procédé et appareil de traitement de données pour une scène virtuelle, dispositif électronique, support de stockage lisible par ordinateur et produit programme d'ordinateur

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102010010001A1 (de) 2010-03-02 2011-09-08 Geuder Ag Verfahren zur Entwicklung und virtuellen Erprobung eines chirurgischen Instruments
DE102010010002A1 (de) 2010-03-02 2011-09-08 Geuder Ag Verfahren zur Durchführung einer virtuellen Operation zu Trainingszwecken
DE102013207528A1 (de) * 2013-04-25 2014-10-30 Bayerische Motoren Werke Aktiengesellschaft Verfahren zum Interagieren mit einem auf einer Datenbrille angezeigten Objekt
DE102013019200A1 (de) 2013-11-15 2015-05-21 Audi Ag Verfahren zum Betreiben eines Bediensystems, Bediensystem und Vorrichtung mit einem Bediensystem
DE102014000876B3 (de) 2014-01-23 2015-01-08 Heidelberger Druckmaschinen Ag 3D Digitaler Proof
DE102014010309B4 (de) * 2014-07-11 2017-11-23 Audi Ag Anzeigen von zusätzlichen Inhalten in einer virtuellen Szenerie
DE102016102868A1 (de) 2016-02-18 2017-08-24 Adrian Drewes System zur Darstellung von Objekten in einem virtuellen dreidimensionalen Bildraum

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4984179A (en) * 1987-01-21 1991-01-08 W. Industries Limited Method and apparatus for the perception of computer-generated imagery
GB2281838A (en) * 1993-08-04 1995-03-15 Pioneer Electronic Corp Input for a virtual reality system
US6414681B1 (en) * 1994-10-12 2002-07-02 Canon Kabushiki Kaisha Method and apparatus for stereo image display
WO2007097738A2 (fr) * 2005-01-26 2007-08-30 Wollf Robin Q Systeme de commande d'un dispositif de positionnement d'une caméra/d'une arme piloté par un dispositif de suivi des mouvements de l'œil/de la tête/d'une caméra

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5689628A (en) * 1994-04-14 1997-11-18 Xerox Corporation Coupling a display object to a viewpoint in a navigable workspace
JP2001522098A (ja) 1997-10-30 2001-11-13 ドクター・バルデヴェグ・ゲーエムベーハー 画像処理方法および装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4984179A (en) * 1987-01-21 1991-01-08 W. Industries Limited Method and apparatus for the perception of computer-generated imagery
GB2281838A (en) * 1993-08-04 1995-03-15 Pioneer Electronic Corp Input for a virtual reality system
US6414681B1 (en) * 1994-10-12 2002-07-02 Canon Kabushiki Kaisha Method and apparatus for stereo image display
WO2007097738A2 (fr) * 2005-01-26 2007-08-30 Wollf Robin Q Systeme de commande d'un dispositif de positionnement d'une caméra/d'une arme piloté par un dispositif de suivi des mouvements de l'œil/de la tête/d'une caméra

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10372288B2 (en) 2011-09-08 2019-08-06 Airbus Defence and Space GmbH Selection of objects in a three-dimensional virtual scene
EP2754298B1 (fr) * 2011-09-08 2022-05-18 Airbus Defence and Space GmbH Sélection des objets tridimensionnels dans un scénario virtuel
CN112258612A (zh) * 2019-08-01 2021-01-22 北京灵医灵科技有限公司 一种基于断层图像的虚拟解剖对象观察方法和***
CN112258612B (zh) * 2019-08-01 2022-04-22 北京灵医灵科技有限公司 一种基于断层图像的虚拟解剖对象观察方法和***
WO2024032137A1 (fr) * 2022-08-12 2024-02-15 腾讯科技(深圳)有限公司 Procédé et appareil de traitement de données pour une scène virtuelle, dispositif électronique, support de stockage lisible par ordinateur et produit programme d'ordinateur

Also Published As

Publication number Publication date
DE102007057208A1 (de) 2009-05-28
WO2009062492A3 (fr) 2010-04-22

Similar Documents

Publication Publication Date Title
WO2009062492A2 (fr) Procédé de représentation d'objets images dans un espace image tridimensionnel virtuel
EP2156410A1 (fr) Procédé de représentation d'objets images dans un espace image tridimensionnel virtuel
DE102020110662A1 (de) Simultane positionsbestimmung und kartenerstellung (slam) für mehrere benutzer
EP1763845B1 (fr) Procede et dispositif pour determiner des superpositions optiques d'objets virtuels
DE102009037835B4 (de) Verfahren zur Darstellung von virtueller Information in einer realen Umgebung
DE69621148T2 (de) Dreidimensionales zeichnungssystem und verfahren
DE112016005343T5 (de) Elektronische Anzeigestabilisierung unter Verwendung von Pixelgeschwindigkeiten
EP3781364A1 (fr) Procédé d'utilisation par un utilisateur d'une cinématique actionnée à plusieurs membres, de préférence d'un robot, de préférence encore d'un robot à bras articulé, au moyen d'un dispositif d'affichage mobile
EP2289061B1 (fr) Simulateur d'ophtalmoscope
DE112012001022T5 (de) Ausrichtungssteuerung in einem am Kopf zu tragenden Gerät mit erweiterter Realität
EP0836332A2 (fr) Moniteur autostéréoscopique, adaptant la position d'un observateur (PAM)
DE102017107489B3 (de) Mikroskopanordnung zur Aufnahme und Darstellung dreidimensionaler Bilder einer Probe
DE102014006732A1 (de) Bildüberlagerung von virtuellen Objekten in ein Kamerabild
EP3012712A1 (fr) Motif virtuel dans un environnement reel
DE69837165T2 (de) Verfahren und gerät für automatische animation von dreidimensionalen grafischen szenen für verbesserte 3-d visualisierung
DE102012009257B4 (de) Verfahren zur Ausführung beim Betreiben eines Mikroskops und Mikroskop
DE102018209377A1 (de) Verfahren zur Darstellung von AR-/VR-Inhalten auf einem mobilen Endgerät und mobiles Endgerät, auf dem AR-/VR-Inhalte dargestellt werden
DE112019002798T5 (de) Informationsverarbeitungsvorrichtung, informationsverabeitungsverfahren und programm
DE102013213492A1 (de) Bildanpassung für kontaktanaloge Darstellungen auf Datenbrillen
DE102020214824A1 (de) Verfahren zum Betreiben eines Visualisierungssystems bei einer chirurgischen Anwendung und Visualisierungssystem für eine chirurgische Anwendung
DE102015100680B4 (de) Verfahren und Vorrichtungen zur Umgebungsdarstellung
DE102019108999B4 (de) Verfahren zur immersiven Anzeige von stereoskopischen Bildern und Bildfolgen
DE102012108249A1 (de) Verfahren und Vorrichtung zur Verbesserung der Wiedergabe stereoskopischer Bilder
DE102013216858A1 (de) Verfahren zur Darstellung eines in einem Volumendatensatz abgebildeten Objektes auf einem Bildschirm
DE60129059T2 (de) 3D-visuelle Präsentationsmethode und Apparat für Autosimulator

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08848787

Country of ref document: EP

Kind code of ref document: A2

122 Ep: pct application non-entry in european phase

Ref document number: 08848787

Country of ref document: EP

Kind code of ref document: A2