EP1683063A1 - System und verfahren zur durchführung und visualisierung von simulationen in einer erweiterten realität - Google Patents
System und verfahren zur durchführung und visualisierung von simulationen in einer erweiterten realitätInfo
- Publication number
- EP1683063A1 EP1683063A1 EP04804510A EP04804510A EP1683063A1 EP 1683063 A1 EP1683063 A1 EP 1683063A1 EP 04804510 A EP04804510 A EP 04804510A EP 04804510 A EP04804510 A EP 04804510A EP 1683063 A1 EP1683063 A1 EP 1683063A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- simulation
- storage medium
- real
- stored
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
Definitions
- the invention relates to a system and a method for displaying information, in particular augmented reality information, for at least one user.
- Such a system or method is used, for example, in the planning of work processes and other processes in an industrial environment.
- Augmented Reality is a new type of human-environment interaction with great potential to support industrial work processes and other processes before and during the process flows.
- the viewer's field of view is enriched with computer-generated virtual objects so that product and process information can be intuitively captured and used.
- the use of portable computers opens up AR application fields with high mobility requirements such as B. in production halls, spatially distributed plants or large-volume conveyor systems. Augmented reality is already being developed for production and service applications.
- the invention is based on the object of specifying a system and a method which enables the visualization of simulation results in the context of real arrangements in an augmented reality system.
- planned processes or fictitious situations such as the accumulation of visitors, errors in existing or planned systems etc. should be visualized directly in the real environment and thus made tangible for the user.
- This task is performed by a system for displaying information, in particular augmented reality information, for at least one user
- At least one recording unit for recording an environment and for generating corresponding environmental information, which identify a position and / or an orientation of the system in relation to the environment,
- This task is further achieved by a method for displaying information, in particular augmented reality information, for at least one user
- an environment is detected and corresponding environment information that identifies a position and / or an orientation of the system in relation to the environment is generated,
- simulation data is generated with the help of at least one simulation system
- environment information is linked with the aid of at least one processing unit and image information that is continuously modified and stored on the basis of the simulation data and stored in a first storage medium is linked.
- the simulation results can be visualized in the context of the real environment. Simulation results are transferred to reality with the help of augmented reality and visualized there in a mixed virtual real world.
- the dynamic Results of the simulation are transported into the real world according to their dynamics and thus made tangible for the viewer. Simulations, which until now have only been carried out in purely virtual environments, can be represented in reality using the system according to the invention or the device according to the invention, without the need for complex modeling of the real environment. Falsifications of the real environment and the resulting incorrect conclusions, which can never be completely ruled out when reproducing reality, are avoided by the inventive combination of simulation techniques and augmented reality technology.
- the simulation can also be used as a forecasting tool.
- the simulation can be used to make a forecast that a problem will arise on a machine or conveyor in the near future. Then z. B. As a warning, the surroundings of the machine are colored yellow or red (or the virtually exploding boiler is shown).
- the system according to the invention can be used advantageously, for example, for the following tasks:
- the processing unit is designed such that it is used to calculate concealments of virtual objects by means of a real arrangement existing in the detection area of the system based on the image information stored in the first storage medium and to generate a quantity of data for describing the virtual objects serves, the areas of the virtual objects hidden by the real arrangement being hidden.
- a quantity of data is generated for the reproduction of a 3-dimensional model, the representation of which enables the user to precisely position virtual and real objects in all three spatial dimensions.
- a virtual object positioned behind a real object is also perceived by the user as such.
- the system advantageously has at least one playback unit for displaying the amount of data generated by the processing unit. Two different methods can be used when reproducing the augmented reality information.
- the reproduction unit can be designed as a head-mounted display, the objects described by the image information generated by the processing unit being directly displayed in the field of vision of the user, and the user viewing the part of the current part that is not hidden by the objects described by the image information Continues to perceive reality directly.
- This type of presentation of augmented reality information is the so-called optical see-through method.
- the playback unit is designed such that the objects described by the image information generated by the processing unit and the part not covered by the objects described by the image information of the current reality, for which purpose the device has in particular at least one image capture unit, which is designed as a video camera, for example, for capturing the current reality.
- This embodiment enables the augmented reality information to be displayed for several users.
- This type of presentation of augmented reality information is the so-called video see-through method.
- the parts of the virtual objects described by the image information and not covered by real objects are superimposed on the image captured by the video camera and displayed on an or, for. B. using a video splitter, multiple playback units.
- the playback units can be head-mounted displays and / or ordinary monitors, which in particular can also be positioned at locations distant from the detected current reality.
- the system has at least one application controller for controlling the simulation system and / or at least one real process.
- the application control enables the activation and control of real and virtual processes.
- it displays the available commands and the current status of the system, for example using a video card, with the aid of the playback unit.
- a process control which contains the necessary execution system and the necessary control programs to control a plant or plant components according to a predetermined process scheme (e.g. a PLC with its associated data and function blocks).
- a predetermined process scheme e.g. a PLC with its associated data and function blocks.
- the system comprises at least one user interface which allows the user to control the simulation system and / or at least one real process.
- the user interface le can include different input devices such. B. Mouse, keyboard, microphones, etc.
- the signals supplied by the input devices are converted into commands for the application control using appropriate device drivers.
- the system expediently has a second storage medium in which current status values of the real process, in particular sensor values and / or actuator values to be set, are stored.
- the simulation system is designed such that the course of a simulation can be continuously influenced by the status values stored in the second storage medium and / or the status values stored in the second storage medium can be modified by the simulation system. Due to the possibility of access to the state values stored in the second storage medium, the simulation system can react to current states of a real process and influence a running simulation accordingly. Furthermore, the simulation system can modify state values and thus influence a running real process.
- At least one process connection is provided in an advantageous embodiment which changes the state of a real process, in particular as a function of the state values stored in the second storage medium, and allows the current state of a real process to be recorded.
- a third storage medium in which data are stored, which allow a reconstruction of a process simulated by means of the simulation system.
- the third storage medium contains continuous and cyclic data, that have been recorded over a defined period of time. They are sufficiently precise to be able to play back slow-motion or time-lapse events recorded by the simulation system, both forward and backward along the time axis.
- FIG. 1 shows a schematic illustration of a system for displaying information, in particular augmented reality information, for at least one user
- FIG. 2 shows a detailed illustration of an embodiment of the system shown in FIG
- FIG. 1 shows a schematic illustration of a system 1 for displaying information, in particular augmented reality information, for at least one user 2.
- a real arrangement 11 which can be, for example, a lifting table.
- the user 2 carries a detection device 3a on his head, for example a sensor, which is part of a detection unit 3. With the aid of the detection device 3 a, the surroundings of the user 2 are detected and corresponding processing information 3 is generated with a processing unit 3 b, which identifies the position and the viewing angle of the user 2 in relation to the real arrangement 11.
- the system has a first storage medium 5 in which
- Image information 6 relating to real and virtual objects is stored.
- the image information 6 describes the real and virtual objects in the form of three-dimensional models.
- the system 1 u also includes a simulation system 7 with which simulation data 12 are generated.
- the in the memory Image information 6 stored in the medium 5 is continuously updated by the simulation data 12 generated by the simulation system 7.
- a processing unit 8 now links the image information 6 and the environmental information 4 to a new data set 9, which can be displayed with the aid of a playback unit 10.
- the processing device 8 By linking the image information 6 with the ambient information 4 by the processing device 8, it is now possible to display the newly generated image information 9 in the user's field of vision with precise position. In this case, concealments by the lifting table 11 are made visible to the user from the geometric arrangements described by the image information 6.
- the visualization of the data 6 stored in the storage medium 5 for the description of virtual objects thus takes place in the context of the real environment of the user 2, the user 2 correctly displaying both the virtual and the real objects in his surroundings in all three spatial dimensions receives.
- the simulation system 7 also makes it possible to dynamize the real and virtual objects stored in the first storage medium 5.
- FIG. 2 shows a detailed illustration of an embodiment of the system shown in FIG.
- a real arrangement 11 which in turn can be a lifting table.
- a detection unit 3 consisting of a detection device 3a, which is designed in particular as a sensor, and a processing unit 3b
- the position and the viewing angle of the user 2 in relation to the real arrangement 11 are detected and in the form of environmental information 4, which is in the form of a matrix given to the processing unit 8.
- the processing unit 3b can in particular be a tracking system.
- a first storage medium 5 contains image information 6 for describing the three-dimensional model of one or more virtual or real arrangements.
- the virtual or real arrangements are a virtual robot 18a and virtual packets 18b.
- visualizing zui e.g. B. the virtual packages 18b are transported and / or lifted by the real lifting table 11 and sorted out by the virtual robot 18a according to one or more quality criteria.
- a simulation system 7 generates a lot of simulation data 12, on the basis of which the image information 6 stored in the storage medium 5 is continuously updated. With the aid of the simulation system 7, dynamization of the virtual objects 18a and 18b is possible.
- a simulation model is stored in a fourth storage medium 19 and contains all the necessary data in order to be able to simulate the physical behavior and the control behavior with sufficient accuracy both from real and from virtual components.
- the simulation model also describes the dependency between the objects (e.g. package lying on the lift table and to be transported according to the current conveying speed).
- the processing unit 8 combines the environmental information 4 and the image information 6 of the virtual arrangements 18a and 18b, which is continuously updated by the simulation system, to form a new data set 9.
- one is required fifth storage medium 20 stores calibration information in the form of matrices which describe geometric deviations between the sensor of the detection system 3a, the eye of the user 2 and the display unit 10.
- the processing unit 8 processes the parts of the virtual geometric Orders 18a and 18b are hidden, which are covered by the lifting table 11.
- the user 2 receives a correct three-dimensional impression of the augmented reality consisting of the lifting table 11, the virtual robot 18a and the virtual packages 18b.
- the quantity of data 9 generated by the processing unit 8 is converted with the aid of a video card 21 into a signal which can be represented by the reproduction unit 10.
- the system 1 further comprises an application control 14, with the aid of which the simulation system 7 can be accessed, and a real process, in this case e.g. an active lifting table 11 can be controlled.
- user 2 has a user interface 15 available, which e.g. can include a mouse, a keyboard or a microphone.
- the control commands entered by user 2 via user interface 15 are converted into a signal for application control 14 using one or more device drivers 22.
- the commands available for the user 2 can be displayed by the application controller 14 on the playback device 10 using the video card 21.
- a process control 23 subdivided into the application control 14 contains the necessary sequence system and the necessary control programs in order to control the lifting table 11 according to a predetermined sequence scheme.
- This can be, for example, a programmable logic controller (PLC) with its associated data and function blocks.
- PLC programmable logic controller
- the current status values of the lifting table 11 are stored in a second storage medium 13, as well as actuator values currently to be set via a process interface 24.
- the sensor and actuator values stored in the second storage medium 13 can both be read by the simulation system 7 in order to make corresponding modifications to the current simulation perform, as well as be modified to effect a change in the current process via the process interface 24.
- the current sensor values of the lifting table 11 can be read in and current actuator values of the lifting table 11 can be set.
- the sensor values are stored in the process image located in the second storage medium 13 and the actuator values are read from the process image located in the second storage medium 13.
- the system 1 described thus enables the user 2 to access the user interface 15 both to the simulation process and to the real process on the lifting table 11. Furthermore, the real process and the simulation can influence one another.
- the system 1 has a d-tread storage medium 16 in which data generated during the simulation are continuously stored. These data, which are recorded continuously or cyclically over a defined period of time, are sufficiently precise to be able to play back events recorded by the simulation system 7 in slow motion or time-lapse. This is possible in both the forward and backward directions along the time axis.
- System 1 described here has two main modes. In the first mode, process active, the real components are controlled directly via the process controller 23 and the process connection 17. In the second mode, process passive, the real components are not addressed via the process connection 17. The modes are activated via the application control 14.
- the behavior of system 1 or the system components in the two modes is as follows.
- the real components are addressed via the process connection 17.
- the system components are set so that the calculated or recorded positions of the real components, in this case the lifting table 11, are reflected in the three-dimensional model, but are not faded in as components by the processing system 8, but only for fading out of the hidden parts of the virtual components 18a and 18b are used.
- the real components are not addressed via the process connection 17 and are in a defined idle state.
- the invention relates to a system and a
- Procedure within an augmented reality (AR) system for the visualization of simulation results in a mixed virtual real environment The system or method enables one or more users to carry out simulation processes in the context of a real environment, in particular in the field of industrial automation technology, and to visualize their static and dynamic results in the context of the real environment. Processes running in the real environment are recorded and synchronized with the simulation. With the help of a control unit, a mutual influencing of real processes with the simulation is made possible. Furthermore, the user can control the simulation process via a user interface.
- AR augmented reality
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE10352893 | 2003-11-10 | ||
DE102004016329A DE102004016329A1 (de) | 2003-11-10 | 2004-04-02 | System und Verfahren zur Durchführung und Visualisierung von Simulationen in einer erweiterten Realität |
PCT/EP2004/052783 WO2005045729A1 (de) | 2003-11-10 | 2004-11-03 | System und verfahren zur durchführung und visualisierung von simulationen in einer erweiterten realität |
Publications (1)
Publication Number | Publication Date |
---|---|
EP1683063A1 true EP1683063A1 (de) | 2006-07-26 |
Family
ID=34575428
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP04804510A Ceased EP1683063A1 (de) | 2003-11-10 | 2004-11-03 | System und verfahren zur durchführung und visualisierung von simulationen in einer erweiterten realität |
Country Status (3)
Country | Link |
---|---|
US (1) | US7852355B2 (de) |
EP (1) | EP1683063A1 (de) |
WO (1) | WO2005045729A1 (de) |
Families Citing this family (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8195386B2 (en) * | 2004-09-28 | 2012-06-05 | National University Corporation Kumamoto University | Movable-body navigation information display method and movable-body navigation information display unit |
US20090055156A1 (en) * | 2007-08-24 | 2009-02-26 | Rockwell Automation Technologies, Inc. | Using commercial computing package models to generate motor control code |
US7899777B2 (en) * | 2007-09-27 | 2011-03-01 | Rockwell Automation Technologies, Inc. | Web-based visualization mash-ups for industrial automation |
JP5384178B2 (ja) * | 2008-04-21 | 2014-01-08 | 株式会社森精機製作所 | 加工シミュレーション方法及び加工シミュレーション装置 |
JP5465957B2 (ja) * | 2008-09-05 | 2014-04-09 | Dmg森精機株式会社 | 加工状態確認方法及び加工状態確認装置 |
DE102009029064A1 (de) * | 2008-09-05 | 2010-04-01 | Mori Seiki Co., Ltd., Yamatokoriyama-shi | Verfahren und Vorrichtung zur Bearbeitungszustandsüberwachung |
US9514654B2 (en) | 2010-07-13 | 2016-12-06 | Alive Studios, Llc | Method and system for presenting interactive, three-dimensional learning tools |
USD648390S1 (en) | 2011-01-31 | 2011-11-08 | Logical Choice Technologies, Inc. | Educational card |
USD648796S1 (en) | 2011-01-31 | 2011-11-15 | Logical Choice Technologies, Inc. | Educational card |
USD647968S1 (en) | 2011-01-31 | 2011-11-01 | Logical Choice Technologies, Inc. | Educational card |
USD654538S1 (en) | 2011-01-31 | 2012-02-21 | Logical Choice Technologies, Inc. | Educational card |
USD675648S1 (en) | 2011-01-31 | 2013-02-05 | Logical Choice Technologies, Inc. | Display screen with animated avatar |
USD648391S1 (en) | 2011-01-31 | 2011-11-08 | Logical Choice Technologies, Inc. | Educational card |
US8686871B2 (en) | 2011-05-13 | 2014-04-01 | General Electric Company | Monitoring system and methods for monitoring machines with same |
US20130343640A1 (en) | 2012-06-21 | 2013-12-26 | Rethink Robotics, Inc. | Vision-guided robots and methods of training them |
JP5983442B2 (ja) * | 2013-01-31 | 2016-08-31 | 富士通株式会社 | プログラム、演算装置および演算方法 |
WO2015160828A1 (en) | 2014-04-15 | 2015-10-22 | Huntington Ingalls Incorporated | System and method for augmented reality display of dynamic environment information |
WO2015164755A1 (en) | 2014-04-25 | 2015-10-29 | Huntington Ingalls Incorporated | Augmented reality display of dynamic target object information |
US9864909B2 (en) | 2014-04-25 | 2018-01-09 | Huntington Ingalls Incorporated | System and method for using augmented reality display in surface treatment procedures |
DE102014209367A1 (de) * | 2014-05-16 | 2015-11-19 | Siemens Aktiengesellschaft | Prozessvisualisierungsvorrichtung und Verfahren zum Anzeigen von Prozessen wenigstens einer Maschine oder Anlage |
US10504294B2 (en) * | 2014-06-09 | 2019-12-10 | Huntington Ingalls Incorporated | System and method for augmented reality discrepancy determination and reporting |
WO2015191346A1 (en) | 2014-06-09 | 2015-12-17 | Huntington Ingalls Incorporated | System and method for augmented reality display of electrical system information |
US10915754B2 (en) * | 2014-06-09 | 2021-02-09 | Huntington Ingalls Incorporated | System and method for use of augmented reality in outfitting a dynamic structural space |
WO2016011149A1 (en) * | 2014-07-16 | 2016-01-21 | Huntington Ingalls Incorporated | System and method for augmented reality display of hoisting and rigging information |
US9881422B2 (en) * | 2014-12-04 | 2018-01-30 | Htc Corporation | Virtual reality system and method for controlling operation modes of virtual reality system |
US10052170B2 (en) | 2015-12-18 | 2018-08-21 | MediLux Capitol Holdings, S.A.R.L. | Mixed reality imaging system, apparatus and surgical suite |
US11386556B2 (en) | 2015-12-18 | 2022-07-12 | Orthogrid Systems Holdings, Llc | Deformed grid based intra-operative system and method of use |
EP3260255B1 (de) * | 2016-06-24 | 2019-08-21 | Zünd Systemtechnik Ag | System zum schneiden von schneidgut |
US20180061269A1 (en) * | 2016-09-01 | 2018-03-01 | Honeywell International Inc. | Control and safety system maintenance training simulator |
US10782668B2 (en) | 2017-03-16 | 2020-09-22 | Siemens Aktiengesellschaft | Development of control applications in augmented reality environment |
JP7170736B2 (ja) * | 2017-11-01 | 2022-11-14 | ヴイアールジニアズ インコーポレイテッド | 双方向性拡張または仮想現実装置 |
TWI659279B (zh) | 2018-02-02 | 2019-05-11 | 國立清華大學 | 基於擴充實境的加工規劃設備 |
DE102018113336A1 (de) * | 2018-06-05 | 2019-12-05 | GESTALT Robotics GmbH | Verfahren zum Verwenden mit einer Maschine zum Einstellen einer Erweiterte-Realität-Anzeigeumgebung |
AT521390B1 (de) * | 2018-06-29 | 2021-11-15 | Wittmann Tech Gmbh | Verfahren zur laufenden Speicherung von internen Betriebszuständen und zur Visualisierung von zeitlich zurückliegenden Ablaufsequenzen sowie Roboter und/oder Robotsteuerung hierfür |
US11540794B2 (en) | 2018-09-12 | 2023-01-03 | Orthogrid Systesm Holdings, LLC | Artificial intelligence intra-operative surgical guidance system and method of use |
JP7466928B2 (ja) | 2018-09-12 | 2024-04-15 | オルソグリッド システムズ ホールディング,エルエルシー | 人工知能の術中外科的ガイダンスシステムと使用方法 |
USD910652S1 (en) | 2019-01-31 | 2021-02-16 | OrthoGrid Systems, Inc | Display screen or portion thereof with a graphical user interface |
USD979578S1 (en) | 2021-02-08 | 2023-02-28 | Orthogrid Systems Holdings, Llc | Display screen or portion thereof with a graphical user interface |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5997913A (en) | 1990-12-10 | 1999-12-07 | Genencor International, Inc. | Method enhancing flavor and aroma in foods by overexpression of β-glucosidase |
US5815411A (en) * | 1993-09-10 | 1998-09-29 | Criticom Corporation | Electro-optic vision system which exploits position and attitude |
US5625765A (en) * | 1993-09-03 | 1997-04-29 | Criticom Corp. | Vision systems including devices and methods for combining images for extended magnification schemes |
DE50003531D1 (de) * | 1999-03-02 | 2003-10-09 | Siemens Ag | System und verfahren zur situationsgerechten unterstützung der interaktion mit hilfe von augmented-reality-technologien |
US7324081B2 (en) * | 1999-03-02 | 2008-01-29 | Siemens Aktiengesellschaft | Augmented-reality system for situation-related support of the interaction between a user and an engineering apparatus |
US20030210228A1 (en) * | 2000-02-25 | 2003-11-13 | Ebersole John Franklin | Augmented reality situational awareness system and method |
EP1182541A3 (de) * | 2000-08-22 | 2005-11-30 | Siemens Aktiengesellschaft | System und Verfahren zum kombinierten Einsatz verschiedener Display-/Gerätetypen mit systemgesteuerter kontextabhängiger Informationsdarstellung |
JP3406965B2 (ja) * | 2000-11-24 | 2003-05-19 | キヤノン株式会社 | 複合現実感提示装置及びその制御方法 |
JP2005500096A (ja) * | 2001-06-13 | 2005-01-06 | ヴォリューム・インタラクションズ・プライヴェート・リミテッド | ガイドシステム |
US20030014212A1 (en) * | 2001-07-12 | 2003-01-16 | Ralston Stuart E. | Augmented vision system using wireless communications |
US7002551B2 (en) * | 2002-09-25 | 2006-02-21 | Hrl Laboratories, Llc | Optical see-through augmented reality modified-scale display |
US7050078B2 (en) * | 2002-12-19 | 2006-05-23 | Accenture Global Services Gmbh | Arbitrary object tracking augmented reality applications |
DE10305384A1 (de) * | 2003-02-11 | 2004-08-26 | Kuka Roboter Gmbh | Verfahren und Vorrichtung zur Visualisierung rechnergestützter Informationen |
US7394459B2 (en) * | 2004-04-29 | 2008-07-01 | Microsoft Corporation | Interaction between objects and a virtual environment display |
-
2004
- 2004-11-03 US US10/578,940 patent/US7852355B2/en active Active
- 2004-11-03 EP EP04804510A patent/EP1683063A1/de not_active Ceased
- 2004-11-03 WO PCT/EP2004/052783 patent/WO2005045729A1/de active Application Filing
Non-Patent Citations (4)
Title |
---|
DIXON K ET AL: "RAVE: a real and virtual environment for multiple mobile robot systems", INTELLIGENT ROBOTS AND SYSTEMS, 1999. IROS '99. PROCEEDINGS. 1999 IEEE /RSJ INTERNATIONAL CONFERENCE ON KYONGJU, SOUTH KOREA 17-21 OCT. 1999, PISCATAWAY, NJ, USA,IEEE, US, vol. 3, 17 October 1999 (1999-10-17), pages 1360 - 1367, XP010362379, ISBN: 978-0-7803-5184-4, DOI: 10.1109/IROS.1999.811669 * |
INNOCENTI M ET AL: "A synthetic environment for simulation and visualization of dynamic systems", AMERICAN CONTROL CONFERENCE, 1999. PROCEEDINGS OF THE 1999 SAN DIEGO, CA, USA 2-4 JUNE 1999, PISCATAWAY, NJ, USA,IEEE, US, vol. 3, 2 June 1999 (1999-06-02), pages 1769 - 1773, XP010344647, ISBN: 978-0-7803-4990-2, DOI: 10.1109/ACC.1999.786144 * |
See also references of WO2005045729A1 * |
VIJAIMUKUND RAGHAVAN ET AL: "Interactive Evaluation of Assembly Sequences Using Augmented Reality", IEEE TRANSACTIONS ON ROBOTICS AND AUTOMATION, IEEE INC, NEW YORK, US, vol. 15, no. 3, 1 June 1999 (1999-06-01), XP011053407, ISSN: 1042-296X * |
Also Published As
Publication number | Publication date |
---|---|
US20070088526A1 (en) | 2007-04-19 |
US7852355B2 (en) | 2010-12-14 |
WO2005045729A1 (de) | 2005-05-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP1683063A1 (de) | System und verfahren zur durchführung und visualisierung von simulationen in einer erweiterten realität | |
EP1141799B1 (de) | System und verfahren zum bedienen und beobachten eines automatisierungssystems | |
EP1701233B1 (de) | Generierung virtueller Welten auf Basis einer realen Umgebung | |
DE60302063T2 (de) | Graphische benutzeroberfläche für einen flugsimulator basierend auf einer client-server-architektur | |
EP1163557B1 (de) | System und verfahren zur dokumentationsverarbeitung mit mehrschichtiger strukturierung von informationen, insbesondere für technische und industrielle anwendungen | |
EP1730970A1 (de) | Vorrichtung und verfahren zur gleichzeitigen darstellung virtueller und realer umgebungsinformationen | |
DE102013100698A1 (de) | Verfahren und Vorrichtung für den Einsatz industrieller Anlagensimulatoren unter Verwendung von Cloud-Computing-Technologien | |
DE102004016329A1 (de) | System und Verfahren zur Durchführung und Visualisierung von Simulationen in einer erweiterten Realität | |
WO2005101148A2 (de) | Verfahren und system zur virtuellen inbetriebsetzung einer technischen anlage mit bevorzugter verwendung | |
EP3418839A1 (de) | Verfahren zur steuerung einer automatisierungsanlage | |
DE3925275A1 (de) | Verfahren zur manipulation in unzugaenglichen arbeitsraeumen | |
DE102005050350A1 (de) | System und Verfahren zur Überwachung einer technischen Anlage sowie Datenbrille auf Basis eines solchen Systems | |
EP3686700A1 (de) | Verfahren zur überwachung einer automatisierungsanlage | |
DE102018212944A1 (de) | Verfahren zur Unterstützung der Kollaboration zwischen Mensch und Roboter mittels einer Datenbrille | |
DE19751273A1 (de) | Verfahren zum computergestützten Erstellen und Handhaben einer auf Produkt- oder Prozesslebenszyklen bezugnehmenden technischen Datenbank | |
DE102005014979B4 (de) | Verfahren und Anordnung zur Planung von Fertigungseinrichtungen | |
Chencheva et al. | Application of visualization systems based on augmented reality technology in teaching students of technical specialties | |
WO2022258343A1 (de) | Audiovisuelles assistenzsystem, verfahren und computerprogramm zur unterstützung von wartungsarbeiten, reparaturarbeiten oder installationsarbeiten in einer industriellen anlage | |
EP4287043A1 (de) | Generierung von hologrammen zur darstellung in einer augmented reality umgebung | |
DE10141521C1 (de) | Darstellung von Anwenderinformationen | |
DE102004061842A1 (de) | Tracking System für mobile Anwendungen | |
Neumeister et al. | Evaluation Of An Augmented Reality Qualification System For Manual Assembly And Maintenance | |
DE29712807U1 (de) | Vorrichtung zur visuellen Darstellung von Reinigungs-, Inspektions- und Sanierungsarbeiten | |
WO2024104560A1 (de) | Generierung von hologrammen zur darstellung in einer augmented reality umgebung | |
DE102018214727A1 (de) | System zum Durchführen von computerunterstützten Simulationen |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20060508 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): DE FR GB IT |
|
DAX | Request for extension of the european patent (deleted) | ||
RBV | Designated contracting states (corrected) |
Designated state(s): DE FR GB IT |
|
17Q | First examination report despatched |
Effective date: 20090626 |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: SIEMENS AKTIENGESELLSCHAFT |
|
APBK | Appeal reference recorded |
Free format text: ORIGINAL CODE: EPIDOSNREFNE |
|
APBN | Date of receipt of notice of appeal recorded |
Free format text: ORIGINAL CODE: EPIDOSNNOA2E |
|
APBR | Date of receipt of statement of grounds of appeal recorded |
Free format text: ORIGINAL CODE: EPIDOSNNOA3E |
|
APAF | Appeal reference modified |
Free format text: ORIGINAL CODE: EPIDOSCREFNE |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: SIEMENS AKTIENGESELLSCHAFT |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: SIEMENS AKTIENGESELLSCHAFT |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R003 |
|
APBT | Appeal procedure closed |
Free format text: ORIGINAL CODE: EPIDOSNNOA9E |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
18R | Application refused |
Effective date: 20181016 |