EP3384469A2 - Verfahren zur darstellung einer simulationsumgebung - Google Patents
Verfahren zur darstellung einer simulationsumgebungInfo
- Publication number
- EP3384469A2 EP3384469A2 EP16826698.9A EP16826698A EP3384469A2 EP 3384469 A2 EP3384469 A2 EP 3384469A2 EP 16826698 A EP16826698 A EP 16826698A EP 3384469 A2 EP3384469 A2 EP 3384469A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- real
- simulation environment
- image
- projection
- environment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/05—Geographic models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
Definitions
- the present invention relates to a method for displaying a computer-generated simulation environment simulating a real environment with a database containing the data of a real estate and real objects located in the terrain.
- Such methods for the representation of simulation environments are used in different versions. In particular, but by no means only such methods are used for training and / or training purposes. The training and / or training purpose can be very different. Accordingly, generic methods find use, for example, in the training and instruction of pilots and / or train drivers. More generally, such methods for representing a simulation environment are preferably used when the activity to be performed in the context of the simulation, ie the interaction of a receiver representing the simulation environment, with the simulation environment in reality to significant health and / or financial risks would lead.
- the training and / or training effects achieved or achievable with the depiction largely depend on a realistic representation of the real world, ie the real environment, in the context of the representation of the simulation environment. This means, for example, that the preparation for deployment in the military sector can be more effectively prepared for deployment, the more the representation of the simulation environment resembles a real environment of a planned deployment.
- a surface model is modeled, for example, from the data representing the real environment, in which objects, in particular buildings, are recognized, extracted, modeled and re-integrated. Only after all these processes can a representation of the thus created simulation environment take place. However, this requires a great deal of time. Especially the modeling of recognized objects is equally time-consuming and prone to error.
- the data is obtained by an evaluation of recorded images during an overflight and / or during a transit in the real environment
- the data is a geospecific image of the real estate and / or the real objects and the data are stored as raster data in the database
- a height grid is generated and stored in the database, which assigns a height value to each raster point of the raster data and wherein at least for a part of the surfaces spanned by raster points
- Simulation environment a color texture is determined by a projection of at least one image capture on the generated height grid.
- the method according to the invention makes possible, in a particularly advantageous manner, a particularly realistic representation of a simulation environment based on a height grid with comparatively little effort and a high degree of automation.
- a corresponding height raster can be generated quickly, precisely and automatically from the images of the real environment using well - known methods such as "structure from motion.”
- a vertical raster causes a particularly high level of vertical or at least steep surfaces of the simulation environment This in turn means that automated and at the same time true-to-life coloring and other texturing is not possible on such surfaces, although texturing and / or coloring can be performed manually using the known methods but then requires a correspondingly large effort.
- the method according to the invention makes it possible to project the image recordings or parts of the image recordings onto a generated height grid, which in turn has the advantage that such a projection can take place automatically and at the same time up to a photorealistic texturing of Surfaces in the simulation environment.
- the method according to the invention has the advantage that it does not have to be executed and completed in advance of the presentation of the simulation environment, so that the color textures are present at the surfaces of the simulation environment spanned by the screen dots. Rather, the projection of the at least one image acquisition on the generated height grid for determining a color texture of a part of the surfaces of the simulation environment spanned by the halftone dots during the presentation of the simulation environment, whereby the time intent between the acquisition of the image recordings, the real environment and the representation of the Simulation environment can be further minimized.
- the geospecific image of the real environment is transformed into the grid data of the elevation grid with a clear and reversible transformation.
- a georeferencing can be provided. This assigns each grid point a corresponding point to the real environment.
- the spatial coordinates of the real environment it is necessary for the spatial coordinates of the real environment to be converted into a corresponding coordinate system of the simulation environment and back.
- a projection of several image recordings onto the generated height grid is carried out, the color texture of the surface being the mean value of the respective projections determined color textures is determined.
- advantage is taken of the fact that, in the case of a high image sequence of the image recordings depicting the real environment, the surfaces which are to be provided with a projected color texture in the simulation environment are contained several times from different recording positions. Due to the different image recordings errors and inaccuracies of the image recordings per se as well as due to the projection can be averaged out, which could result in the projection of only an image recording on the height grid for the determination of a color texture.
- different filters, weights and / or smoothing can be used, so that the formed average of the color texture corresponds to a large extent to the reality, ie the surface in the real environment.
- Whole color textures, sections of color textures, or individual pixels of color textures can be averaged.
- the projections of the at least one image recording take place on the height grid of the database of the simulation environment during the representation of the simulation environment.
- a likewise advantageous embodiment of the method can be provided.
- This provides that the image recording or the image recordings which are to be projected onto one by Grid points spanned surface of the simulation environment are used, in particular be pre-processed with respect to the resolution of image acquisition.
- this takes into account the fact that the image recordings of the real environment, which serve to generate the database of the simulation environment, sometimes have a resolution that is significantly higher than the resolution that mediates a user during the execution of the method according to the invention or can be perceived by this. It is possible that both the user himself and the system used to execute the method represent the limiting factor of the resolution.
- the user-perceivable maximum resolution depends on the distance between the viewing position and the subject being viewed. Accordingly, a further particularly advantageous embodiment of the method provides that the preprocessing of the image recording prior to the implementation of the projections depending on a freely selectable viewing position of the representation of the simulation environment, in particular depending on the distance between the viewing position of the simulation environment and the position of the Grid points spanned surface of the simulation environment on which the image acquisition is projected is performed.
- an advantageous embodiment of the method provides that the projections of the image recording in dependence on the recording position in the Real environment and the resulting recording position in the simulation environment takes place.
- the image recordings taken during an overflight above and / or during a passage through the real environment are provided with a position indication that represents a recording position of the image recording.
- a position indication that represents a recording position of the image recording.
- the recording direction in the real environment can be derived, for example, from the movement of the apparatus used to record the image recordings and the respective orientations of the recording apparatus with respect to the movement and, as described above, to the simulation environment and the elevation grid present there.
- the results of the projections for the determination of a color texture are further improved by the following embodiment of the method:
- This provides that the projections of the image acquisition as a function of Ab- Struktursseigenschaften the recording device with which the image acquisition was generated takes place.
- Such properties or imaging properties of the recording device may be, for example, the solid angle, which is detected or imaged starting from the position of the recording device.
- other imaging properties can also improve the quality of the projections to be performed.
- the angular range can be transmitted, for example, together with the recording position and / or the recording direction into the simulation environment, in particular into the elevation grid of the simulation environment, whereby the assignment of a part of an image recording to a surface spanned by halftone dots in the simulation environment is further improved or can be specified.
- the projection advantageously comprises a control method which controls whether a part of a surface of the simulation environment spanned by raster points of the raster model differs from the current one freely selectable viewing position of the simulation environment is visible or hidden.
- the control method is executed using a depth map of the simulation environment derived from the halftone model and the freely selectable viewing position of the simulation environment.
- the projection of at least one image recording are applied to the surfaces of the simulation environment spanned by raster points, which image steep, in particular vertically extending surfaces of the real terrain and / or real objects.
- the projection of image recordings for obtaining color textures carried out in the context of the proposed method can be carried out for all regions of the simulation environment. In relatively flat sections of the real environment, ie sections with a relatively large horizontal component and only a slight or no vertical component, however, after the conversion into the raster data of the simulation environment, a relatively large number of raster points per area or surface of the simulation environment are present, so that other methods for obtaining and displaying colors or color textures provide an equally realistic result with less effort.
- data are stored in the database which represent models of real objects of the real environment.
- models of real objects can, for example, be derived from the raster data in advance of the execution of the method according to the invention by first identifying the images of real objects in the raster data, then extracting the corresponding raster data and, on the basis of these corresponding models, for example polygon-reduced models of the objects which are then also provided as part of the database of the simulation environment.
- the models of the real objects are stored in the database in addition to the raster data which images the real objects.
- known methods for texturing the models for example the polygon-reduced models of the objects can be used, which do not include a projection of an image recording.
- conventional texturing methods can be used for the models, which results in a less realistic coloring and / or texturing of the models.
- the representation of the computer-generated, a real environment simulating simulation environment can be done in different modes, wherein in a view mode, the representation is based on the raster data and at least partially determined by the projection of the image recordings color textures and wherein in an interaction mode the presentation is at least partially based on models of real objects.
- a view mode the representation is based on the raster data and at least partially determined by the projection of the image recordings color textures
- the presentation is at least partially based on models of real objects.
- the method for displaying the simulation environment is based on the raster data and the color textures projected at least partially on the raster data from the image recordings.
- the embodiment of the method described above makes it possible, depending on the mode of operation, to achieve either a particularly realistic view of the simulation environment or an interaction with the simulation environment with a partial waiver of a certain degree of true-to-reality.
- FIG. 2 shows an exemplary detail of a simulation environment and the viewing position arranged therein;
- FIG. 3 shows an exemplary representation of a projection of image recordings on surfaces of the raster data of the simulation environment spanned by raster points;
- FIG. 4 shows an alternative exemplary representation of a projection of an image recording onto a surface spanned by screen dots
- Fig. 5 is a schematic flow diagram of the invention
- FIG. 1 shows a real environment 1, in which real objects 2, such as, for example, buildings or plants, are arranged in a real terrain 3.
- a recording device 4 such as, for example, a satellite or an unmanned aircraft, acquires image recordings 19 in an overflight over the real environment 1.
- a field of view 8 of the recording device 4 is outlined, which is about a receiving direction 17 of the receiving device extends.
- the extent of the field of view 8 to the receiving direction 17 depends on the imaging properties of the recording device.
- the field of view 8 of the receiving device 4 is limited by an imaginary or supposed imaging plane 9.
- the imaging plane 9 illustrates the mapping of the three-dimensional real environment 1 into a two-dimensional image acquisition 19.
- a data set based on which the real environment 1 in a Simulation environment can be transferred.
- an image of the real estate 3 and / or the real objects 2 can be stored in a database, the data being stored as raster data in the database.
- FIG. 2 shows a section from a simulation environment for illustration according to the proposed method.
- a plan view of the simulation environment 10 is selected so that the corresponding raster points of the raster data lie in the plane of the drawing of FIG. 2, whereas the height values associated with each raster point of the raster data are perpendicular to the plane of the drawing of FIG are not shown.
- the section of the simulation environment 10 shows a real-imaging object 11 in the form of a house and a real-imaging terrain 12. In the lower left area, a section of the grid is also displayed, in which each grid point 13 is assigned a height value and thus generates the height grid of the simulation environment 10 becomes.
- the height values of the raster points 13 thus form a total of raster data which are generated from image recordings 19, the formation of which takes place, for example, as described with reference to FIG. It is recognizable on the basis of the grid points 13 of the section of the grid of the raster data that for a section or section of the simulation environment 10 with a flat or horizontal course, ie a course which runs essentially parallel to the plane of the drawing of FIG. 2, such as, for example, the house roof 18 of the object 1 1, there is a relatively large number of screen dots 13 per area. The situation is different for surfaces 14 which are spanned by the grid points and which represent or image the sidewalls of the real imaging object 11 of the simulation environment 10.
- FIG. 2 furthermore shows a viewing position 15, which can be freely selected by the user of the method, which is intended to represent the starting point for displaying the computer-generated simulation environment 10 simulating a real environment. It is clear from the representation of the viewing position 15 with respect to the real imaging object 11 that the user of the method perceives the surface 14.1 and the surface 14.2 of the real imaging object 11 at different angles.
- the surface 14.3 is of the freely selectable display position 15 of FIG. 2 not recognizable, since the surface 14.1 obscured the view of the surface 14.3 from the viewing position 15 from.
- the proposed method whose basic idea is based on the projection of image recordings 19 on the generated height grid of a simulation environment data base, is to provide a part of the simulation environment surfaces spanned by the screen dots with a color texture.
- FIG. 3 again shows a section of a simulation environment 10 in plan view, that is to say with a view of the plane of the screen dots. in which the height values assigned to the grid points are perpendicular to the plane of the drawing.
- a real-imaging object 11 in the form of a house is also shown.
- the supposed image planes 9 of two different image recordings 19 in FIG. 3 are shown by way of example to illustrate the projections of image recordings 19 on the height raster.
- the procedure for obtaining the image recordings 19 described with reference to FIG. 1 permits unambiguous positioning and alignment of the imaging plane 9 in the simulation environment 10.
- the image planes 9 also have a vertical component which, in the example of FIG. 3, is at least partially perpendicular to the plane of the drawing and is not shown in FIG. 3 for reasons of clarity and clarity.
- FIG. 3 how, on the basis of the image recordings 19 and their image planes 9, a projection is made by which a color texture is assigned to at least part of the surfaces 14 of the simulation environment 10 spanned by the screen dots 13.
- a mapping rule or projection rule is sketched on the basis of the dotted lines shown in FIG. 3, on the basis of which a first projection region 16.1 is projected from the image acquisition 19 with the imaging plane 9.1 onto the first surface 14.1 of the real imaging object 11 forming a side wall.
- mapping rules of the projections ie the course of the dotted lines shown by way of example, as well as the not shown therebetween projections of points of the image plane 9.1 on the surfaces 14.1 and 14.2 are determined on the one hand by the raster data or the height grid and on the other hand determined by the present in the generation of the image recording 19 properties such as recording position, recording direction and the like, which are reflected simplified represent in the orientation of the imaging plane 9.1.
- the imaging plane 9.2 of a second image acquisition 19 can also be used to provide the surfaces 14.1 and 14.2 with a color texture generated from the corresponding image acquisition 19, in accordance with the projection rules shown in dashed lines in FIG. It can also be provided that the corresponding projection regions 16.3 and 16.4 of the image acquisition 19 with the image plane 9.2 are first averaged with the corresponding projection regions 16.1 and 16.2 of the image acquisition 19 with the image plane 9.1.
- the quality of the color texture generated by the projections for the surfaces 14.1 and 14.2 of the real image forming object 11 can be further improved.
- 9.2 also allows, at least for a part of the side wall 14.3 of the real image forming object 1 1, the generation of a color texture in the context of a
- the projection area 16.5 of the image recording 19 with the imaging plane 9.2 can be projected onto a part of the surface 14.3 of the real imaging object 11.
- FIG. 4 likewise shows a projection of a part of an image recording 19 onto a substantially vertical surface 14.1 spanned by screen dots 13.
- the representation of FIG. 4 is a strong schematization of an image recording 19. This is not least the clarity of Fig. 4 owed.
- a variety of other contents such as vegetation, other objects, vehicles, people and animals, would also be encompassed by a realistic image acquisition 19.
- the representation of the side view of the object 2 in the image recording 19 of FIG. 4 is a deliberately simplified representation which, however, can not fully reproduce the advantages of the method according to the invention. Because the side view of the object 2 of the image recording 19 is just not a photo-realistic representation, as they can be used in the inventive method. However, the basic principle becomes clear from the representation of FIG.
- the surfaces of the real environment captured in the image recordings 19, in particular the surfaces of real objects 2 covered by the image recordings 19, are used for projection in the simulation environment 10 or in the simulation environment 10 the presentation of the simulation environment 10 cause a correspondingly realistic impression in the viewer.
- the assignment of the image acquisition 19 and the real object 2 depicted therein to the real imaging object 11 of the simulation environment 10, in particular to the surface 14. 1, can be clearly linked between the acquisition of the image acquisition 19 documented spatial coordinates and / or spatial directions of the real environment 1 to the coordinates of Simulation environment 10 are enabled.
- the simulation environment 10 may preferably have a link to the real environment 2.
- the surface 14. 1 of the simulation environment 10 is formed by the sidewall of a real-imaging object 11 of the simulation environment 10.
- the real-imaging object 11 is represented as a three-dimensional object from a specific perspective, which, for example, goes back to a corresponding viewing position on the object 11 in the simulation environment.
- the indicated perspective view of the real-imaging object 11 illustrates some of the challenges of the method according to the invention.
- part of the surface 14. 1 at the upper right-hand edge of the side wall is covered by a part of the house roof 18. 4 also show that, for example, the window arranged below the gable and the surrounding framework structure, as can be seen on the image acquisition 19, are dependent on the viewing position of the simulation environment 10 and the object 11 of FIG. 4 are hidden from the house roof 18.
- a corresponding texturing of the surface 14.1 will therefore take into account the partial covering of the surface 14.1 by the house roof 18. This can for example be accomplished by a depth map which provides information about which parts of the raster data of the height grid of the simulation environment 10 are visible from the respective viewing position.
- the projection texture generated from the image recording 19 can then be correspondingly processed, for example, cut to such an extent that non-visible parts of the surface 14.1 are not visible from the viewing position of FIG.
- the image pickup 19 shows the real object 2 from a perspective different from the perspective of the simulation environment 10.
- the part of the image recording 19 depicting the surface 14.1 is therefore part of the projection or the mapping rule as indicated by the dot-dashed lines in FIG. 4 are tilted and / or distorted such that, from the viewing position of the simulation environment 10 of FIG. 4, the part of the image pickup 19 which images the visible part of the surface 14. 1 is correspondingly arranged on the surface 14. 1, ie projected onto the surface 14 becomes.
- FIG. 5 shows a detail of a flow chart of a method for representing a computer-generated simulation environment simulating a real environment.
- the overall method for representing the simulation environment may include a plurality of further method steps not shown in FIG. 5.
- the method sequence of FIG. 5 thus mainly relates to the projection of image recordings 19 carried out with the proposed method onto a height grid.
- step S1 for example, the determination or identification of the current viewing position 15 of the simulation environment 10 takes place, from which the simulation environment 10 is to be displayed to the viewer.
- an analysis of the raster data can take place in method step S2.
- This analysis may be directed to determining which surfaces 14 of the simulation environment 10 spanned by the screen dots 13 are particularly suitable for texturing in the context of a projection of an image recording 19. This means that particularly steep or even vertical surfaces 14 in the vicinity of the viewing position 15 of the simulation environment 10 are identified in the raster data.
- a respective distance or average distance to the viewing position 15 can be determined.
- a depth map generated on the basis of the viewing position and the raster data can be used to determine the visibility of the surface 14 of the simulation environment 10 identified in method step S2 starting from the viewing position 15.
- an identification of image recordings 19 takes place which at least partially comprise or depict visible surfaces 14 of the simulation environment 10 to be provided from the display position 15 with a color texture.
- an inverse transformation of the data of the simulation environment into the reference system of the real environment can be used in order to be able to determine in which image recordings the surfaces of the simulation environment to be textured are mapped.
- a preprocessing of the identified image recordings follows in the parallel method steps S4.1 and S.4.2, which are shown by way of example.
- the preprocessing can be such that the resolution of the image recordings 19 takes place as a function of the distance between the surface 14 of the raster data of the simulation environment 10 and the viewing position 15 of the simulation environment 10 determined in method step S2.
- other or additional preprocessing steps for preprocessing the image recordings can also be carried out in method steps S4.1 and S4.2.
- the actual projection of the image recordings 19 or at least of parts of the image recordings on the height raster and its surface 14 is subsequently carried out.
- the imaging specifications of the projection or projections used here are determined both on the basis of the height raster of the raster data and on the available data with regard to the recording properties of the respective image recording 19 in the real environment and the image characteristics of the image recording 19 derived therefrom in the simulation environment carried out. In simple terms, this means that the pixels of the image recordings are shifted, rotated and / or distorted such that both the outline of the corresponding part of the image recording 19 and the content of this part match the corresponding surface in the height grid of the raster data.
- the subsequent method step S6 is provided for the case in which the method steps S4.1, S4.2, S5.1 and S5.2 relate to the projections of image recordings 19 on one and the same surface 14 of the simulation environment 10 spanned by raster data.
- an averaging or compensation calculation of the respectively determined color textures is undertaken.
- the color textures thus generated are applied to the height raster of the raster data of the simulation environment 10 in the course of the method step S7.
- method step S8 the thus-textured surfaces 14 of the simulation environment 10 are displayed to the user together with the remaining components of the simulation environment 10.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Remote Sensing (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102015120927.6A DE102015120927A1 (de) | 2015-12-02 | 2015-12-02 | Verfahren zur Darstellung einer Simulationsumgebung |
PCT/DE2016/100563 WO2017092734A2 (de) | 2015-12-02 | 2016-12-01 | Verfahren zur darstellung einer simulationsumgebung |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3384469A2 true EP3384469A2 (de) | 2018-10-10 |
Family
ID=57821722
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP16826698.9A Withdrawn EP3384469A2 (de) | 2015-12-02 | 2016-12-01 | Verfahren zur darstellung einer simulationsumgebung |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP3384469A2 (de) |
DE (1) | DE102015120927A1 (de) |
WO (1) | WO2017092734A2 (de) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102016103057A1 (de) | 2016-02-22 | 2017-08-24 | Krauss-Maffei Wegmann Gmbh & Co. Kg | Verfahren zur Ermittlung und Darstellung von Veränderungen in einer ein reales Gelände und darin befindliche reale Objekte umfassenden Realumgebung |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050195096A1 (en) * | 2004-03-05 | 2005-09-08 | Ward Derek K. | Rapid mobility analysis and vehicular route planning from overhead imagery |
US7626591B2 (en) * | 2006-01-24 | 2009-12-01 | D & S Consultants, Inc. | System and method for asynchronous continuous-level-of-detail texture mapping for large-scale terrain rendering |
US7778491B2 (en) * | 2006-04-10 | 2010-08-17 | Microsoft Corporation | Oblique image stitching |
US8422825B1 (en) * | 2008-11-05 | 2013-04-16 | Hover Inc. | Method and system for geometry extraction, 3D visualization and analysis using arbitrary oblique imagery |
US20130321400A1 (en) * | 2012-06-05 | 2013-12-05 | Apple Inc. | 3D Map Views for 3D Maps |
CA2899584A1 (en) * | 2013-01-29 | 2014-10-23 | Andrew Robert Korb | Methods for analyzing and compressing multiple images |
-
2015
- 2015-12-02 DE DE102015120927.6A patent/DE102015120927A1/de not_active Withdrawn
-
2016
- 2016-12-01 WO PCT/DE2016/100563 patent/WO2017092734A2/de active Application Filing
- 2016-12-01 EP EP16826698.9A patent/EP3384469A2/de not_active Withdrawn
Also Published As
Publication number | Publication date |
---|---|
DE102015120927A1 (de) | 2017-06-08 |
WO2017092734A3 (de) | 2017-08-03 |
WO2017092734A2 (de) | 2017-06-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE69107267T2 (de) | Verfahren und vorrichtung zur modifizierung einer zone sukzessiver bilder. | |
DE102007045835B4 (de) | Verfahren und Vorrichtung zum Darstellen eines virtuellen Objekts in einer realen Umgebung | |
DE102009041431B4 (de) | Fahrsimulationsvorrichtung, Weitwinkelkamera-Videosimulationsvorrichtung und Bilddeformierungs-/-zusammensetzungsvorrichtung | |
DE68928181T2 (de) | Bilderzeugungsgerät | |
DE102007045834B4 (de) | Verfahren und Vorrichtung zum Darstellen eines virtuellen Objekts in einer realen Umgebung | |
EP3438901A1 (de) | Testfahrtszenario-datenbanksystem für realitätsnahe virtuelle testfahrtszenarien | |
DE102011115739A1 (de) | Verfahren zur Integration von virtuellen Objekten in Fahrzeuganzeigen | |
DE102007030226A1 (de) | Kameragestütztes Navigationssystem und Verfahren zu seinem Betrieb | |
EP3762857A1 (de) | Surroundview-system mit angepasster projektionsfläche | |
WO2009049973A2 (de) | Verfahren zur erzeugung und/oder aktualisierung von texturen von hintergrundobjektmodellen, videoüberwachungssystem zur durchführung des verfahrens sowie computerprogramm | |
WO2008074561A1 (de) | Verfahren zum darstellen eines kartenausschnittes in einem navigationssystem und navigationssystem | |
DE102019005885A1 (de) | Umgebungskartengenerierung und Lochfüllung | |
DE102015120999A1 (de) | Verfahren zur Erzeugung und Darstellung einer computergenerierten, eine Realumgebung abbildenden Simulationsumgebung | |
EP2546778A2 (de) | Verfahren zum evaluieren einer Objekterkennungseinrichtung eines Kraftfahrzeugs | |
DE19549096A1 (de) | Simulationsvorrichtung und -verfahren | |
EP2381207A1 (de) | 3D-Zielvermessung und Zieleinweisung aus IR-Daten | |
EP1628262A2 (de) | Verfahren und Vorrichtung zur Darstellung einer dreidimensionalen Topographie | |
DE102011082881A1 (de) | Darstellung der Umgebung eines Kraftfahrzeugs in einer bestimmten Ansicht unter Verwendung räumlicher Information | |
WO2017092734A2 (de) | Verfahren zur darstellung einer simulationsumgebung | |
EP3384480A1 (de) | Verfahren zur vorbereitenden simulation eines militärischen einsatzes in einem einsatzgebiet | |
DE102022201279B3 (de) | Verfahren zum Erfassen einer Umgebung eines Fahrzeugs, Kameravorrichtung und Fahrzeug | |
EP3754544A1 (de) | Erkennungssystem, arbeitsverfahren und trainingsverfahren | |
DE102015120929A1 (de) | Verfahren zur vorbereitenden Simulation eines militärischen Einsatzes in einem Einsatzgebiet | |
WO2017182021A1 (de) | Verfahren und system zur darstellung einer simulationsumgebung | |
DE102016124989A1 (de) | Bordsteinrepräsentation mit einem dreidimensionalen Körper in einem Fahrerassistenzsystem für ein Kraftfahrzeug |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20180702 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20210419 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20211029 |