CN115880463A - Mars surface detection planning visual simulation method, device, equipment and medium - Google Patents

Mars surface detection planning visual simulation method, device, equipment and medium Download PDF

Info

Publication number
CN115880463A
CN115880463A CN202211385480.9A CN202211385480A CN115880463A CN 115880463 A CN115880463 A CN 115880463A CN 202211385480 A CN202211385480 A CN 202211385480A CN 115880463 A CN115880463 A CN 115880463A
Authority
CN
China
Prior art keywords
detection
mars
scene
virtual
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211385480.9A
Other languages
Chinese (zh)
Inventor
高兴烨
严韦
任鑫
曾兴国
左维
李春来
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National Astronomical Observatories of CAS
Original Assignee
National Astronomical Observatories of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National Astronomical Observatories of CAS filed Critical National Astronomical Observatories of CAS
Priority to CN202211385480.9A priority Critical patent/CN115880463A/en
Publication of CN115880463A publication Critical patent/CN115880463A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The disclosure provides a Mars surface detection planning visual simulation method, device, equipment and medium, which can be applied to the technical field of deep space detection and the technical field of visual simulation. The method comprises the following steps: determining a target detection point of the mars train according to a virtual detection scene, wherein the virtual detection scene is established according to mars terrain data, mars train model data and ephemeris data, and illumination related to the position of the sun is set in the virtual detection scene; determining a detection event for presentation in the virtual detection scene based on parameter information related to the detection event, the parameter information including time information of the detection event; generating real-time illumination matched with the detection event in the virtual detection scene based on the time information of the detection event; under the condition that real-time illumination matched with the detection event is simulated in the virtual detection scene, a scene simulation result of the mars vehicle at the target detection point is determined according to display content related to the detection event in the virtual detection scene.

Description

Mars surface detection planning visual simulation method, device, equipment and medium
Technical Field
The disclosure relates to the technical field of deep space exploration and the technical field of view simulation, in particular to a Mars surface exploration planning view simulation method, device, equipment and medium.
Background
The detection work of the surface of the mars is usually carried out through the mars car, but the communication link between the mars and the earth is long in time delay, and ground workers cannot control the mars car in real time to complete scientific detection tasks, so that the ground workers can perform scene simulation according to the acquired data related to the detection of the mars, and the detection environment around the mars car can be conveniently known.
In the course of implementing the disclosed concept, the inventors found that there are at least the following problems in the related art: in the related art, the visual simulation of the mars train is directly simulated in a virtual detection scene, which easily causes an accident that a detection point is located in the shadow of the mars train in the actual detection process.
Disclosure of Invention
In view of this, the present disclosure provides a method, an apparatus, a device and a medium for Mars surface exploration planning visual simulation.
One aspect of the present disclosure provides a Mars surface detection planning visual simulation method, including:
determining a target detection point of the mars train according to a virtual detection scene, wherein the virtual detection scene is established according to mars terrain data, mars train model data and ephemeris data, and illumination related to the position of the sun is set in the virtual detection scene;
determining a detection event for presentation in a virtual detection scene based on parameter information related to the detection event, wherein the parameter information comprises time information of the detection event;
generating real-time illumination matched with the detection event in the virtual detection scene based on the time information of the detection event;
and under the condition of simulating real-time illumination matched with the detection event in the virtual detection scene, determining a scene simulation result of the Mars train at the target detection point according to the display content related to the detection event in the virtual detection scene.
According to an embodiment of the present disclosure, wherein generating real-time illumination in the virtual detection scene matching the detection event based on the time information of the detection event comprises:
determining the detection time of the detection event according to the time information of the detection event;
determining the sun position of a target detection point in a virtual detection scene according to the detection time;
real-time illumination is generated in the virtual detection scene that matches the detection event based on the sun position.
According to an embodiment of the present disclosure, wherein determining a sun position of a target detection point in a virtual detection scene according to a detection time comprises:
acquiring the detection time of a previous frame in a virtual detection scene;
determining the detection time of the current frame based on the detection time of the previous frame according to the preset time multiple speed and frame interval;
and determining the sun position of the target detection point in the virtual detection scene based on the detection time of the current frame.
According to an embodiment of the present disclosure, the method further includes:
recording simulation information of the detection event in a visual simulation result in a text file, wherein the simulation information comprises parameter information of the detection event;
determining modification information in the case of modifying time information of the detection event; and
and displaying the detection event after the time information is modified in the virtual detection scene based on the modification information and the text file.
According to an embodiment of the present disclosure, the method further includes:
generating a first format file related to a train model of a train by using train parameter information of the train;
generating a second format file related to a detection point terrain model of the target detection point by using a stereo image acquired by a navigation terrain camera of the mars vehicle;
generating a third format file related to the sky environment in the virtual detection scene by using a panoramic image of the mars vehicle annularly shot on the landing platform;
and establishing a virtual detection scene by using a three-dimensional rendering engine based on the first format file, the second format file and the third format file.
According to an embodiment of the present disclosure, a scientific loading of a mars train comprises: the system comprises at least one navigation terrain camera and a multispectral camera, wherein the at least one navigation terrain camera and the multispectral camera are positioned at the top of a mast of a Mars vehicle; the method further comprises the following steps:
simulating imaging ranges of the two navigation terrain cameras and the multispectral camera by using a semitransparent pyramid-like light beam model based on the installation positions and the camera parameters of the at least one navigation terrain camera and the multispectral camera; and
establishing at least one navigation terrain camera and a multispectral camera in the virtual exploration scene by utilizing a three-dimensional rendering engine so as to simulate the imaging range of the at least one navigation terrain camera and the multispectral camera.
According to an embodiment of the present disclosure, the scientific loading of the mars train further comprises: a Mars surface composition detector, a Mars subsurface detection radar, a Mars surface magnetic field detector and a Mars meteorological measuring instrument.
Another aspect of the present disclosure provides a mars surface detection planning visual simulation apparatus, including:
the system comprises a first determination module, a second determination module and a third determination module, wherein the first determination module is used for determining a target detection point of the mars train according to a virtual detection scene, the virtual detection scene is established according to mars terrain data, mars train model data and ephemeris data, and illumination related to the position of the sun is set in the virtual detection scene;
a second determining module, configured to determine a detection event for presentation in the virtual detection scene based on parameter information related to the detection event, where the parameter information includes time information of the detection event;
the generating module is used for generating real-time illumination matched with the detection event in the virtual detection scene based on the time information of the detection event;
and the third determining module is used for determining a visual simulation result of the mars vehicle at the target detection point according to the display content related to the detection event in the virtual detection scene under the condition of simulating real-time illumination matched with the detection event in the virtual detection scene.
Another aspect of the present disclosure provides an electronic device including: one or more processors; a memory for storing one or more programs, wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the mars surface survey planning vision simulation method.
Another aspect of the present disclosure provides a computer-readable storage medium having stored thereon executable instructions that, when executed by a processor, cause the processor to perform the above mars surface detection planning vision simulation method.
According to the embodiment of the disclosure, because the illumination related to the sun position is set in the virtual detection scene, and the real-time illumination matched with the detection event can be generated according to the time information of the detection event, the detection event with the real-time illumination can be displayed in the virtual detection scene, the scientificity and the reality degree of the virtual detection field are enhanced, the occurrence of the self-shadow accident that the detection point is positioned in the mars car in the actual detection process is reduced, and the selection of the target detection point can be optimized, so that the technical problem that the accident that the detection point is positioned in the self-shadow of the mars car in the actual detection process is easily caused because the visual simulation of the mars car is directly simulated in the virtual scene in the related technology is at least partially solved.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent from the following description of embodiments of the present disclosure with reference to the accompanying drawings, in which:
FIG. 1 schematically illustrates a flow chart of a Mars surface sounding planning visual simulation method according to an embodiment of the disclosure;
FIG. 2 schematically illustrates a block diagram of a Mars surface exploration planning vision simulation system, in accordance with an embodiment of the present disclosure;
FIG. 3 schematically illustrates a time-driven Mars surface science sounding simulation method according to an embodiment of the disclosure;
FIG. 4 schematically illustrates a block diagram of a Mars surface exploration planning view simulation apparatus, in accordance with an embodiment of the present disclosure; and
fig. 5 schematically shows a block diagram of an electronic device adapted to implement the above described method according to an embodiment of the present disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is illustrative only and is not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
In those instances where a convention analogous to "at least one of A, B, and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B, and C" would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). Where a convention analogous to "at least one of A, B, or C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B, or C" would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.).
In the related technology, the mars vehicle can carry out patrol detection work on the mars surface with complex environment, information interaction between the mars vehicle with the mars scientists and engineering technical staff on the earth is realized, in order to ensure that the mars vehicle can safely, scientifically and efficiently complete patrol detection work on the mars surface, ground workers are required to fully know the surrounding environment of the mars vehicle within limited working time, target detection points which meet detection constraint conditions and have the maximum scientific detection value are selected, optimization and verification work of a scientific load detection planning scheme is completed aiming at the selected target detection points, and uploading of scientific detection instructions is finally completed. However, due to the fact that the ground-fire communication link is prolonged in time, ground workers cannot control the mars vehicle in real time to complete a scientific detection task, conditions such as complexity and changeability of mars surface detection environments, narrow scientific detection time window, uniqueness and the like exist in a scientific detection process, and therefore the ground workers need to reconstruct a virtual scientific detection site which is the same as a mars surface real detection site by using acquired mars scientific detection data, so that the detection environment around the mars vehicle can be conveniently known, decision analysis is fast, and scientific detection schemes can be efficiently formulated, verified and optimized.
At present, after a virtual detection site of a mars surface is constructed, simulation verification work of scientific detection activities is mostly directly carried out, and the influence of time change and detection events of a detector on scientific detection constraint conditions such as shadow illumination of a target detection point is not considered. Therefore, the accident that the selected target detection point is positioned in the self-shadow of the detector during actual detection can be caused, and precious scientific detection time is wasted.
In view of this, the embodiment of the present disclosure provides a mars surface detection planning visual simulation method. The method comprises the steps of determining a target detection point of the mars train according to a virtual detection scene, wherein the virtual detection scene is established according to mars terrain data, mars train model data and ephemeris data, and illumination related to the position of the sun is set in the virtual detection scene; determining a detection event for presentation in a virtual detection scene based on parameter information related to the detection event, wherein the parameter information comprises time information of the detection event; generating real-time illumination matched with the detection event in the virtual detection scene based on the time information of the detection event; and under the condition of simulating real-time illumination matched with the detection event in the virtual detection scene, determining a scene simulation result of the Mars train at the target detection point according to the display content related to the detection event in the virtual detection scene.
Fig. 1 schematically shows a flow chart of a mars surface detection planning visual simulation method according to an embodiment of the present disclosure.
As shown in fig. 1, the method includes operations S101 to S104.
In operation S101, a target detection point of a mars train is determined according to a virtual detection scene, where the virtual detection scene is established according to mars terrain data, mars train model data, and ephemeris data, and illumination related to a sun position is set in the virtual detection scene.
According to an embodiment of the present disclosure, a virtual probe scene may be displayed in a display screen of a terminal. In the established virtual detection scene, a ray extending along the direction of the viewpoint can be sent out from the screen point by clicking the screen point and is intersected with the mars terrain in the virtual detection scene to obtain an intersection point of a three-dimensional coordinate point, the mars can be a coordinate origin, and the intersection point can be determined as a target detection point of the mars. The target detection point can be determined by the target detection point auxiliary selection module, the target detection point can be measured, selected, marked, edited and stored by the target detection point auxiliary selection module, and the distance and the coordinates of the target detection point relative to the origin coordinates of the mars vehicle can be displayed. The target detection point auxiliary selection module also has a measurement function, and the distance between two points can be calculated by selecting the two points of the terrain.
In operation S102, a detection event for presentation in a virtual detection scene is determined based on parameter information related to the detection event, wherein the parameter information includes time information of the detection event.
According to an embodiment of the present disclosure, the parameter information related to the detection event may include information of a target planned by the overall detection, characteristic information of a labeled scientific target point, scientific detection constraint information of each scientific load, and time information of the detection event, and the time information of the detection event may include detection window duration information, set detection time, and the like.
In operation S103, real-time illumination matching the detection event is generated in the virtual detection scene based on the time information of the detection event.
According to an embodiment of the present disclosure, the time information of the detection event may be a preset time for detecting at mars. Due to the change of time, the position of the sun can be changed continuously when the Mars detect actually, so that the illumination in the Mars is changed, therefore, the real-time illumination matched with the detection event can be generated in the virtual detection scene based on the time information of the detection event, the parallel light source can be added in the virtual detection scene, and the height and the angle of the parallel light can be updated in real time according to the height angle and the azimuth angle of the position of the sun, so that the real-time light and shadow effect can be constructed.
In operation S104, under the condition that real-time illumination matched with the detection event is simulated in the virtual detection scene, a visual simulation result of the mars car at the target detection point is determined according to the display content related to the detection event in the virtual detection scene.
According to the embodiment of the disclosure, under the condition that real-time illumination matched with a detection event is simulated in a virtual detection scene, namely, the display content in the virtual detection scene can be regarded as the illumination condition in actual detection work, scientific detection can be performed on a target detection point according to parameter information of the detection event, a scene simulation result is determined according to the display content in the virtual detection scene, and the scene simulation result can record the scientific detection process and related parameters of a mars vehicle in the virtual detection scene.
According to the embodiment of the disclosure, because the illumination related to the sun position is set in the virtual detection scene, and the real-time illumination matched with the detection event can be generated according to the time information of the detection event, the detection event with the real-time illumination can be displayed in the virtual detection scene, the scientificity and the reality degree of the virtual detection field are enhanced, the occurrence of the self-shadow accident that the detection point is positioned in the mars car in the actual detection process is reduced, and the selection of the target detection point can be optimized, so that the technical problem that the accident that the detection point is positioned in the self-shadow of the mars car in the actual detection process is easily caused because the visual simulation of the mars car is directly simulated in the virtual scene in the related technology is at least partially solved.
According to an embodiment of the present disclosure, wherein generating real-time illumination matching the detection event in the virtual detection scene based on the time information of the detection event comprises:
determining the detection time of the detection event according to the time information of the detection event;
determining the sun position of a target detection point in a virtual detection scene according to the detection time;
real-time illumination is generated in the virtual detection scene that matches the detection event based on the sun position.
According to an embodiment of the present disclosure, the detection point real-time illumination calculation module is implemented using a CSPICE tool library. The CSPICE tool library is an application program interface of SPICE spatial information auxiliary system C language version. SPICE corresponds to the acronym of english for the 5 words spaceship (Spacecraft), planet (Planet), device (Instrument), camera matrix (camera-matrix) and Events (Events). SPICE encapsulates ephemeris data based on a file form, supports interfaces such as C language, fortan language, python language, IDL language, matlab language and the like, and is used for a user to read data from SPICE to obtain data of planets, spacecrafts and the like in the technical field of deep space exploration, such as the position, the running direction, the speed and the like of the planets, the spacecrafts and the like. In the method, the positions of the sun and the earth relative to a target detection point on the surface of the mars are acquired by inputting detection time and the longitude and latitude of the surface of the mars by adopting a CSPICE (China compact disc) library of an N0066 version based on ephemeris data of DE421 when the positions of the sun and the mars are equal. The probe time may be included in event information of the probe event.
According to the embodiment of the disclosure, the position of the sun and the earth relative to the target detection point on the surface of the mars can be determined according to the detection time, that is, the position of the sun can be determined, and the position of illumination can be determined according to the position of the sun, so that real-time illumination matched with a detection event can be generated in a virtual detection scene.
According to the embodiment of the disclosure, by generating the real-time illumination, the detection event can be more truly displayed in the actual detection scene of the mars by the mars car in the virtual detection scene, and the accidents caused by the fact that the target detection point is located in the self-shadow of the mars car are reduced.
According to an embodiment of the present disclosure, wherein, according to the detection time, the sun position of the target detection point is determined in the virtual detection scene, comprising:
acquiring the detection time of a previous frame in a virtual detection scene;
determining the detection time of the current frame based on the detection time of the previous frame according to the preset time multiple speed and frame interval;
and determining the sun position of the target detection point in the virtual detection scene based on the detection time of the current frame.
According to the embodiment of the disclosure, in an actual Mars surface detection task, the constraint condition of the scientific detection environment changes along with the change of time. The display content in the virtual detection scene can be displayed in a display screen of the terminal device in a video mode, and time drive can be added in the virtual detection scene through a simulation time submodule in a time and detection event simulation management module, so that the influence of real reaction time change on the Mars surface scientific detection environment in the virtual detection scene is reflected. The simulation time submodule can control the detection time in the modes of fast forward, slow down, playback, skip, pause and the like, and drive the environmental elements such as the solar altitude angle, the solar azimuth angle and the like which change along with the time and the detection event to be updated.
For example, in each update, the time multiple speed may be preset to be n, the time of the frame interval may be t, and the detection time of the previous frame may be t1, then the detection time of the current frame may be calculated as the sum of nt and the detection time t1 of the previous frame, that is, the detection time of the current frame may be t1+ nt, and according to the detection time of the current frame, the position of the sun may be determined, so as to update the altitude angle and the azimuth angle of the sun in the virtual detection scene.
According to the embodiment of the disclosure, the detection time in the virtual detection scene is obtained, and the position of the sun is determined, so that the position of the sun can be updated in the virtual detection scene, and the real-time illumination of the sun can be displayed in the virtual detection scene.
According to an embodiment of the present disclosure, the method further includes:
recording simulation information of the detection event in a visual simulation result in a text file, wherein the simulation information comprises parameter information of the detection event;
determining modification information in the case of modifying time information of the detection event;
and displaying the detection event after the time information is modified in the virtual detection scene based on the modification information and the text file.
And adding a detection event editing submodule on the basis of a simulation time submodule in the time and detection event simulation management module. The detection event editing submodule can realize the preview and playback functions by combining with the simulation time submodule, and a user can repeatedly watch the simulation detection process, verify and optimize the scientific detection process, so as to obtain the optimal scientific detection scheme.
According to the embodiment of the disclosure, the visual simulation result can be recorded in a text form while the detection event is shown in the virtual detection scene. The parameter information in the text file may include the times at which the simulation started and ended, the name of the detection event, and so on.
According to the embodiment of the disclosure, after the detection event is finished, under the condition that a user needs to adjust the detection time, the time information of the detection event can be modified, the modification information is determined, the text file can be accessed and recorded in real time by using the detection event editing submodule to be compared with the modification information, and the modified detection event is displayed in the virtual detection scene according to the comparison result.
According to an embodiment of the present disclosure, the method further includes:
generating a first format file related to a train model of the train by using train parameter information of the train;
generating a second format file related to a detection point terrain model of the target detection point by using a stereo image acquired by a navigation terrain camera of the mars vehicle;
generating a third format file related to the sky environment in the virtual detection scene by using a panoramic image of the mars vehicle annularly shot on the landing platform;
and establishing a virtual detection scene by using a three-dimensional rendering engine based on the first format file, the second format file and the third format file.
According to the embodiment of the disclosure, a train model in a virtual detection scene can be generated by using a train model making module, the train model making module can construct a three-dimensional model of a 'melting number' train and a scientific load carried on the train based on materials such as 'melting number' train photos, scientific load drawings, scientific load mounting position parameters and the like through 3DSMax software, and simultaneously construct a motion node of each scientific load so as to facilitate the control of the motion of each scientific load model in a three-dimensional engine at a later stage. The first format file of the train model may be a fbx format file.
According to the embodiment of the disclosure, the coordinate system of the generated train model is a Cartesian coordinate system, the origin is located at the geometric center of the structural bottom plate of the train model, the direction of the train head is a Y axis, the upward direction is a Z axis, and the Y axis, the X axis and the Z axis form a right-hand rectangular coordinate system.
According to the embodiment of the disclosure, a detection point terrain model in a virtual detection scene can be generated by using a detection point terrain model reconstruction module, the detection point terrain model reconstruction module can use a three-dimensional image pair obtained by two navigation terrain cameras on the left and right of a 'melting start' mars, camera parameters, installation parameters of the navigation terrain cameras and mast pitching and yawing angles of the corresponding image pairs, a photogrammetry method is adopted to manufacture a digital orthographic image map (DOM) and a Digital Elevation Model (DEM) of a circular shooting area of the navigation terrain cameras at a target detection point, and a mesh model in obj format, namely a second format file, is generated. The photogrammetry process can be done in Photoscan software. The root mean square error of the reprojection errors of all stereopairs to the connection point is better than 0.3 pixel, which reflects the accuracy of the terrain mesh model reconstructed by the navigation terrain camera. The coordinate system of the generated detection point terrain model is a local Cartesian coordinate system in the northeast, and the origin is located at the middle point of the connecting base line of the left navigation terrain camera and the right navigation terrain camera.
According to the embodiment of the disclosure, the sky environment in the virtual detection scene can be generated by using the detection point sky box manufacturing module, and the detection point sky box manufacturing module constructs the sky bounding box by adopting a panoramic picture of a 'melting' mars vehicle annularly shot on the landing platform on the fifth mars day. The panoramic image is formed by splicing and embedding 24 images which are formed by pitching twice and circularly shooting 360 degrees by the navigation terrain camera, and black pixels are adopted to fill in the places where the panoramic image does not cover, so that a third format file in a png image format is generated. In Unity3D, the orientation of the sky bounding box is set according to the orientation of the navigation terrain camera when taking a picture.
According to an embodiment of the present disclosure, the three-dimensional rendering engine may render using Unity 3D. The Unity3D is a cross-platform 2D and 3D video game engine, is a comprehensive creation tool integrating functions of scene management, animation editing, a rendering engine, a physical engine, sound effects, artificial intelligence, virtual reality and the like, supports cross-platform development of platforms such as windows, linux, iOS, android, mac and the like, and supports C #, javaScript scripts. After the generated first format file, the second format file and the third format file are imported by using a unity3D scene editor, the three-dimensional scene is edited in a visual mode, and development is performed by matching with a script, so that the development difficulty of the visual simulation program is greatly reduced, and the development process of the visual simulation program is accelerated.
On one hand, the local solar height and azimuth angle of the Mars surface detection point are obtained in real time through time driving based on real ephemeris data, real-time illumination in a virtual detection scene is simulated, scientificity and reality degree of the virtual detection scene are greatly enhanced, a Mars scientist can more truly and comprehensively master the surrounding environment of a 'melting' Mars train, and more reasonable and scientific target detection points are selected in an auxiliary mode; on the other hand, the method realizes a Mars vehicle detection planning visual simulation method on the Mars surface based on the constructed simulation time system, realizes visual simulation preview and playback in a scientific detection process through the simulation time system, and assists ground workers in completing the formulation, optimization and verification of a scientific load detection planning scheme.
According to an embodiment of the present disclosure, a scientific loading of a mars train comprises: the system comprises at least one navigation terrain camera and a multispectral camera, wherein the at least one navigation terrain camera and the multispectral camera are positioned at the top of a mast of a mars vehicle; the method further comprises the following steps:
simulating imaging ranges of the two navigation terrain cameras and the multispectral camera by using a semitransparent pyramid-like light beam model based on the installation positions and the camera parameters of the at least one navigation terrain camera and the multispectral camera;
establishing at least one navigation terrain camera and a multispectral camera in the virtual exploration scene by utilizing a three-dimensional rendering engine so as to simulate the imaging range of the at least one navigation terrain camera and the multispectral camera.
According to the embodiment of the disclosure, the navigation terrain cameras can be arranged on the left and the right of the Mars train respectively, the multispectral camera is arranged at the top of the Mars train mast, and the navigation terrain cameras and the multispectral camera can acquire Mars surface images at different angles along with pitching and rotating of the mast.
According to the embodiment of the disclosure, based on the installation position and the camera parameters of at least one navigation terrain camera and multispectral camera, the imaging ranges of the two navigation terrain cameras and the multispectral camera are simulated by using the semitransparent pyramid-like light beam model, and pictures in the imaging ranges of the navigation terrain cameras and the multispectral camera can be acquired, so that the detection of a mars vehicle on the mars vehicle can be simulated more truly. The navigation terrain camera and the multispectral camera may be generated in the virtual probe scene using a render-to-texture technique.
According to the embodiment of this disclosure, the scientific load of mars car still includes: a Mars surface composition detector, a Mars subsurface detection radar, a Mars surface magnetic field detector and a Mars meteorological measuring instrument.
According to the embodiment of the disclosure, the Mars surface composition detector works in a mode that a two-dimensional directional mirror rotates to aim at a target, a Laser Induced Breakdown Spectrometer (LIBS) emits Laser, the Laser is reflected to a target ground object through the two-dimensional directional mirror, the surface of the target is ablated by the Laser to form plasma and emit light, and the spectrometer collects plasma signals to obtain an LIBS spectrum. The laser can be represented by a red bright line and is reflected by the horizontal pitching rotating two-dimensional rotating mirror to point to the target ground object.
According to the embodiment of the disclosure, when the mars subsurface detection radar, the mars surface magnetic field detector and the mars meteorological measuring instrument are used for carrying out scientific detection, the load carried by the mars vehicle is not required to move, and the visual effect can be realized by adopting a fixed aperture detection effect.
According to the embodiment of the invention, a virtual detection scene of the 'melting' train based on time change is constructed based on the geographic camera data of the 'melting' train navigation, DE421 ephemeris data, the 'melting' train photo, the carried scientific load installation parameters and the related drawing, a detection event system based on time drive is also constructed, and the detection effect of each scientific load detection event is simulated in a visual view.
According to the embodiment of the disclosure, the viewpoint control, the visual display auxiliary information (including the detection time, the compass and the like) and the like mentioned in the above method can be realized by using a human-computer interaction and auxiliary information display module.
Fig. 2 schematically illustrates a block diagram of a mars surface exploration planning vision simulation system, in accordance with an embodiment of the present disclosure.
As shown in fig. 2, the system 200 includes:
the train model making module 201 is used for making a 'melting' train and a scientific load digital three-dimensional model carried by the train.
And the detection point terrain model module 202 is used for reconstructing a detection site three-dimensional terrain model based on the data of the stereo image acquired by the 'melting and congratulating' train navigation terrain camera.
A detection point sky box making module 203, configured to make a sky box model surrounding a virtual detection site based on an image obtained by a mars car navigation terrain camera.
And the three-dimensional rendering engine module 204 is used for organizing and drawing the three-dimensional scene in real time.
The target probe point auxiliary selection module 205 is configured to assist in selecting a target probe point, and has functions of three-dimensional measurement, target probe point selection, and labeling.
And the time and detection event simulation management module 206 is used for realizing the functions of accelerating, decelerating, pausing and replaying simulation time, and can edit and record the scientific detection events of various scientific loads based on a simulation time system.
And the detection point real-time illumination calculation module 207 is used for calculating the solar altitude angle and azimuth angle information of the detection point on the surface of the Mars in real time according to time.
And the scientific detection effect visual simulation module 208 is used for presenting each scientific load detection effect through visual simulation according to the scientific load detection mode.
And the man-machine interaction and auxiliary information display module 209 is used for the functions of viewpoint control and interface interaction, and can also assist in displaying information such as time, north arrow and the like.
According to the embodiment of the present disclosure, reference may be made to the description of the method for simulating the Mars surface detection planning view in the embodiment of the present disclosure for the functions of the modules in the system 200, which is not described herein again.
The embodiment of the disclosure provides another Mars surface science detection simulation method based on time driving on the basis of the system.
Fig. 3 schematically illustrates a simulation method for Mars surface science exploration based on time-driven according to an embodiment of the present disclosure.
As shown in fig. 3, the method includes operations S301 to S307.
In operation S301, a virtual detection scene is established, in which a train model, a detection point terrain model, a sky environment, illumination, and the like are displayed in the virtual detection scene.
In operation S302, a position of the train model is set, a position of the train three-dimensional model is moved, and an angle of the train model is rotated to conform to an actual position and orientation of the train on the surface of the train.
In operation S303, a detection time is set according to a practical detection window.
In operation S304, a target detection point is determined. For example, virtual detection of the scene environment, such as terrain, lighting, and feature of features of terrain, can be combined to select appropriate ones and label them.
In operation S305, a detection event is determined. And determining a detection event according to the overall detection planning target, the labeled target detection point, the detection time of the detection window and the scientific detection constraint conditions of the scientific loads.
In operation S306, the simulation deduces scientific exploration. And playing the set scientific detection process according to the detection time sequence. During the observation and detection process, the observation visual angle can be adjusted through fast forward, slow down and pause time, and the reasonability and the scientificity of scientific detection are analyzed, verified and optimized. In the case where the target probe point does not satisfy the preset condition, operation S303 is performed, and in the case where the target probe point does not satisfy the preset condition, operation S307 is performed.
In operation S307, a view simulation result of the mars car at the target detection point is determined according to the display content related to the detection event in the virtual detection scene.
According to the embodiment of the present disclosure, the operations in fig. 3 may refer to descriptions of other embodiments of the present disclosure, and are not described herein again.
Fig. 4 schematically shows a block diagram of a mars surface detection planning view simulation apparatus according to an embodiment of the present disclosure.
As shown in fig. 4, the mars surface detection planning vision simulation apparatus 400 includes a first determining module 410, a second determining module 420, a generating module 430, and a third determining module 440.
The first determining module 410 is configured to determine a target detection point of a mars train according to a virtual detection scene, where the virtual detection scene is established according to mars terrain data, mars train model data, and ephemeris data, and illumination related to a sun position is set in the virtual detection scene.
A second determining module 420, configured to determine a detection event for presentation in the virtual detection scene based on parameter information related to the detection event, where the parameter information includes time information of the detection event.
A generating module 430, configured to generate real-time illumination in the virtual detection scene matching the detection event based on the time information of the detection event.
The third determining module 440 is configured to determine a scene simulation result of the mars car at the target detection point according to the display content related to the detection event in the virtual detection scene under the condition that real-time illumination matched with the detection event is simulated in the virtual detection scene.
According to an embodiment of the present disclosure, the generating module 430 for generating real-time illumination in the virtual detection scene matching the detection event based on the time information of the detection event includes:
the first generation unit is used for determining the detection time of the detection event according to the time information of the detection event;
the second generation unit is used for determining the sun position of the target detection point in the virtual detection scene according to the detection time;
a third generating unit for generating real-time illumination matching the detection event in the virtual detection scene based on the sun position.
According to an embodiment of the present disclosure, wherein the second generating unit for determining the sun position of the target detection point in the virtual detection scene according to the detection time comprises:
the first generation subunit is used for acquiring the detection time of the previous frame in the virtual detection scene;
the second generation subunit is used for determining the detection time of the current frame based on the detection time of the previous frame according to the preset time double speed and the frame interval;
and the third generation subunit is used for determining the sun position of the target detection point in the virtual detection scene based on the detection time of the current frame.
According to an embodiment of the present disclosure, the apparatus further includes:
the recording module is used for recording simulation information of the detection event in the visual simulation result in a text file, wherein the simulation information comprises parameter information of the detection event;
the fourth determining module is used for determining modification information under the condition of modifying the time information of the detection event; and
and the display module is used for displaying the detection event after the time information is modified in the virtual detection scene based on the modification information and the text file.
According to an embodiment of the present disclosure, the apparatus further includes:
the first format file generation module is used for generating a first format file related to a train model of a train by utilizing train parameter information of the train;
the second format file generation module is used for generating a second format file related to a detection point terrain model of the target detection point by utilizing a stereo image acquired by a navigation terrain camera of the mars vehicle;
the third format file generation module is used for generating a third format file related to the sky environment in the virtual detection scene by utilizing a panoramic image of the mars vehicle annularly shot on the landing platform;
and the establishing module is used for establishing a virtual detection scene by utilizing a three-dimensional rendering engine based on the first format file, the second format file and the third format file.
According to an embodiment of the present disclosure, a scientific loading of a mars train comprises: the system comprises at least one navigation terrain camera and a multispectral camera, wherein the at least one navigation terrain camera and the multispectral camera are positioned at the top of a mast of a mars vehicle; the above-mentioned device still includes:
the first simulation module is used for simulating the imaging ranges of the two navigation terrain cameras and the multispectral camera by using a semitransparent pyramid-like light beam model based on the installation position and the camera parameters of the at least one navigation terrain camera and the multispectral camera; and
the second simulation module is used for establishing at least one navigation terrain camera and a multispectral camera in the virtual detection scene by utilizing the three-dimensional rendering engine so as to simulate the imaging range of the at least one navigation terrain camera and the multispectral camera.
Any number of modules, sub-modules, units, sub-units, or at least part of the functionality of any number thereof according to embodiments of the present disclosure may be implemented in one module. Any one or more of the modules, sub-modules, units, and sub-units according to the embodiments of the present disclosure may be implemented by being split into a plurality of modules. Any one or more of the modules, sub-modules, units, sub-units according to embodiments of the present disclosure may be implemented at least in part as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in any other reasonable manner of hardware or firmware by integrating or packaging a circuit, or in any one of or a suitable combination of software, hardware, and firmware implementations. Alternatively, one or more of the modules, sub-modules, units, sub-units according to embodiments of the disclosure may be at least partially implemented as a computer program module, which when executed may perform the corresponding functions.
For example, any number of the first determining module 410, the second determining module 420, the generating module 430 and the third determining module 440 may be combined and implemented in one module/unit/sub-unit, or any one of the modules/units/sub-units may be split into a plurality of modules/units/sub-units. Alternatively, at least part of the functionality of one or more of these modules/units/sub-units may be combined with at least part of the functionality of other modules/units/sub-units and implemented in one module/unit/sub-unit. According to an embodiment of the present disclosure, at least one of the first determining module 410, the second determining module 420, the generating module 430, and the third determining module 440 may be implemented at least partially as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in hardware or firmware by any other reasonable manner of integrating or packaging a circuit, or in any one of three implementations of software, hardware, and firmware, or in any suitable combination of any of them. Alternatively, at least one of the first determining module 410, the second determining module 420, the generating module 430 and the third determining module 440 may be at least partially implemented as a computer program module, which when executed, may perform a corresponding function.
It should be noted that, in the embodiment of the present disclosure, the mars surface detection planning view simulation apparatus portion corresponds to the mars surface detection planning view simulation method portion in the embodiment of the present disclosure, and the description of the mars surface detection planning view simulation apparatus portion specifically refers to the mars surface detection planning view simulation method portion, which is not described herein again.
Fig. 5 schematically shows a block diagram of an electronic device adapted to implement the above described method according to an embodiment of the present disclosure. The electronic device shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 5, an electronic device 500 according to an embodiment of the present disclosure includes a processor 501 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 502 or a program loaded from a storage section 508 into a Random Access Memory (RAM) 503. The processor 501 may comprise, for example, a general purpose microprocessor (e.g., a CPU), an instruction set processor and/or associated chipset, and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), among others. The processor 501 may also include on-board memory for caching purposes. Processor 501 may include a single processing unit or multiple processing units for performing different actions of a method flow according to embodiments of the disclosure.
In the RAM 503, various programs and data necessary for the operation of the electronic apparatus 500 are stored. The processor 501, the ROM502, and the RAM 503 are connected to each other by a bus 504. The processor 501 performs various operations of the method flows according to the embodiments of the present disclosure by executing programs in the ROM502 and/or the RAM 503. Note that the programs may also be stored in one or more memories other than the ROM502 and the RAM 503. The processor 501 may also perform various operations of method flows according to embodiments of the present disclosure by executing programs stored in the one or more memories.
According to an embodiment of the present disclosure, electronic device 500 may also include an input/output (I/O) interface 505, input/output (I/O) interface 505 also being connected to bus 504. The system 500 may also include one or more of the following components connected to the I/O interface 505: an input portion 506 including a keyboard, a mouse, and the like; an output portion 507 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage portion 508 including a hard disk and the like; and a communication section 509 including a network interface card such as a LAN card, a modem, or the like. The communication section 509 performs communication processing via a network such as the internet. The driver 510 is also connected to the I/O interface 505 as necessary. A removable medium 511 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 510 as necessary, so that a computer program read out therefrom is mounted on the storage section 508 as necessary.
According to embodiments of the present disclosure, method flows according to embodiments of the present disclosure may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer-readable storage medium, the computer program containing program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 509, and/or installed from the removable medium 511. The computer program, when executed by the processor 501, performs the above-described functions defined in the system of the embodiments of the present disclosure. The systems, devices, apparatuses, modules, units, etc. described above may be implemented by computer program modules according to embodiments of the present disclosure.
The present disclosure also provides a computer-readable storage medium, which may be contained in the apparatus/device/system described in the above embodiments; or may exist separately and not be assembled into the device/apparatus/system. The computer readable storage medium carries one or more programs which, when executed, implement a mars surface survey planning vision simulation method according to an embodiment of the disclosure.
According to an embodiment of the present disclosure, the computer-readable storage medium may be a non-volatile computer-readable storage medium. Examples may include, but are not limited to: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
For example, according to embodiments of the present disclosure, a computer-readable storage medium may include ROM502 and/or RAM 503 and/or one or more memories other than ROM502 and RAM 503 described above.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. Those skilled in the art will appreciate that various combinations and/or combinations of features recited in the various embodiments and/or claims of the present disclosure can be made, even if such combinations or combinations are not expressly recited in the present disclosure. In particular, various combinations and/or combinations of the features recited in the various embodiments and/or claims of the present disclosure may be made without departing from the spirit or teaching of the present disclosure. All such combinations and/or associations are within the scope of the present disclosure.
The embodiments of the present disclosure have been described above. However, these examples are for illustrative purposes only and are not intended to limit the scope of the present disclosure. Although the embodiments are described separately above, this does not mean that the measures in the embodiments cannot be used advantageously in combination. The scope of the disclosure is defined by the appended claims and equivalents thereof. Various alternatives and modifications can be devised by those skilled in the art without departing from the scope of the disclosure, and these alternatives and modifications are intended to fall within the scope of the disclosure.

Claims (10)

1. A Mars surface detection planning visual simulation method comprises the following steps:
determining a target detection point of a mars train according to a virtual detection scene, wherein the virtual detection scene is established according to mars terrain data, mars train model data and ephemeris data, and illumination related to the position of the sun is set in the virtual detection scene;
determining a detection event for presentation in the virtual detection scene based on parameter information related to the detection event, wherein the parameter information includes time information of the detection event;
generating real-time illumination in the virtual detection scene matching the detection event based on the time information of the detection event;
and under the condition that real-time illumination matched with the detection event is simulated in the virtual detection scene, determining a visual simulation result of the mars vehicle at the target detection point according to display content related to the detection event in the virtual detection scene.
2. The method of claim 1, wherein the generating real-time illumination in the virtual detection scene matching the detection event based on the temporal information of the detection event comprises:
determining the detection time of the detection event according to the time information of the detection event;
determining the sun position of the target detection point in the virtual detection scene according to the detection time;
generating real-time illumination in the virtual detection scene that matches the detection event based on the sun position.
3. The method of claim 2, wherein said determining the sun location of the target detection point in the virtual detection scene according to the detection time comprises:
acquiring the detection time of a previous frame in the virtual detection scene;
determining the detection time of the current frame based on the detection time of the previous frame according to the preset time multiple speed and frame interval;
determining the sun position of the target detection point in the virtual detection scene based on the detection time of the current frame.
4. The method of claim 1, further comprising:
recording simulation information of the detection event in the view simulation result in a text file, wherein the simulation information comprises parameter information of the detection event;
determining modification information in the case of modifying the time information of the detection event; and
and displaying the detection event after the time information is modified in the virtual detection scene based on the modification information and the text file.
5. The method of claim 1, further comprising:
generating a first format file related to a train model of the train by using train parameter information of the train;
generating a second format file related to a detection point terrain model of the target detection point by using a stereo image acquired by a navigation terrain camera of the mars vehicle;
generating a third format file related to the sky environment in the virtual detection scene by using a panoramic image of the mars vehicle annularly shot on a landing platform;
and establishing the virtual detection scene by using a three-dimensional rendering engine based on the first format file, the second format file and the third format file.
6. The method of claim 4, the scientific payload of the Mars vehicle comprising: at least one of the navigation terrain camera, the multispectral camera, at least one of the navigation terrain camera and the multispectral camera being located on top of a mast of the mars vehicle; the method further comprises the following steps:
simulating imaging ranges of two of the navigation terrain cameras and the multispectral camera with a semi-transparent pyramid-like beam model based on mounting locations and camera parameters of at least one of the navigation terrain cameras and the multispectral camera; and
establishing at least one of the navigation terrain camera and the multispectral camera in the virtual probe scene with the three-dimensional rendering engine to simulate the imaging range of at least one of the navigation terrain camera and the multispectral camera.
7. The method of claim 6, the scientific payload of the Mars vehicle further comprising: a Mars surface composition detector, a Mars subsurface detection radar, a Mars surface magnetic field detector and a Mars meteorological measuring instrument.
8. A Mars surface detection planning visual simulation device comprises:
the system comprises a first determination module, a second determination module and a third determination module, wherein the first determination module is used for determining a target detection point of a mars vehicle according to a virtual detection scene, the virtual detection scene is established according to mars terrain data, mars vehicle model data and ephemeris data, and illumination related to the position of the sun is set in the virtual detection scene;
a second determining module, configured to determine a detection event for presentation in the virtual detection scene based on parameter information related to the detection event, wherein the parameter information includes time information of the detection event;
a generating module for generating real-time illumination in the virtual detection scene matching the detection event based on the time information of the detection event;
a third determining module, configured to determine a visual simulation result of the mars car at the target detection point according to display content related to the detection event in the virtual detection scene under the condition that real-time illumination matched with the detection event is simulated in the virtual detection scene.
9. An electronic device, comprising:
one or more processors;
a memory for storing one or more programs,
wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1-7.
10. A computer readable storage medium having stored thereon executable instructions which, when executed by a processor, cause the processor to carry out the method of any one of claims 1 to 7.
CN202211385480.9A 2022-11-07 2022-11-07 Mars surface detection planning visual simulation method, device, equipment and medium Pending CN115880463A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211385480.9A CN115880463A (en) 2022-11-07 2022-11-07 Mars surface detection planning visual simulation method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211385480.9A CN115880463A (en) 2022-11-07 2022-11-07 Mars surface detection planning visual simulation method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN115880463A true CN115880463A (en) 2023-03-31

Family

ID=85759479

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211385480.9A Pending CN115880463A (en) 2022-11-07 2022-11-07 Mars surface detection planning visual simulation method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN115880463A (en)

Similar Documents

Publication Publication Date Title
US11763474B2 (en) Method for generating simulated point cloud data, device, and storage medium
CN107966693B (en) Vehicle-mounted laser radar simulation method based on depth rendering
CN110765620B (en) Aircraft visual simulation method, system, server and storage medium
Behzadan et al. Georeferenced registration of construction graphics in mobile outdoor augmented reality
US8139111B2 (en) Height measurement in a perspective image
Jain et al. Recent developments in the ROAMS planetary rover simulation environment
El-Hakim et al. System for indoor 3D mapping and virtual environments
KR101886754B1 (en) Apparatus and method for generating a learning image for machine learning
CN109961522A (en) Image projecting method, device, equipment and storage medium
CN105628055B (en) A kind of deep space probe landing autonomous optical navigation target imaging simulation system
CN115439528B (en) Method and equipment for acquiring image position information of target object
CN113051776A (en) Satellite attitude and orbit simulation system and method based on Unity3D
CN109269495B (en) Dynamic star map generation method and device
Guvenc et al. Simulation Environment for Safety Assessment of CEAV Deployment in Linden
KR20210104033A (en) Planning method, apparatus, control terminal and storage medium of survey and mapping sample points
Remetean et al. Philae locating and science support by robotic vision techniques
CN115880463A (en) Mars surface detection planning visual simulation method, device, equipment and medium
El-Hakim et al. Sensor based creation of indoor virtual environment models
Smith et al. Building Maps for Terrain Relative Navigation Using Blender: an Open Source Approach
CN115688440A (en) Lunar digital environment construction simulation system
Crues et al. Digital Lunar Exploration Sites (DLES)
Thompson et al. Stereo Camera Simulation for Lunar Surface Photogrammetry
Koduri et al. AUREATE: An Augmented Reality Test Environment for Realistic Simulations
Paar et al. Preparing 3D vision & visualization for ExoMars
Hough et al. DEMkit & LunaRay: Tools for Mission Data Generation and Validation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination