CN115909858A - Flight simulation experience system based on VR image - Google Patents
Flight simulation experience system based on VR image Download PDFInfo
- Publication number
- CN115909858A CN115909858A CN202310213585.4A CN202310213585A CN115909858A CN 115909858 A CN115909858 A CN 115909858A CN 202310213585 A CN202310213585 A CN 202310213585A CN 115909858 A CN115909858 A CN 115909858A
- Authority
- CN
- China
- Prior art keywords
- terrain
- user
- resolution
- level
- subarea
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Landscapes
- Processing Or Creating Images (AREA)
Abstract
The invention relates to the technical field of data processing, in particular to a flight simulation experience system based on VR images, which is used for acquiring all terrain subregions, acquiring a visual range region and a projection plane of a user according to a viewpoint and a main visual direction of the user, acquiring a resolution characteristic value and a texture detail characteristic value of each terrain subregion in the visual range region, acquiring all terrain subregions and a terrain elevation data matrix for processing the user, calculating relative scheduling resources and relative calculation resources of the user according to the terrain elevation data matrix, allocating computer hardware resources according to the scheduling resource occupation ratio and the calculation resource occupation ratio of the user, generating a VR display picture of the user by scheduling resources of the user, better and reasonably allocating the computer hardware resources in a multi-user scene and ensuring the picture quality of each user VR display device in the multi-user scene.
Description
Technical Field
The invention relates to the technical field of data processing, in particular to a flight simulation experience system based on VR images.
Background
The flight simulation experience system is simulation equipment for flight training and science popularization, the most important purpose of the flight simulation experience system is to train pilots, and the flight simulation experience system is widely applied to the popularization background of civil aviation in recent years due to the advantages of safety, reliability, economy and no limitation of meteorological conditions. With the continuous development of the VR technology, the immersion and the interactivity of the flight simulation system are improved by combining the VR technology with the flight simulation experience system, so that the flight scene is more real, and the performance of computer-generated graphics is required to be improved.
The flight simulation experience system needs to rapidly process huge topographic data and texture mapping data to realize the continuity of flight simulation pictures and avoid picture tearing, but along with the gradual improvement of the picture quality requirements of users of the flight simulation system at the same time, the hardware cost of a computer is also higher and higher, how to better coordinate computer hardware resources under a multi-user scene to achieve the purpose of ensuring the picture quality of a VR display end of a user becomes an urgent industrial difficulty to be solved. In the prior art, in order to save the hardware cost of a computer, the terrain data and the texture data required by a flight simulation picture are generally adjusted in real time according to the change of a flight viewpoint. In a multi-user scene, different users have different main viewing directions, and the purpose of saving computer hardware resources is achieved by performing different levels of storage scheduling on terrain data by using a terrain LOD model algorithm in the prior art, but the method cannot meet the requirement of comprehensive scheduling on the terrain data under a multi-user target and cannot ensure the picture quality of each user VR display device under the multi-user scene, so that a method capable of performing comprehensive scheduling on the terrain data through multiple main viewing directions and further ensuring the picture quality of the multi-user VR display device is needed.
Disclosure of Invention
The invention provides a flight simulation experience system based on VR images, which aims to solve the existing problems.
The invention relates to a flight simulation experience system based on VR images, which adopts the following technical scheme:
one embodiment of the present invention provides a flight simulation experience system based on VR images, the system comprising:
the terrain area blocking module is used for blocking the terrain area to obtain all terrain sub-areas;
the region resource dividing module is used for dividing the texture data packets, obtaining texture data sub-packets of each terrain subregion, setting different resolution levels and obtaining terrain elevation data blocks of all the resolution levels of each terrain subregion;
the regional characteristic acquisition module is used for acquiring a visual range region and a projection plane of a user according to a viewpoint and a main view direction of the user and acquiring a resolution characteristic value and a texture detail characteristic value of each terrain subregion in the visual range region;
the regional characteristic grading module is used for grading the resolution characteristic value and the texture detail characteristic value to obtain all resolution characteristic value grades and texture detail characteristic value grades;
the user resource characteristic acquisition module is used for acquiring all terrain subareas and terrain elevation data matrixes processed by the users and calculating relative scheduling resources and relative calculation resources of the users according to the terrain elevation data matrixes;
the user resource acquisition module is used for taking the target grade terrain elevation data blocks and the target texture data sub-packages corresponding to all the terrain sub-areas to be processed of the user as the generation resources of the user;
and the hardware resource allocation module is used for calculating the scheduling resource proportion and the calculating resource proportion of the user, allocating computer hardware resources according to the scheduling resource proportion and the calculating resource proportion of the user, scheduling the generated resources of the user and generating a VR display picture of each user.
Further, the setting of different resolution levels includes the following specific steps:
carrying out multi-resolution level division from the highest resolution level to the lowest resolution level, and dividing the multi-resolution level into a second preset number of resolution levels, wherein the resolution levelsResolution level 1 is the lowest resolution level, being the highest resolution level; number of data points in a block of terrain elevation data at resolution level 1>In the range, the number of data points in the terrain elevation data block at resolution level 2 is ≧>In-range, similarly, resolution level>Is greater than or equal to @>In the range, is greater or less>Represents a second predetermined number, is greater than>A number of data points in the block of terrain elevation data representing the lowest resolution level, based on the resolution of the block of terrain elevation data, based on the number of data points in the block of terrain elevation data, based on the resolution of the block of terrain elevation data, and based on the resolution of the block of terrain elevation data, based on the resolution of the block of the terrain elevation data, based on the number of the terrain elevation data in the block of the terrain elevation data>The number of data points in the terrain elevation data block representing the highest high resolution level.
Further, the acquiring a visual range area and a projection plane of the user according to the viewpoint and the main viewing direction of the user includes the following specific steps:
when a plurality of users use the flight simulation equipment at the same time, the viewpoints of the users are the same viewpoint, and the main viewing directions of the users are multiple;
acquiring a visual range area on a terrain area according to a viewpoint by combining with the visual angle limit of flight simulation equipment; and combining VR equipment information, and acquiring a projection plane of each user according to the main view direction of each user.
Further, the method for obtaining the resolution characteristic value and the texture detail characteristic value comprises the following steps:
acquiring the distance between the center point of each terrain subarea and a viewpoint in the visual range area, and recording the distance as the viewpoint distance of each terrain subarea; acquiring the distance between the central point of each terrain subarea and the central point of the projection plane of each user, and recording as the main sight distance between each terrain subarea and each user;
and dividing the viewpoint distance of the terrain subarea by a first numerical value to obtain a resolution characteristic value of the terrain subarea, dividing the dominant viewing distance of the terrain subarea and the user by a second numerical value to obtain a texture detail characteristic value of the terrain subarea and the user, wherein the second numerical value is the maximum distance between the projection plane of the user and the central point of the projection plane.
Further, the obtaining of all the resolution characteristic value levels and texture detail characteristic value levels includes the following specific steps:
the value range according to the resolution characteristic value is [0,1 ]]Obtaining all resolution feature value levels, including: the characteristic value of resolution isIn the range, belongs to a resolution characteristic value level 1, the resolution characteristic value is->Within the range, belongs to resolution feature level 2, and similarly, the resolution feature is ≦ ≦>In the range, belongs to a resolution characteristic value level>,/>Representing a second preset number;
according to the value range [0,1 ] of the texture detail characteristic value]And obtaining all levels of texture detail feature values, including: the texture detail feature value isWithin the range, the texture detail feature value belongs to level 1, and the texture detail feature value is withinWhen the range is within, it belongs to level 2, and similarly, the texture detail value is->In range, belong to a texture detail feature value level >>,/>Representing a third preset number.
Further, the acquiring all terrain subareas and terrain elevation data matrix for processing of the user includes the following specific steps:
for any terrain subarea, acquiring the lowest level of the terrain subarea and the texture detail characteristic value levels of a plurality of users, and marking the user corresponding to the lowest level of the texture detail characteristic value levels as a processing user of the terrain subarea; acquiring a processing user corresponding to each terrain subarea in the visual range area, and allocating each terrain subarea in the visual range area to the corresponding processing user as a processing terrain subarea of the processing user;
for any user, counting all terrain processing subareas of the user and belonging to the resolution characteristic value levelAnd belongs to a texture detail feature value level->Processing the number of terrain sub-areas, the number being considered as the ÷ th in the terrain elevation data matrix of the user>Go, first/>The elements of the column.
Further, the method for calculating the relative scheduling resource comprises the following steps:
in the formula (I), the compound is shown in the specification,represents a relative scheduled resource for the ith user, based on the relative scheduling resource>Represents the number of processed terrain sub-areas belonging to the i-th user, and>a geographic elevation data matrix representing an ith user>Line and the fifth->Column elements, <' > based on>Represents a natural logarithm based on a natural constant e>Represents a second predetermined number, is greater than>Representing a third preset number.
Further, the calculation method of the relative calculation resources is as follows:
in the formula (I), the compound is shown in the specification,represents the relative computing resource of the ith user, <' > or>Represents the number of processed terrain sub-areas belonging to the i-th user, and>indicates the ^ th or ^ th on the terrain elevation data matrix for the ith user>Line and first or second based on>Element of a column, < > or >>Represents a natural logarithm based on a natural constant e, <' > is>Represents a second predetermined number, is greater than>Representing a third preset number.
Further, the method for acquiring the sub-packets of the target-level terrain elevation data block and the target texture data comprises the following steps:
according to the resolution characteristic value level of the terrain subarea processed by the userThe resolution level which corresponds to the terrain sub-area to be processed is obtained->The terrain elevation data block is recorded as a target grade terrain elevation data block for processing the terrain subarea, and the grade of the texture detail characteristic value of the processed terrain subarea of the user is based on->Setting up process terrain subregion correspondencesIs greater than or equal to>A target texture data sub-packet, marked as processing a terrain sub-region, based on the current location of the terrain>Representing a third preset number.
Further, the calculating of the scheduling resource proportion and the calculating resource proportion of the user includes the following specific steps:
in the formula (I), the compound is shown in the specification,represents the scheduled resource occupancy, based on the status of the ith user>Represents a relative scheduled resource for the ith user, based on the relative scheduling resource>Represents the number of users who simultaneously use the flight simulation apparatus, or>Representing the relative scheduled resource of the kth user; />Indicates the computing resource occupancy of the ith user, based on the status of the device>Represents the relative computing resource of the ith user, <' > or>Indicating simultaneous use of flight simulating equipmentNumber of users>Representing the relative computational resources of the kth user.
The technical scheme of the invention has the beneficial effects that: compared with the conventional single-user flight simulation experience system, the multi-user flight simulation experience system provided by the invention has the advantages that the visual range area and the projection plane of a user are obtained according to the viewpoint and the main visual direction of the user, the resolution characteristic value and the texture detail characteristic value of each terrain subregion in the visual range area are obtained, the relative scheduling resources and the relative computing resources of the user are computed according to the terrain elevation data matrix, the computer hardware resources are distributed according to the scheduling resource proportion and the computing resource proportion of the user, the generated resources of the user are scheduled to generate a VR display picture of the user, the computer hardware resources are better and reasonably distributed in a multi-user scene, and the picture quality of VR display equipment of each user in the multi-user scene is ensured.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flow chart illustrating the steps of a VR image based flight simulation experience system in accordance with the present invention;
fig. 2 is a schematic view of the visual range area and projection plane of a user of the present invention.
Detailed Description
To further illustrate the technical means and effects of the present invention for achieving the predetermined objects, the following detailed description of a flight simulation experience system based on VR images, its specific implementation, structure, features and effects will be provided in conjunction with the accompanying drawings and the preferred embodiments. In the following description, different "one embodiment" or "another embodiment" refers to not necessarily the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following specifically describes a specific scheme of the flight simulation experience system based on VR images, which is provided by the present invention, with reference to the accompanying drawings.
Referring to fig. 1, a flowchart illustrating steps of a VR image based flight simulation experience system according to an embodiment of the present invention is shown, where the system includes: the system comprises a terrain region partitioning module, a region resource partitioning module, a region feature acquisition module, a region feature grading module, a user resource feature acquisition module, a user resource acquisition module and a hardware resource allocation module.
The terrain area blocking module is used for blocking the terrain area to obtain all terrain sub-areas.
The method specifically comprises the following steps:
blocking a terrain area of a flight simulation scene to obtain all terrain sub-areas, including: the whole terrain area of the flight simulation scene is a square area, the whole terrain area is uniformly divided into a first preset number of terrain sub-areas (square areas), the areas of all the terrain sub-areas are the same and are marked as S due to the uniform division, and the unit of the area of each terrain sub-area is square decimeter. The first predetermined number in this embodiment10000, in other embodiments, the implementer may set the first preset number as desired.
And the area resource dividing module is used for dividing DEM terrain elevation data and texture data packets according to all terrain sub-areas.
The method specifically comprises the following steps:
setting different resolution levels, including: the DEM terrain elevation data of the terrain area are used for drawing the terrain of the terrain area, the DEM terrain elevation data corresponding to each terrain sub-area are recorded as terrain elevation data blocks, the number of data points in the terrain elevation data blocks of different resolution levels of the terrain sub-area is different, and the number of the data points in the terrain elevation data blocks of higher resolution levels is larger. The highest resolution level of the terrain elevation data block is determined by the source of the DEM terrain elevation data, while the lowest resolution level of the terrain elevation data block is set by the user.
In an embodiment of the present invention, the number of data points in the terrain elevation data block of the highest resolution level isWherein S denotes the area of a terrain sub-region>Represents a first unit area, the first unit area being 0.25 square meters; the number of data points in a block of terrain elevation data at the lowest resolution level is ≧ n>Wherein S denotes the area of a terrain sub-region>The second unit area is represented, and in this embodiment, the second unit area is 100 square meters, and in other embodiments, the implementer may set the second unit area as needed.
Carrying out multi-resolution level division from the highest resolution level to the lowest resolution level, and dividing the multi-resolution level into a second preset number of resolution levels, wherein the resolution levelsResolution level 1 is the lowest resolution level; number of data points in a block of terrain elevation data at resolution level 1>In-range, resolution level 2 terrain elevation data blockNumber of data points in>In range, likewise, resolution level>The number of data points in the terrain elevation data block isIn the range, is greater or less>Represents a second predetermined number, is greater than>The number of data points in the block of terrain elevation data representing the lowest resolution level, <' >>The number of data points in the terrain elevation data block representing the highest resolution level.
The second predetermined number in this embodimentAnd 50, in other embodiments, the practitioner may set the second predetermined number as desired.
Storing terrain elevation data blocks of different resolution levels of a terrain sub-region according to a terrain LOD model algorithm, wherein the storing process comprises the following steps: for any terrain sub-area, acquiring terrain elevation data blocks of all resolution levels of the terrain sub-area, and storing the terrain elevation data blocks of each resolution level of the terrain sub-area in corresponding units; the terrain elevation data blocks of all resolution levels for all terrain sub-areas are stored in corresponding cells.
The method comprises the steps that a terrain area of a flight simulation scene corresponds to a texture data packet, and the texture data packet is used for rendering the terrain area and drawing specific texture details after the terrain of the terrain area is drawn; according to the partitioning result of the terrain area, the texture data packet is divided into a plurality of texture data sub-packets, each terrain sub-area corresponds to one texture data sub-packet, the texture data sub-packets are used for rendering the terrain sub-areas, the higher the utilization rate of the texture data sub-packets is, the higher the rendering degree of the terrain sub-areas is, the more the texture details of the terrain sub-areas are, and the better the visual effect is.
The area characteristic acquisition module is used for acquiring a resolution characteristic value and a texture detail characteristic value of the terrain subarea.
The method specifically comprises the following steps:
when a plurality of users use the flight simulation equipment at the same time, the viewpoints and the main viewing directions of the users are obtained through a positioning device arranged in the flight simulation equipment, and because the viewpoints of different users on the same flight simulation equipment are approximately the same and only the main viewing directions are different, the viewpoints of the users are considered to be the same viewpoint and the main viewing directions of the users are multiple.
In combination with the visual angle limitation of the flight simulation equipment, acquiring a visual range area on a terrain area according to a viewpoint, as shown in fig. 2, which is the prior art and will not be described in detail herein; in conjunction with the VR device information, a projection plane of each user is obtained according to the main viewing direction of each user, as shown in fig. 2.
Acquiring the distance between the center point of each terrain subarea and a viewpoint in the visual range area, and recording the distance as the viewpoint distance of each terrain subarea; and acquiring the distance between the central point of each terrain sub-area and the central point of the projection plane of each user, and recording the distance as the main sight distance between each terrain sub-area and each user.
It should be noted that the user's requirements for the resolution level and texture detail of each terrain subarea in the visual range area are related to the viewpoint distance of each terrain subarea and the dominant viewing distance of each terrain subarea from the user: the larger the viewpoint distance of the terrain subarea is, the farther the terrain subarea is from the viewpoint is, the smaller the requirement of the user on the resolution of the terrain subarea is, that is, the larger the viewpoint distance is, the more low-resolution terrain elevation data blocks of the terrain subarea are required to be used to form the simulated terrain (in the main view direction) of the user; the smaller the viewpoint distance is, the closer the terrain subarea is to the viewpoint, the greater the requirement of the user on the resolution of the terrain elevation data blocks of the terrain subarea is, that is, the smaller the viewpoint distance is, the more high-resolution terrain elevation data blocks of the terrain subarea need to be used to form the user's simulated terrain (in the main view direction).
It should be further noted that the larger the dominant viewing distance between the terrain subregion and the user is, the farther the terrain subregion is from the projection plane of the user is, the smaller the requirement of the user on the texture details (determined by the human eye imaging characteristics) of the terrain subregion is, the lower the usage rate of the texture data sub-packet is, that is, the larger the dominant viewing distance is, the less texture details of the terrain subregion are required, and the lower the usage rate of the texture data sub-packet is; the smaller the dominant viewing distance is, the closer the terrain subregion is to the projection plane of the user, the greater the requirement of the user on texture details (determined by human eye imaging characteristics) of the terrain subregion is, the higher the utilization rate of the texture data sub-packet is, that is, the smaller the dominant viewing distance is, the more texture details of the terrain subregion are required, and the higher the utilization rate of the texture data sub-packet is.
In conclusion, representing the requirements of the user on the resolution level and the texture details of the terrain subarea by using the viewpoint distance of the terrain subarea and the dominant sight distance of the terrain subarea and the user respectively; dividing the viewpoint distance of the terrain subarea by a first numerical value to obtain a resolution characteristic value of the terrain subarea, wherein the first numerical value is a visual distance set by the flight simulation system and is determined by a set parameter of the flight simulation system; and dividing the main visual distance between the terrain subarea and the user by a second numerical value to obtain a texture detail characteristic value between the terrain subarea and the user, wherein the second numerical value is the maximum distance between the terrain subarea and the central point of the projection plane of the user. The resolution characteristic value and the texture detail characteristic value are both normalization results.
The region feature grading module is used for grading the resolution feature value and the texture detail feature value of the terrain subregion.
The method specifically comprises the following steps:
the larger the viewpoint distance of the terrain subareas is, the larger the resolution characteristic value of the terrain subareas is, and the lower the requirement of a user on the resolution of the terrain subareas is; thus, the topographyThe greater the resolution characteristic value of the region, the smaller the resolution characteristic value level of the terrain subregion, and at this time, the smaller the requirement of the user on the resolution of the terrain subregion. Grading the resolution characteristic value and the texture detail characteristic value, wherein the resolution characteristic value is divided into a second preset number of resolution characteristic value grades, and the value range of the resolution characteristic value is [0,1 ] as the resolution characteristic value is a normalization result]Then, all resolution eigenvalue levels are specifically: the characteristic value of resolution isWhen the range is within, the resolution characteristic value belongs to the resolution characteristic value level 1Within range, belongs to resolution characteristic level 2, and similarly, resolution characteristic values in &>In the range belonging to a resolution characteristic value level>,/>Representing a second preset number, resolution characteristic level->The highest resolution eigenvalue level and the resolution eigenvalue level 1 the lowest resolution eigenvalue level.
The larger the dominant viewing distance between the terrain subarea and the user is, the larger the texture detail characteristic value between the terrain subarea and the user is, and the smaller the requirement of the user on the texture detail of the terrain subarea is; therefore, the larger the texture detail feature value of the terrain sub-area and the user is, the smaller the texture detail feature value level of the terrain sub-area is, and at this time, the smaller the requirement of the user on the texture detail of the terrain sub-area is. Grading the texture detail characteristic values, wherein the texture detail characteristic values are divided into a third preset number of texture detail characteristic value grades due to the textureThe detail characteristic value is the normalization result, and the value range of the texture detail characteristic value is [0,1 ]]Then, all the texture detail feature value levels are specifically: the texture detail feature value isIn range, belongs to level 1 and has a texture detail feature value in%>When the range is within, it belongs to level 2, and similarly, the texture detail value is->In range, belong to a texture detail feature value level >>,/>Represents a third preset number, a level of the texture detail feature value->The highest level of texture detail feature values, and level 1 of texture detail feature values is the lowest level of texture detail feature values.
Third predetermined number in this embodimentIn other embodiments, the practitioner may set the third predetermined number as desired at 10.
The user resource characteristic acquisition module is used for relative scheduling resources and relative computing resources of the user.
The method specifically comprises the following steps:
it should be noted that, for a user, each terrain sub-area in the user's visual range area has a resolution characteristic value level and a texture detail characteristic value level, but the terrain elevation data blocks corresponding to each terrain sub-area are not scheduled by the computing device of the user, and are further used for forming the simulated terrain of the user; that is, for the user, it is the final goal to obtain the accurate scheduling and texture rendering of each terrain sub-region in the user's visual range region. Based on this logic, for any terrain sub-area, the terrain sub-area and each user have texture detail feature values, and the terrain sub-area and each user have texture detail feature value levels, the terrain elevation data block corresponding to the terrain sub-area should be called by the user with the lowest texture detail feature value level of the terrain sub-area to form the simulated terrain of the user, otherwise, the terrain elevation data block corresponding to the terrain sub-area is more important for the user.
For any terrain subarea, obtaining the lowest level of the terrain subarea and the texture detail characteristic value levels of a plurality of users, and marking the user corresponding to the lowest level of the texture detail characteristic value levels as a processing user of the terrain subarea; each terrain subarea in the visual range area has a corresponding processing user, and each terrain subarea in the visual range area is allocated to the processing user as a processing terrain subarea of the processing user.
For any user, counting all terrain processing subareas of the user and belonging to the resolution characteristic value levelAnd belongs to a texture detail feature value level->Number of processed terrain sub-areas->Will number->As the ^ th or greatest in the terrain elevation data matrix of the user>Line and the fifth->Elements of a column ofMiddle, or>,/>In total->Individual resolution characteristic value levels, having a total of->The texture detail feature value level, so that the terrain elevation data matrix of the user has a size of,/>Represents a second predetermined number, is greater than>Representing a third preset number.
The calculation formula of the relative scheduling resource of the user is specifically as follows:
in the formula (I), the compound is shown in the specification,represents a relative scheduled resource for the ith user, based on the relative scheduling resource>Represents the number of processed terrain sub-areas belonging to the i-th user, and>a geographic elevation data matrix representing an ith user>Line and the fifth->The elements of the column (i.e. belonging to the resolution characteristic value level ÷ in all terrain-processing subareas of the ith user>And belongs to a texture detail feature value level->Number of processed terrain sub-regions), based on the number of processed terrain sub-regions in the field)>Represents a natural logarithm based on a natural constant e, <' > is>Represents a second predetermined number, is greater than>Representing a third preset number; representing the relative scheduling resource of the ith user by the entropy of the number of processing terrain sub-regions belonging to different resolution eigenvalue levels, the relative scheduling resource of the ith user->The larger the size is, the more disordered the resolution characteristic value level of the processing terrain subarea of the ith user is, and the more resources need to be scheduled when the simulation terrain of the ith user is formed.
The calculation formula of the relative calculation resources of the user is specifically as follows:
in the formula (I), the compound is shown in the specification,represents the relative computing resource of the ith user, <' > or>Represents the number of processed terrain sub-areas belonging to the i-th user, and>indicates the ^ th or ^ th on the terrain elevation data matrix for the ith user>Line and the fifth->The elements of the column (i.e. all terrain-handling subareas of the ith user belonging to the resolution characteristic value level->And belongs to a texture detail feature value level->Number of terrain sub-areas processed) in combination with a predetermined number of terrain sub-areas, and>represents a natural logarithm based on a natural constant e, <' > is>Represents a second predetermined number, and>representing a third preset number; />Belonging to a texture detail feature value level->The higher the level of texture detail feature value of the processed terrain sub-region is, the more quickly it is>The larger the size is, the higher the requirement of a user on the texture details of the terrain sub-area is, the higher the utilization rate of the texture data sub-packet is, and the calculation force required for processing the terrain elevation data block corresponding to the terrain sub-area is ^ greater>The larger, more computing resources are needed, the relative computing resources of the user +>The larger.
The user resource acquisition module is used for acquiring the generated resources of the user.
The method specifically comprises the following steps:
for any user, corresponding target-level terrain elevation data blocks and target texture data sub-packets of all terrain sub-areas processed belonging to the user are generated resources which need to be scheduled by the user; the method for processing the target level terrain elevation data block and the target texture data sub-package corresponding to the terrain sub-area comprises the following specific steps: according to the resolution characteristic value level of the terrain subarea processed by the userThe resolution level which corresponds to the terrain sub-area to be processed is obtained->The terrain elevation data block is recorded as a target grade terrain elevation data block for processing the terrain subarea, and the grade of the texture detail characteristic value of the processed terrain subarea of the user is based on->Setting the utilization rate of a texture data sub-packet corresponding to the processed terrain sub-area to be ^ or ^>And recording as a target texture data sub-packet for processing the terrain subarea.
The hardware resource allocation module is used for calculating the scheduling resource proportion and the computing resource proportion of the user and allocating the computer hardware resources.
The method specifically comprises the following steps:
according to the relative scheduling resources and the relative computing resources of the users, the scheduling resource proportion and the computing resource proportion of the users are computed, and the method specifically comprises the following steps:
in the formula (I), the compound is shown in the specification,represents a scheduled resource occupancy of the ith user, based on the status of the scheduling resource occupancy>Represents a relative scheduled resource for the ith user, based on the relative scheduling resource>Indicating the number of users who simultaneously use the flight simulation apparatus, in>Represents the relative scheduled resource of the kth user; />Indicates the computing resource occupancy of the ith user, based on the status of the device>Represents the relative computing resource of the ith user, <' > or>Represents the number of users who simultaneously use the flight simulation apparatus, or>Representing the relative computational resources of the kth user.
The larger the relative scheduling resource and the relative computing resource of the user are, the larger the scheduling resource proportion and the computing resource proportion of the user are, the more the scheduling resource and the computing resource are needed by the user, the larger the corresponding needed computing power is, and the more computer hardware resources are allocated to the user.
And distributing corresponding computer hardware resources to each user according to the scheduling resource proportion and the computing resource proportion of all the users for scheduling the generated resources of the users and generating VR display pictures of all the users.
The system comprises a terrain area partitioning module, an area resource partitioning module, an area characteristic acquisition module, an area characteristic grading module, a user resource characteristic acquisition module, a user resource acquisition module and a hardware resource allocation module. Compared with the existing single-user flight simulation experience system, the multi-user flight simulation experience system provided by the invention has the advantages that the visual range area and the projection plane of a user are obtained according to the viewpoint and the main visual direction of the user, the resolution characteristic value and the texture detail characteristic value of each terrain subarea in the visual range area are obtained, the relative scheduling resource and the relative computing resource of the user are computed according to the terrain elevation data matrix, the computer hardware resource is distributed according to the scheduling resource proportion and the computing resource proportion of the user, the generated resource of the user is scheduled to generate the VR display picture of the user, the computer hardware resource is better and reasonably distributed in a multi-user scene, and the picture quality of VR display equipment of each user in the multi-user scene is ensured.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.
Claims (10)
1. A VR image based flight simulation experience system, comprising:
the terrain area blocking module is used for blocking a terrain area to obtain all terrain sub-areas;
the regional resource dividing module is used for dividing the texture data packet to obtain texture data sub-packets of each terrain sub-region, setting different resolution levels and obtaining terrain elevation data blocks of all resolution levels of each terrain sub-region;
the regional characteristic acquisition module is used for acquiring a visual range region and a projection plane of a user according to a viewpoint and a main view direction of the user and acquiring a resolution characteristic value and a texture detail characteristic value of each terrain subarea in the visual range region;
the regional characteristic grading module is used for grading the resolution characteristic value and the texture detail characteristic value to obtain all resolution characteristic value grades and texture detail characteristic value grades;
the user resource characteristic acquisition module is used for acquiring all terrain subareas and terrain elevation data matrixes processed by the users and calculating relative scheduling resources and relative calculation resources of the users according to the terrain elevation data matrixes;
the user resource acquisition module is used for taking target grade terrain elevation data blocks and target texture data sub-packages corresponding to all terrain sub-areas processed by the user as generating resources of the user;
and the hardware resource allocation module is used for calculating the scheduling resource proportion and the calculating resource proportion of the user, allocating computer hardware resources according to the scheduling resource proportion and the calculating resource proportion of the user, scheduling the generated resources of the user and generating a VR display picture of each user.
2. The VR image based flight simulation experience system of claim 1, wherein the setting of different resolution levels includes the specific steps of: carrying out multi-resolution level division from the highest resolution level to the lowest resolution level, and dividing the multi-resolution level into a second preset number of resolution levels, wherein the resolution levelsResolution level 1 is the lowest resolution level, being the highest resolution level; the number of data points in the terrain elevation data block at resolution level 1 is ≥>In the range, the number of data points in the terrain elevation data block at resolution level 2 is ≧>Within range, in the same way, distinguishRate level>Is greater than or equal to @>In range of>A second preset number is indicated which is,the number of data points in the block of terrain elevation data representing the lowest resolution level, <' >>The number of data points in the terrain elevation data block representing the highest high resolution level.
3. The VR image based flight simulation experience system of claim 1, wherein the obtaining of the user's visual range area and projection plane from the user's viewpoint and main viewing direction comprises the following steps:
when a plurality of users use the flight simulation equipment at the same time, the viewpoints of the users are the same viewpoint, and the main viewing directions of the users are multiple;
acquiring a visual range area on a terrain area according to a viewpoint by combining with the visual angle limit of flight simulation equipment; and combining VR equipment information, and acquiring a projection plane of each user according to the main view direction of each user.
4. The VR image-based flight simulation experience system of claim 1, wherein the resolution feature values and texture detail feature values are obtained by:
acquiring the distance between the center point of each terrain subarea and a viewpoint in the visual range area, and recording the distance as the viewpoint distance of each terrain subarea; acquiring the distance between the central point of each terrain subarea and the central point of the projection plane of each user, and recording as the main sight distance between each terrain subarea and each user;
and dividing the viewpoint distance of the terrain subarea by the first numerical value to obtain a resolution characteristic value of the terrain subarea, and dividing the dominant view distance of the terrain subarea and the user by the second numerical value to obtain a texture detail characteristic value of the terrain subarea and the user, wherein the second numerical value is the maximum distance between the projection plane of the user and the central point of the projection plane.
5. The VR image based flight simulation experience system of claim 1, wherein the obtaining all the resolution feature value levels and texture detail feature value levels comprises the following steps:
the value range according to the resolution characteristic value is [0,1 ]]Obtaining all resolution feature value levels, including: the characteristic value of resolution isIn range, belongs to resolution characteristic value level 1, and the resolution characteristic value is->Within range, belongs to resolution characteristic level 2, and similarly, resolution characteristic values in &>In the range, belongs to a resolution characteristic value level>,/>Representing a second preset number;
according to the value range [0,1 ] of the texture detail characteristic value]And obtaining all levels of texture detail feature values, including: the texture detail feature value isWithin the range, belongs to a texture detail feature value level 1, and the texture detail feature value is ≥ er>When the range is within, it belongs to level 2, and similarly, the texture detail value is->In range, belong to a texture detail feature value level >>,/>Representing a third preset number.
6. The VR image based flight simulation experience system of claim 1, wherein the acquiring all terrain subareas and terrain elevation data matrix of the user comprises the following steps:
for any terrain subarea, acquiring the lowest level of the terrain subarea and the texture detail characteristic value levels of a plurality of users, and marking the user corresponding to the lowest level of the texture detail characteristic value levels as a processing user of the terrain subarea; acquiring a processing user corresponding to each terrain subarea in the visual range area, and allocating each terrain subarea in the visual range area to the corresponding processing user as a processing terrain subarea of the processing user;
for any user, counting all terrain processing subareas of the user and belonging to the resolution characteristic value levelAnd belongs to a texture detail feature value level->OfManaging a number of terrain sub-areas, the number being considered as the ^ th or greater in the terrain elevation data matrix of the user>Line and the fifth->The elements of the column.
7. The VR image based flight simulation experience system of claim 1, wherein the relative scheduling resources are calculated by:
in the formula (II)>Represents a relative scheduled resource for the ith user, based on the relative scheduling resource>Represents the number of processed terrain sub-areas belonging to the i-th user, in conjunction with a processing unit>Indicates the ^ th or ^ th on the terrain elevation data matrix for the ith user>Line and the fifth->Element of a column, < > or >>Represents a natural logarithm based on a natural constant e>Represents a second predetermined number, and>representing a third preset number.
8. The VR image based flight simulation experience system of claim 1, wherein the relative computing resources are computed by:
in combination with>Represents a relative computing resource of an ith user, based on a relative computing resource of the ith user>Represents the number of processed terrain sub-areas belonging to the i-th user, in conjunction with a processing unit>The first in the terrain elevation data matrix representing the ith userLine and first or second based on>Element of a column, < > or >>Represents a natural logarithm based on a natural constant e>Represents a second predetermined number, is greater than>Representing a third preset number. />
9. The VR image-based flight simulation experience system of claim 1, wherein the target-level terrain elevation data block and the target texture data sub-packet are obtained by:
according to the resolution characteristic value level of the terrain subarea processed by the userThe resolution level which corresponds to the terrain sub-area to be processed is obtained->The terrain elevation data block is recorded as a target grade terrain elevation data block for processing the terrain subarea, and the grade of the texture detail characteristic value of the processed terrain subarea of the user is based on->Setting the utilization rate of a texture data sub-packet corresponding to the processed terrain sub-area to be ^ or ^>A target texture data sub-packet, marked as processing a terrain sub-region, based on the current location of the terrain>Representing a third preset number.
10. The VR image based flight simulation experience system of claim 1, wherein the calculating of the scheduled resource proportion and the calculated resource proportion of the user includes the following steps:
in the formula (II)>Represents a scheduled resource occupancy of the ith user, based on the status of the scheduling resource occupancy>Represents a relative scheduled resource for the ith user, based on the relative scheduling resource>Indicating the number of users who simultaneously use the flight simulation apparatus, in>Representing the relative scheduled resource of the kth user; />Represents a computing resource ratio of an ith user, based on a user selection criteria>Represents a relative computing resource of an ith user, based on a relative computing resource of the ith user>Indicating the number of users who simultaneously use the flight simulation apparatus, in>Representing the relative computational resources of the kth user. />
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310213585.4A CN115909858B (en) | 2023-03-08 | 2023-03-08 | Flight simulation experience system based on VR image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310213585.4A CN115909858B (en) | 2023-03-08 | 2023-03-08 | Flight simulation experience system based on VR image |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115909858A true CN115909858A (en) | 2023-04-04 |
CN115909858B CN115909858B (en) | 2023-05-09 |
Family
ID=85739227
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310213585.4A Active CN115909858B (en) | 2023-03-08 | 2023-03-08 | Flight simulation experience system based on VR image |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115909858B (en) |
Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0100097A2 (en) * | 1982-07-30 | 1984-02-08 | Honeywell Inc. | Computer controlled imaging system |
US6346938B1 (en) * | 1999-04-27 | 2002-02-12 | Harris Corporation | Computer-resident mechanism for manipulating, navigating through and mensurating displayed image of three-dimensional geometric model |
US20060077255A1 (en) * | 2004-08-10 | 2006-04-13 | Hui Cheng | Method and system for performing adaptive image acquisition |
US20090087029A1 (en) * | 2007-08-22 | 2009-04-02 | American Gnc Corporation | 4D GIS based virtual reality for moving target prediction |
CN102074049A (en) * | 2011-03-01 | 2011-05-25 | 哈尔滨工程大学 | Wide-range terrain scheduling simplifying method based on movement of viewpoint |
CN202221566U (en) * | 2011-07-08 | 2012-05-16 | 中国民航科学技术研究院 | Flight programming system and verification platform of performance-based navigation |
CN104766366A (en) * | 2015-03-31 | 2015-07-08 | 东北林业大学 | Method for establishing three-dimensional virtual reality demonstration |
CN105139451A (en) * | 2015-08-10 | 2015-12-09 | 中国商用飞机有限责任公司北京民用飞机技术研究中心 | HUD (head-up display) based synthetic vision guiding display system |
CN106446351A (en) * | 2016-08-31 | 2017-02-22 | 郑州捷安高科股份有限公司 | Real-time drawing-oriented large-scale scene organization and scheduling technology and simulation system |
CN106530896A (en) * | 2016-11-30 | 2017-03-22 | 中国直升机设计研究所 | Virtual system for unmanned aerial vehicle flight demonstration |
CN109064546A (en) * | 2018-06-08 | 2018-12-21 | 东南大学 | A kind of landform image data fast dispatch method and its system |
CN110908510A (en) * | 2019-11-08 | 2020-03-24 | 四川大学 | Application method of oblique photography modeling data in immersive display equipment |
CN112001993A (en) * | 2020-07-14 | 2020-11-27 | 深圳市规划国土房产信息中心(深圳市空间地理信息中心) | Multi-GPU (graphics processing Unit) city simulation system for large scene |
WO2021113268A1 (en) * | 2019-12-01 | 2021-06-10 | Iven Connary | Systems and methods for generating of 3d information on a user display from processing of sensor data |
CN113506370A (en) * | 2021-07-28 | 2021-10-15 | 自然资源部国土卫星遥感应用中心 | Three-dimensional geographic scene model construction method and device based on three-dimensional remote sensing image |
CN113516769A (en) * | 2021-07-28 | 2021-10-19 | 自然资源部国土卫星遥感应用中心 | Virtual reality three-dimensional scene loading and rendering method and device and terminal equipment |
US11216663B1 (en) * | 2020-12-01 | 2022-01-04 | Pointivo, Inc. | Systems and methods for generating of 3D information on a user display from processing of sensor data for objects, components or features of interest in a scene and user navigation thereon |
US20220130145A1 (en) * | 2019-12-01 | 2022-04-28 | Pointivo Inc. | Systems and methods for generating of 3d information on a user display from processing of sensor data for objects, components or features of interest in a scene and user navigation thereon |
-
2023
- 2023-03-08 CN CN202310213585.4A patent/CN115909858B/en active Active
Patent Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0100097A2 (en) * | 1982-07-30 | 1984-02-08 | Honeywell Inc. | Computer controlled imaging system |
US6346938B1 (en) * | 1999-04-27 | 2002-02-12 | Harris Corporation | Computer-resident mechanism for manipulating, navigating through and mensurating displayed image of three-dimensional geometric model |
US20060077255A1 (en) * | 2004-08-10 | 2006-04-13 | Hui Cheng | Method and system for performing adaptive image acquisition |
US20090087029A1 (en) * | 2007-08-22 | 2009-04-02 | American Gnc Corporation | 4D GIS based virtual reality for moving target prediction |
CN102074049A (en) * | 2011-03-01 | 2011-05-25 | 哈尔滨工程大学 | Wide-range terrain scheduling simplifying method based on movement of viewpoint |
CN202221566U (en) * | 2011-07-08 | 2012-05-16 | 中国民航科学技术研究院 | Flight programming system and verification platform of performance-based navigation |
CN104766366A (en) * | 2015-03-31 | 2015-07-08 | 东北林业大学 | Method for establishing three-dimensional virtual reality demonstration |
CN105139451A (en) * | 2015-08-10 | 2015-12-09 | 中国商用飞机有限责任公司北京民用飞机技术研究中心 | HUD (head-up display) based synthetic vision guiding display system |
CN106446351A (en) * | 2016-08-31 | 2017-02-22 | 郑州捷安高科股份有限公司 | Real-time drawing-oriented large-scale scene organization and scheduling technology and simulation system |
CN106530896A (en) * | 2016-11-30 | 2017-03-22 | 中国直升机设计研究所 | Virtual system for unmanned aerial vehicle flight demonstration |
CN109064546A (en) * | 2018-06-08 | 2018-12-21 | 东南大学 | A kind of landform image data fast dispatch method and its system |
CN110908510A (en) * | 2019-11-08 | 2020-03-24 | 四川大学 | Application method of oblique photography modeling data in immersive display equipment |
WO2021113268A1 (en) * | 2019-12-01 | 2021-06-10 | Iven Connary | Systems and methods for generating of 3d information on a user display from processing of sensor data |
US20220130145A1 (en) * | 2019-12-01 | 2022-04-28 | Pointivo Inc. | Systems and methods for generating of 3d information on a user display from processing of sensor data for objects, components or features of interest in a scene and user navigation thereon |
CN112001993A (en) * | 2020-07-14 | 2020-11-27 | 深圳市规划国土房产信息中心(深圳市空间地理信息中心) | Multi-GPU (graphics processing Unit) city simulation system for large scene |
US11216663B1 (en) * | 2020-12-01 | 2022-01-04 | Pointivo, Inc. | Systems and methods for generating of 3D information on a user display from processing of sensor data for objects, components or features of interest in a scene and user navigation thereon |
CN113506370A (en) * | 2021-07-28 | 2021-10-15 | 自然资源部国土卫星遥感应用中心 | Three-dimensional geographic scene model construction method and device based on three-dimensional remote sensing image |
CN113516769A (en) * | 2021-07-28 | 2021-10-19 | 自然资源部国土卫星遥感应用中心 | Virtual reality three-dimensional scene loading and rendering method and device and terminal equipment |
Also Published As
Publication number | Publication date |
---|---|
CN115909858B (en) | 2023-05-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104731894A (en) | Thermodynamic diagram display method and device | |
KR102047615B1 (en) | Processing Method and Apparatus for Particle Systems | |
CN104200506B (en) | Three-dimension GIS massive vector data rendering intent and device | |
CN102306395B (en) | Distributed drawing method and device of three-dimensional data | |
CN108267154B (en) | Map display method and device | |
CN110909759B (en) | Urban area hierarchical management system and method based on population big data | |
US9208752B2 (en) | Method for synchronous representation of a virtual reality in a distributed simulation system | |
CN104183020B (en) | Atural object mesh simplification method based on the local secondary error measure with penalty term | |
CN110827391A (en) | Image rendering method, device and equipment and storage medium | |
CN110378992A (en) | Towards large scene model web terminal dynamic rendering LOD processing method | |
CN110555085A (en) | Three-dimensional model loading method and device | |
CN107274344B (en) | Map zooming method and system based on resource distribution, memory and control equipment | |
CN115909858A (en) | Flight simulation experience system based on VR image | |
CN112055213B (en) | Method, system and medium for generating compressed image | |
CN117555426A (en) | Virtual reality interaction system based on digital twin technology | |
CN107688431A (en) | Man-machine interaction method based on radar fix | |
CN116883576A (en) | TBR+PT-based collaborative rendering method and device | |
CN109241213B (en) | Electronic map point location aggregation method and device | |
CN107992821A (en) | A kind of image-recognizing method and system | |
Boorboor et al. | Submerse: Visualizing storm surge flooding simulations in immersive display ecologies | |
CN103106687B (en) | The computer generating method of three-dimensional ocean grid and device thereof in self-adaptation FOV (Field of View) | |
CN107507256B (en) | Real-time drawing method of large-batch ship targets based on grid aggregation in VTS system | |
CN114064831B (en) | Data rasterization display method and system and storage medium | |
CN113115077B (en) | Code rate self-adaptive transmission method and system for static point cloud server | |
CN110827400B (en) | Method and device for generating model of object in three-dimensional scene and terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |