CN115909858A - Flight simulation experience system based on VR image - Google Patents

Flight simulation experience system based on VR image Download PDF

Info

Publication number
CN115909858A
CN115909858A CN202310213585.4A CN202310213585A CN115909858A CN 115909858 A CN115909858 A CN 115909858A CN 202310213585 A CN202310213585 A CN 202310213585A CN 115909858 A CN115909858 A CN 115909858A
Authority
CN
China
Prior art keywords
terrain
user
resolution
level
subarea
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310213585.4A
Other languages
Chinese (zh)
Other versions
CN115909858B (en
Inventor
韩权
刘敏
张猛
邱鹏
李钊滨
陈益民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Nantianmen Network Information Co ltd
Original Assignee
Shenzhen Nantianmen Network Information Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Nantianmen Network Information Co ltd filed Critical Shenzhen Nantianmen Network Information Co ltd
Priority to CN202310213585.4A priority Critical patent/CN115909858B/en
Publication of CN115909858A publication Critical patent/CN115909858A/en
Application granted granted Critical
Publication of CN115909858B publication Critical patent/CN115909858B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The invention relates to the technical field of data processing, in particular to a flight simulation experience system based on VR images, which is used for acquiring all terrain subregions, acquiring a visual range region and a projection plane of a user according to a viewpoint and a main visual direction of the user, acquiring a resolution characteristic value and a texture detail characteristic value of each terrain subregion in the visual range region, acquiring all terrain subregions and a terrain elevation data matrix for processing the user, calculating relative scheduling resources and relative calculation resources of the user according to the terrain elevation data matrix, allocating computer hardware resources according to the scheduling resource occupation ratio and the calculation resource occupation ratio of the user, generating a VR display picture of the user by scheduling resources of the user, better and reasonably allocating the computer hardware resources in a multi-user scene and ensuring the picture quality of each user VR display device in the multi-user scene.

Description

Flight simulation experience system based on VR image
Technical Field
The invention relates to the technical field of data processing, in particular to a flight simulation experience system based on VR images.
Background
The flight simulation experience system is simulation equipment for flight training and science popularization, the most important purpose of the flight simulation experience system is to train pilots, and the flight simulation experience system is widely applied to the popularization background of civil aviation in recent years due to the advantages of safety, reliability, economy and no limitation of meteorological conditions. With the continuous development of the VR technology, the immersion and the interactivity of the flight simulation system are improved by combining the VR technology with the flight simulation experience system, so that the flight scene is more real, and the performance of computer-generated graphics is required to be improved.
The flight simulation experience system needs to rapidly process huge topographic data and texture mapping data to realize the continuity of flight simulation pictures and avoid picture tearing, but along with the gradual improvement of the picture quality requirements of users of the flight simulation system at the same time, the hardware cost of a computer is also higher and higher, how to better coordinate computer hardware resources under a multi-user scene to achieve the purpose of ensuring the picture quality of a VR display end of a user becomes an urgent industrial difficulty to be solved. In the prior art, in order to save the hardware cost of a computer, the terrain data and the texture data required by a flight simulation picture are generally adjusted in real time according to the change of a flight viewpoint. In a multi-user scene, different users have different main viewing directions, and the purpose of saving computer hardware resources is achieved by performing different levels of storage scheduling on terrain data by using a terrain LOD model algorithm in the prior art, but the method cannot meet the requirement of comprehensive scheduling on the terrain data under a multi-user target and cannot ensure the picture quality of each user VR display device under the multi-user scene, so that a method capable of performing comprehensive scheduling on the terrain data through multiple main viewing directions and further ensuring the picture quality of the multi-user VR display device is needed.
Disclosure of Invention
The invention provides a flight simulation experience system based on VR images, which aims to solve the existing problems.
The invention relates to a flight simulation experience system based on VR images, which adopts the following technical scheme:
one embodiment of the present invention provides a flight simulation experience system based on VR images, the system comprising:
the terrain area blocking module is used for blocking the terrain area to obtain all terrain sub-areas;
the region resource dividing module is used for dividing the texture data packets, obtaining texture data sub-packets of each terrain subregion, setting different resolution levels and obtaining terrain elevation data blocks of all the resolution levels of each terrain subregion;
the regional characteristic acquisition module is used for acquiring a visual range region and a projection plane of a user according to a viewpoint and a main view direction of the user and acquiring a resolution characteristic value and a texture detail characteristic value of each terrain subregion in the visual range region;
the regional characteristic grading module is used for grading the resolution characteristic value and the texture detail characteristic value to obtain all resolution characteristic value grades and texture detail characteristic value grades;
the user resource characteristic acquisition module is used for acquiring all terrain subareas and terrain elevation data matrixes processed by the users and calculating relative scheduling resources and relative calculation resources of the users according to the terrain elevation data matrixes;
the user resource acquisition module is used for taking the target grade terrain elevation data blocks and the target texture data sub-packages corresponding to all the terrain sub-areas to be processed of the user as the generation resources of the user;
and the hardware resource allocation module is used for calculating the scheduling resource proportion and the calculating resource proportion of the user, allocating computer hardware resources according to the scheduling resource proportion and the calculating resource proportion of the user, scheduling the generated resources of the user and generating a VR display picture of each user.
Further, the setting of different resolution levels includes the following specific steps:
carrying out multi-resolution level division from the highest resolution level to the lowest resolution level, and dividing the multi-resolution level into a second preset number of resolution levels, wherein the resolution levels
Figure SMS_3
Resolution level 1 is the lowest resolution level, being the highest resolution level; number of data points in a block of terrain elevation data at resolution level 1>
Figure SMS_5
In the range, the number of data points in the terrain elevation data block at resolution level 2 is ≧>
Figure SMS_7
In-range, similarly, resolution level>
Figure SMS_2
Is greater than or equal to @>
Figure SMS_4
In the range, is greater or less>
Figure SMS_6
Represents a second predetermined number, is greater than>
Figure SMS_8
A number of data points in the block of terrain elevation data representing the lowest resolution level, based on the resolution of the block of terrain elevation data, based on the number of data points in the block of terrain elevation data, based on the resolution of the block of terrain elevation data, and based on the resolution of the block of terrain elevation data, based on the resolution of the block of the terrain elevation data, based on the number of the terrain elevation data in the block of the terrain elevation data>
Figure SMS_1
The number of data points in the terrain elevation data block representing the highest high resolution level.
Further, the acquiring a visual range area and a projection plane of the user according to the viewpoint and the main viewing direction of the user includes the following specific steps:
when a plurality of users use the flight simulation equipment at the same time, the viewpoints of the users are the same viewpoint, and the main viewing directions of the users are multiple;
acquiring a visual range area on a terrain area according to a viewpoint by combining with the visual angle limit of flight simulation equipment; and combining VR equipment information, and acquiring a projection plane of each user according to the main view direction of each user.
Further, the method for obtaining the resolution characteristic value and the texture detail characteristic value comprises the following steps:
acquiring the distance between the center point of each terrain subarea and a viewpoint in the visual range area, and recording the distance as the viewpoint distance of each terrain subarea; acquiring the distance between the central point of each terrain subarea and the central point of the projection plane of each user, and recording as the main sight distance between each terrain subarea and each user;
and dividing the viewpoint distance of the terrain subarea by a first numerical value to obtain a resolution characteristic value of the terrain subarea, dividing the dominant viewing distance of the terrain subarea and the user by a second numerical value to obtain a texture detail characteristic value of the terrain subarea and the user, wherein the second numerical value is the maximum distance between the projection plane of the user and the central point of the projection plane.
Further, the obtaining of all the resolution characteristic value levels and texture detail characteristic value levels includes the following specific steps:
the value range according to the resolution characteristic value is [0,1 ]]Obtaining all resolution feature value levels, including: the characteristic value of resolution is
Figure SMS_9
In the range, belongs to a resolution characteristic value level 1, the resolution characteristic value is->
Figure SMS_10
Within the range, belongs to resolution feature level 2, and similarly, the resolution feature is ≦ ≦>
Figure SMS_11
In the range, belongs to a resolution characteristic value level>
Figure SMS_12
,/>
Figure SMS_13
Representing a second preset number;
according to the value range [0,1 ] of the texture detail characteristic value]And obtaining all levels of texture detail feature values, including: the texture detail feature value is
Figure SMS_14
Within the range, the texture detail feature value belongs to level 1, and the texture detail feature value is within
Figure SMS_15
When the range is within, it belongs to level 2, and similarly, the texture detail value is->
Figure SMS_16
In range, belong to a texture detail feature value level >>
Figure SMS_17
,/>
Figure SMS_18
Representing a third preset number.
Further, the acquiring all terrain subareas and terrain elevation data matrix for processing of the user includes the following specific steps:
for any terrain subarea, acquiring the lowest level of the terrain subarea and the texture detail characteristic value levels of a plurality of users, and marking the user corresponding to the lowest level of the texture detail characteristic value levels as a processing user of the terrain subarea; acquiring a processing user corresponding to each terrain subarea in the visual range area, and allocating each terrain subarea in the visual range area to the corresponding processing user as a processing terrain subarea of the processing user;
for any user, counting all terrain processing subareas of the user and belonging to the resolution characteristic value level
Figure SMS_19
And belongs to a texture detail feature value level->
Figure SMS_20
Processing the number of terrain sub-areas, the number being considered as the ÷ th in the terrain elevation data matrix of the user>
Figure SMS_21
Go, first/>
Figure SMS_22
The elements of the column.
Further, the method for calculating the relative scheduling resource comprises the following steps:
Figure SMS_23
in the formula (I), the compound is shown in the specification,
Figure SMS_25
represents a relative scheduled resource for the ith user, based on the relative scheduling resource>
Figure SMS_28
Represents the number of processed terrain sub-areas belonging to the i-th user, and>
Figure SMS_30
a geographic elevation data matrix representing an ith user>
Figure SMS_26
Line and the fifth->
Figure SMS_27
Column elements, <' > based on>
Figure SMS_29
Represents a natural logarithm based on a natural constant e>
Figure SMS_31
Represents a second predetermined number, is greater than>
Figure SMS_24
Representing a third preset number.
Further, the calculation method of the relative calculation resources is as follows:
Figure SMS_32
in the formula (I), the compound is shown in the specification,
Figure SMS_35
represents the relative computing resource of the ith user, <' > or>
Figure SMS_37
Represents the number of processed terrain sub-areas belonging to the i-th user, and>
Figure SMS_39
indicates the ^ th or ^ th on the terrain elevation data matrix for the ith user>
Figure SMS_34
Line and first or second based on>
Figure SMS_36
Element of a column, < > or >>
Figure SMS_38
Represents a natural logarithm based on a natural constant e, <' > is>
Figure SMS_40
Represents a second predetermined number, is greater than>
Figure SMS_33
Representing a third preset number.
Further, the method for acquiring the sub-packets of the target-level terrain elevation data block and the target texture data comprises the following steps:
according to the resolution characteristic value level of the terrain subarea processed by the user
Figure SMS_41
The resolution level which corresponds to the terrain sub-area to be processed is obtained->
Figure SMS_42
The terrain elevation data block is recorded as a target grade terrain elevation data block for processing the terrain subarea, and the grade of the texture detail characteristic value of the processed terrain subarea of the user is based on->
Figure SMS_43
Setting up process terrain subregion correspondencesIs greater than or equal to>
Figure SMS_44
A target texture data sub-packet, marked as processing a terrain sub-region, based on the current location of the terrain>
Figure SMS_45
Representing a third preset number.
Further, the calculating of the scheduling resource proportion and the calculating resource proportion of the user includes the following specific steps:
Figure SMS_46
Figure SMS_47
in the formula (I), the compound is shown in the specification,
Figure SMS_50
represents the scheduled resource occupancy, based on the status of the ith user>
Figure SMS_52
Represents a relative scheduled resource for the ith user, based on the relative scheduling resource>
Figure SMS_54
Represents the number of users who simultaneously use the flight simulation apparatus, or>
Figure SMS_49
Representing the relative scheduled resource of the kth user; />
Figure SMS_51
Indicates the computing resource occupancy of the ith user, based on the status of the device>
Figure SMS_53
Represents the relative computing resource of the ith user, <' > or>
Figure SMS_55
Indicating simultaneous use of flight simulating equipmentNumber of users>
Figure SMS_48
Representing the relative computational resources of the kth user.
The technical scheme of the invention has the beneficial effects that: compared with the conventional single-user flight simulation experience system, the multi-user flight simulation experience system provided by the invention has the advantages that the visual range area and the projection plane of a user are obtained according to the viewpoint and the main visual direction of the user, the resolution characteristic value and the texture detail characteristic value of each terrain subregion in the visual range area are obtained, the relative scheduling resources and the relative computing resources of the user are computed according to the terrain elevation data matrix, the computer hardware resources are distributed according to the scheduling resource proportion and the computing resource proportion of the user, the generated resources of the user are scheduled to generate a VR display picture of the user, the computer hardware resources are better and reasonably distributed in a multi-user scene, and the picture quality of VR display equipment of each user in the multi-user scene is ensured.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flow chart illustrating the steps of a VR image based flight simulation experience system in accordance with the present invention;
fig. 2 is a schematic view of the visual range area and projection plane of a user of the present invention.
Detailed Description
To further illustrate the technical means and effects of the present invention for achieving the predetermined objects, the following detailed description of a flight simulation experience system based on VR images, its specific implementation, structure, features and effects will be provided in conjunction with the accompanying drawings and the preferred embodiments. In the following description, different "one embodiment" or "another embodiment" refers to not necessarily the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following specifically describes a specific scheme of the flight simulation experience system based on VR images, which is provided by the present invention, with reference to the accompanying drawings.
Referring to fig. 1, a flowchart illustrating steps of a VR image based flight simulation experience system according to an embodiment of the present invention is shown, where the system includes: the system comprises a terrain region partitioning module, a region resource partitioning module, a region feature acquisition module, a region feature grading module, a user resource feature acquisition module, a user resource acquisition module and a hardware resource allocation module.
The terrain area blocking module is used for blocking the terrain area to obtain all terrain sub-areas.
The method specifically comprises the following steps:
blocking a terrain area of a flight simulation scene to obtain all terrain sub-areas, including: the whole terrain area of the flight simulation scene is a square area, the whole terrain area is uniformly divided into a first preset number of terrain sub-areas (square areas), the areas of all the terrain sub-areas are the same and are marked as S due to the uniform division, and the unit of the area of each terrain sub-area is square decimeter. The first predetermined number in this embodiment
Figure SMS_56
10000, in other embodiments, the implementer may set the first preset number as desired.
And the area resource dividing module is used for dividing DEM terrain elevation data and texture data packets according to all terrain sub-areas.
The method specifically comprises the following steps:
setting different resolution levels, including: the DEM terrain elevation data of the terrain area are used for drawing the terrain of the terrain area, the DEM terrain elevation data corresponding to each terrain sub-area are recorded as terrain elevation data blocks, the number of data points in the terrain elevation data blocks of different resolution levels of the terrain sub-area is different, and the number of the data points in the terrain elevation data blocks of higher resolution levels is larger. The highest resolution level of the terrain elevation data block is determined by the source of the DEM terrain elevation data, while the lowest resolution level of the terrain elevation data block is set by the user.
In an embodiment of the present invention, the number of data points in the terrain elevation data block of the highest resolution level is
Figure SMS_57
Wherein S denotes the area of a terrain sub-region>
Figure SMS_58
Represents a first unit area, the first unit area being 0.25 square meters; the number of data points in a block of terrain elevation data at the lowest resolution level is ≧ n>
Figure SMS_59
Wherein S denotes the area of a terrain sub-region>
Figure SMS_60
The second unit area is represented, and in this embodiment, the second unit area is 100 square meters, and in other embodiments, the implementer may set the second unit area as needed.
Carrying out multi-resolution level division from the highest resolution level to the lowest resolution level, and dividing the multi-resolution level into a second preset number of resolution levels, wherein the resolution levels
Figure SMS_62
Resolution level 1 is the lowest resolution level; number of data points in a block of terrain elevation data at resolution level 1>
Figure SMS_65
In-range, resolution level 2 terrain elevation data blockNumber of data points in>
Figure SMS_67
In range, likewise, resolution level>
Figure SMS_63
The number of data points in the terrain elevation data block is
Figure SMS_64
In the range, is greater or less>
Figure SMS_66
Represents a second predetermined number, is greater than>
Figure SMS_68
The number of data points in the block of terrain elevation data representing the lowest resolution level, <' >>
Figure SMS_61
The number of data points in the terrain elevation data block representing the highest resolution level.
The second predetermined number in this embodiment
Figure SMS_69
And 50, in other embodiments, the practitioner may set the second predetermined number as desired.
Storing terrain elevation data blocks of different resolution levels of a terrain sub-region according to a terrain LOD model algorithm, wherein the storing process comprises the following steps: for any terrain sub-area, acquiring terrain elevation data blocks of all resolution levels of the terrain sub-area, and storing the terrain elevation data blocks of each resolution level of the terrain sub-area in corresponding units; the terrain elevation data blocks of all resolution levels for all terrain sub-areas are stored in corresponding cells.
The method comprises the steps that a terrain area of a flight simulation scene corresponds to a texture data packet, and the texture data packet is used for rendering the terrain area and drawing specific texture details after the terrain of the terrain area is drawn; according to the partitioning result of the terrain area, the texture data packet is divided into a plurality of texture data sub-packets, each terrain sub-area corresponds to one texture data sub-packet, the texture data sub-packets are used for rendering the terrain sub-areas, the higher the utilization rate of the texture data sub-packets is, the higher the rendering degree of the terrain sub-areas is, the more the texture details of the terrain sub-areas are, and the better the visual effect is.
The area characteristic acquisition module is used for acquiring a resolution characteristic value and a texture detail characteristic value of the terrain subarea.
The method specifically comprises the following steps:
when a plurality of users use the flight simulation equipment at the same time, the viewpoints and the main viewing directions of the users are obtained through a positioning device arranged in the flight simulation equipment, and because the viewpoints of different users on the same flight simulation equipment are approximately the same and only the main viewing directions are different, the viewpoints of the users are considered to be the same viewpoint and the main viewing directions of the users are multiple.
In combination with the visual angle limitation of the flight simulation equipment, acquiring a visual range area on a terrain area according to a viewpoint, as shown in fig. 2, which is the prior art and will not be described in detail herein; in conjunction with the VR device information, a projection plane of each user is obtained according to the main viewing direction of each user, as shown in fig. 2.
Acquiring the distance between the center point of each terrain subarea and a viewpoint in the visual range area, and recording the distance as the viewpoint distance of each terrain subarea; and acquiring the distance between the central point of each terrain sub-area and the central point of the projection plane of each user, and recording the distance as the main sight distance between each terrain sub-area and each user.
It should be noted that the user's requirements for the resolution level and texture detail of each terrain subarea in the visual range area are related to the viewpoint distance of each terrain subarea and the dominant viewing distance of each terrain subarea from the user: the larger the viewpoint distance of the terrain subarea is, the farther the terrain subarea is from the viewpoint is, the smaller the requirement of the user on the resolution of the terrain subarea is, that is, the larger the viewpoint distance is, the more low-resolution terrain elevation data blocks of the terrain subarea are required to be used to form the simulated terrain (in the main view direction) of the user; the smaller the viewpoint distance is, the closer the terrain subarea is to the viewpoint, the greater the requirement of the user on the resolution of the terrain elevation data blocks of the terrain subarea is, that is, the smaller the viewpoint distance is, the more high-resolution terrain elevation data blocks of the terrain subarea need to be used to form the user's simulated terrain (in the main view direction).
It should be further noted that the larger the dominant viewing distance between the terrain subregion and the user is, the farther the terrain subregion is from the projection plane of the user is, the smaller the requirement of the user on the texture details (determined by the human eye imaging characteristics) of the terrain subregion is, the lower the usage rate of the texture data sub-packet is, that is, the larger the dominant viewing distance is, the less texture details of the terrain subregion are required, and the lower the usage rate of the texture data sub-packet is; the smaller the dominant viewing distance is, the closer the terrain subregion is to the projection plane of the user, the greater the requirement of the user on texture details (determined by human eye imaging characteristics) of the terrain subregion is, the higher the utilization rate of the texture data sub-packet is, that is, the smaller the dominant viewing distance is, the more texture details of the terrain subregion are required, and the higher the utilization rate of the texture data sub-packet is.
In conclusion, representing the requirements of the user on the resolution level and the texture details of the terrain subarea by using the viewpoint distance of the terrain subarea and the dominant sight distance of the terrain subarea and the user respectively; dividing the viewpoint distance of the terrain subarea by a first numerical value to obtain a resolution characteristic value of the terrain subarea, wherein the first numerical value is a visual distance set by the flight simulation system and is determined by a set parameter of the flight simulation system; and dividing the main visual distance between the terrain subarea and the user by a second numerical value to obtain a texture detail characteristic value between the terrain subarea and the user, wherein the second numerical value is the maximum distance between the terrain subarea and the central point of the projection plane of the user. The resolution characteristic value and the texture detail characteristic value are both normalization results.
The region feature grading module is used for grading the resolution feature value and the texture detail feature value of the terrain subregion.
The method specifically comprises the following steps:
the larger the viewpoint distance of the terrain subareas is, the larger the resolution characteristic value of the terrain subareas is, and the lower the requirement of a user on the resolution of the terrain subareas is; thus, the topographyThe greater the resolution characteristic value of the region, the smaller the resolution characteristic value level of the terrain subregion, and at this time, the smaller the requirement of the user on the resolution of the terrain subregion. Grading the resolution characteristic value and the texture detail characteristic value, wherein the resolution characteristic value is divided into a second preset number of resolution characteristic value grades, and the value range of the resolution characteristic value is [0,1 ] as the resolution characteristic value is a normalization result]Then, all resolution eigenvalue levels are specifically: the characteristic value of resolution is
Figure SMS_70
When the range is within, the resolution characteristic value belongs to the resolution characteristic value level 1
Figure SMS_71
Within range, belongs to resolution characteristic level 2, and similarly, resolution characteristic values in &>
Figure SMS_72
In the range belonging to a resolution characteristic value level>
Figure SMS_73
,/>
Figure SMS_74
Representing a second preset number, resolution characteristic level->
Figure SMS_75
The highest resolution eigenvalue level and the resolution eigenvalue level 1 the lowest resolution eigenvalue level.
The larger the dominant viewing distance between the terrain subarea and the user is, the larger the texture detail characteristic value between the terrain subarea and the user is, and the smaller the requirement of the user on the texture detail of the terrain subarea is; therefore, the larger the texture detail feature value of the terrain sub-area and the user is, the smaller the texture detail feature value level of the terrain sub-area is, and at this time, the smaller the requirement of the user on the texture detail of the terrain sub-area is. Grading the texture detail characteristic values, wherein the texture detail characteristic values are divided into a third preset number of texture detail characteristic value grades due to the textureThe detail characteristic value is the normalization result, and the value range of the texture detail characteristic value is [0,1 ]]Then, all the texture detail feature value levels are specifically: the texture detail feature value is
Figure SMS_76
In range, belongs to level 1 and has a texture detail feature value in%>
Figure SMS_77
When the range is within, it belongs to level 2, and similarly, the texture detail value is->
Figure SMS_78
In range, belong to a texture detail feature value level >>
Figure SMS_79
,/>
Figure SMS_80
Represents a third preset number, a level of the texture detail feature value->
Figure SMS_81
The highest level of texture detail feature values, and level 1 of texture detail feature values is the lowest level of texture detail feature values.
Third predetermined number in this embodiment
Figure SMS_82
In other embodiments, the practitioner may set the third predetermined number as desired at 10.
The user resource characteristic acquisition module is used for relative scheduling resources and relative computing resources of the user.
The method specifically comprises the following steps:
it should be noted that, for a user, each terrain sub-area in the user's visual range area has a resolution characteristic value level and a texture detail characteristic value level, but the terrain elevation data blocks corresponding to each terrain sub-area are not scheduled by the computing device of the user, and are further used for forming the simulated terrain of the user; that is, for the user, it is the final goal to obtain the accurate scheduling and texture rendering of each terrain sub-region in the user's visual range region. Based on this logic, for any terrain sub-area, the terrain sub-area and each user have texture detail feature values, and the terrain sub-area and each user have texture detail feature value levels, the terrain elevation data block corresponding to the terrain sub-area should be called by the user with the lowest texture detail feature value level of the terrain sub-area to form the simulated terrain of the user, otherwise, the terrain elevation data block corresponding to the terrain sub-area is more important for the user.
For any terrain subarea, obtaining the lowest level of the terrain subarea and the texture detail characteristic value levels of a plurality of users, and marking the user corresponding to the lowest level of the texture detail characteristic value levels as a processing user of the terrain subarea; each terrain subarea in the visual range area has a corresponding processing user, and each terrain subarea in the visual range area is allocated to the processing user as a processing terrain subarea of the processing user.
For any user, counting all terrain processing subareas of the user and belonging to the resolution characteristic value level
Figure SMS_85
And belongs to a texture detail feature value level->
Figure SMS_89
Number of processed terrain sub-areas->
Figure SMS_93
Will number->
Figure SMS_86
As the ^ th or greatest in the terrain elevation data matrix of the user>
Figure SMS_88
Line and the fifth->
Figure SMS_92
Elements of a column ofMiddle, or>
Figure SMS_95
,/>
Figure SMS_83
In total->
Figure SMS_87
Individual resolution characteristic value levels, having a total of->
Figure SMS_91
The texture detail feature value level, so that the terrain elevation data matrix of the user has a size of
Figure SMS_94
,/>
Figure SMS_84
Represents a second predetermined number, is greater than>
Figure SMS_90
Representing a third preset number.
The calculation formula of the relative scheduling resource of the user is specifically as follows:
Figure SMS_96
in the formula (I), the compound is shown in the specification,
Figure SMS_98
represents a relative scheduled resource for the ith user, based on the relative scheduling resource>
Figure SMS_102
Represents the number of processed terrain sub-areas belonging to the i-th user, and>
Figure SMS_105
a geographic elevation data matrix representing an ith user>
Figure SMS_99
Line and the fifth->
Figure SMS_100
The elements of the column (i.e. belonging to the resolution characteristic value level ÷ in all terrain-processing subareas of the ith user>
Figure SMS_103
And belongs to a texture detail feature value level->
Figure SMS_106
Number of processed terrain sub-regions), based on the number of processed terrain sub-regions in the field)>
Figure SMS_97
Represents a natural logarithm based on a natural constant e, <' > is>
Figure SMS_101
Represents a second predetermined number, is greater than>
Figure SMS_104
Representing a third preset number; representing the relative scheduling resource of the ith user by the entropy of the number of processing terrain sub-regions belonging to different resolution eigenvalue levels, the relative scheduling resource of the ith user->
Figure SMS_107
The larger the size is, the more disordered the resolution characteristic value level of the processing terrain subarea of the ith user is, and the more resources need to be scheduled when the simulation terrain of the ith user is formed.
The calculation formula of the relative calculation resources of the user is specifically as follows:
Figure SMS_108
in the formula (I), the compound is shown in the specification,
Figure SMS_110
represents the relative computing resource of the ith user, <' > or>
Figure SMS_115
Represents the number of processed terrain sub-areas belonging to the i-th user, and>
Figure SMS_120
indicates the ^ th or ^ th on the terrain elevation data matrix for the ith user>
Figure SMS_111
Line and the fifth->
Figure SMS_114
The elements of the column (i.e. all terrain-handling subareas of the ith user belonging to the resolution characteristic value level->
Figure SMS_118
And belongs to a texture detail feature value level->
Figure SMS_122
Number of terrain sub-areas processed) in combination with a predetermined number of terrain sub-areas, and>
Figure SMS_109
represents a natural logarithm based on a natural constant e, <' > is>
Figure SMS_113
Represents a second predetermined number, and>
Figure SMS_117
representing a third preset number; />
Figure SMS_121
Belonging to a texture detail feature value level->
Figure SMS_112
The higher the level of texture detail feature value of the processed terrain sub-region is, the more quickly it is>
Figure SMS_116
The larger the size is, the higher the requirement of a user on the texture details of the terrain sub-area is, the higher the utilization rate of the texture data sub-packet is, and the calculation force required for processing the terrain elevation data block corresponding to the terrain sub-area is ^ greater>
Figure SMS_119
The larger, more computing resources are needed, the relative computing resources of the user +>
Figure SMS_123
The larger.
The user resource acquisition module is used for acquiring the generated resources of the user.
The method specifically comprises the following steps:
for any user, corresponding target-level terrain elevation data blocks and target texture data sub-packets of all terrain sub-areas processed belonging to the user are generated resources which need to be scheduled by the user; the method for processing the target level terrain elevation data block and the target texture data sub-package corresponding to the terrain sub-area comprises the following specific steps: according to the resolution characteristic value level of the terrain subarea processed by the user
Figure SMS_124
The resolution level which corresponds to the terrain sub-area to be processed is obtained->
Figure SMS_125
The terrain elevation data block is recorded as a target grade terrain elevation data block for processing the terrain subarea, and the grade of the texture detail characteristic value of the processed terrain subarea of the user is based on->
Figure SMS_126
Setting the utilization rate of a texture data sub-packet corresponding to the processed terrain sub-area to be ^ or ^>
Figure SMS_127
And recording as a target texture data sub-packet for processing the terrain subarea.
The hardware resource allocation module is used for calculating the scheduling resource proportion and the computing resource proportion of the user and allocating the computer hardware resources.
The method specifically comprises the following steps:
according to the relative scheduling resources and the relative computing resources of the users, the scheduling resource proportion and the computing resource proportion of the users are computed, and the method specifically comprises the following steps:
Figure SMS_128
Figure SMS_129
in the formula (I), the compound is shown in the specification,
Figure SMS_131
represents a scheduled resource occupancy of the ith user, based on the status of the scheduling resource occupancy>
Figure SMS_133
Represents a relative scheduled resource for the ith user, based on the relative scheduling resource>
Figure SMS_135
Indicating the number of users who simultaneously use the flight simulation apparatus, in>
Figure SMS_132
Represents the relative scheduled resource of the kth user; />
Figure SMS_134
Indicates the computing resource occupancy of the ith user, based on the status of the device>
Figure SMS_136
Represents the relative computing resource of the ith user, <' > or>
Figure SMS_137
Represents the number of users who simultaneously use the flight simulation apparatus, or>
Figure SMS_130
Representing the relative computational resources of the kth user.
The larger the relative scheduling resource and the relative computing resource of the user are, the larger the scheduling resource proportion and the computing resource proportion of the user are, the more the scheduling resource and the computing resource are needed by the user, the larger the corresponding needed computing power is, and the more computer hardware resources are allocated to the user.
And distributing corresponding computer hardware resources to each user according to the scheduling resource proportion and the computing resource proportion of all the users for scheduling the generated resources of the users and generating VR display pictures of all the users.
The system comprises a terrain area partitioning module, an area resource partitioning module, an area characteristic acquisition module, an area characteristic grading module, a user resource characteristic acquisition module, a user resource acquisition module and a hardware resource allocation module. Compared with the existing single-user flight simulation experience system, the multi-user flight simulation experience system provided by the invention has the advantages that the visual range area and the projection plane of a user are obtained according to the viewpoint and the main visual direction of the user, the resolution characteristic value and the texture detail characteristic value of each terrain subarea in the visual range area are obtained, the relative scheduling resource and the relative computing resource of the user are computed according to the terrain elevation data matrix, the computer hardware resource is distributed according to the scheduling resource proportion and the computing resource proportion of the user, the generated resource of the user is scheduled to generate the VR display picture of the user, the computer hardware resource is better and reasonably distributed in a multi-user scene, and the picture quality of VR display equipment of each user in the multi-user scene is ensured.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (10)

1. A VR image based flight simulation experience system, comprising:
the terrain area blocking module is used for blocking a terrain area to obtain all terrain sub-areas;
the regional resource dividing module is used for dividing the texture data packet to obtain texture data sub-packets of each terrain sub-region, setting different resolution levels and obtaining terrain elevation data blocks of all resolution levels of each terrain sub-region;
the regional characteristic acquisition module is used for acquiring a visual range region and a projection plane of a user according to a viewpoint and a main view direction of the user and acquiring a resolution characteristic value and a texture detail characteristic value of each terrain subarea in the visual range region;
the regional characteristic grading module is used for grading the resolution characteristic value and the texture detail characteristic value to obtain all resolution characteristic value grades and texture detail characteristic value grades;
the user resource characteristic acquisition module is used for acquiring all terrain subareas and terrain elevation data matrixes processed by the users and calculating relative scheduling resources and relative calculation resources of the users according to the terrain elevation data matrixes;
the user resource acquisition module is used for taking target grade terrain elevation data blocks and target texture data sub-packages corresponding to all terrain sub-areas processed by the user as generating resources of the user;
and the hardware resource allocation module is used for calculating the scheduling resource proportion and the calculating resource proportion of the user, allocating computer hardware resources according to the scheduling resource proportion and the calculating resource proportion of the user, scheduling the generated resources of the user and generating a VR display picture of each user.
2. The VR image based flight simulation experience system of claim 1, wherein the setting of different resolution levels includes the specific steps of: carrying out multi-resolution level division from the highest resolution level to the lowest resolution level, and dividing the multi-resolution level into a second preset number of resolution levels, wherein the resolution levels
Figure QLYQS_2
Resolution level 1 is the lowest resolution level, being the highest resolution level; the number of data points in the terrain elevation data block at resolution level 1 is ≥>
Figure QLYQS_5
In the range, the number of data points in the terrain elevation data block at resolution level 2 is ≧>
Figure QLYQS_7
Within range, in the same way, distinguishRate level>
Figure QLYQS_3
Is greater than or equal to @>
Figure QLYQS_4
In range of>
Figure QLYQS_6
A second preset number is indicated which is,
Figure QLYQS_8
the number of data points in the block of terrain elevation data representing the lowest resolution level, <' >>
Figure QLYQS_1
The number of data points in the terrain elevation data block representing the highest high resolution level.
3. The VR image based flight simulation experience system of claim 1, wherein the obtaining of the user's visual range area and projection plane from the user's viewpoint and main viewing direction comprises the following steps:
when a plurality of users use the flight simulation equipment at the same time, the viewpoints of the users are the same viewpoint, and the main viewing directions of the users are multiple;
acquiring a visual range area on a terrain area according to a viewpoint by combining with the visual angle limit of flight simulation equipment; and combining VR equipment information, and acquiring a projection plane of each user according to the main view direction of each user.
4. The VR image-based flight simulation experience system of claim 1, wherein the resolution feature values and texture detail feature values are obtained by:
acquiring the distance between the center point of each terrain subarea and a viewpoint in the visual range area, and recording the distance as the viewpoint distance of each terrain subarea; acquiring the distance between the central point of each terrain subarea and the central point of the projection plane of each user, and recording as the main sight distance between each terrain subarea and each user;
and dividing the viewpoint distance of the terrain subarea by the first numerical value to obtain a resolution characteristic value of the terrain subarea, and dividing the dominant view distance of the terrain subarea and the user by the second numerical value to obtain a texture detail characteristic value of the terrain subarea and the user, wherein the second numerical value is the maximum distance between the projection plane of the user and the central point of the projection plane.
5. The VR image based flight simulation experience system of claim 1, wherein the obtaining all the resolution feature value levels and texture detail feature value levels comprises the following steps:
the value range according to the resolution characteristic value is [0,1 ]]Obtaining all resolution feature value levels, including: the characteristic value of resolution is
Figure QLYQS_9
In range, belongs to resolution characteristic value level 1, and the resolution characteristic value is->
Figure QLYQS_10
Within range, belongs to resolution characteristic level 2, and similarly, resolution characteristic values in &>
Figure QLYQS_11
In the range, belongs to a resolution characteristic value level>
Figure QLYQS_12
,/>
Figure QLYQS_13
Representing a second preset number;
according to the value range [0,1 ] of the texture detail characteristic value]And obtaining all levels of texture detail feature values, including: the texture detail feature value is
Figure QLYQS_14
Within the range, belongs to a texture detail feature value level 1, and the texture detail feature value is ≥ er>
Figure QLYQS_15
When the range is within, it belongs to level 2, and similarly, the texture detail value is->
Figure QLYQS_16
In range, belong to a texture detail feature value level >>
Figure QLYQS_17
,/>
Figure QLYQS_18
Representing a third preset number.
6. The VR image based flight simulation experience system of claim 1, wherein the acquiring all terrain subareas and terrain elevation data matrix of the user comprises the following steps:
for any terrain subarea, acquiring the lowest level of the terrain subarea and the texture detail characteristic value levels of a plurality of users, and marking the user corresponding to the lowest level of the texture detail characteristic value levels as a processing user of the terrain subarea; acquiring a processing user corresponding to each terrain subarea in the visual range area, and allocating each terrain subarea in the visual range area to the corresponding processing user as a processing terrain subarea of the processing user;
for any user, counting all terrain processing subareas of the user and belonging to the resolution characteristic value level
Figure QLYQS_19
And belongs to a texture detail feature value level->
Figure QLYQS_20
OfManaging a number of terrain sub-areas, the number being considered as the ^ th or greater in the terrain elevation data matrix of the user>
Figure QLYQS_21
Line and the fifth->
Figure QLYQS_22
The elements of the column.
7. The VR image based flight simulation experience system of claim 1, wherein the relative scheduling resources are calculated by:
Figure QLYQS_24
in the formula (II)>
Figure QLYQS_28
Represents a relative scheduled resource for the ith user, based on the relative scheduling resource>
Figure QLYQS_30
Represents the number of processed terrain sub-areas belonging to the i-th user, in conjunction with a processing unit>
Figure QLYQS_25
Indicates the ^ th or ^ th on the terrain elevation data matrix for the ith user>
Figure QLYQS_27
Line and the fifth->
Figure QLYQS_29
Element of a column, < > or >>
Figure QLYQS_31
Represents a natural logarithm based on a natural constant e>
Figure QLYQS_23
Represents a second predetermined number, and>
Figure QLYQS_26
representing a third preset number.
8. The VR image based flight simulation experience system of claim 1, wherein the relative computing resources are computed by:
Figure QLYQS_33
in combination with>
Figure QLYQS_37
Represents a relative computing resource of an ith user, based on a relative computing resource of the ith user>
Figure QLYQS_39
Represents the number of processed terrain sub-areas belonging to the i-th user, in conjunction with a processing unit>
Figure QLYQS_34
The first in the terrain elevation data matrix representing the ith user
Figure QLYQS_36
Line and first or second based on>
Figure QLYQS_38
Element of a column, < > or >>
Figure QLYQS_40
Represents a natural logarithm based on a natural constant e>
Figure QLYQS_32
Represents a second predetermined number, is greater than>
Figure QLYQS_35
Representing a third preset number. />
9. The VR image-based flight simulation experience system of claim 1, wherein the target-level terrain elevation data block and the target texture data sub-packet are obtained by:
according to the resolution characteristic value level of the terrain subarea processed by the user
Figure QLYQS_41
The resolution level which corresponds to the terrain sub-area to be processed is obtained->
Figure QLYQS_42
The terrain elevation data block is recorded as a target grade terrain elevation data block for processing the terrain subarea, and the grade of the texture detail characteristic value of the processed terrain subarea of the user is based on->
Figure QLYQS_43
Setting the utilization rate of a texture data sub-packet corresponding to the processed terrain sub-area to be ^ or ^>
Figure QLYQS_44
A target texture data sub-packet, marked as processing a terrain sub-region, based on the current location of the terrain>
Figure QLYQS_45
Representing a third preset number.
10. The VR image based flight simulation experience system of claim 1, wherein the calculating of the scheduled resource proportion and the calculated resource proportion of the user includes the following steps:
Figure QLYQS_47
Figure QLYQS_50
in the formula (II)>
Figure QLYQS_53
Represents a scheduled resource occupancy of the ith user, based on the status of the scheduling resource occupancy>
Figure QLYQS_48
Represents a relative scheduled resource for the ith user, based on the relative scheduling resource>
Figure QLYQS_51
Indicating the number of users who simultaneously use the flight simulation apparatus, in>
Figure QLYQS_54
Representing the relative scheduled resource of the kth user; />
Figure QLYQS_55
Represents a computing resource ratio of an ith user, based on a user selection criteria>
Figure QLYQS_46
Represents a relative computing resource of an ith user, based on a relative computing resource of the ith user>
Figure QLYQS_49
Indicating the number of users who simultaneously use the flight simulation apparatus, in>
Figure QLYQS_52
Representing the relative computational resources of the kth user. />
CN202310213585.4A 2023-03-08 2023-03-08 Flight simulation experience system based on VR image Active CN115909858B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310213585.4A CN115909858B (en) 2023-03-08 2023-03-08 Flight simulation experience system based on VR image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310213585.4A CN115909858B (en) 2023-03-08 2023-03-08 Flight simulation experience system based on VR image

Publications (2)

Publication Number Publication Date
CN115909858A true CN115909858A (en) 2023-04-04
CN115909858B CN115909858B (en) 2023-05-09

Family

ID=85739227

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310213585.4A Active CN115909858B (en) 2023-03-08 2023-03-08 Flight simulation experience system based on VR image

Country Status (1)

Country Link
CN (1) CN115909858B (en)

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0100097A2 (en) * 1982-07-30 1984-02-08 Honeywell Inc. Computer controlled imaging system
US6346938B1 (en) * 1999-04-27 2002-02-12 Harris Corporation Computer-resident mechanism for manipulating, navigating through and mensurating displayed image of three-dimensional geometric model
US20060077255A1 (en) * 2004-08-10 2006-04-13 Hui Cheng Method and system for performing adaptive image acquisition
US20090087029A1 (en) * 2007-08-22 2009-04-02 American Gnc Corporation 4D GIS based virtual reality for moving target prediction
CN102074049A (en) * 2011-03-01 2011-05-25 哈尔滨工程大学 Wide-range terrain scheduling simplifying method based on movement of viewpoint
CN202221566U (en) * 2011-07-08 2012-05-16 中国民航科学技术研究院 Flight programming system and verification platform of performance-based navigation
CN104766366A (en) * 2015-03-31 2015-07-08 东北林业大学 Method for establishing three-dimensional virtual reality demonstration
CN105139451A (en) * 2015-08-10 2015-12-09 中国商用飞机有限责任公司北京民用飞机技术研究中心 HUD (head-up display) based synthetic vision guiding display system
CN106446351A (en) * 2016-08-31 2017-02-22 郑州捷安高科股份有限公司 Real-time drawing-oriented large-scale scene organization and scheduling technology and simulation system
CN106530896A (en) * 2016-11-30 2017-03-22 中国直升机设计研究所 Virtual system for unmanned aerial vehicle flight demonstration
CN109064546A (en) * 2018-06-08 2018-12-21 东南大学 A kind of landform image data fast dispatch method and its system
CN110908510A (en) * 2019-11-08 2020-03-24 四川大学 Application method of oblique photography modeling data in immersive display equipment
CN112001993A (en) * 2020-07-14 2020-11-27 深圳市规划国土房产信息中心(深圳市空间地理信息中心) Multi-GPU (graphics processing Unit) city simulation system for large scene
WO2021113268A1 (en) * 2019-12-01 2021-06-10 Iven Connary Systems and methods for generating of 3d information on a user display from processing of sensor data
CN113506370A (en) * 2021-07-28 2021-10-15 自然资源部国土卫星遥感应用中心 Three-dimensional geographic scene model construction method and device based on three-dimensional remote sensing image
CN113516769A (en) * 2021-07-28 2021-10-19 自然资源部国土卫星遥感应用中心 Virtual reality three-dimensional scene loading and rendering method and device and terminal equipment
US11216663B1 (en) * 2020-12-01 2022-01-04 Pointivo, Inc. Systems and methods for generating of 3D information on a user display from processing of sensor data for objects, components or features of interest in a scene and user navigation thereon
US20220130145A1 (en) * 2019-12-01 2022-04-28 Pointivo Inc. Systems and methods for generating of 3d information on a user display from processing of sensor data for objects, components or features of interest in a scene and user navigation thereon

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0100097A2 (en) * 1982-07-30 1984-02-08 Honeywell Inc. Computer controlled imaging system
US6346938B1 (en) * 1999-04-27 2002-02-12 Harris Corporation Computer-resident mechanism for manipulating, navigating through and mensurating displayed image of three-dimensional geometric model
US20060077255A1 (en) * 2004-08-10 2006-04-13 Hui Cheng Method and system for performing adaptive image acquisition
US20090087029A1 (en) * 2007-08-22 2009-04-02 American Gnc Corporation 4D GIS based virtual reality for moving target prediction
CN102074049A (en) * 2011-03-01 2011-05-25 哈尔滨工程大学 Wide-range terrain scheduling simplifying method based on movement of viewpoint
CN202221566U (en) * 2011-07-08 2012-05-16 中国民航科学技术研究院 Flight programming system and verification platform of performance-based navigation
CN104766366A (en) * 2015-03-31 2015-07-08 东北林业大学 Method for establishing three-dimensional virtual reality demonstration
CN105139451A (en) * 2015-08-10 2015-12-09 中国商用飞机有限责任公司北京民用飞机技术研究中心 HUD (head-up display) based synthetic vision guiding display system
CN106446351A (en) * 2016-08-31 2017-02-22 郑州捷安高科股份有限公司 Real-time drawing-oriented large-scale scene organization and scheduling technology and simulation system
CN106530896A (en) * 2016-11-30 2017-03-22 中国直升机设计研究所 Virtual system for unmanned aerial vehicle flight demonstration
CN109064546A (en) * 2018-06-08 2018-12-21 东南大学 A kind of landform image data fast dispatch method and its system
CN110908510A (en) * 2019-11-08 2020-03-24 四川大学 Application method of oblique photography modeling data in immersive display equipment
WO2021113268A1 (en) * 2019-12-01 2021-06-10 Iven Connary Systems and methods for generating of 3d information on a user display from processing of sensor data
US20220130145A1 (en) * 2019-12-01 2022-04-28 Pointivo Inc. Systems and methods for generating of 3d information on a user display from processing of sensor data for objects, components or features of interest in a scene and user navigation thereon
CN112001993A (en) * 2020-07-14 2020-11-27 深圳市规划国土房产信息中心(深圳市空间地理信息中心) Multi-GPU (graphics processing Unit) city simulation system for large scene
US11216663B1 (en) * 2020-12-01 2022-01-04 Pointivo, Inc. Systems and methods for generating of 3D information on a user display from processing of sensor data for objects, components or features of interest in a scene and user navigation thereon
CN113506370A (en) * 2021-07-28 2021-10-15 自然资源部国土卫星遥感应用中心 Three-dimensional geographic scene model construction method and device based on three-dimensional remote sensing image
CN113516769A (en) * 2021-07-28 2021-10-19 自然资源部国土卫星遥感应用中心 Virtual reality three-dimensional scene loading and rendering method and device and terminal equipment

Also Published As

Publication number Publication date
CN115909858B (en) 2023-05-09

Similar Documents

Publication Publication Date Title
CN104731894A (en) Thermodynamic diagram display method and device
KR102047615B1 (en) Processing Method and Apparatus for Particle Systems
CN104200506B (en) Three-dimension GIS massive vector data rendering intent and device
CN102306395B (en) Distributed drawing method and device of three-dimensional data
CN108267154B (en) Map display method and device
CN110909759B (en) Urban area hierarchical management system and method based on population big data
US9208752B2 (en) Method for synchronous representation of a virtual reality in a distributed simulation system
CN104183020B (en) Atural object mesh simplification method based on the local secondary error measure with penalty term
CN110827391A (en) Image rendering method, device and equipment and storage medium
CN110378992A (en) Towards large scene model web terminal dynamic rendering LOD processing method
CN110555085A (en) Three-dimensional model loading method and device
CN107274344B (en) Map zooming method and system based on resource distribution, memory and control equipment
CN115909858A (en) Flight simulation experience system based on VR image
CN112055213B (en) Method, system and medium for generating compressed image
CN117555426A (en) Virtual reality interaction system based on digital twin technology
CN107688431A (en) Man-machine interaction method based on radar fix
CN116883576A (en) TBR+PT-based collaborative rendering method and device
CN109241213B (en) Electronic map point location aggregation method and device
CN107992821A (en) A kind of image-recognizing method and system
Boorboor et al. Submerse: Visualizing storm surge flooding simulations in immersive display ecologies
CN103106687B (en) The computer generating method of three-dimensional ocean grid and device thereof in self-adaptation FOV (Field of View)
CN107507256B (en) Real-time drawing method of large-batch ship targets based on grid aggregation in VTS system
CN114064831B (en) Data rasterization display method and system and storage medium
CN113115077B (en) Code rate self-adaptive transmission method and system for static point cloud server
CN110827400B (en) Method and device for generating model of object in three-dimensional scene and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant