CN207488953U - Virtual reality scenario simulator - Google Patents
Virtual reality scenario simulator Download PDFInfo
- Publication number
- CN207488953U CN207488953U CN201721660460.2U CN201721660460U CN207488953U CN 207488953 U CN207488953 U CN 207488953U CN 201721660460 U CN201721660460 U CN 201721660460U CN 207488953 U CN207488953 U CN 207488953U
- Authority
- CN
- China
- Prior art keywords
- platform
- label
- stage property
- processing unit
- virtual reality
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The utility model provides a kind of virtual reality scenario simulator, which includes wearable device, harvester and processing unit.The harvester is located on platform, and processing unit is positioned over outside platform, and the processing unit is connect respectively with wearable device, harvester by wired or wireless way.For simulating various virtual scenes, the harvester is used to obtain the information in scene and is handled, and generate virtual scene picture by processing unit the simulation stage property being equipped in the platform.Virtual reality scenario simulator described in the utility model can realize simulation stage property and virtual scene picture synchronization in platform, user moves the virtual scene picture of viewing simultaneously in platform, and pass through body and touch simulation stage property in platform, user is made to have sense on the spot in person.
Description
Technical field
The utility model is related to a kind of technical fields of virtual reality (VR) more particularly to a kind of virtual reality scenario to simulate
Device.
Background technology
Virtual reality is neozoic exchange technology, in recent years, it continue to develop and it is perfect, rapidly in every field
Extensive use is obtained for industry, has good humidification to the perceptive experience of people.In VR technical aspects, pass through by
Emulation, intelligence sensor are combined the Virtual Space of establishment with a variety of science and technology such as graphical display, give user's body and face reality
The experience of scene, existing VR technologies depend on vision, the sense of hearing, and captured according to head movement or gesture-capture half
Interactive experience, and tactile capture is still immature, largely effects on the interactive experience degree of user.
Utility model content
The utility model main purpose provides a kind of virtual reality scenario simulator, it is intended to solve existing virtual scene and hand over
Mutual equipment is mainly limited to the technical issues of vision, sense of hearing interaction.
To achieve the above object, the utility model provides a kind of virtual reality scenario simulator, the virtual reality
Scenario simulation device includes wearable device, harvester and processing unit, and the harvester is located on platform, processing dress
Storing is placed in outside platform, and the processing unit is connect respectively with wearable device, harvester by wired or wireless way,
In:
The platform includes spherical stage property, non-spherical stage property and label layer, and the label layer is located on platform, spherical road
Tool and non-spherical stage property are positioned on label layer, and the non-spherical stage property and spherical stage property are respectively equipped with the first label, the mark
The matrix type structure that layer includes multiple first labels composition is signed, each first label is multiple basic as a base unit
The central point of unit using platform is radiated out as origin forms matrix type structure;
The wearable device includes VR glasses, data interaction equipment and the second label, and the is equipped at the VR glasses
Two readers, the data interaction equipment are connected to VR glasses and processing unit by wired or wireless way respectively;
The harvester includes hub, data line and the first reader, and first reader is respectively provided at flat
The corner of platform, first reader are electrically connected to processing unit by hub and data line;
The processing unit includes wireless receiving and dispatching jet device, input-output unit, display, processor, power supply, the place
Reason device be electrically connected respectively with power supply and input-output unit, input-output unit also respectively with display and wireless receiving and dispatching jet device
Electrical connection, wireless receiving and dispatching jet device are connected with wearable device pairing.
Preferably, canyon topography and level terrain are equipped in the platform, the spherical stage property is positioned at canyon topography
Label layer on.
Preferably, the non-spherical stage property is made of multiple sides, each side of the non-spherical stage property is respectively equipped with
One the first label.
Preferably, the data interaction equipment is additionally provided with accumulator, and the accumulator is used to provide electricity for wearable device
Energy.
Preferably, the processing unit further includes the reservoir for being electrically connected to processor, and the reservoir is used to store void
Intend the 3D pictures of liquid object, the 3D pictures of dummy object, ambient sound and simulation algorithm scheme.
Preferably, the processing unit further includes the tag read-write equipment for being electrically connected to processor, and the tag read-write equipment is used
In the different identification information being respectively written into the first label and the second label.
Compared to the prior art, the advantageous effect of virtual scene simulator described in the utility model:From vision, the sense of hearing,
Three aspects of tactile are cut, and on the basis of existing VR, are increased simulation stage property and are used to simulate the virtual scene under various environment, real
Simulation stage property and virtual scene picture synchronization in existing platform, can not existing virtual scene interaction at present so as to solve the prior art
Vision, the present situation of the sense of hearing are mainly limited to, improves the actual experience degree of user.
Description of the drawings
Fig. 1 is the application schematic diagram of the utility model virtual reality scenario simulator;
Fig. 2 is the schematic diagram of virtual scene picture;
Fig. 3 is the structure diagram of platform;
Fig. 4 is the structure diagram of label layer in Fig. 3;
Fig. 5 is the structure diagram of wearable device in Fig. 1;
Fig. 6 is the structure diagram of harvester in Fig. 1;
Fig. 7 is the structure diagram of processing unit in Fig. 1.
The embodiments will be further described with reference to the accompanying drawings for the realization, functional characteristics and advantage of the utility model aim.
Specific embodiment
It should be appreciated that specific embodiment described herein is used only for explaining the utility model, it is not used to limit this reality
With novel.
Facilitate with reference to shown in Fig. 1 and 2, Fig. 1 is the application schematic diagram of virtual reality scenario simulator of the present invention, and Fig. 2 is
The schematic diagram of virtual scene picture.The virtual reality scenario simulator includes wearable device 2, harvester 3 and place
Manage device 4;The wearable device 2 is worn on user 5, and user 5 can move in platform 1, and harvester 3 is located at
On platform 1, processing unit 3 is positioned over outside platform 1;The processing unit 4 is respectively with wearable device 2, harvester 3 by having
Line or wireless mode connection.
For simulating virtual scene picture 6 (as shown in Figure 2), the harvester 3 is used to obtain platform 1 platform 1
Interior identification information, and pass through wired or wireless way and identification information is transmitted at processing unit 4, it is handled through processing unit 4
After obtain virtual scene picture 6, virtual scene picture 6 is transmitted back to by wired or wireless way at wearable device 2, and to
User 5 shows virtual scene picture 6.
With reference to shown in figure 3 and 4, Fig. 3 is the structure diagram of platform, and Fig. 4 is the structure diagram of label layer in Fig. 3.Institute
It states platform 1 and includes spherical stage property 11, non-spherical stage property 12 and label layer 13;14 peace of canyon topography is equipped in the platform 1
Smooth landform 15, label layer 13 are set on the platform 1, and spherical stage property 11 is positioned on the label layer 13 at canyon topography 14, non-spherical
Stage property 12 is positioned on the label layer 13 at level terrain 15;The surface of the spherical stage property 11 is set there are one the first label 16
(such as passive RFID tags of the prior art), the non-spherical stage property 12 can be to be made of multiple sides, the non-spherical
Each side of stage property 12 is respectively equipped with first label 16;The label layer 13 includes multiple first labels 16 composition
Matrix type structure (as shown in Figure 4), each first label 16 are used as a base unit 17, each 17 region model of base unit
It encloses and is determined according to 3 collectable range of harvester, multiple base units 17 are radiated out using the central point of platform 1 as origin
Form matrix type structure.
The spherical stage property 11 is used to simulate the virtual liquid object 61 (such as water and marsh etc.) in virtual scene picture 6,
The non-spherical stage property 12 is described each for simulating dummy object 62 in virtual scene picture 6 (such as mountain, tree and house etc.)
Different identification informations is preset in a first label 16 respectively, the harvester 3 is for corresponding to the first label 16 of acquisition
Identification information and reach at processing unit 3, processing unit 3 is by the identification information of acquisition and is handled to obtain virtual scene
The 3D pictures of virtual liquid object 61 and dummy object 62 in picture 6, the label layer 13 are used for the seat of each point in locating platform 1
Mark;Since label layer 13 of the present invention is using matrix type structure, the coordinate to be accurately positioned each point in platform 1 provides reference
Point, and formed using 1 center of platform as the coordinate system of origin, it is true according to the coordinate system of generation when user 5 enters in platform 1
Determine coordinate position residing for user 5.
Refering to what is shown in Fig. 5, Fig. 5 is the structure diagram of wearable device in Fig. 1.The wearable device 2 includes VR
Mirror 21 (such as VR glasses all-in-one machine of the prior art), 22 and second label 23 of data interaction equipment (such as the prior art
In passive RFID tags);The VR glasses 21 are worn on the head of user 5, and data interaction equipment 22 is worn on user 5
Back, the second label 23 is worn on the foot (as shown in Figure 5) of user 5;The data interaction equipment 22 is additionally provided with electric power storage
Pond 221, the VR glasses 21 are equipped with the second reader 24, and (such as the RFID label tag with linear polarized antenna is read in the prior art
Device), the data interaction equipment 22 is connected to VR glasses 21 and processing unit 4 by wired or wireless way respectively.
The accumulator 221 is used to provide electric energy for each equipment of wearable device, and the VR glasses 21 are used to show void
Intend scenic picture 6, second label 23 for being worn on 5 leg of user is described to be located at for positioning 5 present position of user
The second reader 24 on VR glasses 21 is used to obtain the identification information and power level letter of 5 the first label 16 of front of user
Breath, the identification information and power level information of the first label 16 of acquisition are transmitted to through data interaction equipment 22 at processing unit 3
It is handled, handles to obtain (the portion of 5 observable virtual scene picture 6 of user of visual angle picture 63 through 3 integration of processing unit
Split screen), visual angle picture 63 is transmitted back to by wired or wireless way at data interaction equipment 22, and data interaction equipment 22 will pass
The visual angle picture 63 returned is sent at VR glasses 21 and shows.
With reference to figure 6, Fig. 6 is the structure diagram of harvester in Fig. 1.The harvester 3 includes hub 33, data
32 and first reader 31 (such as carrying the RFID reader of circular polarized antenna in the prior art) of line;First reader
31 are respectively provided at the corner of platform 1;First reader 31 is electrically connected to processing unit by hub 33 and data line 32
3。
For obtaining the identification information corresponding to the first label 16, the first reader 31 is also equipped with first reader 31
Power trace function is penetrated in transmitting-receiving, and signal is received when identifying the first label 16 and the second label 23 for recording the first reader 31
Power level information, the first reader 31 obtain identification information and power level information integrated by hub 33 through data
Line 32 is reached at processing unit 3.
Refering to what is shown in Fig. 7, Fig. 7 is the structure diagram of processing unit in Fig. 1.The processing unit 4 includes wireless receiving and dispatching
Jet device 41, input-output unit 42, display 43, processor 44, reservoir 45, power supply 46, tag read-write equipment 47;The place
Reason device 44 be electrically connected respectively with reservoir 45, power supply 46 and input-output unit 42, input-output unit 42 also respectively with mark
Label reader 47, display 43, wireless receiving and dispatching jet device 41 are electrically connected, and wireless receiving and dispatching jet device 41 is matched with wearable device 2
Connection.
The tag read-write equipment 47 is used for the different identification information being respectively written into the first label 16 and the second label 23, institute
Display 43 is stated for showing the simulated conditions of virtual scene picture 6, the reservoir 45 is used to store virtual liquid object 61
3D pictures, the 3D pictures of dummy object 62, ambient sound and simulation algorithm scheme, the processor 44 are believed for extracting mark
The 3D pictures of the virtual liquid object 61 of manner of breathing reply, the 3D pictures of dummy object 62, and calculated using location simulation algorithm arrangement
The power level information generation coordinate system for the first label 16 that first reader obtains, by simulation algorithm scheme by virtual liquid
The 3D pictures of object 61 and the 3D pictures of dummy object 62 imported into coordinate system and form virtual scene picture 6, and the wireless receiving and dispatching is penetrated
Equipment 41 provides electric energy, the input-output unit 42 for reception or transmitted data signal, the power supply 46 for each equipment
For information input and output, the virtual liquid object 61 being pre-stored in reservoir 45, dummy object 62, ambient sound and imitative
True algorithm arrangement can be updated or upgraded by input-output unit 42.
The utility model embodiment provides virtual reality scenario simulator, on the basis of existing VR technologies, increases
Reality scene is used to simulate the virtual scene under various environment, realizes simulation stage property and virtual scene picture synchronization in platform,
User platform movement viewing virtual scene picture simultaneously, and pass through body and touch simulation stage property in platform, make
User has sense on the spot in person.The utility model is combined by using simulator and virtual scene, from vision, the sense of hearing,
Three aspects of tactile, realize that simulator is synchronous with virtual scene, so as to improve true body of the user in virtual scene
Test sense.
It these are only the preferred embodiment of the utility model, it does not limit the scope of the patent of the present invention, every
Equivalent structure made based on the specification and figures of the utility model or equivalent function transformation, are directly or indirectly used in
Other related technical areas are equally included in the patent within the scope of the utility model.
Claims (6)
1. a kind of virtual reality scenario simulator, which is characterized in that the virtual reality scenario simulator includes wearable set
Standby, harvester and processing unit, the harvester are located on platform, and processing unit is positioned over outside platform, the processing
Device is connect respectively with wearable device, harvester by wired or wireless way, wherein:
The platform includes spherical stage property, non-spherical stage property and label layer, and the label layer is located on platform, spherical stage property and
Non-spherical stage property is positioned on label layer, and the non-spherical stage property and spherical stage property are respectively equipped with the first label, the label layer
Include the matrix type structure of multiple first labels composition, each first label is as a base unit, multiple base units
It is radiated out using the central point of platform as origin and forms matrix type structure;
The wearable device includes VR glasses, data interaction equipment and the second label, is read at the VR glasses equipped with second
Device is read, the data interaction equipment is connected to VR glasses and processing unit by wired or wireless way respectively;
The harvester includes hub, data line and the first reader, and first reader is respectively provided at platform
Corner, first reader are electrically connected to processing unit by hub and data line;
The processing unit includes wireless receiving and dispatching jet device, input-output unit, display, processor, power supply, the processor
It is electrically connected respectively with power supply and input-output unit, input-output unit is also electrically connected with display and wireless receiving and dispatching jet device respectively
It connects, wireless receiving and dispatching jet device is connected with wearable device pairing.
2. virtual reality scenario simulator as described in claim 1, which is characterized in that canyon topography is equipped in the platform
And level terrain, the spherical stage property are positioned on the label layer at canyon topography.
3. virtual reality scenario simulator as described in claim 1, which is characterized in that the non-spherical stage property is by multiple sides
Face forms, each side of the non-spherical stage property is respectively equipped with first label.
4. virtual reality scenario simulator as described in claim 1, which is characterized in that the data interaction equipment is additionally provided with
Accumulator, the accumulator are used to provide electric energy for wearable device.
5. virtual reality scenario simulator as described in claim 1, which is characterized in that the processing unit, which further includes, to be electrically connected
The reservoir of processor is connected to, the reservoir is used to store 3D pictures, the environment of the 3D pictures of virtual liquid object, dummy object
Sound and simulation algorithm scheme.
6. virtual reality scenario simulator as described in claim 1, which is characterized in that the processing unit, which further includes, to be electrically connected
The tag read-write equipment of processor is connected to, the tag read-write equipment is used to be respectively written into different marks in the first label and the second label
Know information.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201721660460.2U CN207488953U (en) | 2017-12-04 | 2017-12-04 | Virtual reality scenario simulator |
PCT/CN2018/074245 WO2019109492A1 (en) | 2017-12-04 | 2018-01-26 | Virtual reality scene simulation device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201721660460.2U CN207488953U (en) | 2017-12-04 | 2017-12-04 | Virtual reality scenario simulator |
Publications (1)
Publication Number | Publication Date |
---|---|
CN207488953U true CN207488953U (en) | 2018-06-12 |
Family
ID=62457554
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201721660460.2U Expired - Fee Related CN207488953U (en) | 2017-12-04 | 2017-12-04 | Virtual reality scenario simulator |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN207488953U (en) |
WO (1) | WO2019109492A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109933207A (en) * | 2019-04-02 | 2019-06-25 | 黄立新 | The tactile of reality environment generates analogy method |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113433837B (en) * | 2021-06-15 | 2023-03-21 | 浙江水利水电学院 | Indoor design method and system based on VR |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101435872B (en) * | 2008-12-23 | 2011-09-28 | 郑之敏 | RFID matrix distributed personnel position monitoring system and monitoring method thereof |
DE102010037195A1 (en) * | 2010-08-27 | 2012-03-01 | Benedikt Hieronimi | System for detecting radio frequency transceivers and their uses |
ES2656868T3 (en) * | 2011-10-05 | 2018-02-28 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Portable device, virtual reality system and method |
US9779550B2 (en) * | 2012-10-02 | 2017-10-03 | Sony Corporation | Augmented reality system |
CN105183142B (en) * | 2014-06-13 | 2018-02-09 | 中国科学院光电研究院 | A kind of digital information reproducing method of utilization space position bookbinding |
US9599821B2 (en) * | 2014-08-08 | 2017-03-21 | Greg Van Curen | Virtual reality system allowing immersion in virtual space to consist with actual movement in actual space |
CN105373224B (en) * | 2015-10-22 | 2016-06-22 | 山东大学 | A kind of mixed reality games system based on general fit calculation and method |
CN107167132A (en) * | 2016-03-07 | 2017-09-15 | 上海积杉信息科技有限公司 | Indoor locating system based on augmented reality and virtual reality |
-
2017
- 2017-12-04 CN CN201721660460.2U patent/CN207488953U/en not_active Expired - Fee Related
-
2018
- 2018-01-26 WO PCT/CN2018/074245 patent/WO2019109492A1/en active Application Filing
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109933207A (en) * | 2019-04-02 | 2019-06-25 | 黄立新 | The tactile of reality environment generates analogy method |
Also Published As
Publication number | Publication date |
---|---|
WO2019109492A1 (en) | 2019-06-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7000555B2 (en) | Augmented reality display method, attitude information determination method and equipment | |
US9286725B2 (en) | Visually convincing depiction of object interactions in augmented reality images | |
EP2579128B1 (en) | Portable device, virtual reality system and method | |
CN105279750B (en) | It is a kind of that guide system is shown based on the equipment of IR-UWB and image moment | |
CN108540542B (en) | Mobile augmented reality system and display method | |
US11769306B2 (en) | User-exhibit distance based collaborative interaction method and system for augmented reality museum | |
CN106850528A (en) | Media system and method | |
CN105608746A (en) | Method for virtual realizing of reality | |
KR102012835B1 (en) | An augmented reality system capable of manipulating an augmented reality object using three-dimensional attitude information and recognizes handwriting of character | |
CN207488953U (en) | Virtual reality scenario simulator | |
CN109871912A (en) | Virtual reality scenario simulator and method | |
CN105488312A (en) | Game system | |
CN108615260A (en) | The method and device that shows of augmented reality digital culture content is carried out under a kind of exception actual environment | |
CN111540056A (en) | AR intelligent navigation method and AR intelligent navigation system | |
CN208077438U (en) | Virtual reality scenario dynamic analog device | |
CN109992097A (en) | Virtual reality scenario dynamic analog device and method | |
US11907434B2 (en) | Information processing apparatus, information processing system, and information processing method | |
CN110389691A (en) | Using the augmented reality equipment of three-dimensional scenic computer graphics principle | |
Lee | Interactive Game using the Augmented Reality Technique | |
Pan et al. | Overview of Augmented Reality Technology |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20180612 Termination date: 20211204 |