CN103929456A - Far-end space simulation method - Google Patents
Far-end space simulation method Download PDFInfo
- Publication number
- CN103929456A CN103929456A CN201310014877.1A CN201310014877A CN103929456A CN 103929456 A CN103929456 A CN 103929456A CN 201310014877 A CN201310014877 A CN 201310014877A CN 103929456 A CN103929456 A CN 103929456A
- Authority
- CN
- China
- Prior art keywords
- information
- far
- reduction
- sent
- virtual
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 25
- 238000004088 simulation Methods 0.000 title claims abstract description 10
- 238000004891 communication Methods 0.000 claims abstract description 35
- 230000007613 environmental effect Effects 0.000 claims description 39
- 230000001413 cellular effect Effects 0.000 claims description 14
- 238000005286 illumination Methods 0.000 claims description 11
- 238000005094 computer simulation Methods 0.000 claims description 6
- 230000003287 optical effect Effects 0.000 claims description 6
- 238000010295 mobile communication Methods 0.000 claims description 3
- 230000000007 visual effect Effects 0.000 claims description 3
- 230000002194 synthesizing effect Effects 0.000 claims 1
- 230000002860 competitive effect Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 230000007812 deficiency Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 210000001503 joint Anatomy 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Landscapes
- Processing Or Creating Images (AREA)
Abstract
The invention relates to a far-end space simulation method which comprises the steps that a space information collector (30), a central server (10) and a space simulation restoring terminal (40) are arranged at a far-end destination. The central server (10) transmits received environment information data collected by the space information collector (30) to a second storage (401) of the space simulation restoring terminal (40) through a communication network (20). Then, a central controller (402) carries out decoding restoring on the received environment information data of the second storage (401) in a time-sharing, classifying, subchannel and format-sorting mode. Compared with the prior art, the space state of the far-end destination or a hypothetical space state can be simulated to a place where a user is placed, the requirement that the user wants to feel various events happening at the far-end destination clearly can be met, and the user can have the feeling that the user travels through time and space to the destination.
Description
Technical field the present invention relates to transfer of data, especially relates to the method for virtual far-end spatiality.
Background technology is current, along with scientific and technical progress, people's work and life are more and more rich and varied, people are also more and more higher for work and quality of life demand, and one of them important demand is exactly to wish to understand all over the world thing or the news of the various physical culture of generation at any time, culture, politics, nature, humanity etc. aspect quick, accurate, full and accurately.At present, people to understand the mode of these things or news be that sound, word, picture or the image providing by media such as broadcast, newspaper, TV, the Internets realized.The problem existing in this process is, these sound, word, picture or image can only the relatively unilateral information in unilateral transmission spot, and cannot allow people's vivid sense be subject to place, the spot space environment information of entirety relatively, people can not get impression on the spot in person.
Summary of the invention the technical problem to be solved in the present invention is to avoid above-mentioned the deficiencies in the prior art part and a kind of propose virtual far-end spatiality method, the method can be by virtual to the residing place of user to the spatiality of far-end destination or imaginary spatiality, want the vivid demand of this or that far-end destination occurs of experiencing to meet user, allow user have seemingly to go through the veil of time, sensation on the spot in person.
The present invention solve the technical problem can be by realizing by the following technical solutions:
A kind of method that proposes virtual far-end spatiality, comprises the steps:
A. in far-end destination, Spatial information collection device is set, described Spatial information collection device at least comprises digital dock, video sensor, audio sensor, encoder and first memory; Each transducer co-operation is under the frequency of described digital dock, separately corresponding environmental information classification in the space environment of far-end destination is gathered and is sent to encoder, described environmental information at least comprises video information and audio-frequency information, wherein video information comprises the horizontal direction images at least 270 degree visual angles around of described destination, and described encoder carries out the various information of each transducer collection to send in first memory and store after timesharing, classification, subchannel, cellular coding;
B., central server is set, this central server is established a communications link by communication network and described first memory, the environmental information data of storing in described first memory are sent to described central server;
C., space virtual reduction terminal is set, this space virtual reduction terminal is one can hold a people or many people's closed room device, in this space virtual reduction terminal, is at least provided with second memory, central controller, display screen, combination audio and light device; Described second memory and central controller are established a communications link by communication network and described central server;
D. described central server is transferred to the environmental information data of receiving the second memory of described space virtual reduction terminal by communication network; The timesharing of environmental information data, classification, subchannel, cellular decoding reduction that central controller is received second memory again, the video-information decoding of described video sensor collection is sent to display screen, the audio-frequency information decoding of audio sensor collection is sent to combination audio.
Further, in described steps A, described Spatial information collection device also comprises temperature sensor, humidity sensor, smell sensor and optical sensor, and described environmental information also comprises temperature information, humidity information, smell information and illumination information, in described step C, in described space virtual reduction terminal, be also provided with temperature reduction adjusting device, humidity reduction adjusting device, smell reduction adjusting device and light device, in described step D, the environmental information data timesharing that central controller is received second memory again, classification, subchannel, cellular decoding reduction, the video-information decoding of described video sensor collection is sent to display screen, the audio-frequency information decoding of audio sensor collection is sent to combination audio, the temperature information decoding of described temperature sensor collection is sent to temperature reduction adjusting device, the humidity information decoding of humidity sensor collection is sent to humidity reduction adjusting device, the audio-frequency information decoding of audio sensor collection is sent to combination audio, the smell information decoding of smell sensor collection is sent to smell reduction adjusting device, the illumination information decoding of optical sensor collection is sent to light device.
Further, in described steps A, the environmental information of the far-end destination that Spatial information collection device gathers, the imaginary spatial state information that general-purpose computers analogue technique is simulated is replaced, imagination spatial state information is the same with environmental information, comprises video information, temperature information, humidity information, audio-frequency information, smell information and illumination information; Described imaginary spatial state information is sent in first memory and is stored by communication network again after timesharing, classification, subchannel, cellular coding.
In the present invention, the environmental information data that described second memory is received can be also the comprehensive synthetic of the environmental information of far-end destination of Spatial information collection device collection and imaginary spatial state information that computer modeling technique is simulated.
In the present invention, described central server is separate unit server, server zone or cloud computing center.
In the present invention, described central controller is multimedia computer, hand-held mobile terminal or nonshared control unit.
In the present invention, described communication network is broadcast communication network, TV and communication network, mobile communications network, dedicated communications network, satellite communication network or internet.
In the present invention, the environmental information data of a far-end destination, place can be sent to many places space virtual reduction terminal simultaneously; One place's space virtual reduction terminal can receive the environmental information data of reduction far-end destination, many places.
In the present invention, the illusion spatial state information data of place simulation can be sent to many places space virtual reduction terminal simultaneously; One place's space virtual reduction terminal can receive the illusion spatial state information data of reduction many places simulation.
Compared with the existing technology, the technique effect of the method for the virtual far-end spatiality of the present invention is: 1. the people in space virtual reduction terminal experiences far-end destination institute occurrence as can be on the spot in person, comprise scene dynamics, sound, temperature, smell etc., the competitive sports of for example far-end, large-scale theatrical festival, do not use scene, can realize and the much the same atmosphere in scene yet; 2. according to people's hobby and demand, simulate and provide the illusion spatialities such as such as ancient times, foreign lands' landscape, mysterious world to the people in space virtual reduction terminal, allow them obtain the impression that goes through the veil of time and joyful, meet people's different spiritual demands.
brief description of the drawings
Fig. 1 is the principle block diagram of the virtual far-end spatiality of the present invention embodiment mono-;
Fig. 2 is the principle block diagram of the virtual far-end spatiality of the present invention embodiment bis-;
Fig. 3 is the structure block diagram of Spatial information collection device 30;
Fig. 4 is the structure block diagram of space virtual reduction terminal 40.
Embodiment is described in further detail below in conjunction with the most preferred embodiment shown in accompanying drawing.
Embodiment mono-:
The embodiment of the method one of virtual far-end spatiality, shown in Fig. 1, comprises the steps:
A. in far-end destination, Spatial information collection device 30 is set, described far-end destination can be arbitrary place that people like and yearn for, for example competitive sports venue, large-scale theatrical festival location, etc., the structure of described Spatial information collection device 30 as shown in Figure 3, comprises digital dock 301, video sensor 302, temperature sensor 303, humidity sensor 304, audio sensor 305, smell sensor 306, optical sensor 307, encoder 308 and first memory 309, each transducer co-operation is under the frequency of described digital dock 301, separately corresponding environmental information classification in the space environment of far-end destination 106 is gathered and is sent to encoder 308, described environmental information comprises video information, temperature information, humidity information, audio-frequency information, smell information and illumination information, wherein video information comprises the horizontal direction image at least 270 degree visual angles around of described destination, the various information of each transducer collection are carried out timesharing by described encoder 308, classification, subchannel, after cellular coding, send to storage in first memory 309,
B., central server 10 is set, this central server 10 is established a communications link by communication network 20 and described first memory 309, the environmental information data of storage in described first memory 309 are sent to described central server 10; Described central server 10 can be separate unit server, and server zone or cloud computing center also can be set, specifically according to the size of environmental information data and provide the needs of supporting to determine;
C., space virtual reduction terminal 40 is set, this space virtual reduction terminal 40 is one can hold a people or many people's closed room device, as shown in Figure 4, in this space virtual reduction terminal 40, be provided with second memory 401, central controller 402, display screen 403, temperature reduction adjusting device 404, humidity reduction adjusting device 405, combination audio 406, smell reduction adjusting device 407 and light device 408; Described second memory 401 and central controller 402 are established a communications link by communication network 20 and described central server 10; Described central controller 402 can be multimedia computer, can be also hand-held mobile terminal or nonshared control unit, or other tool calculates the equipment of control data transmission;
D. described central server 10 is transferred to the environmental information data of receiving the second memory 401 of described space virtual reduction terminal 40 by communication network 20, the environmental information data timesharing that central controller 402 is received second memory 401 again, classification, subchannel, cellular decoding reduction, the video-information decoding that described video sensor 302 is gathered is sent to display screen 403, the temperature information decoding that described temperature sensor 303 is gathered is sent to temperature reduction adjusting device 404, the humidity information decoding that humidity sensor 304 is gathered is sent to humidity reduction adjusting device 405, the audio-frequency information decoding that audio sensor 305 is gathered is sent to combination audio 406, the smell information decoding that smell sensor 306 is gathered is sent to smell reduction adjusting device 407, the illumination information decoding that optical sensor 307 is gathered is sent to light device 408.
Because each transducer of Spatial information collection device 30 is all Information Monitoring under the frequency sequential of same digital dock, video information, temperature information, humidity information, audio-frequency information, the timesharing again of the data of smell information and illumination information, classification, subchannel, cellular coding, then through central controller 402 timesharing of space virtual reduction terminal 40, classification, subchannel, cellular decoding reduction sends to corresponding device or equipment, by virtual the spatial state information of far-end destination that Spatial information collection device 30 is set to space virtual reduction terminal 40, make people in space virtual reduction terminal 40 just as the scene in far-end destination in, for example, Spatial information collection device 30 is set in certain gymnasium, people in the space virtual reduction terminal 40 away from this gymnasium, need not enter in this gymnasium, also be equivalent to see at the scene the competitive sports in this gymnasium.
In the present invention, described communication network 20 is broadcast communication network, TV and communication network, mobile communications network, dedicated communications network, satellite communication network or internet, can be also mixing or the integrated use of these networks.
In the present invention, the environmental information data of a far-end destination, place can be sent to many places space virtuals reduction terminal 40 simultaneously, as above-mentioned, the competitive sports in described gymnasium can be simultaneously by virtual to multiple space virtuals reduction terminals 40, one place's space virtual reduction terminal 40 can receive the environmental information data of reduction far-end destination, many places, for example, in certain gymnasium, first ground, Spatial information collection device 30 is set, in certain joint performance venue of second ground, be also provided with Spatial information collection device 30, all dock with a space virtual reduction terminal 40 on the third ground by central server 10, so, people in this space virtual reduction terminal 40, hold competitive sports in certain gymnasium, first ground time, be equivalent to stay certain competitive sports scene, gymnasium, first ground, hold theatrical festival in second ground joint performance venue time, be equivalent to stay theatrical festival scene, second ground.
In the present invention, the collection of data, coding, storage, and central server is to data communication and the decoding reduction of central controller and second memory, is all ripe technology, specifically repeats no more herein.
Embodiment bis-:
This embodiment bis-is roughly identical with embodiment mono-, as shown in Figure 2, difference is: embodiment mono-is the realistic space state of virtual concrete certain destination, and virtual in embodiment bis-be the illusion spatiality simulating, in this embodiment, Spatial information collection device 30 must be set, and whole environmental information data have computer simulation and produce.In the present embodiment, imaginary spatial state information is the same with environmental information, comprises video information, temperature information, humidity information, audio-frequency information, smell information and illumination information; Described imaginary spatial state information sends to storage in first memory 309 by communication network 20 again after timesharing, classification, subchannel, cellular coding.
In the present embodiment, the illusion spatial state information data of one place simulation can be sent to many places space virtual reduction terminal 40 simultaneously, for example, first simulate the scene of ancient times, again by the reduction of decoding of multiple different space virtuals reduction terminals 40, make to reduce the people of terminals 40 in these space virtuals and go through the veil of time and get back to the ancient times seemingly; One place's space virtual reduction terminal 40 can receive the illusion spatial state information data of reduction many places simulation, for example, first place simulates the ancient times, and second place simulates foreign lands' landscape, and the third place simulates mysterious world, all achieve a butt joint with the virtual reduction terminal 40 of the same space, so, can select according to the hobby of oneself in the people of this space virtual reduction terminal 40, be in the ancient times for a moment, stay among foreign lands for a moment, jump to mysterious world for a moment.
Embodiment tri-:
This embodiment tri-is roughly identical with embodiment mono-, two, difference is, in the present embodiment, the environmental information data that described second memory 401 is received can be the comprehensive synthetic of the environmental information of the far-end destination that gathers of Spatial information collection device 30 and imaginary spatial state information that computer modeling technique is simulated.That is to say, in all kinds of environmental informations of space virtual reduction terminal 40, some information is the realistic space state that comes from concrete certain destination, some information is the illusion spatial state information simulating, the scene that for example audio-frequency information, temperature information, smell information come from first Spatial information collection device 30 is housed, and video information, humidity information and illumination information be the computer simulation of second place out, then by timesharing, classification, subchannel, cellular decoding reduction; Also can be, the scene that in video information, dynamic video comes from first Spatial information collection device 30 is housed, and in video information static scene by the computer simulation of second place out, then by the reduction of decoding of space virtual reduction terminal 40.
Above content is in conjunction with concrete optimal technical scheme further description made for the present invention, can not assert that specific embodiment of the invention is confined to these explanations.For general technical staff of the technical field of the invention, without departing from the inventive concept of the premise, can also make some simple deduction or replace, all should be considered as belonging to protection scope of the present invention.
Claims (9)
1. a method for virtual far-end spatiality, is characterized in that, comprises the steps:
A., Spatial information collection device (30) is set in far-end destination, and described Spatial information collection device (30) at least comprises digital dock (301), video sensor (302), audio sensor (305), encoder (308) and first memory (309); Each transducer co-operation is under the frequency of described digital dock (301), separately corresponding environmental information classification in the space environment of far-end destination (106) is gathered and is sent to encoder (308), described environmental information at least comprises video information and audio-frequency information, wherein video information comprises the horizontal direction image at least 270 degree visual angles around of described destination, and described encoder (308) carries out the various information of each transducer collection to send to storage in first memory (309) after timesharing, classification, subchannel, cellular coding;
B., central server (10) is set, this central server (10) is established a communications link by communication network (20) and described first memory (309), the environmental information data of storage in described first memory (309) are sent to described central server (10);
C., space virtual reduction terminal (40) is set, this space virtual reduction terminal (40) is one can hold a people or many people's closed room device, in this space virtual reduction terminal (40), is at least provided with second memory (401), central controller (402), display screen (403), combination audio (406) and light device (408); Described second memory (401) and central controller (402) are established a communications link by communication network (20) and described central server (10);
D. described central server (10) is transferred to the environmental information data of receiving the second memory (401) of described space virtual reduction terminal (40) by communication network (20); The timesharing of environmental information data, classification, subchannel, cellular decoding reduction that central controller (402) is received second memory (401) again, the video-information decoding that described video sensor (302) is gathered is sent to display screen (403), and the audio-frequency information decoding that audio sensor (305) is gathered is sent to combination audio (406).
2. the method for virtual far-end spatiality according to claim 1, it is characterized in that: in described steps A, described Spatial information collection device (30) also comprises temperature sensor (303), humidity sensor (304), smell sensor (306) and optical sensor (307), and described environmental information also comprises temperature information, humidity information, smell information and illumination information, in described step C, in described space virtual reduction terminal (40), be also provided with temperature reduction adjusting device (404), humidity reduction adjusting device (405), smell reduction adjusting device (407) and light device (408), in described step D, the environmental information data timesharing that central controller (402) is received second memory (401) again, classification, subchannel, cellular decoding reduction, the video-information decoding that described video sensor (302) is gathered is sent to display screen (403), the audio-frequency information decoding that audio sensor (305) is gathered is sent to combination audio (406), the temperature information decoding that described temperature sensor (303) is gathered is sent to temperature reduction adjusting device (404), the humidity information decoding that humidity sensor (304) is gathered is sent to humidity reduction adjusting device (405), the audio-frequency information decoding that audio sensor (305) is gathered is sent to combination audio (406), the smell information decoding that smell sensor (306) is gathered is sent to smell reduction adjusting device (407), the illumination information decoding that optical sensor (307) is gathered is sent to light device (408).
3. the method for virtual far-end spatiality according to claim 2, it is characterized in that: in described steps A, the environmental information of the far-end destination that Spatial information collection device (30) gathers, the imaginary spatial state information that general-purpose computers analogue technique is simulated is replaced, imagination spatial state information is the same with environmental information, comprises video information, temperature information, humidity information, audio-frequency information, smell information and illumination information; Described imaginary spatial state information sends to storage in first memory (309) after timesharing, classification, subchannel, cellular coding again by communication network (20).
4. the method for virtual far-end spatiality according to claim 3, is characterized in that: the environmental information data that described second memory (401) is received are environmental informations of far-end destination and comprehensively synthesizing of imaginary spatial state information that computer modeling technique is simulated that Spatial information collection device (30) gathers.
5. according to the method for the virtual far-end spatiality described in any one of claim 1 to 4, it is characterized in that: described central server (10) is separate unit server, server zone or cloud computing center.
6. according to the method for the virtual far-end spatiality described in any one of claim 1 to 4, it is characterized in that: described central controller (402) is multimedia computer, hand-held mobile terminal or nonshared control unit.
7. according to the method for the virtual far-end spatiality described in any one of claim 1 to 4, it is characterized in that: described communication network (20) is broadcast communication network, TV and communication network, mobile communications network, dedicated communications network, satellite communication network or internet.
8. the method for virtual far-end spatiality according to claim 1 and 2, is characterized in that: the environmental information data of a far-end destination, place can be sent to many places space virtual reduction terminals (40) simultaneously; One place's space virtual reduction terminal (40) can receive the environmental information data of reduction far-end destination, many places.
9. the method for virtual far-end spatiality according to claim 3, is characterized in that: the illusion spatial state information data of place simulation can be sent to many places space virtual reduction terminals (40) simultaneously; One place's space virtual reduction terminal (40) can receive the illusion spatial state information data of reduction many places simulation.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310014877.1A CN103929456B (en) | 2013-01-16 | 2013-01-16 | The method of virtual remote spatiality |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310014877.1A CN103929456B (en) | 2013-01-16 | 2013-01-16 | The method of virtual remote spatiality |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103929456A true CN103929456A (en) | 2014-07-16 |
CN103929456B CN103929456B (en) | 2018-01-02 |
Family
ID=51147533
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201310014877.1A Active CN103929456B (en) | 2013-01-16 | 2013-01-16 | The method of virtual remote spatiality |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103929456B (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104618512A (en) * | 2015-02-28 | 2015-05-13 | 任学宁 | Remote transmission odor simulation system and method |
CN106813790A (en) * | 2015-11-27 | 2017-06-09 | 英业达科技有限公司 | Temperature informing device |
CN112740147A (en) * | 2018-09-28 | 2021-04-30 | 环球城市电影有限责任公司 | Special effect communication technology |
CN114020082A (en) * | 2021-11-18 | 2022-02-08 | 翟腾飞 | Experience center based on consumers |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1865630A (en) * | 2005-05-17 | 2006-11-22 | 陈进 | Multifunctional private room device and method therefor |
CN101118654A (en) * | 2007-09-19 | 2008-02-06 | 中国科学院上海微***与信息技术研究所 | Machine vision computer simulation emulation system based on sensor network |
CN101174332A (en) * | 2007-10-29 | 2008-05-07 | 张建中 | Method, device and system for interactively combining real-time scene in real world with virtual reality scene |
CN102147925A (en) * | 2010-02-08 | 2011-08-10 | 上海华博信息服务有限公司 | Virtual reality processing method based on image sensor |
-
2013
- 2013-01-16 CN CN201310014877.1A patent/CN103929456B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1865630A (en) * | 2005-05-17 | 2006-11-22 | 陈进 | Multifunctional private room device and method therefor |
CN101118654A (en) * | 2007-09-19 | 2008-02-06 | 中国科学院上海微***与信息技术研究所 | Machine vision computer simulation emulation system based on sensor network |
CN101174332A (en) * | 2007-10-29 | 2008-05-07 | 张建中 | Method, device and system for interactively combining real-time scene in real world with virtual reality scene |
CN102147925A (en) * | 2010-02-08 | 2011-08-10 | 上海华博信息服务有限公司 | Virtual reality processing method based on image sensor |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104618512A (en) * | 2015-02-28 | 2015-05-13 | 任学宁 | Remote transmission odor simulation system and method |
CN106813790A (en) * | 2015-11-27 | 2017-06-09 | 英业达科技有限公司 | Temperature informing device |
CN112740147A (en) * | 2018-09-28 | 2021-04-30 | 环球城市电影有限责任公司 | Special effect communication technology |
CN114020082A (en) * | 2021-11-18 | 2022-02-08 | 翟腾飞 | Experience center based on consumers |
Also Published As
Publication number | Publication date |
---|---|
CN103929456B (en) | 2018-01-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN204965778U (en) | Infant teaching system based on virtual reality and vision positioning | |
CN106789991A (en) | A kind of multi-person interactive method and system based on virtual scene | |
CN105052154A (en) | Generating videos with multiple viewpoints | |
CN105657438A (en) | Method and apparatus for processing panoramic live video resource | |
JP2023501832A (en) | Realization method, apparatus and related products for lens division | |
CN102752667A (en) | Multi-stream media live broadcast interaction system and live broadcast interaction method | |
CN106572344A (en) | Virtual reality live broadcast method and system and cloud server | |
CN103533445B (en) | Flying theater playing system based on active interaction | |
CN108769824A (en) | A kind of video mixed flow method, apparatus, system, equipment and medium | |
WO2014162257A2 (en) | Wearable personal mini cloud game and multimedia device | |
CN110427107A (en) | Virtually with real interactive teaching method and system, server, storage medium | |
CN103929456A (en) | Far-end space simulation method | |
CN105611220A (en) | Communication system based on holographic projection technology | |
CN109035933A (en) | Remote interaction educational method and its system | |
CN102394927A (en) | Method for storing and synchronizing data of internal and external networks | |
CN106204119A (en) | A kind of model room access method based on Quick Response Code and system | |
CN205123915U (en) | Fictitious tour application system | |
CN207123961U (en) | Immersion multi-person synergy trainer for the three-dimensional arc curtain formula of Substation Training | |
CN102521864A (en) | Method for simulating display screen effect in game | |
CN206179284U (en) | Interactive teaching equipment based on cloud environment | |
Wong et al. | Interactive museum exhibits with embedded systems: A use‐case scenario | |
Yang et al. | A Human‐Computer Interaction System for Agricultural Tools Museum Based on Virtual Reality Technology | |
CN202694670U (en) | Multimedia digital sand table | |
CN202551219U (en) | Long-distance three-dimensional virtual simulation synthetic system | |
CN103198519A (en) | Virtual character photographic system and virtual character photographic method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right |
Effective date of registration: 20190131 Address after: 610000 South Tianfu Avenue, Tianfu New District, Chengdu City, Sichuan Province, 2039 Tianfu Jingrong Building, 16th Floor, 1609 Patentee after: Chengdu Tianfu Topfond Intelligent Technology Co Ltd Address before: 518000 Apartment Building 901, No. 5 Taoyuan Road, Nanshan District, Shenzhen City, Guangdong Province Patentee before: Chen Jin |
|
TR01 | Transfer of patent right |