CN107274491A - A kind of spatial manipulation Virtual Realization method of three-dimensional scenic - Google Patents

A kind of spatial manipulation Virtual Realization method of three-dimensional scenic Download PDF

Info

Publication number
CN107274491A
CN107274491A CN201610224291.1A CN201610224291A CN107274491A CN 107274491 A CN107274491 A CN 107274491A CN 201610224291 A CN201610224291 A CN 201610224291A CN 107274491 A CN107274491 A CN 107274491A
Authority
CN
China
Prior art keywords
scenery
space
hardware platform
recognized
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610224291.1A
Other languages
Chinese (zh)
Inventor
梁超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian Qianjing Co-Founder Technology Co Ltd
Original Assignee
Dalian Qianjing Co-Founder Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian Qianjing Co-Founder Technology Co Ltd filed Critical Dalian Qianjing Co-Founder Technology Co Ltd
Priority to CN201610224291.1A priority Critical patent/CN107274491A/en
Publication of CN107274491A publication Critical patent/CN107274491A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a kind of spatial manipulation Virtual Realization method of three-dimensional scenic, comprise the following steps:S1:Images to be recognized is made, system records the characteristic point information of images to be recognized;S2:Write the call instruction information of camera;S3:Set locating rule;S4:Images to be recognized is captured using the camera of hardware platform;S5:Set up 3d space coordinates and lock the coordinate system, occur the 3D scenery after identification image in the 3d space coordinates;S6:The dynamic and special efficacy for the manipulation button control 3D scenery that user passes through hardware platform are acted;S7:Using the displacement data of video capturing device and gyroscope monitoring 3D scenery under true environment, and it is shown in by computing on screen, and the real-time attitude information of 3D scenery is sent to terminal device by communication and makes corresponding dynamic effect.

Description

A kind of spatial manipulation Virtual Realization method of three-dimensional scenic
Technical field
The present invention relates to augmented reality field, more particularly to a kind of spatial manipulation of three-dimensional scenic are virtually real Existing method, the technology is referred to as Space Control Virtual Reality, abbreviation SCVR, i.e. space Manipulate virtual reality.
Background technology
Augmented reality (Augmented Reality, abbreviation AR) augmented reality, it is a kind of by true generation Boundary's information and the integrated new technology of virtual world information " seamless ", are the timings script in real world Between be difficult the entity information (visual information, sound, taste, tactile etc.) experienced in spatial dimension, pass through computer Deng science and technology, it is superimposed again after analog simulation, by virtual Information application to real world, by human sensory Perceived, so as to reach the sensory experience of exceeding reality.Real environment and virtual object are superimposed in real time Same picture or space has been arrived simultaneously to exist.Augmented reality, not only presents the information of real world, And show virtual information simultaneously, two kinds of information are complementary to one another, are superimposed.In the enhancing of visualization In reality, user utilizes Helmet Mounted Display, is synthesized together real world and computer graphic are multiple, just may be used To see the real world around it.Augmented reality contains multimedia, three-dimensional modeling, regarded in real time Frequency is shown and new technology and the new tool such as control, fused multisensor, real-time tracking and registration, scene fusion. Augmented reality is provided in general, different from the appreciable information of the mankind.
Unity3D is that one developed by Unity Technologies allows player easily to create such as three-dimensional to regard The multi-platform comprehensive game of the type interaction contents such as frequency game, building visualization, realtime three dimensional animation is opened Hair instrument, is a professional game engine integrated comprehensively.
Actual image is realized that three-dimensional virtual image is real-time by people by certain technological means in the prior art The technical scheme of display, but defect of the prior art is the virtual image of actual appearance not by image Definite spatial information shows comprehensively, simply from different angles, the secondary beautiful image of visual presence one or It is the stronger image of visual impression, and its color is single does not allow people to produce three-dimensional sense to virtual image Feel.
The content of the invention
The problem of being existed according to prior art, it is virtually real the invention discloses a kind of spatial manipulation of three-dimensional scenic Existing method, comprises the following steps:
S1:Images to be recognized is made, system records the characteristic point information of images to be recognized;
S2:The call instruction information of its corresponding camera is write for different hardware platform systems;
S3:Images to be recognized is placed into reality scene, the origin of coordinates is set by hardware platform, half is set up The locating rule of spherical space, spherical space and pure flat space of planes;
S4:Images to be recognized is captured using the camera of hardware platform, calculated per frame image features point;
S5:The three-dimensional space position of images to be recognized is calculated, virtual scene is calculated and appears in actual scene The origin of coordinates simultaneously sets up 3d space coordinates and locks the coordinate system accordingly, occurs in the 3d space coordinates Recognize the 3D scenery after image;
S6:The dynamic and special efficacy for the manipulation button control 3D scenery that user passes through hardware platform are acted;
S7:Using the displacement data of video capturing device and gyroscope monitoring 3D scenery under true environment, and It is shown in by computing on screen, and the real-time attitude information of 3D scenery is sent to by communication Terminal device makes corresponding dynamic effect.
Set up in S6 after 3d space coordinates, when user is manipulated using hardware platform to 3D scenery:
The change frequency of the unit number of seconds of the reflection shadow difference of entity scenery is caught using camera, according to the frequency Rate system judges moving target and the static target set up in the entity scenery in space;
When multiple users are manipulated using respective hardware platform to 3D scenery, using bluetooth equipment with being The external device of system customization sets up data exchange.
System is using WiFi electric wave signal, the actual range between computing hardware platform and tested actual object, The distance between testee, the distance between the testee is the coordinate points of real space object in seat The distance of origin is marked, that is, the head-on collision relation of object in real space is obtained, so as to obtain surrounding space scenery Blurred contour.
When multiple users are manipulated using respective hardware platform to 3D scenery, system passes through bluetooth equipment The coordinate information in the different 3d spaces of interaction, calculates cross-check data and generation and realistic space Optimum Matching 3rd Virtual Space coordinate system, generates virtual coordinates corresponding with real-world scene coordinate in this coordinate system.
When the action of 3D scenery is controlled in S6:Hardware platform carries out data exchange, system record with 3D scenery The change information of the data message and location data simultaneously gives adjust instruction, and monitoring is in real space in real time The new real-world object occurred, outline data is fed back and corresponding mathematical model is born, using evading, Into, ignore, interactive processing mode is presented in display device screen.
The hardware platform is using associated mobile devices such as mobile phone, tablet personal computers, and the equipment can be intelligent hand Ring Google glass etc., or mobile device on the market is not just being appeared in production.
By adopting the above-described technical solution, a kind of spatial manipulation for three-dimensional scenic that the present invention is provided is virtually real Existing method:By framework of unity3D by augmented reality and virtual reality technology using mobile device manipulation to shifting In the system of dynamic equipment, pass through the display screen of mobile device, the perfection of experiencing virtual scenery and real-world scene With reference to, it is possible to realize space orientation, the motion that virtual scene is manipulated to the mobile device in reality scene Track and animation effect.
Brief description of the drawings
, below will be to implementing in order to illustrate more clearly of the embodiment of the present application or technical scheme of the prior art The accompanying drawing used required in example or description of the prior art is briefly described, it should be apparent that, describe below In accompanying drawing be only some embodiments described in the application, for those of ordinary skill in the art, On the premise of not paying creative work, other accompanying drawings can also be obtained according to these accompanying drawings.
Fig. 1 is flow chart of the invention.
Embodiment
To make technical scheme and advantage clearer, with reference to the accompanying drawing in the embodiment of the present invention, Clear complete description is carried out to the technical scheme in the embodiment of the present invention:
A kind of spatial manipulation Virtual Realization method of three-dimensional scenic as shown in Figure 1, comprises the following steps:
S1:Images to be recognized is made, system records the characteristic point information of images to be recognized;Wherein make picture Process in the prior art make picture method, by the pictorial information made store to the present invention based on In software systems, then system scans the effect that identification just occurs during to the image by camera.
S2:The call instruction information of its corresponding camera is write for different hardware platform systems.For example Close camera, open the first-class command information of shooting.
S3:Images to be recognized is placed into reality scene, the origin of coordinates is set by hardware platform, half is set up The locating rule of spherical space, spherical space and pure flat space of planes.
S4:Images to be recognized is captured using the camera of hardware platform, calculated per frame image features point.
S5:The three-dimensional space position of images to be recognized is calculated, virtual scene is calculated and appears in actual scene The origin of coordinates simultaneously sets up 3d space coordinates and locks the coordinate system accordingly, occurs in the 3d space coordinates Recognize the 3D scenery after image.
S6:The dynamic and special efficacy for the manipulation button control 3D scenery that user passes through hardware platform are acted.Here User just can by the hardware platform such as function key of mobile phone if can manipulate the dynamic of 3D scenery in Virtual Space Work and displacement etc. activity.
S7:Using the displacement data of video capturing device and gyroscope monitoring 3D scenery under true environment, and It is shown in by computing on screen, and the real-time attitude information of 3D scenery is sent to by communication Terminal device makes corresponding dynamic effect.
Further, set up in S6 after 3d space coordinates, when user is carried out using hardware platform to 3D scenery During manipulation:The change frequency of the poor level unit number of seconds of reflection shadow of entity scenery is caught using camera, according to The frequency system judges moving target and the static target set up in the entity scenery in space.The process is to take the photograph As head captures the motion process of entity scenery, so as to judge that the scenery is static or motion.
When multiple users are manipulated using respective hardware platform to 3D scenery, using bluetooth equipment with being The external device of system customization sets up data exchange.Bluetooth equipment is divided into positioner and portable mobile device, fixed Position device function be:By the data exchange between multiple hardware platforms, positioning is measured.It is portable The function of mobile device is to enter row data communication with hardware platform, measures positioning.Positioner is that band is blue The stationary object of tooth function, portable mobile device is the moving object with Bluetooth function.
System is using WiFi electric wave signal, the actual range between computing hardware platform and tested actual object, The distance between testee, the distance between the testee is the coordinate points and seat of real space object The distance of origin is marked, that is, the head-on collision relation of object in real space is obtained, so as to obtain surrounding space scenery Blurred contour.
When multiple users are manipulated using respective hardware platform to 3D scenery, system passes through bluetooth equipment The coordinate information in the different 3d spaces of interaction, calculates cross-check data and generation and realistic space Optimum Matching 3rd Virtual Space coordinate system, generates virtual coordinates corresponding with real-world scene coordinate in this coordinate system.
When the action of 3D scenery is controlled in S6:Hardware platform carries out data exchange, system record with 3D scenery The change information of the data message and location data simultaneously gives adjust instruction, and monitoring is in real space in real time The new real-world object occurred, outline data is fed back and corresponding mathematical model is born, using evading, Into, ignore, interactive processing mode is presented in display device screen.
The hardware platform uses mobile phone or tablet personal computer.
Method disclosed by the invention is seen by mobile display devices such as mobile phone or tablet personal computers in real scene The three-dimensional cartoon scene arrived, can realize calculated immediately according to the time with reality scene its space orientation and Displacement, it is possible to which making user freely manipulate virtual scene by the said equipment, calculation is empty in real time in reality scene Between positioning and displacement, it is possible to achieve the interactive process of true man and virtual scene.
The foregoing is only a preferred embodiment of the present invention, but protection scope of the present invention not office Be limited to this, any one skilled in the art the invention discloses technical scope in, according to this The technical scheme of invention and its inventive concept are subject to equivalent substitution or change, should all cover the protection in the present invention Within the scope of.

Claims (6)

1. a kind of spatial manipulation Virtual Realization method of three-dimensional scenic, it is characterised in that comprise the following steps:
S1:Images to be recognized is made, system records the characteristic point information of images to be recognized;
S2:The call instruction information of its corresponding camera is write for different hardware platform systems;
S3:Images to be recognized is placed into reality scene, the origin of coordinates is set by hardware platform, half is set up The locating rule of spherical space, spherical space and pure flat space of planes;
S4:Images to be recognized is captured using the camera of hardware platform, calculated per frame image features point;
S5:The three-dimensional space position of images to be recognized is calculated, virtual scene is calculated and appears in actual scene The origin of coordinates simultaneously sets up 3d space coordinates and locks the coordinate system accordingly, occurs in the 3d space coordinates Recognize the 3D scenery after image;
S6:The dynamic and special efficacy for the manipulation button control 3D scenery that user passes through hardware platform are acted;
S7:Using the displacement data of video capturing device and gyroscope monitoring 3D scenery under true environment, and It is shown in by computing on screen, and the real-time attitude information of 3D scenery is sent to by communication Terminal device makes corresponding dynamic effect.
2. a kind of spatial manipulation Virtual Realization method of three-dimensional scenic according to claim 1, its feature Also reside in:Set up in S6 after 3d space coordinates, when user is manipulated using hardware platform to 3D scenery:
The change frequency of the unit number of seconds of the reflection shadow difference of entity scenery is caught using camera, according to the frequency Rate system judges moving target and the static target set up in the entity scenery in space;
When multiple users are manipulated using respective hardware platform to 3D scenery, using bluetooth equipment with being The external device of system customization sets up data exchange.
3. a kind of spatial manipulation Virtual Realization method of three-dimensional scenic according to claim 2, its feature Also reside in:System uses WiFi electric wave signal, the reality between computing hardware platform and tested actual object The distance between distance, testee, the distance between described testee is the coordinate of real space object Point obtains the head-on collision relation of object in real space, so as to obtain surrounding space in the distance of the origin of coordinates The blurred contour of scenery.
4. a kind of spatial manipulation Virtual Realization method of three-dimensional scenic according to claim 2, its feature Also reside in:When multiple users are manipulated using respective hardware platform to 3D scenery, system passes through bluetooth Equipment interacts the coordinate information in different 3d spaces, calculates cross-check data and generation and optimal of realistic space The 3rd Virtual Space coordinate system matched somebody with somebody, generates virtual coordinates corresponding with real-world scene coordinate in this coordinate system.
5. a kind of spatial manipulation Virtual Realization method of three-dimensional scenic according to claim 1, its feature Also reside in:When the action of 3D scenery is controlled in S6:Hardware platform carries out data exchange, system with 3D scenery Record the data message and the change information of location data and give adjust instruction, monitoring in real time is true empty Between in occur new real-world object, outline data feed back and corresponding mathematical model is born, use Evade, into, ignore, interactive processing mode is presented in display device screen.
6. a kind of spatial manipulation Virtual Realization method of three-dimensional scenic according to claim 1, its feature Also reside in:The hardware platform uses mobile phone or tablet personal computer associated mobile device.
CN201610224291.1A 2016-04-09 2016-04-09 A kind of spatial manipulation Virtual Realization method of three-dimensional scenic Pending CN107274491A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610224291.1A CN107274491A (en) 2016-04-09 2016-04-09 A kind of spatial manipulation Virtual Realization method of three-dimensional scenic

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610224291.1A CN107274491A (en) 2016-04-09 2016-04-09 A kind of spatial manipulation Virtual Realization method of three-dimensional scenic

Publications (1)

Publication Number Publication Date
CN107274491A true CN107274491A (en) 2017-10-20

Family

ID=60052549

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610224291.1A Pending CN107274491A (en) 2016-04-09 2016-04-09 A kind of spatial manipulation Virtual Realization method of three-dimensional scenic

Country Status (1)

Country Link
CN (1) CN107274491A (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109087399A (en) * 2018-07-17 2018-12-25 上海游七网络科技有限公司 A method of by positioning figure Fast synchronization AR space coordinates
CN109920035A (en) * 2019-01-21 2019-06-21 东南大学 A kind of dynamic special efficacy synthetic method based on mobile device augmented reality
CN110322257A (en) * 2018-03-28 2019-10-11 苏宁易购集团股份有限公司 Virtual objects distribution method and device based on 3D scene
CN110335292A (en) * 2019-07-09 2019-10-15 北京猫眼视觉科技有限公司 It is a kind of to track the method and system for realizing simulated scenario tracking based on picture
CN110490980A (en) * 2019-08-15 2019-11-22 中国建筑第二工程局有限公司西南分公司 A kind of Virtual Construction template information processing system and method based on fixation and recognition
CN110716685A (en) * 2018-07-11 2020-01-21 广东虚拟现实科技有限公司 Image display method, image display device and entity object thereof
CN110968198A (en) * 2019-12-05 2020-04-07 重庆一七科技开发有限公司 Method for simulating three-dimensional space moving positioning of virtual reality world in real space
CN111580676A (en) * 2020-05-20 2020-08-25 深圳中科盾科技有限公司 Foot gesture recognition omnidirectional control system and implementation method thereof
CN112905731A (en) * 2021-03-29 2021-06-04 中国电建集团昆明勘测设计研究院有限公司 IMU-GPS assisted linkage method for 360-degree panoramic photo and three-dimensional GIS scene
CN113282171A (en) * 2021-05-14 2021-08-20 中国海洋大学 Oracle augmented reality content interaction system, method, equipment and terminal
CN113409468A (en) * 2021-05-10 2021-09-17 北京达佳互联信息技术有限公司 Image processing method and device, electronic equipment and storage medium
CN113721475A (en) * 2020-05-26 2021-11-30 广州汽车集团股份有限公司 Data exchange method, data exchange control device and data exchange system
CN114141090A (en) * 2022-01-10 2022-03-04 中国矿业大学 Real-operation virtual measurement total station training simulation system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120162372A1 (en) * 2010-12-22 2012-06-28 Electronics And Telecommunications Research Institute Apparatus and method for converging reality and virtuality in a mobile environment
CN102903144A (en) * 2012-08-03 2013-01-30 樊晓东 Cloud computing based interactive augmented reality system implementation method
CN103226838A (en) * 2013-04-10 2013-07-31 福州林景行信息技术有限公司 Real-time spatial positioning method for mobile monitoring target in geographical scene
CN104156998A (en) * 2014-08-08 2014-11-19 深圳中科呼图信息技术有限公司 Implementation method and system based on fusion of virtual image contents and real scene
CN105468142A (en) * 2015-11-16 2016-04-06 上海璟世数字科技有限公司 Interaction method and system based on augmented reality technique, and terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120162372A1 (en) * 2010-12-22 2012-06-28 Electronics And Telecommunications Research Institute Apparatus and method for converging reality and virtuality in a mobile environment
CN102903144A (en) * 2012-08-03 2013-01-30 樊晓东 Cloud computing based interactive augmented reality system implementation method
CN103226838A (en) * 2013-04-10 2013-07-31 福州林景行信息技术有限公司 Real-time spatial positioning method for mobile monitoring target in geographical scene
CN104156998A (en) * 2014-08-08 2014-11-19 深圳中科呼图信息技术有限公司 Implementation method and system based on fusion of virtual image contents and real scene
CN105468142A (en) * 2015-11-16 2016-04-06 上海璟世数字科技有限公司 Interaction method and system based on augmented reality technique, and terminal

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110322257B (en) * 2018-03-28 2022-06-07 苏宁易购集团股份有限公司 Virtual article issuing method and device based on 3D scene
CN110322257A (en) * 2018-03-28 2019-10-11 苏宁易购集团股份有限公司 Virtual objects distribution method and device based on 3D scene
CN110716685B (en) * 2018-07-11 2023-07-18 广东虚拟现实科技有限公司 Image display method, image display device, image display system and entity object of image display system
CN110716685A (en) * 2018-07-11 2020-01-21 广东虚拟现实科技有限公司 Image display method, image display device and entity object thereof
CN109087399B (en) * 2018-07-17 2024-03-01 上海游七网络科技有限公司 Method for rapidly synchronizing AR space coordinate system through positioning map
CN109087399A (en) * 2018-07-17 2018-12-25 上海游七网络科技有限公司 A method of by positioning figure Fast synchronization AR space coordinates
CN109920035B (en) * 2019-01-21 2023-04-21 东南大学 Dynamic special effect synthesis method based on mobile equipment augmented reality
CN109920035A (en) * 2019-01-21 2019-06-21 东南大学 A kind of dynamic special efficacy synthetic method based on mobile device augmented reality
CN110335292A (en) * 2019-07-09 2019-10-15 北京猫眼视觉科技有限公司 It is a kind of to track the method and system for realizing simulated scenario tracking based on picture
CN110335292B (en) * 2019-07-09 2021-04-30 北京猫眼视觉科技有限公司 Method, system and terminal for realizing simulation scene tracking based on picture tracking
CN110490980A (en) * 2019-08-15 2019-11-22 中国建筑第二工程局有限公司西南分公司 A kind of Virtual Construction template information processing system and method based on fixation and recognition
CN110968198A (en) * 2019-12-05 2020-04-07 重庆一七科技开发有限公司 Method for simulating three-dimensional space moving positioning of virtual reality world in real space
CN111580676A (en) * 2020-05-20 2020-08-25 深圳中科盾科技有限公司 Foot gesture recognition omnidirectional control system and implementation method thereof
CN113721475A (en) * 2020-05-26 2021-11-30 广州汽车集团股份有限公司 Data exchange method, data exchange control device and data exchange system
CN112905731A (en) * 2021-03-29 2021-06-04 中国电建集团昆明勘测设计研究院有限公司 IMU-GPS assisted linkage method for 360-degree panoramic photo and three-dimensional GIS scene
CN113409468A (en) * 2021-05-10 2021-09-17 北京达佳互联信息技术有限公司 Image processing method and device, electronic equipment and storage medium
CN113282171A (en) * 2021-05-14 2021-08-20 中国海洋大学 Oracle augmented reality content interaction system, method, equipment and terminal
CN113282171B (en) * 2021-05-14 2024-03-12 中国海洋大学 Oracle text augmented reality content interaction system, method, equipment and terminal
CN114141090B (en) * 2022-01-10 2023-10-13 中国矿业大学 Total station practical training simulation system for practical operation and virtual measurement
CN114141090A (en) * 2022-01-10 2022-03-04 中国矿业大学 Real-operation virtual measurement total station training simulation system

Similar Documents

Publication Publication Date Title
CN107274491A (en) A kind of spatial manipulation Virtual Realization method of three-dimensional scenic
US9654734B1 (en) Virtual conference room
US10460512B2 (en) 3D skeletonization using truncated epipolar lines
TWI659335B (en) Graphic processing method and device, virtual reality system, computer storage medium
US9595127B2 (en) Three-dimensional collaboration
US20230206531A1 (en) Avatar display device, avatar generating device, and program
US10607403B2 (en) Shadows for inserted content
KR20180100476A (en) Virtual reality-based apparatus and method to generate a three dimensional(3d) human face model using image and depth data
US20120162384A1 (en) Three-Dimensional Collaboration
CA2924156A1 (en) Method, system and apparatus for capture-based immersive telepresence in virtual environment
CN104915979A (en) System capable of realizing immersive virtual reality across mobile platforms
WO2010022351A2 (en) System and method for low bandwidth image transmission
KR20220047719A (en) Pew-Shot Composite for Talking Head
US11461942B2 (en) Generating and signaling transition between panoramic images
WO2004012141A2 (en) Virtual reality immersion system
US20230105064A1 (en) System and method for rendering virtual reality interactions
US11727645B2 (en) Device and method for sharing an immersion in a virtual environment
US20230386147A1 (en) Systems and Methods for Providing Real-Time Composite Video from Multiple Source Devices Featuring Augmented Reality Elements
Minatani et al. Face-to-face tabletop remote collaboration in mixed reality
US20230164304A1 (en) Communication terminal device, communication method, and software program
JP2020530218A (en) How to project immersive audiovisual content
Siegl et al. An augmented reality human–computer interface for object localization in a cognitive vision system
TWI839830B (en) Mixed reality interaction method, device, electronic equipment and medium
Nakamura et al. A Mutual Motion Capture System for Face-to-face Collaboration.
CA3089885A1 (en) Virtual reality system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20171020