TW202420034A - The agent of metaverse go outside. - Google Patents

The agent of metaverse go outside. Download PDF

Info

Publication number
TW202420034A
TW202420034A TW111141424A TW111141424A TW202420034A TW 202420034 A TW202420034 A TW 202420034A TW 111141424 A TW111141424 A TW 111141424A TW 111141424 A TW111141424 A TW 111141424A TW 202420034 A TW202420034 A TW 202420034A
Authority
TW
Taiwan
Prior art keywords
user
robot
camera
metaverse
control software
Prior art date
Application number
TW111141424A
Other languages
Chinese (zh)
Inventor
吳志銘
吳漾福
吳宣蕾
Original Assignee
吳志銘
Filing date
Publication date
Application filed by 吳志銘 filed Critical 吳志銘
Publication of TW202420034A publication Critical patent/TW202420034A/en

Links

Abstract

A robot and network platform that allows ordinary people to play in the metaverse, they can use this platform to connect to the real world, and the user control the robot is like play in the metaverse.
The user’ s image of his characters in metaverse can display in the real world by the 3D LED Fan on the agent robot. And user can talk with people of the real world. Ordinary people can see the metaverse Characters without using special equipment (such as 3D glasses/VR glasses...). in the real world.

Description

元宇宙出口代理人 Metaverse Export Agent

本創作之技術領域係元宇宙相關應用。 The technical field of this creation is metaverse-related applications.

目前一般元宇宙的技術開發大多注重入口與虛擬世界的建置與經營,導致元宇宙停滯不前,因為元宇宙的內容建置需搭配3D模型建置,以及內容的交互設計,再加上穿戴設備未普及,所以能使用的內容極為有限,而且用3D模型建置的物件,沒有實體物件精緻與實際功效,所以元宇宙的世界中,能進行的工作極少。 At present, most of the technical development of the general metaverse focuses on the construction and operation of the portal and the virtual world, which has caused the metaverse to stagnate. Because the content construction of the metaverse requires the construction of 3D models and the interactive design of the content, and wearable devices are not popular, the content that can be used is extremely limited. Moreover, objects built with 3D models do not have the sophistication and practical effects of physical objects, so there are very few things that can be done in the world of the metaverse.

而我們則反思從另一角度切入,讓元宇宙的人們可以從虛擬世界再進入真實世界,成為元宇宙出口,讓元宇宙的人可以擁有虛擬的身份與形象,然後在真實的世界中進行各種活動,如參觀博物館,看畫展,逛精品街,教學,甚至表演等,因此當人們進入元宇宙後,除了參與虛擬世界中的任何活動,還能藉由其各種不同的出口,進入各種真實世界(如同次宇宙)然後在真實世界中,以其虛擬的形象,智能的控制真實世界中的代理人與真實世界的人事物進行互動。 We reflect on this from another angle, allowing people in the metaverse to enter the real world from the virtual world, becoming the exit of the metaverse, allowing people in the metaverse to have a virtual identity and image, and then carry out various activities in the real world, such as visiting museums, art exhibitions, shopping in boutique streets, teaching, and even performing. Therefore, when people enter the metaverse, in addition to participating in any activities in the virtual world, they can also enter various real worlds (such as the sub-universe) through its various exits, and then in the real world, with their virtual image, intelligently control the agents in the real world to interact with people and things in the real world.

另外,用建模的方式來建設元宇宙,其建模時間長,且效果遠低於真實的事物,並且成本高昂,而利用代理人於真實世界中穿梭,其效果真實且沉浸效果佳,再加上成本低廉,還能建立一條元宇宙與真實世界的互動管道,讓真實世界的人也能透過代理人,眼見元宇宙內的事物,形成一條雙向的互動管道。 In addition, the modeling method used to build the metaverse takes a long time to build, the effect is far inferior to the real thing, and the cost is high. Using agents to travel in the real world has a real effect and good immersion effect. In addition, the cost is low, and an interactive channel between the metaverse and the real world can be established, so that people in the real world can also see things in the metaverse through agents, forming a two-way interactive channel.

試想,在元宇宙中,要參觀大英博物館,或是故宮博物院,那需要的建模時間數十年,甚至上百年,且3D模型與真品的效果差距甚大,可是利用代理人,可將此時間縮短成幾個月,就能讓元宇宙的人們來進行一個宏大博物院的參觀,當代理人廣泛鋪設至上萬個真實空間時,元宇宙世界裡,就多出上萬個出口通向不同的次宇宙,而若每個次宇宙中的物件與建築由上萬個真實元件形成時,那就是相當多出上億個元宇宙元件,且若是每個次宇宙中的參與人員有數十甚至數百人,則表示元宇宙的人,隨時可與數十萬甚至數百萬的真實世界人員一起互動,快速擴大元宇宙的參與。 Imagine that in the metaverse, if you want to visit the British Museum or the Palace Museum, it would take decades or even hundreds of years to build models, and the effect of the 3D model is very different from the real thing. However, by using agents, this time can be shortened to a few months, allowing people in the metaverse to visit a grand museum. When agents are widely deployed in tens of thousands of real spaces, there will be tens of thousands of exits in the metaverse leading to different sub-universes. If the objects and buildings in each sub-universe are formed by tens of thousands of real components, that is equivalent to hundreds of millions of metaverse components. And if there are dozens or even hundreds of participants in each sub-universe, it means that people in the metaverse can interact with hundreds of thousands or even millions of real-world people at any time, rapidly expanding the participation of the metaverse.

圖1為本創作的示意圖。其主要有五大部分組成本創作之機器人與網路平台,其中1為載體底盤用於承載本創作機器人之各項設備與機器人運動之輪子與電機,2為支撐架用於支撐3顯影LED風扇及4攝影機雲台5電腦軟體。 Figure 1 is a schematic diagram of this creation. It mainly consists of five parts, including the robot and network platform of this creation, 1 is the carrier chassis used to carry the various equipment of the robot of this creation and the wheels and motors for the robot's movement, 2 is the support frame used to support 3 the display LED fan and 4 the camera pan and tilt 5 the computer software.

圖2為本創作的結構圖。1載體底盤用於承載本創作機器人之各項設備與機器人運動之輪子與電機,其中1-2輪子用於承載1載體底盤使其移動,1-3驅動輪子的馬達,用以驅動輪子使輪子旋轉以致1載體底盤移動,在1載體底盤上還有1-4喇叭用於播放使用者的聲音,1-5網路控制器用於連結網路作為使用者控制機器人及傳輸資料等,而5電腦軟體則是用於傳輸與控制使用者的電腦裝備(如AR/VR/手機/電腦…)與1-5網路控制器間的軟體,1-6距離傳感器用於檢測機器人于真實環境中,週圍的人或物的距離與位置,以避免機器人碰撞環境中的人或物,1-7電池為整個機器人的電力來源,供電給1載體底盤上的各項設備及3顯影LED風扇與4攝影機雲台,2支撐架上有3顯影LED風扇及4攝影機雲台,其中3顯影LED風扇由3-1LED燈條及3-2風扇馬達所組成,當馬達旋轉時,LED燈條上的LED會經由1-5網路控制器上傳來的資料,閃爍顏色及亮度,透過視覺暫留的效果,讓一般觀眾可以看到使用者所傳輸出來的影像圖形,讓觀眾不必配帶3D眼鏡,也能呈現立體的顯影,4攝影機雲台由4-1攝影機及4-2雲台馬達及4-3麥克風所組成,4-1攝影機裝置在4-2雲台馬達上,當雲台馬達旋轉時,4-1攝影機所拍攝的角度也隨之移動,而使用者可以利用自己穿戴的VR眼鏡或頭盔轉動脖子或是手持的設備 或是手機,來傳輸要4攝影機雲台移動的角度,進而控制攝影機拍攝的視角,看到真實世界的場景並透過4-3麥克風聽到真實世界的聲音。 Figure 2 is the structure of this invention. 1 Carrier chassis is used to carry various equipment of this invention robot and the wheels and motors for the robot to move. Among them, 1-2 wheels are used to carry 1 carrier chassis to make it move, 1-3 motors driving wheels are used to drive wheels to rotate so that 1 carrier chassis moves, and there are 1-4 speakers on 1 carrier chassis to play the user's voice, 1-5 network controller is used to connect to the network for users to control the robot and transmit data, and 5 computer software is used to transmit The software between the computer equipment (such as AR/VR/mobile phone/computer...) and the network controller 1-5 is used to control the user. The distance sensor 1-6 is used to detect the distance and position of the people or objects around the robot in the real environment to prevent the robot from colliding with people or objects in the environment. The battery 1-7 is the power source of the entire robot, supplying power to various devices on the 1 carrier chassis, 3 display LED fans and 4 camera pan/tilts. There are 3 display LED fans and 2 support frames on the 2 support frames. 4-camera pan-tilt, 3-display LED fans are composed of 3-1 LED light strip and 3-2 fan motor. When the motor rotates, the LED on the LED light strip will flash color and brightness through the data uploaded by 1-5 network controller. Through the visual retention effect, the general audience can see the image graphics transmitted by the user, so that the audience can present three-dimensional display without wearing 3D glasses. The 4-camera pan-tilt consists of 4-1 camera and 4-2 cloud The 4-1 camera is mounted on the 4-2 pan-tilt motor. When the pan-tilt motor rotates, the angle of the 4-1 camera also moves. Users can use their VR glasses or helmets to turn their necks or handheld devices or mobile phones to transmit the angle of the 4-camera pan-tilt movement, thereby controlling the camera's shooting angle, seeing the real world scene and hearing the real world sound through the 4-3 microphone.

圖3為本創作實際運行的控制方式,其中代理人為機器人在現實中的某場域中運行,透過網路平台的傳輸,使用者可以用各種能上網的工具如電腦/手機/VR眼鏡等與代理人連線,而使用者在元宇宙中的虛擬角色的影像則會成像在代理人的顯影LED風扇上,而代理人上的攝影機及麥克風則會透過網路控制器將所拍到的影像及聲音回傳給使用者,讓使用者如同身處於該場域中。 Figure 3 shows the actual control method of this creation, where the agent is a robot running in a real field. Through the transmission of the network platform, users can connect to the agent with various tools that can access the Internet, such as computers/mobile phones/VR glasses, etc., and the image of the user's virtual character in the metaverse will be imaged on the agent's display LED fan, and the camera and microphone on the agent will send the captured images and sounds back to the user through the network controller, making the user feel as if they are in the field.

圖4為本創作中不同形式的1載體底盤,該底盤可以是一台類似單純車體的移動平台,也可以是一個具有能移動又具有手臂的人形機器人,具備移動與手部行動的功能,而3顯影LED風扇在此載體底盤只須顯示頭像,不須顯示使用者的身體。 Figure 4 shows different forms of 1 carrier chassis in this creation. The chassis can be a mobile platform similar to a simple car body, or a humanoid robot with arms that can move and have the functions of movement and hand movements. The 3 display LED fan only needs to display the head image on this carrier chassis, without displaying the user's body.

圖5為本創作中不同形式的1載體底盤,該底盤可以是一台時下流行的無人機,可使代理人三維空間移動。 Figure 5 shows different forms of 1 carrier chassis in this creation. The chassis can be a popular drone that allows the agent to move in three-dimensional space.

1:載體底盤 1: Carrier chassis

1-2:輪子 1-2: Wheels

1-3:馬達 1-3: Motor

1-4:喇叭 1-4: Speaker

1-5:網路控制器 1-5: Network controller

1-6:距離傳感器 1-6: Distance sensor

1-7:電池 1-7:Battery

2:支撐架 2: Support frame

3:顯影LED風扇 3: Display LED fan

3-1:LED燈條 3-1: LED light strip

3-2:風扇馬達 3-2: Fan motor

4:攝影機雲台 4: Camera gimbal

4-1:攝影機 4-1: Camera

4-2:雲台馬達 4-2: PTZ motor

4-3:麥克風 4-3: Microphone

5:電腦軟體 5: Computer software

為讓本創作之上述以及其他特徵、優點與實施例能更明顯易懂,所附圖式之說明如下: In order to make the above and other features, advantages and embodiments of this invention more clearly understood, the attached diagrams are explained as follows:

第1圖繪示本創作示意圖; Figure 1 shows a schematic diagram of this creation;

第2圖繪示本創作結構圖; Figure 2 shows the structure of this creation;

第3圖繪示本創作之實際運行的控制方式; Figure 3 shows the actual control method of this creation;

第4圖繪示本創作之不同形式的載體底盤1; Figure 4 shows different forms of carrier chassis 1 of this creation;

第5圖繪示本創作之不同形式的載體底盤2; Figure 5 shows different forms of carrier chassis 2 of this creation;

本創作之實施如圖2。1載體底盤用於承載本創作機器人之各項設備與機器人運動之輪子與電機,其中1-2輪子至少2個用於承載1載體底盤使其移動,1-3馬達用以驅動輪子使輪子旋轉以致1載體底盤移動,輪子可以是一般輪子也可以是萬向輪,或是組成履帶的形式,方便在地面不平整的地方上使用,在1載體底盤上還有1-4喇叭用於播放使用者的聲音,1-5網路控制器用於連結網路作為使用者控制機器人及傳輸資料等,控制器可以是WIFI/又或是4G/5G/6G手機等設備,透過5電腦軟體,讓代理人的控制與資料能在互聯網上傳輸,進而與各種元宇宙平台聯結,並傳送影音資料於使用者端與機器人端,而1-6距離傳感器,可以是超聲波或紅外光或視雷射測距儀或攝影機等所組成,用於檢測機器人於真實環境中,週圍的人或物的距離與位置,以避免機器人碰撞環境中的人或物,或是用於檢測代理人於場域中的位置,進而可以設計代理人自走路線及自動返航充電等,1-7電池為整個機器人的電力來源,供電給1載體底盤上的各項設備及3顯影LED風扇與4攝影機雲台,2支撐架上有3顯影LED風扇及4攝影機雲台,其中3顯影LED風扇由3-1LED燈條及3-2風扇馬達所組成,當馬達旋轉時,LED燈條上的LED會經由1-5網路控制器上傳來的資料,閃爍顏色及亮度,透過視覺暫留的效果,讓一般觀眾可以看到使用者所傳輸出來的影像圖形,讓觀眾不必配帶3D眼鏡,也能呈現立體的顯影,此方式有別於一般的螢幕如LCD或是OLED螢幕,因為一般螢幕都有其外圍的邊框,所以任何影像的呈現,都是出現在框框內,而螢幕中不成像的部分會以黑色存在,也會被觀眾的眼睛所看見,說 白了就是看到一整個螢幕,而本創作利用LED燈條旋轉加上視覺暫留效果所做的成像,沒有任何邊框,而沒有成像的部分也沒有顏色純透明,也就是觀眾會看到成像背後的實體物,因此觀眾看到裸眼3D的效果類似幽靈的成像,再加上一個會移動的載體底盤帶著成像在場域中移動,就像一個卡通人物跳進了真實的世界中與真人互動,而4攝影機雲台由4-1攝影機及4-2雲台馬達及4-3麥克風所組成,4-1攝影機裝置在4-2雲台馬達上,當雲台馬達旋轉時,4-1攝影機所拍攝的角度也隨之移動,而使用者可以利用自己穿戴的VR眼鏡或頭盔轉動脖子或是手持的設備或是手機,透過5電腦軟體來傳輸要4攝影機雲台移動的角度,進而控制攝影機拍攝的視角,看到真實世界的場景並透過4-3麥克風聽到真實世界的聲音,以及控制機器人的移動如前進後退等。 The implementation of this creation is shown in Figure 2. 1 Carrier chassis is used to carry various equipment of the robot of this creation and the wheels and motors for the movement of the robot. Among them, at least 2 of the wheels 1-2 are used to carry 1 carrier chassis to make it move, and 1-3 motors are used to drive the wheels to rotate so that 1 carrier chassis moves. The wheels can be ordinary wheels or universal wheels, or in the form of tracks, which are convenient for use on uneven ground. There are also 1-4 speakers on 1 carrier chassis to play the user's voice, and 1-5 network controllers are used to connect to the network as users to control the robot and transmit data, etc. The controller can be WIFI/or 4G/5G/6G mobile phones and other devices. Through 5 computer software, the agent's control and data can be transmitted on the Internet, and then connected to various metaverse platforms. The 1-6 distance sensors can be ultrasonic, infrared, laser, or camera sensors, etc., and are used to detect the distance and position of people or objects around the robot in the real environment to prevent the robot from colliding with people or objects in the environment, or to detect the position of the agent in the field, so that the agent can design a self-propelled route and automatic return to charge, etc. The 1-7 battery is the power source of the entire robot, supplying power to various devices on the 1 carrier chassis, 3 display LED fans, and 4 camera pan-tilts. There are 3 display LED fans and 4 camera pan-tilts on the 2 support frames, of which the 3 display LED fans are composed of 3-1 LED light strips and 3-2 fan motors. When the motor rotates, the LED on the LED light strip The data transmitted from the 1-5 network controller will flash in color and brightness. Through the visual retention effect, the general audience can see the image graphics transmitted by the user, so that the audience can present a three-dimensional image without wearing 3D glasses. This method is different from ordinary screens such as LCD or OLED screens, because ordinary screens have their outer frames, so any image presentation will appear within the frame, and the non-imaged part of the screen will exist in black and will also be seen by the audience's eyes. To put it simply, you can see the entire screen. This creation uses the rotation of LED light strips and the visual retention effect to create an image without any frames, and the non-imaged part has no color and is purely transparent, that is, the audience will see the real object behind the image, so the audience sees the naked The effect of the eye 3D is similar to the image of a ghost, and a moving carrier chassis moves the image in the field, just like a cartoon character jumping into the real world and interacting with real people. The 4-camera gimbal is composed of a 4-1 camera, a 4-2 gimbal motor and a 4-3 microphone. The 4-1 camera is installed on the 4-2 gimbal motor. When the gimbal motor rotates, the angle of the 4-1 camera also moves. Users can use the VR glasses or helmets they wear to turn their necks or handheld devices or mobile phones to transmit the angle of the 4-camera gimbal movement through 5 computer software, thereby controlling the camera's shooting angle, seeing the real world scene and hearing the real world sound through the 4-3 microphone, and controlling the robot's movement such as forward and backward.

1:載體底盤 1: Carrier chassis

2:支撐架 2: Support frame

3:顯影LED風扇 3: Display LED fan

4:攝影機雲台 4: Camera gimbal

5:電腦軟體 5: Computer software

Claims (7)

一種網路代理機器人裝置,可以移動及顯示影像與聲音包括: A network proxy robot device that can move and display images and sounds includes: 一載體底盤; A carrier chassis; 一支撐架;用於支撐LED風扇與攝影機雲台; A support frame; used to support LED fans and camera pan/tilts; 一顯影LED風扇;用於顯示影像; 1. Display LED fan; used for displaying images; 一攝影機雲台;用於回傳場域影像給使用者; A camera pan/tilt; used to transmit the scene image back to the user; 一網路控制器;用於與手機或電腦或VR眼鏡通訊的無線 A network controller; a wireless controller for communicating with a mobile phone, computer or VR glasses (如WIFI/4G/5G…)或有線(如USB/RS232…)等通訊及控制機器人; (such as WIFI/4G/5G...) or wired (such as USB/RS232...) communication and control of robots; 如申請專利範圍第1項所述之載體底盤,其中包含: The carrier chassis as described in Item 1 of the patent application scope, including: 一電池; 1 battery; 一距離傳感器;用於檢測環境距離; A distance sensor; used to detect environmental distance; 一組輪子; A set of wheels; 一組驅動馬達;與輪子連接用以驅動底盤移動; A set of driving motors; connected to the wheels to drive the chassis to move; 一喇叭;用於播放使用者的聲音; A speaker; used to play the user's voice; 如申請專利範圍第1項所述之顯影LED風扇,其中包含: The display LED fan as described in Item 1 of the patent application scope includes: 一LED燈條; 1 LED light strip; 一風扇馬達;與條連接,旋轉燈條用於顯示影像用; A fan motor; connected to the strip, the rotating light strip is used to display images; 一LED顯示控制器;用於與網路控制器連結顯示影像用; An LED display controller; used to connect with a network controller to display images; 如申請專利範圍第1項所述之攝影機雲台,其中包含: The camera gimbal as described in Item 1 of the patent application scope, including: 一攝影機;用於將實體場域的影像回傳給使用者; A camera; used to transmit images of the physical scene back to the user; 一麥克風;用於將實體場域的聲音回傳給使用者; A microphone; used to transmit the sound of the physical field back to the user; 一雲台馬達;用於旋轉攝影機角度用; 1 pan/tilt motor; used to rotate the camera angle; 一套電腦軟體,其中包含: A set of computer software, including: 一電腦端控制軟體 A computer control software 一機器人端控制軟體 1. Robot control software 如申請專利範圍第5項所述之電腦端控制軟體,其中包含: The computer control software as described in Item 5 of the patent application scope includes: 一網路通訊協議;用於與元宇宙平台及使用者端的電腦(手機/VR/AR/平板)通訊之協議如HTTP或4G/5G。 1. Internet communication protocol; a protocol used to communicate with the Metaverse platform and the user's computer (mobile phone/VR/AR/tablet), such as HTTP or 4G/5G. 一使用者動作指令接收與傳送程式;用於接收使用者於元宇宙平台中所使用的設備所發出的訊號,例如接收使用VR/AR頭盔轉頭抬頭的訊號,然後傳送給機器人端的控制軟體,控制機器人端的攝影機雲台依訊號轉向或抬頭,改變攝影機的角度。又例如接收使用者使用VR/AR的手柄訊號,然後傳送給機器人端的控制軟體,控制機器人前後左右行動。 1. User action command receiving and transmitting program; used to receive signals from the devices used by users in the Metaverse platform, such as receiving signals from turning and raising the head when using a VR/AR helmet, and then transmitting them to the control software on the robot side to control the camera pan/tilt on the robot side to turn or raise the head according to the signal, changing the angle of the camera. Another example is receiving signals from the user using a VR/AR handle, and then transmitting them to the control software on the robot side to control the robot to move forward, backward, left, and right. 一影音接收與傳輸程式;用於使用者電腦端與遠端機器人之影音傳輸,例如透過使用者端的麥克風,將使用者的聲音傳給遠端的機器人端控制軟體,在機器人端播放使用者的聲音,而機器人端則透過機器人端控制軟體則將機器人端的麥克風與攝影機接收到的影音訊息,回傳給使用者端的電腦設備,令使用者可以看到與聽到機器人端的聲音與影像。 1. Audio and video receiving and transmission program; used for audio and video transmission between the user's computer and the remote robot. For example, the user's voice is transmitted to the remote robot control software through the user's microphone, and the user's voice is played on the robot. The robot then transmits the audio and video information received by the robot's microphone and camera back to the user's computer through the robot control software, so that the user can see and hear the sound and image of the robot. 如申請專利範圍第5項所述之機器人端控制軟體,其中包含 The robot control software as described in Item 5 of the patent application scope includes 一使用者動作指令接收程式;用於接收電腦端控制軟體所發出的訊號,例如VR/AR頭盔轉頭抬頭的訊號,然後控制機器人端的攝影機雲台依訊號轉向或抬頭,改變攝影機的角度。又例如VR/AR的手柄訊號,然後傳控制機器人前後左右行動。 A user action command receiving program; used to receive signals from the computer control software, such as the signal of the VR/AR helmet turning or raising the head, and then control the camera pan/tilt of the robot to turn or raise the head according to the signal, changing the angle of the camera. Another example is the VR/AR handle signal, which is then transmitted to control the robot to move forward, backward, left, and right. 一影音接收與傳輸程式;用於接收使用者在元宇宙內的形象圖案與 使用者說話的聲音,例如透過使用者端的麥克風,將使用者的聲音傳給遠端的機器人來發聲,然後所收到的圖案資料發給如申請專利範圍第3項的LED風扇,透過視覺暫留的技術,在機器人端的觀眾可以看到使用者所傳輸過來的圖案,如使用者在元宇宙的虛擬頭像或虛擬的角色影像,或是直播的影像。 1. An audio and video receiving and transmitting program; used to receive the user's image pattern in the metaverse and the user's voice, for example, through the user's microphone, the user's voice is transmitted to the remote robot to speak, and then the received pattern data is sent to the LED fan such as the third item of the patent application scope. Through the visual retention technology, the audience at the robot end can see the pattern transmitted by the user, such as the user's virtual avatar or virtual character image in the metaverse, or the live broadcast image.
TW111141424A 2022-10-31 The agent of metaverse go outside. TW202420034A (en)

Publications (1)

Publication Number Publication Date
TW202420034A true TW202420034A (en) 2024-05-16

Family

ID=

Similar Documents

Publication Publication Date Title
US11484790B2 (en) Reality vs virtual reality racing
JP6929380B2 (en) Second screen virtual window to VR environment
US11695901B2 (en) Emotes for non-verbal communication in a videoconferencing system
CN109069932B (en) Viewing Virtual Reality (VR) environment associated with VR user interactivity
US20180348856A1 (en) Massive simultaneous remote digital presence world
JP2021036449A (en) System and method for augmented and virtual reality
US11212437B2 (en) Immersive capture and review
JP2020513957A (en) Extended vehicle system and method
CN106023289A (en) Image generation system and image generation method
CN105080134A (en) Realistic remote-control experience game system
Zhang et al. LightBee: A self-levitating light field display for hologrammatic telepresence
CN105359063A (en) Head mounted display with tracking
CN112150885A (en) Cockpit system based on mixed reality and scene construction method
CN203899120U (en) Realistic remote-control experience game system
WO2020095368A1 (en) Information processing system, display method, and computer program
US20240087236A1 (en) Navigating a virtual camera to a video avatar in a three-dimensional virtual environment, and applications thereof
WO2019009163A1 (en) Information processing system, player-side device, control method, and program
TW202420034A (en) The agent of metaverse go outside.
JP2019175322A (en) Simulation system and program
KR20190101621A (en) Miniature land tour system by RC car steering and real time video transmission
US11776227B1 (en) Avatar background alteration
US12028651B1 (en) Integrating two-dimensional video conference platforms into a three-dimensional virtual environment
US11748939B1 (en) Selecting a point to navigate video avatars in a three-dimensional environment
US11741652B1 (en) Volumetric avatar rendering
US11741664B1 (en) Resituating virtual cameras and avatars in a virtual environment