CN115390657A - Method and system for performing virtual-real interaction based on mobile phone - Google Patents

Method and system for performing virtual-real interaction based on mobile phone Download PDF

Info

Publication number
CN115390657A
CN115390657A CN202210380104.4A CN202210380104A CN115390657A CN 115390657 A CN115390657 A CN 115390657A CN 202210380104 A CN202210380104 A CN 202210380104A CN 115390657 A CN115390657 A CN 115390657A
Authority
CN
China
Prior art keywords
mobile phone
virtual
gesture
equipment
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210380104.4A
Other languages
Chinese (zh)
Inventor
陈伟桦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Shadow Creator Information Technology Co Ltd
Original Assignee
Shanghai Shadow Creator Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Shadow Creator Information Technology Co Ltd filed Critical Shanghai Shadow Creator Information Technology Co Ltd
Priority to CN202210380104.4A priority Critical patent/CN115390657A/en
Publication of CN115390657A publication Critical patent/CN115390657A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • G06F9/542Event management; Broadcasting; Multicasting; Notifications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • G06F9/544Buffers; Shared memory; Pipes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/14Session management
    • H04L67/141Setup of application sessions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72415User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories for remote control of appliances
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2209/00Indexing scheme relating to G06F9/00
    • G06F2209/54Indexing scheme relating to G06F9/54
    • G06F2209/545Gui

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a method and a system for carrying out virtual-real interaction based on a mobile phone, which comprises the following steps: step S1: running the 3D system in the virtual device; step S2: establishing connection between the mobile phone and the virtual equipment; and step S3: based on the connection established between the mobile phone and the virtual equipment, the mobile phone sends an operation instruction to a 3D system in the virtual equipment; and step S4: the virtual equipment captures the gesture of the mobile phone through the RGB camera and combines the operation instruction to realize interaction. The invention solves the problem of operation compatibility of different mobile phones by adopting the technology of dynamically setting the APP resolution.

Description

Method and system for performing virtual-real interaction based on mobile phone
Technical Field
The invention relates to the technical field of virtual reality, in particular to a method and a system for performing virtual-real interaction based on a mobile phone.
Background
The current interaction of MR/VR generally employs a handle or a gesture. The two modes can not really experience the operation experience of combining virtual and real, and the novel mode of using the mobile phone to carry out virtual and real interaction can enable a user to enjoy the same body feeling of operating the mobile phone in a virtual world.
Patent document CN106790996A (application number: 201611047520.3) discloses a mobile phone virtual reality interaction system and method, the system comprising: the mobile phone comprises a head-mounted device and a mobile phone, wherein a display screen of the mobile phone is used for playing VR video; the mobile phone is provided with: the image recording module is used for recording videos in real time; the image overlapping module is used for overlapping the recorded video and the VR video so that the recorded video and the played VR video can be played on a display screen of the mobile phone at the same time; the motion capture module is used for capturing gesture operation appearing in the recorded video picture, recognizing the gesture operation and converting the gesture operation into an operation instruction; and the image control module is used for executing corresponding operation on the video played on the display screen according to the operation instruction identified by the motion capture module.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide a method and a system for performing virtual-real interaction based on a mobile phone.
The method for carrying out virtual-real interaction based on the mobile phone provided by the invention comprises the following steps:
step S1: running the 3D system in the virtual device;
step S2: establishing connection between the mobile phone and the virtual equipment;
and step S3: based on the connection established between the mobile phone and the virtual equipment, the mobile phone sends an operation instruction to a 3D system in the virtual equipment;
and step S4: the virtual equipment captures the gesture of the mobile phone through the RGB camera and combines the operation instruction to realize interaction.
Preferably, the step S2 employs: and establishing Socket connection between the mobile phone and the virtual equipment.
Preferably, the device information of the mobile phone is sent to the virtual device based on the Socket connection established between the mobile phone and the virtual device;
the device information of the mobile phone includes a resolution.
Preferably, the step S4 employs:
step S4.1: capturing the gesture of the mobile phone through an RGB camera, wherein the gesture comprises the position and the rotation of the mobile phone in the space; controlling the direction of an operation ray in the 3D system based on the gesture of the mobile phone;
step S4.2: the method comprises the steps of selecting an APP icon of a mobile phone to be opened through rays, and synchronously acting the operation of a mobile phone screen in reality into 2DAPP of a virtual world based on the connection established between the mobile phone and virtual equipment to realize virtual interaction.
Preferably, when the application is opened, the same resolution of the mobile phone is automatically set, and the captured gesture of the mobile phone is used, so that the display of the application in the virtual world is fit in the displayed screen of the mobile phone.
Preferably, the virtual device comprises a VR virtual device, an AR virtual device or a MR virtual device.
The invention provides a system for carrying out virtual-real interaction based on a mobile phone, which comprises:
a module M1: running the 3D system in the virtual device;
a module M2: establishing connection between the mobile phone and the virtual equipment;
a module M3: based on the connection established between the mobile phone and the virtual equipment, the mobile phone sends an operation instruction to a 3D system in the virtual equipment;
a module M4: the virtual equipment captures the gesture of the mobile phone through the RGB camera and combines the operation instruction to realize interaction.
Preferably, the module M2 employs: establishing Socket connection between the mobile phone and the virtual equipment;
sending the equipment information of the mobile phone to the virtual equipment based on the Socket connection established between the mobile phone and the virtual equipment;
the device information of the mobile phone includes a resolution.
Preferably, the module M4 employs:
module M4.1: capturing the gesture of the mobile phone through an RGB camera, wherein the gesture comprises the position and the rotation of the mobile phone in the space; controlling the direction of an operation ray in the 3D system based on the gesture of the mobile phone;
module M4.2: the method comprises the steps of selecting an APP icon of a mobile phone to be opened through rays, and synchronously acting the operation of a mobile phone screen in reality into 2DAPP of a virtual world based on the connection established between the mobile phone and virtual equipment to realize virtual interaction.
Preferably, when the application is opened, the same resolution of the mobile phone is automatically set, and the captured gesture of the mobile phone is used, so that the display of the application in the virtual world is fit in the displayed screen of the mobile phone.
Compared with the prior art, the invention has the following beneficial effects:
1. by adopting the technology of dynamically setting the APP resolution ratio, the problem of operation compatibility of different mobile phones is solved.
2. Compared with handle interaction, the mobile phone is almost a device owned by everyone, and interaction can be experienced without buying outside or carrying the handle
3. Compared with gesture interaction, the mobile phone can enable the user to experience touch sense, and enable the user to experience the pleasure of operation in the aspects of vision and touch sense
4. The invention can enable the user to perfectly experience the application of the current mobile phone terminal, in particular games, on the MR/VR equipment.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
fig. 1 is a flowchart of a method for performing virtual-real interaction based on a mobile phone.
Fig. 2 is a flowchart of a method for performing virtual-real interaction based on a mobile phone.
Fig. 3 is a display flowchart of the virtual device.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that it would be obvious to those skilled in the art that various changes and modifications can be made without departing from the spirit of the invention. All falling within the scope of the present invention.
Example 1
As shown in fig. 1 to 2, the method for performing virtual-real interaction based on a mobile phone according to the present invention includes:
step S1: running the 3D system in the virtual device;
step S2: establishing Socket connection between the mobile phone and the virtual equipment, and sending equipment information (resolution) of the mobile phone to the virtual equipment based on the Socket connection between the mobile phone and the virtual equipment;
and step S3: based on the Socket connection established between the mobile phone and the virtual equipment, the mobile phone sends an operation instruction to a 3D system in the virtual equipment;
and step S4: the virtual equipment captures the gesture of the mobile phone through the RGB camera and realizes interaction by combining with the operation instruction.
Specifically, the step S4 employs:
step S4.1: capturing the gesture of the mobile phone through an RGB camera, wherein the gesture comprises the position and the rotation of the mobile phone in the space; controlling the direction of an operation ray in the 3D system based on the gesture of the mobile phone;
step S4.2: selecting an APP (application) icon of a mobile phone to be opened (such as WeChat) through rays, and synchronously acting the operation on a mobile phone screen in reality into 2DAPP of a virtual world based on Socket connection established between the mobile phone and virtual equipment to realize virtual interaction; for example: the mobile phone TP clicking or sliding event is transmitted from the mobile phone to the virtual device through Socket, for example, after the screen clicking event is sent to the virtual device, the screen clicking event may be resolved as determining to open the APP.
Specifically, when the application is opened, the resolution of the mobile phone is automatically set to be the same as the resolution of the mobile phone, and the captured gesture of the mobile phone is used, so that the display of the application in the virtual world is fit in the displayed screen of the mobile phone.
Specifically, after the application is opened and the virtual device receives the resolution, the texture boundary is set according to the resolution when generating the texture in the virtual world, for example, if the resolution of the mobile phone is 1920 × 1080, the texture is also set to 1920 × 1080.
In particular, the virtual device comprises a VR virtual device, an AR virtual device or a MR virtual device.
The invention provides a system for carrying out virtual-real interaction based on a mobile phone, which comprises the following components:
a module M1: running the 3D system in the virtual device;
a module M2: establishing Socket connection between the mobile phone and the virtual equipment, establishing Socket connection between the mobile phone and the virtual equipment based on the mobile phone, and sending equipment information (resolution) of the mobile phone to the virtual equipment;
a module M3: based on the Socket connection established between the mobile phone and the virtual equipment, the mobile phone sends an operation instruction to a 3D system in the virtual equipment;
a module M4: the virtual equipment captures the gesture of the mobile phone through the RGB camera and combines the operation instruction to realize interaction.
The invention can also improve the accuracy of capturing the mobile phone by the camera by adopting a neural network algorithm and solve the problem of untight virtual and real fit. Specifically, when a camera of the virtual device captures a picture, a neural network algorithm is used for identifying whether a mobile phone exists in the picture, and when the mobile phone exists, the pose of the mobile phone can be calculated quickly.
Specifically, the module M4 employs:
module M4.1: capturing the gesture of the mobile phone through an RGB camera, wherein the gesture comprises the position and the rotation of the mobile phone in the space; controlling the direction of an operation ray in the 3D system based on the gesture of the mobile phone;
module M4.2: selecting an APP (application) icon of a mobile phone to be opened (such as WeChat) through rays, and synchronously acting the operation on a mobile phone screen in reality into 2DAPP of a virtual world based on Socket connection established between the mobile phone and virtual equipment to realize virtual interaction; for example: the click or slide mobile phone TP event is transmitted from the mobile phone to the virtual device through the Socket, and after the click screen event is sent to the virtual device, the click screen event can be resolved as determining to open the APP.
Specifically, when the application is opened, the resolution of the mobile phone is automatically set to be the same as the resolution of the mobile phone, and the captured gesture of the mobile phone is used, so that the display of the application in the virtual world is fit in the displayed screen of the mobile phone.
Specifically, after the application is opened and the virtual device receives the resolution, the texture boundary is set according to the resolution when generating the texture in the virtual world, for example, if the resolution of the mobile phone is 1920 × 1080, the texture is also set to 1920 × 1080.
In particular, the virtual device comprises a VR virtual device, an AR virtual device or a MR virtual device.
Example 2
Example 2 is a preferred example of example 1
The method for performing virtual-real interaction based on a mobile phone, as shown in fig. 3, includes:
step 1: and (2) operating the 3D system in the MR/VR equipment, selecting and clicking an application program by a user through a mobile phone, calling a callback method StartAppCallback in a jar packet introduced by the 3D system through an interface StartApp, and transmitting the name of the application program packet to be called to a framework layer.
Step 2: starting the application program, the interface StartActivity communication Binder of the frame layer informs the service activitymanager service to interact with Zygote through the process socket, and the fork of the service activitymanager service is used as the process of the application program.
And step 3: the new process for fork will generate 3 threads, namely, activityThread, applicationThread, and W thread, which are generic communication Binder classes, and are described as follows:
ActivityThread: the method is the UI thread/main thread of the application program, the main () method of the method is the real entrance of the application program, the execution of the main thread of the new process for managing fork (the function of the main entrance function is equivalent to that of the common Java program), at this time, IApplicationthread interface, AMS is Client, activitythread, and Applicationthread is Server, and AMS is responsible for scheduling and executing activities, broadcast and other operations.
Application thread: created before the creation of the main Activity, is responsible for listening to requests sent by the AMS to create Activity. The ApplicationThread judges whether the application program is displayed or not through interaction with the AMS and through the starting and exiting records of the AMS to each application program.
And W thread: after the Activity is created, if a new instruction is generated, the WMS receives instruction data and feeds the instruction data back to the W thread, the W thread sends capture information to the DecorView, if the DecorView is not processed, the capture information is transmitted to PhoneWindow, and if the PhoneWindow is not processed, the capture information is transmitted to the Activity to process the message through a Handler.
And 4, step 4: the W thread transmits the application program data to a WindowManagerService through Binder communication, and the WindowManagerService receives the data and then transmits the application program data to the SurfaceFlinger through Binder communication.
And 5: the SurfaceFlinger analyzes the data position and size of the application program:
step 5.1: the spatial position of the application program is obtained by capturing the pose of the mobile phone, and a 4x4 spatial matrix containing XYZ three axes of the application program is calculated, wherein the matrix contains the world spatial position and the rotation angle of the application program.
Step 5.2: and the size of the application program window is dynamically set according to the resolution transmitted by the mobile phone.
Step 5.3: and binding the two matrixes and the layerld together, performing data encapsulation, and transmitting the data encapsulation to the 3D system through binder communication.
And 6: cs component script of 3D system creates a map Texture from application data buffer according to layerId through interface CreateExternalTexture, and the generated map adjusts the map position according to the position and size information in the data in the two matrices passed.
And 7: the interface SetTriangles informs the generated frame map to the system desktop through a script inversion launcher. Cs, and submits the frame map to a surface flag, informs the surface flag to draw each frame, and starts the function of rendering and displaying the system desktop nested application program.
It is known to those skilled in the art that, in addition to implementing the system, apparatus and its various modules provided by the present invention in pure computer readable program code, the system, apparatus and its various modules provided by the present invention can be implemented in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like by completely programming the method steps. Therefore, the system, the apparatus, and the modules thereof provided by the present invention may be considered as a hardware component, and the modules included in the system, the apparatus, and the modules for implementing various programs may also be considered as structures in the hardware component; modules for performing various functions may also be considered to be both software programs for performing the methods and structures within hardware components.
The foregoing description of specific embodiments of the present invention has been presented. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes or modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention. The embodiments and features of the embodiments of the present application may be combined with each other arbitrarily without conflict.

Claims (10)

1. A method for carrying out virtual-real interaction based on a mobile phone is characterized by comprising the following steps:
step S1: running the 3D system in the virtual device;
step S2: establishing connection between the mobile phone and the virtual equipment;
and step S3: based on the connection established between the mobile phone and the virtual equipment, the mobile phone sends an operation instruction to a 3D system in the virtual equipment;
and step S4: the virtual equipment captures the gesture of the mobile phone through the RGB camera and combines the operation instruction to realize interaction.
2. The method for virtual-real interaction based on mobile phone of claim 1, wherein the step S2 adopts: and establishing Socket connection between the mobile phone and the virtual equipment.
3. The method for virtual-real interaction based on the mobile phone as claimed in claim 1, wherein the device information of the mobile phone is sent to the virtual device based on a Socket connection established between the mobile phone and the virtual device;
the device information of the mobile phone includes a resolution.
4. The method for virtual-real interaction based on mobile phone of claim 1, wherein the step S4 comprises:
step S4.1: capturing the gesture of the mobile phone through an RGB camera, wherein the gesture comprises the position and the rotation of the mobile phone in the space; controlling the direction of an operation ray in the 3D system based on the gesture of the mobile phone;
step S4.2: the method comprises the steps of selecting an APP icon of a mobile phone to be opened through rays, and synchronously acting the operation of a mobile phone screen in reality into 2DAPP of a virtual world based on the connection established between the mobile phone and virtual equipment to realize virtual interaction.
5. The method for virtual-real interaction based on mobile phone of claim 4, wherein when the application is opened, the resolution is automatically set to the same resolution of the mobile phone, and the captured gesture of the mobile phone is used, so that the display of the application in the virtual world fits in the screen of the displayed mobile phone.
6. The method for virtual-real interaction based on mobile phone of claim 1, wherein the virtual device comprises a VR virtual device, an AR virtual device or a MR virtual device.
7. A system for virtual-real interaction based on a mobile phone is characterized by comprising:
a module M1: running the 3D system in the virtual appliance;
a module M2: establishing connection between the mobile phone and the virtual equipment;
a module M3: based on the connection established between the mobile phone and the virtual equipment, the mobile phone sends an operation instruction to a 3D system in the virtual equipment;
a module M4: the virtual equipment captures the gesture of the mobile phone through the RGB camera and combines the operation instruction to realize interaction.
8. The system for virtual-real interaction based on mobile phone of claim 7, wherein the module M2 employs: establishing Socket connection between the mobile phone and the virtual equipment;
sending the equipment information of the mobile phone to the virtual equipment based on the Socket connection established between the mobile phone and the virtual equipment;
the device information of the mobile phone includes a resolution.
9. The system for virtual-real interaction based on mobile phone of claim 7, wherein the module M4 employs:
module M4.1: capturing the gesture of the mobile phone through an RGB camera, wherein the gesture comprises the position and the rotation of the mobile phone in the space; controlling the direction of an operation ray in the 3D system based on the gesture of the mobile phone;
module M4.2: the method comprises the steps of selecting an APP icon of a mobile phone to be opened through rays, and synchronously acting the operation of a mobile phone screen in reality into 2DAPP of a virtual world based on the connection established between the mobile phone and virtual equipment to realize virtual interaction.
10. The system for virtual-real interaction based on mobile phone of claim 7, wherein when the application is opened, the resolution is automatically set to the same resolution as the mobile phone, and the captured gesture of the mobile phone is used, so that the display of the application in the virtual world fits in the screen of the displayed mobile phone.
CN202210380104.4A 2022-04-12 2022-04-12 Method and system for performing virtual-real interaction based on mobile phone Pending CN115390657A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210380104.4A CN115390657A (en) 2022-04-12 2022-04-12 Method and system for performing virtual-real interaction based on mobile phone

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210380104.4A CN115390657A (en) 2022-04-12 2022-04-12 Method and system for performing virtual-real interaction based on mobile phone

Publications (1)

Publication Number Publication Date
CN115390657A true CN115390657A (en) 2022-11-25

Family

ID=84115794

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210380104.4A Pending CN115390657A (en) 2022-04-12 2022-04-12 Method and system for performing virtual-real interaction based on mobile phone

Country Status (1)

Country Link
CN (1) CN115390657A (en)

Similar Documents

Publication Publication Date Title
WO2020248640A1 (en) Display device
US9485493B2 (en) Method and system for displaying multi-viewpoint images and non-transitory computer readable storage medium thereof
EP3151548A1 (en) Video recording method and device
US11450044B2 (en) Creating and displaying multi-layered augemented reality
US20100217884A2 (en) Method and system of providing multimedia content
CN112333474B (en) Screen projection method, system, equipment and storage medium
CN105554430B (en) A kind of video call method, system and device
EP2685715A1 (en) Method and device for managing video resources in video conference
JP2012085301A (en) Three-dimensional video signal processing method and portable three-dimensional display device embodying the method
CN114025219B (en) Rendering method, device, medium and equipment for augmented reality special effects
CN113099298A (en) Method and device for changing virtual image and terminal equipment
KR20080082759A (en) System and method for realizing vertual studio via network
CN111683260A (en) Program video generation method, system and storage medium based on virtual anchor
WO2023093698A1 (en) Interaction method for game live-streaming, and storage medium, program product and electronic device
CN114845136B (en) Video synthesis method, device, equipment and storage medium
WO2019118028A1 (en) Methods, systems, and media for generating and rendering immersive video content
US11900530B1 (en) Multi-user data presentation in AR/VR
CN106412617B (en) Remote debugging control method and device
CN117044189A (en) Multi-user interactive board for improving video conference
Roussel Exploring new uses of video with videoSpace
WO2024027611A1 (en) Video live streaming method and apparatus, electronic device and storage medium
WO2024041672A1 (en) Iptv service-based vr panoramic video playback method and system
US20230341993A1 (en) Moving a digital representation of a video conference participant to a new location in a virtual environment
US20230405475A1 (en) Shooting method, apparatus, device and medium based on virtual reality space
CN115390657A (en) Method and system for performing virtual-real interaction based on mobile phone

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination