CN117289783A - Interaction method and device based on virtual reality - Google Patents

Interaction method and device based on virtual reality Download PDF

Info

Publication number
CN117289783A
CN117289783A CN202210686910.4A CN202210686910A CN117289783A CN 117289783 A CN117289783 A CN 117289783A CN 202210686910 A CN202210686910 A CN 202210686910A CN 117289783 A CN117289783 A CN 117289783A
Authority
CN
China
Prior art keywords
user
interaction
equipment
interaction space
picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210686910.4A
Other languages
Chinese (zh)
Inventor
吴雨涵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202210686910.4A priority Critical patent/CN117289783A/en
Publication of CN117289783A publication Critical patent/CN117289783A/en
Pending legal-status Critical Current

Links

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The disclosure relates to an interaction method and device based on virtual reality, wherein the method comprises the following steps: responding to the operation of the first user on the VR equipment, and displaying a picture of the interaction space through the VR equipment; responding to the operation of the first user on the object in the picture of the interaction space through the VR equipment, receiving the operation data of the second user on the object in the interaction space, and performing virtual reality creation to obtain the VR work. When carrying out VR authoring in VR device through adopting the method of this disclosure, a plurality of users can get into same interactive space, feel the three-dimensional authoring environment of the same virtual 3D, carry out immersive authoring jointly, and a plurality of users that participate in the authoring can carry out the exchange interdynamic, breaks the restriction that current user authored alone, effectively promotes authoring efficiency.

Description

Interaction method and device based on virtual reality
Technical Field
The disclosure relates to the technical field of virtual reality, in particular to a virtual reality interaction method and device.
Background
Virtual Reality (VR) is a computer simulation system that can create and experience a Virtual world by using a computer to generate a simulated environment whose Reality is very close to the real world, and in which a user has an immersive feel.
Currently, users are able to implement creation, editing, testing, etc. of 3D models in VR scenes through VR applications of authoring classes. However, these existing VR applications have complex operating procedures, and have failed to meet the increasing authoring demands of users, resulting in poor user experience.
Disclosure of Invention
In order to solve the technical problems, the present disclosure provides an interaction method and device based on virtual reality.
In a first aspect, the present disclosure provides an interaction method for virtual reality, including:
responding to the operation of a first user on VR equipment, and displaying a picture of an interaction space through the VR equipment;
responding to the operation of the first user on the object in the picture of the interaction space through the VR equipment, and receiving the operation data of the second user on the object in the interaction space, and performing VR creation to obtain VR works.
As a possible implementation manner, the responding to the operation of the first user on the virtual reality VR device, displaying, by the VR device, a screen of the interaction space includes:
displaying, by the VR device, a virtual picture, the virtual picture including a create button;
responding to the operation of the first user on the creation button through the VR equipment, generating a picture of an interaction space, and displaying the picture of the interaction space through the VR equipment;
and generating an identifier corresponding to the interaction space, wherein the second user can enable the VR equipment corresponding to the second user to display the picture of the interaction space through the identifier corresponding to the interaction space.
In some possible embodiments, the responding to the operation of the first user on the object in the picture of the interaction space by the VR device, and receiving operation data of the second user on the object in the interaction space, performing VR authoring to obtain a VR work, includes:
displaying a design tool in a picture of the interaction space; wherein the design tool comprises: one or more of a painting tool, a prototype interaction tool, a modeling tool, and an industrial-scale design tool;
and responding to the operation of the first user on the design tool through the VR equipment, and performing VR authoring.
In some possible embodiments, the method further comprises:
in the authoring process, responding to triggering operation of the first user on a target virtual portal displayed in a picture of the interaction space through the VR equipment, displaying a virtual window in the interaction space, and displaying a multimedia resource corresponding to the target virtual portal through the virtual window.
In some possible embodiments, the method further comprises:
responding to an authority setting instruction input by the first user through the VR equipment, and setting the interaction authority of the second user in the interaction space;
the interaction permission is used for indicating permission of a user to edit the VR work being authored.
In some possible embodiments, the method further comprises:
and responding to a sharing instruction input by the first user through the VR equipment, and sending the VR works to a server so as to share the VR works to a client corresponding to the appointed user through the server.
In some possible embodiments, the method further comprises:
and responding to a first uploading instruction input by the first user through the VR equipment, and uploading the VR work to a target display platform so as to display the VR work in the target display platform.
In some possible embodiments, the method further comprises:
and responding to a second uploading instruction input by the first user through the VR equipment, and uploading the VR work to a template resource library in a server so that other users can download the VR work through the template resource library.
In a second aspect, the present disclosure provides a virtual reality-based interaction device, comprising:
the display module is used for responding to the operation of the first user on the VR equipment and displaying the picture of the interaction space through the VR equipment;
and the design module is used for responding to the operation of the first user on the object in the picture of the interaction space through the VR equipment, receiving the operation data of the second user on the object in the interaction space, and carrying out VR creation to obtain VR works.
In a third aspect, the present disclosure provides an electronic device comprising: a memory and a processor;
the memory is configured to store computer program instructions;
the processor is configured to execute the computer program instructions to cause the electronic device to implement the virtual reality based interaction method of any of the first aspect and the first aspect.
In a fourth aspect, the present disclosure provides a readable storage medium comprising: computer program instructions;
the computer program instructions being executable by an electronic device to cause the electronic device to implement the virtual reality based interaction method of any of the first aspect and the first aspect.
In a fifth aspect, the present disclosure provides a computer program product, which when executed by an electronic device, causes the electronic device to implement the virtual reality based interaction method of any of the first aspect and the first aspect.
The embodiment of the disclosure provides an interaction method and device based on virtual display, wherein the method comprises the following steps: responding to the operation of the first user on the VR equipment, and displaying a picture of the interaction space through the VR equipment; responding to the operation of the first user on the object in the picture of the interaction space through the VR equipment, receiving the operation data of the second user on the object in the interaction space, and performing VR creation to obtain VR works. When VR authoring is carried out in the VR device by adopting the method disclosed by the invention, a plurality of users can enter the same interaction space and feel the same virtual 3D stereoscopic authoring environment, immersive authoring is carried out jointly, a plurality of users participating in authoring can carry out communication interaction, the limit of current single user authoring is broken, and the authoring efficiency is effectively improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
In order to more clearly illustrate the embodiments of the present disclosure or the solutions in the prior art, the drawings that are required for the description of the embodiments or the prior art will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
Fig. 1 is an application scenario schematic diagram of an interaction method based on virtual reality provided in an embodiment of the present disclosure;
fig. 2 is a schematic diagram of an overall functional architecture of a VR application provided by the present disclosure;
FIG. 3 is an overall framework diagram of interactions with the VR application shown in FIG. 2;
fig. 4 is a flowchart of an interaction method based on virtual reality provided in an embodiment of the present disclosure;
fig. 5 is a flowchart of an interaction method based on virtual reality provided in an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an interaction device based on virtual display provided in the present disclosure;
fig. 7 is a schematic structural diagram of an electronic device provided in the present disclosure.
Detailed Description
In order that the above objects, features and advantages of the present disclosure may be more clearly understood, a further description of aspects of the present disclosure will be provided below. It should be noted that, without conflict, the embodiments of the present disclosure and features in the embodiments may be combined with each other.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure, but the present disclosure may be practiced otherwise than as described herein; it will be apparent that the embodiments in the specification are only some, but not all, embodiments of the disclosure.
Currently, there are several VR applications, such as ShapesXR, multiBrush, gradient Sketch, sculptR XR, tin Brush, etc., which can enable a user to create a 3D model in the VR application, or to import 3D materials into the application, edit and adjust, and then transmit the edited VR content to a PC end by means of a Cable line or AirLink, and perform secondary creation in the PC end by using 3D drawing software/game engine.
However, the current VR application has a single person authoring mode, and in a scenario requiring multiple persons to cooperatively participate in authoring, the authoring process is more complex, resulting in lower authoring efficiency.
In addition, at present, authoring is carried out in VR application, needs to be in the VR device through appointed website uploading 3D material collection to the PC end, and the user often needs to wear to take off VR equipment, and from the VR device in the tandem to the PC end, and the authoring flow is broken easily for the authoring cost of user increases the flesh. In addition, the current VR application cannot be combined with more scenes, and the application scenes are single.
In order to solve the above problems, the present disclosure provides an interaction method and apparatus based on virtual display, which can implement VR authoring by multi-user cooperation, and in the authoring process, the user does not need to repeatedly wear and take off VR equipment, so as to improve the design authoring efficiency, and open up multiple scene links such as multi-user cooperation, design authoring, manufacturing construction, exhibition, resource integration, and the like, so as to improve the processing efficiency of the whole link.
Fig. 1 is an application scenario schematic diagram of an interaction method based on virtual reality provided in the present disclosure. Referring to fig. 1, the scenario includes: a plurality of VR devices 101 and a server 102, the plurality of VR devices 101 may interact through the server 102.
VR device 101 may include a VR head display device, which may be a VR helmet or VR glasses, and a handheld device. The user may view the virtual picture displayed by the VR head display device by wearing the VR head display device, and the user may operate the object in the virtual picture by operating the handheld device (e.g., pressing a button on the handheld device, moving the handheld device, rotating the handheld device, etc.).
Wherein a client (i.e., VR application) may be installed in the VR device 101, the client may control the VR device 101 to perform corresponding operations, such as, for example, operating on objects in the virtual screen to build a 3D model, editing a created 3D model, and so on.
The server 102 may be a stand-alone server, a cloud server, a server cluster, a cloud service cluster, a service platform, etc., and the disclosure is not limited to the type of server 102. In the present disclosure, multiple VR devices 101 may interact through the server 102, e.g., upload virtual resources to the server 102, download virtual resources from the server 102, and so forth.
In this scenario, multiple VR devices 101 may each launch a respective installed VR application and enter the same interaction space within which multiple users may co-participate in VR authoring. The same interaction space is understood as that the multiple VR devices 101 display the shared 3D virtual image, and the virtual image is updated by any user through the operation data of the VR devices on the objects in the virtual image, and the updated virtual image is synchronized to the VR devices 101 corresponding to other users through the server 102, so that the virtual images displayed by the multiple VR devices 101 are synchronized. And in the authoring process, multiple users can also conduct a multi-person session through the VR device, and the session modes can include, but are not limited to: text chat, voice call, emoticons, sharing the current screen of the user participating in the creation (such as a mobile phone end or a PC end), and the like. .
According to the user's operation in the VR device 101, the VR device 101 may also upload the authored VR work to the template resource library of the server 102 to be published as a template resource. Other users may download template assets from the template asset library and use the template assets, such as secondary authoring of 3D models in the template assets.
According to the user's operation in the VR device 101, the VR device 101 may also share the VR work to the client corresponding to the specified user through the server 102. The client corresponding to the specified user may be a mobile terminal or a PC terminal, which is not limited in this disclosure.
For example, as shown in fig. 1, assuming that the scenario further includes a personal computer PC103, a corresponding client is installed in the PC103, a user may browse VR works shared by other users through the client in the PC103, and may also edit the VR works within the permission range thereof. It will be appreciated that the screen that the user sees through the PC103 may be 3D, but cannot be personally perceived as a 3D environment. By supporting multi-terminal sharing and collaborative authoring, the requirement that users participate in VR authoring jointly in different scenes can be met.
In addition, according to the operation of the user in the VR device 101, the VR work can be released to other platforms for display, or be browsed and used by the user of other platforms. For example, in the embodiment shown in fig. 1, a server 104 may be further included, where the server 104 is a server corresponding to the NFT platform, and the user may upload the VR work to the server 102, and then the server 102 sends the VR work to the server 104, so as to issue the VR work to the NFT platform for display.
The method provided by the present disclosure will be described in detail below with reference to several specific embodiments in conjunction with the accompanying drawings and the scenario.
Fig. 2 is a schematic diagram of an overall functional architecture of a VR application provided by the present disclosure. Referring to fig. 2, the VR application provided in the present disclosure may at least provide the following functions: communication function, design function, linkage function, cross-platform interaction function and resource integration function.
Communication function: multiple users may be caused to collaborate to complete VR authoring, which may be from different positions, for example, may include: design, product, development, test, market operation, algorithm, electronics, architecture, and so forth.
The design function is as follows: realize comparatively powerful designing function through integrating multiclass design instrument, multiclass design instrument includes: drawing tools MultiBrush, prototype interaction tools Shapes XR, modeling tools GraySketch, industrial grade tools Autodesk VRed, and so forth.
Linkage function: the system can be linked with users in different scenes, such as production and manufacturing with users of a manufacturing end and simulation test with users of a testing end, wherein the manufacturing end can be the manufacturing end of any industry, and can be, but is not limited to, the automobile industry, the home manufacturing industry, the machine production and manufacturing industry, the electronic product manufacturing industry and the like.
Cross-platform interaction function: the VR application system can be used for realizing release of VR works authored by a user in other platforms, for example, the VR applications can interact with the NFT platform, and the authored VR works are released to the NFT platform for display.
Resource integration function: and supporting the function of publishing the authored VR work as a template resource by a user. After being published as template resources, VR works can be presented to users of VR applications in a variety of forms, such as work sources, creators, collections, work types, presentation spaces, etc., so that users can download these template resources for use creation.
Fig. 3 is an overall framework diagram of interactions with the VR application shown in fig. 2. Referring to fig. 3, a VR application may provide design tools and template resources, and a user may invoke the design tools and/or the template resources to create, where a mode of creation may be a plurality of users participating in, discussing, designing, editing to produce a VR work, which may be any aspect of visual design, industrial design, indoor design, architectural design, automotive design, etc. After creation is completed, cross-platform release can be realized for display, and linkage manufacturing production, construction and the like can be realized. And the template resource can be uploaded to a server corresponding to the VR application to be released as the template resource for other users to use.
Fig. 4 is a flowchart of an interaction method based on virtual reality according to an embodiment of the disclosure. Referring to fig. 4, the method of the present embodiment includes:
s401, responding to the operation of the first user on the VR device, and displaying a picture of the interaction space through the VR device.
The interaction space is a 3D virtual space for VR creation. The first user can wear the VR equipment, starts the client side installed in the VR equipment, operates the VR equipment and enters the interaction space, namely, the first user is shown a 3D interaction space picture through the VR equipment.
The interaction space can be created by the first user, or the first user created by other users can enter through the identification of identifying the interaction space by operating the VR device. The present disclosure is not limited in this regard. The identification of interaction space may be, but is not limited to,: digital identification, identification patterns (e.g., two-dimensional code patterns, bar code patterns, etc.), and the like. For example, the interaction space is created by the second user, and then the first user can input the digital identifier corresponding to the interaction space into the corresponding area of the virtual picture displayed by the VR head display device by operating the handheld device, so that the VR head display device can display the picture of the interaction space. The present disclosure is not limited to implementations in which the first user enters the interaction space.
The present disclosure does not limit display parameters and 3D effects of a picture of an interaction space, for example, the picture of the interaction space may take a solid color (e.g., white) background, an object included in the picture of the interaction space may be in a 3D form, and the object may include a virtual 3D icon set, a virtual control (e.g., a color setting control), a virtual portal (e.g., a portal into an associated setting panel of a 3D model), and so on, which correspond to a design tool.
S402, responding to the operation of a first user on the object in the picture of the interaction space through the VR equipment, receiving the operation data of a second user on the object in the interaction space, and performing virtual reality creation to obtain the VR work.
In a scenario of multi-person collaboration, any user in the interaction space may edit the virtual resource currently being edited, i.e., the first user and the second user may edit the VR work currently being edited in the interaction space.
The first user can input triggering operation for the object in the picture of the interaction space through operating the handheld device, so that a design tool provided by the VR application is called, and authoring is performed in the VR application. In the authoring process, the VR head display device worn by the first user continuously updates the picture of the interaction space along with the operation of the user. The operation data corresponding to the first user (where the operation data corresponding to the first user may also be understood as data of an updated picture of the interaction space obtained based on the operation of the first user) may also be sent to the VR device worn by the second user through the server, and displayed in the VR device worn by the second user.
The number of the second users can be multiple, the second users can operate the objects in the pictures of the interaction space in a similar mode, so that a design tool provided by the VR application is called to conduct VR creation, and operation data corresponding to the second users can be sent to VR equipment worn by the first users through the server and displayed in the VR equipment worn by the first users.
By the method, virtual pictures viewed by a plurality of users entering the same interaction space can be synchronous, and then the plurality of users can participate in VR creation together.
The method provided by the embodiment can realize that the user carries out VR authoring in the VR application through the VR equipment in a multi-person cooperation mode, and breaks the limit of the current VR application on authoring, thereby improving VR authoring efficiency.
Optionally, in the embodiment shown in fig. 4, the first user and the second user may also initiate a multi-person session in the process of participating in VR authoring together, where the session mode may be referred to in the foregoing, and the present disclosure is not limited to the session mode.
In some possible implementations, after the first user and the second user enter the same interaction space, a virtual button to initiate a multi-person session may be included in a screen of the interaction space, and any user in the interaction space may operate the virtual button through the VR device to initiate the multi-person session. After receiving the session request, the other users may click a corresponding button in the virtual screen, such as a "listen" button, to join the multi-person session.
In another possible implementation, when the first user and the second user enter the interaction space, the function of the multi-user session may be started by default, and the session frame and corresponding buttons of the session function (such as a button for initiating multi-user voice, a mute button, etc.) are displayed in a specific area of the screen of the interaction space. The user can use the session functions provided by the VR application without any action.
The VR application provides a multi-user session function, so that communication among a plurality of users participating in VR creation is facilitated, and processing efficiency is improved.
Optionally, on the basis of the embodiment shown in fig. 4, in the authoring process, sharing presentation of the multimedia resources may also be performed in the screen of the interaction space. The multimedia resources can be integrated into a network link at the PC end in advance, and the network link and the multimedia resources are uploaded to the server. The present disclosure is not limited to multimedia resources, and may be any type of image, video, text, etc.
When a user carries out VR creation through VR equipment, a virtual entry corresponding to a network link can be displayed in a picture of an interaction space, the user uploading the network link can obtain corresponding multimedia resources from a server through triggering operation of the virtual entry, and the multimedia resources are displayed in the interaction space through a virtual window. And before the multimedia resource is displayed through the virtual window, prompt information of whether to share the multimedia resource to other users in the interactive space can be displayed, after the users confirm to share the multimedia resource through the VR equipment, the multimedia resource is issued to the other users in the interactive space through the server, and the multimedia resource is displayed in the VR equipment worn by the other users.
The VR application meets the interaction requirement of the user when VR authoring in a multi-person collaboration mode by providing the user with the function of sharing multimedia resources in the interaction space.
Optionally, based on the embodiment shown in fig. 4, the user who creates the interaction space may also set the authoring rights of other users, where the authoring rights are used to indicate the rights of the user to edit the VR work being authored. For example, it may be set that only one user is supported for editing at the same time, so as to avoid confusion in the editing process; alternatively, it may be set that only the user who creates the interaction space can edit the virtual resource, and other users cannot edit it. Of course, the authoring rights can also be set to other modes, which are not limited by this disclosure.
The VR application can meet the requirements of the user on the protection of the VR works authored by the user by providing the function of setting the authoring authority to the user and meeting the personalized setting of the user.
Fig. 5 is a flowchart of an interaction method based on virtual reality according to an embodiment of the disclosure. Referring to fig. 5, the method of the present embodiment includes:
s501, responding to the operation of the first user on the VR device, and displaying a picture of the interaction space through the VR device.
S502, responding to the operation of a first user on the object in the picture of the interaction space through the VR equipment, receiving the operation data of a second user on the object in the interaction space, and performing virtual reality creation to obtain the VR work.
Steps S501 to S502 in the present embodiment and steps S401 to S402 in the embodiment shown in fig. 4; similarly, reference may be made to the detailed description of the embodiment shown in fig. 4, which is not repeated here for the sake of brevity.
On this basis, the embodiment may further include the following steps:
s503, responding to a sharing instruction input by the first user through the VR equipment, and sending the VR works to the server so as to share the VR works to the client corresponding to the appointed user through the server.
After the VR works are authored, sharing buttons can be included in the virtual pictures displayed by the VR equipment, a first user can operate the sharing buttons in the virtual pictures through the VR equipment to generate sharing instructions, the VR equipment responds to the sharing instructions and sends the VR works and the identification information of the client corresponding to the appointed user to the server, and the server shares the VR works to the client corresponding to the appointed user. The client corresponding to the designated user may be a VR end or a PC end, which is not limited in this disclosure.
The VR application can meet the sharing requirement of the user by supporting multi-terminal interaction.
S504, responding to a first uploading instruction input by a first user through the VR equipment, and uploading the VR work to a target display platform so as to display the VR work in the target display platform.
The VR application can support the user to share the created VR works to other platforms and display the created VR works in the other platforms so as to meet the requirement that the user wants to share the VR works across the platforms.
As one possible implementation manner, after the VR work is authored, an upload button may be included in a virtual screen displayed by the VR device, a first user operates the upload button through the VR device, the VR device responds to the operation of the upload button, one or more candidate platforms that can be shared may be displayed, the user selects a target display platform from the candidate platforms through the operation of the VR device, and the VR device responds to the selection operation of the user on the target display platform, so as to generate a first upload instruction. Responding to the first uploading instruction, sending information of the VR work and the target display platform to a server, and then sending the VR work to a server corresponding to the target display platform by the server so as to be released to the target display platform.
The target display platform may be, but is not limited to, an NFT platform, after the VR work is sent to a server of the NFT platform, the server of the NFT platform may generate a non-homogeneous certificate for the VR work as a permanent identifier of the VR work in the NFT platform, and then the VR work may be displayed in the NFT platform for a user in the NFT platform to browse, and so on.
S505, responding to a second uploading instruction input by the first user through the VR equipment, and uploading the VR works to a template resource library in the server, so that other users can download the VR works through the template resource library.
As one possible implementation manner, after the VR work is authored, a virtual screen displayed by the VR device may include a template release button, and the first user operates the template release button through the VR device, and the VR device responds to the operation of the template release button to generate the second upload instruction. Responding to the second uploading instruction, sending the VR work and the indication information for indicating that the VR work is released as the template resource to the server, after the server receives the VR work and the indication information, storing the VR work in a template resource library, downloading the VR work by other users when authoring through the VR application, and editing and designing the VR work further within the permission range allowed by the user who releases the VR work.
The present disclosure also provides, for example, an interaction device based on virtual reality.
Fig. 6 is a schematic structural diagram of an interaction device based on virtual reality according to an embodiment of the disclosure. Referring to fig. 6, an apparatus 600 provided in this embodiment includes:
and the display module 601 is configured to respond to an operation of the VR device by the first user, and display a picture of the interaction space through the VR device.
The design module 602 is configured to respond to an operation of the first user on the object in the picture of the interaction space through the VR device, and receive operation data of the second user on the object in the interaction space, and perform VR authoring to obtain a VR work.
In some possible implementations, the display module 601 is specifically configured to display, by the VR device, a virtual screen, where the virtual screen includes a create button; responding to the operation of the first user on the creation button through the VR equipment, generating a picture of an interaction space, and displaying the picture of the interaction space through the VR equipment; and generating an identifier corresponding to the interaction space, wherein the second user can enable the VR equipment corresponding to the second user to display the picture of the interaction space through the identifier corresponding to the interaction space.
In some possible implementations, the display module 601 is further configured to display a design tool in a screen of the interaction space; wherein the design tool comprises: drawing tools, prototype interaction tools, modeling tools, and one or more types of tools among industrial-scale design tools.
The design module 602 is specifically configured to respond to an operation of the first user on the design tool through the VR device, and perform VR authoring.
In some possible implementations, the display module 601 is further configured to respond to a triggering operation of the first user on a target virtual portal displayed in a picture of the interaction space through the VR device in the authoring process, and display a virtual window in the interaction space, where the virtual window is used to display a multimedia resource corresponding to the target virtual portal.
In some possible embodiments, the apparatus 600 further includes a permission setting module 603, configured to set, in response to a permission setting instruction input by the first user through the VR device, an interaction permission of the second user in the interaction space; the interaction permission is used for indicating permission of a user to edit the VR work being authored.
In some possible embodiments, the apparatus 600 further comprises: and the sending module 604 is configured to send the VR work to a server in response to a sharing instruction input by the first user through the VR device, so as to share the VR work to a client corresponding to the specified user through the server.
In some possible implementations, the sending module 604 is further configured to upload the VR work to a target display platform in response to a first upload instruction input by the first user through the VR device, so as to display the VR work in the target display platform.
In some possible implementations, the sending module 604 is further configured to upload the VR work to a template repository in a server in response to a second upload instruction input by the first user through the VR device, so that other users can download the VR work through the template repository.
The device provided in this embodiment may be used to implement the technical solution of any of the foregoing method embodiments, and its implementation principle and technical effects are similar, and reference may be made to the detailed description of the foregoing method embodiments, which are not repeated herein for brevity.
Fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure. Referring to fig. 7, an electronic device 700 provided in this embodiment may include: a memory 701 and a processor 702.
The memory 701 may be a separate physical unit, and the processor 702 may be connected through a bus 703. The memory 701 and the processor 702 may be integrated, implemented by hardware, or the like.
The memory 701 is used for storing program instructions, which the processor 702 invokes to perform the virtual reality based interaction method provided by any of the method embodiments above.
Alternatively, when some or all of the methods of the above embodiments are implemented in software, the electronic device 700 may include only the processor 702. The memory 701 for storing a program is located outside the electronic device 700, and the processor 702 is connected to the memory through a circuit/wire for reading and executing the program stored in the memory.
The processor 702 may be a central processing unit (central processing unit, CPU), a network processor (network processor, NP) or a combination of CPU and NP.
The processor 702 may further comprise a hardware chip. The hardware chip may be an application-specific integrated circuit (ASIC), a programmable logic device (programmable logic device, PLD), or a combination thereof. The PLD may be a complex programmable logic device (complex programmable logic device, CPLD), a field-programmable gate array (field-programmable gate array, FPGA), general-purpose array logic (generic array logic, GAL), or any combination thereof.
Memory 701 may include volatile memory (RAM), such as random-access memory (RAM); the memory may also include a nonvolatile memory (non-volatile memory), such as a flash memory (flash memory), a hard disk (HDD) or a Solid State Drive (SSD); the memory may also comprise a combination of the above types of memories.
The present disclosure also provides a readable storage medium comprising: computer program instructions which, when executed by at least one processor of an electronic device, cause the electronic device to implement a virtual reality based interaction method as provided by any of the method embodiments above.
The present disclosure also provides a computer program product which, when run on a computer, causes the computer to implement the virtual reality based interaction method provided by any of the method embodiments above.
It should be noted that in this document, relational terms such as "first" and "second" and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The foregoing is merely a specific embodiment of the disclosure to enable one skilled in the art to understand or practice the disclosure. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the disclosure. Thus, the present disclosure is not intended to be limited to the embodiments shown and described herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (12)

1. An interaction method based on virtual reality is characterized by comprising the following steps:
responding to the operation of a first user on Virtual Reality (VR) equipment, and displaying a picture of an interaction space through the VR equipment;
responding to the operation of the first user on the object in the picture of the interaction space through the VR equipment, and receiving the operation data of the second user on the object in the interaction space, and performing VR creation to obtain VR works.
2. The method of claim 1, wherein displaying, by the VR device, a picture of the interaction space in response to the first user operating the virtual reality VR device, comprises:
displaying, by the VR device, a virtual picture, the virtual picture including a create button;
responding to the operation of the first user on the creation button through the VR equipment, generating a picture of an interaction space, and displaying the picture of the interaction space through the VR equipment;
and generating an identifier corresponding to the interaction space, wherein the second user can enable the VR equipment corresponding to the second user to display the picture of the interaction space through the identifier corresponding to the interaction space.
3. The method of claim 1, wherein responding to the operation of the first user on the object in the picture of the interaction space by the VR device, and receiving operation data of the second user on the object in the interaction space, performing VR authoring to obtain VR works, comprises:
displaying a design tool in a picture of the interaction space; wherein the design tool comprises: one or more of a painting tool, a prototype interaction tool, a modeling tool, and an industrial-scale design tool;
and responding to the operation of the first user on the design tool through the VR equipment, and performing VR authoring.
4. The method according to claim 1, wherein the method further comprises:
in the authoring process, responding to triggering operation of the first user on a target virtual portal displayed in a picture of the interaction space through the VR equipment, displaying a virtual window in the interaction space, and displaying a multimedia resource corresponding to the target virtual portal through the virtual window.
5. The method according to claim 1, wherein the method further comprises:
responding to an authority setting instruction input by the first user through the VR equipment, and setting the interaction authority of the second user in the interaction space;
the interaction permission is used for indicating permission of a user to edit the VR work being authored.
6. The method according to claim 1, wherein the method further comprises:
and responding to a sharing instruction input by the first user through the VR equipment, and sending the VR works to a server so as to share the VR works to a client corresponding to the appointed user through the server.
7. The method according to claim 1, wherein the method further comprises:
and responding to a first uploading instruction input by the first user through the VR equipment, and uploading the VR work to a target display platform so as to display the VR work in the target display platform.
8. The method according to claim 1, wherein the method further comprises:
and responding to a second uploading instruction input by the first user through the VR equipment, and uploading the VR work to a template resource library in a server so that other users can download the VR work through the template resource library.
9. An interactive apparatus based on virtual reality, comprising:
the display module is used for responding to the operation of the first user on the virtual reality VR equipment and displaying the picture of the interaction space through the VR equipment;
and the design module is used for responding to the operation of the first user on the object in the picture of the interaction space through the VR equipment, receiving the operation data of the second user on the object in the interaction space, and carrying out VR creation to obtain VR works.
10. An electronic device, comprising: a memory and a processor;
the memory is configured to store computer program instructions;
the processor is configured to execute the computer program instructions to cause the electronic device to implement the virtual reality based interaction method of any one of claims 1 to 8.
11. A readable storage medium, comprising: computer program instructions;
execution of the computer program instructions by an electronic device causes the electronic device to implement the virtual reality-based interaction method of any one of claims 1 to 8.
12. A computer program product, characterized in that the computer program product, when executed by an electronic device, causes the electronic device to implement the virtual reality based interaction method of any of claims 1 to 8.
CN202210686910.4A 2022-06-16 2022-06-16 Interaction method and device based on virtual reality Pending CN117289783A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210686910.4A CN117289783A (en) 2022-06-16 2022-06-16 Interaction method and device based on virtual reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210686910.4A CN117289783A (en) 2022-06-16 2022-06-16 Interaction method and device based on virtual reality

Publications (1)

Publication Number Publication Date
CN117289783A true CN117289783A (en) 2023-12-26

Family

ID=89246769

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210686910.4A Pending CN117289783A (en) 2022-06-16 2022-06-16 Interaction method and device based on virtual reality

Country Status (1)

Country Link
CN (1) CN117289783A (en)

Similar Documents

Publication Publication Date Title
KR101918262B1 (en) Method and system for providing mixed reality service
US9367950B1 (en) Providing virtual reality experiences based on three-dimensional designs produced using three-dimensional design software
Nebeling et al. 360proto: Making interactive virtual reality & augmented reality prototypes from paper
US20210026998A1 (en) Rapid design and visualization of three-dimensional designs with multi-user input
US11227437B2 (en) Three-dimensional model constructing method, apparatus, and system
KR20120045744A (en) An apparatus and method for authoring experience-based learning content
CN112068751A (en) House resource display method and device
US11227075B2 (en) Product design, configuration and decision system using machine learning
US20220138383A1 (en) Product design, configuration and decision system using machine learning
CN114697703A (en) Video data generation method and device, electronic equipment and storage medium
CN110990106B (en) Data display method and device, computer equipment and storage medium
JP7381556B2 (en) Media content planning system
AU2017310075A1 (en) System for composing or modifying virtual reality sequences, method of composing and system for reading said sequences
CN117289783A (en) Interaction method and device based on virtual reality
CN114021047A (en) Information presentation method in immersive activity system and electronic device
CN113436320A (en) 3D model generation system and method based on IFC model file
CN112270733A (en) AR expression package generation method and device, electronic equipment and storage medium
CN111382292A (en) Content management server, information sharing system, and communication method
CN116627397B (en) Program development method and related device
US11544775B2 (en) System and method for virtual demonstration of product
Frosini et al. ProtoSketchAR: Prototyping in Augmented Reality via Sketchings
CN117392306A (en) Virtual object rendering method and device, electronic equipment and storage medium
CN117618938A (en) Interactive processing method and device for virtual object, electronic equipment and storage medium
CN117010829A (en) Art asset data processing method, system, equipment and storage medium
KR20230160534A (en) Metaverse environment-based exhibition platform service providing method, device and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination