CN118105689A - Game processing method and device based on virtual reality, electronic equipment and storage medium - Google Patents

Game processing method and device based on virtual reality, electronic equipment and storage medium Download PDF

Info

Publication number
CN118105689A
CN118105689A CN202211528421.2A CN202211528421A CN118105689A CN 118105689 A CN118105689 A CN 118105689A CN 202211528421 A CN202211528421 A CN 202211528421A CN 118105689 A CN118105689 A CN 118105689A
Authority
CN
China
Prior art keywords
game
media content
subspace
user
virtual reality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211528421.2A
Other languages
Chinese (zh)
Inventor
黄栗辰
黄翔宇
冀利悦
付平非
杨帆
孙宜欣
陆离
庞鸿潇
曾诚
秦劲启
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202211528421.2A priority Critical patent/CN118105689A/en
Priority to US18/525,503 priority patent/US20240177435A1/en
Publication of CN118105689A publication Critical patent/CN118105689A/en
Pending legal-status Critical Current

Links

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The present disclosure provides a game processing method, an apparatus, an electronic device, and a storage medium based on virtual reality, where a first game subspace is displayed in a first virtual reality space for presenting a first media content, and a first game object associated with the first media content is displayed in the first game subspace for a user to play, so that the user can watch the first media content in the virtual reality space while playing a game associated with the first media content, and further more immersive and richer media content watching and game experience can be provided for the user.

Description

Game processing method and device based on virtual reality, electronic equipment and storage medium
Technical Field
The disclosure relates to the technical field of computers, in particular to a game processing method and device based on virtual reality, electronic equipment and a storage medium.
Background
With the development of Virtual Reality (VR) technology, more and more Virtual social platforms or applications are developed for users. In the virtual social platform, a user can control his/her virtual character to perform social interaction, entertainment, learning, remote office, UGC creation (User Generated Content ) and the like with the virtual character controlled by other users through an intelligent terminal device such as a head-mounted VR glasses. However, the interaction form provided by the related virtual reality space is relatively single, and cannot meet the diversified interaction requirements of users.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
In a first aspect, according to one or more embodiments of the present disclosure, there is provided a game processing method based on virtual reality, including:
displaying a first virtual reality space for presenting first media content to a user;
displaying a first game subspace in the first virtual space to enable the user to view the first media content and the first game subspace simultaneously;
displaying a first game object within the first game subspace, the first game object being associated with the first media content;
And displaying corresponding game feedback information based on the operation of the user on the first game object.
In a second aspect, according to one or more embodiments of the present disclosure, there is provided a virtual reality-based game processing apparatus, comprising:
a virtual space display unit configured to display a first virtual reality space, the first virtual reality space being used to present first media content to a user;
A game space display unit for displaying a first game subspace in the first virtual space so that the user can observe the first media content and the first game subspace at the same time;
A game object display unit for displaying a first game object within the first game subspace, the first game object being associated with the first media content;
And the feedback information display unit is used for displaying corresponding game feedback information based on the operation of the user on the first game object.
In a third aspect, according to one or more embodiments of the present disclosure, there is provided an electronic device comprising: at least one memory and at least one processor; wherein the memory is for storing program code, and the processor is for invoking the program code stored by the memory to cause the electronic device to perform the virtual reality based game processing method provided in accordance with one or more embodiments of the present disclosure.
In a fourth aspect, according to one or more embodiments of the present disclosure, there is provided a non-transitory computer storage medium storing program code which, when executed by a computer device, causes the computer device to perform the virtual reality-based game processing method provided according to one or more embodiments of the present disclosure.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale.
FIG. 1 is a flow chart of a game processing method based on virtual reality according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a virtual reality device according to an embodiment of the disclosure;
FIG. 3 is an alternative schematic diagram of a virtual field of view of a virtual reality device provided in accordance with an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a first virtual reality space provided in accordance with an embodiment of the present disclosure;
5-8 are schematic diagrams of a first virtual reality space and a first game subspace provided at a first viewing angle of a user according to an embodiment of the present disclosure;
FIG. 9 is a schematic diagram of a first virtual reality space and a first game subspace at a first viewing angle of a user provided in accordance with another embodiment of the present disclosure;
FIG. 10 is a schematic diagram of a game processing device based on virtual reality according to an embodiment of the present disclosure;
fig. 11 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be understood that the steps recited in the embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. Furthermore, embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments. The term "responsive to" and related terms mean that one signal or event is affected to some extent by another signal or event, but not necessarily completely or directly. If event x occurs "in response to" event y, x may be directly or indirectly in response to y. For example, the occurrence of y may ultimately lead to the occurrence of x, but other intermediate events and/or conditions may exist. In other cases, y may not necessarily result in the occurrence of x, and x may occur even though y has not yet occurred. Furthermore, the term "responsive to" may also mean "at least partially responsive to".
The term "determining" broadly encompasses a wide variety of actions, which may include obtaining, calculating, computing, processing, deriving, investigating, looking up (e.g., looking up in a table, a database or another data structure), ascertaining and the like, and may also include receiving (e.g., receiving information), accessing (e.g., accessing data in a memory) and the like, as well as parsing, selecting, choosing, establishing and the like. Related definitions of other terms will be given in the description below. Related definitions of other terms will be given in the description below.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
For the purposes of this disclosure, the phrase "a and/or B" means (a), (B), or (a and B).
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
One or more embodiments of the present disclosure provide a virtual Reality-based game processing method that employs an Extended Reality (XR) technique. The augmented reality technology can combine reality with virtual through a computer, and provides a virtual reality space for a user to interact with. In virtual reality space, a user may perform social interactions, entertainment, learning, work, tele-office, authoring UGC (User Generated Content ), etc. through a virtual reality device, such as a head mounted display (Head Mount Display, HMD).
Referring to fig. 2, a user may enter a virtual reality space through a virtual reality device, such as head-mounted VR glasses, and control his or her Avatar (Avatar) in the virtual reality space to socially interact with other user-controlled avatars, entertain, learn, remotely office, etc.
In one embodiment, in virtual reality space, a user may implement related interactive operations through a controller, which may be a handle, for example, a user performs related operation control through operation of keys of the handle. Of course, in other embodiments, the target object in the virtual reality device may be controlled using a gesture or voice or multi-mode control mode instead of using the controller.
The virtual reality devices described in embodiments of the present disclosure may include, but are not limited to, the following types:
And the computer-side virtual reality (PCVR) equipment performs related calculation of virtual reality functions and data output by using the PC side, and the external computer-side virtual reality equipment realizes the effect of virtual reality by using the data output by the PC side.
The mobile virtual reality device supports setting up a mobile terminal (such as a smart phone) in various manners (such as a head-mounted display provided with a special card slot), performing related calculation of a virtual reality function by the mobile terminal through connection with the mobile terminal in a wired or wireless manner, and outputting data to the mobile virtual reality device, for example, watching a virtual reality video through an APP of the mobile terminal.
The integrated virtual reality device has a processor for performing the calculation related to the virtual function, and thus has independent virtual reality input and output functions, and is free from connection with a PC or a mobile terminal, and has high degree of freedom in use.
Of course, the form of implementation of the virtual reality device is not limited to this, and may be further miniaturized or enlarged as needed.
The sensor (such as a nine-axis sensor) for detecting the gesture in the virtual reality device is arranged in the virtual reality device, and is used for detecting the gesture change of the virtual reality device in real time, if the user wears the virtual reality device, when the gesture of the head of the user changes, the real-time gesture of the head is transmitted to the processor, so that the gaze point of the sight of the user in the virtual environment is calculated, an image in the gaze range (namely a virtual view field) of the user in the three-dimensional model of the virtual environment is calculated according to the gaze point, and the image is displayed on the display screen, so that the user looks like watching in the real environment.
Fig. 3 shows an alternative schematic view of a virtual field of view of a virtual reality device according to an embodiment of this disclosure, where a horizontal field of view angle and a vertical field of view angle are used to describe a distribution range of the virtual field of view in a virtual environment, a vertical direction of distribution range is represented by a vertical field of view angle BOC, a horizontal direction of distribution range is represented by a horizontal field of view angle AOB, and an image of the virtual field of view in the virtual environment can always be perceived by a human eye through a lens. The angle of view represents a distribution range of viewing angles that the lens has when sensing an environment. For example, the angle of view of a virtual reality device indicates the range of distribution of viewing angles that the human eye has when a virtual environment is perceived through a lens of the virtual reality device; for another example, in a mobile terminal provided with a camera, the field angle of the camera is a distribution range of the viewing angle that the camera has when sensing the real environment to shoot.
Virtual reality devices, such as HMDs, integrate several cameras (e.g., depth cameras, RGB cameras, etc.), the purpose of which is not limited to providing a through view only. The camera images and integrated Inertial Measurement Unit (IMU) provide data that can be processed by computer vision methods to automatically analyze and understand the environment. Also, HMDs are designed to support not only passive computer vision analysis, but also active computer vision analysis. The passive computer vision method analyzes image information captured from the environment. These methods may be monoscopic (images from a single camera) or stereoscopic (images from two cameras). Including but not limited to feature tracking, object recognition, and depth estimation. Active computer vision methods add information to the environment by projecting a pattern that is visible to the camera but not necessarily to the human vision system. Such techniques include time-of-flight (ToF) cameras, laser scanning, or structured light to simplify stereo matching issues. Active computer vision is used to implement scene depth reconstruction.
Referring to fig. 1, fig. 1 shows a flowchart of a game processing method 100 based on virtual reality according to an embodiment of the disclosure, where the method 100 includes steps S120-S180.
Step S120: a first virtual reality space is displayed, the first virtual reality space being for presenting first media content to a user.
The virtual reality space may be a simulation environment for the real world, a virtual scene of half simulation and half virtual, or a virtual scene of pure virtual. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene or a three-dimensional virtual scene, and the dimension of the virtual scene is not limited in the embodiment of the present application. For example, a virtual scene may include sky, land, sea, etc., the land may include environmental elements of a desert, city, etc., and a user may control a virtual object to move in the virtual scene.
In some embodiments, the first media content is displayed in the form of a video stream or a virtual 3D object.
In one embodiment, a video stream may be acquired and video content may be presented within a preset area within the virtual reality space based on the video stream. Illustratively, the video stream may be in an H.265, H.264, MPEG-4, or the like encoding format. In a specific embodiment, the client may receive the live video stream sent by the server, and display live video images in the video image display space based on the live video stream.
In one embodiment, a media content display area (e.g., a virtual screen) is provided in the first virtual reality space for displaying the first media content.
In one embodiment, the first media content may include a sporting event, a concert, a video live broadcast, or the like. Illustratively, the virtual reality space comprises a virtual live space. In the virtual live space, a performer user can live in an avatar (e.g., 3D avatar) or a real image, and a spectator user can control the avatar to view the performer live in a viewing perspective such as a first person perspective or a third person perspective.
Step S140: a first game subspace is displayed in the first virtual space to enable the user to view the first media content and the first game subspace simultaneously.
In some embodiments, an image of the first game subspace may be displayed superimposed over an image of the first virtual space.
In some embodiments, a first game subspace is displayed within the first virtual space in response to a first operation by the user or based on a game presentation time node.
The first operation includes, but is not limited to, a somatosensory control operation, a gesture control operation, an eyeball shaking operation, a touch control operation, a voice control instruction, or an operation of an external control device. The first operation may comprise one or a group of operations.
In some embodiments, the first operation includes an operation for a first visual element displayed within the first virtual reality space. In an exemplary embodiment, one or more preset first visual elements are preset in the first virtual reality space, and if the first visual elements are triggered by a user through a first operation, an image of the first game subspace is displayed in a superimposed manner on an image of the first virtual space. For example, a preset game area may be set in the first virtual reality space, where a plurality of preset game props (e.g., football models) are set in the game area, and when the user triggers the game props (e.g., the user selects the game props, grabs and throws the game props, or the user controls the virtual character to approach the game props so that the distance between the virtual character and the game props is less than a preset threshold value), an "open XXX mini game" button control may be displayed, and when the button control is triggered by the user, the user controls the virtual character to enter the first game subspace. In another embodiment, if the play object is out of the user's field of view, the button control is no longer present in the user's field of view.
The game presentation time node is for the virtual reality space to automatically load the first game subspace at the time node. In some embodiments, the game presentation time node may be a time node preset by the virtual reality system. For example, one or more fixed time periods of each day or week may be opened to the user as a first game subspace.
In some embodiments, a game presentation time node may be determined based on first media content presented in a first virtual reality space. For example, the game presentation time node may be determined based on a time node at which the first media content starts playing or a preset time node during the playing of the first media content. For example, taking a first media content as an example of a concert, the point in time at which the concert starts or the point in time at which a particular track starts performing may be taken as the game presentation point in time.
In one embodiment, the game presentation time node may be determined based on preset information contained in a media information stream (e.g., video stream) of the first media content. For example, the preset information may take the form of supplemental enhancement information (Supplemental Enhancement Information, SEI). The supplemental enhancement information is additional information that may be included in the video stream, such as user-defined information, to increase the usability of the video, making the video more versatile. The supplemental enhancement information can be packaged and transmitted together with the video frame, so that the effect of synchronous transmission and analysis of the supplemental enhancement information and the video frame is realized. In this way, when the client decodes the media information stream, the game presentation time node may be determined by supplemental enhancement information in the media information stream.
In some embodiments, the first game subspace may provide a user with a occasion input class game. In the timing input type game, a player needs to perform a predetermined operation at a predetermined timing, and the operation timing of the player is compared with a reference timing to evaluate the operation of the player (for example, to determine the game score of the player). The timing input-type game may include a music game in which a player performs a predetermined operation input at a timing corresponding to the progress of a musical composition, and the input timing is compared with a reference timing to evaluate the player's operation. The music game is, for example, a game of proficiency in performance of rhythms, intervals, and the like.
In some embodiments, a countdown may be displayed to provide the user with a preparation time at the beginning of the game. At the time of displaying the countdown, the game contents to be played next may be displayed simultaneously. Taking a music game as an example, the name of music to be played may be displayed while the countdown is displayed.
Step S160: a first game object is displayed within the first game subspace, the first game object being associated with the first media content.
Step S180: and displaying corresponding game feedback information based on the operation of the user on the first game object.
The first game object is an object to be operated by a user, such as an animation or a model, and the game system determines a game score of the user according to an operation condition (such as whether the operation timing accords with a reference timing and whether the operation content meets a preset operation requirement) applied to the first game object by the user, and displays corresponding game feedback information.
The game feedback information includes, but is not limited to, game achievement information, game evaluation information, game animation special effects. In some embodiments, the game performance information may include a player's game overall performance (e.g., overall score), a score corresponding to a single operation; the game rating information may be used to measure the accuracy of the player's operation, and may include, for example, but not limited to, "excellent," "perfect," "very excellent," or "N-click," etc.
In some embodiments, the first game object includes an animated model of equipment or constituent elements used by an activity to which the first media content relates. Illustratively, if the first media content relates to a sports-like activity, the first game object includes an animated model of sports equipment used by the sports activity; or if the first media content relates to a musical class activity, the first game object includes an animated model of a musical instrument used by the musical class activity; or if the first media content relates to fitness class activities, the first game object comprises an animated model of human action. For example, if the first media content is a football game, the first game object may be an animated model of football, so that a user may play a game related to football while watching the football game in virtual reality space.
In some embodiments, the first game subspace is located at a preset position within the first virtual reality space to enable the user to view the first media content and the image of the first game subspace simultaneously. For example, if the first media content is in the form of a video stream, the first game subspace may be located in a first virtual reality space in which a virtual screen playing video is oriented or facing.
In some embodiments, the first game subspace includes a first region for providing an active region for a user-controlled avatar, a second region for displaying game feedback information, and a third region for displaying the first game object.
Fig. 4 is a schematic diagram of a media content display area and a first game subspace in a virtual reality space according to an embodiment of the disclosure. The first virtual reality space 10 includes a media content display area 20, a first game subspace 30. The media content display area 20 is for displaying the first media content. The first game subspace 30 is located in the direction in which the first media content is directed (i.e. the direction of the X-axis shown in fig. 4). The first game subspace 30 comprises a first region 31 providing an active region for a user-controlled virtual character 40, a second region 32 for displaying game feedback information, and a third region 33 for displaying the first game object. The second region 32 and the third region 33 are located between the media content display area 20 and the first region 31 to enable the user to simultaneously view the first media content and the image of the first game subspace (e.g., the game feedback information displayed by the second region and the first game object displayed by the third region), for example, through the first or third perspective of the virtual character.
It should be noted that the second area and the third area may belong to the same area or two independent areas, which is not limited herein.
According to one or more embodiments of the present disclosure, by displaying a first game subspace in a first virtual reality space for presenting first media content and displaying a first game object associated with the first media content within the first game subspace for a user to play, the user can play a game related to the first media content while viewing the first media content in the virtual reality space, and thus a more immersive, richer media content viewing and gaming experience can be provided to the user.
In some embodiments, the first game object moves in the first game subspace in a direction in which the first media content is directed.
Referring to fig. 4, in the first game subspace 33, the first game object 331 (i.e., the football animation model) moves in the direction in which the first media content is directed (i.e., the X-axis direction shown in fig. 4), so that the user can generate a visual effect of the football animation model as if it were coming from the football game being played in the media content display area 20.
In some embodiments, the movement origin of the first game object may be determined based on a preset area within a media content display area in which the first media content is displayed. In some embodiments, the preset area may be an area in which a main presentation object (e.g., a stage or a host) within the media content display area is located. In some embodiments, the predetermined area may be a center area of the media content display area.
In some embodiments, the occasion input class game provided by the first game subspace requires a user to perform a prescribed operation on the first game object at a prescribed occasion, the user's operation occasion being compared to a reference occasion to evaluate the user's operation (e.g., to determine the user's game score). The timing input-type game may include a music game in which a user performs a predetermined operation input at a timing corresponding to the progress of a musical composition, and the input timing is compared with a reference timing to evaluate a player operation. The music game is, for example, a game of proficiency in performance of rhythms, intervals, and the like.
In some embodiments, a frequency of presentation (e.g., number of presentations, presentation timing) of the first game object within the first game subspace may be determined based on a tempo and/or musical interval of music played in the first media content. Illustratively, the higher the tempo or interval of the music, the higher the presentation frequency; and/or the lower the presentation frequency if the tempo or interval of the music is slower. Taking a music game as an example, the currently played music of the first media content (such as a live video of a concert) is also the music being adopted by the ongoing music game in the first game subspace, and in the live broadcast process of the concert, the user can enjoy the song performed by the concert and also play the music game using the song, so that more immersive and richer viewing and game experience can be provided for the user.
In a specific embodiment, game schemes of the corresponding first game objects may be set in advance based on different music songs contained in the first media content, and occurrence time of each game scheme may be determined based on a presentation timeline of the first media content. Wherein the game scheme of the first game object relates to a scheme of presentation timing (e.g., frequency, time period, etc.) and/or moving path of the first game object in the first game subspace. For example, a musical composition to be played by the current first media content may be determined in real time based on preset information (supplemental enhancement information) contained in the media information stream to further determine what game scheme to employ.
Fig. 5 to 8 are schematic views of a first virtual reality space and a first game subspace at a first viewing angle of a user according to an embodiment of the present disclosure, wherein the current media content presented by the first virtual reality display space is a football game. Referring to fig. 5, in the first game subspace, the first game object (the soccer animation model shown in fig. 5) moves toward the user based on a preset total of 6 moving tracks (i.e., tracks 1-6), and a certain score can be obtained when the user hits the first game object within a prescribed time (e.g., before the first game object disappears). Conversely, if the user misses to hit a certain first game object, no score is added or no score is added. Additional credits may be acquired if the user swipes a plurality of first game objects a plurality of times (e.g., more than 2 times) in succession. In some embodiments, the number of strokes of the user (e.g., combo+n) may also be displayed in real time, and when the number of strokes reaches a particular value, a preset animated special effect and/or vibration feedback may also be displayed, but the disclosure is not limited thereto. When the connection is broken, resetting and resetting the number of the connection.
Referring to fig. 6, the user may be further required to strike the first game object at a predetermined timing, for example, after the first game object is accompanied by a preset hint indication (for example, a square model around the football animation model shown in fig. 6), the user strikes the first game object, a certain score may be obtained, and corresponding game evaluation information (for example, "perfect") may be displayed. In some embodiments, preset animated special effects or vibration feedback may also be displayed, but the disclosure is not limited thereto.
Referring to fig. 7 to 8, the user may also drag the first game object (the football animation model shown in fig. 7) according to the hint track displayed in the first game subspace. It should be noted that, the operation of the user on the first game object may further include operations other than clicking and dragging, which is not limited herein.
In some embodiments, a virtual hand representing a virtual character of a user may be displayed in virtual reality space, the virtual hand being capable of moving following the movement of the user's real hand in real space. For example, the motion state and position of the real hand of the user in the real space may be determined by a motion sensor built in a controller (e.g., a handle) held by the user, and based thereon, the motion state and position of the virtual hand in the first virtual reality space may be determined; the image containing the user's real hand or controller may also be captured based on the HMD-integrated camera to process and analyze the motion state and position of the user's real hand or controller in real space based on computer vision methods, and thereby determine the motion state and position of the virtual hand in the first virtual reality space, but the disclosure is not limited thereto.
Fig. 9 illustrates a first virtual reality space and a first game subspace schematic diagram provided under a first perspective of a user according to another embodiment of the present disclosure. The media content presented in the current first virtual reality display space is a body-building video, and the first game object is an animation model of human body actions.
In some embodiments, a first game subspace is displayed in the first virtual space, and a preset space transition animation can be displayed to prompt a user to enter a new space, and the space transition animation can also mask the loading process of the first game subspace. Illustratively, the spatial transition animation may include a process of dimming and then brightening the brightness of the screen display (e.g., an "open eye closed eye" animated special effect) to simulate the real visual experience of a user into a new space in a real environment.
In some embodiments, a first visual element is associated with the first media content. For example, if the first media content relates to a sports-like activity, the first game object includes an animated model of sports equipment used by the sports activity; or if the first media content relates to a musical class activity, the first game object includes an animated model of a musical instrument used by the musical class activity; or if the first media content relates to fitness class activities, the first game object comprises an animated model of human action.
Accordingly, referring to fig. 10, there is provided a virtual reality-based game processing apparatus 600 according to an embodiment of the present disclosure, including:
a virtual space display unit 601 for displaying a first virtual reality space for presenting first media content to a user;
A game space display unit 602 configured to display a first game subspace in the first virtual space, so that the user can observe the first media content and the first game subspace at the same time;
a game object display unit 603 for displaying a first game object in the first game subspace, the first game object being associated with the first media content;
And a feedback information display unit 604, configured to display corresponding game feedback information based on the operation of the user on the first game object.
In some embodiments, the game feedback information includes one or more of the following: game score information, game evaluation information, game animation special effects.
In some embodiments, the first media content is displayed in the form of a video stream or a virtual 3D object.
In some embodiments, the first game subspace is used to provide a user with a occasion input class game.
In some embodiments, the first game subspace is located at a preset position within the first virtual reality space.
In some embodiments, the first game subspace is located in a direction towards which the first media content is directed.
In some embodiments, the first game object moves in the first game subspace in a direction in which the first media content is directed.
In some embodiments, the apparatus further comprises:
And the movement starting point determining unit is used for determining the movement starting point of the first game object based on a preset area in a media content display area for displaying the first media content.
In some embodiments, the apparatus further comprises:
And the presentation frequency determining unit is used for determining the presentation frequency of the first game object in the first game subspace based on the rhythm and/or musical interval of the currently played music in the first media content.
In some embodiments, the higher the tempo or interval of the music, the higher the presentation frequency; and/or the lower the presentation frequency if the tempo or interval of the music is slower.
In some embodiments, the first game object includes an animated model of equipment or constituent elements used by an activity to which the first media content relates.
In some embodiments, if the first media content relates to a sports-like activity, the first game object includes an animated model of sports equipment used by the sports activity; or if the first media content relates to a musical class activity, the first game object includes an animated model of a musical instrument used by the musical class activity; or if the first media content relates to fitness class activities, the first game object comprises an animated model of human action.
In some embodiments, the game space display unit is further for displaying a first game subspace within the first virtual space in response to a first operation by the user or based on a game presentation time node.
In some embodiments, the first operation includes an operation on a first visual element displayed within the first virtual reality space, the first visual element associated with the first media content.
In some embodiments, the first game subspace includes a first region for providing an active region for a user-controlled avatar, a second region for displaying game feedback information, and a third region for displaying the first game object.
In some embodiments, the game presentation time node is determined based on the first media content.
In some embodiments, the game presentation time node is determined based on preset information contained in a media information stream of the first media content.
In some embodiments, the game space display unit is further configured to superimpose and display the image of the first game subspace on the image of the first virtual space.
For embodiments of the device, reference is made to the description of method embodiments for the relevant points, since they essentially correspond to the method embodiments. The apparatus embodiments described above are merely illustrative, wherein the modules illustrated as separate modules may or may not be separate. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
Accordingly, in accordance with one or more embodiments of the present disclosure, there is provided an electronic device comprising:
At least one memory and at least one processor;
Wherein the memory is configured to store program code, and the processor is configured to invoke the program code stored by the memory to cause the electronic device to perform a virtual reality-based game processing method provided in accordance with one or more embodiments of the present disclosure.
Accordingly, in accordance with one or more embodiments of the present disclosure, a non-transitory computer storage medium is provided, the non-transitory computer storage medium storing program code executable by a computer device to cause the computer device to perform a virtual reality-based game processing method provided in accordance with one or more embodiments of the present disclosure.
Referring now to fig. 11, a schematic diagram of an electronic device (e.g., a terminal device or server) 800 suitable for use in implementing embodiments of the present disclosure is shown. The terminal devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., in-vehicle navigation terminals), and the like, and stationary terminals such as digital TVs, desktop computers, and the like. The electronic device shown in fig. 11 is merely an example, and should not impose any limitations on the functionality and scope of use of embodiments of the present disclosure.
As shown in fig. 11, the electronic device 800 may include a processing means (e.g., a central processor, a graphics processor, etc.) 801, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 802 or a program loaded from a storage means 808 into a Random Access Memory (RAM) 803. In the RAM803, various programs and data required for the operation of the electronic device 800 are also stored. The processing device 801, the ROM 802, and the RAM803 are connected to each other by a bus 804. An input/output (I/O) interface 805 is also connected to the bus 804.
In general, the following devices may be connected to the I/O interface 805: input devices 806 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, and the like; an output device 807 including, for example, a Liquid Crystal Display (LCD), speakers, vibrators, etc.; storage 808 including, for example, magnetic tape, hard disk, etc.; communication means 809. The communication means 809 may allow the electronic device 800 to communicate wirelessly or by wire with other devices to exchange data. While fig. 11 shows an electronic device 800 having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program may be downloaded and installed from a network via communication device 809, or installed from storage device 808, or installed from ROM 802. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing device 801.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to perform the methods of the present disclosure described above.
Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
According to one or more embodiments of the present disclosure, there is provided a game processing method based on virtual reality, including: displaying a first virtual reality space for presenting first media content to a user; displaying a first game subspace in the first virtual space to enable the user to view the first media content and the first game subspace simultaneously; displaying a first game object within the first game subspace, the first game object being associated with the first media content; and displaying corresponding game feedback information based on the operation of the user on the first game object.
In accordance with one or more embodiments of the present disclosure, the game feedback information includes one or more of the following: game score information, game evaluation information, game animation special effects.
According to one or more embodiments of the present disclosure, the first media content is displayed in the form of a video stream or a virtual 3D object.
In accordance with one or more embodiments of the present disclosure, the first game subspace is used to provide a user with a occasion input class game.
According to one or more embodiments of the present disclosure, the first game subspace is located at a preset position within the first virtual reality space.
In accordance with one or more embodiments of the present disclosure, the first game subspace is located in a direction in which the first media content is directed.
According to one or more embodiments of the present disclosure, the first game object moves in the first game subspace in a direction in which the first media content is directed.
According to one or more embodiments of the present disclosure, the virtual reality-based game processing method further includes: a movement start point of the first game object is determined based on a preset area in a media content display area in which the first media content is displayed.
According to one or more embodiments of the present disclosure, the virtual reality-based game processing method further includes: a frequency of presentation of the first game object within the first game subspace is determined based on a tempo and/or musical interval of music currently played in the first media content.
According to one or more embodiments of the present disclosure, the presentation frequency is higher if the tempo of the music is faster or the musical interval is higher; and/or the lower the presentation frequency if the tempo or interval of the music is slower.
According to one or more embodiments of the present disclosure, the first game object includes an animated model of equipment or constituent elements used by an activity to which the first media content relates.
According to one or more embodiments of the present disclosure, if the first media content relates to a sports class activity, the first game object includes an animated model of sports equipment used by the sports activity; or if the first media content relates to a musical class activity, the first game object includes an animated model of a musical instrument used by the musical class activity; or if the first media content relates to fitness class activities, the first game object comprises an animated model of human action.
According to one or more embodiments of the present disclosure, the displaying a first game subspace within the first virtual space includes: a first game subspace is displayed within the first virtual space in response to a first operation by the user or based on a game presentation time node.
According to one or more embodiments of the present disclosure, the first operation includes an operation on a first visual element displayed within the first virtual reality space, the first visual element being associated with the first media content.
In accordance with one or more embodiments of the present disclosure, the first game subspace includes a first region for providing an active region for a user-controlled avatar, a second region for displaying game feedback information, and a third region for displaying the first game object.
In accordance with one or more embodiments of the present disclosure, the game presentation time node is determined based on the first media content.
According to one or more embodiments of the present disclosure, the game presentation time node is determined based on preset information contained in a media information stream of the first media content.
According to one or more embodiments of the present disclosure, the displaying a first game subspace within the first virtual space includes: and superposing and displaying the image of the first game subspace on the image of the first virtual space.
According to one or more embodiments of the present disclosure, there is provided a virtual reality-based game processing apparatus including: a virtual space display unit configured to display a first virtual reality space, the first virtual reality space being used to present first media content to a user; a game space display unit for displaying a first game subspace in the first virtual space so that the user can observe the first media content and the first game subspace at the same time; a game object display unit for displaying a first game object within the first game subspace, the first game object being associated with the first media content; and the feedback information display unit is used for displaying corresponding game feedback information based on the operation of the user on the first game object.
According to one or more embodiments of the present disclosure, there is provided an electronic device including: at least one memory and at least one processor; wherein the memory is for storing program code, and the processor is for invoking the program code stored by the memory to cause the electronic device to perform the virtual reality based game processing method provided in accordance with one or more embodiments of the present disclosure.
According to one or more embodiments of the present disclosure, there is provided a non-transitory computer storage medium storing program code which, when executed by a computer device, causes the computer device to perform a virtual reality-based game processing method provided according to one or more embodiments of the present disclosure.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in this disclosure is not limited to the specific combinations of features described above, but also covers other embodiments which may be formed by any combination of features described above or equivalents thereof without departing from the spirit of the disclosure. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).
Moreover, although operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the present disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims.

Claims (21)

1. A virtual reality-based game processing method, comprising:
displaying a first virtual reality space for presenting first media content to a user;
displaying a first game subspace in the first virtual space to enable the user to view the first media content and the first game subspace simultaneously;
displaying a first game object within the first game subspace, the first game object being associated with the first media content;
And displaying corresponding game feedback information based on the operation of the user on the first game object.
2. The method of claim 1, wherein the game feedback information includes one or more of: game score information, game evaluation information, game animation special effects.
3. The method of claim 1, wherein the first media content is displayed in the form of a video stream or a virtual 3D object.
4. The method of claim 1, wherein the first game subspace is used to provide a user with a timing input class game.
5. The method of claim 1, wherein the first game subspace is located at a preset position within the first virtual reality space.
6. The method of claim 1, wherein a first game subspace is located in a direction towards which the first media content is directed.
7. The method of claim 6, wherein the first game object moves in the first game subspace in a direction in which the first media content is directed.
8. The method as recited in claim 6, further comprising:
a movement start point of the first game object is determined based on a preset area in a media content display area in which the first media content is displayed.
9. The method as recited in claim 1, further comprising:
a frequency of presentation of the first game object within the first game subspace is determined based on a tempo and/or musical interval of music currently played in the first media content.
10. The method of claim 9, wherein the step of determining the position of the substrate comprises,
The higher the rhythm or interval of the music is, the higher the presentation frequency is; and/or
The lower the tempo or interval of the music, the lower the presentation frequency.
11. The method of claim 1, wherein the first game object comprises an animated model of equipment or constituent elements used by the activity to which the first media content relates.
12. The method of claim 11, wherein the step of determining the position of the probe is performed,
If the first media content relates to a sports-like activity, the first game object includes an animated model of sports equipment used by the sports activity; or alternatively
If the first media content relates to a musical class activity, the first game object includes an animated model of a piece of music equipment used by the musical class activity; or alternatively
If the first media content relates to fitness class activities, the first game object comprises an animated model of human action.
13. The method of claim 1, wherein the displaying a first game subspace within the first virtual space comprises:
A first game subspace is displayed within the first virtual space in response to a first operation by the user or based on a game presentation time node.
14. The method of claim 13, wherein the first operation comprises an operation on a first visual element displayed within the first virtual reality space, the first visual element being associated with the first media content.
15. The method of claim 13, wherein the game presentation time node is determined based on the first media content.
16. The method of claim 15, wherein the game presentation time node is determined based on preset information contained in a media information stream of the first media content.
17. The method of claim 1, wherein the first game subspace comprises a first region for providing an active region for a user-controlled avatar, a second region for displaying game feedback information, and a third region for displaying the first game object.
18. The method of claim 1, wherein the displaying a first game subspace within the first virtual space comprises:
and superposing and displaying the image of the first game subspace on the image of the first virtual space.
19. A virtual reality-based game processing apparatus, comprising:
a virtual space display unit configured to display a first virtual reality space, the first virtual reality space being used to present first media content to a user;
A game space display unit for displaying a first game subspace in the first virtual space so that the user can observe the first media content and the first game subspace at the same time;
A game object display unit for displaying a first game object within the first game subspace, the first game object being associated with the first media content;
And the feedback information display unit is used for displaying corresponding game feedback information based on the operation of the user on the first game object.
20. An electronic device, comprising:
At least one memory and at least one processor;
Wherein the memory is for storing program code and the processor is for invoking the program code stored in the memory to cause the electronic device to perform the method of any of claims 1-18.
21. A non-transitory computer storage medium comprising,
The non-transitory computer storage medium stores program code that, when executed by a computer device, causes the computer device to perform the method of any of claims 1 to 18.
CN202211528421.2A 2022-11-30 2022-11-30 Game processing method and device based on virtual reality, electronic equipment and storage medium Pending CN118105689A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211528421.2A CN118105689A (en) 2022-11-30 2022-11-30 Game processing method and device based on virtual reality, electronic equipment and storage medium
US18/525,503 US20240177435A1 (en) 2022-11-30 2023-11-30 Virtual interaction methods, devices, and storage media

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211528421.2A CN118105689A (en) 2022-11-30 2022-11-30 Game processing method and device based on virtual reality, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN118105689A true CN118105689A (en) 2024-05-31

Family

ID=91212759

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211528421.2A Pending CN118105689A (en) 2022-11-30 2022-11-30 Game processing method and device based on virtual reality, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN118105689A (en)

Similar Documents

Publication Publication Date Title
RU2719454C1 (en) Systems and methods for creating, translating and viewing 3d content
US9616338B1 (en) Virtual reality session capture and replay systems and methods
US9429912B2 (en) Mixed reality holographic object development
EP2887322B1 (en) Mixed reality holographic object development
KR20150108842A (en) Mixed reality filtering
US20180169517A1 (en) Reactive animation for virtual reality
JP7222121B2 (en) Methods and Systems for Managing Emotional Compatibility of Objects in Stories
US20190122408A1 (en) Conversion of 2d diagrams to 3d rich immersive content
CN113194329B (en) Live interaction method, device, terminal and storage medium
CN118105689A (en) Game processing method and device based on virtual reality, electronic equipment and storage medium
US20240078734A1 (en) Information interaction method and apparatus, electronic device and storage medium
CN118227005A (en) Information interaction method, device, electronic equipment and storage medium
US20230377248A1 (en) Display control method and apparatus, terminal, and storage medium
US20230162448A1 (en) Projector assisted augmented reality, adjusting ar size, shape and presentation based on real world space
WO2024016880A1 (en) Information interaction method and apparatus, and electronic device and storage medium
US20240177435A1 (en) Virtual interaction methods, devices, and storage media
CN117631904A (en) Information interaction method, device, electronic equipment and storage medium
US20220237844A1 (en) Information processing system, information processing method, and computer program
CN117111723A (en) Special effect display method, device, electronic equipment and storage medium
CN117519456A (en) Information interaction method, device, electronic equipment and storage medium
CN117519457A (en) Information interaction method, device, electronic equipment and storage medium
CN117899456A (en) Display processing method, device, equipment and medium of two-dimensional assembly
CN117788759A (en) Information pushing method, device, electronic equipment and storage medium
CN117994284A (en) Collision detection method, collision detection device, electronic equipment and storage medium
CN117376591A (en) Scene switching processing method, device, equipment and medium based on virtual reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination