CN111359200B - Game interaction method and device based on augmented reality - Google Patents

Game interaction method and device based on augmented reality Download PDF

Info

Publication number
CN111359200B
CN111359200B CN202010120288.1A CN202010120288A CN111359200B CN 111359200 B CN111359200 B CN 111359200B CN 202010120288 A CN202010120288 A CN 202010120288A CN 111359200 B CN111359200 B CN 111359200B
Authority
CN
China
Prior art keywords
game
determining
game scenario
virtual character
live
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010120288.1A
Other languages
Chinese (zh)
Other versions
CN111359200A (en
Inventor
沈佳波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202010120288.1A priority Critical patent/CN111359200B/en
Publication of CN111359200A publication Critical patent/CN111359200A/en
Application granted granted Critical
Publication of CN111359200B publication Critical patent/CN111359200B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5258Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a game interaction method and device based on augmented reality. Wherein the method comprises the following steps: acquiring a live-action image of the terminal equipment under the current shooting visual angle; determining a first game scenario in the game, which is matched with the live-action image; determining a target display position of a virtual character associated with a second game scenario satisfying the condition in the first game scenario in the live-action image; and displaying the virtual character on the target display position in response to the first touch operation applied to the terminal device, and controlling interaction of the virtual character based on at least the second game scenario. The invention solves the technical problem that the existing man-machine interactive game can not realize the interaction between the virtual character and the physical scene.

Description

Game interaction method and device based on augmented reality
Technical Field
The invention relates to the field of computers, in particular to a game interaction method and device based on augmented reality.
Background
AR (Augmented Reality ) technology is a technology that fuses virtual information with the real world, where application of AR technology in the gaming world can provide a strong immersive experience for players.
However, in the conventional AR game, the performance of the virtual character is limited to the screen, and the interaction between the virtual character and the physical scene cannot be realized, the world created by the game and the real world where the player is located are split, and the immersion experience effect of the player is poor.
For example, an interactive movie game is an AR game that can allow players to change the fates of a principal by selecting and striving to achieve the best outcome of the game. Currently, the pictures of interactive movie games can be realized by the following ways:
mode one: the game is made exquisite by model performance, and the immersion experience of a player is strong by using a movie lens language, but in the mode, the model only acts in a virtual scene of a screen, and cannot interact with the real world.
Mode two: the drama is constructed by editing films shot by players, the game drama of the type is smooth, the immersion experience of the players is strong, but when interaction occurs, the interruption sense is strong due to limited materials.
Mode three: the method is realized through picture resources, and the realization cost is low, but the realization effect is poorer. In addition, due to the limitation of picture resources, people cannot perform vivid performance, people and event relations can only be replaced by limited person pictures, and repeated use of picture disc materials can easily cause bad feeling to players, so that participation feeling is not strong, and immersion experience is poor.
In view of the above problems, no effective solution has been proposed at present.
Disclosure of Invention
The embodiment of the invention provides a game interaction method and device based on augmented reality, which at least solve the technical problem that the existing man-machine interaction game cannot realize interaction between a virtual character and a physical scene.
According to an aspect of an embodiment of the present invention, there is provided a game interaction method based on augmented reality, including: acquiring a live-action image of the terminal equipment under the current shooting visual angle; determining a first game scenario in the game, which is matched with the live-action image; determining a target display position of a virtual character associated with a second game scenario satisfying the condition in the first game scenario in the live-action image; and displaying the virtual character on the target display position in response to the first touch operation applied to the terminal device, and controlling interaction of the virtual character based on at least the second game scenario.
Further, the game interaction method based on augmented reality further comprises the following steps: and analyzing the live-action image to obtain a first game scenario corresponding to the analysis result.
Further, the game interaction method based on augmented reality further comprises the following steps: and identifying an object in the live-action image, and acquiring a first game scenario matched with the object.
Further, the game interaction method based on augmented reality further comprises the following steps: traversing a characteristic label preset in a game scenario in a game; and acquiring a first game scenario in which the feature tag is matched with the object in the game scenario.
Further, the game interaction method based on augmented reality further comprises the following steps: a second game scenario is determined from the first game scenario.
Further, the game interaction method based on augmented reality further comprises the following steps: if the first game scenario is one, determining the first game scenario as the second game scenario.
Further, the game interaction method based on augmented reality further comprises the following steps: if the first game scenario is multiple, determining a second game scenario from the first game scenario according to the priority corresponding to the first game scenario.
Further, the game interaction method based on augmented reality further comprises the following steps: if the first game plot is a plurality of game plots, displaying a selection control for the first game plot on a display screen of the terminal equipment; and responding to the touch operation acted on the selection control, and determining the first game scenario corresponding to the touch operation as the second game scenario.
Further, the game interaction method based on augmented reality further comprises the following steps: acquiring character data of a virtual character associated with a second game scenario; and determining the target display position of the virtual character in the live-action image according to the character data.
Further, the character data includes at least one of: the virtual character's range of motion data, the virtual character's pose data, and the virtual character's footprint data at different poses.
Further, the game interaction method based on augmented reality further comprises the following steps: determining a display range in the live-action image according to the character data; and determining the target display position in the display range.
Further, the game interaction method based on augmented reality further comprises the following steps: acquiring shooting information of the terminal equipment, wherein the shooting information comprises at least one of the following steps: position information of the terminal equipment and shooting angle of the terminal equipment; determining the display area of the virtual character in the display range according to the shooting information; and determining a target display position corresponding to the display area in the display range.
Further, the game interaction method based on augmented reality further comprises the following steps: determining at least one alternative display position corresponding to the display area in the display range; acquiring a plurality of live-action images under different shooting visual angles; comparing the plurality of live-action images to determine environment information corresponding to the live-action images, wherein the environment information at least comprises one of the following: environmental height, environmental depth; determining at least one alternative display position in a display range according to the environment information and the display area; and determining a target display position from the at least one alternative display position in response to the second touch operation.
Further, the game interaction method based on augmented reality further comprises the following steps: and after determining the target display position of the virtual character associated with the second game scenario information in the live-action image, displaying position prompt information, wherein the position prompt information is used for prompting the terminal equipment to adjust from the first shooting position to the second shooting position indicated by the position prompt information under the current shooting view angle.
Further, the game interaction method based on augmented reality further comprises the following steps: and under the condition that the first shooting position of the terminal equipment does not meet the preset condition, displaying position prompt information, wherein the preset condition at least comprises: the distance between the first photographing position and the target display position of the virtual character is greater than or less than a preset distance.
Further, the game interaction method based on augmented reality further comprises the following steps: under the condition that the number of the virtual characters is one, determining a second shooting position according to the target display position of the virtual characters and the position information of the terminal equipment under the current shooting view angle; and displaying the position prompt information at the second shooting position.
Further, the game interaction method based on augmented reality further comprises the following steps: under the condition that the number of the virtual characters is a plurality of, determining a second shooting position according to the interaction relation among the plurality of virtual characters and the position information of the terminal equipment under the current shooting view angle; and displaying the position prompt information at the second shooting position.
Further, the game interaction method based on augmented reality further comprises the following steps: after determining the target display position of the virtual character associated with the second game scenario information in the live-action image, angle prompt information is displayed, wherein the angle prompt information is used for prompting the terminal equipment to be adjusted from the current shooting view angle to the target shooting view angle.
Further, the game interaction method based on augmented reality further comprises the following steps: determining a target shooting visual angle according to the second game scenario information; and displaying the angle prompt information according to the target shooting visual angle and the current shooting visual angle.
Further, the game interaction method based on augmented reality further comprises the following steps: acquiring an interaction request corresponding to a second game scenario; and controlling the interaction action of the virtual character according to the operation instruction for the interaction request received by the terminal equipment.
Further, the game interaction method based on augmented reality further comprises the following steps: acquiring weather information acquired by terminal equipment; a first game scenario in the game that matches the weather information is determined.
Further, the game interaction method based on augmented reality further comprises the following steps: after controlling the interactive action of the virtual character based at least on the second game scenario, displaying an effect image corresponding to the second game scenario or the interactive action on a display screen of the terminal device, and displaying an interactive result corresponding to the effect image on the display screen in response to the third touch operation.
According to another aspect of the embodiment of the present invention, there is also provided a game interaction device based on augmented reality, including: the acquisition module is used for acquiring a live-action image of the terminal equipment under the current shooting visual angle; the first determining module is used for determining a first game scenario matched with the live-action image in the game; a second determining module for determining a target display position of a virtual character associated with a second game scenario satisfying the condition in the first game scenario in the live-action image; and the response module is used for responding to the first touch operation acted on the terminal equipment, displaying the virtual character on the target display position and controlling the interaction action of the virtual character at least based on the second game scenario.
According to another aspect of the embodiment of the present invention, there is also provided a storage medium, where the storage medium includes a stored program, and when the program runs, the device where the storage medium is controlled to execute the above-mentioned game interaction method based on augmented reality.
According to another aspect of the embodiment of the present invention, there is also provided a processor for running a program, where the program executes the above-mentioned augmented reality-based game interaction method.
In the embodiment of the application, a mode of matching a live-action image with a game scenario and matching the position of a virtual character in a game with the position in the live-action image is adopted, the live-action image of a terminal device under the current shooting view angle is obtained, a first game scenario matched with the live-action image in the game is determined, then the target display position of the virtual character associated with a second game scenario meeting the condition in the first game scenario in the live-action image is determined, finally the virtual character is displayed on the target display position in response to a first touch operation acted on the terminal device, and the interaction action of the virtual character is controlled at least based on the second game scenario.
According to the scheme provided by the application, matching of the live-action image and the game scenario and matching of the virtual characters in the game and the positions in the live-action image can be realized, so that interaction between the virtual characters in the game and physical scenes in the reality is realized, the virtual characters in the game can be used as guests in the reality of a player, and the game immersion experience of the player is enhanced. In addition, the game plot is matched with the live-action image, namely, different live-action images correspond to different game plots, so that the game plot is more matched with a physical scene in reality, and the game immersion experience of a player is further improved.
Therefore, the scheme provided by the application achieves the purpose of interaction between the virtual character and the physical scene in the game, thereby realizing the technical effect of improving the immersion experience of the game, and further solving the technical problem that the existing man-machine interaction game cannot realize the interaction between the virtual character and the physical scene.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
FIG. 1 is a flow chart of a game interaction method based on augmented reality according to an embodiment of the present application;
FIG. 2 is a schematic display of an alternative terminal device according to an embodiment of the application;
FIG. 3 is a schematic display of an alternative terminal device according to an embodiment of the application;
FIG. 4 is a schematic display of an alternative terminal device according to an embodiment of the application;
FIG. 5 is a schematic display of an alternative terminal device according to an embodiment of the application;
FIG. 6 is a schematic display of an alternative terminal device according to an embodiment of the application;
FIG. 7 is a schematic display of an alternative terminal device according to an embodiment of the invention;
FIG. 8 is a schematic display of an alternative terminal device according to an embodiment of the invention;
fig. 9 is a schematic display diagram of an alternative terminal device according to an embodiment of the present invention; and
fig. 10 is a schematic diagram of an augmented reality-based game interaction device according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
According to an embodiment of the present invention, an embodiment of a game interaction method based on augmented reality is provided, and it should be noted that a terminal device may be used as an execution body of the embodiment, where the terminal device may be a device with an image capturing function, for example, a smart phone, a tablet, or the like. Optionally, the terminal device further has a display unit, so as to display a scene related to the game and display related information during man-machine interaction.
Furthermore, it should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer executable instructions, and that although a logical order is illustrated in the flowcharts, in some cases the steps illustrated or described may be performed in an order other than that illustrated herein.
Fig. 1 is a flowchart of a Game interaction method based on augmented reality according to an embodiment of the present invention, wherein the method may be applied in an AR Game, for example, in an interactive movie AR Game, and may also be applied in an RPG (Role Playing Game), and in this embodiment, an AR Game is taken as an example for explanation. Specifically, as shown in fig. 1, the method comprises the following steps:
Step S102, obtaining a live-action image of the terminal equipment under the current shooting view angle.
In an alternative embodiment, the terminal device has an AR game installed therein, which may be, for example, an interactive movie AR game. When the player starts the game, the terminal device automatically turns on an image capture device (e.g., a camera) to capture a live-action image in the real scene in which the terminal device is located.
In another alternative embodiment, an interactive movie game is installed in the terminal device, after the player starts the interactive movie game, an interface of "whether to collect images" is popped up in a display screen of the terminal device, after the player determines to collect the images, the terminal device starts the image collection device again, and collects live-action images in a real scene where the terminal device is located.
It should be noted that in step S102, the current shooting view angle may be the shooting view angle corresponding to the terminal device after the terminal device starts the image capturing device, or may be the shooting view angle after the image capturing device is started, where the player adjusts the capturing position and the capturing view angle of the terminal device.
Step S104, determining a first game scenario matched with the live-action image in the game.
In step S104, the first game scenario is a game scenario matched with the live-action image, where the first game scenario may be stored in the terminal device or may be stored in a server corresponding to the game, and when the game scenario needs to be acquired, the terminal device acquires the game scenario corresponding to the live-action image through the server.
In an alternative embodiment, the terminal device may analyze the live image to obtain a first game scenario corresponding to the analysis result. Specifically, the terminal equipment identifies an object in the live-action image, and acquires a first game scenario matched with the object. The method for acquiring the first game scenario matched with the object specifically comprises the following steps: traversing the preset feature labels of the game plots in the game, and acquiring a first game plot of which the feature labels are matched with the object in the game plots. Optionally, the feature tag preset in the game scenario may be used to represent the category to which the object in the live-action image belongs, for example, the game scenario includes a sofa, the feature tag corresponding to the sofa is "the object allowing the virtual character to sit down", while in the live-action image, there is no sofa, but there is a stool, and if the feature tag corresponding to the stool and the sofa are the same, it is determined that the game scenario is matched with the stool of the object. At this time, the line of the virtual character in the game scenario that interacts with the player may be "do you have a sofa, call, which stool may be truly hard enough, but will be in the bar.
Step S106, determining the target display position of the virtual character associated with the second game scenario meeting the condition in the first game scenario in the live-action image.
In step S106, the second game scenario may include two parts, namely a main line task (i.e. a main line scenario) and a branch line task (i.e. a branch line scenario), where the main line task is a task that a player must complete through the game in the game, and generally, different main line tasks are generated along with the improvement of the player level; a spur task refers to a task that can be selectively accomplished by a player, whether the task is accomplished without affecting the spur task, and typically, when a player encounters a non-player character, a different spur task is triggered.
Optionally, the terminal device identifies the object in the live-action image, and determines whether the second game scenario is a main line task or a branch line task according to the identification result. Specifically, if the terminal equipment detects that an object in the live-action image is matched with the main line task, the terminal equipment releases the main line task; if the terminal equipment detects that an object in the live-action image is not matched with the main line task, the terminal equipment releases the branch line task, for example, a player opens a game in the wild, at the moment, the terminal equipment cannot detect the relevant main line task, at the moment, the branch line task is released, a virtual character in the game can be used for carrying out some simple tasks by a player, for example, a female main Bayesian main angle is picked up, and the player can complete the branch line task by only aiming the terminal equipment at roadside plants and identifying objects of the terminal equipment.
Further, after determining the second game scenario satisfying the condition in the first game scenario, the terminal device determines the target display position of the virtual character associated with the second game scenario in the live-action image, for example, in the second game scenario, the virtual character sits on the sofa and there is also a sofa in the live-action image, in the display schematic diagram of the terminal device shown in fig. 2, at this time, the terminal device selects the target display position where the virtual character can be placed from the horizontal plane where the sofa can sit, as the B area on the sofa in fig. 2, while the a area in the sofa is not the horizontal plane, so the a area cannot place the virtual character. Alternatively, the target display position and the non-target display position may be displayed in different colors, for example, the region a in fig. 2 is displayed in red and the region B is displayed in green.
Step S108, responding to the first touch operation acted on the terminal equipment, displaying the virtual character on the target display position, and controlling the interaction action of the virtual character based on at least the second game scenario.
In step S108, the first touch operation may be an operation of a determination control in the terminal device by the player, where the determination control is used to determine a target display position of the virtual character in the terminal device, for example, in the display schematic of the terminal device shown in fig. 3, the virtual character is displayed in the target display position. Further, after the virtual character is displayed on the target display position, the terminal device controls the virtual character to execute the interactive action in the second game scenario, for example, controls the virtual character to emit voice, and the content of the voice is the corresponding line in the second game scenario. At this time, the player may make a corresponding reaction according to the interaction of the virtual character, for example, the voice of the player is recorded through the voice recording function of the terminal device, so as to implement the interaction between the player and the virtual character.
Based on the above-described schemes defined in steps S102 to S108, it can be known that, by adopting a manner of matching a live-action image with a game scenario and matching a virtual character in a game with a position in the live-action image, a live-action image of a terminal device under a current shooting view angle is obtained, and a first game scenario matched with the live-action image in the game is determined, then a target display position of a virtual character associated with a second game scenario satisfying a condition in the first game scenario in the live-action image is determined, finally, in response to a first touch operation acting on the terminal device, the virtual character is displayed on the target display position, and an interaction action of the virtual character is controlled based on at least the second game scenario.
It is easy to notice that in the above process, the scheme provided by the application can realize the matching of the live-action image and the game scenario and the matching of the virtual character in the game and the position in the live-action image, thereby realizing the interaction between the virtual character in the game and the physical scene in the reality, enabling the virtual character in the game to serve as a guest in the player reality and enhancing the game immersion experience of the player. In addition, the game plot is matched with the live-action image, namely, different live-action images correspond to different game plots, so that the game plot is more matched with a physical scene in reality, and the game immersion experience of a player is further improved.
Therefore, the scheme provided by the application achieves the purpose of interaction between the virtual character and the physical scene in the game, thereby realizing the technical effect of improving the immersion experience of the game, and further solving the technical problem that the existing man-machine interaction game cannot realize the interaction between the virtual character and the physical scene.
In an alternative embodiment, after the first game scenario matching the live-action image is obtained through step S102 and step S104, the terminal device needs to determine the second game scenario from the first game scenario. Wherein the terminal device may determine the second game scenario from the first game scenario according to the number of the first game scenarios.
Optionally, if the first game scenario is one, determining the first game scenario as the second game scenario. If the first game plot is a plurality of game plots, determining a second game plot from the first game plot according to the priority corresponding to the first game plot, wherein the second game plot is the first game plot with the highest priority. For example, mainline tasks have a higher priority than branch tasks.
In another alternative embodiment, if the first game scenario is multiple, a selection control for the first game scenario is displayed on a display screen of the terminal device, and in response to a touch operation acting on the selection control, it is determined that the first game scenario corresponding to the touch operation is the second game scenario. Alternatively, the selection control may be a drop-down list containing a plurality of first game episodes, from which the player may select the first game episode as the second game episode.
Further, after determining the second game scenario, the terminal device determines a target display position of the virtual character associated with the second game scenario in the live-action image. Specifically, the terminal device first acquires character data of the virtual character associated with the second game scenario, and then determines a target display position of the virtual character in the live-action image according to the character data. Wherein the character data includes at least one of: the virtual character's range of motion data, the virtual character's pose data, and the virtual character's footprint data at different poses.
Optionally, the activity range data of the virtual character characterizes the activity range of the virtual character in the live-action image; the pose data of the virtual character characterizes pose information of the virtual character, such as standing, walking, sitting, etc.; the footprint data of the virtual character in different poses characterizes the footprint information of the virtual character, e.g. the footprint in standing is different from the footprint in sitting. The occupation area information of different virtual roles in different postures can be stored in a local storage unit of the terminal equipment or in a server. When a player needs to put the virtual character, a camera is started to search a real scene, at the moment, the terminal equipment can read the stored character data or acquire the character data from a server, and a graph which is suitable for the distance of the player and is similar to information such as the placement area and angle of the virtual character is generated in a view finding interface, so that the player is assisted to find a reasonable placement position (namely a target display position) in the real scene.
In an alternative embodiment, the character data of the virtual character may be obtained by analyzing the second game scenario, and after obtaining the character data, the terminal device further determines the target display position of the virtual character in the live-action image. Specifically, the terminal device first determines a display range in the live-action image according to the character data, and then determines a target display position in the display range. For example, the character data of the virtual character is: the movable range of the virtual character in the live-action image is sofa, the gesture is sitting, the occupied area is S, after the terminal equipment analyzes character data and objects in the live-action image, the display range of the virtual character in the live-action image is determined to be sofa (such as sofa in the schematic diagram shown in fig. 2), and then the terminal equipment determines the target display position in the display range.
Optionally, the terminal device first obtains shooting information of the terminal device, then determines a display area of the virtual character in a display range according to the shooting information, and finally determines a target display position corresponding to the display area in the display range. Wherein the shooting information comprises at least one of the following: position information of the terminal equipment and shooting angle of the terminal equipment. For example, in fig. 2, the display range is sofa, the occupied area of the virtual character in sitting is S, and the display area of the virtual character can be determined to be S.
After determining the display area, the terminal device further determines a target display position corresponding to the display area within the display range. Specifically, the terminal device determines at least one alternative display position corresponding to the display area in the display range, acquires a plurality of live-action images under different shooting visual angles, compares the live-action images, determines environment information corresponding to the live-action images, determines at least one alternative display position in the display range according to the environment information and the display area, and finally determines a target display position from the at least one alternative display position in response to a second touch operation. Wherein the environmental information includes at least one of: environmental height, environmental depth.
For example, in fig. 2, after determining that the display area is S and the display range is sofa, the terminal device shoots a plurality of live-action images at different shooting angles, then compares the live-action images to obtain information such as the environmental height and the environmental depth of the live-action images, and in combination with the display area of the virtual character, it can be determined that the sofa has a plurality of alternative display positions, that is, the virtual character can be placed at each alternative display position. At this time, the terminal device may select the target display position from the plurality of candidate display positions according to the second touch operation of the player, as in fig. 2, the player selects the B region on the sofa as the target display position.
In another alternative embodiment, after determining at least one alternative display position in the display range, the terminal device detects whether a second touch operation is received within a preset duration, and if the detected second touch operation, the terminal device selects a target display position from the at least one alternative display position according to the second touch operation of the player; if the second touch operation is not detected, the terminal device may select an optimal alternative display position from the at least one alternative display position as a target display position according to the second game scenario, or the terminal device may randomly select the optimal alternative display position from the at least one alternative display position as the target display position.
It should be noted that, after determining the target display position of the virtual character associated with the second game scenario information in the live-action image, the photographing position of the terminal device may not be the optimal photographing position, and in order to obtain a better game experience, it is also necessary to determine the photographing position of the terminal device.
Optionally, the terminal device may further display a position prompt message, where the position prompt message is used to prompt the terminal device to adjust from the first shooting position to the second shooting position indicated by the position prompt message under the current shooting view angle. And under the condition that the first shooting position of the terminal equipment does not meet the preset condition, displaying position prompt information, wherein the preset condition at least comprises: the distance between the first photographing position and the target display position of the virtual character is greater than or less than a preset distance. As M1 in fig. 4 is a position indication information, M2 in fig. 5 is a position indication information, the directions of M1 and M2 are different, which indicates that the adjustment directions of the shooting positions of the terminal device are different, for example, the position indication information M1 in fig. 4 indicates that the current shooting position of the terminal device is too close to the target display position of the virtual character; the position cue information M2 in fig. 5 indicates that the current photographing position of the terminal device is too far from the target display position of the virtual character.
In addition, the position hint information is related to the number of virtual characters. Optionally, under the condition that the number of the virtual characters is one, determining a second shooting position according to the target display position of the virtual characters and the position information of the terminal equipment under the current shooting view angle, and displaying the position prompt information at the second shooting position. And under the condition that the number of the virtual characters is a plurality of, determining a second shooting position according to the interaction relation among the plurality of virtual characters and the position information of the terminal equipment under the current shooting view angle, and displaying position prompt information at the second shooting position.
It should be noted that, the player can adjust the shooting position of the terminal device according to the position prompt information, so as to obtain an optimal shooting position, so that the live-action image shot by the player is more pertinent and natural, and further the game immersion experience of the player is improved, such as the conical shape Q in fig. 5 is the optimal shooting position of the player.
In addition, it should be noted that, after determining the target display position of the virtual character associated with the second game scenario information in the live-action image, the angle of the terminal device may not be the optimal photographing angle, for example, the terminal device is not parallel to the horizontal plane, and the photographing angle of the terminal device needs to be determined for better game experience.
Optionally, the terminal device may further display an angle prompt message, where the angle prompt message is used to prompt the terminal device to adjust from the current shooting view angle to the target shooting view angle. For example, N in fig. 6 is an angle hint information that may hint a player how to adjust a shooting angle of the terminal device, e.g., a horizontal angle, a vertical angle, etc.
Specifically, the terminal device first determines a target shooting view angle according to the second game scenario information, and then displays angle prompt information according to the target shooting view angle and the current shooting view angle. For example, in fig. 6, when two triangles in the angle hint information are aligned in the horizontal direction, it is indicated that the terminal device is parallel to the ground; when two triangles in the angle prompt message are straight lines, the terminal equipment is vertical to the ground.
Further, after the shooting position and shooting angle of the terminal device are adjusted, a start control (as shown in fig. 7) appears in a display screen of the terminal device, and after a player performs related operation (for example, clicking) on the start control, the terminal device starts a game and records the game in response to the operation of the player.
It should be noted that, during the process of recording the game, the terminal device may control the interaction of the virtual character based on at least the second game scenario, i.e. implement man-machine interaction. In the application, the graphical user interface of the terminal equipment can be realized through four layers, namely a physical world layer, a model performance layer, a special effect performance layer and a man-machine interaction layer, wherein the physical world layer is used for displaying a live-action image, and the layer can acquire the live-action image in a physical scene through an image acquisition device; the model performance layer is used for displaying virtual objects, in the layer, controls related to guiding functions for assisting a player to shoot can be displayed besides the virtual objects, for example, the target display position in fig. 2, the position prompt information in fig. 4 and 5, the angle prompt information in fig. 6 and the like, and guiding function controls for video recording can be further included, so that the player is guided to interact with a proper virtual character alternately, for example, the virtual character A finishes the speech of a line, and the player can transfer a lens to the virtual character B through the guiding function controls; the special effect expression layer is used for displaying special effect models (such as water stains, soil stains and the like) related to the plot information; the human-machine interaction layer is used to expose the relevant controls for human-machine interaction (e.g., the "start" button in fig. 7).
In an alternative embodiment, the terminal device controls the interaction of the virtual character based at least on the second game scenario after the virtual character is displayed on the target display location. Specifically, the terminal device first obtains an interaction request corresponding to the second game scenario, and then controls interaction actions of the virtual roles according to the operation instruction for the interaction request received by the terminal device. For example, in the schematic diagram shown in fig. 8, the terminal device displays, according to the second game scenario, an interactive request selectable by the player (such as "who is you? "this voice, the terminal device is detecting" you want to dry the mani? After the interactive voice, the virtual character is controlled to spray water to the player, so that the effect diagram shown in fig. 9 is obtained. Wherein in fig. 9, the dashed circles represent water droplets.
In another alternative embodiment, the terminal device may further acquire weather information collected by the terminal device, and determine a first game scenario in the game that matches the weather information. For example, in a room, the terminal device collects ambient temperature and detects that the ambient temperature is low, at which time the terminal device controls the avatar to speak a "you are too cold in home" word. Optionally, two interactive voices can be displayed in the graphical user interface of the terminal device, wherein the first interactive voice is ' we go away from the bar ', and the second interactive voice is ' give you the hot tea bar. After the player selects the second interactive voice, the avatar has a cup of hot tea in his hand while the display of the terminal device is fogged.
Further, after controlling the interactive action of the virtual character based at least on the second game scenario, an effect image corresponding to the second game scenario or the interactive action is displayed on the display screen of the terminal device, and an interactive result corresponding to the effect image is displayed on the display screen in response to the third touch operation. For example, in the schematic diagram shown in FIG. 9, a player may interact through a hand-wiping action.
After the game starts, the terminal device can record the game, and after the recording is completed, the terminal device can keep the sound of the player, the corresponding interaction result and the like, so that the completed movie game is formed. In addition, after the movie game is completed, the player can also perform a sharing operation on the generated movie game.
As can be seen from the above, the scheme provided by the application uses AR (Augmented Reality ) technology to determine the placement position and area of the virtual character, so that the virtual character can serve as a "guest" of a player in the real world, and can realize face-to-face conversation between the player and the virtual character, thereby enhancing the immersion experience of the game. In addition, the player can also place the virtual character in different scenes, and the sensor of the terminal equipment detects environmental factors such as light rays and the like, so that different feedback is given to the player, and the playability of a single scenario is enhanced.
Therefore, the scheme provided by the application can enable the player to be not only an actor in the game, but also a photographer, and each participating player can share own works, so that the retransmission property of the game works is enhanced. In addition, the screen of the terminal device can be used as a perception layer of the player to the outside, and becomes a perception medium of the player, so that the perception of the player is liberated, the limitation of a fourth wall is broken through, and the participation and immersion experience of the player are further improved.
Example 2
There is further provided in accordance with an embodiment of the present application an augmented reality-based game interaction device, wherein fig. 10 is a schematic diagram of an augmented reality-based game interaction device according to an embodiment of the present application, as shown in fig. 10, the device includes: an acquisition module 1001, a first determination module 1003, a second determination module 1005, and a response module 1007.
The acquiring module 1001 is configured to acquire a live-action image of the terminal device under a current shooting view angle; a first determining module 1003, configured to determine a first game scenario in the game that matches the live-action image; a second determining module 1005 for determining a target display position of the virtual character associated with the second game scenario satisfying the condition in the first game scenario in the live-action image; and a response module 1007 for displaying the virtual character on the target display position in response to the first touch operation applied to the terminal device and controlling the interaction of the virtual character based at least on the second game scenario.
Here, it should be noted that the above-mentioned obtaining module 1001, first determining module 1003, second determining module 1005, and responding module 1007 correspond to steps S102 to S108 of the above-mentioned embodiment, and the four modules are the same as the examples and application scenarios implemented by the corresponding steps, but are not limited to those disclosed in the above-mentioned embodiment 1.
In an alternative embodiment, the first determining module includes: the second acquisition module is used for analyzing the live-action image and acquiring a first game scenario corresponding to the analysis result.
In an alternative embodiment, the second acquisition module includes: and the third acquisition module is used for identifying the object in the live-action image and acquiring the first game scenario matched with the object.
In an alternative embodiment, the third acquisition module includes: and the traversing module and the fourth obtaining module. The traversal module is used for traversing characteristic labels preset in game plots in the game; and the fourth acquisition module is used for acquiring the first game scenario with the feature tag matched with the object in the game scenario.
In an alternative embodiment, the augmented reality-based game interaction device further comprises: and the third determining module is used for determining the second game scenario from the first game scenario.
In an alternative embodiment, the third determining module includes: and the fourth determining module is used for determining that the first game scenario is the second game scenario if the first game scenario is one.
In an alternative embodiment, the third determining module includes: and the fifth determining module is used for determining the second game scenario from the first game scenario according to the priority corresponding to the first game scenario if the first game scenario is a plurality of.
In an alternative embodiment, the third determining module includes: and a display module and a sixth determination module. The display module is used for displaying a selection control for the first game scenario on a display screen of the terminal equipment if the first game scenario is a plurality of; and the sixth determining module is used for responding to the touch operation acted on the selection control and determining the first game scenario corresponding to the touch operation as the second game scenario.
In an alternative embodiment, the second determining module includes: a fifth acquisition module and a seventh determination module. The fifth acquisition module is used for acquiring the character data of the virtual character associated with the second game scenario; and a seventh determining module, configured to determine a target display position of the virtual character in the live-action image according to the character data.
Optionally, the character data includes at least one of: the virtual character's range of motion data, the virtual character's pose data, and the virtual character's footprint data at different poses.
In an alternative embodiment, the seventh determining module includes: an eighth determination module and a ninth determination module. The eighth determining module is used for determining a display range in the live-action image according to the role data; and the ninth determining module is used for determining the target display position in the display range.
In an alternative embodiment, the ninth determination module includes: a sixth acquisition module, a tenth determination module, and an eleventh determination module. The sixth acquisition module is configured to acquire shooting information of the terminal device, where the shooting information includes at least one of the following: position information of the terminal equipment and shooting angle of the terminal equipment; a tenth determining module, configured to determine a display area of the virtual character within a display range according to the shooting information; and an eleventh determining module for determining a target display position corresponding to the display area within the display range.
In an alternative embodiment, the eleventh determining module includes: a twelfth determination module, a seventh acquisition module, a thirteenth determination module, a fourteenth determination module, and a fifteenth determination module. The twelfth determining module is used for determining at least one alternative display position corresponding to the display area in the display range; a seventh acquisition module, configured to acquire a plurality of live-action images under different shooting viewing angles; the thirteenth determining module is configured to compare the plurality of live-action images, and determine environmental information corresponding to the live-action images, where the environmental information includes at least one of the following: environmental height, environmental depth; a fourteenth determining module, configured to determine at least one alternative display position within a display range according to the environmental information and the display area; and a fifteenth determining module, configured to determine a target display position from at least one candidate display position in response to the second touch operation.
In an alternative embodiment, the augmented reality-based game interaction device further comprises: the first display module is used for displaying position prompt information after determining the target display position of the virtual character associated with the second game scenario information in the live-action image, wherein the position prompt information is used for prompting the terminal equipment to adjust from the first shooting position to the second shooting position indicated by the position prompt information under the current shooting view angle.
In an alternative embodiment, the first display module includes: the second display module is configured to display position prompt information when the first shooting position of the terminal device does not meet a preset condition, where the preset condition at least includes: the distance between the first photographing position and the target display position of the virtual character is greater than or less than a preset distance.
In an alternative embodiment, the second display module includes: a sixteenth determination module and a third display module. The sixteenth determining module is used for determining a second shooting position according to the target display position of the virtual character and the position information of the terminal equipment under the current shooting visual angle under the condition that the number of the virtual characters is one; and the third display module is used for displaying the position prompt information at the second shooting position.
In an alternative embodiment, the second display module includes: seventeenth determining module and fourth displaying module. The seventeenth determining module is configured to determine, when the number of the virtual characters is multiple, a second shooting position according to an interaction relationship between the multiple virtual characters and position information of the terminal device under the current shooting view angle; and the fourth display module is used for displaying the position prompt information at the second shooting position.
In an alternative embodiment, the augmented reality-based game interaction device further comprises: and the fifth display module is used for displaying angle prompt information after determining the target display position of the virtual character associated with the second game scenario information in the live-action image, wherein the angle prompt information is used for prompting the terminal equipment to be adjusted from the current shooting view angle to the target shooting view angle.
In an alternative embodiment, the fifth display module includes: an eighteenth determining module and a sixth displaying module. The eighteenth determining module is used for determining a target shooting visual angle according to the second game scenario information; and the sixth display module is used for displaying the angle prompt information according to the target shooting visual angle and the current shooting visual angle.
In an alternative embodiment, the response module includes: and an eighth acquisition module and a control module. The eighth acquisition module is used for acquiring an interaction request corresponding to the second game scenario; and the control module is used for controlling the interaction action of the virtual character according to the operation instruction for the interaction request received by the terminal equipment.
In an alternative embodiment, the augmented reality-based game interaction device further comprises: a ninth acquisition module and a nineteenth determination module. The ninth acquisition module is used for acquiring weather information acquired by the terminal equipment; and the nineteenth determining module is used for determining a first game scenario matched with the weather information in the game.
In an alternative embodiment, the augmented reality-based game interaction device further comprises: the first display module and the second display module. The first display module is used for displaying an effect image corresponding to the second game scenario or the interaction on a display screen of the terminal equipment; and the second display module is used for responding to the third touch operation and displaying an interaction result corresponding to the effect image on the display screen.
Example 3
According to another aspect of the embodiments of the present invention, there is also provided a storage medium including a stored program, wherein the device in which the storage medium is controlled to execute the augmented reality-based game interaction method of embodiment 1 described above when the program runs.
Example 4
According to another aspect of the embodiments of the present application, there is also provided a processor for running a program, wherein the program executes the augmented reality-based game interaction method of embodiment 1.
The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present application, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed technology may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a preferred embodiment of the present invention and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present invention, which are intended to be comprehended within the scope of the present invention.

Claims (22)

1. A method of augmented reality-based game interaction, the method comprising:
acquiring a live-action image of the terminal equipment under the current shooting visual angle;
determining a first game scenario in the game, which is matched with the live-action image;
determining a target display position of a virtual character associated with a second game scenario satisfying a condition in the first game scenario in the live-action image;
responding to a first touch operation acted on the terminal equipment, displaying the virtual character on the target display position, and controlling interaction action of the virtual character based on at least the second game scenario;
wherein determining a target display position of a virtual character associated with the second game scenario in the live-action image comprises: acquiring character data of the virtual character associated with the second game scenario; determining the target display position of the virtual character in the live-action image according to the character data;
Wherein determining the target display position of the virtual character in the live-action image according to the character data comprises: determining a display range in the live-action image according to the character data; determining the target display position within the display range;
wherein determining the target display position within the display range includes: acquiring shooting information of the terminal equipment, wherein the shooting information comprises at least one of the following steps: the position information of the terminal equipment and the shooting angle of the terminal equipment; determining the display area of the virtual character in the display range according to the shooting information; and determining a target display position corresponding to the display area in the display range.
2. The method of claim 1, wherein determining first gameplay information in a game that matches the live image comprises:
and analyzing the live-action image to obtain the first game scenario corresponding to the analysis result.
3. The method of claim 2, wherein analyzing the live-action image to obtain a first game scenario corresponding to the analysis result comprises:
And identifying an object in the live-action image, and acquiring the first game scenario matched with the object.
4. A method according to claim 3, wherein obtaining the first game scenario that matches the object comprises:
traversing a feature tag preset in a game scenario in the game;
and acquiring the first game scenario in which the characteristic tag is matched with the object in the game scenario.
5. The method according to claim 1, wherein the method further comprises:
the second gameplay is determined from the first gameplay.
6. The method of claim 5, wherein determining the second gameplay from the first gameplay comprises:
if the first game scenario is one, determining the first game scenario as the second game scenario.
7. The method of claim 5, wherein said determining the second gameplay from the first gameplay comprises:
and if the first game scenario is multiple, determining the second game scenario from the first game scenario according to the priority corresponding to the first game scenario.
8. The method of claim 5, wherein said determining the second gameplay from the first gameplay comprises:
if the first game scenario is multiple, displaying a selection control for the first game scenario on a display screen of the terminal equipment;
and responding to the touch operation acted on the selection control, and determining the first game scenario corresponding to the touch operation as the second game scenario.
9. The method of claim 1, wherein the character data comprises at least one of: the virtual character comprises movement range data of the virtual character, gesture data of the virtual character and occupied area data of the virtual character under different gestures.
10. The method of claim 1, wherein determining a target display position within the display range corresponding to the display area comprises:
determining at least one alternative display position corresponding to the display area in the display range;
acquiring a plurality of live-action images under different shooting visual angles;
comparing the plurality of live-action images, and determining environment information corresponding to the live-action images, wherein the environment information at least comprises one of the following: environmental height, environmental depth;
Determining at least one alternative display position in the display range according to the environment information and the display area;
and responding to a second touch operation, and determining the target display position from the at least one alternative display position.
11. The method of claim 1, wherein after determining a target display position of a virtual character associated with the second gameplay information in the live-action image, the method further comprises:
and displaying position prompt information, wherein the position prompt information is used for prompting the terminal equipment to adjust from a first shooting position to a second shooting position indicated by the position prompt information under the current shooting view angle.
12. The method of claim 11, wherein presenting location cues comprises:
and under the condition that the first shooting position of the terminal equipment does not meet the preset condition, displaying the position prompt information, wherein the preset condition at least comprises: the distance between the first shooting position and the target display position of the virtual character is larger than or smaller than a preset distance.
13. The method of claim 12, wherein presenting the location hint information comprises:
Determining the second shooting position according to the target display position of the virtual character and the position information of the terminal equipment under the current shooting visual angle under the condition that the number of the virtual characters is one;
and displaying the position prompt information at the second shooting position.
14. The method of claim 12, wherein presenting the location hint information comprises:
under the condition that the number of the virtual characters is a plurality of, determining the second shooting position according to the interaction relation among the plurality of virtual characters and the position information of the terminal equipment under the current shooting visual angle;
and displaying the position prompt information at the second shooting position.
15. The method of claim 1, wherein after determining a target display position of a virtual character associated with the second gameplay information in the live-action image, the method further comprises:
and displaying angle prompt information, wherein the angle prompt information is used for prompting to adjust the terminal equipment from the current shooting visual angle to a target shooting visual angle.
16. The method of claim 15, wherein presenting the angle cues comprises:
Determining the target shooting visual angle according to the second game scenario information;
and displaying the angle prompt information according to the target shooting visual angle and the current shooting visual angle.
17. The method of claim 1, wherein controlling the interaction of the virtual character based at least on the second gameplay comprises:
acquiring an interaction request corresponding to a second game scenario;
and controlling the interaction action of the virtual character according to the operation instruction for the interaction request received by the terminal equipment.
18. The method according to claim 1, wherein the method further comprises:
acquiring weather information acquired by the terminal equipment;
determining the first game scenario in the game that matches the weather information.
19. The method of claim 1, wherein after controlling the interaction of the virtual character based at least on the second gameplay, the method further comprises:
displaying an effect image corresponding to the second game scenario or to the interactive action on a display screen of the terminal device;
and responding to a third touch operation, and displaying an interaction result corresponding to the effect image on the display screen.
20. An augmented reality-based game interaction device, comprising:
the acquisition module is used for acquiring a live-action image of the terminal equipment under the current shooting visual angle;
the first determining module is used for determining a first game scenario matched with the live-action image in the game;
a second determining module, configured to determine a target display position of a virtual character associated with a second game scenario satisfying a condition in the first game scenario in the live-action image;
the response module is used for responding to a first touch operation acted on the terminal equipment, displaying the virtual character on the target display position and controlling interaction action of the virtual character based on at least the second game scenario;
wherein the second determining module includes: a fifth acquisition module configured to acquire character data of the virtual character associated with the second game scenario; a seventh determining module, configured to determine, according to the character data, the target display position of the virtual character in the live-action image;
wherein the seventh determination module includes: an eighth determining module, configured to determine a display range in the live-action image according to the character data; a ninth determining module, configured to determine the target display position within the display range;
Wherein the ninth determination module includes: a sixth acquisition module, configured to acquire shooting information of the terminal device, where the shooting information includes at least one of the following: the position information of the terminal equipment and the shooting angle of the terminal equipment; a tenth determining module, configured to determine a display area of the virtual character within the display range according to the shooting information; an eleventh determining module, configured to determine a target display position corresponding to the display area within the display range.
21. A storage medium comprising a stored program, wherein the program, when run, controls a device in which the storage medium is located to perform the augmented reality-based game interaction method of any one of claims 1 to 19.
22. A processor for executing a program, wherein the program when executed performs the augmented reality based game interaction method of any one of claims 1 to 19.
CN202010120288.1A 2020-02-26 2020-02-26 Game interaction method and device based on augmented reality Active CN111359200B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010120288.1A CN111359200B (en) 2020-02-26 2020-02-26 Game interaction method and device based on augmented reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010120288.1A CN111359200B (en) 2020-02-26 2020-02-26 Game interaction method and device based on augmented reality

Publications (2)

Publication Number Publication Date
CN111359200A CN111359200A (en) 2020-07-03
CN111359200B true CN111359200B (en) 2023-09-26

Family

ID=71201148

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010120288.1A Active CN111359200B (en) 2020-02-26 2020-02-26 Game interaction method and device based on augmented reality

Country Status (1)

Country Link
CN (1) CN111359200B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112068703B (en) 2020-09-07 2021-11-16 北京字节跳动网络技术有限公司 Target object control method and device, electronic device and storage medium
CN112148197A (en) * 2020-09-23 2020-12-29 北京市商汤科技开发有限公司 Augmented reality AR interaction method and device, electronic equipment and storage medium
CN112843715B (en) * 2020-12-31 2023-07-04 上海米哈游天命科技有限公司 Shooting visual angle determining method, device, equipment and storage medium
CN113274718A (en) * 2021-06-03 2021-08-20 北京爱其科技有限公司 Gun based on augmented reality and implementation method of shooting type augmented reality
CN114089829B (en) * 2021-10-13 2023-03-21 深圳中青宝互动网络股份有限公司 Virtual reality's meta universe system
CN114612637B (en) * 2022-03-15 2024-07-02 北京字跳网络技术有限公司 Scene picture display method and device, computer equipment and storage medium
CN116943191A (en) * 2022-04-18 2023-10-27 腾讯科技(深圳)有限公司 Man-machine interaction method, device, equipment and medium based on story scene

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108156523A (en) * 2017-11-24 2018-06-12 互影科技(北京)有限公司 The interactive approach and device that interactive video plays
CN109865289A (en) * 2019-01-15 2019-06-11 特斯联(北京)科技有限公司 A kind of outdoor scene environment entertainment systems and its method based on augmented reality
CN109985382A (en) * 2019-04-03 2019-07-09 腾讯科技(深圳)有限公司 Script execution, device, equipment and the storage medium of plot node
CN110180168A (en) * 2019-05-31 2019-08-30 网易(杭州)网络有限公司 A kind of display methods and device, storage medium and processor of game picture
CN110384924A (en) * 2019-08-21 2019-10-29 网易(杭州)网络有限公司 The display control method of virtual objects, device, medium and equipment in scene of game
CN110465097A (en) * 2019-09-09 2019-11-19 网易(杭州)网络有限公司 Role in game, which stands, draws display methods and device, electronic equipment, storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108156523A (en) * 2017-11-24 2018-06-12 互影科技(北京)有限公司 The interactive approach and device that interactive video plays
CN109865289A (en) * 2019-01-15 2019-06-11 特斯联(北京)科技有限公司 A kind of outdoor scene environment entertainment systems and its method based on augmented reality
CN109985382A (en) * 2019-04-03 2019-07-09 腾讯科技(深圳)有限公司 Script execution, device, equipment and the storage medium of plot node
CN110180168A (en) * 2019-05-31 2019-08-30 网易(杭州)网络有限公司 A kind of display methods and device, storage medium and processor of game picture
CN110384924A (en) * 2019-08-21 2019-10-29 网易(杭州)网络有限公司 The display control method of virtual objects, device, medium and equipment in scene of game
CN110465097A (en) * 2019-09-09 2019-11-19 网易(杭州)网络有限公司 Role in game, which stands, draws display methods and device, electronic equipment, storage medium

Also Published As

Publication number Publication date
CN111359200A (en) 2020-07-03

Similar Documents

Publication Publication Date Title
CN111359200B (en) Game interaction method and device based on augmented reality
US10516916B2 (en) Method of processing video data, device, computer program product, and data construct
US11436803B2 (en) Insertion of VR spectator in live video of a live event
US9539500B2 (en) Biometric recognition
JP6545744B2 (en) Operation mode switching in head mounted display
US9195305B2 (en) Recognizing user intent in motion capture system
CN102947777B (en) Usertracking feeds back
TWI531396B (en) Natural user input for driving interactive stories
CN103258184B (en) Methods for capturing depth data of a scene and applying computer actions
US20170011554A1 (en) Systems and methods for dynamic spectating
US20170309077A1 (en) System and Method for Implementing Augmented Reality via Three-Dimensional Painting
US11826628B2 (en) Virtual reality sports training systems and methods
CN110493642B (en) Virtual interactive viewing method, system, device and storage medium
CN102736731B (en) Intelligent gameplay photo captures
US20150189243A1 (en) Automated video production system
Pidaparthy et al. Keep your eye on the puck: Automatic hockey videography
CN112843693B (en) Method and device for shooting image, electronic equipment and storage medium
KR101767569B1 (en) The augmented reality interactive system related to the displayed image contents and operation method for the system
KR102297582B1 (en) Game device and electronic device for game service
US20230030260A1 (en) Systems and methods for improved player interaction using augmented reality
KR102120711B1 (en) A system for management and assistance of billiard game
US20240087072A1 (en) Live event information display method, system, and apparatus
JP2023181568A (en) Information processing apparatus and information processing method
CN112245910A (en) Modeling and extreme movement method and system based on Quest head display
CN118200617A (en) Virtual article configuration method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant