CN111632377A - Shooting track display method and device, electronic equipment and storage medium - Google Patents

Shooting track display method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN111632377A
CN111632377A CN202010513301.XA CN202010513301A CN111632377A CN 111632377 A CN111632377 A CN 111632377A CN 202010513301 A CN202010513301 A CN 202010513301A CN 111632377 A CN111632377 A CN 111632377A
Authority
CN
China
Prior art keywords
virtual aircraft
bullet
data
pose
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010513301.XA
Other languages
Chinese (zh)
Inventor
揭志伟
李炳泽
武明飞
符修源
陈凯彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Shangtang Technology Development Co Ltd
Zhejiang Sensetime Technology Development Co Ltd
Original Assignee
Zhejiang Shangtang Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Shangtang Technology Development Co Ltd filed Critical Zhejiang Shangtang Technology Development Co Ltd
Priority to CN202010513301.XA priority Critical patent/CN111632377A/en
Publication of CN111632377A publication Critical patent/CN111632377A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/803Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8017Driving on land or water; Flying
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The disclosure provides a display method and device of a shooting track, electronic equipment and a storage medium, wherein the display method comprises the following steps: acquiring pose data of each AR device in the associated AR device group in a target real scene in real time, and pose control data and bullet launching control data of each AR device aiming at the matched virtual aircraft; determining trajectory information of bullets fired by each virtual aircraft based on the pose control data for the virtual aircraft and the bullet firing control data for the virtual aircraft; for each AR device, generating fusion track display data of at least one bullet for the AR device based on the pose data of the AR device and the track information of the bullets launched by each virtual aircraft, and sending the fusion track display data to the AR device for display.

Description

Shooting track display method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of AR technologies, and in particular, to a method and an apparatus for displaying a shooting trajectory, an electronic device, and a storage medium.
Background
Augmented Reality (AR) technology is a technology for skillfully fusing virtual information and the real world, and is currently applied to various fields, such as movie and television production, animation technology, games, education and the like.
At present, the AR technology has few application scenes in a multi-player shooting game, and how to increase the vivid effect of the AR technology when the AR technology is applied to the multi-player shooting game is a problem worthy of research.
Disclosure of Invention
The embodiment of the disclosure at least provides a display scheme of a shooting track.
In a first aspect, an embodiment of the present disclosure provides a method for displaying a shooting trajectory, including:
acquiring pose data of each AR device in the associated AR device group in a target real scene in real time, and pose control data and bullet launching control data of each AR device aiming at the matched virtual aircraft;
determining trajectory information of bullets fired by the virtual aircraft based on the pose control data for each of the virtual aircraft and the bullet firing control data for the virtual aircraft;
and for each AR device, generating fusion track display data for at least one bullet of the AR device based on the pose data of the AR device and the track information of the bullets launched by each virtual aircraft, and sending the fusion track display data to the AR device for display.
In the embodiment of the disclosure, fusion trajectory display data for at least one bullet of the AR device can be generated according to the pose data of the AR device and the trajectory information of the bullets launched by each virtual aircraft, that is, a plurality of users can respectively see the trajectory of the bullets launched by the virtual aircraft through the respective AR devices carried by the users, so that the application of the AR technology in the field of multiplayer games can be realized, and the trajectory of the bullets can be seen from different viewing angles by the users at different positions, thereby enabling the display content to be more vivid.
In one possible embodiment, the determining trajectory information of the bullets fired by the virtual aircraft based on the pose control data for each virtual aircraft and the bullet firing control data for the virtual aircraft comprises:
determining pose data for each virtual aircraft based on the pose control data for that virtual aircraft;
determining an initial position, a launching direction and a shooting strength of the virtual aircraft for launching the bullet based on the pose data of the virtual aircraft and the bullet launching control data aiming at the virtual aircraft;
and determining the track information of the bullet launched by the virtual aircraft based on the initial position, the launching direction and the shooting strength of the bullet launched by the virtual aircraft.
In the embodiment of the disclosure, the trajectory information of the bullet launched by the virtual aircraft in the target real scene can be determined by combining the pose data of the virtual aircraft and the bullet launching control data of the virtual aircraft, so that the trajectory of the bullet launched by the virtual aircraft in the AR scene can be conveniently and intuitively displayed to a user, and the displayed content is more vivid.
In one possible embodiment, the generating, for each AR device, fused trajectory presentation data for at least one bullet of the AR device based on the pose data of the AR device and trajectory information of bullets fired by the respective virtual aircraft includes:
determining fused track special effect data of the at least one bullet based on track information of bullets fired by each virtual aircraft;
and for each AR device, determining fused track display data of at least one bullet aiming at the AR device based on the pose data of the AR device and the fused track special effect data corresponding to the at least one bullet.
In a possible implementation, the display method further includes:
determining whether there is at least one target virtual fired bullet hit based on the pose control data for each virtual aircraft and the bullet firing control data for that virtual aircraft;
upon determining that there is at least one target virtual aircraft hit by a bullet, determining special effect data for when the at least one target virtual aircraft was hit by the bullet;
and generating fused shooting display data aiming at each AR device based on the pose data of the AR device and the special effect data of the at least one target virtual aircraft when being hit by the bullet, and sending the fused shooting display data to the AR device for display.
In one possible embodiment, the determining whether there is at least one target virtually fired bullet hit based on the pose control data for each virtual aircraft and the bullet firing control data for that virtual aircraft comprises:
determining a moving track of each virtual aircraft within a preset time length based on the current pose data of the virtual aircraft and pose control data aiming at the virtual aircraft within the preset time length;
determining whether the moving track of the target virtual aircraft intersects with the flight track of any bullet or not based on the track information of each bullet in the preset time length and the moving track of each virtual aircraft in the preset time length;
upon determining that there is an intersection of the trajectory of movement of the target virtual aircraft with the flight trajectory of any of the bullets, determining that there is at least one target virtual aircraft that was hit by a bullet.
In a possible implementation, the display method further includes:
in response to the fact that the target virtual aircraft is hit by a bullet, updating the pose data of the target virtual aircraft to obtain new pose data corresponding to the target virtual aircraft;
and generating pose change display data of the target virtual aircraft for the AR equipment based on the pose data of the AR equipment and the new pose data corresponding to the target virtual aircraft for each AR equipment, and sending the pose change display data to the AR equipment for display.
In one possible implementation, the obtaining, in real time, pose data of each AR device in the associated AR device group in the target real scene includes:
real scene images shot by each AR device in the associated AR device group aiming at the target real scene are obtained in real time;
and determining the pose data of the AR equipment in the target real scene based on the real scene image shot by each AR equipment and a pre-established three-dimensional scene model used for representing the target real scene.
In a second aspect, an embodiment of the present disclosure provides a display device for a shooting track, including:
the acquisition module is used for acquiring the pose data of each AR device in the associated AR device group in a target real scene in real time, and the pose control data and bullet shooting control data of each AR device aiming at the matched virtual aircraft;
a determination module for determining trajectory information of bullets fired by the virtual aircraft based on the pose control data for each of the virtual aircraft and the bullet firing control data for the virtual aircraft;
and the generating module is used for generating fusion track display data of at least one bullet aiming at the AR equipment based on the pose data of the AR equipment and the track information of the bullets launched by each virtual aircraft aiming at each AR equipment, and sending the fusion track display data to the AR equipment for display.
In one possible embodiment, the determining module, when configured to determine the trajectory information of the bullet fired by the virtual aircraft based on the pose control data for each of the virtual aircraft and the bullet firing control data for the virtual aircraft, comprises:
determining pose data for each virtual aircraft based on the pose control data for that virtual aircraft;
determining an initial position, a launching direction and a shooting strength of the virtual aircraft for launching the bullet based on the pose data of the virtual aircraft and the bullet launching control data aiming at the virtual aircraft;
and determining the track information of the bullet launched by the virtual aircraft based on the initial position, the launching direction and the shooting strength of the bullet launched by the virtual aircraft.
In one possible embodiment, the generating module, when configured to generate, for each AR device, fused trajectory presentation data for at least one bullet of the AR device based on the pose data of the AR device and trajectory information of bullets fired by the respective virtual aircraft, includes:
determining fused track special effect data of the at least one bullet based on track information of bullets fired by each virtual aircraft;
and for each AR device, determining fused track display data of at least one bullet aiming at the AR device based on the pose data of the AR device and the fused track special effect data corresponding to the at least one bullet.
In a possible implementation, the determining module is further configured to:
determining whether there is at least one target virtual fired bullet hit based on the pose control data for each virtual aircraft and the bullet firing control data for that virtual aircraft;
upon determining that there is at least one target virtual aircraft hit by a bullet, determining special effect data for when the at least one target virtual aircraft was hit by the bullet;
the generating module is further configured to generate, for each AR device, fused shooting display data for the AR device based on the pose data of the AR device and the special effect data of the at least one target virtual aircraft when being hit by the bullet, and send the fused shooting display data to the AR device for display.
In one possible embodiment, the determining module, when configured to determine whether there is at least one target virtually fired bullet hit based on the pose control data for each virtual aircraft and the bullet firing control data for that virtual aircraft, comprises:
determining a moving track of each virtual aircraft within a preset time length based on the current pose data of the virtual aircraft and pose control data aiming at the virtual aircraft within the preset time length;
determining whether the moving track of the target virtual aircraft intersects with the flight track of any bullet or not based on the track information of each bullet in the preset time length and the moving track of each virtual aircraft in the preset time length;
upon determining that there is an intersection of the trajectory of movement of the target virtual aircraft with the flight trajectory of any of the bullets, determining that there is at least one target virtual aircraft that was hit by a bullet.
In a possible implementation, the generating module is further configured to:
in response to the fact that the target virtual aircraft is hit by a bullet, updating the pose data of the target virtual aircraft to obtain new pose data corresponding to the target virtual aircraft;
and generating pose change display data of the target virtual aircraft for the AR equipment based on the pose data of the AR equipment and the new pose data corresponding to the target virtual aircraft for each AR equipment, and sending the pose change display data to the AR equipment for display.
In one possible implementation, the acquiring module, when configured to acquire pose data of each AR device in the associated AR device group in the target real scene in real time, includes:
real scene images shot by each AR device in the associated AR device group aiming at the target real scene are obtained in real time;
and determining the pose data of the AR equipment in the target real scene based on the real scene image shot by each AR equipment and a pre-established three-dimensional scene model used for representing the target real scene.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating via the bus when the electronic device is running, the machine-readable instructions when executed by the processor performing the steps of the method of presentation according to the first aspect.
In a fourth aspect, the disclosed embodiments provide a computer-readable storage medium having stored thereon a computer program, which, when executed by a processor, performs the steps of the presentation method according to the first aspect.
In order to make the aforementioned objects, features and advantages of the present disclosure more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for use in the embodiments will be briefly described below, and the drawings herein incorporated in and forming a part of the specification illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the technical solutions of the present disclosure. It is appreciated that the following drawings depict only certain embodiments of the disclosure and are therefore not to be considered limiting of its scope, for those skilled in the art will be able to derive additional related drawings therefrom without the benefit of the inventive faculty.
Fig. 1 shows a flowchart of a method for displaying a shooting trajectory provided by an embodiment of the present disclosure;
FIG. 2 illustrates a flowchart of a method for determining pose data of an AR device provided by an embodiment of the present disclosure;
fig. 3 shows a flowchart of a method for determining trajectory information of a bullet according to an embodiment of the present disclosure;
fig. 4 illustrates a flowchart of a method for determining fused trajectory presentation data for at least one bullet for each AR device provided by an embodiment of the present disclosure;
FIG. 5 illustrates a flowchart of a method of determining fused shot presentation data for each AR device provided by an embodiment of the present disclosure;
FIG. 6 is a schematic diagram illustrating a display of a firing trajectory provided by an embodiment of the present disclosure;
fig. 7 shows a schematic diagram of an electronic device provided by an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, not all of the embodiments. The components of the embodiments of the present disclosure, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure, presented in the figures, is not intended to limit the scope of the claimed disclosure, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the disclosure without making creative efforts, shall fall within the protection scope of the disclosure.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
The term "and/or" herein merely describes an associative relationship, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
At present, the AR technology has few application scenes in a multi-player shooting game, for example, a spaceship battle game participated by multiple players, each user can control a matched virtual spaceship through a carried AR device, and in a similar application scene, how to enable the game to have a more vivid effect through the AR technology is a technical problem to be discussed in the disclosure.
Based on the research, the present disclosure provides a method for displaying a shooting trajectory, which can determine trajectory information of a bullet fired by a virtual aircraft by acquiring pose data of each AR device in a target real scene in real time, and pose control data and bullet firing control data of the AR device for the matched virtual aircraft, and further generate fused trajectory display data for at least one bullet of the AR device for each AR device according to the pose data of the AR device and the trajectory information of the bullet fired by each virtual aircraft, that is, multiple users can respectively see the trajectory of the bullet fired by the virtual aircraft through the respective AR devices, so that the AR technology can be applied to the field of multiplayer games, and users at different positions can see the trajectory of the bullet from different viewing angles, thereby making the display more realistic.
To facilitate understanding of the present embodiment, a detailed description is first given of a method for displaying a shooting track disclosed in the embodiments of the present disclosure, and an execution subject of the method for displaying a shooting track provided in the embodiments of the present disclosure is generally a computer device with certain computing power, where the computer device includes, for example, a server or other processing devices. In some possible implementations, the method for presenting a shot trajectory may be implemented by a processor calling computer-readable instructions stored in a memory.
Referring to fig. 1, a flowchart of a method for displaying a shooting trajectory according to an embodiment of the present disclosure is shown, where the method includes the following steps S101 to S103:
s101, acquiring pose data of each AR device in the associated AR device group in a target real scene in real time, and pose control data and bullet shooting control data of each AR device for the matched virtual aircraft.
For example, each AR device in the associated AR device group may commonly experience the same AR scene content, and in an application scene, the AR devices may select the same AR experience package to enter a multi-player interactive AR scene, for example, for an AR scene of a game class, each AR device may enter the same AR room to experience the AR scene by logging in an AR game account, so that the AR devices entering the same AR room may serve as the associated AR device group.
For example, the pose data of the AR device in the target reality scene may include a position and/or a display angle of a display component for displaying the virtual operation object when the user holds or wears the AR device.
The display component of the AR device specifically refers to a component used for displaying a virtual operation object in the AR device, for example, when the AR device is a mobile phone or a tablet, the corresponding display component may be a display screen, and when the AR device is AR glasses, the corresponding display component may be a lens used for displaying a virtual aircraft.
For example, the embodiment of the present disclosure may determine the pose data of each AR device through a real scene image of the AR device taken for a target real scene, which will be explained later.
For example, the virtual operation object matched with each AR device may be a virtual aircraft to be controlled and selected after the AR device logs in an AR account, for example, an AR scene is an airship battle scene, where each AR device may select a virtual battle airship to be controlled and select a number of the matched virtual battle airship after logging in the AR account, so that the AR device may send pose control data and bullet launching control data for the virtual battle airship with the matched number to a server during an AR experience.
The pose control data may comprise position control data and/or attitude control data for the virtual aircraft, wherein the position control data may determine a target position to which the virtual aircraft is flying, and the attitude control data may determine a flight attitude of the virtual aircraft.
Illustratively, the bullet firing control data includes, but is not limited to, one or more of the following:
data indicating whether the virtual aircraft fired the bullet, data of the type of bullet fired, and corresponding shot force data when the bullet was fired.
And S102, determining track information of bullets launched by each virtual aircraft based on the pose control data for the virtual aircraft and the bullet launching control data for the virtual aircraft.
For each virtual aircraft, the pose data of the virtual aircraft can be determined based on the pose control data for the virtual aircraft, and then based on the bullet firing control data for the aircraft, whether the virtual aircraft has fired a bullet, the type of fired bullet, the firing power, and the like can be determined.
By combining the pose data of the virtual aircraft and the related data of the bullets launched by the virtual aircraft, the bullet launched by the virtual aircraft at which pose, the type and shooting strength of the launched bullet can be determined, so that the track information of the bullet launched by the virtual aircraft can be determined.
Illustratively, if any virtual aircraft launches a bullet with a range of 1 meter to the east, the trajectory information of the bullet launched by any virtual aircraft can be 1 meter of the bullet flown to the east by the virtual aircraft, and of course, in order to make the display effect more realistic, the influence of gravity on the bullet can be considered here, so that the trajectory of the bullet appears to be parabolic.
S103, generating fusion track display data of at least one bullet aiming at each AR device according to the pose data of the AR device and the track information of the bullets launched by each virtual aircraft and sending the fusion track display data to the AR device for display.
The pose data of each AR device in the target real scene are different, so that the viewing fields of view of the target real scene through the AR devices are different, and the presentation special effects of the track of the bullet viewed by each user through the respective AR devices are possibly different.
Illustratively, the associated AR device group may include a first AR device and a second AR device, which are respectively carried by the user a and the user B, and when the user a and the user B can see the same bullet through the respective carried AR devices, for example, see the bullet launched by the virtual aircraft 001, the difference is that when the pose data of the first AR device and the second AR device are different, the viewing angles of the bullet launched by the virtual aircraft 001 seen by the user a and the user B are different, for example, the user a sees the right side of the bullet itself flying, and the user B sees the left side of the bullet itself flying.
In the embodiment of the disclosure, fusion trajectory display data for at least one bullet of the AR device can be generated according to the pose data of the AR device and the trajectory information of the bullets launched by each virtual aircraft, that is, a plurality of users can respectively see the trajectory of the bullets launched by the virtual aircraft through the respective AR devices carried by the users, so that the application of the AR technology in the field of multiplayer games can be realized, and the trajectory of the bullets can be seen from different viewing angles by the users at different positions, thereby enabling the display content to be more vivid.
The above-mentioned S101 to S103 will be explained with reference to specific embodiments.
For the above S101, when acquiring pose data of each AR device in the associated AR device group in the target real scene in real time, as shown in fig. 2, the following S1011 to S1012 are specifically included:
s1011, real scene images shot by each AR device in the associated AR device group aiming at the target real scene are obtained in real time;
and S1012, determining the pose data of the AR equipment in the target real scene based on the real scene image shot by each AR equipment and a pre-established three-dimensional scene model for representing the target real scene.
For example, the three-dimensional scene model may be constructed based on a plurality of sample images of the target real scene captured in advance, specifically, during construction, the three-dimensional scene model may be constructed by extracting feature points in each sample image, after the three-dimensional scene model is generated, the sample image corresponding to each feature point in the three-dimensional scene model may be saved, and the capturing pose of the sample image in the three-dimensional scene model may be stored, so that after the real scene image of the target real scene captured by the AR device is obtained, the feature points may be extracted from the real scene image, the sample image matched with the real scene image is determined based on the extracted feature points, and finally, the pose data of the AR device in the three-dimensional scene model is obtained.
Because the three-dimensional scene model is a model representing the target real scene, the pose data of the AR device in the three-dimensional scene model can be used as the pose data of the AR device in the target real scene.
In addition, the pose data of the AR device may also be determined by a pose sensor mounted to the AR device.
Or, in another embodiment, the real scene image shot by the AR device may be further input into a neural network model for positioning stored in advance, so as to determine the pose data corresponding to the AR device shooting the real scene image.
Specifically, the neural network may be trained based on a plurality of sample images obtained by shooting a real scene in advance, and pose data corresponding to each sample image.
The three-dimensional scene model can be constructed based on a plurality of real scene images shot for the target real scene in advance, and after the construction is completed, the constructed three-dimensional scene model can be corrected through a real two-dimensional map corresponding to the target real scene, so that the three-dimensional scene model representing the target real scene with high accuracy is obtained.
Regarding the above S102, when determining trajectory information of a bullet fired by each virtual aircraft based on the pose control data for the virtual aircraft and the bullet firing control data for the virtual aircraft, as shown in fig. 3, the following S1021 to S1023 may be included:
s1021, determining pose data of each virtual aircraft based on the pose control data for the virtual aircraft;
s1022, determining the initial position, the shooting direction and the shooting strength of the virtual aircraft for shooting the bullet based on the pose data of the virtual aircraft and the bullet shooting control data aiming at the virtual aircraft;
and S1023, determining the track information of the bullets launched by the virtual aircraft based on the initial position, the launching direction and the shooting strength of the bullets launched by the virtual aircraft.
Exemplarily, the initial position of the bullet launched by the virtual aircraft is the position a in the target reality scene, the launching direction is the direction toward the east in the target reality scene, the shooting strength is the selected preset three-level strength, and can be the maximum strength corresponding to the virtual aircraft, then the parabolic shape of the bullet launched by the virtual aircraft can be determined according to the shooting strength and the gravity, and the trajectory information of the bullet launched by the virtual aircraft in the target reality scene can be determined by combining the initial position and the launching direction.
In the embodiment of the disclosure, the trajectory information of the bullet launched by the virtual aircraft in the target real scene can be determined by combining the pose data of the virtual aircraft and the bullet launching control data of the virtual aircraft, so that the trajectory of the bullet launched by the virtual aircraft in the AR scene can be conveniently and intuitively displayed to a user, and the displayed content is more vivid.
For the above S103, when generating, for each AR device, fused trajectory presentation data of at least one bullet for the AR device based on the pose data of the AR device and trajectory information of bullets fired by the respective virtual aircraft, as shown in fig. 4, the following S1031 to S1032 may be included:
s1031, determining fusion track special effect data of at least one bullet based on track information of bullets launched by each virtual aircraft;
s1032, for each AR device, determining fused track showing data of at least one bullet for the AR device based on the pose data of the AR device and the fused track special effect data corresponding to the at least one bullet.
In an application scenario, there may be a scenario where a plurality of virtual aircraft simultaneously fire bullets, where trajectory information of bullets fired by each virtual aircraft in the same time period may be fused based on trajectory information of bullets fired by each virtual aircraft to obtain fused trajectory special effect data of at least one bullet, for example, trajectory information of bullets respectively fired by two virtual aircraft may be simultaneously included, where trajectory information of one bullet is a trajectory fired from an a position to the east, and trajectory information of another bullet is a trajectory fired from a B position to the west, and trajectory information of the two bullets is fused, that is, the fused trajectory special effect data includes trajectory information of the two bullets.
In an implementation manner, as shown in fig. 5, the display method provided in the embodiment of the present disclosure further includes the following steps S501 to S503:
s501, it is determined whether there is at least one target virtually fired bullet hit based on the pose control data for each virtual aircraft and the bullet firing control data for the virtual aircraft.
Specifically, when determining whether there is at least one target virtually fired bullet hit based on the pose control data for each virtual aircraft and the bullet firing control data for the virtual aircraft, it may include:
(1) determining a moving track of each virtual aircraft within a preset time length based on the current pose data of the virtual aircraft and pose control data aiming at the virtual aircraft within the preset time length;
(2) determining whether the moving track of the target virtual aircraft intersects with the flight track of any bullet or not based on the track information of each bullet in the preset time length and the moving track of each virtual aircraft in the preset time length;
(3) upon determining that there is an intersection of the trajectory of movement of the target virtual aircraft with the flight trajectory of any of the bullets, determining that there is at least one target virtual aircraft that was hit by a bullet.
For example, a time period required for the bullet to fall from the shooting may be set to a preset time period here, and within the preset time period, the movement trajectory of each virtual aircraft may be determined based on the current pose data of the virtual aircraft and the pose control data for the virtual aircraft.
In addition, the position of the target virtual aircraft shot by any bullet can be determined according to the corresponding intersection point position of the target virtual aircraft when intersecting with the flight track of the bullet, such as the head, the tail or the middle of the target virtual aircraft can be shot.
S502, when it is determined that at least one target virtual aircraft is shot by a bullet, determining special effect data when the at least one target virtual aircraft is shot by the bullet.
Illustratively, the special effect data may include position data in setting by a bullet, explosion data, and the like.
S503, aiming at each AR device, generating fusion shooting display data aiming at the AR device based on the pose data of the AR device and the special effect data of at least one target virtual aircraft when being hit by a bullet, and sending the fusion shooting display data to the AR device for displaying.
Likewise, because each AR device has different pose data in the target real scene, and thus the viewing field of view for the target real scene by the AR device is also different, and the target virtual aircraft viewed by each user through the respective AR device may also have different special effects when being shot by a bullet, it is necessary to generate, for each AR device, fused shot presentation data for the AR device based on the pose data of the AR device and the special effect data of at least one target virtual aircraft when being shot by a bullet.
Illustratively, the user on the AR device side can view the presentation picture of the target virtual aircraft shot in the own viewing field of view through the fused shooting demonstration data aiming at the AR device.
In a possible implementation manner, the display method provided by the embodiment of the present disclosure further includes:
(1) in response to the fact that the target virtual aircraft is hit by the bullet, updating the pose data of the target virtual aircraft to obtain new pose data corresponding to the target virtual aircraft;
(2) and generating pose change display data of the target virtual aircraft for the AR equipment based on the pose data of the AR equipment and the new pose data corresponding to the target virtual aircraft for each AR equipment, and sending the pose change display data to the AR equipment for display.
Specifically, new pose data corresponding to the target virtual aircraft can be determined based on the shot position of the target virtual aircraft and the shot strength, and the new pose data can be used for representing the position change and/or the posture change of the target virtual aircraft after the target virtual aircraft is shot.
Further, similarly, because the pose data of each AR device in the target real scene are different, the viewing fields of view of the target real scene by the AR devices are also different, and the pose change display data of the target virtual aircraft viewed by each user through the respective AR device after being ejected by a quilt is also different, so that the pose change display data for each AR device needs to be generated based on the pose data of the AR device and the new pose data of the target virtual aircraft after being ejected by the quilt.
Illustratively, the user on the AR device side can view a presentation picture of the pose change of the target virtual aircraft after being shot in the viewing field of the user through the pose change presentation data for the AR device.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
Based on the same technical concept, the embodiment of the present disclosure further provides a display apparatus of a shooting track corresponding to the display method of the shooting track, and since the principle of the apparatus in the embodiment of the present disclosure for solving the problem is similar to the display method in the embodiment of the present disclosure, the implementation of the apparatus may refer to the implementation of the method, and repeated details are not described again.
Referring to fig. 6, there is a schematic diagram of a display device 600 for a shooting track provided in an embodiment of the present disclosure, the display device for a shooting track including:
the acquisition module 601 is configured to acquire pose data of each AR device in the associated AR device group in a target real scene, and pose control data and bullet shooting control data of each AR device for the matched virtual aircraft in real time;
a determining module 602 for determining trajectory information of bullets fired by each virtual aircraft based on the pose control data for the virtual aircraft and the bullet firing control data for the virtual aircraft;
the generating module 603 is configured to generate, for each AR device, fused trajectory display data for at least one bullet of the AR device based on the pose data of the AR device and trajectory information of the bullets fired by each virtual aircraft, and send the fused trajectory display data to the AR device for display.
In one possible implementation, the determining module 602, when configured to determine that the bullet fired by the virtual aircraft is in the trajectory information based on the pose control data for each virtual aircraft and the bullet firing control data for the virtual aircraft, includes:
determining pose data for each virtual aircraft based on the pose control data for that virtual aircraft;
determining an initial position, a launching direction and a shooting strength of the virtual aircraft for launching the bullet based on the pose data of the virtual aircraft and the bullet launching control data aiming at the virtual aircraft;
and determining the track information of the bullet launched by the virtual aircraft based on the initial position, the launching direction and the shooting strength of the bullet launched by the virtual aircraft.
In one possible implementation, the generating module 603, when configured to generate, for each AR device, fused trajectory presentation data for at least one bullet of the AR device based on the pose data of the AR device and trajectory information of bullets fired by the respective virtual aircraft, includes:
determining fusion track special effect data of at least one bullet based on track information of the bullets launched by each virtual aircraft;
and for each AR device, determining fused track display data of at least one bullet aiming at the AR device based on the pose data of the AR device and fused track special effect data corresponding to the at least one bullet.
In a possible implementation, the determining module 602 is further configured to:
determining whether there is at least one target virtual fired bullet hit based on the pose control data for each virtual aircraft and the bullet firing control data for that virtual aircraft;
determining special effect data when the at least one target virtual aircraft is hit by the bullet after determining that the at least one target virtual aircraft is hit by the bullet;
the generating module is further used for generating fusion shooting display data aiming at each AR device based on the pose data of the AR device and special effect data of at least one target virtual aircraft when the target virtual aircraft is hit by a bullet, and sending the fusion shooting display data to the AR device for displaying.
In one possible implementation, the determining module 602, when configured to determine whether there is at least one target virtually fired bullet hit based on the pose control data for each virtual aircraft and the bullet firing control data for the virtual aircraft, includes:
determining a moving track of each virtual aircraft within a preset time length based on the current pose data of the virtual aircraft and pose control data aiming at the virtual aircraft within the preset time length;
determining whether the moving track of the target virtual aircraft intersects with the flight track of any bullet or not based on the track information of each bullet in the preset time length and the moving track of each virtual aircraft in the preset time length;
upon determining that there is an intersection of the trajectory of movement of the target virtual aircraft with the flight trajectory of any of the bullets, determining that there is at least one target virtual aircraft that was hit by a bullet.
In a possible implementation, the generating module 603 is further configured to:
in response to the fact that the target virtual aircraft is hit by the bullet, updating the pose data of the target virtual aircraft to obtain new pose data corresponding to the target virtual aircraft;
and generating pose change display data of the target virtual aircraft for the AR equipment based on the pose data of the AR equipment and the new pose data corresponding to the target virtual aircraft for each AR equipment, and sending the pose change display data to the AR equipment for display.
In one possible implementation, the obtaining module 601, when configured to obtain, in real time, pose data of each AR device in the associated AR device group in the target real scene, includes:
real scene images shot by each AR device in the associated AR device group aiming at a target real scene are obtained in real time;
and determining the pose data of the AR equipment in the target real scene based on the real scene image shot by each AR equipment and a pre-established three-dimensional scene model for representing the target real scene.
The description of the processing flow of each module in the device and the interaction flow between the modules may refer to the related description in the above method embodiments, and will not be described in detail here.
Corresponding to the display method of the shooting trajectory in fig. 1, an embodiment of the present disclosure further provides an electronic device 700, as shown in fig. 7, a schematic structural diagram of the electronic device 700 provided in the embodiment of the present disclosure includes:
a processor 71, a memory 72, and a bus 73; the memory 72 is used for storing execution instructions and includes a memory 721 and an external memory 722; the memory 721 is also referred to as an internal memory, and is used for temporarily storing the operation data in the processor 71 and the data exchanged with the external memory 722 such as a hard disk, the processor 71 exchanges data with the external memory 722 through the memory 721, and when the electronic device 700 operates, the processor 71 communicates with the memory 72 through the bus 73, so that the processor 71 executes the following instructions: acquiring pose data of each AR device in the associated AR device group in a target real scene in real time, and pose control data and bullet launching control data of the AR devices aiming at the matched virtual aircraft; determining trajectory information of bullets fired by each virtual aircraft based on the pose control data for the virtual aircraft and the bullet firing control data for the virtual aircraft; for each AR device, generating fusion track display data of at least one bullet for the AR device based on the pose data of the AR device and the track information of the bullets launched by each virtual aircraft, and sending the fusion track display data to the AR device for display.
The embodiment of the present disclosure further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program performs the steps of the method for displaying a shooting trajectory in the above-mentioned method embodiment. The storage medium may be a volatile or non-volatile computer-readable storage medium.
The computer program product of the method for displaying a shooting trajectory provided by the embodiment of the present disclosure includes a computer-readable storage medium storing a program code, where instructions included in the program code may be used to execute the steps of the method for displaying a shooting trajectory in the above method embodiment, which may be referred to in the above method embodiment specifically, and are not described herein again.
The embodiments of the present disclosure also provide a computer program, which when executed by a processor implements any one of the methods of the foregoing embodiments. The computer program product may be embodied in hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied in a computer storage medium, and in another alternative embodiment, the computer program product is embodied in a Software product, such as a Software Development Kit (SDK), or the like.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed system, apparatus, and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present disclosure. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Finally, it should be noted that: the above-mentioned embodiments are merely specific embodiments of the present disclosure, which are used for illustrating the technical solutions of the present disclosure and not for limiting the same, and the scope of the present disclosure is not limited thereto, and although the present disclosure is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive of the technical solutions described in the foregoing embodiments or equivalent technical features thereof within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present disclosure, and should be construed as being included therein. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (10)

1. A display method of a shooting track is characterized by comprising the following steps:
acquiring pose data of each AR device in the associated AR device group in a target real scene in real time, and pose control data and bullet launching control data of each AR device aiming at the matched virtual aircraft;
determining trajectory information of bullets fired by the virtual aircraft based on the pose control data for each of the virtual aircraft and the bullet firing control data for the virtual aircraft;
and for each AR device, generating fusion track display data for at least one bullet of the AR device based on the pose data of the AR device and the track information of the bullets launched by each virtual aircraft, and sending the fusion track display data to the AR device for display.
2. The presentation method of claim 1 wherein said determining bullet trajectory information fired by said virtual aircraft based on pose control data for each of said virtual aircraft and bullet firing control data for said virtual aircraft comprises:
determining pose data for each virtual aircraft based on the pose control data for that virtual aircraft;
determining an initial position, a launching direction and a shooting strength of the virtual aircraft for launching the bullet based on the pose data of the virtual aircraft and the bullet launching control data aiming at the virtual aircraft;
and determining the track information of the bullet launched by the virtual aircraft based on the initial position, the launching direction and the shooting strength of the bullet launched by the virtual aircraft.
3. The presentation method according to claim 1 or 2, wherein the generating, for each AR device, fused trajectory presentation data for at least one bullet of the AR device based on the pose data of the AR device and trajectory information of bullets fired by the respective virtual aircraft comprises:
determining fused track special effect data of the at least one bullet based on track information of bullets fired by each virtual aircraft;
and for each AR device, determining fused track display data of at least one bullet aiming at the AR device based on the pose data of the AR device and the fused track special effect data corresponding to the at least one bullet.
4. The display method according to any one of claims 1 to 3, further comprising:
determining whether there is at least one target virtual fired bullet hit based on the pose control data for each virtual aircraft and the bullet firing control data for that virtual aircraft;
upon determining that there is at least one target virtual aircraft hit by a bullet, determining special effect data for when the at least one target virtual aircraft was hit by the bullet;
and generating fused shooting display data aiming at each AR device based on the pose data of the AR device and the special effect data of the at least one target virtual aircraft when being hit by the bullet, and sending the fused shooting display data to the AR device for display.
5. The presentation method of claim 4 wherein said determining whether there is at least one target virtually fired bullet hit based on the pose control data for each virtual aircraft and the bullet firing control data for that virtual aircraft comprises:
determining a moving track of each virtual aircraft within a preset time length based on the current pose data of the virtual aircraft and pose control data aiming at the virtual aircraft within the preset time length;
determining whether the moving track of the target virtual aircraft intersects with the flight track of any bullet or not based on the track information of each bullet in the preset time length and the moving track of each virtual aircraft in the preset time length;
upon determining that there is an intersection of the trajectory of movement of the target virtual aircraft with the flight trajectory of any of the bullets, determining that there is at least one target virtual aircraft that was hit by a bullet.
6. The display method according to claim 4 or 5, further comprising:
in response to the fact that the target virtual aircraft is hit by a bullet, updating the pose data of the target virtual aircraft to obtain new pose data corresponding to the target virtual aircraft;
and generating pose change display data of the target virtual aircraft for the AR equipment based on the pose data of the AR equipment and the new pose data corresponding to the target virtual aircraft for each AR equipment, and sending the pose change display data to the AR equipment for display.
7. The presentation method according to any one of claims 1 to 6, wherein the obtaining the pose data of each AR device in the associated AR device group in the target real scene in real time comprises:
real scene images shot by each AR device in the associated AR device group aiming at the target real scene are obtained in real time;
and determining the pose data of the AR equipment in the target real scene based on the real scene image shot by each AR equipment and a pre-established three-dimensional scene model used for representing the target real scene.
8. A display device of a shooting trajectory, comprising:
the acquisition module is used for acquiring the pose data of each AR device in the associated AR device group in a target real scene in real time, and the pose control data and bullet shooting control data of each AR device aiming at the matched virtual aircraft;
a determination module for determining trajectory information of bullets fired by the virtual aircraft based on the pose control data for each of the virtual aircraft and the bullet firing control data for the virtual aircraft;
and the generating module is used for generating fusion track display data of at least one bullet aiming at the AR equipment based on the pose data of the AR equipment and the track information of the bullets launched by each virtual aircraft aiming at each AR equipment, and sending the fusion track display data to the AR equipment for display.
9. An electronic device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating via the bus when the electronic device is operating, the machine-readable instructions when executed by the processor performing the steps of the presentation method as claimed in any one of claims 1 to 8.
10. A computer-readable storage medium, having stored thereon a computer program which, when being executed by a processor, is adapted to carry out the steps of the presentation method as claimed in any one of the claims 1 to 8.
CN202010513301.XA 2020-06-08 2020-06-08 Shooting track display method and device, electronic equipment and storage medium Pending CN111632377A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010513301.XA CN111632377A (en) 2020-06-08 2020-06-08 Shooting track display method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010513301.XA CN111632377A (en) 2020-06-08 2020-06-08 Shooting track display method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111632377A true CN111632377A (en) 2020-09-08

Family

ID=72322818

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010513301.XA Pending CN111632377A (en) 2020-06-08 2020-06-08 Shooting track display method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111632377A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112619163A (en) * 2020-12-22 2021-04-09 上海米哈游天命科技有限公司 Flight path control method and device, electronic equipment and storage medium
CN113750532A (en) * 2021-09-24 2021-12-07 腾讯科技(深圳)有限公司 Track display method and device, storage medium and electronic equipment
CN114584681A (en) * 2020-11-30 2022-06-03 北京市商汤科技开发有限公司 Target object motion display method and device, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105031919A (en) * 2010-03-30 2015-11-11 索尼电脑娱乐美国公司 Method for an augmented reality character to maintain and exhibit awareness of an observer
CN107526443A (en) * 2017-09-29 2017-12-29 北京金山安全软件有限公司 Augmented reality method, device, system, electronic equipment and storage medium
CN110507990A (en) * 2019-09-19 2019-11-29 腾讯科技(深圳)有限公司 Interactive approach, device, terminal and storage medium based on virtual aircraft
CN110718094A (en) * 2019-10-29 2020-01-21 北京汽车集团有限公司 Collision early warning method, device and equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105031919A (en) * 2010-03-30 2015-11-11 索尼电脑娱乐美国公司 Method for an augmented reality character to maintain and exhibit awareness of an observer
CN107526443A (en) * 2017-09-29 2017-12-29 北京金山安全软件有限公司 Augmented reality method, device, system, electronic equipment and storage medium
CN110507990A (en) * 2019-09-19 2019-11-29 腾讯科技(深圳)有限公司 Interactive approach, device, terminal and storage medium based on virtual aircraft
CN110718094A (en) * 2019-10-29 2020-01-21 北京汽车集团有限公司 Collision early warning method, device and equipment

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114584681A (en) * 2020-11-30 2022-06-03 北京市商汤科技开发有限公司 Target object motion display method and device, electronic equipment and storage medium
CN112619163A (en) * 2020-12-22 2021-04-09 上海米哈游天命科技有限公司 Flight path control method and device, electronic equipment and storage medium
CN113750532A (en) * 2021-09-24 2021-12-07 腾讯科技(深圳)有限公司 Track display method and device, storage medium and electronic equipment
CN113750532B (en) * 2021-09-24 2023-07-14 腾讯科技(深圳)有限公司 Track display method and device, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
CN111638793B (en) Display method and device of aircraft, electronic equipment and storage medium
CN109939438B (en) Track display method and device, storage medium and electronic device
CN111617471A (en) Virtual shooting display method and device, electronic equipment and storage medium
JP5563709B2 (en) System and method for facilitating interaction with virtual space via a touch-sensitive surface
CN111632377A (en) Shooting track display method and device, electronic equipment and storage medium
KR102629359B1 (en) Virtual object attack prompt method and device, and terminal and storage medium
CN112148189A (en) Interaction method and device in AR scene, electronic equipment and storage medium
CN111651057A (en) Data display method and device, electronic equipment and storage medium
CN110876849B (en) Virtual vehicle control method, device, equipment and storage medium
CN112148187A (en) Interaction method and device for augmented reality scene, electronic equipment and storage medium
CN112076473A (en) Control method and device of virtual prop, electronic equipment and storage medium
CN111569414B (en) Flight display method and device of virtual aircraft, electronic equipment and storage medium
CN113398601A (en) Information transmission method, information transmission device, computer-readable medium, and apparatus
JP2024522972A (en) Method, device, equipment and computer program for rendering video frames
CN113633975A (en) Virtual environment picture display method, device, terminal and storage medium
CN113256710B (en) Method and device for displaying foresight in game, computer equipment and storage medium
CN113599822B (en) Virtual prop control method and device, storage medium and electronic equipment
CN111650953B (en) Aircraft obstacle avoidance processing method and device, electronic equipment and storage medium
CN111651048B (en) Multi-virtual object arrangement display method and device, electronic equipment and storage medium
CN112156472A (en) Control method, device and equipment of virtual prop and computer readable storage medium
Yavuz et al. Desktop Artillery Simulation Using Augmented Reality
Quek et al. Obscura: A mobile game with camera based mechanics
EP4394721A1 (en) Training system, method and apparatus using extended reality contents
CN113617030B (en) Virtual object control method, device, terminal and storage medium
CN113730909B (en) Aiming position display method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination