CN112933601A - Virtual throwing object operation method, device, equipment and medium - Google Patents

Virtual throwing object operation method, device, equipment and medium Download PDF

Info

Publication number
CN112933601A
CN112933601A CN202110227863.2A CN202110227863A CN112933601A CN 112933601 A CN112933601 A CN 112933601A CN 202110227863 A CN202110227863 A CN 202110227863A CN 112933601 A CN112933601 A CN 112933601A
Authority
CN
China
Prior art keywords
virtual
throwing
obstacle
environment
virtual environment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110227863.2A
Other languages
Chinese (zh)
Other versions
CN112933601B (en
Inventor
刘智洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202110227863.2A priority Critical patent/CN112933601B/en
Publication of CN112933601A publication Critical patent/CN112933601A/en
Application granted granted Critical
Publication of CN112933601B publication Critical patent/CN112933601B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/573Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5372Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for tagging characters, objects or locations in the game scene, e.g. displaying a circle under the character controlled by the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Aiming, Guidance, Guns With A Light Source, Armor, Camouflage, And Targets (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an operation method, device, equipment and medium for a virtual throwing object, and relates to the field of virtual environments. The method comprises the following steps: displaying a virtual object of the handheld virtual projectile, the virtual object being in a virtual environment; receiving a pre-throwing operation, wherein the pre-throwing operation is used for throwing and aiming a virtual throwing object in a virtual environment; displaying track indication information of the virtual throwing object in the virtual environment based on the pre-throwing operation, wherein the track indication information is used for indicating the flight track of the virtual throwing object in the virtual environment; in response to receiving the throwing operation, throwing the virtual throwing object to a target location in the virtual environment; displaying a virtual obstacle at the target location, the virtual obstacle to block virtual injury to the virtual object from a second side in response to the virtual object being located on a first side of the virtual obstacle. The virtual throwing object is used for setting the virtual barrier in the virtual environment to protect the virtual object, so that the operation diversity of the virtual throwing object is improved.

Description

Virtual throwing object operation method, device, equipment and medium
Technical Field
The present application relates to the field of virtual environments, and in particular, to a method, an apparatus, a device, and a medium for operating a virtual projectile.
Background
In applications that include a virtual environment, it is often necessary to perform activities in the virtual environment by controlling virtual objects in the virtual environment, such as: walking, driving, swimming, fighting, picking up objects, etc., wherein the virtual object may use part of the virtual prop to implement a certain function, e.g., the virtual object may implement different functions by throwing a carried virtual grenade, virtual flash bomb, virtual smoke bomb, etc.
In the related art, the virtual throwing prop includes a fighting prop and a tactical prop, wherein the fighting prop is a virtual throwing prop which can cause virtual damage to a virtual object of an enemy, such as a virtual grenade, and the tactical prop cannot cause damage to the virtual object of the enemy, but can form a certain tactical effect, such as a virtual flash bomb.
However, the two types of virtual throwing props are single in using mode and achieving effect.
Disclosure of Invention
The embodiment of the application provides an operation method, device, equipment and medium of a virtual throwing object, and the diversity of the operation method of the virtual throwing object can be improved. The technical scheme is as follows:
in one aspect, there is provided a method of operating a virtual projectile, the method comprising:
displaying a virtual object of a handheld virtual projectile, the virtual object being in a virtual environment;
receiving a pre-cast operation for casting aiming of the virtual projectile in the virtual environment;
displaying track indication information of the virtual throwing object in the virtual environment based on the pre-throwing operation, wherein the track indication information is used for indicating the flight track of the virtual throwing object in the virtual environment;
in response to receiving a throwing operation, throwing the virtual throwing object to a target location in the virtual environment;
displaying a virtual obstacle at the target location, the virtual obstacle to block virtual injury to the virtual object from a second side in response to the virtual object being located on a first side of the virtual obstacle, wherein the first side and the second side are opposite.
In another aspect, there is provided an apparatus for operating a virtual projectile, the apparatus comprising:
a display module for displaying a virtual object of a handheld virtual projectile, the virtual object being in a virtual environment;
a receiving module for a pre-throwing operation for throwing and aiming the virtual throwing object in the virtual environment;
the display module is further configured to display track indication information of the virtual throwing object in the virtual environment based on the pre-throwing operation, where the track indication information is used to indicate a flight track of the virtual throwing object in the virtual environment;
the display module is further used for responding to the received throwing operation, throwing the virtual throwing object to a target position in the virtual environment;
the display module is further configured to display a virtual obstacle at the target location, the virtual obstacle being configured to block virtual injury to the virtual object from a second side in response to the virtual object being located on a first side of the virtual obstacle, wherein the first side and the second side are opposite.
In another aspect, there is provided a computer device, the terminal comprising a processor and a memory, the memory having stored therein at least one instruction, at least one program, set of codes, or set of instructions, the at least one instruction, the at least one program, set of codes, or set of instructions, which is loaded and executed by the processor to implement the method of operating a virtual projectile as described in any of the embodiments of the present application.
In another aspect, there is provided a computer readable storage medium having at least one program code stored therein, the program code being loaded and executed by a processor to implement the method of operating a virtual projectile as described in any of the embodiments of the present application.
In another aspect, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The computer instructions are read by a processor of a computer device from a computer readable storage medium, and execution of the computer instructions by the processor causes the computer device to perform the method of operating a virtual projectile as described in any of the above embodiments.
The technical scheme provided by the application at least comprises the following beneficial effects:
in a virtual environment comprising a virtual object, the virtual object holding the virtual throwing object is controlled to throw the virtual throwing object to a target position in the virtual environment, so that a virtual barrier is displayed at the target position, the virtual barrier can protect the virtual object on a first side from being virtually injured from a second side, wherein the first side is opposite to the second side, the operation diversity of the virtual throwing object is improved, and the function of the virtual throwing object is enriched.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a block diagram of an electronic device provided in an exemplary embodiment of the present application;
FIG. 2 is a block diagram of a computer system provided in an exemplary embodiment of the present application;
FIG. 3 is a flow chart of a method of operation of a virtual projectile provided by an exemplary embodiment of the present application;
FIG. 4 is a schematic illustration of an equipment interface provided by an exemplary embodiment of the present application;
FIG. 5 is a flow chart of a method of operation of a virtual projectile provided by another exemplary embodiment of the present application;
FIG. 6 is a schematic view of a first preset range of a virtual obstacle provided by an exemplary embodiment of the present application;
FIG. 7 is a flow chart of a method of operation of a virtual projectile provided by another exemplary embodiment of the present application;
FIG. 8 is a schematic illustration of a throwing trajectory provided by an exemplary embodiment of the present application;
FIG. 9 is a schematic view of a virtual obstacle provided by an exemplary embodiment of the present application;
FIG. 10 is a schematic illustration of virtual obstacle orientation determination provided by an exemplary embodiment of the present application;
FIG. 11 is a flowchart corresponding to a method of operating a virtual projectile in accordance with an exemplary embodiment of the present application;
fig. 12 is a block diagram of an apparatus for operating a virtual projectile according to an exemplary embodiment of the present application;
fig. 13 is a block diagram of an apparatus for operating a virtual projectile according to another exemplary embodiment of the present application;
fig. 14 is a block diagram of a terminal according to an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
First, terms referred to in the embodiments of the present application are briefly described:
virtual environment: is a virtual environment that is displayed (or provided) when an application is run on the terminal. The virtual environment may be a simulation environment of a real world, a semi-simulation semi-fictional three-dimensional environment, or a pure fictional three-dimensional environment. The virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment, and the following embodiments illustrate the virtual environment as a three-dimensional virtual environment, but are not limited thereto. Optionally, the virtual environment is also used for virtual environment engagement between at least two virtual characters. Optionally, the virtual environment is further used for fighting between at least two virtual characters using a virtual firearm. Optionally, the virtual environment is further configured to engage a virtual firearm between at least two virtual characters within a target area, the target area being smaller over time in the virtual environment.
Virtual object: refers to a movable object in a virtual environment. The movable object can be a virtual character, a virtual animal, an animation character, etc., such as: characters, animals, plants, oil drums, walls, stones, etc. displayed in a three-dimensional virtual environment. Optionally, the virtual object is a three-dimensional stereo model created based on an animated skeleton technique. Each virtual object has its own shape and volume in the three-dimensional virtual environment, occupying a portion of the space in the three-dimensional virtual environment.
Virtual throwing object: the method refers to a prop for triggering a target function by throwing a virtual object in a virtual environment. Illustratively, the division by function is made with virtual projectiles including combat props and tactical props. Wherein, the combat props are throwing props that can cause virtual injury to the virtual object, for example: virtual grenades, virtual bums, virtual sticky bombs, and the like. Tactical props are throwing props which cannot cause virtual damage to virtual objects but can cause functional influence, for example: virtual smoke bombs, virtual shatter bombs, and the like. Optionally, the throwing prop may trigger the target function when the thrown duration reaches a preset duration, or may trigger the target function when thrown and there is a collision condition. The virtual grenade in the virtual environment is taken as a fighting prop, and the virtual flash bomb is taken as a tactical prop for illustration, wherein the virtual grenade is a prop which triggers a detonation function when the thrown duration in the virtual environment reaches the preset duration, a player controls a target virtual object to throw the virtual grenade, and when the thrown duration of the virtual grenade reaches the preset duration, the virtual grenade detonates in the virtual environment and injures the virtual object positioned in the preset distance range of the detonation point; the virtual flash bomb is a prop which is thrown in a virtual environment and triggers a flash action when a collision event exists, a player controls a target virtual object to throw the virtual flash bomb, when the virtual flash bomb falls on the ground in the virtual environment, a flash function is triggered, and the sight line of the virtual object located in a flash range is blocked.
The method provided in the present application may be applied to a virtual reality application program, a three-dimensional map program, a military simulation program, a First-Person shooter game (FPS), a Third-Person shooter game (TPS), a Multiplayer Online tactical competition game (MOBA), and the like, and the following embodiments are exemplified by applications in Games.
The game based on the virtual environment is often composed of one or more maps of game worlds, the virtual environment in the game simulates the scene of the real world, the user can control the virtual object in the game to perform actions of walking, running, jumping, shooting, fighting, driving, switching to use a virtual weapon, attacking other virtual objects by using the virtual weapon and the like in the virtual environment, the interactivity is strong, and a plurality of users can form a team on line to play a competitive game. When the user controls the virtual object to use the virtual weapon to attack the target virtual object, the user selects a proper virtual weapon to attack the virtual object according to the position of the target virtual object or the operation habit. The virtual weapon comprises at least one of a mechanical weapon, a close-up weapon and a throwing weapon, wherein the mechanical weapon comprises rifles, sniper guns, pistols, shotguns and the like, the close-up weapon comprises at least one of daggers, knives, axes, swords, sticks and pots (such as pans), and the throwing weapon comprises virtual grenades, virtual viscous grenades, virtual flashes, virtual smoke bombs and the like.
The terminal in the present application may be a desktop computer, a laptop computer, a mobile phone, a tablet computer, an e-book reader, an MP3(Moving Picture Experts Group Audio Layer III, mpeg compression standard Audio Layer 3) player, an MP4(Moving Picture Experts Group Audio Layer IV, mpeg compression standard Audio Layer 4) player, and so on. The terminal is installed and operated with an application program supporting a virtual environment, such as an application program supporting a three-dimensional virtual environment. The application program may be any one of a virtual reality application program, a three-dimensional map program, a military simulation program, a TPS game, an FPS game, and an MOBA game. Alternatively, the application program may be a stand-alone application program, such as a stand-alone 3D game program, or may be a network online application program.
Fig. 1 shows a block diagram of an electronic device according to an exemplary embodiment of the present application. The electronic device 100 includes: an operating system 120 and application programs 122.
Operating system 120 is the base software that provides applications 122 with secure access to computer hardware.
Application 122 is an application that supports a virtual environment. Optionally, application 122 is an application that supports a three-dimensional virtual environment. The application 122 may be any one of a virtual reality application, a three-dimensional map program, a military simulation program, a TPS game, an FPS game, an MOBA game, and a multi-player gunfight type live game. The application 122 may be a stand-alone application, such as a stand-alone 3D game program.
Fig. 2 shows a block diagram of a computer system provided in an exemplary embodiment of the present application. The computer system 200 includes: a terminal 210, a server 220 and a communication network 230.
The terminal 210 is installed and operated with an application program supporting a virtual environment. The application program can be any one of a virtual reality application program, a three-dimensional map program, a military simulation program, a TPS game, an FPS game, an MOBA game and a multi-player gunfight living game. The user can control the virtual objects located in the virtual environment to perform activities by using the terminal 210. Such activities include, but are not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the virtual object is a virtual character, such as a simulated persona or an animated persona.
The terminal 210 is connected to the server 220 through the communication network 230. The communication network 230 includes a wireless network or a wired network. The device types corresponding to the terminal 210 include: at least one of a game console, a desktop computer, a smartphone, a tablet, an e-book reader, an MP3 player, an MP4 player, and a laptop portable computer.
The server 220 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. Server 220 is used to provide background services for applications that support a three-dimensional virtual environment. Optionally, the server 220 undertakes primary computing work and the terminal 210 undertakes secondary computing work; alternatively, the server 220 undertakes the secondary computing work and the terminal 210 undertakes the primary computing work; alternatively, the server 220 and the terminal 210 perform cooperative computing by using a distributed computing architecture.
The user logs in the application program corresponding to the virtual environment through the terminal 210, the terminal 210 establishes long connection with the server 220, the server 220 authenticates the request sent by the terminal 210, if the request is legal, the server 220 processes the request and returns the processing result of the request to the terminal 210. Illustratively, a user controls a virtual object in a virtual environment to use a virtual throwing object, the terminal 210 generates a corresponding use request and sends the use request to the server 220, the server 220 calculates a drop point of the virtual throwing object according to the use request and returns a logic processing result to the terminal 210, and the terminal 210 correspondingly displays a target function of the virtual throwing object after receiving the processing result.
With reference to the above noun brief introduction and description of implementation environment, a method for operating a virtual throwing object according to an embodiment of the present application is described, referring to fig. 3, which shows a flowchart of a method for operating a virtual throwing object according to an exemplary embodiment, and the method is described by taking the method as an example when applied to a terminal, and the method includes:
step 301, displaying a virtual object of the handheld virtual throwing object.
And the terminal displays a virtual environment interface, wherein the virtual environment interface comprises a picture for observing the virtual environment based on the visual angle of the virtual object. Optionally, the picture is a picture for observing the virtual environment at a first person perspective of the virtual object, or a picture for observing the virtual environment at a third person perspective of virtual awakening. The user can control the virtual objects in the virtual environment through the terminal to implement various activities in the virtual environment through the virtual objects, including but not limited to: the virtual object may be a simulated character or an animated character by adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, skill use, and attack.
The virtual object is in a virtual environment, the virtual object possessing a virtual projectile. Illustratively, the virtual throwing object may be obtained by picking up a virtual object from a virtual environment, or may be obtained by equipping the virtual object in a preset interface before entering the virtual environment, where the virtual object entering the virtual environment carries the virtual throwing object. In one example, as shown in fig. 4, before starting a virtual match, a user enters an equipment interface 400, a virtual item list 410 is displayed in the equipment interface 400, the virtual item list 410 includes a plurality of virtual items that can be equipped by a virtual object, including the virtual throwing object 411, the user can select the virtual throwing object 411 from the virtual item list 410 and equip the virtual throwing object 411 to a virtual object corresponding to a user account through an equipment control 420, and when the user controls the virtual object to enter the virtual match, the virtual object carries the virtual throwing object 411.
The virtual throwing object is a prop which displays a virtual obstacle in a virtual environment after being thrown, wherein the virtual throwing object can display the virtual obstacle when the throwing time reaches a preset time length, or can display the virtual obstacle when a collision event exists in the virtual environment after being thrown. In the embodiment of the application, the virtual throwing prop can realize both tactical effect and combat effect.
Optionally, the virtual throwing object may be a prop thrown to the ground in the virtual environment, may also be a prop thrown to a designated location, and may also be a prop thrown to any other location. Illustratively, the virtual projectile may be thrown in a virtual environment on any one of a virtual floor, a virtual desktop, a surface of a virtual object, and the like.
In the embodiment of the application, the virtual object of the handheld virtual throwing object is displayed through the terminal. Illustratively, when the virtual throwing object is in a pre-equipped state (the virtual throwing object display equipment is arranged at the waist position of the virtual object), the virtual throwing object is switched to an equipped state through the throwing object equipment control/shortcut key, namely, the virtual object holding the virtual throwing object is displayed in the virtual environment picture. When the virtual throwing object is in the virtual backpack, a user can display a virtual object of the handheld virtual throwing object after using and operating the virtual throwing object in a backpack interface corresponding to the virtual backpack; or the virtual throwing object is switched to an equipment state through a virtual throwing object switching control on the virtual environment interface.
Optionally, the virtual throwing object may trigger the target function when the thrown duration reaches the preset duration, or may trigger the target function when the virtual throwing object is thrown and there is a collision condition, where the target function is triggered when the thrown duration reaches the preset duration is described as an example. Optionally, the thrown duration is divided into two phases: the method comprises a pre-throwing stage and a throwing stage, wherein the pre-throwing stage refers to a stage that a virtual object holds a virtual throwing object in hand and determines the throwing direction of the virtual throwing object, the throwing stage refers to a stage that the virtual throwing object is thrown to a corresponding position and triggers a target function after the virtual object throws the virtual throwing object along the throwing direction, and the thrown duration can be timed from the starting time of the pre-throwing stage or the starting time of the throwing stage.
Step 302, receive a pre-cast operation.
The pre-cast operation is used for casting and aiming a virtual throwing object in a virtual environment. The pre-throwing operation corresponds to the pre-throwing stage in the throwing duration, that is, when the user determines to throw the virtual throwing object in the current throwing direction through the pre-throwing operation, the virtual throwing object has a falling point position in the virtual environment, the falling point position can be a virtual ground, a virtual wall, the surface of a virtual object and the like in the virtual environment, and the falling point position and the position triggering the virtual throwing object can be the same or different. The user can prejudge the target position of the virtual throwing object triggering function based on the drop point position.
Illustratively, after receiving the pre-throwing operation, the user can also cancel the throwing aiming of the virtual throwing object through the throwing canceling operation, and the virtual throwing object is not consumed.
The pre-throwing operation and the throwing canceling operation can be realized through a preset control and can also be realized through a preset shortcut key, and the limitation is not carried out.
Step 303, displaying the trajectory indication information of the virtual throwing object in the virtual environment based on the pre-throwing operation.
The trajectory indication information is used to indicate a flight trajectory of the virtual projectile in the virtual environment. Illustratively, the throwing indication information is determined by the corresponding throwing direction and throwing angle of the virtual throwing object under the pre-throwing operation. In one example, the trajectory indication information is displayed in a virtual parabola of a preset transparency in the virtual environment.
Illustratively, the drop point indication information of the virtual throwing object is determined through the track indication information, and the drop point indication information is used for indicating the drop point condition of the virtual throwing object in the virtual environment. In one example, a position of a drop point is determined according to a collision point of a virtual parabola corresponding to the trajectory indication information and a virtual object in the virtual environment, and the drop point indication information is displayed in the virtual environment based on the position of the drop point, for example, when the virtual parabola corresponding to a virtual throwing object in the pre-throwing operation generates a collision point with a virtual wall, the drop point position information is displayed at the collision point.
In response to receiving the throwing operation, the virtual throwing object is thrown to a target location in the virtual environment, step 304.
The throwing operation is used to instruct throwing of the virtual throwing object through the virtual object to a target location in the virtual environment.
Illustratively, a throwing control part is included in the virtual environment interface, and a user can control a virtual object to use the virtual throwing object through the throwing control part; the user may also control the virtual object to use the virtual throwing object by presetting a shortcut key, which is not limited herein. Taking the example of controlling a virtual object to use a virtual throwing object through a throwing control, a terminal displays the throwing control, and the throwing control is superposed on a virtual environment picture; and controlling the virtual object to throw the virtual throwing object in response to receiving the triggering operation aiming at the throwing control. The terminal receives the throwing operation through the throwing control, generates a throwing signal according to the throwing operation, sends the throwing signal to the server, the server determines the last falling point of the virtual throwing object according to the throwing signal, determines the target position, returns feedback information including the target position to the terminal, and the terminal displays a corresponding virtual environment picture according to the feedback information, namely the virtual throwing object is thrown to the target position in the virtual environment by the virtual object.
Step 305, displaying a virtual obstacle at the target location.
Schematically, the virtual throwing object corresponds to a target triggering time length, and after the virtual throwing object is thrown to a virtual environment, the corresponding throwing time length needs to reach the target triggering time length, and then the virtual barrier is displayed at the target position, namely, the virtual barrier is displayed at the target position in response to that the virtual throwing object is thrown to the target position in the virtual environment and the thrown time length reaches the target triggering time length; virtual obstacles may also be displayed when thrown and in the presence of a collision condition; or the virtual throwing object is triggered to display the virtual barrier when the virtual object approaches a certain range of the virtual throwing object thrown to the target position in the virtual environment; the virtual obstacle may be displayed at the target position when the virtual throwing object is thrown into the virtual environment and the obstacle display operation is received (for example, a trigger operation on the obstacle display control is received), which is not limited herein.
The function of the virtual obstacle displayed by the virtual projectile in the virtual environment comprises an attack blocking function, i.e. if the virtual object is located on a first side of the virtual obstacle, the virtual obstacle is used for blocking virtual injury to the virtual object from a second side, wherein the first side and the second side are opposite. Illustratively, the virtual obstacle is a virtual object with a preset shape, the virtual object may be a virtual wall, a virtual steel plate, a virtual explosion-proof shield, and the like, and the preset shape may be a rectangular parallelepiped, a cube, a sphere, or another irregular-shaped object, which is not limited herein.
When the virtual object is on the first side of the virtual barrier, the virtual barrier can protect the virtual object from virtual injury from the second side, i.e., the virtual barrier can block attack in a preset direction, optionally, the virtual barrier can block attack operation of a hostile virtual object, but not attack operation of the virtual object or a teammate virtual object, the hostile virtual object has a hostile paradox relationship with the virtual object throwing the virtual projectile, the teammate virtual object has a same paradox relationship with the virtual object throwing the virtual projectile, in one example, a virtual bullet launched by the hostile virtual object using a virtual firearm cannot penetrate the virtual barrier, and a virtual bullet launched by the teammate virtual object using a virtual firearm can penetrate the virtual barrier; or the virtual barrier blocks all attack operations in the virtual environment, in one example, neither the hostile virtual object nor the teammate virtual object can penetrate the virtual barrier using a virtual bullet fired by the virtual firearm; or the virtual barrier allows attack operations in one direction but blocks attack operations in the opposite direction, in one example the virtual barrier comprises an attack face and a defence face, a virtual bullet being able to pass through the virtual barrier from the attack face but not from the defence face.
The virtual obstacle also corresponds to a virtual life value, in one example, an attack operation of an enemy virtual object on the virtual obstacle can cause damage to the virtual obstacle, that is, the corresponding virtual life value of the virtual obstacle is reduced, when the virtual life value is reduced to 0, the virtual obstacle disappears from the virtual environment, illustratively, the virtual obstacle can directly disappear from the virtual environment, or an explosion effect is triggered when the virtual life value is cleared, and the explosion effect can cause virtual damage to the virtual object within a certain range. The method comprises the steps that attack operation of an enemy virtual object on a virtual barrier is received, and the virtual life value of the virtual barrier is reduced; and controlling the virtual barrier to disappear from the virtual environment in response to the virtual life value being cleared.
The virtual barrier also corresponds to a target display duration, in one example, if the virtual life value of the virtual barrier is not cleared before the target display duration is finished, the virtual barrier continues to be displayed in the virtual environment, and if the virtual life value of the virtual barrier is cleared, the virtual barrier disappears from the virtual environment even if the display duration of the virtual barrier does not reach the target display duration; or the virtual barrier only corresponds to the target display duration, namely before the display duration of the virtual barrier does not reach the target display duration, the virtual barrier is displayed in the virtual environment, and the virtual barrier is controlled to disappear from the virtual environment in response to the display duration of the virtual barrier reaching the target display duration; or triggering an attack function of the virtual barrier in the virtual environment in response to the display duration of the virtual barrier reaching the target display duration, wherein the attack function includes causing virtual damage to a virtual object within a first preset range, and the virtual object can be an enemy virtual object, a teammate virtual object or a virtual object for throwing the virtual throwing object.
The virtual barrier is also correspondingly provided with a cancellation control, when the virtual barrier is successfully displayed at a target position in the virtual environment, the cancellation control is displayed on a virtual environment interface, and when the cancellation control receives a trigger signal, the virtual barrier in the virtual environment is controlled to disappear, namely, after the user displays the virtual barrier through a virtual throwing object, the virtual barrier can be cancelled in a manual mode.
The function of the virtual barrier in the virtual environment also includes an attack function. After the virtual barrier is displayed in the virtual environment, receiving an attack trigger signal aiming at the virtual barrier, and triggering an attack function of the virtual barrier in the virtual environment based on the attack trigger signal, wherein the attack function is used for indicating that virtual injury is caused to virtual rest within a first preset range of the virtual barrier, and the virtual object can be an enemy virtual object, a teammate virtual object or a virtual object for throwing the virtual throwing object. Illustratively, the sending of the attack trigger signal includes triggering by a control, or triggering in response to disappearance of the virtual obstacle, or triggering when the virtual object approaches a certain preset range of the virtual obstacle.
Schematically, triggering the attack function of the virtual obstacle through a control, namely, displaying the control in a virtual environment interface by a terminal, wherein the control is superposed on a virtual environment picture, a virtual environment is displayed in the virtual environment picture, and the control is used for triggering the attack function of the virtual obstacle; receiving a trigger operation aiming at a control; and generating an attack trigger signal to the virtual obstacle based on the trigger operation so as to realize the attack function of the virtual obstacle in the virtual environment. Optionally, the control may be displayed in the virtual environment interface when the virtual object is close to the virtual obstacle, that is, when the virtual object moves to a second preset range of the virtual obstacle, the control is displayed; the control may also be that after the virtual object throws the virtual throwing object, the virtual obstacle is successfully displayed in the virtual environment and then displayed on the virtual environment interface, that is, the control is displayed in response to the virtual obstacle being displayed at the target position.
Illustratively, when the display duration of the virtual obstacle in the virtual environment reaches the target display duration, the attack function of the virtual obstacle is automatically triggered. Namely, in response to the display duration of the virtual obstacle reaching the target display duration, an attack trigger signal for the virtual obstacle is generated to realize an attack function of the virtual obstacle in the virtual environment.
Illustratively, when a virtual obstacle located in the virtual environment is approached by a virtual object, the attack function of the virtual obstacle is automatically triggered. That is, in response to the virtual object existing within the third preset range of the virtual obstacle, an attack trigger signal for the virtual obstacle is generated to realize an attack function of the virtual obstacle in the virtual environment, and the virtual object may be an enemy virtual object, a teammate virtual object, or a virtual object for throwing the virtual throwing object.
In summary, according to the operation method of the virtual throwing object provided by the embodiment of the present application, in the virtual environment including the virtual object, the virtual object holding the virtual throwing object is controlled to throw the virtual throwing object to the target position in the virtual environment, so as to realize displaying the virtual barrier at the target position, and the virtual barrier can protect the virtual object on the first side from the virtual injury from the second side, wherein the first side is opposite to the second side, so that the operation diversity of the virtual throwing object is improved, and the function of the virtual throwing object is enriched.
Referring to fig. 5, it shows an operation method of a virtual throwing object provided in another exemplary embodiment of the present application, and in the embodiment of the present application, taking as an example that the virtual throwing object can also implement an attack function, the method includes:
step 501, displaying a virtual object of the handheld virtual throwing object.
The virtual object is in a virtual environment, the virtual object holding a virtual projectile. The virtual throwing object can display a virtual barrier after being thrown, and the functions of the virtual barrier in the virtual environment comprise an attack blocking function and an attack function.
Step 502, receive a pre-cast operation.
The pre-cast operation is used for casting and aiming a virtual throwing object in a virtual environment. The user can determine the position of the virtual throwing object at the falling point in the virtual environment when throwing in the current throwing direction through the pre-throwing operation, and the user can pre-judge the target position of the virtual throwing object triggering function based on the position of the falling point.
Step 503, displaying the track indication information of the virtual throwing object in the virtual environment based on the pre-throwing operation.
The trajectory indication information is used to indicate a flight trajectory of the virtual projectile in the virtual environment. Illustratively, the throwing indication information is determined by the corresponding throwing direction and throwing angle of the virtual throwing object under the pre-throwing operation.
In response to receiving the throwing operation, the virtual throwing object is thrown to a target location in the virtual environment, step 504.
The throwing operation is used to instruct throwing of the virtual throwing object through the virtual object to a target location in the virtual environment. Illustratively, the terminal determines to receive a throwing operation of a user through a trigger signal of a throwing control, generates a corresponding throwing signal according to a determined throwing direction and a throwing angle in a pre-throwing operation stage, transmits the throwing signal to the server, determines the final setting of the virtual throwing object, namely a target position, according to the throwing direction and the throwing angle in the throwing signal, returns feedback information corresponding to the target position to the terminal, and displays a corresponding virtual environment picture according to the feedback information, namely the virtual throwing object is thrown to the target position in the virtual environment by the virtual object.
Step 505, displaying a virtual obstacle at the target position.
Illustratively, the virtual obstacle is a virtual object with a preset shape, the virtual object may be a virtual wall, a virtual steel plate, a virtual explosion-proof shield, and the like, and the preset shape may be a rectangular parallelepiped, a cube, a sphere, or another irregular-shaped object, which is not limited herein. The functions of the virtual barrier include an attack blocking function and an attack function.
Steps 506-508 (step 506 includes steps 5061 and 5062, step 507 includes steps 5071 and 5072) are the functional effects that three virtual obstacles can achieve in the virtual environment.
At step 5061, an attack trigger signal for the virtual obstacle is received.
The sending of the attack trigger signal comprises triggering through a control, or triggering when the virtual obstacle disappears in response, or triggering when the virtual object approaches a certain preset range of the virtual obstacle. The method comprises the steps that after a terminal receives an attack trigger signal aiming at a virtual obstacle, an attack trigger request is correspondingly generated, the attack trigger request is sent to a server, the server determines a virtual object which is virtually damaged due to the attack function of the virtual obstacle in a virtual environment according to the attack trigger request, effect feedback information is generated and fed back to the terminal, and the terminal displays a corresponding virtual environment picture according to the effect feedback information.
At step 5062, an attack function of the virtual barrier in the virtual environment is triggered based on the attack trigger signal.
The attack function is used for indicating that virtual damage is caused to a virtual object within a first preset range of a virtual obstacle. The first preset range has a corresponding relationship with the volume of the virtual obstacle, and in one example, as shown in fig. 6, a virtual obstacle 610 is displayed in the virtual environment screen 600, the virtual obstacle 610 is a rectangular wall surface, the first preset range is determined according to the length of the virtual obstacle 610, that is, a circle is made with the length of the virtual obstacle 610 as a diameter, and a circular range 620, that is, a first preset range, is obtained. When the attack function of the virtual obstacle is triggered, the virtual obstacle can generate an explosion effect, and the corresponding enemy virtual object in the first preset range can be virtually damaged by a preset numerical value.
Illustratively, the magnitude of the virtual injury is related to the distance between the enemy virtual object and the virtual obstacle, and in one example, the closer the enemy virtual object is to the virtual obstacle, the higher the value corresponding to the received virtual injury is, and the farther the enemy virtual object is from the virtual obstacle, the lower the value corresponding to the received virtual injury is.
Step 5071, in response to receiving an attack operation on the virtual obstacle by the virtual object in the virtual environment, reducing a virtual life value of the virtual obstacle.
The virtual barrier also corresponds to a virtual life value. Illustratively, when the virtual object in the virtual environment includes a hostile virtual object, the hostile virtual object's attack operation on the virtual obstacle may cause damage to the virtual obstacle, i.e. reduce the corresponding virtual life value of the virtual obstacle, and when the virtual life value is reduced to 0, the virtual obstacle disappears from the virtual environment.
In one example, the virtual damage to the virtual obstacle caused by the attack operation of the enemy on the virtual object is related to the virtual prop used by the virtual object, that is, the higher the attack force corresponding to the virtual prop is, the higher the virtual damage to the virtual obstacle is, for example, if the virtual gun held by the enemy on the virtual object is a virtual pistol, the lower the damage to the virtual obstacle caused by the corresponding virtual bullet, if the virtual gun held by the enemy on the virtual object is a virtual mortar, the higher the damage to the corresponding virtual shell, the higher the damage to the virtual obstacle caused by the virtual obstacle, if the corresponding virtual damage is higher than the total virtual life value or the remaining virtual life value of the virtual obstacle, the virtual obstacle is directly destroyed, and the attack blocking function cannot be provided to the virtual object any more.
In response to the virtual life value being empty, the virtual obstacle is controlled to disappear from the virtual environment, step 5072.
When the virtual life value of the virtual barrier is emptied, the virtual barrier disappears from the virtual environment, and the attack blocking function can not be provided for the virtual object any more. Illustratively, the virtual barrier may disappear directly from the virtual environment, or an explosion effect may be triggered when the virtual life value is cleared, and the explosion effect may cause virtual damage to a virtual object within a certain range.
And step 508, responding to the display duration of the virtual barrier reaching the target display duration, and controlling the virtual barrier to disappear from the virtual environment.
The virtual barrier also corresponds to a target display duration, illustratively, if the virtual life value of the virtual barrier is not cleared before the target display duration is finished, the virtual barrier continues to be displayed in the virtual environment, and if the virtual life value of the virtual barrier is cleared, the virtual barrier disappears from the virtual environment even if the display duration of the virtual barrier does not reach the target display duration; or the virtual barrier only corresponds to the target display duration, namely before the display duration of the virtual barrier does not reach the target display duration, the virtual barrier is displayed in the virtual environment, and the virtual barrier is controlled to disappear from the virtual environment in response to the fact that the display duration of the virtual barrier reaches the target display duration. Schematically, or after the display duration of the virtual obstacle reaches the target display duration, triggering an attack function of the virtual obstacle in the virtual environment, namely generating an explosion effect after the virtual obstacle displays the target display duration in the virtual environment, and generating virtual damage to the virtual object within the second preset range.
Illustratively, when the display duration of the virtual obstacle does not reach the target display duration, the user may control the virtual obstacle to disappear through the cancel control, that is, when the cancel control receives the trigger signal, the virtual obstacle directly disappears.
In summary, according to the operation method of the virtual throwing object provided by the embodiment of the application, the virtual barrier is displayed through the virtual throwing object in the virtual environment, and the functions of the virtual barrier in the virtual environment comprise the attack blocking function and the attack function, so that the battle effect and the tactical effect can be realized, the operation diversity of the virtual throwing object is improved, and the functions of the virtual throwing object are enriched.
Referring to fig. 7, there is shown an operation method of a virtual throwing object according to another exemplary embodiment of the present application, in which a throwing process of the virtual throwing object is described, the method including:
step 701, displaying a virtual object of the handheld virtual throwing object.
The user can control the virtual objects in the virtual environment through the terminal, so that various activities can be realized in the virtual environment through the virtual objects. Wherein the activity includes a virtual projectile held in the hand using a virtual object.
Step 702, in response to receiving the pre-cast operation, displaying trajectory indication information of the virtual throwing object in the virtual environment.
Illustratively, a throwing trajectory of a virtual throwing object in a virtual environment is determined in response to receiving a target account number usage signal for the virtual throwing object. Specifically, the terminal generates a use signal according to the received pre-throwing operation, determines a throwing track of the throwing object in the virtual environment according to the use signal, and displays track indication information on a virtual environment screen according to the throwing track. The target account corresponds to the virtual object, that is, the user logs in the target account in an application program on the terminal, the application program provides a virtual environment, and the user can control the virtual object in the virtual environment through the terminal.
In the embodiment of the present application, please refer to fig. 8, a throwing control 810 is included in a virtual environment interface 800 displayed by a terminal, a user throws a virtual throwing object by clicking or pressing the throwing control 810, illustratively, when the user clicks the throwing control 810, a virtual object shows a throwing motion, the terminal displays track indication information 820 corresponding to a current throwing direction and a throwing angle in a virtual environment screen, and the user can approximately judge a drop point of the virtual throwing object through the track indication information 820.
Illustratively, the throwing track may be determined by the terminal or the server, and is not limited herein. Taking the determination through the server as an example, the terminal generates a corresponding throwing signal according to the throwing direction and the throwing angle, and transmits the throwing signal to the server, and the server performs logic processing according to the throwing signal.
Step 703, displaying the information indicating the placement of the virtual throwing object in the virtual environment.
And determining the drop point indication information of the virtual throwing object through the track indication information, wherein the drop point indication information is used for indicating the drop point condition of the virtual throwing object in the virtual environment. Illustratively, the drop point indication information may be a drop point position at the time of a first collision in the virtual environment when the virtual throwing object is thrown at a throwing angle and a throwing direction corresponding to the current pre-throwing operation, and in one example, the drop point indication information is displayed on a virtual wall if the drop point position at the time of the first collision is the virtual wall. Or the drop point indication information may be a final drop point, that is, a target position in the virtual environment when the virtual throwing object is thrown at a throwing angle and a throwing direction corresponding to the current pre-throwing operation, in one example, when the virtual throwing object is thrown at a throwing angle and a throwing direction corresponding to the current pre-throwing operation, the drop point position where the virtual throwing object collides for the first time in the virtual environment is located on a virtual wall, the virtual throwing object rebounds through the virtual wall, falls on the virtual ground and collides for the second time, and finally falls at a collision point of the second collision, and the drop point indication information is displayed at the collision point of the second collision.
Step 704, in response to receiving the throwing operation, throws the virtual throwing object to a target location in the virtual environment.
The virtual throwing object falls at a target position after flying and colliding in a virtual environment, schematically, the determination of the target position is obtained through logic processing by a server, namely the server determines a collision point of the virtual throwing object in the virtual environment according to a throwing signal, if the collision point corresponds to a trigger plane, the collision point is determined to be the target position, if the collision point corresponds to a non-trigger plane, the virtual throwing object continues flying after rebounding through the non-trigger plane and collides with the next collision point until the virtual throwing object falls on the trigger plane, the server determines the target position, wherein the trigger plane comprises a virtual plane which is horizontally displayed in the virtual environment, such as a virtual ground, a virtual desktop and the like, and the non-trigger plane comprises a virtual plane which is vertically displayed or obliquely displayed in the virtual environment, such as a virtual wall surface and the like. The server generates a feedback signal for the determined target position, returns the feedback signal to the terminal, and the terminal displays a corresponding virtual environment picture according to the feedback signal, namely a throwing track of the virtual throwing object in the virtual environment and the target position corresponding to the falling point.
Step 705, displaying a virtual obstacle at the target location. And when the virtual throwing object is thrown to the virtual environment and the corresponding throwing time length reaches the target triggering time length, displaying the virtual barrier at the target position.
As shown in fig. 9, a virtual obstacle 910 is displayed at a target position in the virtual environment interface 900, wherein the virtual obstacle 910 corresponds to a virtual wall, and its corresponding placing direction is perpendicular to the current virtual object facing direction, that is, as shown in fig. 10, the orientation of the virtual obstacle 1010 in the virtual environment 1000 is parallel to or on the same straight line with the normal direction 1030 of the virtual obstacle from the virtual object facing direction 1020. That is, the drop point of the virtual projectile determines the target location for virtual obstacle generation, while the orientation of the virtual object determines the orientation of the virtual obstacle.
Illustratively, the size of the virtual obstacle is determined by the surrounding environment, in one example, since there are more controls in the virtual environment in the upward direction, the maximum height is generally taken in the height, and if there are other obstacles above, such as a roof, the corresponding height takes the height from the obstacle to the ground as the height of the virtual obstacle; then, the lengths of the two sides need to be selected, rays are emitted from the target position to the left side and the right side schematically, after the rays collide with other obstacles, the terminal obtains a transverse longest length, if the length is larger than the preset length of the virtual obstacle, the virtual obstacle with the preset length is directly displayed, if the length is smaller than the preset length of the virtual obstacle, the length is taken as the length of the virtual obstacle, and finally the virtual obstacle is displayed through the determined height and length.
In one example, please refer to fig. 11, which shows a corresponding flow chart of the operation method of the virtual throwing object, that is, the flow includes: controlling a virtual object to equip a virtual projectile 1101; judging if the throwing control is triggered 1102; if yes, the virtual throwing object is thrown into the virtual environment 1103; judging whether to collide with a non-trigger plane 1104; if so, then bounce and continue flying 1105; if not, the object falls into the target position to explode and generate a virtual obstacle 1106; determining whether the virtual barrier is broken 1107; if so, the virtual obstacle disappears 1108; if not, continuing to resist the attack operation of the enemy virtual object 1109; determining whether to trigger an attack function for a virtual barrier 1110; if yes, the virtual barrier generates an explosion effect and causes virtual damage 1111 to the enemy virtual object within the first preset range.
To sum up, in the virtual environment including the virtual object, the operation method for the virtual projectile according to the embodiment of the present application displays the trajectory indication information through the pre-throwing operation, throws the virtual projectile to the target position in the virtual environment through the throwing operation, and displays the virtual obstacle at the target position, where the virtual obstacle can provide an attack blocking function for the virtual object, that is, if the virtual object is located on one side of the virtual obstacle, the virtual obstacle can block the virtual damage to the virtual object from the second side, so that the operation diversity of the virtual projectile is improved, and the function of the virtual projectile is enriched.
Referring to fig. 12, a block diagram of an operating device for a virtual throwing object according to an exemplary embodiment of the present invention is shown, the device including:
a display module 1210 for displaying a virtual object of a handheld virtual projectile, the virtual object being in a virtual environment;
a receiving module 1220, configured to perform a pre-throwing operation, where the pre-throwing operation is used to throw and aim the virtual throwing object in the virtual environment;
the display module 1210 is further configured to display, in the virtual environment, trajectory indication information of the virtual projectile based on the pre-throwing operation, where the trajectory indication information is used to indicate a flight trajectory of the virtual projectile in the virtual environment;
the display module 1210 is further configured to throw the virtual throwing object to a target location in the virtual environment in response to receiving a throwing operation;
the display module 1210 is further configured to display a virtual obstacle at the target location, the virtual obstacle being configured to block virtual injury to the virtual object from a second side in response to the virtual object being located on a first side of the virtual obstacle, wherein the first side and the second side are opposite.
In an alternative embodiment, the virtual projectile corresponds to a targeted trigger duration;
the display module 1210 is further configured to display the virtual obstacle at the target location in response to the thrown duration of the virtual throw reaching the target trigger duration.
In an alternative embodiment, the function of the virtual barrier in the virtual environment comprises an attack function;
the receiving module 1220 is further configured to receive an attack trigger signal for the virtual obstacle;
referring to fig. 13, the apparatus further includes:
a triggering module 1230, configured to trigger the attack function of the virtual obstacle in the virtual environment based on the attack trigger signal, where the attack function is used to instruct virtual damage to a virtual object within a first preset range of the virtual obstacle.
In an optional embodiment, the display module 1210 is further configured to display a control in a virtual environment interface, where the control is configured to trigger the attack function of the virtual obstacle;
the receiving module 1220 is further configured to receive a trigger operation on the control;
the device further comprises:
a generating module 1340 configured to generate the attack trigger signal for the virtual obstacle based on the triggering operation.
In an optional embodiment, the display module 1210 is further configured to display the control in response to the virtual object moving to be within a second preset range of the virtual obstacle.
In an optional embodiment, the virtual obstacle corresponds to a virtual life value;
the device further comprises:
a control module 1250, configured to reduce the virtual life value of the virtual obstacle in response to receiving an attack operation of a virtual object in the virtual environment on the virtual obstacle;
the control module 1250 is further configured to control the virtual obstacle to disappear from the virtual environment in response to the virtual life value being empty.
In an optional embodiment, the virtual obstacle corresponds to a target display duration;
the control module 1250 is further configured to control the virtual obstacle to disappear from the virtual environment in response to the display duration of the virtual obstacle reaching the target display duration;
or, the triggering module 1230 is further configured to trigger the attack function of the virtual obstacle in the virtual environment in response to the display duration of the virtual obstacle reaching the target display duration.
In an alternative embodiment, the display module 1210 further comprises:
a determining unit 1211, configured to determine, based on the trajectory indication information, a placement point indication information of the virtual throwing object, where the placement point indication information indicates a placement point condition of the virtual throwing object in the virtual environment;
the display module 1210 is further configured to display the drop point indication information of the virtual throwing object in the virtual environment.
In summary, the operating device for a virtual projectile provided in the embodiment of the present application, in a virtual environment including a virtual object, throws the virtual projectile to a target position in the virtual environment by controlling the virtual object holding the virtual projectile, so as to display a virtual obstacle at the target position, where the virtual obstacle can protect the virtual object on a first side from a virtual injury from a second side, where the first side is opposite to the second side, thereby improving the operating diversity of the virtual projectile and enriching the function of the virtual projectile.
It should be noted that: the operation device for virtual throwing articles provided in the above embodiments is only exemplified by the division of the above functional modules, and in practical applications, the above functions may be distributed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions. In addition, the operation device of the virtual throwing object provided by the above embodiment and the operation method embodiment of the virtual throwing object belong to the same concept, and the specific implementation process is described in the method embodiment, and is not described herein again.
Fig. 14 shows a block diagram of a terminal 1400 according to an exemplary embodiment of the present invention. The terminal 1400 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. Terminal 1400 can also be referred to as user equipment, a portable terminal, a laptop terminal, a desktop terminal, or other names.
In general, terminal 1400 includes: a processor 1401, and a memory 1402.
Processor 1401 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so forth. The processor 1401 may be implemented in at least one hardware form of DSP (Digital Signal Processing), FPGA (Field-Programmable Gate Array), and PLA (Programmable Logic Array). Processor 1401 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1401 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing content that the display screen needs to display. In some embodiments, processor 1401 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 1402 may include one or more computer-readable storage media, which may be non-transitory. Memory 1402 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1402 is used to store at least one instruction for execution by processor 1401 to implement a zone action method in a virtual environment as provided by method embodiments herein.
In some embodiments, terminal 1400 may further optionally include: a peripheral device interface 1403 and at least one peripheral device. The processor 1401, the memory 1402, and the peripheral device interface 1403 may be connected by buses or signal lines. Each peripheral device may be connected to the peripheral device interface 1403 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1404, a display 1405, a camera assembly 1406, audio circuitry 1407, a positioning assembly 1408, and a power supply 1409.
The peripheral device interface 1403 can be used to connect at least one peripheral device related to I/O (Input/Output) to the processor 1401 and the memory 1402. In some embodiments, the processor 1401, memory 1402, and peripheral interface 1403 are integrated on the same chip or circuit board; in some other embodiments, any one or both of the processor 1401, the memory 1402, and the peripheral device interface 1403 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The Radio Frequency circuit 1404 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 1404 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 1404 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1404 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1404 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 1404 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1405 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1405 is a touch display screen, the display screen 1405 also has the ability to capture touch signals at or above the surface of the display screen 1405. The touch signal may be input to the processor 1401 for processing as a control signal. At this point, the display 1405 may also be used to provide virtual buttons and/or virtual keyboards, also referred to as soft buttons and/or soft keyboards. In some embodiments, the display 1405 may be one, providing the front panel of the terminal 1400; in other embodiments, display 1405 may be at least two, respectively disposed on different surfaces of terminal 1400 or in a folded design; in still other embodiments, display 1405 may be a flexible display disposed on a curved surface or on a folded surface of terminal 1400. Even further, the display 1405 may be arranged in a non-rectangular irregular figure, i.e., a shaped screen. The Display 1405 can be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and the like.
The camera assembly 1406 is used to capture images or video. Optionally, camera assembly 1406 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1406 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 1407 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1401 for processing or inputting the electric signals to the radio frequency circuit 1404 to realize voice communication. For stereo capture or noise reduction purposes, multiple microphones may be provided, each at a different location of terminal 1400. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is then used to convert electrical signals from the processor 1401 or the radio frequency circuit 1404 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuit 1407 may also include a headphone jack.
The positioning component 1408 serves to locate the current geographic position of the terminal 1400 for navigation or LBS (Location Based Service). The Positioning component 1408 may be based on the Positioning component of the GPS (Global Positioning System) in the united states, the beidou System in china, or the galileo System in russia.
Power supply 1409 is used to power the various components of terminal 1400. The power source 1409 may be alternating current, direct current, disposable or rechargeable. When the power source 1409 comprises a rechargeable battery, the rechargeable battery can be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 1400 also includes one or more sensors 1410. The one or more sensors 1410 include, but are not limited to: acceleration sensor 1411, gyroscope sensor 1412, pressure sensor 1413, fingerprint sensor 1414, optical sensor 1415, and proximity sensor 1416.
The acceleration sensor 1411 may detect the magnitude of acceleration on three coordinate axes of a coordinate system established with the terminal 1400. For example, the acceleration sensor 1411 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1401 can control the touch display 1405 to display a user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1411. The acceleration sensor 1411 may also be used for the acquisition of motion data of a game or a user.
The gyro sensor 1412 may detect a body direction and a rotation angle of the terminal 1400, and the gyro sensor 1412 and the acceleration sensor 1411 may cooperate to collect a 3D motion of the user on the terminal 1400. The processor 1401 can realize the following functions according to the data collected by the gyro sensor 1412: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 1413 may be disposed on the side bezel of terminal 1400 and/or underlying touch display 1405. When the pressure sensor 1413 is disposed on the side frame of the terminal 1400, the user's holding signal of the terminal 1400 can be detected, and the processor 1401 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1413. When the pressure sensor 1413 is disposed at the lower layer of the touch display 1405, the processor 1401 controls the operability control on the UI interface according to the pressure operation of the user on the touch display 1405. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1414 is used for collecting a fingerprint of a user, and the processor 1401 identifies the user according to the fingerprint collected by the fingerprint sensor 1414, or the fingerprint sensor 1414 identifies the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, processor 1401 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying for, and changing settings, etc. Fingerprint sensor 1414 may be disposed on the front, back, or side of terminal 1400. When a physical button or vendor Logo is provided on terminal 1400, fingerprint sensor 1414 may be integrated with the physical button or vendor Logo.
The optical sensor 1415 is used to collect ambient light intensity. In one embodiment, processor 1401 can control the display brightness of touch display 1405 based on the ambient light intensity collected by optical sensor 1415. Specifically, when the ambient light intensity is high, the display luminance of the touch display 1405 is increased; when the ambient light intensity is low, the display brightness of the touch display 1405 is turned down. In another embodiment, the processor 1401 can also dynamically adjust the shooting parameters of the camera assembly 1406 according to the intensity of the ambient light collected by the optical sensor 1415.
Proximity sensor 1416, also known as a distance sensor, is typically disposed on the front panel of terminal 1400. The proximity sensor 1416 is used to collect the distance between the user and the front surface of the terminal 1400. In one embodiment, when proximity sensor 1416 detects that the distance between the user and the front face of terminal 1400 is gradually decreased, processor 1401 controls touch display 1405 to switch from a bright screen state to a dark screen state; when proximity sensor 1416 detects that the distance between the user and the front face of terminal 1400 is gradually increasing, processor 1401 controls touch display 1405 to switch from a breath-screen state to a bright-screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 14 is not intended to be limiting with respect to terminal 1400 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be employed.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, which may be a computer readable storage medium contained in a memory of the above embodiments; or it may be a separate computer-readable storage medium not incorporated in the terminal. The computer readable storage medium has stored therein at least one instruction, at least one program, a set of codes, or a set of instructions that is loaded and executed by the processor to implement the method of regional action in a virtual environment as described in any of the above embodiments.
Optionally, the computer-readable storage medium may include: a Read Only Memory (ROM), a Random Access Memory (RAM), a Solid State Drive (SSD), or an optical disc. The Random Access Memory may include a resistive Random Access Memory (ReRAM) and a Dynamic Random Access Memory (DRAM). The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (14)

1. A method of operating a virtual projectile, the method comprising:
displaying a virtual object of a handheld virtual projectile, the virtual object being in a virtual environment;
receiving a pre-cast operation for casting aiming of the virtual projectile in the virtual environment;
displaying track indication information of the virtual throwing object in the virtual environment based on the pre-throwing operation, wherein the track indication information is used for indicating the flight track of the virtual throwing object in the virtual environment;
in response to receiving a throwing operation, throwing the virtual throwing object to a target location in the virtual environment;
displaying a virtual obstacle at the target location, the virtual obstacle to block virtual injury to the virtual object from a second side in response to the virtual object being located on a first side of the virtual obstacle, wherein the first side and the second side are opposite.
2. The method of claim 1, wherein the virtual projectile corresponds to a target trigger duration;
the displaying a virtual obstacle at the target location, comprising:
displaying the virtual obstacle at the target location in response to the thrown duration of the virtual throw reaching the target trigger duration.
3. The method of claim 1, wherein the function of the virtual obstacle in the virtual environment comprises an attack function;
after the displaying the virtual obstacle at the target position, further comprising:
receiving an attack trigger signal for the virtual obstacle;
triggering the attack function of the virtual obstacle in the virtual environment based on the attack trigger signal, wherein the attack function is used for indicating that virtual damage is caused to a virtual object within a first preset range of the virtual obstacle.
4. The method of claim 3, wherein the receiving an attack trigger signal for the virtual obstacle comprises:
displaying a control in a virtual environment interface, wherein the control is used for triggering the attack function of the virtual barrier;
receiving a trigger operation on the control;
generating the attack trigger signal to the virtual obstacle based on the trigger operation.
5. The method of claim 4, wherein displaying control controls in the virtual environment interface comprises:
displaying the control in response to the virtual object moving to within a second preset range of the virtual obstacle.
6. The method according to any one of claims 1 to 5, wherein the virtual obstacle corresponds to a virtual life value;
after the displaying of the virtual obstacle at the target location in the virtual environment, further comprising:
in response to receiving an attack operation on the virtual obstacle by a virtual object in the virtual environment, reducing the virtual life value of the virtual obstacle;
controlling the virtual obstacle to disappear from the virtual environment in response to the virtual life value emptying.
7. The method of claim 3, wherein the virtual obstacle corresponds to a targeted display duration;
after the displaying the virtual obstacle at the target position, further comprising:
in response to the display duration of the virtual obstacle reaching the target display duration, controlling the virtual obstacle to disappear from the virtual environment;
or, in response to the display duration of the virtual obstacle reaching the target display duration, triggering the attack function of the virtual obstacle in the virtual environment.
8. The method of claim 1, wherein the displaying the trajectory indication of the virtual projectile in the virtual environment based on the pre-cast operation further comprises:
determining the drop point indication information of the virtual throwing object based on the track indication information, wherein the drop point indication information is used for indicating the drop point condition of the virtual throwing object in the virtual environment;
displaying drop point indication information of the virtual throwing object in the virtual environment.
9. An apparatus for operating a virtual projectile, the apparatus comprising:
a display module for displaying a virtual object of a handheld virtual projectile, the virtual object being in a virtual environment;
a receiving module for a pre-throwing operation for throwing and aiming the virtual throwing object in the virtual environment;
the display module is further configured to display track indication information of the virtual throwing object in the virtual environment based on the pre-throwing operation, where the track indication information is used to indicate a flight track of the virtual throwing object in the virtual environment;
the display module is further used for responding to the received throwing operation, throwing the virtual throwing object to a target position in the virtual environment;
the display module is further configured to display a virtual obstacle at the target location, the virtual obstacle being configured to block virtual injury to the virtual object from a second side in response to the virtual object being located on a first side of the virtual obstacle, wherein the first side and the second side are opposite.
10. The apparatus of claim 9, wherein the virtual projectile corresponds to a target trigger duration;
the display module is further configured to display the virtual obstacle at the target location in response to the thrown duration of the virtual throw reaching the target trigger duration.
11. The apparatus of claim 9, wherein the function of the virtual obstacle in the virtual environment comprises an attack function;
the receiving module is further configured to receive an attack trigger signal for the virtual obstacle;
the device further comprises:
the triggering module is used for triggering the attack function of the virtual obstacle in the virtual environment based on the attack triggering signal, and the attack function is used for indicating that virtual damage is caused to a virtual object in a first preset range of the virtual obstacle.
12. The apparatus of claim 11,
the display module is further configured to display a control in a virtual environment interface, where the control is used to trigger the attack function of the virtual obstacle;
the receiving module is further configured to receive a trigger operation on the control;
the device further comprises:
a generating module for generating the attack trigger signal to the virtual obstacle based on the triggering operation.
13. A computer device comprising a processor and a memory, said memory having stored therein at least one instruction, at least one program, set of codes, or set of instructions, which is loaded and executed by said processor to implement a method of operating a virtual projectile in accordance with any one of claims 1 to 8.
14. A computer readable storage medium having stored therein at least one program code, the program code being loaded and executed by a processor to implement a method of operating a virtual projectile in accordance with any one of claims 1 to 8.
CN202110227863.2A 2021-03-01 2021-03-01 Virtual throwing object operation method, device, equipment and medium Active CN112933601B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110227863.2A CN112933601B (en) 2021-03-01 2021-03-01 Virtual throwing object operation method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110227863.2A CN112933601B (en) 2021-03-01 2021-03-01 Virtual throwing object operation method, device, equipment and medium

Publications (2)

Publication Number Publication Date
CN112933601A true CN112933601A (en) 2021-06-11
CN112933601B CN112933601B (en) 2023-05-16

Family

ID=76247043

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110227863.2A Active CN112933601B (en) 2021-03-01 2021-03-01 Virtual throwing object operation method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN112933601B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113546424A (en) * 2021-08-04 2021-10-26 网易(杭州)网络有限公司 Virtual resource use control method and device, computer equipment and storage medium
CN113750532A (en) * 2021-09-24 2021-12-07 腾讯科技(深圳)有限公司 Track display method and device, storage medium and electronic equipment
CN114939275A (en) * 2022-05-24 2022-08-26 北京字跳网络技术有限公司 Object interaction method, device, equipment and storage medium
WO2023273605A1 (en) * 2021-06-30 2023-01-05 北京字跳网络技术有限公司 Virtual prop control method and apparatus, and device and computer-readable storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111443857A (en) * 2020-03-12 2020-07-24 腾讯科技(深圳)有限公司 Control method and device of virtual prop, storage medium and electronic equipment

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111443857A (en) * 2020-03-12 2020-07-24 腾讯科技(深圳)有限公司 Control method and device of virtual prop, storage medium and electronic equipment

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023273605A1 (en) * 2021-06-30 2023-01-05 北京字跳网络技术有限公司 Virtual prop control method and apparatus, and device and computer-readable storage medium
CN113546424A (en) * 2021-08-04 2021-10-26 网易(杭州)网络有限公司 Virtual resource use control method and device, computer equipment and storage medium
CN113546424B (en) * 2021-08-04 2024-07-02 网易(杭州)网络有限公司 Virtual resource use control method, device, computer equipment and storage medium
CN113750532A (en) * 2021-09-24 2021-12-07 腾讯科技(深圳)有限公司 Track display method and device, storage medium and electronic equipment
CN113750532B (en) * 2021-09-24 2023-07-14 腾讯科技(深圳)有限公司 Track display method and device, storage medium and electronic equipment
CN114939275A (en) * 2022-05-24 2022-08-26 北京字跳网络技术有限公司 Object interaction method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN112933601B (en) 2023-05-16

Similar Documents

Publication Publication Date Title
CN110427111B (en) Operation method, device, equipment and storage medium of virtual prop in virtual environment
CN111282275B (en) Method, device, equipment and storage medium for displaying collision traces in virtual scene
CN110721468B (en) Interactive property control method, device, terminal and storage medium
CN110917619B (en) Interactive property control method, device, terminal and storage medium
CN111475573B (en) Data synchronization method and device, electronic equipment and storage medium
CN111589149B (en) Using method, device, equipment and storage medium of virtual prop
CN112933601B (en) Virtual throwing object operation method, device, equipment and medium
CN111228809A (en) Operation method, device, equipment and readable medium of virtual prop in virtual environment
CN111001159B (en) Virtual item control method, device, equipment and storage medium in virtual scene
CN111265873A (en) Using method, device, equipment and storage medium of virtual prop
CN111475029B (en) Operation method, device, equipment and storage medium of virtual prop
CN112870715B (en) Virtual item putting method, device, terminal and storage medium
WO2021147496A1 (en) Method and apparatus for using virtual prop, and device and storage meduim
CN111589150A (en) Control method and device of virtual prop, electronic equipment and storage medium
CN112057857B (en) Interactive property processing method, device, terminal and storage medium
CN112402964B (en) Using method, device, equipment and storage medium of virtual prop
CN111760284A (en) Virtual item control method, device, equipment and storage medium
CN112138384A (en) Using method, device, terminal and storage medium of virtual throwing prop
CN112717410B (en) Virtual object control method and device, computer equipment and storage medium
CN113041622A (en) Virtual throwing object throwing method in virtual environment, terminal and storage medium
CN113713382A (en) Virtual prop control method and device, computer equipment and storage medium
CN111249726B (en) Operation method, device, equipment and readable medium of virtual prop in virtual environment
CN113713383A (en) Throwing prop control method and device, computer equipment and storage medium
CN111921190A (en) Method, device, terminal and storage medium for equipping props of virtual objects
CN112316430B (en) Prop using method, device, equipment and medium based on virtual environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40045962

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant