CN115920389A - Virtual object action processing method and device and computer equipment - Google Patents

Virtual object action processing method and device and computer equipment Download PDF

Info

Publication number
CN115920389A
CN115920389A CN202211717665.5A CN202211717665A CN115920389A CN 115920389 A CN115920389 A CN 115920389A CN 202211717665 A CN202211717665 A CN 202211717665A CN 115920389 A CN115920389 A CN 115920389A
Authority
CN
China
Prior art keywords
animation
action
weapon
motion
virtual object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211717665.5A
Other languages
Chinese (zh)
Inventor
王子宜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202211717665.5A priority Critical patent/CN115920389A/en
Publication of CN115920389A publication Critical patent/CN115920389A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The application provides a virtual object action processing method, a virtual object action processing device and computer equipment, and belongs to the technical field of electronic games. The method comprises the following steps: responsive to a target virtual object equipping a first virtual weapon, determining a first weapon type to which the first virtual weapon corresponds; determining a first gesture action and a first general animation required for controlling the target virtual object to execute a first game behavior from a preset gesture action library and a preset general animation library which are configured in advance based on the first virtual weapon and the first weapon type; generating a first action animation corresponding to the first game behavior based on the first gesture action and the first generic animation. The method and the device can achieve the effect of reducing resource consumption when the virtual character is provided with different weapons and needs to express different shooting actions.

Description

Virtual object action processing method and device and computer equipment
Technical Field
The application relates to the technical field of electronic games, in particular to a method and a device for processing actions of a virtual object and computer equipment.
Background
With the development of computer technology, more and more electronic games enter people's lives, such as electronic games like Third-person shooter game (TPS), generally, a player can control a virtual object in the TPS game to perform various actions, and the virtual object can be equipped with various virtual remote weapons.
In the related art, in the TPS game which is on line now, when a player controls the virtual object to shoot, the virtual object performs a corresponding shooting action. Generally, in response to a shooting instruction input by a player, the virtual object may repeatedly play a fixed shooting animation according to parameters such as a specific rate, a start shooting time, a reset shooting time, a shooting interval, etc. of a currently held weapon to display the shooting motion of the virtual character on the graphic user interface.
However, since there are a great many kinds of weapons in the TPS game, if different shooting actions are required to be performed in the case where different weapons are equipped to virtual characters, it is necessary to make one different shooting action for the different weapons. Therefore, in the case where the virtual character is equipped with different weapons and needs to exhibit different shooting actions, the related-art scheme has a problem of large resource consumption.
Disclosure of Invention
The application aims to provide a virtual object action processing method, a virtual object action processing device and a computer device, which can achieve the effect of reducing resource consumption when virtual characters are provided with different weapons and need to express different shooting actions.
The embodiment of the application is realized as follows:
in a first aspect of the embodiments of the present application, a method for processing a virtual object action is provided, where the method includes:
responsive to a target virtual object equipping a first virtual weapon, determining a first weapon type to which the first virtual weapon corresponds;
determining a first gesture action and a first general animation required for controlling the target virtual object to execute a first game behavior from a preset gesture action library and a preset general animation library which are configured in advance based on the first virtual weapon and the first weapon type; wherein the first gesture action is determined by the first virtual weapon, the first generic animation is determined by the first weapon type, the first game action comprises: aiming behavior, single shooting behavior, continuous shooting behavior;
generating a first action animation corresponding to the first game behavior based on the first gesture action and the first general animation; wherein the first action animation is used for showing the action effect of the target virtual object when the first game behavior is executed by using the first virtual weapon, and the first action animation comprises: aiming behavior animation, single shooting behavior animation, and continuous shooting behavior animation.
In a second aspect of the embodiments of the present application, there is provided a virtual object motion processing apparatus, including:
a determination module to determine, in response to a target virtual object equipping a first virtual weapon, a first weapon type corresponding to the first virtual weapon;
the acquisition determining module is used for determining a first gesture action and a first general animation required for controlling the target virtual object to execute a first game behavior from a preset gesture action library and a preset general animation library which are configured in advance based on the first virtual weapon and the first weapon type; wherein the first gesture action is determined by the first virtual weapon, the first generic animation is determined by the first weapon type, the first game action comprises: aiming behavior, single shooting behavior, continuous shooting behavior;
a generating module for generating a first action animation corresponding to the first game behavior based on the first gesture action and the first general animation; wherein the first action animation is used for showing action effects of the target virtual object when the first game behavior is executed by using the first virtual weapon, and the first action animation comprises: aiming behavior motion animation, single shooting behavior motion animation, and continuous shooting behavior motion animation.
In a third aspect of the embodiments of the present application, there is provided a computer device, where the computer device includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the computer program, when executed by the processor, implements the virtual object action processing method according to the first aspect.
In a fourth aspect of the embodiments of the present application, a computer-readable storage medium is provided, where a computer program is stored, and when being executed by a processor, the computer program implements the virtual object action processing method according to the first aspect.
The beneficial effects of the embodiment of the application include:
according to the virtual object action processing method, the virtual object action processing device and the computer equipment, a first virtual weapon is equipped in response to a target virtual object, and a first weapon type corresponding to the first virtual weapon is determined. And determining a first gesture action and a first general animation required for controlling the target virtual object to execute the first game behavior from a preset gesture action library and a preset general animation library which are configured in advance based on the first virtual weapon and the first weapon type.
Since each first universal animation is a universal animation corresponding to one type of virtual weapon, that is, only one set of first universal animation needs to be designed for the same type of virtual weapon, and a different set of first universal animation does not need to be designed for each type of virtual weapon. Therefore, when the action processing of the target virtual object is subsequently performed, only the first general animation corresponding to each type needs to be stored and called, so that the resources occupied by the action processing of the target virtual object can be greatly reduced.
Based on the first gesture motion and the first generic animation, a first motion animation corresponding to the first game behavior is generated.
Since each first generic animation is a generic animation corresponding to one type of virtual weapon, and the preset gesture motion is a single-frame motion corresponding to each type of virtual weapon. Then, each first motion animation obtained by adjusting each first general animation according to the first posture action corresponding to the target virtual weapon is an animation corresponding to the model of the target virtual weapon.
That is, if the first virtual weapons of the target virtual object equipment are different, the first general animations or the first posture motions used when determining the first motion animations of the target virtual object are different, and it is ensured that the obtained first motion animations are different, so that the first motion animations corresponding to the types of the first virtual weapons and the first posture motions corresponding to the types of the first virtual weapons can be used only, and when the first virtual weapons are equipped with virtual characters, the first motion animations corresponding to the types of the first virtual weapons can be obtained without setting different general animations according to the types of the first virtual weapons, respectively.
It will be appreciated that since each first generic animation may be an animation corresponding to a virtual weapon of the same type, it is not necessary to design different generic animations for different types of virtual weapons of the same type in the present application, but rather the first motion animation of a first virtual weapon may be determined from a first generic animation that is generic to a plurality of different types of virtual weapons of the same type.
Therefore, the reuse rate of each first general animation can be improved, the purpose that different first action animations can be obtained when the virtual character is provided with first virtual weapons of different types in the same type can be achieved according to the first general animation suitable for the virtual weapons of different types, and the effect that the resource consumption is reduced when the virtual character is provided with different weapons and different shooting actions need to be expressed can be achieved.
Drawings
To more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a flowchart of a first virtual object action processing method according to an embodiment of the present application;
fig. 2 is a flowchart of a second virtual object action processing method according to an embodiment of the present application;
fig. 3 is a flowchart of a third virtual object action processing method according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a first shooting effect provided by an embodiment of the present application;
fig. 5 is a flowchart of a fourth virtual object action processing method according to the embodiment of the present application;
FIG. 6 is a schematic diagram of a second shooting effect provided by an embodiment of the present application;
fig. 7 is a schematic structural diagram of a virtual object motion processing apparatus according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of another virtual object motion processing apparatus according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined or explained in subsequent figures.
In the description of the present application, it should be noted that the terms "first", "second", "third", etc. are used only for distinguishing the description, and are not intended to indicate or imply relative importance.
In the related art, in the TPS game which is on line now, when a player controls the virtual object to shoot, the virtual object performs a corresponding shooting action. Generally, in response to a shooting instruction input by a player, the virtual object may repeatedly play a fixed shooting animation according to parameters such as a specific rate, a start shooting time, a reset shooting time, a shooting interval, etc. of a currently held weapon to display the shooting motion of the virtual character on the graphic user interface. However, since the number of different weapons in the TPS game is very large, if different shooting actions are to be displayed in the case of virtual characters equipped with different weapons, it is necessary to make a different shooting action for the different weapons. Therefore, in the case where the virtual character is equipped with different weapons and needs to express different shooting actions, the related-art scheme has a problem of large resource consumption.
Therefore, the embodiment of the present application provides a virtual object action processing method, which includes obtaining at least one first general animation corresponding to a weapon type to which a target virtual weapon belongs by responding to the virtual object, obtaining a preset gesture action matched with the target virtual weapon, determining each to-be-selected general animation of the virtual object according to each first general animation and the preset gesture action, responding to an action control instruction for the virtual object, determining a target action animation of the virtual object according to each to-be-selected general animation and the action control instruction, and executing the target action animation, so that an effect of reducing resource consumption when a virtual character is equipped with different weapons and needs to express different shooting actions can be achieved.
The virtual object action processing method in one embodiment of the present application may be executed on a terminal device or a server. The terminal device may be a local terminal device. When the virtual object action processing method runs on the server, the method can be implemented and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and the client device.
In an optional embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud games. Taking a cloud game as an example, a cloud game refers to a game mode based on cloud computing. In the cloud game operation mode, the game program operation main body and the game picture presentation main body are separated, the storage and operation of the game display method are completed on the cloud game server, and the client device is used for receiving and sending data and presenting the game picture, for example, the client device can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; however, the terminal device performing the information processing is a cloud game server in the cloud. When a game is played, a player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are coded and compressed, the data are returned to the client device through a network, and finally, the data are decoded through the client device and the game pictures are output.
In an alternative embodiment, the terminal device may be a local terminal device. Taking a game as an example, the local terminal device stores a game program and is used for presenting a game screen. The local terminal device is used for interacting with the player through a graphical user interface, namely, a game program is downloaded and installed and operated through an electronic device conventionally. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal or provided to the player through holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including a game screen and a processor for running the game, generating the graphical user interface, and controlling display of the graphical user interface on the display screen.
In a possible implementation manner, an embodiment of the present invention provides a virtual object action processing method, which provides a graphical user interface through a terminal device, where the terminal device may be the aforementioned local terminal device, and may also be the aforementioned client device in a cloud interaction system.
The embodiment of the present application will be described with reference to a virtual object action processing method applied to a terminal game as an example. However, the embodiment of the present application is not intended to be applied to only the virtual object motion processing in the terminal game.
The following explains the virtual object motion processing method provided in the embodiment of the present application in detail.
Fig. 1 is a flowchart of a virtual object action processing method provided in the present application, where the method may be applied to a computer device, and the computer device may be the foregoing terminal device or server. Referring to fig. 1, an embodiment of the present application provides a virtual object action processing method, including:
step 1001: in response to a target virtual object equipping a first virtual weapon, a first weapon type corresponding to the first virtual weapon is determined.
Alternatively, the target virtual object may refer to a target virtual object manipulated by a player in the virtual scene or the electronic game.
The first virtual weapon may be any virtual remote weapon in the video game, such as a virtual gun or the like.
The first weapon type corresponding to the first virtual weapon may include: rifle, submachine gun, light machine gun, shotgun, sniper gun, pistol, etc.
If the first virtual weapon is a virtual gun, the model of the first virtual weapon may include various models such as "AK47", "SCAR", "M16A4", "M1911", which is not limited in this application.
The three types of firearms "AK47", "SCAR" and "M16A4" are all rifle-type virtual weapons. The firearm model "M1911" belongs to a virtual weapon of the pistol type.
Notably, where the target virtual object is equipped with the target virtual weapon, the weapon type of the first virtual weapon currently equipped by the user may be determined to facilitate performance of subsequent operations.
Step 1002: and determining a first gesture action and a first general animation required for controlling the target virtual object to execute the first game behavior from a preset gesture action library and a preset general animation library which are configured in advance based on the first virtual weapon and the first weapon type.
Alternatively, the preset gesture motion library and the preset general animation library may be set in advance by a related art person.
Gesture actions corresponding to various types of virtual weapons can be stored in the preset gesture action library. The gesture acts as a single frame POSE. The posture and the action corresponding to the virtual weapon with different models are different.
The preset general animation library can store general animations corresponding to various types of virtual weapons. The different types of virtual weapons correspond to different generic animations.
Alternatively, the first game action may refer to various game actions that the virtual character may trigger or execute while armed with a virtual weapon. The first play activity includes at least one of: aiming behavior, single shot behavior, continuous shot behavior.
Generally, when the target virtual object executes the first game behavior, a corresponding effect is triggered in the game.
For example, if the first play action is a single-shot action, then the target virtual object will trigger the effect of firing a round of ammunition by equipping the target virtual object with the first virtual weapon during the game when the target virtual object performs the single-shot action. For another example, if the first playing behavior is a continuous shooting behavior, then when the target virtual object performs the continuous playing behavior, the effect of continuously shooting multiple bullets by equipping the target virtual object with the first virtual weapon will be triggered in the game. For another example, if the first game action is a targeting action, then the targeting action performed by the target virtual object triggers an effect in the game of arming the first virtual weapon into a targeting state by the target virtual object.
The first gesture motion is determined by the first virtual weapon, and the first gesture motion is a gesture motion corresponding to the model of the first virtual weapon stored in the preset gesture motion library.
Optionally, the first gesture motion may refer to a single frame motion, i.e., a single frame pos, designed by the relevant technician for a specific model of the target virtual weapon. In general, the first gestural action may be a frame of static motion of the target virtual object when aimed but not fired while armed with the first virtual weapon.
In addition, a related technician may design a gesture action for each model of virtual weapon in the electronic game in advance and store the gesture action in a preset gesture action library. The first gesture motion may be for a holding motion of the target virtual object holding the first virtual weapon and a standing motion of the target virtual object all over the body while the target virtual object is equipped with the first virtual weapon.
The first universal animation is determined by the first weapon type, and the first universal animation is the universal animation corresponding to the type of the first virtual weapon stored in the preset universal animation library.
Alternatively, each first general animation may be an animation corresponding to each type, which is designed in advance by a person skilled in the art, and each first general animation may include an action sequence, which may include a plurality of frames of the same or different actions. Specifically, each first general animation corresponding to any type may be an animation designed by selecting one model of firearm from any type of firearms as a virtual weapon equipped with the target virtual object.
Each first generic animation may include: aiming behavior motion animation, single shooting behavior motion animation, and continuous shooting behavior motion animation.
The aiming behavior action animation can refer to animation which is played when the target virtual object is aimed when equipped with a virtual weapon but is not shot. The single shooting behavior animation can refer to animation which needs to be played when the target virtual object is shot once or shot at a single point when equipped with a virtual weapon. The continuous shooting behavior animation may refer to animation that the target virtual object needs to play when shooting multiple times or shooting continuously while equipping a virtual weapon.
It should be noted that, since each first common animation is a common animation corresponding to one type of virtual weapon, that is, only one set of first common animation needs to be designed for the same type of virtual weapon, and a different set of first common animation does not need to be designed for each type of virtual weapon. Therefore, when the action processing of the target virtual object is subsequently performed, only the first general animation corresponding to each type needs to be stored and called, so that the resources occupied by the action processing of the target virtual object can be greatly reduced.
Step 1003: based on the first gesture motion and the first generic animation, a first motion animation corresponding to the first game behavior is generated.
Alternatively, each first action animation may refer to some animations that the target virtual object may need to exhibit when performing the corresponding game action, in case the target virtual object is equipped with the first virtual weapon.
The first action animation is specifically used for showing the action effect of the target virtual object when the first virtual weapon is used for executing the first game behavior.
Each first motion animation includes an aiming behavior motion animation, a single shooting behavior motion animation, and/or a continuous shooting behavior motion animation.
Generally, in the case that the target virtual object performs the aiming action, i.e. needs to show the aiming action animation, the target virtual object will hold the first virtual weapon into the aiming state and at the same time perform the corresponding idle action, such as a breathing action during aiming.
In the case that the target virtual object performs the single shooting action, i.e. the single shooting action animation needs to be presented, the target virtual object performs a shooting action using the first virtual weapon. In case the target virtual object performs the continuous shooting action, i.e. needs to show the continuous shooting behavior action animation, the target virtual object will use the first virtual weapon to perform multiple continuous shooting actions.
It is noted that each first motion animation may be a set of dynamic motions adjusted according to the first gesture motion. Each first motion animation may be in the form of an animation or a motion sequence, or may be in any other possible form, which is not limited in this embodiment of the present application.
It is worth mentioning that since each first generic animation is a generic animation corresponding to one type of virtual weapon, the preset gesture motion is a single-frame motion corresponding to each type of virtual weapon. Then, each first motion animation obtained by adjusting each first general animation according to the first posture action corresponding to the target virtual weapon is an animation corresponding to the model of the target virtual weapon.
That is, if the first virtual weapons of the target virtual object equipment are different, the first general animations or the first posture motions used when determining the first motion animations of the target virtual object are different, and it is ensured that the obtained first motion animations are different, so that the first motion animations corresponding to the types of the first virtual weapons and the first posture motions corresponding to the types of the first virtual weapons can be used only, and when the first virtual weapons are equipped with virtual characters, the first motion animations corresponding to the types of the first virtual weapons can be obtained without setting different general animations according to the types of the first virtual weapons, respectively.
In addition, the first game behavior can refer to various game behaviors that the virtual character may trigger or execute when equipped with a virtual weapon. Each first generic animation also includes the animations that the virtual character may need to exhibit while armed with the target virtual weapon.
According to the scheme, different universal animations do not need to be set for the virtual weapons of different models, but first universal animations which are universal for the virtual weapons of different models are set according to the types of the virtual weapons, and therefore the first action animations can be generated or manufactured by using less resources.
In an embodiment of the present application, in response to a target virtual object being equipped with a first virtual weapon, a first weapon type corresponding to the first virtual weapon is determined. And determining a first gesture action and a first general animation required for controlling the target virtual object to execute the first game behavior from a preset gesture action library and a preset general animation library which are configured in advance based on the first virtual weapon and the first weapon type.
Since each first universal animation is a universal animation corresponding to one type of virtual weapon, that is, only one set of first universal animation needs to be designed for the same type of virtual weapon, and a different set of first universal animation does not need to be designed for each type of virtual weapon. Therefore, when the action processing of the target virtual object is subsequently performed, only the first general animation corresponding to each type needs to be stored and called, so that the resources occupied by the action processing of the target virtual object can be greatly reduced.
Based on the first gesture motion and the first generic animation, a first motion animation corresponding to the first game behavior is generated.
Since each first generic animation is a generic animation corresponding to one type of virtual weapon, and the preset gesture motion is a single-frame motion corresponding to each type of virtual weapon. Then, each first action animation obtained by adjusting each first general animation according to the first gesture action corresponding to the target virtual weapon is an animation corresponding to the model of the target virtual weapon.
That is, if the first virtual weapons of the target virtual object equipment are different, the first general animations or the first posture motions used when determining the first motion animations of the target virtual object are different, and it is ensured that the obtained first motion animations are different, so that the first motion animations corresponding to the types of the first virtual weapons and the first posture motions corresponding to the types of the first virtual weapons can be used only, and when the first virtual weapons are equipped with virtual characters, the first motion animations corresponding to the types of the first virtual weapons can be obtained without setting different general animations according to the types of the first virtual weapons, respectively.
It will be appreciated that since each first generic animation may be an animation corresponding to a virtual weapon of the same type, it is not necessary to design different generic animations for different types of virtual weapons of the same type in the present application, but rather the first motion animation of a first virtual weapon may be determined from a first generic animation that is generic to a plurality of different types of virtual weapons of the same type.
Therefore, the reuse rate of each first universal animation can be improved, the purpose that different first action animations can be obtained when the virtual character is provided with first virtual weapons of different types in the same type can be achieved according to the first universal animation suitable for the virtual weapons of different types, and the effect of reducing resource consumption when the virtual character is provided with different weapons and different shooting actions need to be expressed can be achieved.
One possible approach, see fig. 2, the method further comprises:
step 1004: and receiving an action control instruction aiming at the target virtual object, and determining a target game behavior corresponding to the action control instruction and a target virtual weapon equipped by the target virtual object.
Alternatively, the motion control instruction may be an instruction input by a user through a corresponding input device for controlling a target game behavior executed by the target virtual object.
For example, the motion control command may include a shooting command, an aiming direction adjustment command, and the like, if the motion control command is the shooting command, the target game behavior may be a single shooting behavior or a continuous shooting behavior, and if the motion control command is the aiming command, the target game behavior may be an aiming behavior, which is not limited in this embodiment.
The target play activity is one of the first play activities.
The target virtual weapon is a virtual weapon equipped with the target virtual object in response to the motion control command.
It is noted that, since the motion control command is a command for controlling the target virtual object to execute the target game behavior. Then, the target game behavior that the target virtual object needs to execute currently can be accurately determined according to the action control instruction. Therefore, the action animation which needs to be displayed when the target virtual object executes the target game behavior can be accurately determined according to the target game behavior indicated by the action control instruction and the target virtual weapon currently equipped with the target virtual object in the subsequent operation.
Step 1005: and in response to the target virtual weapon being the first virtual weapon and the target game behavior being the first game behavior, controlling the target virtual object to execute the first game behavior, and rendering and displaying a first action animation corresponding to the first game behavior.
Optionally, in response to the target virtual weapon being the first virtual weapon and the target gaming activity being the first gaming activity, then a first action animation needs to be determined that the target virtual object needs to exhibit when armed with a first virtual weapon and executing the first gaming activity. In this way, it can be ensured that the motion animation controlling the presentation of the target virtual object matches the motion control instructions and the currently equipped virtual weapon.
The first action animation and the target game behavior are corresponding. Illustratively, if the target game play is a single shot play in the first game play, then the corresponding first action animation is a single shot play action animation. If the target game action is a continuous shooting action, the corresponding first action animation is a continuous shooting action animation. If the target game action is a targeting action, then the corresponding first action animation is a targeting action animation.
And, the first action animation corresponding to the first game behavior that is finally rendered for presentation is determined according to the first virtual weapon that the target virtual object is currently equipped with.
When the first action animation is displayed in a rendering mode, the target virtual object can be controlled to adjust attributes such as the posture, the action and the orientation of the target virtual object according to the first action animation, and therefore the purposes of responding to the action control instruction and displaying the corresponding animation can be achieved.
Notably, since the first play activity is determined according to the action control instruction, the first action animation corresponding to the first play activity that is finally rendered is determined according to the first play activity and the first virtual weapon. Then, when the target virtual object is equipped with the first virtual weapon, the target game behavior that the target virtual object needs to execute currently can be accurately determined to be which game behavior in the first game behaviors according to the action control instruction. Therefore, the target virtual object can accurately execute the target game behavior corresponding to the action control command, and simultaneously, the target action animation corresponding to the target game behavior is displayed. And further, the matching degree of the first game behavior and the first action animation corresponding to the first game behavior is ensured to be executed and displayed, and the problem that the executed first game behavior and the displayed action animation are asynchronous or unmatched is avoided.
It is worth noting that since each first action animation is an animation corresponding to the model of the first virtual weapon, the first action animation eventually presented based on the first game action corresponds to the model of the target virtual weapon, that is, in the case where the model of the target virtual weapon equipped with the target virtual object is different, even if the action control command input by the user is the same, a different action animation is rendered.
In this way, it is ensured that the target virtual object can perform different target motion animations according to the motion control commands when the target virtual object is equipped with different models of virtual weapons.
In one possible implementation, referring to fig. 3, generating a first action animation corresponding to the first game behavior based on the first gesture action and the first generic animation includes:
step 1006: dynamic gesture information between each first generic animation and the first gesture action is determined.
Optionally, the dynamic gesture information may be used to characterize difference information between each first generic animation and the first gesture motion. The dynamic pose information may specifically further include parameters for adjusting each first generic animation to an animation corresponding to the model of the first virtual weapon.
It is noted that since the first gesture motion is a single frame motion corresponding to the model of the first virtual weapon, and each first generic animation may comprise a plurality of frames of different motions, and each first generic animation is corresponding to the type of the first virtual weapon, it is necessary to determine difference information between the first gesture motion and each first generic animation, such that portions of each frame motion of each first generic animation that are different from the first gesture motion can be determined.
Step 1007: each first motion animation is determined according to each dynamic gesture information and the first gesture motion.
By determining respective dynamic gesture information between the first gesture motion corresponding to the model of the first virtual weapon and respective first generic animations corresponding to the type of the first virtual weapon, the first gesture motion may be adjusted according to the respective dynamic gesture information, thereby obtaining respective first motion animations applicable to the model of the first virtual weapon. For example, the first gesture motion may be adjusted using a dynamic gesture information, respectively, to obtain a first motion animation. That is, if there are 10 pieces of dynamic posture information, the first posture motion can be adjusted by using one of the 10 pieces of dynamic posture information, and 10 pieces of first motion animation can be obtained.
In this way, it is possible to achieve the object of obtaining different first motion animations when a virtual character is equipped with different virtual weapons, depending on which first generic animation of different models of the same type is used.
In one possible implementation, determining dynamic gesture information between each first generic animation and the first gesture action includes:
and comparing each frame of motion in each first general animation with the first posture motion to obtain the motion posture of each first general animation.
Alternatively, each frame of action in each first general animation may refer to an action included in each frame of animation obtained by decomposing each first general animation frame by frame.
Optionally, the motion gesture is a partial animation of each frame motion of the first general animation, which is different from the first gesture motion. That is, the motion gesture may be a different dynamic motion from the first gesture motion among the frame motions of the first general animation.
For example, since multiple frame motions may be included in each first general animation, each frame motion in each first general animation may be compared with the first gesture motion, and then the motion gesture between each frame motion in each first general animation and the first gesture motion may be obtained.
Therefore, the part of each frame of motion in each first general animation, which is different from the first gesture motion, can be accurately determined, and further the part of the animation, which is different from the first gesture motion, of each first general animation can be determined.
Each motion posture is used as the dynamic posture information.
In this way, dynamic gesture information representing information of a difference between each first generic animation and the first gesture motion may be obtained.
In one possible way, any one of the first general animations may be set as B, the preset gesture motion may be set as ase:Sub>A, and then the motion gesture of each first general animation may be determined by calling ase:Sub>A corresponding comparison algorithm through B-ase:Sub>A.
In one possible implementation, determining each first motion animation according to each dynamic gesture information and the first gesture motion includes:
and performing superposition processing on each dynamic posture information and the first posture motion to obtain each initial motion animation.
Alternatively, each initial motion animation may refer to a dynamic motion animation obtained by superimposing each piece of dynamic posture information on the first posture motion.
It is to be noted that, since the first gesture motion is a single frame motion corresponding to the model of the first virtual weapon, the dynamic gesture information is superimposed on the initial motion animations obtained after the first gesture motion, that is, the dynamic motion animation corresponding to the model of the first virtual weapon.
And acquiring preset control parameters according to the model of the first virtual weapon, and adjusting each initial action animation based on the preset control parameters.
Optionally, the preset control parameter includes at least one of: a play rate variable, a start time variable, an animation reset time variable, and an animation amplitude variable.
The preset control parameters may be set by the relevant technician based on the actual performance of the different models of virtual weapons. That is, the preset control parameters for different models of virtual weapons are different.
Illustratively, the preset control parameter may be a floating point type variable. The playing rate variable can be used for controlling the playing rate of each initial motion animation, the starting time variable can be used for controlling the playing starting time of each initial motion animation, the animation resetting time variable can be used for controlling the resetting time of each initial motion animation, and the animation amplitude variable can be used for controlling the maximum motion amplitude during playing of each initial motion animation. The embodiment of the present application does not limit this.
And taking each adjusted initial motion animation as each first motion animation.
As each initial action animation is adjusted through the preset control parameters, the to-be-selected general animations corresponding to the virtual weapons of different models can generate differences, and the purpose that different first action animations can be obtained when the virtual character is provided with different virtual weapons can be achieved.
One possible approach, receiving a motion control command for the target virtual object, determining a target game behavior corresponding to the motion control command and a target virtual weapon equipped by the target virtual object, includes:
the model number of the target virtual weapon is determined, and it is determined whether the model number of the target virtual weapon is the same as the first virtual weapon.
If yes, determining that the target virtual object is currently equipped with the first virtual weapon.
And analyzing the action control command, and taking one game behavior in the first game behavior, which is the same as the game behavior indicated by the action control command, as the target game behavior.
Therefore, whether the target virtual weapon equipped by the target virtual object is the first virtual weapon or not can be accurately determined when the action control instruction is received, and the target game behavior indicated by the action control instruction can be accurately determined.
If the motion control instruction is an instruction for controlling the target virtual object to execute a shooting motion, and the first game behavior is a single-shot behavior, in a possible implementation manner, controlling the target virtual object to execute the first game behavior and render a first motion animation showing a correspondence of the first game behavior includes:
and controlling the target virtual object to execute the single shooting action.
The single shooting behavior motion animation is taken as the first motion animation to be presented.
In this way, the single shooting action animation can be presented while the single shooting action is being performed to ensure that the game action performed by the target virtual object matches and is synchronized with the presented action animation.
And determining a first recoil parameter according to the scattering dispersion parameter of the first virtual weapon, and taking the shooting reclination angle of the single shooting behavior action animation as a second recoil parameter.
Alternatively, the scatter spread parameter may be a scatter spread attribute at the time of shooting set by a person of ordinary skill in the art according to the model of the first virtual weapon. The scattering distribution parameter may be determined according to the scattering distribution condition of the first virtual weapon in the real world, or may be randomly set, which is not limited in this embodiment of the present application.
Optionally, the first recoil parameter may refer to a horizontal recoil parameter of the target virtual object when armed with the first virtual weapon for a single shot. The first recoil parameter may particularly indicate a magnitude and an angle of oscillation of the orientation of the target virtual object and/or a direction in which the first virtual weapon is pointed in a horizontal direction when a single shot is taken.
The second recoil parameter may refer to a vertical recoil parameter of the target virtual object when armed with the first virtual weapon for a single shot. The second recoil parameter may particularly indicate a magnitude and direction of a change in vertical direction of the direction in which the first virtual weapon is pointed at when a single shot is taken.
Alternatively, the shooting lean angle may refer to an angle at which the target virtual object holds the part of the first virtual weapon or the direction in which the first virtual weapon is pointed leans back when the target virtual object is equipped with the first virtual weapon for a single shot.
And overlapping the first recoil parameter and the second recoil parameter to the first action animation, and displaying the overlapped first action animation.
It should be noted that, when the target virtual object performs the superimposed first motion animation, the target virtual object may swing horizontally and/or vertically according to the first recoil parameter and the second recoil parameter, so that the effect of recoil generated by the target virtual weapon after shooting in the real world can be simulated when the target virtual object shows the target motion animation.
To better explain the target virtual object motion processing method provided by the embodiment of the present application, fig. 4 is a schematic diagram of a shooting effect provided by the embodiment of the present application. Exemplarily, (a) in fig. 4 shows a graphical user interface a including a target virtual object X, a target virtual weapon W, and a virtual target B. It can be seen that the target virtual object X holds the target virtual weapon W and makes a targeting action towards the virtual target B, but at this time the user has not input the above-mentioned shooting instruction, and the target virtual object X has not performed the shooting action. Also, the direction in which the target virtual object X holds the target virtual weapon W is parallel to the ground.
Assuming that the user inputs a shooting instruction for a single shot, the target virtual object X performs a shooting action toward the virtual target B by the target virtual weapon W, and at this time, the graphical user interface a is changed to obtain the graphical user interface B shown in (B) of fig. 4.
Referring to fig. 4 (B), a target virtual object X, a target virtual weapon W, a virtual target B, and a shot hole K1 are included in the graphical user interface B. It can be seen that the direction in which the target virtual object X holds the target virtual weapon W is changed, and at this time, the direction in which the target virtual object X holds the target virtual weapon W is tilted backwards and is not parallel to the ground. And a shooting hole K1 appears on the virtual target B, the picture shown on the visible graphical user interface B is the picture after the target virtual object X holds the target virtual weapon W to complete a shooting action.
If the motion control instruction is an instruction for controlling the target virtual object to execute a plurality of shooting motions, and the first game behavior is a continuous shooting behavior, in one possible implementation, referring to fig. 5, the method for controlling the target virtual object to execute the first game behavior and rendering a first motion animation showing a correspondence of the first game behavior includes:
step 1008: and controlling the target virtual object to execute the continuous shooting action.
Step 1009: and taking the continuous shooting behavior motion animation as the first motion animation to be displayed.
In this way, the continuous shooting behavior may be performed while the continuous shooting behavior is being presented, to ensure that the game behavior performed by the target virtual object matches and is synchronized with the presented motion animation.
Step 1010: and determining a third recoil parameter according to the scattering and scattering parameters of the first virtual weapon, and determining a fourth recoil parameter according to the shooting reclination angle of the single shooting behavior action animation and the executed times of the shooting action.
Optionally, the third recoil parameter may refer to a horizontal recoil parameter of the target virtual object when armed with the first virtual weapon for continuous shooting. The third recoil parameter may particularly indicate a magnitude and an angle of the horizontal swing of the orientation of the target virtual object and/or the direction in which the first virtual weapon is pointed when shooting continuously.
The fourth recoil parameter may refer to a vertical recoil parameter of the target virtual object when equipped with the first virtual weapon for continuous shooting. The fourth recoil parameter may in particular indicate a magnitude and a direction of a change in a vertical direction of a direction in which the first virtual weapon is pointed when consecutive shots are taken.
Step 1011: and overlapping the third recoil force parameter and the fourth recoil force parameter to the first action animation, and displaying the overlapped first action animation.
It should be noted that, when the target virtual object performs the superimposed continuous shooting animation, the target virtual object may swing horizontally and/or vertically according to the third recoil parameter and the fourth recoil parameter, so that the recoil effect of the first virtual weapon after shooting in the real world can be simulated when the target virtual object shows the target motion animation.
To better explain the target virtual object motion processing method provided in the embodiment of the present application, fig. 6 is another schematic diagram of a shooting effect provided in the embodiment of the present application.
Assuming that the user inputs the instruction of continuous shooting after inputting the above-mentioned instruction of single shooting, a graphical user interface C as shown in fig. 6 can be obtained. The graphical user interface C further comprises a target virtual object X, a target virtual weapon W, a virtual target B, a shooting hole K1, a shooting hole K2, and a shooting hole K3. It can be seen that the direction in which target virtual object X holds target virtual weapon W is further changed than in (b) of fig. 4, and the direction in which target virtual object X holds target virtual weapon W is greatly tilted back. And new shooting holes K2 and K3 appear on the virtual target B, the screen shown by the visible graphical user interface C is a screen after the target virtual object X holds the target virtual weapon W and completes the continuous shooting action.
It will be appreciated that a target action animation capable of simulating a real-world recoil effect may be played or executed while the target virtual object is responding to the firing instructions.
In one possible implementation, determining the fourth recoil parameter according to the shooting reclination angle and the executed number of times of the shooting action of the single shooting behavior action animation includes:
the product of the firing reclination angle and the number of times the firing action has been performed is determined.
Alternatively, the number of times of execution of the shooting action may refer to the number of times that the target virtual object has completed the shooting action by the target virtual weapon in the present response to the shooting instruction. For example, if the target virtual weapon is a virtual firearm, the number of times the firing action has been performed may be increased by 1 for each round fired by the target virtual weapon.
The number of times the shooting action has been performed at the very beginning of the response to the shooting instruction is 0.
In addition, the product may characterize the accumulated angle of need for lean back on consecutive shots.
And comparing the product with a preset backward-leaning angle threshold, and if the product is greater than or equal to the preset backward-leaning angle threshold, taking the preset backward-leaning angle threshold as a fourth recoil parameter.
Alternatively, the preset lean angle threshold may be a maximum angle at which the target virtual object may lean back, set by a skilled person. The embodiment of the present application does not limit this.
It should be noted that if the product is greater than or equal to the predetermined lean angle threshold, it may indicate that the number of consecutive shots is large, and the target virtual object needs to lean backward by a large angle, which may cause abnormal deformation of the target virtual object.
And if the product is smaller than the preset backswing angle threshold value, taking the product as a fourth recoil parameter.
It should be noted that the target virtual object does not lean back by more than the preset lean angle threshold no matter how many shooting actions are performed or how much recoil influence is generated. Therefore, the problem that the posture of the target virtual object is abnormally deformed due to the fact that the target virtual object is too large in backward tilting angle caused by more continuous shooting times can be solved. In this way, the effect of the target virtual object motion processing can be improved.
One possible approach, after determining the product of the firing reclination angle and the number of times the firing action has been performed, the method further comprises:
and determining the duration of the input shooting instruction, and if the duration of the shooting instruction is greater than or equal to the preset duration, taking the preset back-bending angle threshold value as a fourth recoil parameter.
Alternatively, the duration of the input of the shooting instruction may refer to a duration in which the target virtual object needs to continuously perform the shooting action.
Alternatively, the preset time period may be any value greater than 0 set by the skilled person.
It should be noted that if the duration of the shooting command is greater than or equal to the preset duration, it may indicate that the number of consecutive shots is large, and the angle at which the target virtual object needs to lean back is large, which may cause abnormal deformation of the target virtual object. And when the duration of the shooting instruction is greater than or equal to the preset duration, the preset backward inclination angle threshold is used as a fourth recoil parameter, so that the problem that the posture of the target virtual object is abnormally deformed can be avoided, and meanwhile, the flexibility of determining the fourth recoil parameter can be improved.
In a possible implementation manner, if the motion control command is specifically a shooting command, the number of times that the motion control command controls the target virtual object to perform the shooting motion may be determined as follows:
the number of times of shooting actions indicated by the shooting instruction may be determined based on an input interval time between two consecutive inputs of the shooting instruction by the user and a shooting interval time of the first virtual weapon.
The firing interval may be a duration that begins at the time the first virtual weapon completes any firing and ends at the time the first virtual weapon begins the next firing.
For example, if the input interval time is greater than the shooting interval time, it may be determined that the number of times of the shooting action indicated by the shooting instruction is a single time. If the input interval time is less than or equal to the shooting interval time, the number of times of shooting actions indicated by the shooting instruction can be confirmed to be multiple times.
And if the shooting action time indicated by the shooting instruction is a single time, the shooting action is the single shooting action, and the action control instruction is used for controlling the target virtual object to execute an instruction of one shooting action. And if the shooting action times indicated by the shooting command are multiple times, the shooting action is the continuous shooting action, and the action control command is used for controlling the target virtual object to execute the command of continuous multiple shooting actions.
It is noted that if the number of times of the shooting motion is a single time, it may be determined that the target motion animation corresponding to the single shooting animation needs to be displayed, and if the number of times of the shooting motion is a plurality of times, it may be determined that the target motion animation corresponding to the continuous shooting animation needs to be played. Therefore, the target virtual object can be controlled to execute different actions or play different animations according to different shooting instructions input by the user.
In a possible manner, when the first recoil parameter and the third recoil parameter are superimposed, all levels of spinal bones of the target virtual object may be selected as aiming bones based on an Aim Ik manner, and a position pointed by the target virtual weapon is acquired as an aiming reference point.
And taking a certain position on the extension line of the virtual sight on the graphical user interface as a target position, and enabling the position pointed by the target virtual weapon to be continuously aligned with the virtual sight.
The first recoil parameter and the third recoil parameter are then applied to the aiming bone by means of an Aim Ik, such that the aiming bone is rotated to change the position and/or direction at which the target virtual weapon is pointed.
In a possible implementation manner, rendering and displaying a first action animation corresponding to the first game behavior further includes:
and determining the offset animation corresponding to the rotating direction according to the rotating speed and the rotating direction of the target virtual object.
Alternatively, the rotation speed of the target virtual object may be determined according to the speed at which the user inputs the above-mentioned aiming direction adjustment command, and the rotation direction of the target virtual object may be determined according to the direction in which the user inputs the aiming direction adjustment command.
Alternatively, the offset animation may refer to a gesture animation designed based on the preset gesture motion and used for instructing the preset gesture motion to rotate and offset by a certain angle in four directions, namely, up, down, left and right directions.
In particular, the offset animation may indicate a tilt angle or a rotation angle at which the target virtual object holds the target virtual weapon. The animation length of the offset animation is 1 frame.
Illustratively, a two-dimensional mixture Space (Blend Space 2D) may be correspondingly established based on the movement of a virtual pointer or a virtual cursor of the input device in a two-dimensional coordinate system on the screen when the user inputs the motion control command, wherein a horizontal axis of the two-dimensional mixture Space is responsive to the horizontal input movement speed of the input device, a vertical axis of the two-dimensional mixture Space is responsive to the vertical movement speed of the input device, and a range of horizontal and vertical coordinates of the two-dimensional mixture Space is (-1, 1).
The preset gesture action is corresponding to the two-dimensional mixed space when the coordinate is (0, 0), and the different offset animations are corresponding to the two-dimensional mixed space when the coordinate is (0, 1), (0, -1), (-1, 0) or (1, 0).
And adjusting the target action animation according to the offset animation and the rotating speed to obtain and display the adjusted target action animation.
Therefore, the action executed by the target virtual object and/or the target action animation played by the target virtual object can be adjusted according to the aiming direction adjusting instruction input by the user, and the tilting or rotating effect of the posture of the held weapon can be accurately simulated when the held weapon rotates rapidly or the direction is adjusted rapidly in a real scene. In this way, the reality of executing the target motion animation on the target virtual object can be improved, and the effect of motion processing of the target virtual object can be improved.
In a possible implementation manner, adjusting the first motion animation according to the offset animation and the rotation speed to obtain and display the adjusted first motion animation includes:
and determining and adjusting the weight value of the first action animation according to the rotating speed.
Optionally, the weight value is used to indicate a magnitude of the adjustment to the first motion animation.
Notably, the weight value may also be used to indicate whether a new first action animation is more biased towards the first action animation or the offset animation. In general, the greater the weight value, the more biased the new first action animation may be toward the offset animation. The embodiment of the present application does not limit this.
And mixing the first action animation and the offset animation based on the weight value to obtain a new first action animation.
Alternatively, the operation of the blending process may be an operation of fusing the first motion animation and the offset animation, and selecting the fused animation according to the weight value.
And displaying the new first action animation.
In this way, the new first motion animation can be accurately determined and displayed so as to better simulate the tilting or rotating effect of the posture of the held weapon when the held weapon is rapidly rotated or rapidly adjusted in direction in a real scene.
In one possible implementation, the motion control instructions include targeting instructions that correspond to the targeting animation, and then the first play activity is a targeting activity. Then, controlling the target virtual object to execute the first game behavior and rendering a first action animation showing the first game behavior, including:
and controlling the target virtual object to execute the aiming action.
In this way, the target virtual object can be accurately targeted in response to the targeting instruction while equipping the first virtual weapon.
And taking the aiming behavior action animation corresponding to the aiming behavior as the first action animation, and displaying the first action animation.
In this way, the animation aimed at by the target virtual object can be accurately presented to ensure matching with the game action currently being performed by the target virtual object.
In one possible implementation, the method further includes, in rendering and displaying a first action animation corresponding to the first game behavior:
an overlap length between the distance field of the first virtual weapon and the capsule body of each virtual object in the virtual scene is determined.
Alternatively, the distance field of the first virtual weapon can be a perceptron of the target virtual weapon set by a skilled artisan.
The distance field of the target virtual weapon is used to determine whether the first virtual weapon overlaps or collides with the capsule body of each virtual object.
The overlap length may be the maximum diameter of the region where the distance field of the first virtual weapon overlaps the capsule of any virtual object, or the distance of the distance field of the first virtual weapon into the capsule of any virtual object.
And if the overlapping length is smaller than the preset distance, acquiring the gesture adjusting action corresponding to the first virtual weapon.
Alternatively, the preset distance may be any value set by a person skilled in the relevant art.
Optionally, the gesture-adjusting action may be an action of the target virtual object holding the first virtual weapon in close proximity. If the target virtual object holds the first virtual weapon according to the gesture adjustment, the first virtual weapon can be made to be close to the body of the target virtual object to the maximum extent.
And adjusting the first action animation according to the overlapping length and the adjusting posture action.
Therefore, the posture of the target virtual object holding the first virtual weapon can be changed, so as to reduce the overlapping length between the distance field of the first virtual weapon and the capsule body of each virtual object in the virtual scene, and further avoid the phenomenon that the first virtual weapon penetrates each virtual object to cause die penetration.
In one possible implementation, the adjusting the first motion animation according to the overlap length and the adjustment gesture motion includes:
an adjustment weight is determined based on the overlap length and the range of the distance field of the first virtual weapon.
Alternatively, the range of the distance field of the first virtual weapon can refer to the maximum length of the distance field of the first virtual weapon.
Illustratively, the adjustment weight may be determined by calculating a ratio of the overlap length to a range of the distance field of the first virtual weapon and using the ratio as the adjustment weight.
And superposing the adjusted gesture motion on the first motion animation according to the adjusted weight.
In general, the greater the adjustment weight, the greater the degree to which the adjustment gesture motion is superimposed.
In addition, the degree of superimposing the adjustment gesture motion on the first motion animation can be adjusted by the above-described animation amplitude variable. For example, the value range of the animation amplitude variable is set to [0,1], and the larger the adjustment weight is, the closer the value of the animation amplitude variable is to 1.
In one possible implementation, determining, from a preset gesture action library and a preset general animation library configured in advance, a first gesture action and a first general animation required for controlling the target virtual object to perform the first game behavior based on the first virtual weapon and the first weapon type includes:
the model number of the first virtual weapon is determined.
And determining the first gesture action required when the target virtual object executes the first game behavior from the preset gesture action library according to the model of the first virtual weapon.
Optionally, the first gesture action is a single frame gesture.
Specifically, a gesture action corresponding to the model of the first virtual weapon in the preset gesture action library may be found as the first gesture action by traversing the preset gesture action library established in advance according to the model of the first virtual weapon.
The preset gesture action library comprises single-frame gesture actions designed by related technicians for various types of virtual weapons.
And determining each first general animation required by the target virtual object to execute the first game behavior from the preset general animation library according to the weapon type of the target virtual weapon.
Optionally, each first generic animation comprises a targeting animation, a single shot animation, and/or a continuous shot animation.
Optionally, a set of animations corresponding to the weapon type of the target virtual weapon in the preset general animation library may be found as each first general animation by traversing a preset general animation library established in advance according to the weapon type of the first virtual weapon.
In this way, each of the first generic animation and the first gesture motion can be accurately determined.
In one possible approach, after equipping the first virtual weapon and completing each shooting action, the target virtual object may also be controlled to re-perform the first gesture action without the above-mentioned action control command input.
In addition, a homing animation may be set when controlling the target virtual object to re-perform a preset gesture motion. And controlling the target virtual object to play, trigger or execute the homing animation under the condition that the target virtual object meets the preset condition.
Alternatively, the preset condition may be that the target virtual object gradually decreases in the backward angle, and when the backward angle of the target virtual object is equal to a certain threshold, the target virtual object starts to be controlled to play, trigger or execute the homing animation.
Therefore, the situation that the action of the target virtual object changes suddenly or jumps when the preset gesture action is executed again can be avoided, and the smoothness of the re-execution of the preset gesture action can be improved.
One possible approach, after generating a first action animation corresponding to the first game behavior based on the first gesture action and the first generic animation, the method further comprises:
and storing each first action animation into a preset animation library to be selected.
Alternatively, the preset candidate action library may be a database set by a related technician for storing each determined first action animation.
For example, after the first action animations are generated, the first action animations can be continuously reserved, so that the first action animations corresponding to the virtual weapons of various models need to be repeatedly generated when the game is opened again or a new game is started, and the resources occupied by the action processing of the target virtual object are reduced.
Or, when the game is closed or the current game is finished, all the first action animations stored in the preset animation library to be selected can be deleted, so that occupied storage resources are released.
In this way, the flexibility of performing the operation processing of the target virtual object can be improved.
One possible approach, after determining the model number of the first virtual weapon, the method further comprises:
determining a model number of a second virtual weapon armed by the target virtual object in the current game play.
Optionally, the current game process may refer to a process from a time when the game starts to run to a current time, or may refer to a process in a currently ongoing game round, which is not limited in the embodiment of the present application.
The second virtual weapon may refer to all virtual weapons armed by the target virtual object at the current game play. The second virtual weapon may be the same as the first virtual weapon, or may be different from the first virtual weapon.
Alternatively, the number of second virtual weapons may be any integer greater than or equal to 0. The embodiments of the present application do not limit this.
It is determined whether the model of the second virtual weapon is the same as the model of the first virtual weapon.
It is noted that if the number of the second virtual weapons is 0, it can be directly determined that the model number of the second virtual weapons is different from the model number of the first virtual weapons.
If the number of second virtual weapons is greater than 1, it may be determined that the model number of the second virtual weapon is different from the model number of the first virtual weapon if the model numbers of all the second virtual weapons are different from the model number of the first virtual weapon.
If not, determining the first gesture action from the preset gesture action library according to the model of the first virtual weapon, and determining each first general animation from the preset general animation library according to the weapon type of the target virtual weapon.
It is noted that, if the model of the second virtual weapon is different from the model of the first virtual weapon, and no virtual weapon with the same model as the first virtual weapon is equipped in the current game process, it indicates that the model of the first virtual weapon is equipped for the first time in the current game process, that is, each first action animation corresponding to the first virtual weapon has not been generated in the current game process. Then after acquiring the first general animation and the first gesture action corresponding to the weapon type of the target virtual weapon, the corresponding steps such as the step 1003 and the like are further executed to determine each first action animation.
And if so, acquiring a second action animation corresponding to the model of the first virtual weapon from a preset animation library to be selected.
The second motion animation is generated based on a second gesture motion and a second generic animation of a second virtual weapon that is the same model as the first virtual weapon. The specific steps are the same as the steps for generating the first motion animation.
Since the models of the first virtual weapon and the second virtual weapon are the same, the types of the first virtual weapon and the second virtual weapon should also be the same. Therefore, the second gesture motion and the second general animation corresponding to the second virtual weapon can be multiplexed as the first gesture motion and the first general animation of the first virtual weapon.
It should be noted that, if no virtual weapon with the same model as the first virtual weapon is equipped in the current game process, it indicates that the target virtual object is the virtual weapon equipped with the model for the first time in the current game process, that is, each first action animation corresponding to the first virtual weapon has not been generated in the current game process.
In this way, when the target virtual object is repeatedly equipped with a virtual weapon of the same model, it is not necessary to re-determine each first motion animation corresponding to the model based on the first general animation and the first posture motion, and resources occupied by performing motion processing on the target virtual object can be reduced.
The following describes a device, an apparatus, and a computer-readable storage medium for executing the target virtual object action processing method provided by the present application, and specific implementation procedures and technical effects thereof are referred to above, and will not be described again below.
Fig. 7 is a schematic structural diagram of a target virtual object action processing apparatus according to an embodiment of the present application, and referring to fig. 7, the apparatus includes:
a determination module 201 is configured to determine a first weapon type corresponding to a first virtual weapon in response to a target virtual object equipping the first virtual weapon.
And the acquisition determining module 202 is configured to determine, based on the first virtual weapon and the first weapon type, a first gesture action and a first general animation required for controlling the target virtual object to execute the first game behavior from a preset gesture action library and a preset general animation library configured in advance.
Wherein the first gesture action is determined by the first virtual weapon, the first generic animation is determined by the first weapon type, the first game action comprises: aiming behavior, single shot behavior, continuous shot behavior.
A generating module 203, configured to generate a first action animation corresponding to the first game behavior based on the first gesture action and the first general animation.
Wherein the first action animation is used for showing the action effect of the target virtual object when the first virtual weapon is used for executing the first game behavior, and the first action animation comprises: aiming behavior animation, single shooting behavior animation, and continuous shooting behavior animation.
Optionally, referring to fig. 8, the apparatus further comprises:
the receiving and determining module 204 is configured to receive an action control instruction for the target virtual object, and determine a target game behavior corresponding to the action control instruction and a target virtual weapon equipped with the target virtual object.
And the executing module 205 is configured to, in response to that the target virtual weapon is the first virtual weapon and the target game behavior is the first game behavior, control the target virtual object to execute the first game behavior, and render a first action animation showing a correspondence of the first game behavior.
Optionally, the generating module 203 is further configured to determine dynamic gesture information between each first generic animation and the first gesture action. Each first motion animation is determined according to each dynamic gesture information and the first gesture motion.
The generating module 203 is further configured to compare each frame of motion in each first general animation with the first gesture motion, respectively, to obtain a motion gesture of each first general animation. Each motion posture is used as the dynamic posture information.
The generating module 203 is further configured to perform an overlay process on each piece of dynamic posture information and the first posture motion, so as to obtain each initial motion animation. And acquiring preset control parameters according to the model of the first virtual weapon, and adjusting each initial action animation based on the preset control parameters. And taking each adjusted initial motion animation as each first motion animation.
Acquisition determination module 202 is also configured to determine the model number of the target virtual weapon and determine whether the model number of the target virtual weapon is the same as the first virtual weapon. If yes, determining that the target virtual object is currently equipped with the first virtual weapon. And analyzing the action control command, and taking one game behavior in the first game behavior, which is the same as the game behavior indicated by the action control command, as the target game behavior.
The execution module 205 is further configured to control the target virtual object to execute the single shooting action. The single shooting behavior motion animation is taken as the first motion animation to be presented. And determining the first recoil parameter according to the scattering dispersion parameter of the first virtual weapon, and taking the shooting reclination angle of the single shooting behavior motion animation as a second recoil parameter. And overlapping the first recoil parameter and the second recoil parameter to the first action animation, and displaying the overlapped first action animation.
The execution module 205 is further configured to control the target virtual object to execute the continuous shooting activity. The continuous shooting behavior motion animation is taken as the first motion animation to be shown. And determining a third recoil parameter according to the scattering and scattering parameters of the first virtual weapon, and determining a fourth recoil parameter according to the shooting supination angle and the executed times of the shooting action of the single shooting action animation. And overlapping the third recoil force parameter and the fourth recoil force parameter to the first action animation, and displaying the overlapped first action animation.
The execution module 205 is also configured to determine the product of the firing reclination angle and the number of times the firing action has been performed. And comparing the product with a preset backward-leaning angle threshold, and if the product is greater than or equal to the preset backward-leaning angle threshold, taking the preset backward-leaning angle threshold as a fourth backward-leaning force parameter. And if the product is smaller than the preset backswing angle threshold value, taking the product as a fourth recoil parameter.
The execution module 205 is further configured to determine a duration of the input of the shooting instruction, and if the duration of the shooting instruction is greater than or equal to a preset duration, use the preset reclining angle threshold as the fourth squat strength parameter.
The execution module 205 is further configured to determine the number of firing actions indicated by the firing command based on an input interval between two consecutive inputs of the firing command by the user and a firing interval of the first virtual weapon.
The execution module 205 is further configured to determine an offset animation corresponding to the rotation direction according to the rotation speed and the rotation direction of the target virtual object. And adjusting the target action animation according to the offset animation and the rotating speed to obtain and display the adjusted target action animation.
The executing module 205 is further configured to determine to adjust the weight value of the first motion animation according to the rotation speed. And mixing the first action animation and the offset animation based on the weight value to obtain a new first action animation. And displaying the new first action animation.
The execution module 205 is also configured to control the target virtual object to perform the targeting behavior. And taking the aiming behavior action animation corresponding to the aiming behavior as the first action animation, and displaying the first action animation.
The execution module 205 is also operable to determine a length of overlap between the distance field of the first virtual weapon and the capsule of virtual objects in the virtual scene. And if the overlapping length is smaller than the preset distance, acquiring the gesture adjusting action corresponding to the first virtual weapon. And adjusting the first action animation according to the overlapping length and the adjusting posture action.
The execution module 205 is further operable to determine an adjustment weight based on the overlap length and the range of the distance field of the first virtual weapon. And superposing the adjusted gesture motion on the first motion animation according to the adjusted weight.
The determination module 201 is also used to determine the model number of the first virtual weapon.
And determining the first gesture action required by the target virtual object to execute the first game behavior from the preset gesture action library according to the model of the first virtual weapon. And determining each first general animation required by the target virtual object to execute the first game behavior from the preset general animation library according to the weapon type of the target virtual weapon.
The generating module 203 is further configured to store each first action animation in a preset animation library to be selected.
The acquisition determination module 202 is further configured to determine a model number of a second virtual weapon armed by the target virtual object during the current game play. It is determined whether the model of the second virtual weapon is the same as the model of the first virtual weapon. If not, determining the first gesture motion from the preset gesture motion library according to the model of the first virtual weapon, and determining each first general animation from the preset general animation library according to the weapon type of the target virtual weapon. And if so, acquiring a second action animation corresponding to the model of the first virtual weapon from a preset animation library to be selected.
The above-mentioned apparatus is used for executing the method provided by the foregoing embodiment, and the implementation principle and technical effect are similar, which are not described herein again.
These above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors, or one or more Field Programmable Gate Arrays (FPGAs), etc. For another example, when one of the above modules is implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. As another example, these modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
Fig. 9 is a schematic structural diagram of a computer device according to an embodiment of the present application. Referring to fig. 9, the computer apparatus includes: a memory 301 and a processor 302, wherein the memory 301 stores a computer program operable on the processor 302, and the processor 302 executes the computer program to implement the steps of any of the above-mentioned method embodiments.
Processor 302 is configured to determine a first weapon type corresponding to a first virtual weapon in response to a target virtual object arming the first virtual weapon.
A processor 302, configured to determine, based on the first virtual weapon and the first weapon type, a first gesture action and a first general animation required for controlling the target virtual object to perform the first game behavior from a pre-configured pre-set gesture action library and a pre-set general animation library.
A processor 302 for generating a first action animation corresponding to the first game behavior based on the first gesture action and the first generic animation.
Optionally, the processor 302 is configured to receive a motion control instruction for the target virtual object, and determine a target game behavior corresponding to the motion control instruction and a target virtual weapon equipped with the target virtual object.
A processor 302, configured to, in response to the target virtual weapon being the first virtual weapon and the target game behavior being the first game behavior, control the target virtual object to execute the first game behavior, and render a first action animation showing a correspondence of the first game behavior.
Optionally, the processor 302 is further configured to determine dynamic gesture information between each first generic animation and the first gesture action. Each first motion animation is determined according to each dynamic gesture information and the first gesture motion.
The processor 302 is further configured to compare each frame of motion in each first general animation with the first gesture motion, respectively, to obtain a motion gesture of each first general animation. Each motion posture is used as the dynamic posture information.
The processor 302 is further configured to perform an overlay process on each piece of dynamic gesture information and the first gesture motion, so as to obtain each initial motion animation. And acquiring preset control parameters according to the model of the first virtual weapon, and adjusting each initial action animation based on the preset control parameters. And taking each adjusted initial motion animation as each first motion animation.
Processor 302 is also configured to determine the model number of the target virtual weapon and determine whether the model number of the target virtual weapon is the same as the first virtual weapon. If yes, determining that the target virtual object is currently equipped with the first virtual weapon. And analyzing the action control command, and taking one game behavior in the first game behavior, which is the same as the game behavior indicated by the action control command, as the target game behavior.
The processor 302 is also configured to control the target virtual object to perform the single shot activity. The single shooting behavior motion animation is taken as the first motion animation to be presented. And determining the first recoil parameter according to the scattering and scattering parameters of the first virtual weapon, and taking the shooting supination angle of the single shooting behavior action animation as a second recoil parameter. And overlapping the first recoil parameter and the second recoil parameter to the first action animation, and displaying the overlapped first action animation.
The processor 302 is also configured to control the target virtual object to perform the continuous shooting activity. The continuous shooting behavior motion animation is taken as the first motion animation to be shown. And determining a third recoil parameter according to the scattering and scattering parameters of the first virtual weapon, and determining a fourth recoil parameter according to the shooting reclination angle of the single shooting behavior action animation and the executed times of the shooting action. And overlapping the third recoil force parameter and the fourth recoil force parameter to the first action animation, and displaying the overlapped first action animation.
The processor 302 is also configured to determine the product of the firing recline angle and the number of times the firing action has been performed. And comparing the product with a preset backward-leaning angle threshold, and if the product is greater than or equal to the preset backward-leaning angle threshold, taking the preset backward-leaning angle threshold as a fourth backward-leaning force parameter. And if the product is smaller than the preset backswing angle threshold value, taking the product as a fourth recoil parameter.
The processor 302 is further configured to determine a duration of the input of the shooting instruction, and if the duration of the shooting instruction is greater than or equal to a preset duration, take the preset reclining angle threshold as the fourth squat force parameter.
The processor 302 is further configured to determine the number of firing actions indicated by the firing order based on an input interval between two consecutive inputs of the firing order by the user and a firing interval of the first virtual weapon.
The processor 302 is further configured to determine an offset animation corresponding to the rotation direction according to the rotation speed and the rotation direction of the target virtual object. And adjusting the target action animation according to the offset animation and the rotating speed to obtain and display the adjusted target action animation.
The processor 302 is further configured to determine a weight value for adjusting the first motion animation according to the rotation speed. And mixing the first action animation and the offset animation based on the weight value to obtain a new first action animation. And displaying the new first action animation.
The processor 302 is also configured to control the target virtual object to perform the targeting act. And taking the aiming behavior action animation corresponding to the aiming behavior as the first action animation, and displaying the first action animation.
The processor 302 is also configured to determine a length of overlap between the distance field of the first virtual weapon and the capsule of virtual objects in the virtual scene. And if the overlapping length is smaller than the preset distance, acquiring the posture adjusting action corresponding to the first virtual weapon. And adjusting the first action animation according to the overlapping length and the adjusting posture action.
Processor 302 is also configured to determine an adjustment weight based on the overlap length and the range of the distance field of the first virtual weapon. And superposing the adjusted gesture motion on the first motion animation according to the adjusted weight.
Processor 302 is also configured to determine a model number of the first virtual weapon. And determining the first gesture action required by the target virtual object to execute the first game behavior from the preset gesture action library according to the model of the first virtual weapon. And determining each first general animation required by the target virtual object to execute the first game behavior from the preset general animation library according to the weapon type of the target virtual weapon.
The processor 302 is further configured to store each first motion animation in a preset animation library to be selected.
Processor 302 is also configured to determine a model number of a second virtual weapon armed by the target virtual object during the current game play. It is determined whether the model of the second virtual weapon is the same as the model of the first virtual weapon. If not, determining the first gesture motion from the preset gesture motion library according to the model of the first virtual weapon, and determining each first general animation from the preset general animation library according to the weapon type of the target virtual weapon. And if so, acquiring a second action animation corresponding to the model of the first virtual weapon from a preset animation library to be selected.
The embodiments of the present application also provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the steps in the above-mentioned method embodiments can be implemented.
The processor is configured to determine a first weapon type corresponding to a first virtual weapon in response to a target virtual object arming the first virtual weapon.
The processor is used for determining a first gesture action and a first general animation required for controlling the target virtual object to execute a first game behavior from a preset gesture action library and a preset general animation library which are configured in advance based on the first virtual weapon and the first weapon type.
The processor is configured to generate a first action animation corresponding to the first game behavior based on the first gesture action and the first general animation.
Optionally, the processor is configured to receive a motion control instruction for the target virtual object, and determine a target game behavior corresponding to the motion control instruction and a target virtual weapon equipped with the target virtual object.
The processor is used for responding to the target virtual weapon and the target game behavior as the first game behavior, controlling the target virtual object to execute the first game behavior, and rendering and displaying a first action animation corresponding to the first game behavior.
Optionally, the processor is further configured to determine dynamic gesture information between each first generic animation and the first gesture action. And determining each first action animation according to each dynamic gesture information and the first gesture action.
The processor is further configured to compare each frame of motion in each first general animation with the first gesture motion, respectively, to obtain a motion gesture of each first general animation. Each motion posture is used as the dynamic posture information.
The processor is further configured to perform superposition processing on each piece of dynamic posture information and the first posture motion respectively to obtain each initial motion animation. And acquiring preset control parameters according to the model of the first virtual weapon, and adjusting each initial action animation based on the preset control parameters. And taking each adjusted initial motion animation as each first motion animation.
The processor is also configured to determine a model number of the target virtual weapon and determine whether the model number of the target virtual weapon is the same as the first virtual weapon. If yes, determining that the target virtual object is currently equipped with the first virtual weapon. And analyzing the action control command, and taking a game behavior in the first game behavior, which is the same as the game behavior indicated by the action control command, as the target game behavior.
The processor is also configured to control the target virtual object to perform the single shot activity. The single shooting behavior motion animation is taken as the first motion animation to be presented. And determining the first recoil parameter according to the scattering dispersion parameter of the first virtual weapon, and taking the shooting reclination angle of the single shooting behavior motion animation as a second recoil parameter. And overlapping the first recoil parameter and the second recoil parameter to the first action animation, and displaying the overlapped first action animation.
The processor is also configured to control the target virtual object to perform the continuous shooting activity. The continuous shooting behavior motion animation is taken as the first motion animation to be shown. And determining a third recoil parameter according to the scattering and scattering parameters of the first virtual weapon, and determining a fourth recoil parameter according to the shooting reclination angle of the single shooting behavior action animation and the executed times of the shooting action. And overlaying the third recoil parameter and the fourth recoil parameter to the first action animation, and displaying the overlaid first action animation.
The processor is also configured to determine a product of the firing reclination angle and the number of times the firing action has been performed. And comparing the product with a preset backward-leaning angle threshold, and if the product is greater than or equal to the preset backward-leaning angle threshold, taking the preset backward-leaning angle threshold as a fourth backward-leaning force parameter. And if the product is smaller than the preset backswing angle threshold value, taking the product as a fourth recoil parameter.
The processor is further configured to determine a duration of the input of the shooting instruction, and if the duration of the shooting instruction is greater than or equal to a preset duration, take the preset reclining angle threshold as a fourth squat force parameter.
The processor is further configured to determine the number of firing actions indicated by the firing command based on an input interval between two consecutive inputs of the firing command by the user and a firing interval of the first virtual weapon.
The processor is further configured to determine an offset animation corresponding to the rotation direction according to the rotation speed and the rotation direction of the target virtual object. And adjusting the target action animation according to the offset animation and the rotating speed to obtain and display the adjusted target action animation.
The processor is further configured to determine to adjust a weight value of the first motion animation according to the rotation speed. And mixing the first action animation and the offset animation based on the weight value to obtain a new first action animation. And displaying the new first action animation.
The processor is also configured to control the target virtual object to perform the targeting act. And taking the aiming behavior action animation corresponding to the aiming behavior as the first action animation, and displaying the first action animation.
The processor is further configured to determine a length of overlap between the distance field of the first virtual weapon and the capsule body of each virtual object in the virtual scene. And if the overlapping length is smaller than the preset distance, acquiring the posture adjusting action corresponding to the first virtual weapon. And adjusting the first action animation according to the overlapping length and the adjusting posture action.
The processor is also configured to determine an adjustment weight based on the overlap length and the range of the distance field of the first virtual weapon. And superposing the adjusted gesture motion on the first motion animation according to the adjusted weight.
The processor is also configured to determine a model number of the first virtual weapon. And determining the first gesture action required by the target virtual object to execute the first game behavior from the preset gesture action library according to the model of the first virtual weapon. And determining each first general animation required by the target virtual object to execute the first game behavior from the preset general animation library according to the weapon type of the target virtual weapon.
The processor is also used for storing each first action animation in a preset animation library to be selected.
The processor is also configured to determine a model number of a second virtual weapon armed by the target virtual object during the current game play. It is determined whether the model of the second virtual weapon is the same as the model of the first virtual weapon. If not, determining the first gesture motion from the preset gesture motion library according to the model of the first virtual weapon, and determining each first general animation from the preset general animation library according to the weapon type of the target virtual weapon. And if so, acquiring a second action animation corresponding to the model of the first virtual weapon from a preset animation library to be selected.
Optionally, the present application also provides a program product, such as a computer-readable storage medium, including a program, which when executed by a processor, is configured to perform any of the above-described target virtual object action processing method embodiments.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a unit is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute some steps of the methods according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a portable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other media capable of storing program codes.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (19)

1. A virtual object action processing method, characterized in that the method comprises:
responsive to a target virtual object equipping a first virtual weapon, determining a first weapon type to which the first virtual weapon corresponds;
determining a first gesture action and a first general animation required for controlling the target virtual object to execute a first game behavior from a preset gesture action library and a preset general animation library which are configured in advance based on the first virtual weapon and the first weapon type; wherein the first gesture action is determined by the first virtual weapon, the first generic animation is determined by the first weapon type, and the first game action comprises: aiming behavior, single shooting behavior, continuous shooting behavior;
generating a first action animation corresponding to the first game behavior based on the first gesture action and the first general animation; wherein the first action animation is used for showing the action effect of the target virtual object when the first game behavior is executed by using the first virtual weapon, and the first action animation comprises: aiming behavior animation, single shooting behavior animation, and continuous shooting behavior animation.
2. The virtual object action processing method of claim 1, wherein the method further comprises:
receiving an action control instruction aiming at the target virtual object, and determining a target game behavior corresponding to the action control instruction and a target virtual weapon equipped by the target virtual object;
in response to the target virtual weapon being the first virtual weapon and the target game action being the first game action, controlling the target virtual object to execute the first game action, and rendering a first action animation showing a correspondence of the first game action.
3. The virtual object action processing method of claim 1, wherein the determining, from a pre-configured library of preset gesture actions and a library of preset generic animations, a first gesture action and a first generic animation required to control the target virtual object to perform a first game action based on the first virtual weapon and the first weapon type, comprises:
determining a model number of the first virtual weapon;
determining the first gesture action required by the target virtual object to execute the first game behavior from the preset gesture action library according to the model of the first virtual weapon, wherein the first gesture action is a single-frame gesture;
and determining each first universal animation required by the target virtual object to execute the first game behavior from the preset universal animation library according to the weapon type of the target virtual weapon.
4. The virtual object action processing method of claim 3, wherein after said determining the model number of the first virtual weapon, the method further comprises:
determining the model number of a second virtual weapon equipped by the target virtual object in the current game process;
determining whether the model of the second virtual weapon is the same as the model of the first virtual weapon;
if not, determining the first gesture action from the preset gesture action library according to the model of the first virtual weapon, and determining each first general animation from the preset general animation library according to the weapon type of the target virtual weapon;
and if so, acquiring a second action animation corresponding to the model of the first virtual weapon from a preset animation library to be selected, wherein the second action animation is generated based on a second posture action and a second general animation of a second virtual weapon with the same model as the first virtual weapon.
5. The virtual object motion processing method according to claim 2, wherein generating a first motion animation corresponding to the first game behavior based on the first gesture motion and the first general animation includes:
determining dynamic gesture information between each of the first generic animations and the first gesture actions;
determining each of the first motion animations from each of the dynamic gesture information and the first gesture motion.
6. The virtual object motion processing method of claim 5, wherein the determining dynamic gesture information between each of the first generic animations and the first gesture motion comprises:
comparing each frame of motion in each first general animation with the first gesture motion respectively to obtain the motion gesture of each first general animation, wherein the motion gesture is a part of each frame of motion in the first general animation, which is different from the first gesture motion;
each of the action gestures is used as the dynamic gesture information.
7. The virtual object motion processing method according to claim 5, wherein the determining each of the first motion animations according to each of the dynamic gesture information and the first gesture motion comprises:
performing superposition processing on the dynamic posture information and the first posture action to obtain initial action animations;
acquiring preset control parameters according to the model of the target virtual weapon, and adjusting each initial action animation based on the preset control parameters;
respectively taking each adjusted initial motion animation as each first motion animation;
wherein the preset control parameters include at least one of: a play rate variable, a start time variable, an animation reset time variable, and an animation amplitude variable.
8. The virtual object action processing method of claim 2, wherein the first game action is a single shot action; the controlling the target virtual object to execute the first game behavior and render and display a first action animation corresponding to the first game behavior comprises:
controlling the target virtual object to perform the single shot behavior;
taking the single shooting behavior motion animation as the first motion animation needing to be displayed;
determining a first recoil parameter according to the scattering and scattering parameters of the first virtual weapon, and taking the shooting supination angle of the single shooting behavior motion animation as a second recoil parameter;
and overlapping the first recoil parameter and the second recoil parameter to the first action animation, and displaying the overlapped first action animation.
9. The virtual object action processing method according to claim 2, wherein the first game action is a continuous shooting action; the controlling the target virtual object to execute the first game behavior and render and display a first action animation corresponding to the first game behavior comprises:
controlling the target virtual object to perform the continuous shooting behavior;
taking the continuous shooting behavior action animation as the first action animation to be displayed;
determining a third recoil parameter according to the scattering and scattering parameters of the first virtual weapon, and determining a fourth recoil parameter according to the shooting backward angle and the executed times of the shooting action of the single shooting action animation;
and overlapping the third recoil force parameter and the fourth recoil force parameter to the first action animation, and displaying the overlapped first action animation.
10. The virtual object motion processing method according to claim 9, wherein the determining a fourth squat force parameter from the shooting reclination angle of the single shooting behavior motion animation, the number of times the shooting motion has been performed, comprises:
determining a product of the firing recline angle and the number of times the firing action has been performed;
comparing the product with a preset backward bending angle threshold, and if the product is greater than or equal to the preset backward bending angle threshold, taking the preset backward bending angle threshold as the fourth recoil force parameter;
if the product is smaller than the preset backswing angle threshold value, taking the product as the fourth recoil parameter;
after the determining the product of the firing reclination angle and the number of times the firing action has been performed, the method further comprises:
and determining the duration of the input shooting instruction, and if the duration of the shooting instruction is greater than or equal to the preset duration, taking the preset backswing angle threshold value as the fourth recoil parameter.
11. The virtual object action processing method of claim 2, wherein the rendering shows a first action animation corresponding to the first game behavior, further comprising:
determining an offset animation corresponding to the rotation direction according to the rotation speed and the rotation direction of the target virtual object;
and adjusting the first action animation according to the offset animation and the rotation speed to obtain and display the adjusted first action animation.
12. The method for processing the motion of the virtual object according to claim 11, wherein the adjusting the first motion animation according to the offset animation and the rotation speed to obtain and display the adjusted first motion animation comprises:
determining a weight value for adjusting the first action animation according to the rotation speed, wherein the weight value is used for indicating the amplitude of adjustment of the first action animation;
mixing the first action animation and the offset animation based on the weight value to obtain a new first action animation;
and displaying the new first action animation.
13. The virtual object action processing method of claim 2, wherein the first game action is a targeting action; the controlling the target virtual object to execute the first game behavior and render and display a first action animation corresponding to the first game behavior comprises:
controlling the target virtual object to perform the aiming behavior;
and taking the aiming behavior action animation corresponding to the aiming behavior as the first action animation, and displaying the first action animation.
14. The virtual object action processing method of claim 2, wherein the rendering shows a first action animation corresponding to the first game behavior, further comprising:
determining an overlap length between a distance field of the first virtual weapon and a capsule body of each virtual object in a virtual scene;
if the overlapping length is smaller than a preset distance, acquiring a posture adjusting action corresponding to the first virtual weapon;
and adjusting the first motion animation according to the overlapping length and the adjusting gesture motion.
15. The virtual object motion processing method according to claim 14, wherein the adjusting the first motion animation according to the overlap length and the adjustment gesture motion comprises:
determining an adjustment weight based on the overlap length and a range of the distance field of the target virtual weapon;
and superimposing the adjusted gesture motion on the first motion animation according to the adjusted weight.
16. An apparatus for processing a virtual object motion, the apparatus comprising:
a determination module to determine a first weapon type corresponding to a first virtual weapon in response to a target virtual object arming the first virtual weapon;
the acquisition determining module is used for determining a first gesture action and a first general animation required for controlling the target virtual object to execute a first game behavior from a preset gesture action library and a preset general animation library which are configured in advance based on the first virtual weapon and the first weapon type; wherein the first gesture action is determined by the first virtual weapon, the first generic animation is determined by the first weapon type, the first game action comprises: aiming behavior, single shooting behavior, continuous shooting behavior;
a generating module, configured to generate a first action animation corresponding to the first game behavior based on the first gesture action and the first general animation; wherein the first action animation is used for showing action effects of the target virtual object when the first game behavior is executed by using the first virtual weapon, and the first action animation comprises: aiming behavior motion animation, single shooting behavior motion animation, and continuous shooting behavior motion animation.
17. The virtual object motion processing apparatus of claim 16, wherein the apparatus further comprises:
the receiving and determining module is used for receiving action control instructions aiming at the target virtual object and determining target game behaviors corresponding to the action control instructions and target virtual weapons equipped by the target virtual object;
and the execution module is used for responding to the condition that the target virtual weapon is the first virtual weapon and the target game behavior is the first game behavior, controlling the target virtual object to execute the first game behavior and rendering and displaying a first action animation corresponding to the first game behavior.
18. A computer device, comprising: memory in which a computer program is stored which is executable on the processor, and a processor which, when executing the computer program, carries out the steps of the method according to any one of the preceding claims 1 to 15.
19. A computer-readable storage medium, characterized in that a computer program is stored on the computer-readable storage medium, which computer program, when being executed by a processor, carries out the steps of the method of one of claims 1 to 15.
CN202211717665.5A 2022-12-29 2022-12-29 Virtual object action processing method and device and computer equipment Pending CN115920389A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211717665.5A CN115920389A (en) 2022-12-29 2022-12-29 Virtual object action processing method and device and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211717665.5A CN115920389A (en) 2022-12-29 2022-12-29 Virtual object action processing method and device and computer equipment

Publications (1)

Publication Number Publication Date
CN115920389A true CN115920389A (en) 2023-04-07

Family

ID=86655967

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211717665.5A Pending CN115920389A (en) 2022-12-29 2022-12-29 Virtual object action processing method and device and computer equipment

Country Status (1)

Country Link
CN (1) CN115920389A (en)

Similar Documents

Publication Publication Date Title
CN110548288B (en) Virtual object hit prompting method and device, terminal and storage medium
CN110075522B (en) Control method, device and terminal of virtual weapon in shooting game
CN113398601B (en) Information transmission method, information transmission device, computer-readable medium, and apparatus
WO2022017111A1 (en) Image processing method and apparatus, electronic device and computer-readable storage medium
WO2023045375A1 (en) Method and apparatus for spectating game after character is killed, and electronic device and storage medium
CN111773682B (en) Shooting direction prompting method and device, electronic equipment and storage medium
US20230271084A1 (en) Method of displaying shooting for virtual firearm, apparatus, and computer-readable storage medium
US20240123343A1 (en) Systems and Methods for Programming Movements Of Player-Controlled Avatars in Video Games
CN112107856B (en) Hit feedback method and device, storage medium and electronic equipment
CN115920389A (en) Virtual object action processing method and device and computer equipment
CN113694515B (en) Interface display method, device, terminal and storage medium
CN112121416B (en) Control method, device, terminal and storage medium of virtual prop
CN112755524B (en) Virtual target display method and device, electronic equipment and storage medium
CN114344917A (en) Operation data verification method and device, storage medium and electronic equipment
Pinheiro et al. RealShooting: Expanding the experience of point-and-click target shooting games
CN113617030B (en) Virtual object control method, device, terminal and storage medium
CN112274917B (en) Game display control method, game display control device, electronic equipment and storage medium
WO2024051422A1 (en) Method and apparatus for displaying virtual prop, and device, medium and program product
CN113318431B (en) In-game aiming control method and device
CN112891930B (en) Information display method, device, equipment and storage medium in virtual scene
CN112274917A (en) Game display control method and device, electronic equipment and storage medium
CN113713377A (en) Projection game control method, projection game control device, electronic device, and storage medium
CN114191817A (en) Shooting control method and device for virtual character, electronic equipment and storage medium
CN113181635A (en) Virtual item assembling method, device, terminal and storage medium
CN116966550A (en) Equipment processing method, device, equipment, medium and program product for virtual prop

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination