CN113034651B - Playing method, device, equipment and storage medium of interactive animation - Google Patents

Playing method, device, equipment and storage medium of interactive animation Download PDF

Info

Publication number
CN113034651B
CN113034651B CN202110292800.5A CN202110292800A CN113034651B CN 113034651 B CN113034651 B CN 113034651B CN 202110292800 A CN202110292800 A CN 202110292800A CN 113034651 B CN113034651 B CN 113034651B
Authority
CN
China
Prior art keywords
model
bone
file
data
node
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110292800.5A
Other languages
Chinese (zh)
Other versions
CN113034651A (en
Inventor
孙广东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202110292800.5A priority Critical patent/CN113034651B/en
Publication of CN113034651A publication Critical patent/CN113034651A/en
Application granted granted Critical
Publication of CN113034651B publication Critical patent/CN113034651B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the application provides a playing method, device and equipment of interactive animation and a storage medium, and relates to the technical field of animation. The method comprises the following steps: loading a first model and a second model which participate in a first interactive animation; acquiring first node hooking configuration information related to a first interactive animation; extracting first data to be migrated from a model file of the second model; determining a first skeleton hanging point in a model file of a first model, and adding first data to be migrated to the position below the first skeleton hanging point to obtain an updated model file of the first model; and driving the first model and the second model to move based on the updated model file of the first model and the animation file of the first interactive animation so as to play the first interactive animation. According to the technical scheme provided by the embodiment of the application, the synchronization and coordination of the movement among the multiple models in the interactive animation can be improved.

Description

Playing method, device, equipment and storage medium of interactive animation
Technical Field
The embodiment of the application relates to the technical field of animation, in particular to a playing method, device and equipment of interactive animation and a storage medium.
Background
As animation technology advances, it is often necessary to play a corresponding animation by driving a model of a built character or article.
In the related art, for an interactive animation including two models, the two models are required to be driven to move respectively through two animation files, and then two animation playing components are used to play the animations corresponding to the two models respectively, so as to obtain two animations, and the two animations are combined to obtain the interactive animation of the two models.
However, in the above related art, since two animation files and two animation playing components are used, the case that the motions of two models are not synchronized easily occurs in the interactive animation.
Disclosure of Invention
The embodiment of the application provides a playing method, device and equipment of an interactive animation and a storage medium, which can promote the synchronization and coordination of movement among a plurality of models in the interactive animation. The technical scheme is as follows:
according to an aspect of the embodiments of the present application, there is provided a playing method of an interactive animation, where the method includes:
loading a first model and a second model which participate in a first interactive animation;
acquiring first node hooking configuration information related to the first interactive animation, wherein the first node hooking configuration information is used for indicating a first bone hooking point of the first model and a first bone node to be hooked of the second model in the first interactive animation;
Extracting first data to be migrated from a model file of the second model, wherein the first data to be migrated comprises bone data of the bone nodes to be hung;
determining the first skeleton hanging point in a model file of the first model, and adding the data to be migrated to the position below the first skeleton hanging point to obtain an updated model file of the first model;
and driving the first model and the second model to move based on the updated model file of the first model and the animation file of the first interactive animation so as to play the first interactive animation.
According to an aspect of the embodiments of the present application, there is provided an interactive animation playing device, including:
the model loading module is used for loading a first model and a second model which participate in the first interactive animation;
the information acquisition module is used for acquiring first node hanging configuration information related to the first interactive animation, wherein the first node hanging configuration information is used for indicating a first bone hanging point of the first model and a first bone node to be hung of the second model;
the data extraction module is used for extracting first data to be migrated from the model file of the second model, wherein the first data to be migrated comprises bone data of the first bone node to be hung;
Determining the first skeleton hanging point in a model file of the first model, and adding the data to be migrated to the position below the first skeleton hanging point to obtain an updated model file of the first model;
and driving the first model and the second model to move based on the updated model file of the first model and the animation file of the first interactive animation so as to play the first interactive animation.
According to an aspect of an embodiment of the present application, there is provided a terminal, including a processor and a memory, where the memory stores at least one instruction, at least one section of program, a code set, or an instruction set, and the at least one instruction, the at least one section of program, the code set, or the instruction set is loaded and executed by the processor to implement a playing method of the interactive animation.
According to an aspect of the embodiments of the present application, there is provided a computer readable storage medium having at least one instruction, at least one program, a code set, or an instruction set stored therein, where the at least one instruction, the at least one program, the code set, or the instruction set is loaded and executed by a processor to implement the method for playing an interactive animation as described above.
According to an aspect of embodiments of the present application, there is provided a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the terminal reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions, so that the terminal executes the playing method of the interactive animation.
The technical scheme provided by the embodiment of the application can comprise the following beneficial effects:
the first bone node to be hung of the second model file is hung below the first bone hanging point of the first model file, so that one model file (namely the updated model file of the first model) can contain relevant data of all bones needing to be driven in the first model and the second model, the first model and the second model are driven based on the one model file, the condition that a plurality of models in the first interactive animation are not synchronous in motion is avoided, the bones needing to be synchronously driven is guaranteed, and the synchronism and the coordination of the motions among the plurality of models in the first interactive animation are improved.
In addition, in the embodiment of the application, by combining two model files of two models into one model file, correspondingly, only one animation file is required to be set to drive the two models, and only one animation playing component is required to play the first interactive animation, so that the operation complexity of playing the first interactive animation is reduced and the processing cost of the terminal is saved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic illustration of an implementation environment provided by one embodiment of the present application;
FIG. 2 is a flow chart of a method for playing an interactive animation provided in one embodiment of the present application;
FIG. 3 is a schematic illustration of an interactive animation provided in one embodiment of the present application;
FIG. 4 is a hierarchical schematic view of a skeletal node provided in one embodiment of the present application;
FIG. 5 is a migration schematic of a skeletal node provided in one embodiment of the present application;
FIG. 6 is a flow chart of a method for playing an interactive animation according to another embodiment of the present application;
FIG. 7 is a flow chart of model merging provided in one embodiment of the present application;
FIG. 8 is a block diagram of a playback device for interactive animation according to one embodiment of the present application;
FIG. 9 is a block diagram of a playback device for interactive animation according to another embodiment of the present application;
fig. 10 is a block diagram of a terminal provided in one embodiment of the present application.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present application. Rather, they are merely examples of methods that are consistent with some aspects of the present application as detailed in the accompanying claims.
Referring to fig. 1, a schematic diagram of an implementation environment provided in an embodiment of the present application is shown, where the implementation environment may be implemented as a playing system of an interactive animation. As shown in fig. 1, the system 10 may include: a terminal 11.
The terminal 11 has installed and running therein a target application program, such as a client of the target application program. Optionally, the client has a user account logged in. The terminal is an electronic device with data computing, processing and storage capabilities. The terminal may be a smart phone, a tablet computer, a PC (Personal Computer ), a wearable device, etc., which is not limited in this embodiment of the present application. The target application may be a gaming application, such as a shooting-type gaming application, a multi-player gunfight-type survival gaming application, a flee-kill-type survival gaming application, an LBS (Location Based Service, location based services) type gaming application, a MOBA (Multiplayer Online Battle Arena, multi-player online tactical competition) type gaming application, and the like, to which embodiments of the present application are not limited. The target application may also be any application with interactive animated play function, such as a social application, payment application, video application, music application, shopping application, news application, etc. In the method provided in the embodiment of the present application, the execution body of each step may be the terminal 11, such as a client running in the terminal 11.
In some embodiments, the system 10 further includes a server 12, where the server 12 establishes a communication connection (e.g., a network connection) with the terminal 11, and the server 12 is configured to provide background services for the target application. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud computing service.
The playing method of the interactive animation, which is provided by the embodiment of the application, can be applied to a scene for playing the animation through a display screen; the method and the device can also be applied to animation playing scenes such as AR (Augmented Reality) and VR (Virtual Reality), and the embodiment of the application is not particularly limited.
The following describes the technical scheme of the application through several embodiments.
Referring to fig. 2, a flowchart of a method for playing an interactive animation according to an embodiment of the present application is shown. In this embodiment, the method is applied to the client described above for illustration. The method may comprise the following steps (201-205):
step 201, loading a first model and a second model which participate in a first interactive animation.
In some embodiments, the first interactive animation includes a plurality of models, such as a first model and a second model in the first interactive animation, where an interactive relationship exists between the first model and the second model. The model involved in the first interactive animation may be a two-dimensional model; or may be a three-dimensional model, such as one modeled by 3D (three-dimensional) modeling software and imported into a 3D game engine, followed by a 3D graphics modeling rendering technique. Alternatively, the model involved in the first interactive animation may be a character model for representing a virtual character, an item character model for representing a virtual item, or the like. The virtual character may be a virtual persona, a virtual hero character, a virtual animal character, a virtual monster character, etc., which is not specifically limited in the embodiments of the present application. The virtual character can be a character which corresponds to a user account owned by a user and is controlled by the user, and can also be called as a player character in a client of the game application program; the virtual Character may also be a Character that is not under the control of the user, and may also be referred to as an NPC (Non-player Character) in the client of the game application. Wherein the first model and the second model may be the same model as the human character model; the first model and the second model may also be different kinds of models, such as a first model being a character model and a second model being an object model.
In one example, a first interactive animation that includes an interactive relationship between the hand of the second character model and the hand of the first character model is illustrated as the first character model "pulling" the hand of the second character model. In another example, as shown in FIG. 3, a first interactive animation is illustrated in which a third character model 31 "picks up" a pawn model 32, such that the first interactive animation includes an interactive relationship between the pawn model 32 and the hands 33 of the third character model 31.
Step 202, obtaining first node hooking configuration information related to a first interactive animation.
Optionally, the first node hanging configuration information is used to indicate a first bone hanging point of the first model and a first bone node to be hung of the second model in the first interactive animation. For example, the first node hooking configuration information includes the name/other identification information of the first bone hooking point and the name/other identification information of the first bone node to be hooked. In some embodiments, the first interactive animation belongs to a skeletal animation. Bone animation is one of model animations in which a model includes a skeletal structure of interconnected "bones" and can be generated by changing the position and posture of the bones. In the embodiment of the application, the first interactive animation is obtained by respectively changing the positions and the postures of bones of the first model and the second model so as to generate interactive motion between the first model and the second model.
In some embodiments, the first bone hanging point represents a bone in the first model that corresponds to a portion of the model that has generated an interaction with the second model; the first bone node to be hung refers to a bone corresponding to a model part of the second model, which has interaction with the first bone hanging point. In other embodiments, the first bone suspension point is any one bone node in the first model.
Optionally, the skeletal structure of the second model includes a plurality of levels of skeletal nodes. Wherein bone nodes are used to represent a specified bone or group of bones. For example, for a character model, the skeletal structure may include multiple levels of skeletal nodes as shown in FIG. 4, with the levels of skeletal nodes being, from high to low, respectively: vertebra 41, neck 42, left collarbone 43, left upper arm 44, left forearm 45, left hand 46, left hand first finger 47.
In some embodiments, the first to-be-hung bone node is a root skeletal node of the second model. The root skeletal node refers to the highest level skeletal node among the skeletal nodes of the second model.
It should be noted that the root skeletal node of the second model is not the root node of the second model. Alternatively, as shown in FIG. 5, the root node of the second model is 51 and the root skeletal node 52 of the second model is a child node of the second model.
In other embodiments, the first bone node to be hung is a bone node of a bone that needs to be driven in the second model; that is, the first bone node to be hung may be a root bone node or a child node of any level below the root bone node. In general, only one root skeleton node of the model is used, and other skeleton nodes can take the root skeleton node as the root node to perform chain expansion or tree expansion to form a skeleton structure comprising a plurality of layers. In one example, the left-hand clavicle node of the character model is the root skeletal node, its child node is the upper left arm node, the child node of the upper left arm is the left forearm, the child node of the left forearm is the left hand, and the child node of the left hand is the left hand's 5 fingers.
Step 203, extracting first data to be migrated from the model file of the second model, wherein the first data to be migrated includes bone data of the first bone node to be suspended.
In some embodiments, for the first model and the second model that participate in the first interactive animation, respective model files are respectively corresponding to each other, for representing skeleton structures of the first model and the second model, such as skeleton nodes included in the skeleton structures, skeleton data of each skeleton node, and hierarchical relationships between each skeleton node, and so on. The first bone node to be suspended can be migrated out of the model file of the second model by extracting the first data to be migrated from the model file of the second model.
Step 204, determining a first bone hanging point in the model file of the first model, and adding the first data to be migrated to the position under the first bone hanging point to obtain an updated model file of the first model.
In some embodiments, determining a location of a first bone hanging point in a first model file, and adding first data to be migrated to a position below the first bone hanging point, thereby implementing migration and hanging of a first bone node to be hung. Obviously, the updated model file of the first model includes the bone data of the first model which is original in the model file of the first model, and the bone data of the second model which is used for generating interaction with the first model.
In some embodiments, the first data to be migrated is extracted from the model file of the second model, and the model file of the second model lacks the data. Under the condition that the second model only needs to interact with the first model, the data size of the whole skeleton data in the model files of the first model and the second model is unchanged, namely, only data migration occurs, and new data is not added, so that the increase of the data size is avoided, and further consumption of storage resources is avoided.
In other embodiments, the first data to be migrated is copied from the model file of the second model, then the copied first data to be migrated is added under the first bone hanging point, the model file of the second model is kept complete, the data to be migrated is not deleted, so that the backup of the second model is realized, and under the abnormal conditions that the copied first data to be migrated is wrong, or the copied first data to be migrated is lost, or the bone node is hung up Cheng Chucuo, the first data to be migrated can be re-acquired from the complete model file of the second model, so that the probability that the first interactive animation cannot be played in time due to the abnormal conditions is reduced.
Step 205, driving the first model and the second model to move based on the updated model file of the first model and the animation file of the first interactive animation, so as to play the first interactive animation.
In some embodiments, the animation file is used to configure motion information (also referred to as bone driving information) of the first model and the second model in the first interactive animation, such as a motion trajectory, direction, duration, speed, etc. of a moving bone. Optionally, the first interactive animation corresponds to only one animation file, and the second model is driven to be connected with the skeleton in the first model and the first model through the animation file, so that interactive motion occurs between the first model and the second model to play the first interactive animation.
Optionally, the first model and the second model can be driven to move through one animation file, so that two animation playing components are not required to respectively play the animations of the first model and the second model, and the two animations are synthesized into a first interactive animation; the first interactive animation can be played by only adopting one animation playing component, so that the processing resources required for playing the first interactive animation are saved. Alternatively, the animation playing component may be an Animator, but is not limited to other components having animation playing functions.
Optionally, the name of the same skeleton node in the model file is the same as the name in the animation file. Because the skeleton nodes in the model file (i.e., the updated model file of the first model) need to be driven to move based on the animation file, the animation file and the skeleton nodes in the model file need to be matched, so that the client can find the corresponding skeleton in the model file according to the names of the skeleton nodes contained in the animation file.
In summary, in the technical solution provided in the embodiments of the present application, the bone node to be hung of the second model file is hung under the first bone hanging point of the first model file, so that one model file (i.e., the updated model file of the first model) may include relevant data of all bones to be driven in the first model and the second model, and based on the one model file, the first model and the second model are driven, which avoids the situation that the motions of the plurality of models in the first interactive animation are not synchronous, thereby ensuring that the bones to be synchronously driven can be synchronously driven, and further improving the synchronicity and coordination of the motions between the plurality of models in the first interactive animation.
In addition, in the embodiment of the application, by combining two model files of two models into one model file, correspondingly, only one animation file is required to be set to drive the two models, and only one animation playing component is required to play the first interactive animation, so that the operation complexity of playing the first interactive animation is reduced and the processing cost of the terminal is saved.
Referring to fig. 6, a flowchart of a method for playing an interactive animation according to another embodiment of the present application is shown. In this embodiment, the method is applied to the client described above for illustration. The method may comprise the following steps (601-610):
step 601, loading a first model and a second model which participate in a first interactive animation.
This step 601 is the same as or similar to the step 201 in the embodiment of fig. 2, and will not be described here again.
Step 602, reading first node configuration information from a configuration table of a first model, wherein the first node configuration information is used for indicating a first bone hanging point.
That is, from the configuration table of the first model, it can be determined which bone node in the first model is the first bone hanging point.
Step 603, reading second node configuration information from the configuration table of the second model, where the second node configuration information is used to indicate the first bone node to be hung.
That is, from the configuration table of the second model, it can be determined which skeletal node in the second model is the first to-be-hung skeletal node.
Step 604, extracting bone data of the first bone node to be hung and bone data of child nodes of the first bone node to be hung from a model file of the second model to obtain first data to be migrated.
Wherein, the hierarchical relation between the first bone node to be hung and the child nodes thereof is kept unchanged before and after migration.
In some embodiments, after the first to-be-migrated data is determined based on the second node configuration information, relevant data of the first to-be-migrated data is extracted from a model file of the second model. Because the child nodes of the first bone joint to be hung can move along with the first bone joint to be hung when the first bone joint to be hung moves, the bone data of the child nodes of the first bone joint to be hung is needed to be contained in the first migration data besides the bone data of the first bone joint to be hung, so that the child nodes of the first bone joint to be hung can move along with the first bone joint to be hung.
Step 605, determining a first bone hanging point in the model file of the first model, and adding the first data to be migrated to the position under the first bone hanging point to obtain an updated model file of the first model.
In some embodiments, based on the first node configuration information, a target migration position of the first data to be migrated (i.e., under the first bone hanging point) can be determined, so that the first bone node to be hung becomes a child node of the first bone hanging point, thereby ensuring that the first bone node to be hung can move synchronously with the first bone hanging point.
The contents of steps 604 and 605 described above are described below in connection with one example:
as shown in fig. 5, from a model file 53 of the second model, bone data of the bone node to be hung 52 and bone data of child nodes of the bone node to be hung 52 are extracted, so as to obtain migration data 54; the migration data 54 is then migrated below the bone hanging point 55, resulting in an updated model file 56 for the first model.
Step 606, running a predefined program to set the position and posture of the first bone node to be hung.
Optionally, the position and posture of the first bone-to-be-hung joint are used to define a relative positional relationship between the first bone-to-be-hung joint and the first bone-hanging point before the first interactive animation is played. Before the first interactive animation is played, setting an initial relative position relation between a first bone node to be hung of the second model and a first bone hanging point of the first model through a predefined program, and setting an initial posture of the first bone node to be hung. The animation file can drive the second model to perform interactive motion with the first model based on the position and the gesture of the first bone node to be hung. In this embodiment, after the bone data is migrated, the relative positional relationship between the first bone node to be hung and the first bone hanging point is preset, so that the relative positional relationship between the first bone node to be hung and the first bone hanging point is controllable and more natural and coordinated.
In step 607, bone driving data is obtained from the animation file of the first interactive animation, where the bone driving data is used to determine the target bone to be driven and the corresponding driving mode.
Optionally, the target bone comprises a bone to be driven in the first model and a bone to be driven in the second model. Alternatively, the target bone may be the first bone hanging point of the first model or the first bone node to be hung of the second model, or may be a non-bone hanging point in the first model.
In some embodiments, the bone drive data comprises: the magnitude, direction and point of action of the virtual driving force exerted on the target bone. In other embodiments, the bone drive data comprises: the direction, speed, pose, location, etc. of movement of the target bone at various time nodes or time periods.
Step 608, based on the bone driving data, reads the bone data of the target bone from the updated model file of the first model.
In some embodiments, a target bone in the updated model file of the first model that needs to be driven is determined based on bone drive data.
Step 609, driving the target skeleton to move based on the skeleton data and driving mode of the target skeleton, so as to play the first interactive animation.
In some embodiments, the target skeleton is driven in a driving manner of the target skeleton to enable the target skeleton to move, so that a first interactive animation is generated and played.
And step 610, after the playing of the first interactive animation is completed, moving the first data to be migrated back to the model file of the second model, and recovering to obtain the model file of the second model.
In some embodiments, if the first interactive animation is not played again within a preset period after the first interactive animation is played, the updated model file of the first model is considered to be temporarily unavailable, and the first data to be migrated is moved back to the second model file, so as to obtain a model file after the second model is restored. The preset duration may be 10 seconds, 15 seconds, 30 seconds, 45 seconds, 1 minute, 3 minutes, 10 minutes, 30 minutes, etc. Optionally, the specific duration of the preset duration is set by a related technician according to the actual situation, which is not specifically limited in the embodiments of the present application.
In step 610, after the playing of the first interactive animation is completed, if the first migration data included in the updated model file of the first model is not needed for a while, the first migration data is moved back to the model file of the second model, so as to restore the model file of the second model, such that the model file after the second model is restored can be reused, for example, other first interactive animations are played based on the model file after the second model is restored.
In some embodiments, if a model needs to play a plurality of first interactive animations (such as two animations), the configuration list of the model may be configured with a plurality of bone hanging points, and migration data of different models may be added to the configuration list of the model; the configuration table and the inside of the model can be also configured with a plurality of bone nodes to be hung, which are used for migrating the bone data of the model to model files respectively corresponding to different models; the configuration surface and the inner side of the model can be simultaneously configured with bone hanging points and bone nodes to be hung.
In summary, in the technical solution provided in the embodiments of the present application, due to the hierarchical relationship between the first to-be-hung bone node and its child nodes, the first to-be-hung bone node remains unchanged before and after migration, so that display abnormality of the second model is avoided, and further, abnormality of playing of the first interactive animation is avoided.
In some embodiments, following the step 605, the following steps are further included:
1. extracting the residual data except the first data to be migrated from the model file of the second model;
2. adding residual data in a model file of the first model; wherein the remaining data is added under the first bone node to be hung;
3. Destroying the model file of the second model.
In the embodiment, adding the rest data of the first data to be migrated overseas to the model file of the first model in the model file of the second model, and destroying the model file of the second model, so that the first model file simultaneously contains all relevant data of the first model and the second model, thereby simplifying the number of the model files; in addition, the model file of the first model can be used for repeatedly playing the first interactive animation, and as data migration is not needed again, the processing resources required for repeatedly playing the first interactive animation are reduced, and the playing efficiency of the first interactive animation is improved.
The method of merging the first model and the second model referred to in the above embodiment is summarized in the following with reference to fig. 7. As shown in fig. 7, the method may include the following steps (701-705):
step 701, loading a first model and a second model which participate in a first interactive animation;
step 702, determining a first bone hanging point of a first model and a first bone node to be hung of a second model;
step 703, hanging a first bone node to be hung and related bone data under the first bone hanging point;
Step 704, setting the position and the posture of a first bone joint to be hung;
step 705, adding the rest data except the first data to be migrated in the model file of the second model in the model file of the first model, and merging.
In other possible implementations, the first interactive animation further includes a third model involved, and the playing method of the interactive animation further includes the following steps:
1. loading a third model;
2. acquiring second node hanging configuration information related to the first interactive animation, wherein the second node hanging configuration information is used for indicating a second bone hanging point of the first model and a second bone node to be hung of the third model in the first interactive animation;
3. extracting second data to be migrated from a model file of the third model, wherein the second data to be migrated comprises bone data of a second bone node to be hung;
4. determining a second skeleton hanging point in a model file of the first model, and adding second data to be migrated to the position below the second skeleton hanging point to obtain an updated model file of the first model;
5. and driving the first model, the second model and the third model to move based on the updated model file of the first model and the animation file of the first interactive animation so as to play the first interactive animation.
A part of the explanation of this implementation may refer to the content of the foregoing embodiment of fig. 2 and the embodiment of fig. 6, which are not repeated here.
In some embodiments, the models that participate in the first interactive animation include a first model, a second model, and a third model, and an interactive relationship exists between the first model and the second model, and between the first model and the third model. The second model and the third model are respectively corresponding to respective bone nodes to be hung, the bone nodes to be hung of the second model are first bone nodes to be hung, and the bone nodes to be hung of the third model are second bone nodes to be hung. The first bone node to be hung and the second bone node to be hung are used for being hung on the same model (namely the first model), for example, the first bone node to be hung is hung under a first bone hanging point of the first model, and the second bone node to be hung is hung under a second bone hanging point of the first model. The first bone hanging point and the second bone hanging point may be the same bone hanging point, i.e., the first bone node to be hung and the second bone node to be hung will be hung under the same bone hanging point; the first bone hanging point and the second bone hanging point may also be two different bone hanging points, that is, the first bone node to be hung and the second bone node to be hung will be hung under different bone hanging points, which is not particularly limited in the embodiment of the present application. When the first bone joint to be hung and the second bone joint to be hung are hung under the same bone hanging point, the first bone joint to be hung and the second bone joint to be hung are the same in the level in the model file of the first model, and are the next level of the bone hanging point.
In a specific embodiment, the first interactive animation displays a left-hand "pick up" wineglass model and a right-hand "shake" fan model of the third character model, such that the models participating in the first interactive animation include a first model (i.e., the third character model), a second model (i.e., the wineglass model), and a third model (i.e., the fan model). Two interaction relations exist in the first interaction animation: an interaction between the left hand portion of the third character model and the wineglass model, and an interaction between the third character model and the fan model. In one example, the third character model includes two bone hanging points that are not identical, namely a bone hanging point 1 corresponding to the left hand of the third character model and a bone hanging point 2 corresponding to the right hand of the third character model, respectively, a bone node to be hung of the wineglass model may be hung under the bone hanging point 1, and a bone node to be hung of the fan model may be hung under the bone hanging point 2. In another example, a third character model includes a bone hanging point 3, and both the bone node to be hung of the wineglass model and the bone node to be hung of the fan model hang under the bone hanging point 3.
Alternatively, when 4, 5 or more models are included in the first interactive animation, one of the models may be used as a main model, and all the bone nodes to be hung of the other models may be hung into the one model, so as to play the first interactive animation.
In the implementation manner, the skeleton data of the second model and the third model are hung in the model file of the first model, when the first interactive animation comprises more than two models, one of the models can be selected, and all the bone nodes to be hung of the other models are hung in the one model, so that the first interactive animation comprising the multiple models can be played through one animation file and one animation playing component, the operation complexity of playing the first interactive animation is reduced, and the processing cost of the terminal is saved.
In other possible implementations, the first interactive animation further includes a fourth model involved, and the playing method of the interactive animation further includes the following steps:
1. loading a fourth model;
2. acquiring third node hanging configuration information related to the first interactive animation, wherein the third node hanging configuration information is used for indicating a third bone hanging point of the second model and a third bone node to be hung of the fourth model in the first interactive animation;
3. Extracting third data to be migrated from a model file of the fourth model, wherein the third data to be migrated comprises bone data of a third bone node to be hung;
4. determining a third bone hanging point in the updated model file of the first model, and adding third data to be migrated to the position below the third bone hanging point to obtain a second updated model file of the first model;
5. and driving the first model, the second model and the fourth model to move based on the model file after the second updating of the first model and the animation file of the first interactive animation so as to play the first interactive animation. .
A part of the explanation of this implementation may refer to the content of the foregoing embodiment of fig. 2 and the embodiment of fig. 6, which are not repeated here.
In some embodiments, the models that participate in the first interactive animation include a first model, a second model, and a fourth model, with an interactive relationship between the first model and the second model, and between the second model and the fourth model. The first bone node to be hung in the second model is hung under a first bone hanging point in the first model, the first data to be migrated comprises bone data of a third bone hanging point, the third bone node to be hung in the fourth model is hung under the third bone hanging point, a model file after secondary updating of the first model is obtained, and in the model file after secondary updating of the first model, the hierarchical relation of part of bone nodes from high to low is as follows: the first bone hanging point is more than the first bone node to be hung is more than the third bone hanging point is more than the third bone node to be hung.
Optionally, the first bone node to be hung and the third bone node to be hung of the second model may be the same bone node, in this case, in the model file updated by the first model, a hierarchical relationship from high to low of part of the bone nodes is: the level of the first bone hanging point > the level of the first bone node to be hung = the level of the third bone hanging point > the level of the third bone node to be hung.
In the implementation manner, the skeleton data of the second model and the fourth model are hung in the model file of the first model in a chain hanging manner, when the first interactive animation comprises more than two models, one of the models can be selected, and the bone nodes to be hung of the other models are hung in the model file of the one model in a chain hanging manner, so that the first interactive animation comprising a plurality of models can be played and played through one animation file and one animation playing component, the operation complexity of playing the first interactive animation is reduced, and the processing cost of a terminal is saved.
In other possible implementations, the method for playing the interactive animation further includes the following steps:
1. acquiring fourth node hooking configuration information related to the second interactive animation, wherein a model participating in the second interactive animation comprises a first model and a second model, and the fourth node hooking configuration information is used for indicating a fourth skeleton hooking point of the first model in the second interactive animation;
2. Determining a fourth bone hanging point in a model file of the first model, and transferring the first data to be transferred to the position below the fourth bone hanging point to obtain a model file of the first model after updating again, wherein the fourth bone hanging point and the first bone hanging point are two different bone nodes of the first model;
3. and driving the first model and the second model to move based on the updated model file of the first model and the animation file of the second interactive animation so as to play the second interactive animation.
A part of the explanation of this implementation may refer to the content of the foregoing embodiment of fig. 2 and the embodiment of fig. 6, which are not repeated here.
In the above implementation manner, when the models included in the first interactive animation and the second interactive animation are the same (for example, both the first model and the second model are included), but the bone hanging points of the first model are different, the first migration data hung under the first bone hanging point can be directly migrated to the fourth bone so as to be dropped, so that the second interactive animation is played, the process of re-acquiring the model file is avoided, the processing steps are simplified, and the processing cost of the terminal is saved.
The following are device embodiments of the present application, which may be used to perform method embodiments of the present application. For details not disclosed in the device embodiments of the present application, please refer to the method embodiments of the present application.
Referring to fig. 8, a block diagram of a playback device for interactive animation according to an embodiment of the present application is shown. The device has the function of realizing the method example of playing the interactive animation, and the function can be realized by hardware or by executing corresponding software by hardware. The device may be the terminal 11 described above, or may be provided on the terminal 11. The apparatus 800 may include: a model loading module 810, an information acquisition module 820, a data extraction module 830, a data addition module 840, and a model driving module 850.
The model loading module 810 is configured to load a first model and a second model that participate in a first interactive animation.
The information obtaining module 820 is configured to obtain first node hooking configuration information related to the first interactive animation, where the first node hooking configuration information is used to indicate a first bone hooking point of the first model and a first bone node to be hooked of the second model.
The data extraction module 830 is configured to extract first data to be migrated from a model file of the second model, where the first data to be migrated includes bone data of the first bone node to be suspended.
The data adding module 840 is configured to determine the first bone hanging point in a model file of the first model, and add the first data to be migrated to the position below the first bone hanging point, so as to obtain an updated model file of the first model.
The model driving module 850 is configured to drive the first model and the second model to move based on the updated model file of the first model and the animation file of the first interactive animation, so as to play the first interactive animation.
In summary, in the technical solution provided in the embodiments of the present application, the first bone node to be hung of the second model file is hung under the first bone hanging point of the first model file, so that one model file (i.e., the updated model file of the first model) may include relevant data of all bones to be driven in the first model and the second model, and based on the one model file, the first model and the second model are driven, which avoids the situation that the motions of the plurality of models in the first interactive animation are not synchronous, thereby ensuring that the bones to be synchronously driven, and further improving the synchronicity and coordination of the motions between the plurality of models in the first interactive animation.
In an exemplary embodiment, the data extraction module 830 is configured to:
extracting bone data of the first bone node to be hung and bone data of child nodes of the first bone node to be hung from a model file of the second model to obtain the first data to be migrated; wherein, the hierarchical relation between the first bone node to be hung and the child nodes thereof is kept unchanged before and after migration.
In an exemplary embodiment, the model driver module 850 is configured to:
acquiring skeleton driving data from an animation file of the first interactive animation, wherein the skeleton driving data are used for determining a target skeleton to be driven and a corresponding driving mode; wherein the target bone comprises a bone to be driven in the first model and a bone to be driven in the second model;
based on the bone driving data, reading bone data of the target bone from an updated model file of the first model;
and driving the target skeleton to move based on the skeleton data of the target skeleton and the driving mode so as to play the first interactive animation.
In an exemplary embodiment, as shown in fig. 9, the apparatus 800 further includes: program execution module 860.
The program running module 860 is configured to run a predefined program, and set a position and an attitude of the first bone node to be hung; the position and the gesture of the first bone joint to be hung are used for defining the relative position relationship between the first bone joint to be hung and the bone hanging point of the first model before the interactive animation is played.
In an exemplary embodiment, as shown in fig. 9, the apparatus 800 further includes: the model destruction module 870.
The data extraction module 830 is further configured to extract remaining data except the first data to be migrated from a model file of the second model.
The data adding module 840 is further configured to add the remaining data to a model file of the first model; wherein the remaining data is added under the first to-be-hung bone node.
The model destruction module 870 is configured to destroy the model file of the second model.
In an exemplary embodiment, as shown in fig. 9, the apparatus 800 further includes: the data is moved back to block 880.
The data moving back module 880 is configured to move the first data to be migrated back to the model file of the second model after the playing of the first interactive animation is completed, so as to obtain a model file after the second model is restored; and the model file after the second model is restored is the same as the model file of the second model.
In an exemplary embodiment, the information obtaining module 820 is configured to:
reading first node configuration information from a configuration table of the first model, wherein the first node configuration information is used for indicating the first bone hanging point;
and reading second node configuration information from a configuration table of the second model, wherein the second node configuration information is used for indicating the first bone node to be hung.
In an exemplary embodiment, the first interactive animation further comprises a third model of participation.
The model loading module 810 is further configured to load the third model.
The information obtaining module 820 is further configured to obtain second node hooking configuration information related to the first interactive animation, where the second node hooking configuration information is used to indicate a second bone hanging point of the first model and a second bone node to be hooked of the third model in the first interactive animation.
The data extraction module 830 is further configured to extract second data to be migrated from the model file of the third model, where the second data to be migrated includes skeletal data of the second bone node to be hung.
The data adding module 840 is further configured to determine the second bone hanging point in the model file of the first model, and add the second data to be migrated to the position below the second bone hanging point, so as to obtain an updated model file of the first model.
The model driving module 850 is further configured to drive the first model, the second model, and the third model to move based on the updated model file of the first model and the animation file of the first interactive animation, so as to play the first interactive animation.
In an exemplary embodiment, the first interactive animation further comprises a fourth model of participation.
The model loading module 810 is further configured to load the fourth model.
The information obtaining module 820 is further configured to obtain third node hooking configuration information related to the first interactive animation, where the third node hooking configuration information is used to indicate a third bone hanging point of the second model and a third bone node to be hooked of the fourth model in the first interactive animation.
The data extraction module 830 is further configured to extract third data to be migrated from the model file of the fourth model, where the third data to be migrated includes skeletal data of the third bone node to be hung.
The data adding module 840 is further configured to determine the third bone hanging point in the updated model file of the first model, and add the third data to be migrated to the position below the third bone hanging point, so as to obtain a second updated model file of the first model.
The model driving module 850 is further configured to drive the first model, the second model, and the fourth model to move based on the second updated model file of the first model and the animation file of the first interactive animation, so as to play the first interactive animation.
In an exemplary embodiment, the information obtaining module 820 is further configured to obtain fourth node hooking configuration information related to a second interactive animation, where a model participating in the second interactive animation includes the first model and the second model, and the fourth node hooking configuration information is used to indicate a fourth bone hooking point of the first model in the second interactive animation.
The data adding module 840 is further configured to determine the fourth bone hanging point in the model file of the first model, and migrate the first data to be migrated to the position below the fourth bone hanging point of the first model, to obtain a model file after updating the first model again, where the fourth bone hanging point and the first bone hanging point are two different bone nodes of the first model.
The model driving module 850 is further configured to drive the first model and the second model to move based on the updated model file of the first model and the animation file of the second interactive animation, so as to play the second interactive animation.
In an exemplary embodiment, the first to-be-hung bone node is a root skeletal node of the second model; or, the first bone node to be hung is a bone node of a bone which needs to be driven by the second model in the first interactive animation.
In an exemplary embodiment, the first interactive animation is played using an animation play component.
It should be noted that, in the apparatus provided in the foregoing embodiment, when implementing the functions thereof, only the division of the foregoing functional modules is used as an example, in practical application, the foregoing functional allocation may be implemented by different functional modules, that is, the internal structure of the device is divided into different functional modules, so as to implement all or part of the functions described above. In addition, the apparatus and the method embodiments provided in the foregoing embodiments belong to the same concept, and specific implementation processes of the apparatus and the method embodiments are detailed in the method embodiments and are not repeated herein.
Referring to fig. 10, a block diagram of a terminal 1000 according to one embodiment of the present application is shown. The terminal 1000 can be an electronic device such as a cell phone, tablet computer, game console, electronic book reader, multimedia playing device, wearable device, PC, etc. The terminal is used for implementing the playing method of the interactive animation provided in the embodiment. The terminal may be the terminal 11 in the implementation environment shown in fig. 1. Specifically, the present invention relates to a method for manufacturing a semiconductor device.
In general, terminal 1000 can include: a processor 1001 and a memory 1002.
The processor 1001 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 1001 may be implemented in at least one hardware form of DSP (Digital Signal Processing ), FPGA (Field Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array ). The processor 1001 may also include a main processor, which is a processor for processing data in an awake state, also referred to as a CPU (Central Processing Unit ), and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 1001 may integrate a GPU (Graphics Processing Unit, image processor) for rendering and drawing of content required to be displayed by the display screen. In some embodiments, the processor 1001 may also include an AI (Artificial Intelligence ) processor for processing computing operations related to machine learning.
Memory 1002 may include one or more computer-readable storage media, which may be non-transitory. Memory 1002 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1002 is used to store at least one instruction, at least one program, code set, or instruction set, and is configured to be executed by one or more processors to implement the above-described method of playing interactive animation.
In some embodiments, terminal 1000 can optionally further include: a peripheral interface 1003, and at least one peripheral. The processor 1001, the memory 1002, and the peripheral interface 1003 may be connected by a bus or signal line. The various peripheral devices may be connected to the peripheral device interface 1003 via a bus, signal wire, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1004, a display 1005, a camera assembly 1006, audio circuitry 1007, a positioning assembly 1008, and a power supply 1009.
Those skilled in the art will appreciate that the structure shown in fig. 10 is not limiting and that terminal 1000 can include more or fewer components than shown, or certain components can be combined, or a different arrangement of components can be employed.
In an exemplary embodiment, a computer readable storage medium is also provided, where at least one instruction, at least one program, a set of codes, or a set of instructions is stored, where the at least one instruction, the at least one program, the set of codes, or the set of instructions, when executed by a processor, implement a method for playing an interactive animation as described above.
Alternatively, the computer-readable storage medium may include: ROM (Read-Only Memory), RAM (Random-Access Memory), SSD (Solid State Drives, solid State disk), optical disk, or the like. The random access memory may include ReRAM (Resistance Random Access Memory, resistive random access memory) and DRAM (Dynamic Random Access Memory ), among others.
In an exemplary embodiment, a computer program product or a computer program is also provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the terminal reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions, so that the terminal executes the playing method of the interactive animation.
It should be understood that references herein to "a plurality" are to two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
The foregoing description of the exemplary embodiments of the present application is not intended to limit the invention to the particular embodiments disclosed, but on the contrary, the intention is to cover all modifications, equivalents, alternatives, and alternatives falling within the spirit and scope of the invention.

Claims (15)

1. A method for playing an interactive animation, the method comprising:
Loading a first model and a second model which participate in a first interactive animation;
acquiring first node hooking configuration information related to the first interactive animation, wherein the first node hooking configuration information is used for indicating a first bone hooking point of the first model and a first bone node to be hooked of the second model in the first interactive animation;
extracting first data to be migrated from a model file of the second model, wherein the first data to be migrated comprises bone data of the first bone node to be hung;
determining the first bone hanging point in a model file of the first model, and adding the first data to be migrated to the position below the first bone hanging point to obtain an updated model file of the first model;
and driving the first model and the second model to move based on the updated model file of the first model and the animation file of the first interactive animation so as to play the first interactive animation.
2. The method of claim 1, wherein extracting the first data to be migrated from the model file of the second model comprises:
extracting bone data of the first bone node to be hung and bone data of child nodes of the first bone node to be hung from a model file of the second model to obtain the first data to be migrated;
Wherein, the hierarchical relation between the first bone node to be hung and the child nodes thereof is kept unchanged before and after migration.
3. The method of claim 1, wherein the driving the first model and the second model to move based on the updated model file of the first model and the animation file of the first interactive animation to play the first interactive animation comprises:
acquiring skeleton driving data from an animation file of the first interactive animation, wherein the skeleton driving data are used for determining a target skeleton to be driven and a corresponding driving mode; wherein the target bone comprises a bone to be driven in the first model and a bone to be driven in the second model;
based on the bone driving data, reading bone data of the target bone from an updated model file of the first model;
and driving the target skeleton to move based on the skeleton data of the target skeleton and the driving mode so as to play the first interactive animation.
4. The method of claim 1, wherein after determining the first bone hanging point in the model file of the first model and adding the first data to be migrated to the first bone hanging point, obtaining an updated model file of the first model, further comprises:
Running a predefined program, and setting the position and the posture of the first bone node to be hung;
the position and the gesture of the first bone joint to be hung are used for defining the relative position relationship between the first bone joint to be hung and the first bone hanging point before the first interactive animation is played.
5. The method of claim 1, wherein after determining the first bone hanging point in the model file of the first model and adding the first data to be migrated to the first bone hanging point, obtaining an updated model file of the first model, further comprises:
extracting the residual data except the first data to be migrated from the model file of the second model;
adding the residual data in a model file of the first model; wherein the remaining data is added under the first bone node to be hung;
destroying the model file of the second model.
6. The method of claim 1, wherein the driving the first model and the second model to move based on the updated model file of the first model and the animation file of the first interactive animation further comprises, after playing the first interactive animation:
And after the first interactive animation is played, moving the first data to be migrated back to the model file of the second model, and recovering to obtain the model file of the second model.
7. The method of claim 1, wherein the obtaining the first node hooking configuration information associated with the first interactive animation comprises:
reading first node configuration information from a configuration table of the first model, wherein the first node configuration information is used for indicating the first bone hanging point;
and reading second node configuration information from a configuration table of the second model, wherein the second node configuration information is used for indicating the first bone node to be hung.
8. The method of claim 1, wherein the first interactive animation further comprises a third model of participation, the method further comprising:
loading the third model;
acquiring second node hooking configuration information related to the first interactive animation, wherein the second node hooking configuration information is used for indicating a second bone hooking point of the first model and a second bone node to be hooked of the third model in the first interactive animation;
Extracting second data to be migrated from a model file of the third model, wherein the second data to be migrated comprises bone data of a second bone node to be hung;
determining the second bone hanging point in the model file of the first model, and adding the second data to be migrated to the position below the second bone hanging point to obtain an updated model file of the first model;
and driving the first model, the second model and the third model to move based on the updated model file of the first model and the animation file of the first interactive animation so as to play the first interactive animation.
9. The method of claim 1, wherein the first interactive animation further comprises a fourth model of participation, the method further comprising:
loading the fourth model;
acquiring third node hooking configuration information related to the first interactive animation, wherein the third node hooking configuration information is used for indicating a third bone hooking point of the second model and a third bone node to be hooked of the fourth model in the first interactive animation;
extracting third data to be migrated from a model file of the fourth model, wherein the third data to be migrated comprises bone data of a third bone node to be hung;
Determining a third bone hanging point in the updated model file of the first model, and adding the third data to be migrated to the position below the third bone hanging point to obtain a second updated model file of the first model;
and driving the first model, the second model and the fourth model to move based on the model file after the second update of the first model and the animation file of the first interactive animation so as to play the first interactive animation.
10. The method according to claim 1, wherein the method further comprises:
acquiring fourth node hooking configuration information related to a second interactive animation, wherein a model participating in the second interactive animation comprises the first model and the second model, and the fourth node hooking configuration information is used for indicating a fourth skeleton hooking point of the first model in the second interactive animation;
the method further comprises the steps of after the updated model file based on the first model and the animation file of the first interactive animation drive the first model and the second model to move so as to play the first interactive animation, wherein the method further comprises the steps of:
determining a fourth bone hanging point in a model file of the first model, and migrating the first data to be migrated to the position below the fourth bone hanging point to obtain a model file of the first model after updating again, wherein the fourth bone hanging point and the first bone hanging point are two different bone nodes of the first model;
And driving the first model and the second model to move based on the updated model file of the first model and the animation file of the second interactive animation so as to play the second interactive animation.
11. The method according to any one of claims 1 to 10, wherein,
the first bone node to be hung is a root bone node of the second model;
or alternatively, the first and second heat exchangers may be,
the first bone node to be hung is a bone node which needs to be driven by the second model in the first interactive animation.
12. The method of any one of claims 1 to 10, wherein the first interactive animation is played using an animation playing component.
13. A playback device for interactive animation, the device comprising:
the model loading module is used for loading a first model and a second model which participate in the first interactive animation;
the information acquisition module is used for acquiring first node hooking configuration information related to the first interactive animation, wherein the first node hooking configuration information is used for indicating a first bone hooking point of the first model and a first bone node to be hooked of the second model in the first interactive animation;
The data extraction module is used for extracting first data to be migrated from the model file of the second model, wherein the first data to be migrated comprises bone data of the first bone node to be hung;
the data adding module is used for determining the first bone hanging point in the model file of the first model, and adding the first data to be migrated to the position below the first bone hanging point to obtain an updated model file of the first model;
and the model driving module is used for driving the first model and the second model to move based on the updated model file of the first model and the animation file of the first interactive animation so as to play the first interactive animation.
14. A terminal comprising a processor and a memory, wherein the memory stores at least one instruction, at least one program, a set of codes, or a set of instructions, the at least one instruction, the at least one program, the set of codes, or the set of instructions being loaded and executed by the processor to implement the method of playing an interactive animation as claimed in any one of claims 1 to 12.
15. A computer readable storage medium having stored therein at least one instruction, at least one program, a set of codes or a set of instructions, the at least one instruction, the at least one program, the set of codes or the set of instructions being loaded and executed by a processor to implement the method of playing an interactive animation as claimed in any one of claims 1 to 12.
CN202110292800.5A 2021-03-18 2021-03-18 Playing method, device, equipment and storage medium of interactive animation Active CN113034651B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110292800.5A CN113034651B (en) 2021-03-18 2021-03-18 Playing method, device, equipment and storage medium of interactive animation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110292800.5A CN113034651B (en) 2021-03-18 2021-03-18 Playing method, device, equipment and storage medium of interactive animation

Publications (2)

Publication Number Publication Date
CN113034651A CN113034651A (en) 2021-06-25
CN113034651B true CN113034651B (en) 2023-05-23

Family

ID=76471484

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110292800.5A Active CN113034651B (en) 2021-03-18 2021-03-18 Playing method, device, equipment and storage medium of interactive animation

Country Status (1)

Country Link
CN (1) CN113034651B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113546415B (en) * 2021-08-11 2024-03-29 北京字跳网络技术有限公司 Scenario animation playing method, scenario animation generating method, terminal, device and equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6700586B1 (en) * 2000-08-23 2004-03-02 Nintendo Co., Ltd. Low cost graphics with stitching processing hardware support for skeletal animation
CN106075909A (en) * 2016-07-15 2016-11-09 珠海金山网络游戏科技有限公司 A kind of system and method that changes the outfit of playing
CN109885163A (en) * 2019-02-18 2019-06-14 广州卓远虚拟现实科技有限公司 A kind of more people's interactive cooperation method and systems of virtual reality
CN111383309A (en) * 2020-03-06 2020-07-07 腾讯科技(深圳)有限公司 Skeleton animation driving method, device and storage medium
CN111462287A (en) * 2020-03-31 2020-07-28 网易(杭州)网络有限公司 Data processing method and device for skeleton animation in game and electronic equipment
CN112489172A (en) * 2020-11-12 2021-03-12 杭州电魂网络科技股份有限公司 Method, system, electronic device and storage medium for producing skeleton animation

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6697869B1 (en) * 1998-08-24 2004-02-24 Koninklijke Philips Electronics N.V. Emulation of streaming over the internet in a broadcast application
US9811937B2 (en) * 2015-09-29 2017-11-07 Disney Enterprises, Inc. Coordinated gesture and locomotion for virtual pedestrians
WO2018095273A1 (en) * 2016-11-24 2018-05-31 腾讯科技(深圳)有限公司 Image synthesis method and device, and matching implementation method and device
CN110827376A (en) * 2018-08-09 2020-02-21 北京微播视界科技有限公司 Augmented reality multi-plane model animation interaction method, device, equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6700586B1 (en) * 2000-08-23 2004-03-02 Nintendo Co., Ltd. Low cost graphics with stitching processing hardware support for skeletal animation
CN106075909A (en) * 2016-07-15 2016-11-09 珠海金山网络游戏科技有限公司 A kind of system and method that changes the outfit of playing
CN109885163A (en) * 2019-02-18 2019-06-14 广州卓远虚拟现实科技有限公司 A kind of more people's interactive cooperation method and systems of virtual reality
CN111383309A (en) * 2020-03-06 2020-07-07 腾讯科技(深圳)有限公司 Skeleton animation driving method, device and storage medium
CN111462287A (en) * 2020-03-31 2020-07-28 网易(杭州)网络有限公司 Data processing method and device for skeleton animation in game and electronic equipment
CN112489172A (en) * 2020-11-12 2021-03-12 杭州电魂网络科技股份有限公司 Method, system, electronic device and storage medium for producing skeleton animation

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Attached skeletalmesh don’t sync with animation in game;rit;《https://forums.unrealengine.com/t/attached-skeletalmesh-dont-sync-with-animation-in-game/312511》;全文 *
Spine(骨骼动画)Skeleton组件参考;Cocos;《https://www.bookstack.cn/read/cocos-creator-3.0-zh/b668e1a78f0825c4.md》;全文 *
基于BVH驱动的OGRE骨骼动画;郭力;何明耘;陈雷霆;;计算机应用研究(09);全文 *

Also Published As

Publication number Publication date
CN113034651A (en) 2021-06-25

Similar Documents

Publication Publication Date Title
US11110353B2 (en) Distributed training for machine learning of AI controlled virtual entities on video game clients
US11724191B2 (en) Network-based video game editing and modification distribution system
CN111744173B (en) Dynamic streaming video game client
CN112927332B (en) Bone animation updating method, device, equipment and storage medium
CN111228813B (en) Virtual object control method, device, equipment and storage medium
WO2022184128A1 (en) Skill release method and apparatus for virtual object, and device and storage medium
US11335058B2 (en) Spatial partitioning for graphics rendering
CN111028317B (en) Animation generation method, device and equipment for virtual object and storage medium
US11305193B2 (en) Systems and methods for multi-user editing of virtual content
US11816772B2 (en) System for customizing in-game character animations by players
CN111494942B (en) Animation playing method, device, terminal and storage medium
EP3936206A1 (en) Method for displaying frames in game application program, apparatus, terminal, and storage medium
CN113034651B (en) Playing method, device, equipment and storage medium of interactive animation
CN105597314B (en) Rendering system and method of 2D game and terminal equipment
CN113633975A (en) Virtual environment picture display method, device, terminal and storage medium
CN107172136B (en) The synchronous method and device of voxel data
CN111672122A (en) Interface display method, device, terminal and storage medium
CN111773688A (en) Flexible object rendering method and device, storage medium and electronic device
CN110585711A (en) Control method, device, terminal and storage medium of virtual aircraft
CN114359469B (en) Method, device, equipment and medium for generating master control object projection
US11896898B2 (en) State stream game engine
CN114307150A (en) Interaction method, device, equipment, medium and program product between virtual objects
CN116245987A (en) Method, device, equipment and storage medium for displaying virtual characters in virtual scene
CN113457145B (en) Weather control method, device, equipment and storage medium for game scene
US20220319088A1 (en) Facial capture artificial intelligence for training models

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40047809

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant