CN117315104A - Animation generation method and device, storage medium and electronic device - Google Patents

Animation generation method and device, storage medium and electronic device Download PDF

Info

Publication number
CN117315104A
CN117315104A CN202311300265.9A CN202311300265A CN117315104A CN 117315104 A CN117315104 A CN 117315104A CN 202311300265 A CN202311300265 A CN 202311300265A CN 117315104 A CN117315104 A CN 117315104A
Authority
CN
China
Prior art keywords
virtual
curve
model
binding
muscle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311300265.9A
Other languages
Chinese (zh)
Inventor
满溢芳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202311300265.9A priority Critical patent/CN117315104A/en
Publication of CN117315104A publication Critical patent/CN117315104A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application discloses an animation generation method, an animation generation device, a storage medium and an electronic device. The method comprises the following steps: creating a plurality of virtual curves according to the plurality of muscle areas of the virtual model; position binding is carried out on curve points of the virtual curves and at least part of model vertexes in the muscle areas, so that curve binding results are obtained; constructing a plurality of curve controllers of the virtual curves based on the curve binding result and target skeleton binding data corresponding to the virtual model, wherein the curve controllers are used for controlling curve points to follow the virtual skeleton corresponding to the virtual model to change positions; and generating a muscle animation of the virtual model by using the action parameters of the virtual skeleton and the curve controller, wherein the muscle animation at least displays morphological changes of a plurality of muscle areas. The method solves the technical problems that in the related art, the effect is single and the manufacturing efficiency is low in a mode of manufacturing the muscle animation effect through engraving and repairing.

Description

Animation generation method and device, storage medium and electronic device
Technical Field
The present invention relates to the field of computer technologies, and in particular, to an animation generation method, an animation generation device, a storage medium, and an electronic device.
Background
In traditional animation, the creation of muscle bindings and animation effects is generally ignored, subject to the complex performance effects and technical implementation difficulties of the muscle system. In order to save cost, when it is required to avoid muscle twisting and lasting or simulate a muscle dynamic effect in some shots of an animation link, the prior art generally uses a carving and trimming mode to perform muscle driving, however, the above mode has the following drawbacks: only a single muscle can be driven linearly, and the squeezing effect between the muscles is difficult to realize; often, it is necessary to carve and repair frames by frames or create skin clusters for muscles, and when part of animation is produced, it is also necessary to manually perform key frame operation, which is time-consuming and has low efficiency.
In view of the above problems, no effective solution has been proposed at present.
It should be noted that the information disclosed in the foregoing background section is only for enhancing understanding of the background of the present application and thus may include information that does not form the prior art that is already known to those of ordinary skill in the art.
Disclosure of Invention
At least some embodiments of the present application provide an animation generating method, an animation generating device, a storage medium, and an electronic device, so as to at least solve the technical problems of single effect and low manufacturing efficiency in the related art in a way of manufacturing a muscle animation effect through engraving and repairing.
According to one embodiment of the present application, there is provided an animation generation method, including: creating a plurality of virtual curves according to the plurality of muscle areas of the virtual model; position binding is carried out on curve points of the virtual curves and at least part of model vertexes in the muscle areas, so that curve binding results are obtained; constructing a plurality of curve controllers of the virtual curves based on the curve binding result and target skeleton binding data corresponding to the virtual model, wherein the curve controllers are used for controlling curve points to follow the virtual skeleton corresponding to the virtual model to change positions; and generating a muscle animation of the virtual model by using the action parameters of the virtual skeleton and the curve controller, wherein the muscle animation at least displays morphological changes of a plurality of muscle areas.
According to one embodiment of the present application, there is also provided an animation generating apparatus, including: the creation module is used for creating a plurality of virtual curves according to a plurality of muscle areas of the virtual model; the binding module is used for carrying out position binding on curve points of the virtual curves and at least part of model vertexes in the muscle areas to obtain curve binding results; the building module is used for building curve controllers of a plurality of virtual curves based on the curve binding result and target skeleton binding data corresponding to the virtual model, wherein the curve controllers are used for controlling curve points to follow the virtual skeleton corresponding to the virtual model to change in position; and the generation module is used for generating a muscle animation of the virtual model by utilizing the action parameters of the virtual skeleton and the curve controller, wherein the muscle animation at least displays morphological changes of a plurality of muscle areas.
According to one embodiment of the present application, there is also provided a computer-readable storage medium having a computer program stored therein, wherein the computer program is configured to perform the animation generation method of any one of the above when run.
According to one embodiment of the present application, there is also provided an electronic device including: comprising a memory in which a computer program is stored and a processor arranged to run the computer program to perform the animation generation method of any of the above.
In at least some embodiments of the present application, a plurality of virtual curves are created according to a plurality of muscle areas of a virtual model, a curve binding result is obtained by binding curve points of the plurality of virtual curves with at least some model vertices in the plurality of muscle areas, and a curve controller of the plurality of virtual curves is further constructed based on the curve binding result and target bone binding data corresponding to the virtual model, where the curve controller is configured to control the curve points to follow the virtual bones corresponding to the virtual model to perform position changes, and muscle animation of the virtual model is generated by using action parameters of the virtual bones and the curve controller, where the muscle animation at least displays morphological changes of the plurality of muscle areas. Therefore, the purpose of realizing the dynamic effect of the muscle clusters through the virtual curves is achieved, the technical effects of improving the richness of the muscle animation effect and improving the manufacturing efficiency are achieved, and the technical problems that the effect is single and the manufacturing efficiency is low in a mode of manufacturing the muscle animation effect through engraving and repairing in the related art are solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute an undue limitation to the application. In the drawings:
FIG. 1 is a block diagram of a hardware architecture of a mobile terminal according to an animation generation method according to one embodiment of the present application;
FIG. 2 is a flow chart of an animation generation method according to one embodiment of the present application;
FIG. 3 is a schematic diagram of a skeletal binding diagram of an alternative virtual character model in accordance with one embodiment of the present application;
FIG. 4 is a schematic illustration of an alternative virtual curve according to one embodiment of the present application;
FIG. 5 is a schematic illustration of an alternative curve binding result according to one embodiment of the present application;
FIG. 6 is a schematic diagram of an alternative curve controller according to one embodiment of the present application;
FIG. 7 is a schematic diagram of another alternative curve controller according to one embodiment of the present application;
FIG. 8 is a schematic illustration of an alternative muscle dynamic compression effect according to one embodiment of the present application;
FIG. 9 is a schematic diagram of an alternative muscle automated squeezing effect implementation process according to one embodiment of the present application;
FIG. 10 is a block diagram of an animation generation device according to one embodiment of the present application;
FIG. 11 is a block diagram of an alternative animation generation device, according to one embodiment of the present application;
fig. 12 is a schematic diagram of an electronic device according to one embodiment of the present application.
Detailed Description
In order to make the present application solution better understood by those skilled in the art, the following description will be made in detail and with reference to the accompanying drawings in the embodiments of the present application, it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, shall fall within the scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the present application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In the description of the present application, the term "for example" is used to mean "serving as an example, illustration, or description". Any embodiment described herein as "for example" is not necessarily to be construed as preferred or advantageous over other embodiments. The following description is presented to enable any person skilled in the art to make and use the application. In the following description, details are set forth for purposes of explanation. It will be apparent to one of ordinary skill in the art that the present application may be practiced without these specific details. In other instances, well-known structures and processes have not been shown in detail to avoid obscuring the description of the present application with unnecessary detail. Thus, the present application is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features disclosed herein.
In describing embodiments of the present application, partial terms or expressions that appear are applicable to the following explanation.
Bone (Joint): a virtual skeleton for controlling the morphology and motion of the virtual model.
Bone binding (ringing): for creating a skeleton, also known as a joint hierarchy. The skeleton typically includes bones, joints, and controls. Bone bindings define bones within a grid, and their relationship to each other in motion.
Binding (Binding): refers to a technique of creating a driving relationship of a virtual bone to a grid point of a virtual model such that the grid point moves with the movement of the bone. In the application scene of human modeling, the binding can connect muscles or other soft tissues with bones, so that when the bones move, the soft tissues can also follow the movement, and a more real effect is presented.
Skin (Skin): a mesh of a virtual model and one or more skeletal-related animation techniques, the skins being used to cause the skeletal-driven model to produce reasonable motions, defining which portions of the mesh of the virtual model move when a given skeletal is animated.
Skin Weight (Skin Weight): values for controlling how well the vertices of the virtual model are affected by different bones in animation. The same skeleton of the virtual model can control a plurality of vertexes, the same vertex can also be controlled by a plurality of skeletons, and in an application scene, the control weights (namely weights) of the plurality of skeletons to the plurality of vertexes are distributed in a weight drawing mode. In three-dimensional animation software, a common tool for configuring weights is a brush-type tool, so the process of configuring weights is also called brushing weights.
Vertex skin weights (Vertex Skin Weight): refers to the Weight (Weight) of the corresponding skeleton (Bone) of each Vertex (Vertex) of the virtual model.
Muscle Dynamic effects (Dynamic): in muscle group simulation, muscle groups are made to exhibit dynamic effects such as bending, deformation, deflection, etc., by driving contraction and stretching of muscles.
Deformation (development): refers to morphological changes that occur in the motion of the virtual model. In muscle group simulation, the actions performed by the virtual skeleton may be contraction and stretching of the muscles, which in turn may cause morphological changes to occur by squeezing or pulling each other between the multiple muscle areas.
Controller (Controller): a virtual controller for controlling animation operations performed by the skeleton in the skeleton. In an application scenario, the controller is usually a geometric figure, and the pose and shape of the virtual skeleton are changed by performing adjustment such as rotation, translation, scaling and the like on the controller.
Muscle Elasticity (Elasticity): the device is used for simulating the resilience of muscles after stretching and contracting under the action of external force, so that the muscle clusters show more real effects. Muscle elasticity is related to the material and density of the virtual model.
Real-time Calculation (Real-time Calculation): and immediately processing and calculating the data acquired in real time in the application scene. In muscle group simulation, by real-time calculation, the muscle groups can be enabled to show smoother and real effects.
Muscle Dynamics (Muscle Dynamics) modeling: the virtual model is modeled according to the human muscle structure, the muscle strength and the muscle mechanics, so that the movement of the virtual model accords with the muscle control rule and the kinematics rule.
Bezier curve (Bezier Curve): one parametric curve for defining curve shape by control points is the mathematical curve model proposed by the French engineer Pierre Bezier in the 50 s of the 20 th century.
In one possible implementation manner of the present application, the inventor has practiced and studied carefully to solve the technical problems of single muscle dynamic effect and low manufacturing efficiency in the conventional engraving and modification manner used in setting the muscle animation effect in the field of computer animation.
In addition, the related art also provides the following two ways to make the muscle dynamic effect:
first, muscle plug-in production based on three-dimensional animation software. For example, digital human muscle simulation is performed based on Ziva muscle plug-ins in Maya software, and the muscle dynamic effect of the virtual model is achieved. The Ziva muscle plug-in comprises a self-definable muscle model library, supports the functions of a Maya animation system, cloth simulation and the like, can quickly create, edit and adjust a muscle model, supports various output formats, and facilitates exporting a muscle simulation effect to other software for post-production and rendering.
However, the first approach has the disadvantage that: as the muscle plug-in provides very rich functions and custom options, the learning difficulty of the mode is high and the learning curve is steeper; the muscle plug-in is high in cost and not suitable for individual users or small-sized working rooms; when the muscle plug-in is used for complex muscle simulation, the requirement on computer configuration is high, and stronger hardware equipment is required to be equipped to ensure smooth operation and dynamic display effect.
Second, a musculoskeletal skin preparation is created. Based on the skeleton of the character, the action of the character is smoother and more natural by adding a plurality of musculature and covering each part of musculature model body and corresponding skeleton. In the creation process, the skeletons and muscles of the character are drawn firstly and then bound to the character model, and the muscles and skeletons act with grids on the surface of the character model, so that the movement effect of each local tissue of the character muscles is realized.
However, the second approach has the disadvantage that: the skinning process requires that each bone and muscle be connected to a vertex on the character model, which is time consuming and costly; the skin effect is limited by the topological structure of the model, the situation that the muscles and bones of part of the parts cannot be completely matched possibly occurs, and the expected skin effect is difficult to achieve; the realism of the movement effect is revealed by the design of the muscles and bones, which are greatly negatively affected when they are not sufficiently designed.
In this regard, the embodiment of the application proposes an animation production method, a plurality of virtual curves are created according to a plurality of muscle areas of a virtual model, curve points of the plurality of virtual curves are bound with at least part of model vertices in the plurality of muscle areas to obtain a curve binding result, a curve controller of the plurality of virtual curves is constructed based on the curve binding result and target skeleton binding data corresponding to the virtual model, and further, the action parameters of the virtual skeleton and the curve controller are utilized to generate a muscle animation technical conception of the virtual model, so that the purpose of realizing a dynamic effect of muscle clusters through the virtual curves is achieved, the richness of the muscle animation effect is improved, the production efficiency is improved, and the technical problem is solved.
The above-described method embodiments referred to in the present application may be performed in a terminal device (e.g., a mobile terminal, a computer terminal, or similar computing device). Taking the mobile terminal as an example, the mobile terminal can be a terminal device such as a smart phone, a tablet computer, a palm computer, a mobile internet device, a game machine and the like.
Fig. 1 is a block diagram of a hardware structure of a mobile terminal according to an animation generation method according to one embodiment of the present application. As shown in fig. 1, a mobile terminal may include one or more (only one shown in fig. 1) processors 102, memory 104, transmission devices 106, input output devices 108, and display devices 110. Taking the example of the animation generation method applied to the electronic game scene through the mobile terminal, the processor 102 invokes and runs the computer program stored in the memory 104 to execute the animation generation method, and the generated muscle animation of the virtual model in the electronic game scene is transmitted to the input-output device 108 and/or the display device 110 through the transmission device 106, so that the muscle animation of the virtual model is provided to the player.
As also shown in fig. 1, the processor 102 may include, but is not limited to: a central processor (Central Processing Unit, CPU), a graphics processor (Graphics Processing Unit, GPU), a digital signal processing (Digital Signal Processing, DSP) chip, a microprocessor (Microcontroller Unit, MCU), a programmable logic device (Field Programmable Gate Array, FPGA), a Neural network processor (Neural-Network Processing Unit, NPU), a tensor processor (Tensor Processing Unit, TPU), an artificial intelligence (Artificial Intelligence, AI) type processor, and the like.
It will be appreciated by those skilled in the art that the structure shown in fig. 1 is merely illustrative and not limiting of the structure of the mobile terminal described above. For example, the mobile terminal may also include more or fewer components than shown in fig. 1, or have a different configuration than shown in fig. 1.
In some optional embodiments based on game scenes, the terminal device may further provide a human-machine interaction interface with a touch-sensitive surface, where the human-machine interaction interface may sense finger contacts and/or gestures to interact with a graphical user interface (Graphical User Interface, GUI), where the human-machine interaction functions may include the following interactions: executable instructions for performing the above-described human-machine interaction functions, such as creating web pages, drawing, word processing, making electronic documents, games, video conferencing, instant messaging, sending and receiving electronic mail, talking interfaces, playing digital video, playing digital music, and/or web browsing, are configured/stored in a computer program product or readable storage medium executable by one or more processors.
The above method embodiments related to the present application may also be executed in a server. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, a content distribution network (Content Delivery Network, CDN), basic cloud computing services such as big data and an artificial intelligent platform. Taking an example in which an animation generation method is applied to an electronic game scene through an electronic game server, the electronic game server may generate a muscle animation of a virtual model in the electronic game scene based on the animation generation method and provide the muscle animation of the virtual model to a player (for example, may be rendered for display on a display screen of a player terminal, or provided to the player through holographic projection, or the like).
According to one embodiment of the present application, an embodiment of an animation generation method is provided, it being noted that the steps shown in the flowchart of the figures may be performed in a computer system, such as a set of computer-executable instructions, and, although a logical order is shown in the flowchart, in some cases, the steps shown or described may be performed in an order other than that shown or described herein.
In this embodiment, there is provided an animation generation method operating on the mobile terminal, and fig. 2 is a flowchart of an animation generation method according to one embodiment of the present application, as shown in fig. 2, and the method includes the following steps:
step S21, creating a plurality of virtual curves according to a plurality of muscle areas of the virtual model.
The application scenario of the embodiment of the present application may include, but is not limited to: role modeling scenes, scene modeling scenes, special effect modeling scenes and the like in the fields of games and animations; building modeling scenes, civil engineering modeling scenes, mechanical function modeling scenes and the like in the engineering field; modeling scenes of industrial products in the manufacturing industry field, etc.; biological modeling scenes in the medical field, medical instrument design modeling scenes, and the like; virtual classroom modeling, virtual laboratory modeling scenarios in educational fields, and the like.
In particular, in the field of games and animations, the virtual model is a virtual character model, a virtual animal model, a virtual prop model, or the like in a virtual game scene. The game types corresponding to the virtual game scene may be: action classes (e.g., first or third person shooter games, two-or three-dimensional combat games, war action games, sports action games, etc.), adventure classes (e.g., adventure games, collection games, puzzle games, etc.), simulation classes (e.g., simulated sand table games, simulated foster games, strategy simulation games, city building simulation games, business simulation games, etc.), role playing classes and leisure classes (e.g., chess and card game games, recreation game games, music rhythm games, trade foster games, etc.), etc.
The plurality of muscle areas of the virtual model are a plurality of areas which are divided in advance according to the model structure in the virtual model manufacturing process. The division scale of the muscle areas can be flexibly determined according to application scenes. For example, when a virtual manikin is produced, the virtual manikin is divided into a plurality of muscle areas according to the human body biological structure, and each muscle area corresponds to at least one muscle. For example, each muscle is divided into one muscle region, or a plurality of muscles of each site (e.g., head, thigh, forearm, back, thigh, etc.) are divided into one muscle region.
In an application scenario, at least one virtual curve is created for each of a plurality of muscle areas. To simplify model animation, a virtual curve may be created for each muscle region. The plurality of virtual curves are used for curve binding a plurality of muscle areas of the virtual model. Curve binding refers to a technology for realizing dynamic simulation of muscle clusters through curve binding, and the muscle clusters can show more real movement effects.
And S22, performing position binding on curve points of the virtual curves and at least part of model vertexes in the muscle areas to obtain a curve binding result.
In the application scene, a Wire frame (Wire) binding technology is adopted to bind curve points of each virtual curve in a plurality of virtual curves with model vertexes in corresponding areas in a plurality of muscle areas, and model data after the curve points are bound with the model vertexes is used as a curve binding result. Through a wire frame binding technology, a connection is established between the virtual curve and the virtual model surface, so that the virtual model surface is driven and controlled by the virtual curve, and deformation is generated along with the movement of curve points.
It should be noted that, in the binding process in step S22, the curve point of each virtual curve may be one or more specific curve points on the virtual curve, where the specific curve points are used to determine the length, shape, trend, and the like of the virtual curve.
Step S23, based on the curve binding result and the target skeleton binding data corresponding to the virtual model, constructing curve controllers of a plurality of virtual curves, wherein the curve controllers are used for controlling the curve points to follow the virtual skeleton corresponding to the virtual model to change in position.
The target skeleton binding data corresponding to the virtual model is used for determining a binding relationship between the virtual model and the corresponding virtual skeleton. A curve controller is constructed for each virtual curve of the plurality of virtual curves based on a drive control relationship (characterized by curve binding results) between the virtual model surface and the virtual curve and a motion association relationship (characterized by target bone binding data) between the virtual model surface and the virtual bone, thereby constructing a position constraint relationship between the virtual curve and the virtual bone.
The content of the target bone binding data record includes at least one of: bone hierarchy data for determining parent-child connection relationships between virtual bones; bone position data; bone direction data; weight data for determining binding weights between model vertices and virtual bones; animation data for determining animation information (e.g., key frame position, rotation, scaling, etc. attribute information) of a virtual skeleton; binding gesture data for determining an initial gesture and a transition gesture of the virtual model, wherein the transition gesture is a specific gesture in the process of transforming the virtual model from the initial gesture to the target gesture; bone constraint data for determining constraint relationships (e.g., rotation limits, reverse dynamics constraints, forward dynamics constraints, etc.) between virtual bones; and the basic attribute data is used for determining basic attributes such as identification, name, size, color and the like of the virtual skeleton.
And step S24, generating a muscle animation of the virtual model by using the action parameters of the virtual skeleton and the curve controller, wherein the muscle animation at least displays morphological changes of a plurality of muscle areas.
And taking the action parameters of the virtual skeleton as the input of the curve controller, controlling the curve point position deviations of the plurality of virtual curves by the curve controller, further automatically driving the following deviations of the model vertexes in the plurality of muscle areas on the surface of the virtual model, realizing the muscle dynamic effect of the virtual model, and generating the muscle animation containing the muscle dynamic effect of the virtual model. The muscle dynamic effect includes at least morphological changes of a plurality of muscle areas, for example, deformation due to compression between the plurality of muscle areas of the virtual model.
In particular, the above method may further include other method steps, which are described further below with reference to embodiments of the present application and are not described herein.
It is easy to note that, through the animation generation method provided by the embodiment of the application, the animation effect of the muscle clusters is simulated based on the mode of calculating the extrusion amplitude among the multiple muscle areas of the virtual model by the virtual curve, the vivid muscle effect is simulated by adopting a convenient logic mode on the basis of not depending on the plug-in, the automatic calculation can be realized, the muscle animation can be displayed in real time, the defect of the prior art for manufacturing the dynamic effect of the muscle can be overcome, and compared with the prior art, the beneficial effects of reducing the cost and enhancing the efficiency are realized by the method provided by the application.
In the embodiment of the application, a plurality of virtual curves are created according to a plurality of muscle areas of a virtual model, curve points of the plurality of virtual curves are bound with at least part of model vertexes in the plurality of muscle areas to obtain curve binding results, and further a curve controller of the plurality of virtual curves is constructed based on the curve binding results and target skeleton binding data corresponding to the virtual model, wherein the curve controller is used for controlling the curve points to follow the virtual skeletons corresponding to the virtual model to change in position, and muscle animation of the virtual model is generated by utilizing action parameters of the virtual skeletons and the curve controller, wherein the muscle animation at least shows morphological changes of the plurality of muscle areas. Therefore, the purpose of realizing the dynamic effect of the muscle clusters through the virtual curves is achieved, the technical effects of improving the richness of the muscle animation effect and improving the manufacturing efficiency are achieved, and the technical problems that the effect is single and the manufacturing efficiency is low in a mode of manufacturing the muscle animation effect through engraving and repairing in the related art are solved.
The above-described methods of embodiments of the present application are further described below.
Optionally, in step S21, creating a plurality of virtual curves according to a plurality of muscle areas of the virtual model may include the following steps:
step S211, creating a plurality of virtual curves according to the distribution information of the plurality of muscle areas on the virtual model, wherein the curve types of the plurality of virtual curves are at least one of the following: control vertex curves, edit point curves, and bezier curves.
In the application scenario, the distribution information of the plurality of muscle areas on the virtual model at least includes: the location of each muscle region on the virtual model surface, the number of model vertices within each muscle region. And determining the number, type, position, vertex number and the like of the virtual curves to be created according to the distribution information, and further creating the plurality of virtual curves.
It should be noted that, according to the application scene requirement and the attributes of the multiple muscle areas, virtual curves of the same category or different types may be created for the multiple muscle areas. The curve types of the plurality of virtual curves include, but are not limited to: control Vertex (CV) curves, edit Point (EP) curves, and Bezier (Bezier) curves.
Optionally, the animation generation method may further include the following steps:
step S251, creating a virtual skeleton based on the pre-imported virtual model;
step S252, carrying out weighted binding on model vertexes of the virtual model and the virtual skeleton by using the first weight to obtain initial skeleton binding data;
step S253, according to a preset model action optimization index, the weight adjustment is carried out on the initial bone binding data, and target bone binding data are obtained.
In an application scenario for producing a muscle animation of a virtual character model, a skeleton binding diagram of the virtual character model is shown in fig. 3. A pre-made virtual character model is obtained and imported into animation software (e.g., maya), and a virtual skeleton system is created based on the virtual character model, the virtual skeleton system including at least a plurality of virtual skeletons. Further, binding a plurality of virtual bones to model vertexes of the virtual character model, and giving a preset first weight to each model vertex during binding, so that the model vertexes of the virtual character model are driven by the virtual bones according to the first weights to obtain initial bone binding data, wherein the initial bone binding data can comprise model data and model bone data after weighted binding. As shown in FIG. 3, each muscle region of the virtual character model corresponds to a bone.
It should be noted that, before weight binding is performed on each model vertex, the skeleton constraint may be applied to a controller corresponding to the virtual skeleton, so that the preset skeleton constraint is associated with the skeleton system of the virtual model, and further a function of performing different skeleton adjustment according to the effect requirement can be supported. After the weight is bound, binding data adjustment can be performed according to scene requirements, so that the actions of the virtual character model are more natural and smooth, wherein the scene requirements can be represented by preset model action optimization indexes. Further, the adjusted bone binding data is output into a format (such as FBX) which can be used for animation, and target bone binding data is obtained.
The model action optimization index may be determined by visual performance parameters of the virtual character model, for example, in an application scenario, the visual performance parameters include: deformation degree parameters are measured by the size of displacement vectors of the vertexes of each model on the surface of the model; the conformality parameter is used for representing the degree of maintaining the original shape of the model in the deformation process; the smoothness parameter is used for describing the smoothness of the deformed model surface; the volume conservation parameter is used for describing the change degree of the deformed model volume; detail keeping parameters used for describing the degree of keeping original detail characteristics of the model in the deformation process; and the grid quality parameter is used for describing the change degree of the grid quality of the model in the deformation process, wherein the grid quality comprises the rule degree of the grid shape, the uniformity of grid division and the like. The parameter index value of the visual performance parameter can be preset in the application scene according to the scene requirement.
Through the execution steps, skeleton foundation binding is carried out on the virtual character model, and the skeleton system of the virtual character model is bound with the model shape, so that the virtual character model moves and deforms naturally and smoothly in the animation, and meanwhile, the animation of the virtual character model is convenient for subsequent links.
Optionally, in step S22, performing position binding on curve points of the plurality of virtual curves and at least part of model vertices in the plurality of muscle areas to obtain a curve binding result, which may include the following steps:
step S221, selecting at least partial model vertexes corresponding to a plurality of muscle areas on the virtual model;
step S222, according to preset binding attribute information, position binding is carried out on curve points of a plurality of virtual curves and at least part of model vertexes, and a curve binding result is obtained.
Still taking an application scenario of a muscle animation for producing a virtual character model as an example, after creating a virtual curve for each muscle region, selecting at least part of model vertices corresponding to the virtual curve corresponding to the muscle region from all model vertices in the muscle region for each muscle region, where the at least part of model vertices are model vertices to be bound to the virtual curve corresponding to the muscle region.
It should be noted that the above-mentioned operation of selecting at least part of the model vertices may be implemented by wire-frame technology, which is a technology for connecting a curve with a model area, and a relationship is established between the curve and the model surface, so that the model surface may be controlled by the curve points and deform following the curve. The wire frame tool can provide a selection function and an attribute setting function, the virtual curve and the muscle area to be associated currently are selected through the selection function, then the binding attribute is set through the attribute setting function, and further the wire frame tool is automatically triggered to bind the positions of at least part of model vertexes corresponding to the curve points and the muscle area of the virtual curve, and the curve binding result is obtained.
In the above application scenario, as shown in fig. 4, a left shoulder muscle area of the virtual character model is selected as a muscle area to be bound by a wire frame tool, a virtual curve corresponding to the selected left shoulder muscle area is a virtual curve to be bound, and binding of at least part of model vertices in the left shoulder muscle area (all model vertices in the selected muscle area are model vertices to be bound in this example) and the virtual curve is triggered according to binding attribute information.
Optionally, in step S222, according to the binding attribute information, the curve points and at least part of the model vertices are bound to obtain a curve binding result, which may include the following steps:
step S2221, determining a binding mode and a second weight according to the binding attribute information;
step S2222, using the second weight to bind the curve points and at least part of the model vertices according to the binding mode, so as to obtain the curve binding result.
The binding mode may be used to determine the number correspondence between curve points and model vertices, for example, by determining 10 curve points on a virtual curve through the binding mode to control 100 model vertices in a muscle area, each curve point controlling 10 model vertices with consecutive numbers. The second weight is used for determining the binding strength between the model vertex and the virtual curve, and the higher the binding strength is, the larger the following offset amplitude of the model vertex to the curve point of the virtual curve is. In the application scene, after the binding mode and the second weight are determined according to the binding attribute information, the weighted position binding between the curve point and at least part of the model vertexes is automatically triggered, and a curve binding result is obtained.
In addition, the binding mode can be used for determining a specific binding mode for binding positions between curve points and model vertexes in an application scene. The binding model is used for determining binding between curve points and model vertexes in at least one of the following ways: unidirectional binding, bidirectional binding, delayed binding, automatic binding, custom binding, implicit binding, and explicit binding. For example, the binding mode determined according to the binding attribute information is: and carrying out position binding on a first part of curve points in the plurality of curve points and corresponding model vertexes in a one-way binding mode, and carrying out position binding on a second part of curve points in the plurality of curve points and corresponding model vertexes in a two-way binding mode, wherein the first part of curve points are curve points of the position of the model vertexes in one-way control (namely, the position of the curve points cannot be influenced by the position of the model vertexes), and the second part of curve points are curve points with a two-way position constraint relation with the model vertexes. Therefore, the control authority of the virtual curve to the model vertex can be controlled more flexibly through the binding mode.
Still taking an application scenario of producing a muscle animation of a virtual character model as an example, after a left shoulder muscle area of the virtual character model and a corresponding virtual curve are subjected to curve binding, the obtained curve binding result is shown in fig. 5, and after an association relationship is established between the virtual curve and a model vertex in the left shoulder muscle area, a curve point of the virtual curve generates position deviation to drive a model surface of the virtual character model to deform. As shown in fig. 5, the curve points of the virtual curve are adjusted to the upper right of the virtual character model, and accordingly, the surface of the left shoulder muscle region of the virtual character model is deformed to the upper right.
Optionally, in step S23, constructing a curve controller of a plurality of virtual curves based on the curve binding result and the target bone binding data corresponding to the virtual model may include the following steps:
step S231, determining a numerical calculation relation of position change of curve points along with the virtual skeleton based on the curve binding result and the target skeleton binding data;
step S232, constructing a curve controller of a plurality of virtual curves by utilizing the numerical calculation relation.
Still taking an application scenario of producing a muscle animation of a virtual character model as an example, determining a position association relationship between a curve point and a virtual skeleton according to a curve binding result determined by a plurality of virtual curves and at least partial model vertexes of a plurality of muscle areas on the surface of the virtual model and target skeleton binding data obtained by binding the virtual model and the virtual skeleton.
It should be noted that, in an application scenario, a muscle dynamics rule is generally considered, and a numerical calculation relationship that a curve point follows a virtual skeleton to change a position is determined by combining a curve binding result and target skeleton binding data, so that a curve controller constructed according to the numerical calculation relationship can control a virtual curve in a manner more conforming to a reality rule, and a muscle dynamic effect is more natural and smooth.
For example, in the application scenario, curve controllers are added to a plurality of virtual curves corresponding to a plurality of muscle areas on the left arm of the virtual character model, as shown in fig. 6, at least one curve controller may be added to each virtual curve, and each curve controller is used to control the offset motion of at least one curve point. The curve controller shown in fig. 6 is an arc controller for controlling the corresponding model vertices to be shifted along the arc according to the calculation result, for example, controlling a certain curve point to be shifted along the arc by 2 units of length in a predetermined positive direction.
As also shown in fig. 6, there may be curve points on each virtual curve that do not correspond to the curve controller, and these curve points may be adaptively shifted following the curve points that are shifted by the curve controller to enable the morphological changes of the virtual model surface to be smoothed.
For another example, a curve controller may be added to a plurality of curve points, as shown in fig. 7, and two curve points may be controlled by using the curve controller (as shown in the block of fig. 7), where the curve controller is displayed on the midpoint of the line connecting the two curve points. When the input of the curve controller changes, the curve controller simultaneously controls two curve points to perform offset motion.
In the environment of three-dimensional animation software (such as Maya), a curve controller is added to each of the plurality of virtual curves to control each curve point of the virtual curve, and the curve points can control a plurality of muscle areas of the virtual model, that is, the effect of controlling morphological changes of the plurality of muscle areas on the surface of the virtual model can be finally achieved through the curve controller. Further, constraint relations are added between curve controllers corresponding to the muscle areas and the skeleton system of the virtual model, so that actions of the skeleton system of the virtual model can synchronously drive the curve controllers.
In the environment of three-dimensional animation software (such as Maya), according to the curve binding result and the target skeleton binding data, a numerical rule between the offset of the curve point and the offset of the virtual skeleton can be determined, and further a numerical calculation relation capable of fitting the numerical rule is determined, for example, in the Maya environment, a plurality of calculation nodes are built to reflect the position change numerical calculation relation of the curve point following the virtual skeleton, and according to the requirements of an application scene, the plurality of calculation nodes include but are not limited to: multiply-divide calculation node, add-subtract calculation node, absolute value node, mean value node, interpolation node, mapping node, random number node, etc.
Optionally, in step S24, generating a muscle animation of the virtual model using the motion parameters of the virtual skeleton and the curve controller may include the following steps:
step S241, collecting action parameters of the virtual skeleton when executing the target action, wherein the action parameters include at least one of the following: translation amount, rotation amount, deformation amount, scaling amount;
step S242, calculating the action parameters as the input of the curve controller to obtain the following change data of the curve points;
and step S243, controlling at least part of model vertexes of the virtual model to move in vertex positions according to the following change data, and generating the muscle animation of the virtual model.
After constraint relations are added between curve controllers corresponding to the muscle areas and a skeleton system of the virtual model, action parameters of the virtual skeleton can be used as input of the corresponding curve controllers, and curve points of the corresponding virtual curves are driven through the curve controllers.
Still taking an application scenario of producing a muscle animation of a virtual character model as an example, when the virtual character model performs a target action (in this example, an action of folding an arm), action parameters of a virtual skeleton corresponding to an arm region include: the position translation amount of the virtual skeleton, the rotation angle of the big arm, the rotation angle of the small arm, the offset of the model vertex of the big arm surface (used for representing the deformation amount of the big arm), the offset of the model fixed point of the small arm surface (used for representing the deformation amount of the small arm) and the like. In other application scenarios, scaling changes of the virtual skeleton may also occur when performing other target actions.
In the environment of three-dimensional animation software (such as Maya), the motion parameters are used as the input of a curve controller, and the following change data of the curve points of the virtual curve controlled by the curve controller is output through the calculation of a plurality of calculation nodes corresponding to the curve controller, wherein the following change data at least comprises the offset of each curve point of the virtual curve.
Further, at least partial model vertexes of the virtual model are controlled to move in vertex positions according to the following change data, and the muscle animation of the virtual model is generated. Visually, as the virtual character model performs a target motion, the motion of the virtual skeleton causes compression or stretching between muscle areas in the muscle clusters, and as the target motion progresses, the model surface morphology of the virtual character model changes in real time. For example, as shown in fig. 8, when the arm is folded, the inner side of the large arm and the inner side of the small arm press against each other, causing the model surface near the inner side of the elbow to bulge and deform (or shake is accompanied), causing the outer side of the large arm and the inner side of the small arm to draw and stretch with each other, causing the model surface near the outer side of the elbow to dent and deform inward.
For example, a curve controller is added to the lower arm skeleton of the virtual character model, a curve controller of a virtual curve corresponding to the lower arm muscle region is constructed according to a corresponding curve binding result and target skeleton binding data, the rotation value of the lower arm skeleton is taken as an input variable, a multiple n is multiplied by a multiplication and division calculation node, an offset value m is added by an addition and subtraction calculation node, a calculation result is obtained, the calculation result is taken as an output value of the curve controller, and the output value is used for controlling the movement offset and the rotation offset of the vertex of the surface model of the lower arm model, and further the surface morphology change of the lower arm model is controlled. The multiple n corresponding to the multiply-divide calculation node and the offset value m corresponding to the add-subtract calculation node are determined by a numerical calculation relationship determined based on the curve binding result and the target bone binding data.
Specifically, the offset value m is determined to be 0 for the lower arm skeleton of the virtual character model. When the rotation value of the lower arm is 0, the movement or rotation value of the muscle curve controller is 0, and the surface of the arm has no extrusion effect; when the lower arm rotation value is 30 DEG, the movement or rotation value of the muscle curve controller is 30n, if n is determined to be 0.2, the movement or rotation value of the muscle curve controller is 6, and the movement effect of the muscle area is 6 movement or rotation units.
It is easy to understand that in the above technical solution provided in the embodiments of the present application, two sets of subdivision binding systems are involved in the process of producing a dynamic muscle effect, one set is a direct binding effect between a muscle curve and a model point, and the other set is a binding effect between a muscle curve and a bone created for the muscle curve, i.e. a curve bone controller controls a curve bone, the curve bone further controls the muscle curve, and the muscle curve controls a model region point.
In the above-described technical solution, when creating the virtual curve, the muscle form of the virtual character should be considered, and the form of the created virtual curve matches the muscle form.
The animation generation method provided by the embodiment of the application can be applied to a scene of automatically calculating the extrusion effect of the muscle clusters for the game character model in the electronic game scene, and the application flow is shown in fig. 9, and the basic binding flow of the character skeleton and the binding flow of the muscle curve are synthesized to the implementation flow of the automatic extrusion effect of the muscle. In the basic binding flow of the character skeleton, body binding (mainly binding of the limbs trunk of the game character model and the corresponding skeleton) is firstly carried out, then a controller system is built for the character model, the controller is a model vertex controller, and skin weight is further given to each model vertex. In the muscle curve binding flow, a curve is created according to the muscle trend partition of the game character model, then muscle curve skin and a controller are built, the controller is a curve controller, algorithm node cluster building is further carried out under a Maya environment based on the curve controller, the algorithm node cluster is used for controlling the interaction of the curve and each local muscle model, and the effect of the muscle group of the model is controlled by the curve. And carrying out real-time calculation in an electronic game scene, automatically calculating the squeezing effect of the muscle clusters through the limb movement of the game character model, and displaying in real time. The animation generation method is integrated into the electronic game background, so that the muscle dynamic effect of the game character model can be generated and displayed in real time according to the action executed by the player control game character model, and the visual performance is good.
According to the method provided by the embodiment of the application, aiming at a plurality of muscle areas of the virtual model, the telescopic forms of the curves at the bone points and the muscle model body are automatically calculated through the curve binding technology, the extrusion deformation effect and the dynamic shaking effect of the muscle clusters are displayed in real time, the defect of a muscle animation manufacturing method in the prior art is overcome, the manufacturing difficulty and the manufacturing cost of the muscle animation are reduced, and the richness of the dynamic effect in the muscle animation is improved.
In summary, the above technical solution provided in the embodiments of the present application can achieve the following beneficial effects: the muscle extrusion amplitude of each region can be simulated, and a more natural and vivid muscle effect can be realized compared with the traditional manufacturing mode; by means of automatically calculating the muscle extrusion amplitude of each region, manual frame-by-frame operation is avoided, labor cost is reduced, and manufacturing efficiency is improved; the muscle effect can be displayed in real time, so that technicians can observe the changes of the muscles in real time, and accordingly, modification and adjustment are performed, and animation of the muscles does not need to be reworked in the adjustment process; the method can be applied to various types of animation production (such as traditional animation, virtual reality, game production and the like), and has strong expandability; the customization according to scene needs is supported, such as parameter adjustment, muscle distribution modification, calculation nodes and the like, the flexibility is high, and the actual needs can be better met.
From the description of the above embodiments, it will be clear to a person skilled in the art that the method according to the above embodiments may be implemented by means of software plus the necessary general hardware platform, but of course also by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. a magnetic disc, an optical disc), including several instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method described in the embodiments of the present application.
The present embodiment also provides an animation generating device, which is used for implementing the foregoing embodiments and preferred embodiments, and will not be described in detail. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. While the means described in the following embodiments are preferably implemented in software, implementation in hardware, or a combination of software and hardware, is also possible and contemplated.
Fig. 10 is a block diagram of an animation generation device according to one embodiment of the present application, as shown in fig. 10, the device includes: a creation module 1001 for creating a plurality of virtual curves from a plurality of muscle areas of the virtual model; a binding module 1002, configured to bind the curve points of the plurality of virtual curves with at least some model vertices in the plurality of muscle areas to obtain a curve binding result; a building module 1003, configured to build a curve controller of a plurality of virtual curves based on the curve binding result and target bone binding data corresponding to the virtual model, where the curve controller is configured to control a curve point to follow a virtual bone corresponding to the virtual model to perform a position change; the generating module 1004 is configured to generate a muscle animation of the virtual model by using the motion parameters of the virtual skeleton and the curve controller, where the muscle animation displays at least morphological changes of a plurality of muscle areas.
Optionally, the creating module 1001 is further configured to: creating a plurality of virtual curves according to the distribution information of the plurality of muscle areas on the virtual model, wherein the curve types of the plurality of virtual curves are at least one of the following: control vertex curves, edit point curves, and bezier curves.
Alternatively, fig. 11 is a block diagram of an alternative animation generating apparatus according to an embodiment of the present application, as shown in fig. 11, which includes, in addition to all the modules shown in fig. 10: an adjustment module 1005 for creating a virtual skeleton based on the pre-imported virtual model; weighting and binding model vertexes of the virtual model and the virtual skeleton by using a first weight to obtain initial skeleton binding data; and carrying out weight adjustment on the initial bone binding data according to a preset model action optimization index to obtain target bone binding data.
Optionally, the binding module 1002 is further configured to: selecting at least part of model vertices corresponding to the plurality of muscle areas on the virtual model; and carrying out position binding on curve points of a plurality of virtual curves and at least part of model vertexes according to preset binding attribute information to obtain a curve binding result.
Optionally, the binding module 1002 is further configured to: determining a binding mode and a second weight according to the binding attribute information; and carrying out weighted position binding on the curve points and at least part of the model vertexes according to a binding mode by using a second weight to obtain a curve binding result.
Optionally, the above building block 1003 is further configured to: determining a numerical calculation relation of the curve points along with the virtual bones for position change based on the curve binding result and the target bone binding data; and constructing a curve controller of a plurality of virtual curves by utilizing the numerical calculation relation.
Optionally, the generating module 1004 is further configured to: collecting action parameters of the virtual skeleton when the virtual skeleton executes target actions, wherein the action parameters comprise at least one of the following: translation amount, rotation amount, deformation amount, scaling amount; calculating the action parameters as input of a curve controller to obtain follow-up change data of curve points; and controlling at least part of model vertexes of the virtual model to move in vertex positions according to the following change data, and generating the muscle animation of the virtual model.
It should be noted that each of the above modules may be implemented by software or hardware, and for the latter, it may be implemented by, but not limited to: the modules are all located in the same processor; alternatively, the above modules may be located in different processors in any combination.
Embodiments of the present application also provide a computer readable storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the method embodiments described above when run.
Alternatively, in the present embodiment, the above-described computer-readable storage medium may include, but is not limited to: a usb disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a removable hard disk, a magnetic disk, or an optical disk, or other various media in which a computer program can be stored.
Alternatively, in this embodiment, the above-mentioned computer-readable storage medium may be located in any one of the computer terminals in the computer terminal group in the computer network, or in any one of the mobile terminals in the mobile terminal group.
Alternatively, in the present embodiment, the above-described computer-readable storage medium may be configured to store a computer program for performing the steps of:
s1, creating a plurality of virtual curves according to a plurality of muscle areas of a virtual model;
s2, position binding is carried out on curve points of a plurality of virtual curves and at least part of model vertexes in a plurality of muscle areas, so that a curve binding result is obtained;
s3, constructing curve controllers of a plurality of virtual curves based on curve binding results and target skeleton binding data corresponding to the virtual model, wherein the curve controllers are used for controlling curve points to follow the virtual skeleton corresponding to the virtual model to change positions;
And S4, generating a muscle animation of the virtual model by using the action parameters of the virtual skeleton and the curve controller, wherein the muscle animation at least displays morphological changes of a plurality of muscle areas.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: creating a plurality of virtual curves according to the distribution information of the plurality of muscle areas on the virtual model, wherein the curve types of the plurality of virtual curves are at least one of the following: control vertex curves, edit point curves, and bezier curves.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: creating a virtual skeleton based on the pre-imported virtual model; weighting and binding model vertexes of the virtual model and the virtual skeleton by using a first weight to obtain initial skeleton binding data; and carrying out weight adjustment on the initial bone binding data according to a preset model action optimization index to obtain target bone binding data.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: selecting at least part of model vertices corresponding to the plurality of muscle areas on the virtual model; and carrying out position binding on curve points of a plurality of virtual curves and at least part of model vertexes according to preset binding attribute information to obtain a curve binding result.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: determining a binding mode and a second weight according to the binding attribute information; and carrying out weighted position binding on the curve points and at least part of the model vertexes according to a binding mode by using a second weight to obtain a curve binding result.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: determining a numerical calculation relation of the curve points along with the virtual bones for position change based on the curve binding result and the target bone binding data; and constructing a curve controller of a plurality of virtual curves by utilizing the numerical calculation relation.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: collecting action parameters of the virtual skeleton when the virtual skeleton executes target actions, wherein the action parameters comprise at least one of the following: translation amount, rotation amount, deformation amount, scaling amount; calculating the action parameters as input of a curve controller to obtain follow-up change data of curve points; and controlling at least part of model vertexes of the virtual model to move in vertex positions according to the following change data, and generating the muscle animation of the virtual model.
In the computer-readable storage medium of the above embodiment, a technical solution for implementing an animation generation method is provided. Creating a plurality of virtual curves according to a plurality of muscle areas of the virtual model, performing position binding on curve points of the plurality of virtual curves and at least part of model vertexes in the plurality of muscle areas to obtain a curve binding result, and further constructing a curve controller of the plurality of virtual curves based on the curve binding result and target skeleton binding data corresponding to the virtual model, wherein the curve controller is used for controlling the curve points to follow the virtual skeletons corresponding to the virtual model to perform position change, and generating a muscle animation of the virtual model by utilizing action parameters of the virtual skeletons and the curve controller, wherein the muscle animation at least displays the morphological change of the plurality of muscle areas. Therefore, the purpose of realizing the dynamic effect of the muscle clusters through the virtual curves is achieved, the technical effects of improving the richness of the muscle animation effect and improving the manufacturing efficiency are achieved, and the technical problems that the effect is single and the manufacturing efficiency is low in a mode of manufacturing the muscle animation effect through engraving and repairing in the related art are solved.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present application may be embodied in the form of a software product, which may be stored in a computer readable storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, and includes several instructions to cause a computing device (may be a personal computer, a server, a terminal device, or a network device, etc.) to perform the method according to the embodiments of the present application.
In an exemplary embodiment of the present application, a computer-readable storage medium stores thereon a program product capable of implementing the method described above in the present embodiment. In some possible implementations, the various aspects of the embodiments of the present application may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the present application as described in the "exemplary methods" section of the embodiments, when the program product is run on the terminal device.
A program product for implementing the above method according to an embodiment of the present application may employ a portable compact disc read-only memory (CD-ROM) and comprise program code and may be run on a terminal device, such as a personal computer. However, the program product of the embodiments of the present application is not limited thereto, and in the embodiments of the present application, the computer-readable storage medium may be any tangible medium that can contain, or store the program for use by or in connection with the instruction execution system, apparatus, or device.
Any combination of one or more computer readable media may be employed by the program product described above. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
It should be noted that the program code embodied on the computer readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Embodiments of the present application also provide an electronic device comprising a memory having a computer program stored therein and a processor arranged to run the computer program to perform the steps of any of the method embodiments described above.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, where the transmission device is connected to the processor, and the input/output device is connected to the processor.
Alternatively, in the present embodiment, the above-described processor may be configured to execute the following steps by a computer program:
s1, creating a plurality of virtual curves according to a plurality of muscle areas of a virtual model;
s2, position binding is carried out on curve points of a plurality of virtual curves and at least part of model vertexes in a plurality of muscle areas, so that a curve binding result is obtained;
s3, constructing curve controllers of a plurality of virtual curves based on curve binding results and target skeleton binding data corresponding to the virtual model, wherein the curve controllers are used for controlling curve points to follow the virtual skeleton corresponding to the virtual model to change positions;
And S4, generating a muscle animation of the virtual model by using the action parameters of the virtual skeleton and the curve controller, wherein the muscle animation at least displays morphological changes of a plurality of muscle areas.
Optionally, the above processor may be further configured to perform the following steps by a computer program: creating a plurality of virtual curves according to the distribution information of the plurality of muscle areas on the virtual model, wherein the curve types of the plurality of virtual curves are at least one of the following: control vertex curves, edit point curves, and bezier curves.
Optionally, the above processor may be further configured to perform the following steps by a computer program: creating a virtual skeleton based on the pre-imported virtual model; weighting and binding model vertexes of the virtual model and the virtual skeleton by using a first weight to obtain initial skeleton binding data; and carrying out weight adjustment on the initial bone binding data according to a preset model action optimization index to obtain target bone binding data.
Optionally, the above processor may be further configured to perform the following steps by a computer program: selecting at least part of model vertices corresponding to the plurality of muscle areas on the virtual model; and carrying out position binding on curve points of a plurality of virtual curves and at least part of model vertexes according to preset binding attribute information to obtain a curve binding result.
Optionally, the above processor may be further configured to perform the following steps by a computer program: determining a binding mode and a second weight according to the binding attribute information; and carrying out weighted position binding on the curve points and at least part of the model vertexes according to a binding mode by using a second weight to obtain a curve binding result.
Optionally, the above processor may be further configured to perform the following steps by a computer program: determining a numerical calculation relation of the curve points along with the virtual bones for position change based on the curve binding result and the target bone binding data; and constructing a curve controller of a plurality of virtual curves by utilizing the numerical calculation relation.
Optionally, the above processor may be further configured to perform the following steps by a computer program: collecting action parameters of the virtual skeleton when the virtual skeleton executes target actions, wherein the action parameters comprise at least one of the following: translation amount, rotation amount, deformation amount, scaling amount; calculating the action parameters as input of a curve controller to obtain follow-up change data of curve points; and controlling at least part of model vertexes of the virtual model to move in vertex positions according to the following change data, and generating the muscle animation of the virtual model.
In the electronic device of the above embodiment, a technical solution for implementing an animation generation method is provided. Creating a plurality of virtual curves according to a plurality of muscle areas of the virtual model, performing position binding on curve points of the plurality of virtual curves and at least part of model vertexes in the plurality of muscle areas to obtain a curve binding result, and further constructing a curve controller of the plurality of virtual curves based on the curve binding result and target skeleton binding data corresponding to the virtual model, wherein the curve controller is used for controlling the curve points to follow the virtual skeletons corresponding to the virtual model to perform position change, and generating a muscle animation of the virtual model by utilizing action parameters of the virtual skeletons and the curve controller, wherein the muscle animation at least displays the morphological change of the plurality of muscle areas. Therefore, the purpose of realizing the dynamic effect of the muscle clusters through the virtual curves is achieved, the technical effects of improving the richness of the muscle animation effect and improving the manufacturing efficiency are achieved, and the technical problems that the effect is single and the manufacturing efficiency is low in a mode of manufacturing the muscle animation effect through engraving and repairing in the related art are solved.
Fig. 12 is a schematic diagram of an electronic device according to one embodiment of the present application. As shown in fig. 12, the electronic device 1200 is merely an example, and should not impose any limitation on the functionality and scope of use of the embodiments of the present application.
As shown in fig. 12, the electronic apparatus 1200 is in the form of a general purpose computing device. The components of the electronic device 1200 may include, but are not limited to: the at least one processor 1210, the at least one memory 1220, a bus 1230 connecting the various system components (including the memory 1220 and the processor 1210), and a display 1240.
Wherein the memory 1220 stores program code that can be executed by the processor 1210, such that the processor 1210 performs the steps according to various exemplary implementations of the present application described in the above method section of the embodiments of the present application.
Memory 1220 may include readable media in the form of volatile memory units, such as Random Access Memory (RAM) 12201 and/or cache memory 12202, and may further include Read Only Memory (ROM) 12203, and may include nonvolatile memory, such as one or more magnetic storage devices, flash memory, or other nonvolatile solid state memory.
In some examples, memory 1220 may also include a program/utility 12204 having a set (at least one) of program modules 12205, such program modules 12205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment. Memory 1220 may further include memory located remotely from processor 1210, which may be connected to electronic device 1200 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
Bus 1230 may be one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processor 1210, or a local bus using any of a variety of bus architectures.
The display 1240 may be, for example, a touch screen type liquid crystal display (Liquid Crystal Display, LCD) that may enable a user to interact with a user interface of the electronic device 1200.
Optionally, the electronic apparatus 1200 may also communicate with one or more external devices 1300 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic apparatus 1200, and/or with any device (e.g., router, modem, etc.) that enables the electronic apparatus 1200 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 1250. Also, the electronic device 1200 may communicate with one or more networks (e.g., local area network (Local Area Network, LAN), wide area network (Wide Area Network, WAN) and/or public network, such as the internet) via the network adapter 1260. As shown in fig. 12, the network adapter 1260 communicates with other modules of the electronic device 1200 over a bus 1230. It should be appreciated that although not shown in fig. 12, other hardware and/or software modules may be used in connection with the electronic device 1200, which may include, but are not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, disk array (Redundant Arrays of Independent Disks, RAID) systems, tape drives, data backup storage systems, and the like.
The electronic device 1200 may further include: a keyboard, a cursor control device (e.g., a mouse), an input/output interface (I/O interface), a network interface, a power supply, and/or a camera.
It will be appreciated by those of ordinary skill in the art that the configuration shown in fig. 12 is merely illustrative and is not intended to limit the configuration of the electronic device described above. For example, the electronic device 1200 may also include more or fewer components than shown in fig. 12, or have a different configuration than shown in fig. 12. The memory 1220 may be used to store a computer program and corresponding data, such as a computer program and corresponding data corresponding to the animation generation method in the embodiment of the present application. The processor 1210 executes various functional applications and data processing by executing a computer program stored in the memory 1220, that is, implements the animation generation method described above.
The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present application, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed technology content may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a read-only memory (ROM), a random-access memory (RAM), a removable hard disk, a magnetic disk, or an optical disk, etc., which can store program codes.
The foregoing is merely a preferred embodiment of the present application and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present application and are intended to be comprehended within the scope of the present application.

Claims (10)

1. An animation generation method, characterized in that the animation generation method comprises:
creating a plurality of virtual curves according to the plurality of muscle areas of the virtual model;
position binding is carried out on curve points of the virtual curves and at least part of model vertexes in the muscle areas, so that curve binding results are obtained;
constructing curve controllers of the plurality of virtual curves based on the curve binding result and target skeleton binding data corresponding to the virtual model, wherein the curve controllers are used for controlling the curve points to change in position along with the virtual skeleton corresponding to the virtual model;
and generating a muscle animation of the virtual model by using the action parameters of the virtual skeleton and the curve controller, wherein the muscle animation at least displays morphological changes of the plurality of muscle areas.
2. The animation generation method of claim 1, wherein creating the plurality of virtual curves from a plurality of muscle regions of a virtual model comprises:
Creating a plurality of virtual curves according to the distribution information of the plurality of muscle areas on the virtual model, wherein the curve types of the plurality of virtual curves are at least one of the following: control vertex curves, edit point curves, and bezier curves.
3. The animation generation method of claim 1, further comprising:
creating a virtual skeleton based on the virtual model pre-imported;
weighting and binding the model vertexes of the virtual model and the virtual bones by using a first weight to obtain initial bone binding data;
and carrying out weight adjustment on the initial bone binding data according to a preset model action optimization index to obtain the target bone binding data.
4. The animation generation method of claim 1, wherein position binding curve points of the plurality of virtual curves with at least some model vertices within the plurality of muscle regions to obtain the curve binding result comprises:
selecting the at least partial model vertices corresponding to the plurality of muscle regions on the virtual model;
and carrying out position binding on the curve points of the virtual curves and the at least partial model vertexes according to preset binding attribute information to obtain the curve binding result.
5. The animation generation method of claim 4 wherein performing a position binding of the curve point and the at least part of the model vertices based on the binding attribute information to obtain the curve binding result comprises:
determining a binding mode and a second weight according to the binding attribute information;
and carrying out weighted position binding on the curve points and at least part of model vertexes according to the binding mode by using a second weight to obtain the curve binding result.
6. The animation generation method of claim 1, wherein constructing the curve controller of the plurality of virtual curves based on the curve binding result and the target bone binding data corresponding to the virtual model comprises:
determining a numerical calculation relation of the curve point for position change along with the virtual skeleton based on the curve binding result and the target skeleton binding data;
and constructing the curve controllers of the virtual curves by using the numerical calculation relation.
7. The animation generation method of claim 1, wherein generating the muscle animation of the virtual model using the motion parameters of the virtual skeleton and the curve controller comprises:
Collecting the action parameters of the virtual skeleton when the virtual skeleton executes a target action, wherein the action parameters comprise at least one of the following: translation amount, rotation amount, deformation amount, scaling amount;
calculating the action parameters as input of the curve controller to obtain follow-up change data of the curve points;
and controlling at least part of model vertexes of the virtual model to move in vertex positions according to the following change data, and generating the muscle animation of the virtual model.
8. An animation generation device, characterized in that the animation generation device comprises:
the creation module is used for creating a plurality of virtual curves according to a plurality of muscle areas of the virtual model;
the binding module is used for carrying out position binding on curve points of the virtual curves and at least part of model vertexes in the muscle areas to obtain curve binding results;
the building module is used for building curve controllers of the plurality of virtual curves based on the curve binding result and target skeleton binding data corresponding to the virtual model, wherein the curve controllers are used for controlling the curve points to follow the virtual skeleton corresponding to the virtual model to change in position;
And the generation module is used for generating a muscle animation of the virtual model by utilizing the action parameters of the virtual skeleton and the curve controller, wherein the muscle animation at least displays morphological changes of the plurality of muscle areas.
9. A computer readable storage medium, characterized in that a computer program is stored in the computer readable storage medium, wherein the computer program is arranged to perform the animation generation method of any of claims 1 to 7 when run by a processor.
10. An electronic device comprising a memory and a processor, characterized in that the memory has stored therein a computer program, the processor being arranged to run the computer program to perform the animation generation method of any of claims 1 to 7.
CN202311300265.9A 2023-10-08 2023-10-08 Animation generation method and device, storage medium and electronic device Pending CN117315104A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311300265.9A CN117315104A (en) 2023-10-08 2023-10-08 Animation generation method and device, storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311300265.9A CN117315104A (en) 2023-10-08 2023-10-08 Animation generation method and device, storage medium and electronic device

Publications (1)

Publication Number Publication Date
CN117315104A true CN117315104A (en) 2023-12-29

Family

ID=89280794

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311300265.9A Pending CN117315104A (en) 2023-10-08 2023-10-08 Animation generation method and device, storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN117315104A (en)

Similar Documents

Publication Publication Date Title
US20100156935A1 (en) Method and apparatus for deforming shape of three dimensional human body model
US8498849B1 (en) Wrinkle simulation on fabric based on three arm joint angles
CN107657650B (en) Animation model role binding method and system based on Maya software
JPH08212373A (en) Method for operating plural motion units so as to generate motion of articulated body and its system
CN106548392B (en) Virtual fitting implementation method based on webG L technology
CN111292402A (en) Data processing method, device, equipment and computer readable storage medium
Nedel et al. Anatomic modeling of deformable human bodies
CN110443872B (en) Expression synthesis method with dynamic texture details
Guo et al. Skinning With Deformable Chunks.
CN115908664B (en) Animation generation method and device for man-machine interaction, computer equipment and storage medium
US11948240B2 (en) Systems and methods for computer animation using an order of operations deformation engine
WO2023077972A1 (en) Image data processing method and apparatus, virtual digital human construction method and apparatus, device, storage medium, and computer program product
CN117315104A (en) Animation generation method and device, storage medium and electronic device
CN114419211A (en) Method, device, storage medium and electronic device for controlling virtual character skeleton
Yang et al. Stretch it-realistic smooth skinning
Çetinaslan Position manipulation techniques for facial animation
CN108198234B (en) Virtual character generating system and method capable of realizing real-time interaction
CN110163957A (en) A kind of expression generation system based on aestheticism face program
Li et al. Anatomical human musculature modeling for real-time deformation
Jiang et al. Animating arbitrary topology 3D facial model using the MPEG-4 FaceDefTables
Yang The skinning in character animation: A survey
Yuan et al. The Fusion Method of Virtual Reality Technology and 3D Movie Animation Design.
Kim et al. Real-time motion generating method for artifical fish
CN117732055A (en) Method, device, equipment and storage medium for controlling movement of virtual character in game
Zhang et al. Implementation of Animation Character Action Design and Data Mining Technology Based on CAD Data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination