CN115471592A - Dynamic image processing method and system - Google Patents

Dynamic image processing method and system Download PDF

Info

Publication number
CN115471592A
CN115471592A CN202211153869.0A CN202211153869A CN115471592A CN 115471592 A CN115471592 A CN 115471592A CN 202211153869 A CN202211153869 A CN 202211153869A CN 115471592 A CN115471592 A CN 115471592A
Authority
CN
China
Prior art keywords
information
characteristic
type
forming
vertex
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211153869.0A
Other languages
Chinese (zh)
Inventor
陈逸飏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Mihoyo Tianming Technology Co Ltd
Original Assignee
Shanghai Mihoyo Tianming Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Mihoyo Tianming Technology Co Ltd filed Critical Shanghai Mihoyo Tianming Technology Co Ltd
Priority to CN202211153869.0A priority Critical patent/CN115471592A/en
Publication of CN115471592A publication Critical patent/CN115471592A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/603D [Three Dimensional] animation of natural phenomena, e.g. rain, snow, water or plants

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention relates to the technical field of image processing, in particular to a method and a system for processing a dynamic image, wherein the method for processing the dynamic image comprises the following steps of forming characteristic parameters matched with a target model according to the target model in a state that the target model is finished; tiling and recording the characteristic parameters to vertex UV information to form UV data information; establishing a mapping relation between the UV data information and a target model; baking to form a picture according to the characteristic parameters, wherein an RGB channel in the picture is used for storing position information corresponding to the target model; and forming a vertex image, and combining the vertex image with the picture to form dynamic image information matched with the target model.

Description

Dynamic image processing method and system
Technical Field
The invention relates to the technical field of image processing, in particular to a method and a system for processing a dynamic image.
Background
The production process of the growth animation is generally to produce a contour model, add or bind bone information to the contour model (also called a bone skinning technology), then color the contour model added or bound with the bone information, and form a basic model after the coloring process is completed. The bone is controlled to generate corresponding special effect. The disadvantages of this approach are: in addition, because the skeleton and the model are mutually independent, corresponding animation effects can be realized by needing larger calculation force in the animation operation process, the occupied resources are relatively more, and the method cannot be operated on a low-end platform with insufficient calculation force particularly.
Disclosure of Invention
Aiming at the defects of the prior art, the method and the system for processing the dynamic image provided by the application specifically comprise the following steps:
in one aspect, the present application provides a method for processing a moving image, wherein: comprises the steps of (a) preparing a mixture of a plurality of raw materials,
forming characteristic parameters matched with the target model according to the target model in a state that the target model is finished;
tiling and recording the characteristic parameters to vertex UV information to form UV data information;
establishing a mapping relation between the UV data information and a target model;
baking to form a picture according to the characteristic parameters, wherein an RGB channel in the picture is used for storing position information corresponding to the target model;
and forming a vertex image, and combining the vertex image with the picture to form dynamic image information matched with the target model.
Preferably, the above method for processing a moving image, wherein: the characteristic parameters comprise a first type curve model or a second type curve model; performing the tiling record of the characteristic parameters to form UV data information specifically includes:
reading the first type curve model, and obtaining the initial characteristic information and the end point characteristic information of the first type curve model to form first type vertex UV information matched with the first type curve model; or; reading the second type curve model, and obtaining the initial characteristic information and the end point characteristic information of the second type curve model to form second type vertex UV information matched with the second type curve model;
and forming the UV data information according to the first vertex UV information and/or the second vertex UV information.
Preferably, the above method for processing a moving image, wherein: forming the vertex image, wherein the combining the vertex image with the picture to form a dynamic image information matched with the target model specifically includes:
reading length data of the first type curve model and the second type curve model;
forming control end point data matched with the movement moment according to the length data;
forming dynamic control parameters of the characteristic parameters according to the control endpoint data;
and forming the dynamic image information according to the control parameters and the pictures in combination with the motion moments.
Preferably, the above method for processing a moving image, wherein: the second type of curve model includes a first characteristic value, a second characteristic value, a third characteristic value, and a fourth characteristic value, and the reading of the second type of curve model is executed, and the obtaining of the start characteristic information and the end characteristic information of the second type of curve model to form second type of vertex UV information matched with the second type of curve model specifically includes:
forming first starting characteristic information and first end point characteristic information according to the first characteristic value and the third characteristic value;
forming second starting characteristic information and second end point characteristic information according to the second characteristic value and the fourth characteristic value; wherein the second characteristic value is located intermediate the first characteristic value and the third characteristic value,
and forming the second type of vertex UV information according to the first start characteristic information, the first end point characteristic information, the second start characteristic information and the second end point characteristic information.
In another aspect, the present invention further provides a system for processing a moving image, wherein: comprises the steps of (a) preparing a substrate,
the characteristic parameter forming module is used for forming characteristic parameters matched with the target model according to the target model in a state that the target model is finished;
the UV data information forming module is used for tiling and recording the characteristic parameters to vertex UV information to form UV data information;
the mapping relation forming module is used for establishing a mapping relation between the UV data information and a target model;
the image forming module is used for baking to form an image according to the characteristic parameters, and the RGB channel in the image is used for storing the position information corresponding to the target model;
and the dynamic image information forming module is used for forming a vertex image which is combined with the picture to form dynamic image information matched with the target model.
Preferably, the above processing system for moving images, wherein: the characteristic parameters comprise a first type curve model or a second type curve model; the UV data information forming module specifically comprises:
the curve information reading unit is used for reading the first type of curve model and acquiring the initial characteristic information and the end characteristic information of the first type of curve model to form first type vertex UV information matched with the first type of curve model; or; reading the second type curve model, and obtaining the initial characteristic information and the end point characteristic information of the second type curve model to form second type vertex UV information matched with the second type curve model;
and the UV data information forming unit forms the UV data information according to the first vertex UV information and/or the second vertex UV information.
Preferably, the above processing system for moving images, wherein: the moving image information forming module specifically includes:
the length data reading unit is used for reading the length data of the first type curve model and the second type curve model;
the control unit is used for forming control end point data matched with the movement moment according to the length data; forming dynamic control parameters of the characteristic parameters according to the control endpoint data; and forming the dynamic image information according to the control parameters and the pictures in combination with the motion moments.
In another aspect, the present invention further provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable by the processor, wherein the processor implements the method for processing a dynamic image according to any one of the above descriptions when executing the computer program.
Finally, the present application further provides a computer program product, which includes computer readable code or a readable storage medium carrying computer readable code, when the computer readable code runs in a processor of an electronic device, the processor in the electronic device executes a processing method for implementing a dynamic image as described in any one of the above.
Compared with the prior art, the invention has the beneficial effects that:
according to the technical scheme, the key data of the special effect is stored in a picture form, and the special effect display can be realized by combining with a corresponding control instruction. On one hand, the occupied storage resource is relatively small, in addition, the operation calculation force of the control instruction is small, and good special effect display can be obtained on a low-end platform.
Drawings
FIG. 1 is a flow chart illustrating a method for processing a dynamic image according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating a method for processing a dynamic image according to an embodiment of the present invention;
FIG. 3 is a flowchart illustrating a method for processing a dynamic image according to an embodiment of the present invention;
FIG. 4 is a flowchart illustrating a method for processing a dynamic image according to an embodiment of the present invention;
FIG. 5 is a flowchart illustrating a method for processing a dynamic image according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed description of the preferred embodiments
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example one
As shown in fig. 1, in one aspect, the present invention provides a method for processing a moving image, wherein: comprises the steps of (a) preparing a mixture of a plurality of raw materials,
step S110, forming characteristic parameters matched with a target model according to the target model in a state that the target model is finished; taking a vine model or a branch model as an example, the vine model or the branch model is a three-dimensional vine model or a three-dimensional branch model after being completed, the vine model is calculated to form a curve model matched with the vine model, the branch model is calculated to form a curve model matched with the branch model, and the curve model can be understood as a characteristic parameter. At this point the conversion from "face" to "point" is complete. And converting the stereo image into a linear graph.
Step S120, tiling and recording the characteristic parameters to vertex UV information to form UV data information; specifically, the method comprises the following steps: the characteristic parameters comprise a first type curve model or a second type curve model; executing the tiling record of the characteristic parameters to form UV data information specifically includes:
step S1201, reading the first type curve model, and obtaining the initial characteristic information and the end characteristic information of the first type curve model to form first type vertex UV information matched with the first type curve model; schematically, the first type of curve model may be a single-path curve model, for example, as shown in fig. 2, fig. 2A is a target model, and fig. 2B is a curve model, where the curve model includes 5 curves, which are respectively curve 1, curve 2, curve 3, curve 4, and curve 5; the curve 1, the curve 2, the curve 3, the curve 4 and the curve 5 are all single-path curves, and the single-path curves can be understood as curves without bifurcation.
Or step S1202, reading the second type of curve model, and obtaining start characteristic information and end characteristic information of the second type of curve model to form second type of vertex UV information matched with the second type of curve model; the second type of curve may be a multi-path curve. For example, as shown in fig. 3, curve 6 is a multi-path curve, and it is understood that the multi-path curve is a curve with a bifurcation. The second type of curve comprises a first characteristic value 61, a second characteristic value 62, a third characteristic value 63 and a fourth characteristic value 64;
step S12021, forming first starting characteristic information and first end point characteristic information according to the first characteristic value and the third characteristic value;
step S12022, forming second start feature information and second end feature information according to the second feature value and the fourth feature value; wherein the second characteristic value is located in the middle between the first characteristic value and the third characteristic value, the second characteristic value being understood as a bifurcation point.
Step S12023, forming the second type vertex UV information according to the first start feature information and the first end point feature information, the second start feature information and the second end point feature information.
Step S1203, forming the UV data information according to the first type of vertex UV information and/or the second type of vertex UV information.
S130, establishing a mapping relation between the UV data information and a target model; specifically, the UV data is mapped to the target model by a spatial relationship.
Step S140, baking to form a picture according to the characteristic parameters, wherein as shown in FIG. 4, RGB channels in the picture are used for storing position information corresponding to the target model; in the prior art, an RFB channel stores a parameter value corresponding to an R color, a parameter value corresponding to a G color, and a parameter value corresponding to a B color. In the application, the R color parameter value can be a coordinate X-axis numerical value, the G color parameter value can be a coordinate Y-axis numerical value, and the B color parameter value can be a coordinate Z-axis numerical value, namely, the R color parameter value can be stored in a two-dimensional picture mode by understanding the target model.
Note that, for a multi-path curve, such as the curve in fig. 3, the third characteristic value (i.e., the information of the branch point) has a separate line for recording XYZ axis information. It can be understood that: the length of the two-dimensional picture is the total time that the special effect lasts, the width of the two-dimensional picture is all UV data information, a plurality of scales are arranged in the length direction according to the moment, the width direction is also divided into a plurality of scales according to the quantity of the UV information so as to form a plurality of small pixel points, and each pixel point stores corresponding XYZ axis information.
As shown in fig. 5, in step S150, a vertex image is formed, and the vertex image is combined with the picture to form dynamic image information matched with the target model. E.g. by the shader making the vertex image, further:
step S1501, reading length data of the first type curve model and the second type curve model; the length data may be understood as length information of a single-path curve or a multi-path curve, and may also be understood as data between two end points of the curve in the single-path curve and the multi-path curve. In practical use, the two understanding modes have corresponding use modes.
For example, the length data is understood as length information of a single-path curve or a multi-path curve, and the special effects of 'growing' or 'vine extension' can be realized;
the length data is understood to be the data between the two end points of the curve in the single-path curve and the multi-path curve. Thus realizing the special effect of 'dissolving'.
Step S1502, forming control end point data matched with the movement time according to the length data; the control endpoint data may be custom formed by the user. The movement moment can be understood as the duration of the special effect.
For example, when the start time of a special effect is 1 second, the content required to be displayed by the 'growth' special effect or the 'dissolution' special effect is determined, and then the 1 second is to be understood as the movement moment. Because the growth effect or the dissolution effect is related to time, control end point data matched with the movement moment is set.
A specific example is illustrated:
the length of the vine curve is 10, the 'growth' length of the vine is 1 when the 'growth' special effect lasts for 1 second, the 'growth' length of the vine is 8 when the 'growth' special effect lasts for 5 seconds, and the 'growth' length of the vine is 10 when the 'growth' special effect lasts for 6 seconds. Then a control end point is arranged at the position where the length of the growth is 1, a control end point is arranged at the position where the length of the growth is 8, and a control end point is arranged at the position where the length of the growth is 10. When the special sports effect lasts for 1 second, the displayed vine curve is 1 in length, the vine curves of more than 1 part are controlled to be in a non-display state, similarly, when the special sports effect lasts for 5 seconds, the displayed vine curve is 8 in length, the vine curves of more than 8 parts are controlled to be in a non-display state, and when the special sports effect lasts for 6 seconds, the whole vine is completely displayed.
Step S1503, forming dynamic control parameters of the characteristic parameters according to the control endpoint data;
and step S1504, forming the dynamic image information according to the control parameters and the pictures in combination with the motion moments. Specifically, the control parameters are obtained, and the displayed vine curve is scaled through linear interpolation in combination with the position information in the picture, so that the display effect of thickening from no start to a little over time is achieved. For example, the linear difference may be a clamp instruction.
In the present application, the pixel information of the target model is also stored separately in the form of a map, and the pixel information of the target model is read at the same time when the motion animation is performed.
The method and the device can also realize the reverse growth special effect or the reverse dissolution special effect, and only needs to make simple adjustment, for example, when the reverse growth special effect or the reverse dissolution special effect is realized, all information of the target model is displayed at the beginning of the special effect, and all information of the target model is in a hidden state at the end time of the special effect.
It should be noted that curves 4 and 5 also form the motion change between "point" and "face" according to time.
By the technical scheme, the key data of the special effect is stored in a picture form, and the special effect display can be realized by combining with a corresponding control instruction. On one hand, occupied storage resources are relatively small, and in addition, the operation calculation force of the control instruction is small, and good special effect display can be obtained on a low-end platform.
Example two
The present invention further provides a system for processing a moving image, wherein: comprises the steps of (a) preparing a substrate,
the characteristic parameter forming module is used for forming characteristic parameters matched with the target model according to the target model in a state that the target model is finished;
the UV data information forming module is used for tiling and recording the characteristic parameters to vertex UV information to form UV data information;
the mapping relation forming module is used for establishing a mapping relation between the UV data information and a target model;
the image forming module is used for baking to form an image according to the characteristic parameters, and the RGB channel in the image is used for storing the position information corresponding to the target model;
and the dynamic image information forming module is used for forming a vertex image which is combined with the picture to form dynamic image information matched with the target model.
Preferably, the above processing system for moving images, wherein: the characteristic parameters comprise a first type curve model or a second type curve model; the UV data information forming module specifically comprises:
the curve information reading unit is used for reading the first type of curve model and acquiring the initial characteristic information and the end characteristic information of the first type of curve model to form first type vertex UV information matched with the first type of curve model; or; reading the second type curve model, and obtaining the initial characteristic information and the end point characteristic information of the second type curve model to form second type vertex UV information matched with the second type curve model;
and the UV data information forming unit forms the UV data information according to the first type of vertex UV information and/or the second type of vertex UV information.
Preferably, the above processing system for moving images, wherein: the moving image information forming module specifically includes:
the length data reading unit is used for reading the length data of the first type curve model and the second type curve model;
the control unit is used for forming control end point data matched with the movement moment according to the length data; forming dynamic control parameters of the characteristic parameters according to the control endpoint data; and forming the dynamic image information according to the control parameters and the pictures in combination with the motion moments.
The working principle and the technical effect of the above-mentioned processing system for dynamic images are the same as those of the first embodiment, and are not described herein again.
EXAMPLE III
The embodiment of the application provides electronic equipment, and the control device based on the game running environment provided by the embodiment of the application can be integrated in the electronic equipment. Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 6, the present embodiment provides an electronic device 400, which includes: one or more processors 420; storage 410 to store one or more programs that, when executed by the one or more processors 420, cause the one or more processors 420 to implement:
forming characteristic parameters matched with the target model according to the target model in a state that the target model is finished;
tiling and recording the characteristic parameters to vertex UV information to form UV data information;
establishing a mapping relation between the UV data information and a target model;
baking to form a picture according to the characteristic parameters, wherein an RGB channel in the picture is used for storing position information corresponding to the target model;
and forming a vertex image, and combining the vertex image with the picture to form dynamic image information matched with the target model.
As shown in fig. 6, the electronic device 400 includes a processor 420, a storage device 410, an input device 430, and an output device 440; the number of the processors 420 in the electronic device may be one or more, and one processor 420 is taken as an example in fig. 6; the processor 420, the storage device 410, the input device 430, and the output device 440 in the electronic apparatus may be connected by a bus or other means, and are exemplified by being connected by a bus 450 in fig. 6.
The storage device 410 is a computer-readable storage medium, and can be used for storing software programs, computer executable programs, and module units, such as program instructions corresponding to the control method based on the game execution environment in the embodiment of the present application.
The storage device 410 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal, and the like. Further, the storage 410 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, storage 410 may further include memory located remotely from processor 420, which may be connected via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input means 430 may be used to receive input numbers, character information, or voice information, and to generate key signal inputs related to user settings and function control of the electronic device. The output device 440 may include a display screen, speakers, etc.
Example four
In some embodiments, the methods described above may be implemented as a computer program product. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for carrying out various aspects of the present disclosure. Specifically, the method comprises the following steps:
the computer executable instructions, when executed by a computer processor, are for performing:
forming characteristic parameters matched with the target model according to the target model in a state that the target model is finished;
tiling and recording the characteristic parameters to vertex UV information to form UV data information;
establishing a mapping relation between the UV data information and a target model;
baking to form a picture according to the characteristic parameters, wherein an RGB channel in the picture is used for storing position information corresponding to the target model;
and forming a vertex image, and combining the vertex image with the picture to form dynamic image information matched with the target model.
The computer-readable storage medium described above may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device over a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present disclosure may be assembly instructions, instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source code or object code written in any combination of one or more programming languages, including an object oriented programming language, as well as conventional procedural programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, the electronic circuitry that can execute the computer-readable program instructions implements aspects of the present disclosure by utilizing the state information of the computer-readable program instructions to personalize the electronic circuitry, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA).
These computer-readable program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Having described embodiments of the present disclosure, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or technical improvements to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (9)

1. A method for processing a moving image, characterized by: comprises the steps of (a) preparing a substrate,
forming characteristic parameters matched with the target model according to the target model in a state that the target model is finished;
recording the characteristic parameters to vertex UV information to form UV data information;
establishing a mapping relation between the UV data information and a target model;
baking to form a picture according to the characteristic parameters, wherein an RGB channel in the picture is used for storing position information corresponding to the target model;
and forming a vertex image, and combining the vertex image with the picture to form dynamic image information matched with the target model.
2. A method for processing a moving picture according to claim 1, wherein: the characteristic parameters comprise a first type curve model or a second type curve model; executing the tiling record of the characteristic parameters to form UV data information specifically includes:
reading the first type curve model, and obtaining the initial characteristic information and the end point characteristic information of the first type curve model to form first type vertex UV information matched with the first type curve model; or; reading the second type curve model, and obtaining the initial characteristic information and the end point characteristic information of the second type curve model to form second type vertex UV information matched with the second type curve model;
and forming the UV data information according to the first vertex UV information and/or the second vertex UV information.
3. The method as claimed in claim 1, wherein forming a vertex image, and combining the vertex image with the picture to form a dynamic image information matching the target model specifically comprises:
reading length data of the first type curve model and the second type curve model;
forming control endpoint data matched with the movement moment according to the length data;
forming dynamic control parameters of the characteristic parameters according to the control endpoint data;
and forming the dynamic image information according to the control parameters and the pictures in combination with the motion moments.
4. A method for processing a moving picture according to claim 2, wherein: the step of reading the second type of curve model by the fourth characteristic value to obtain the start characteristic information and the end characteristic information of the second type of curve model to form second type of vertex UV information matched with the second type of curve model specifically includes:
forming first starting characteristic information and first end point characteristic information according to the first characteristic value and the third characteristic value;
forming second starting characteristic information and second end point characteristic information according to the second characteristic value and the fourth characteristic value; wherein the second characteristic value is located intermediate the first characteristic value and the third characteristic value,
and forming the second type of vertex UV information according to the first start characteristic information, the first end point characteristic information, the second start characteristic information and the second end point characteristic information.
5. A system for processing a moving image, comprising: comprises the steps of (a) preparing a mixture of a plurality of raw materials,
the characteristic parameter forming module is used for forming characteristic parameters matched with the target model according to the target model in a state that the target model is finished;
the UV data information forming module is used for tiling and recording the characteristic parameters to the vertex UV information to form UV data information;
the mapping relation forming module is used for establishing a mapping relation between the UV data information and a target model;
the image forming module is used for baking to form an image according to the characteristic parameters, and the RGB channel in the image is used for storing the position information corresponding to the target model;
and the dynamic image information forming module is used for forming a vertex image which is combined with the picture to form dynamic image information matched with the target model.
6. The system for processing a moving image according to claim 5, wherein: the characteristic parameters comprise a first type curve model or a second type curve model; the UV data information forming module specifically includes:
the curve information reading unit is used for reading the first type of curve model and acquiring the initial characteristic information and the end characteristic information of the first type of curve model to form first type vertex UV information matched with the first type of curve model; or; reading the second type curve model, and obtaining the initial characteristic information and the end point characteristic information of the second type curve model to form second type vertex UV information matched with the second type curve model;
and the UV data information forming unit forms the UV data information according to the first vertex UV information and/or the second vertex UV information.
7. The system for processing a moving image according to claim 5, wherein the moving image information forming module specifically includes:
the length data reading unit is used for reading the length data of the first type curve model and the second type curve model;
the control unit is used for forming control end point data matched with the movement moment according to the length data; forming dynamic control parameters of the characteristic parameters according to the control endpoint data; and forming the dynamic image information according to the control parameters and the pictures in combination with the motion moments.
8. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method for processing a moving image according to any one of claims 1 to 4 when executing the computer program.
9. A computer program product comprising computer readable code or a readable storage medium carrying computer readable code which, when run in a processor of an electronic device, the processor in the electronic device performs a processing method for implementing a dynamic image as claimed in any one of claims 1 to 4.
CN202211153869.0A 2022-09-21 2022-09-21 Dynamic image processing method and system Pending CN115471592A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211153869.0A CN115471592A (en) 2022-09-21 2022-09-21 Dynamic image processing method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211153869.0A CN115471592A (en) 2022-09-21 2022-09-21 Dynamic image processing method and system

Publications (1)

Publication Number Publication Date
CN115471592A true CN115471592A (en) 2022-12-13

Family

ID=84334481

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211153869.0A Pending CN115471592A (en) 2022-09-21 2022-09-21 Dynamic image processing method and system

Country Status (1)

Country Link
CN (1) CN115471592A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115761249A (en) * 2022-12-28 2023-03-07 北京曼恒数字技术有限公司 Image processing method, system, electronic equipment and computer program product

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115761249A (en) * 2022-12-28 2023-03-07 北京曼恒数字技术有限公司 Image processing method, system, electronic equipment and computer program product
CN115761249B (en) * 2022-12-28 2024-02-23 北京曼恒数字技术有限公司 Image processing method, system, electronic equipment and computer program product

Similar Documents

Publication Publication Date Title
CN111145326B (en) Processing method of three-dimensional virtual cloud model, storage medium, processor and electronic device
CN110058685B (en) Virtual object display method and device, electronic equipment and computer-readable storage medium
CN110989878B (en) Animation display method and device in applet, electronic equipment and storage medium
CN112102437B (en) Canvas-based radar map generation method and device, storage medium and terminal
CN112237739A (en) Game role rendering method and device, electronic equipment and computer readable medium
CN111161392A (en) Video generation method and device and computer system
CN113313802B (en) Image rendering method, device and equipment and storage medium
CN113827965B (en) Rendering method, device and equipment of sample lines in game scene
CN112929627A (en) Virtual reality scene implementation method and device, storage medium and electronic equipment
CN111803952A (en) Topographic map editing method and device, electronic equipment and computer readable medium
CN112714357A (en) Video playing method, video playing device, electronic equipment and storage medium
CN115471592A (en) Dynamic image processing method and system
CN110221689B (en) Space drawing method based on augmented reality
CN110636331B (en) Method and apparatus for processing video
CN111068314B (en) NGUI resource rendering processing method and device based on Unity
CN112017261B (en) Label paper generation method, apparatus, electronic device and computer readable storage medium
CN110288523B (en) Image generation method and device
CN115487495A (en) Data rendering method and device
CN113160379B (en) Material rendering method and device, storage medium and electronic equipment
CN111145358A (en) Image processing method, device and hardware device
CN115908687A (en) Method and device for training rendering network, method and device for rendering network, and electronic equipment
CN116228956A (en) Shadow rendering method, device, equipment and medium
CN114307143A (en) Image processing method and device, storage medium and computer equipment
CN114693885A (en) Three-dimensional virtual object generation method, apparatus, device, medium, and program product
CN114307144A (en) Image processing method and device, storage medium and computer equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination