CN118115633A - Animation processing method, device, electronic equipment and computer storage medium - Google Patents

Animation processing method, device, electronic equipment and computer storage medium Download PDF

Info

Publication number
CN118115633A
CN118115633A CN202211457347.XA CN202211457347A CN118115633A CN 118115633 A CN118115633 A CN 118115633A CN 202211457347 A CN202211457347 A CN 202211457347A CN 118115633 A CN118115633 A CN 118115633A
Authority
CN
China
Prior art keywords
animation
target
frame
component
file
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211457347.XA
Other languages
Chinese (zh)
Inventor
李木子
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Bilibili Technology Co Ltd
Original Assignee
Shanghai Bilibili Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Bilibili Technology Co Ltd filed Critical Shanghai Bilibili Technology Co Ltd
Priority to CN202211457347.XA priority Critical patent/CN118115633A/en
Publication of CN118115633A publication Critical patent/CN118115633A/en
Pending legal-status Critical Current

Links

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the invention discloses an animation processing method, an animation processing device, electronic equipment and a computer storage medium. The method comprises the following steps: acquiring a first animation file in a SVGA format previewed in an animation preview interface; the animation preview interface comprises a material providing inlet; acquiring a target material provided by a material providing inlet, and identifying a target component identifier corresponding to the target material; replacing original materials corresponding to the target component identification in the first animation file with target materials to generate a second animation file in an SVGA format; and playing the second animation file in the animation preview interface. By adopting the scheme, the first animation file in the SVGA format can be modified, and the SVGA animation production efficiency is improved; and preview the second animation file in time after generating the second animation file, so that the user can check the animation effect after modification in time, and further the animation production efficiency and the user experience are improved.

Description

Animation processing method, device, electronic equipment and computer storage medium
Technical Field
The embodiment of the invention relates to the technical field of data processing, in particular to an animation processing method, an animation processing device, electronic equipment and a computer storage medium.
Background
SVGA (Scalable Vector Graphics Animation ) is a cross-platform open source animation format, and SVGA has many applications in many scenes due to its features of strong compatibility, simple use, good display effect, etc. For example, SVGA animation is often used for gift effects and the like in the live broadcast field.
However, the inventors found in practice that the following drawbacks exist in the prior art: the existing SVGA preview tool only supports the playing of SVGA animation, and cannot modify the SVGA animation. Therefore, when the playing effect of the current SVGA animation does not meet the requirement, the animation needs to be produced in the animation production software again, so that the production efficiency of the SVGA animation is low.
Disclosure of Invention
In view of the technical problems that SVGA animation cannot be modified and SVGA animation production efficiency is low in the prior art, embodiments of the present invention are provided to provide an animation processing method, an apparatus, an electronic device, and a computer storage medium, which overcome or at least partially solve the above problems.
According to a first aspect of an embodiment of the present invention, there is provided an animation processing method, including:
acquiring a first animation file in a SVGA format previewed in an animation preview interface; the animation preview interface comprises a material providing inlet;
Acquiring target materials provided by the material providing inlet, and identifying a target component identifier corresponding to the target materials;
Replacing original materials corresponding to the target component identification in the first animation file with the target materials to generate a second animation file in an SVGA format;
and playing the second animation file in the animation preview interface.
In an optional implementation manner, the obtaining the target material provided through the material providing inlet and identifying the target component identifier corresponding to the target material further includes:
And acquiring the target material provided by the material providing inlet and the configuration information of the target material, and taking the animation component identifier contained in the configuration information as a target component identifier corresponding to the target material.
In an alternative embodiment, the method further comprises:
acquiring component information of at least one animation component contained in the first animation file;
And displaying the component information in the animation preview interface.
In an alternative embodiment, said presenting said part information in said animation preview interface further comprises: displaying the component identification of the at least one animation component in the animation preview interface;
The method further comprises the steps of: detecting a first trigger operation aiming at the displayed component identifier, and identifying an animation component identifier aiming at the first trigger operation; and displaying the original material corresponding to the animation component identification aimed at by the first triggering operation in an animation preview interface.
In an optional implementation manner, the obtaining the target material provided through the material providing inlet and identifying the target component identifier corresponding to the target material further includes:
Detecting a second triggering operation aiming at the displayed component information, and identifying an animation component identifier aiming at the second triggering operation;
and acquiring target materials provided by the material providing entrance, and taking the component identification of the animation component identification aimed by the second triggering operation as the target component identification.
In an alternative embodiment, the method further comprises:
and generating a frame image of a preset animation frame in the first animation file, displaying the frame image and/or deriving the frame image.
In an alternative embodiment, the preset animation frame is determined by:
Displaying the frame identification of the animation frame in the first animation file, detecting a third triggering operation aiming at the displayed frame identification, identifying the frame identification aiming at the third triggering operation, and taking the animation frame corresponding to the frame identification aiming at the third triggering operation as a preset animation frame;
And/or displaying a frame identification input inlet, acquiring a frame identification input through the frame identification input inlet, and taking an animation frame corresponding to the input frame identification as a preset animation frame;
And/or monitoring preview pause operation of the first animation file, identifying an animation frame in the first animation file currently displayed in the animation preview interface, and taking the animation frame in the first animation file currently displayed in the animation preview interface as a preset animation frame.
In an optional implementation manner, the generating the frame image of the preset animation frame in the first animation file further includes:
Determining an animation part corresponding to the preset animation frame;
And rendering in a preset layer according to the original material of the corresponding animation part and the position and/or the gesture in the preset animation frame so as to generate a frame image of the preset animation frame.
In an optional implementation manner, the obtaining the target material provided through the material providing inlet and identifying the target component identifier corresponding to the target material further includes:
Detecting a preset touch operation aiming at a displayed frame image, and identifying a touch position of the preset touch operation in the frame image;
Acquiring the position information of an animation part corresponding to the preset animation frame in the preset animation frame;
Comparing the position information of the animation part in the preset animation frame with the touch position, and determining the animation part with the position information matched with the touch position;
And acquiring target materials provided by the material providing entrance, and taking the component identification of the animation component with the position information matched with the touch position as the target component identification.
In an alternative embodiment, after the determining that the position information matches the animation component of the touch position, the method further includes:
and displaying the original material of the animation component with the determined position information matched with the touch position.
In an alternative embodiment, the method further comprises:
Generating frame images of all animation frames in the first animation file;
And generating an animation file in a preset format according to the frame images of the animation frames.
In an alternative embodiment, the target material is a plurality of target materials, and the plurality of target materials correspond to the same target component identifier;
The replacing the original material corresponding to the target component identifier in the first animation file with the target material to generate a second animation file in SVGA format further comprises:
Detecting a selection operation of target materials, determining the target materials corresponding to the selection operation, and replacing original materials corresponding to target component identifiers in the first animation file with the target materials corresponding to the selection operation to generate a second animation file in an SVGA format.
In an alternative embodiment, the target material is a plurality of target materials, and the plurality of target materials correspond to the same target component identifier;
replacing original materials corresponding to the target component identifiers in the first animation file with the target materials to generate a second animation file in an SVGA format; playing the second animation file in the animation preview interface further comprises:
And replacing original materials corresponding to the target component identifiers in the first animation file with the target materials aiming at each target material in the target materials to generate second animation files corresponding to the target materials, and respectively playing the second animation files corresponding to the target materials in the animation preview interface.
According to a second aspect of an embodiment of the present invention, there is provided an animation processing apparatus including:
the first acquisition module is used for acquiring a first animation file in a SVGA format previewed in the animation preview interface; the animation preview interface comprises a material providing inlet;
A second acquisition module; the method comprises the steps of obtaining target materials provided by a material providing inlet and identifying target component identifiers corresponding to the target materials;
the replacing module is used for replacing original materials corresponding to the target component identifiers in the first animation file with the target materials so as to generate a second animation file in an SVGA format;
and the display module is used for playing the second animation file in the animation preview interface.
In an alternative embodiment, the second acquisition module is configured to: and acquiring the target material provided by the material providing inlet and the configuration information of the target material, and taking the animation component identifier contained in the configuration information as a target component identifier corresponding to the target material.
In an alternative embodiment, the apparatus further comprises: a third obtaining module, configured to obtain component information of at least one animation component included in the first animation file;
The display module is used for: and displaying the component information in the animation preview interface.
In an alternative embodiment, the display module is configured to: displaying the component identification of the at least one animation component in the animation preview interface;
detecting a first trigger operation aiming at the displayed component identifier, and identifying an animation component identifier aiming at the first trigger operation; and displaying the original material corresponding to the animation component identification aimed at by the first triggering operation in an animation preview interface.
In an alternative embodiment, the second acquisition module is configured to: detecting a second triggering operation aiming at the displayed component information, and identifying an animation component identifier aiming at the second triggering operation;
and acquiring target materials provided by the material providing entrance, and taking the component identification of the animation component identification aimed by the second triggering operation as the target component identification.
In an alternative embodiment, the apparatus further comprises: the generation module is used for generating a frame image of a preset animation frame in the first animation file;
the display module is used for: displaying the frame image;
and the export module is used for exporting the frame image.
In an alternative embodiment, the generating module is configured to: displaying the frame identification of the animation frame in the first animation file, detecting a third triggering operation aiming at the displayed frame identification, identifying the frame identification aiming at the third triggering operation, and taking the animation frame corresponding to the frame identification aiming at the third triggering operation as a preset animation frame;
And/or displaying a frame identification input inlet, acquiring a frame identification input through the frame identification input inlet, and taking an animation frame corresponding to the input frame identification as a preset animation frame;
And/or monitoring preview pause operation of the first animation file, identifying an animation frame in the first animation file currently displayed in the animation preview interface, and taking the animation frame in the first animation file currently displayed in the animation preview interface as a preset animation frame.
In an alternative embodiment, the generating module is configured to: determining an animation part corresponding to the preset animation frame;
And rendering in a preset layer according to the original material of the corresponding animation part and the position and/or the gesture in the preset animation frame so as to generate a frame image of the preset animation frame.
In an alternative embodiment, the second acquisition module is configured to: detecting a preset touch operation aiming at a displayed frame image, and identifying a touch position of the preset touch operation in the frame image;
Acquiring the position information of an animation part corresponding to the preset animation frame in the preset animation frame;
Comparing the position information of the animation part in the preset animation frame with the touch position, and determining the animation part with the position information matched with the touch position;
And acquiring target materials provided by the material providing entrance, and taking the component identification of the animation component with the position information matched with the touch position as the target component identification.
In an alternative embodiment, the display module is configured to: and displaying the original material of the animation component with the determined position information matched with the touch position.
In an alternative embodiment, the generating module is configured to: generating frame images of all animation frames in the first animation file;
And generating an animation file in a preset format according to the frame images of the animation frames.
In an alternative embodiment, the target material is a plurality of target materials, and the plurality of target materials correspond to the same target component identifier;
The replacement module is used for: detecting a selection operation of target materials, determining the target materials corresponding to the selection operation, and replacing original materials corresponding to target component identifiers in the first animation file with the target materials corresponding to the selection operation to generate a second animation file in an SVGA format.
In an alternative embodiment, the target material is a plurality of target materials, and the plurality of target materials correspond to the same target component identifier;
The replacement module is used for: for each target material in the target materials, replacing original materials corresponding to the target component identifiers in the first animation file with the target material to generate a second animation file corresponding to the target material;
the display module is used for: and respectively playing the second animation files corresponding to the target materials in the animation preview interface.
According to a third aspect of an embodiment of the present invention, there is provided an electronic apparatus including: the device comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete communication with each other through the communication bus;
the memory is used for storing at least one executable instruction, and the executable instruction enables the processor to execute the operation corresponding to the animation processing method.
According to a fourth aspect of the embodiments of the present invention, there is provided a computer storage medium having at least one executable instruction stored therein, the executable instruction causing a processor to perform operations corresponding to the above-described animation processing method.
According to the embodiment of the invention, the material providing inlet is displayed in the animation preview interface of the first animation file in the SVGA format, so that a user can provide the target material through the material providing inlet and replace the original material of the corresponding part in the first animation file with the target material to generate the second animation file, thereby realizing the modification of the first animation file in the SVGA format and improving the low SVGA animation production efficiency; and preview the second animation file in time after generating the second animation file, so that the user can check the animation effect after modification in time, and further the animation production efficiency and the user experience are improved.
According to the embodiment of the invention, the target component identification corresponding to the target material is determined according to the configuration information of the target material provided by the material providing inlet, so that the accuracy of the target material providing can be improved.
The embodiment of the invention also displays the component information of the animation components of the first animation file in the animation preview interface, thereby facilitating the accurate determination of the animation components needing material replacement by a user.
The embodiment of the invention detects the first triggering operation aiming at the displayed component identifier, identifies the animation component identifier aiming at the first triggering operation, and displays the corresponding original material in the animation preview interface, thereby facilitating the user to check the original material of the animation component and further facilitating the user to accurately determine the animation component needing material replacement.
The embodiment of the invention detects the second triggering operation aiming at the displayed component information, identifies the animation component identifier aiming at the second triggering operation, acquires the target material provided by the material providing inlet, and takes the component identifier of the animation component identifier aiming at the second triggering operation as the target component identifier, so that a user can select the animation component first, then the target material of the animation component is provided, and the providing efficiency of the target material is improved.
The embodiment of the invention generates the frame image of the preset animation frame in the first animation file, displays the frame image and/or derives the frame image, thereby meeting the requirement of a user for viewing and deriving the frame image.
The embodiment of the invention can select the preset animation frame by a user, and can also determine the preset animation frame by inputting the frame identifier by the user, thereby generating a corresponding frame image according to the user requirement; the animation frame during pause can be used as a preset animation frame, and the frame image of the animation frame during pause can be generated, so that the individual requirements of users are met.
The embodiment of the invention determines the animation parts corresponding to the preset animation frames, and renders the animation parts in the preset image layer according to the original materials of the corresponding animation parts and the positions and/or the postures in the preset animation frames so as to generate the frame images of the preset animation frames, thereby improving the generation precision of the frame images.
The embodiment of the invention can also detect the preset touch operation aiming at the displayed frame image, and takes the animation part with the position information of the animation part in the preset animation frame matched with the touch position as the animation part for replacing the material, so that a user can select the animation part for replacing the material according to the visual image, the material replacement efficiency is improved, and the user experience is improved.
The embodiment of the invention also displays the original material of the animation component with the position information matched with the touch position, so that the user can conveniently view the original material of the selected animation component.
The embodiment of the invention generates the frame images of each animation frame in the first animation file, and generates the animation file with the preset format according to the frame images of each animation frame, thereby meeting the requirements of users for animation files with different formats.
According to the embodiment of the invention, under the condition that a plurality of target materials correspond to the same target component identifier, the second animation file corresponding to the selected target materials can be generated according to the selection of the user, so that the user can conveniently and sequentially check the replacement effect of different target materials.
According to the embodiment of the invention, under the condition that the plurality of target materials correspond to the same target component identifier, the second animation files corresponding to the target materials can be generated at the same time, and the plurality of second animation files are played, so that a user can conveniently compare the replacement effects of the plurality of target materials.
The foregoing description is only an overview of the technical solutions of the embodiments of the present invention, and may be implemented according to the content of the specification, so that the technical means of the embodiments of the present invention can be more clearly understood, and the following specific implementation of the embodiments of the present invention will be more apparent.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to designate like parts throughout the figures. In the drawings:
fig. 1 is a schematic flow chart of an animation processing method according to an embodiment of the present invention;
FIG. 2 is a flow chart of another animation processing method according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of an animation preview interface according to an embodiment of the present invention;
FIG. 4 is a flow chart of another animation processing method according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of another animation preview interface provided by an embodiment of the present invention;
FIG. 6 is a flowchart of still another animation processing method according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of yet another animation preview interface provided by an embodiment of the present invention;
Fig. 8 is a schematic diagram showing a functional configuration of an animation processing device according to an embodiment of the present invention;
Fig. 9 shows a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present invention are shown in the drawings, it should be understood that embodiments of the present invention may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the embodiments to those skilled in the art.
Fig. 1 shows a flow chart of an animation processing method according to an embodiment of the present invention. The animation processing method provided in the embodiment may be executed by an SVGA tool in the electronic device. The flowcharts in the embodiments of the present invention are not intended to limit the order in which the steps are performed. Some of the steps in the flow chart may be added or subtracted as desired.
Specifically, as shown in fig. 1, the method includes the steps of:
Step S110, a first animation file in a SVGA format previewed in an animation preview interface is obtained; the animation preview interface comprises a material providing inlet.
The first animation file is an SVGA format animation file previewed by using the SVGA tool in the embodiment of the present invention, and is also an SVGA format animation file currently presented by an animation preview interface of the SVGA tool.
In an alternative embodiment, the preview of the first animation file may be implemented by, but is not limited to, one or more of the following:
Preview mode one: previewing is achieved in a drag manner. For example, the animation preview interface includes a response hot zone, and when it is detected that a user drags a file icon or the like of an animation file in SVGA format to the response hot zone, the animation file in SVGA format is determined as a first animation file, and a preview of the first animation file is started. By adopting the preview mode, the preview of the animation file in the local SVGA format of the electronic equipment can be realized, and the preview efficiency is high.
Preview mode II: the preview is implemented in an address entry manner. For example, the animation preview interface includes an address input entry, and the user may input a storage address of the animation file in the address input entry, where the animation file in SVGA format corresponding to the storage address is the first animation file. The storage address may be a local storage address of the electronic device, or may be a remote network storage address, etc. The preview mode can realize the local SVGA format animation file and the non-local SVGA format animation file, and improves the application range of the method.
And a preview mode III: previewing is implemented in a file selection manner. For example, the animation preview interface includes a file selection entry, after triggering the file selection entry, a user may display an SVGA format animation file locally included in the current electronic device, and the user may select a corresponding SVGA format animation file as the first animation file through clicking or other operations. By adopting the mode, the previewing of the animation file in the local SVGA format of the electronic equipment can be rapidly realized.
When the first animation file is previewed in the animation preview interface, the first animation file can be played in the animation preview interface, so that a user can conveniently and intuitively check the production effect of the first animation file. The embodiment of the invention is not limited to a specific playing form, and for example, a one-time playing form, a multi-time playing form, a cyclic playing form and/or the like can be adopted to realize the playing of the first animation file. The playing form can be selected by the user, so that the personalized requirement of the user is met; the playing form can also be automatically set, so that the preview efficiency of the animation file is improved.
Different from the SVGA preview tool in the prior art, the animation preview interface in the embodiment of the present invention further includes a material providing inlet, and after the user previews the first animation file, if it is determined that a certain material in the first animation file does not meet the requirement, the user may provide the animation material through the material providing inlet, so as to facilitate material replacement for the first animation file. The display position, the display style and the like of the material providing entrance are not limited in the embodiment of the invention.
Step S120, obtaining the target material provided by the material providing inlet and identifying the target component identification corresponding to the target material.
The animation is composed of a series of animation frames which are arranged according to time, each animation frame corresponds to a time point, and the animation frames are sequentially rendered and displayed according to the time sequence of the animation frames, so that the animation effect is displayed. In SVGA format animations, each animation frame may be composed of one or more animation components that may be repeated among multiple animation frames. When the animation parts repeatedly appear in a plurality of animation frames, the display forms of the animation parts in the plurality of animation frames are identical, or the animation parts only have the change of the positions and/or the postures in the plurality of animation frames. Wherein the position of the animation part in the animation frame is specifically the rendering position of the animation part in the animation frame, and the gesture of the animation part in the animation frame is specifically the size, the rotation angle and/or the transparency of the scaling of the animation part in the animation frame. Animation components in SVGA format animations may also be referred to as presentation units, imageKey, etc. Each animation component has a corresponding presentation material that is a visual representation of the animation component. For example, if the animation component is a flower, the display material of the animation component is specifically a flower picture.
In the embodiment of the invention, if the user needs to modify the display material of a certain animation part in the first animation file, other display materials can be provided through the material providing inlet in the animation preview interface, and the display material provided through the material providing inlet is the target material. The embodiment of the invention does not limit the specific type of the target material, for example, the display material can be a bitmap, a vector diagram and the like; in addition, the number of the target materials is not limited in the embodiment of the invention, and one or a plurality of the target materials can be provided.
In an alternative embodiment, the provision of the target material may be accomplished in one or more of the following ways, but is not limited to being used.
The first providing mode is as follows: uploading local materials. Specifically, the user can drag the display material locally stored in the electronic device to the material providing inlet, so that drag uploading of the local material is realized; after the user triggers the material providing entrance, a locally stored display material list is displayed, and the user selects the corresponding display material to upload; and the user can input the local storage address of the display element and upload the local material corresponding to the local storage address. The providing mode is convenient for the user to take the local material as the target material, so that the user can locally manufacture the corresponding material to replace the material in the first animation file, and the accuracy of the material providing is improved.
Providing a second mode: a remote address is entered. Specifically, the user may input a remote address storing the presentation material, and obtain and transmit the corresponding presentation element from the remote address through the network. The method is convenient for the user to remotely provide the target material, on one hand, the application range of the material provision is widened, and on the other hand, the cross-platform provision of the material is convenient.
The third mode is provided: and selecting from a material library. Specifically, the target material provided by the user history may be stored in a material library, and the material example acquired or automatically generated by the system may be stored in the material library, so that at least one presentation material is stored in the material library. Therefore, the display materials in the material library can be provided for the user by the method, and the user selects the corresponding display materials from the material library as the target materials provided at this time, so that the material transmission resources are saved.
As can be seen from the above description, each animation component has a corresponding presentation material, which is a visual representation of the animation component, so that the presentation material has a correspondence with the animation component. After the target material is obtained, the animation component corresponding to the target material is identified, the animation component corresponding to the target material is the target animation component, and the component identifier of the target animation component is the target component identifier. Thus, for any target material, the component identifier of the animation component corresponding to the target material is the target component identifier corresponding to the target material. The component identifiers have local uniqueness, namely, the component identifiers of all animation components in the same animation file are different, and the component identifiers can be component names, component IDs and the like.
In still another alternative embodiment, some animation files have corresponding material format requirements, for example, some animation files only can show PNG or JPG display materials, so as to ensure the correction effect of the first animation file, in this embodiment, the material format configuration file of the first animation file is further obtained, and the material requirement of the first animation file is parsed from the material format configuration file. Then after the target material provided by the material providing inlet is obtained, checking the material format of the target material, and if the material format of the target material meets the material requirement of the first animation file, reserving the target material; if the material format of the target material does not meet the material requirement of the first animation file, discarding the target material, and feeding back prompt information that the material format does not meet the requirement to the user. Further optionally, the prompt information required by the material of the first animation file can be displayed near the material providing inlet or in the material providing inlet, so that the user can provide accurate target material conveniently.
Step S130, replacing original materials corresponding to the target component identification in the first animation file with target materials to generate a second animation file in SVGA format.
The presentation material of each animation part in the first animation file is called original material. After any target material is obtained, the original material of the animation component corresponding to the target material is replaced by the original material corresponding to the target component identifier, and the replaced animation file is the second animation file. Because the embodiment of the invention replaces the display material without changing other contents of the animation file, the generated second animation file is still an SVGA-format animation file. And in the replacing process, deleting the original material corresponding to the target component identifier in the first animation file, and adding the target material to the position of the original material in the first animation file, so that the original material is replaced by the target material. While other configuration information of the target component identification corresponding to the animation component, such as animation frame information corresponding to the component, position and posture information in the corresponding animation frame, and the like, are unchanged.
In an alternative embodiment, the first animation file is specifically copied, a copy file of the first animation file is generated, and original materials corresponding to the target component identifier in the copy file are replaced by target materials, so that a second animation file in an SVGA format is generated, and the original first animation file is reserved.
Step S140, playing the second animation file in the animation preview interface.
In the process of generating the second animation file, the second animation file can be played in time so as to realize real-time or near real-time preview of the second animation file, so that a user can conveniently and quickly check the animation effect after material replacement, and the animation production efficiency is improved.
In an optional implementation manner, in order to facilitate a user to quickly determine the difference of animation effects before and after material replacement, the embodiment of the invention can play the first animation file and the second animation file in the animation preview interface, so that the user can conveniently perform effect comparison on the first animation file and the second animation file. The display areas of the first animation file and the second animation file in the animation preview interface are different, so that interference among different animation files is avoided.
Therefore, the embodiment of the invention displays the material providing entrance in the animation preview interface of the first animation file in the SVGA format, so that a user can provide the target material through the material providing entrance and replace the original material of the corresponding part in the first animation file with the target material to generate the second animation file, thereby realizing the modification of the first animation file in the SVGA format and improving the low manufacturing efficiency of the SVGA animation; and preview the second animation file in time after generating the second animation file, so that the user can check the animation effect after modification in time, and further the animation production efficiency and the user experience are improved.
Fig. 2 is a flow chart illustrating another animation processing method according to an embodiment of the present invention. The animation processing method provided in the embodiment may be executed by an SVGA tool in the electronic device. The flowcharts in the embodiments of the present invention are not intended to limit the order in which the steps are performed. Some of the steps in the flow chart may be added or subtracted as desired.
Specifically, as shown in fig. 2, the method includes the steps of:
step S210, a first animation file in a SVGA format previewed in an animation preview interface is obtained.
Step S220, obtaining the component information of at least one animation component contained in the first animation file, and displaying the component information in the animation preview interface.
In order to improve the accuracy of material replacement, the embodiment of the invention displays the component information of the animation component of the first animation file in the animation preview interface. Wherein, the component information may be: component identification and/or raw material of the animation component, etc.
In an alternative embodiment, the component identifier of at least one animation component is displayed in the animation preview interface, the first trigger operation for the displayed component identifier is detected, the animation component identifier for the first trigger operation is identified, and the original material corresponding to the animation component identifier for the second trigger operation is displayed in the animation preview interface. In the embodiment, the user can select the animation part to be checked through the first triggering operation, and then the original material of the animation part in the first animation file is presented, so that the user can accurately determine that the material is needed to replace the animation part. The first preset trigger operation may be a click operation, a voice control operation, a shortcut key operation, or the like.
Step S230, obtaining the target material provided by the material providing entrance in the animation preview interface and the configuration information of the target material, and taking the animation component identifier contained in the configuration information as the target component identifier corresponding to the target material.
And the animation preview interface is provided with a material providing inlet through which a user can provide target materials and configuration information of the target materials. The configuration information contains the target component identifier of the target animation component corresponding to the target material, so that the animation component identifier contained in the configuration information is used as the target component identifier corresponding to the target material. For example, the user can determine the animation component needing to be subjected to material replacement through the original material of the displayed animation component, and then write the component identification of the animation component needing to be subjected to material replacement into the provided configuration information of the target material. By adopting the implementation mode, accurate replacement of materials is convenient to realize.
In an optional implementation manner, the configuration information of the target material may be a material name, where the material name includes a target component identifier of a target animation component corresponding to the target material, so that the target component identifier corresponding to the target material can be determined by using the material name of the target material.
In yet another alternative embodiment, after receiving the target material and the configuration information of the target material, the target material may be subjected to material format verification (a specific verification process may be described in the embodiment of fig. 1), and/or the configuration information may be subjected to identification verification. When the configuration information is subjected to identification verification, specifically, the component identification of the animation component in the first animation file is compared with the configuration information, and if the configuration information is consistent with the component identification of a certain animation component, the verification is passed; otherwise, checking is not passed, and a prompt message of abnormal configuration information is fed back to the user.
Further optionally, in case that the identification verification of the configuration information fails, in order to facilitate the user to quickly provide accurate configuration information, after comparing the component identification of the animation component in the first animation file with the configuration information, a component identification with high similarity with the configuration information may be determined, and when the prompt information of the configuration information abnormality is fed back to the user, the component identification with high similarity with the configuration information may also be provided to the user, so that the user may select the component identification to quickly correct the configuration information.
Step S240, detecting a second trigger operation aiming at the displayed component information, identifying an animation component identifier aiming at the second trigger operation, acquiring a target material provided through a material providing inlet, and taking the component identifier of the animation component identifier aiming at the second trigger operation as a target component identifier.
In this step, the user can start the providing process of the target material of a certain animation part through the second trigger operation. The second preset trigger operation may be a double click operation, a long press operation, a voice control operation, a shortcut key operation, or the like. After detecting the second trigger operation for the presented component information, the animation component identifier for which the second trigger operation is directed is identified, and then the target material provided from the material providing inlet is the replacement material of the animation component corresponding to the animation component identifier. The implementation mode does not need to carry out information configuration on the target materials, so that the efficiency of providing the materials is conveniently prompted.
The step S230 and the step S240 are two different target material providing manners, and the step S230 and the step S240 may be performed alternatively or sequentially.
Step S250, replacing original materials corresponding to the target component identification in the first animation file with target materials to generate a second animation file in SVGA format, and playing the second animation file in the animation preview interface.
Specifically, this step may be performed after the replacement preview operation is detected. The alternate preview may be a trigger operation for a preset control, a shortcut key operation, a voice control operation, etc.
Taking fig. 3 as an example, the animation preview interface includes a presentation field 31, a presentation field 32, a presentation field 33, and a presentation field 34. Wherein, the display field 31 can preview the animation file, the display field 32 displays a part list, and the part list displays the part identifier of the animation part contained in the first animation file. The presentation field 33 presents the original material with the currently selected animation component. In the initial state, the display field 33 displays the original material of the nth animation component by default, and the user can trigger the corresponding component identifier in the component list displayed in the display field 32 through the first triggering operation, and then the display field 33 displays the original material corresponding to the triggered component identifier. Therefore, a user can preview the first animation file through the animation preview interface and can also view the original materials of all animation parts in the first animation file through the animation preview interface.
The display field 34 displays a "click upload material" control, which is a material providing entry. The user can upload the target material and the configuration information of the target material after clicking the "click upload material" control, and the component identifier contained in the configuration information of the target material is the target component identifier corresponding to the target material.
The user may also trigger a certain component identifier in the presentation field 33 through the second triggering operation, and then click the "click upload material" control to upload the target material. The triggered component identifier is the target component identifier corresponding to the uploaded target material.
After uploading the target material, the "replace preview" control in the display field 34 may be triggered by a third trigger operation, so that the original material corresponding to the target component identifier in the first animation file is replaced by the target material, so as to generate a second animation file in SVGA format, and the generated second animation file is played in the display field 31, so that the user can check the animation effect after the material replacement.
Therefore, the embodiment of the invention displays the component information of the animation components contained in the first animation file in the animation preview interface, thereby facilitating the user to view the information of each animation component in the first animation file and accurately determining the animation components needing material replacement. In the embodiment of the invention, the user can use the animation component identifier contained in the configuration information as the target component identifier corresponding to the target material through the target material provided by the material providing inlet and the configuration information of the target material, so that the accuracy of material replacement is further improved; the user can also select a certain component identifier, then the target material corresponding to the component identifier is provided, and the providing efficiency of the target material is improved.
Fig. 4 is a schematic flow chart of another animation processing method according to an embodiment of the present invention. The animation processing method provided in the embodiment may be executed by an SVGA tool in the electronic device. The flowcharts in the embodiments of the present invention are not intended to limit the order in which the steps are performed. Some of the steps in the flow chart may be added or subtracted as desired.
Specifically, as shown in fig. 4, the method includes the steps of:
Step S410, a first animation file in a SVGA format previewed in an animation preview interface is obtained.
Step S420, generating a frame image of a preset animation frame in the first animation file, and displaying the frame image.
The first animation file corresponds to a plurality of animation frames, and in the embodiment of the invention, a preset animation frame is determined from the plurality of animation frames, a frame image of the preset animation frame is generated, and the frame image of the preset animation frame can be displayed in an animation preview interface. The frame image can be in a picture format such as PNG, JPG and the like.
In an alternative embodiment, the preset animation frame may be determined by one or more of the following:
Determining a first mode: displaying the frame identification of the animation frame in the first animation file, detecting a third triggering operation aiming at the displayed frame identification, identifying the frame identification aiming at the third triggering operation, and taking the animation frame corresponding to the frame identification aiming at the third triggering operation as a preset animation frame. In this determination manner, the frame identifier of each animation frame in the first animation file may be displayed to the user, and the user selects the corresponding frame identifier through a third triggering operation, where the third triggering operation may specifically be a click operation, a preset shortcut key operation, a voice control operation, and so on. The animation frame corresponding to the frame identification aimed at by the third triggering operation is the preset animation frame, so that a user can select the animation frame needing to view the frame image, and the execution efficiency of the method is improved.
And a second determination mode: and displaying the frame identification input inlet, acquiring the frame identification input through the frame identification input inlet, and taking the animation frame corresponding to the input frame identification as a preset animation frame. In the determination mode, the user can directly input the frame identification to determine the animation frame of the frame image to be checked, so that the user requirement is met, and the user experience is improved.
And determining a third mode: and monitoring preview pause operation of the first animation file, identifying an animation frame in the first animation file currently displayed in the animation preview interface, and taking the animation frame in the first animation file currently displayed in the animation preview interface as a preset animation frame. In the determining mode, the user can pause the playing of the first animation file through the preview pause operation, and after the playing of the first animation file is paused, the currently presented animation frame can be used as a preset animation frame, so that the frame image of the animation frame displayed during pause is generated, and the personalized requirement of the user is met.
In yet another alternative embodiment, the frame image of the preset animation frame may be specifically determined by: and determining an animation part corresponding to the preset animation frame, and rendering in a preset layer according to the original material of the corresponding animation part and the position and/or the gesture in the preset animation frame to generate a frame image of the preset animation frame. Specifically, for any animation component corresponding to a preset animation frame, determining a rendering position, a rotation angle, a scaling angle and/or transparency of an original material of the animation component according to the position and/or the gesture of the animation component in the preset animation frame, and then rendering the original material according to the rendering position, the rotation angle, the scaling angle and/or the transparency, and generating a frame image in an image format.
In yet another alternative embodiment, after generating a frame image of a preset animation frame, the frame image may be derived. Wherein the user can input a corresponding derived address, thereby deriving the frame image to the address for storage.
Step S430, detecting a preset touch operation aiming at the displayed frame image, identifying a touch position of the preset touch operation in the frame image, acquiring position information of an animation part corresponding to a preset animation frame in the preset animation frame, comparing the position information of the animation part in the preset animation frame with the touch position, determining an animation part with the position information matched with the touch position, acquiring a target material provided through a material providing inlet, and taking a part identifier of the animation part with the position information matched with the touch position as a target part identifier.
In this embodiment, the user may select an animation component that needs to be replaced with a material through the corresponding portion of the touch frame image, and then provide the target material of the animation component.
Specifically, after detecting a preset touch operation for a displayed frame image, identifying a touch position of the preset touch operation in the frame image. For example, after detecting that the user clicks on the displayed frame image, a click position may be determined, where the click position is a touch position of a preset touch operation in the frame image.
Further, the position information of the animation part corresponding to the preset animation frame in the preset animation frame is obtained from the first animation file, the position information of the animation part in the preset animation frame is compared with the touch position, if the position of a certain animation part in the preset animation frame comprises the touch position, the position information of the animation part in the preset animation frame is determined to be matched with the touch position, and the animation part is the animation part for the user to replace the material.
In an alternative embodiment, after determining the animation component with the position information matched with the touch position, the original material of the animation component with the position information matched with the touch position may be further displayed, so that a user can intuitively view the original material of the animation component with the touch. Further optionally, the component identification of the animation component may also be highlighted so that the user knows the component identification of the animation component.
Further, a target material provided through the material providing inlet is obtained, and the component identifier of the animation component with the position information matched with the touch position is used as a target component identifier.
In an alternative embodiment, frame images of each animation frame in the first animation file may be generated, and an animation file in a preset format may be generated according to the frame images of each animation frame. The preset format may be MP4 format, GIF format, etc., so as to satisfy the animation file format conversion requirement of the user. More specifically, when the user derives the frame image, the user may derive the frame image to the corresponding folder, and then the present embodiment directly generates the animation file in the preset format according to the frame image in the folder.
Step S440, the original material corresponding to the target component identification in the first animation file is replaced with the target material to generate a second animation file in SVGA format, and the second animation file is played in the animation preview interface.
Taking fig. 5 as an example, the animation preview interface includes a presentation field 51, a presentation field 52, a presentation field 53, and a presentation field 54. The display field 51 may preview an animation file, and the display field 51 includes a "pause/play" control, and if the current first animation file is in a play state, after clicking the "pause/play" control (i.e. executing a preview pause operation of the first animation file), the user generates and displays a frame image of the animation frame displayed during pause. The display field 51 further includes an "input view animation frame" control, after the user clicks the "input view animation frame" control, the user may input a frame identifier, so as to generate and display a frame image of an animation frame corresponding to the input frame identifier, or after the user clicks the "input view animation frame" control, the user may display a frame identifier of an animation frame included in the first animation file, and the user selects a corresponding frame identifier, so as to generate and display a frame image of an animation frame corresponding to the selected frame identifier. Therefore, the user can view the frame image of the animation frame according to the self-demand, and is convenient for the user to grasp the static presentation effect of the animation frame.
The display field 51 may also display a frame image of a preset animation frame, and the user may click on a corresponding position of the frame image, if the touch position matches with a position of a certain animation component in the preset animation frame, the original material of the animation component may be displayed in the display field 53, and correspondingly, the component identifier of the animation component is highlighted in the display field 52, so that the user may conveniently learn the component identifier of the animation component.
If the user further goes through the corresponding operation, the providing process of the target material aiming at the animation part is entered. For example, when the user clicks the corresponding position of the frame image and determines the animation component 22-01 matching the corresponding position, the user may further press the touch position for a long time, and then click the "click upload material" control in the display field 54 to upload the target material, and the animation component identifier corresponding to the uploaded target material is 22-01. In addition, the user may also employ step S230 and/or step S240 in the embodiment of fig. 2 to provide the target material and determine the target component identification.
Therefore, the embodiment of the invention generates the frame image of the preset animation frame in the first animation file and displays the frame image, so that a user can check the static display effect of the animation frame, the user can accurately evaluate the effect of the first animation file, the animation part needing material replacement can be accurately determined, and the animation production efficiency is improved.
Fig. 6 shows a flowchart of an animation processing method according to an embodiment of the present invention. The animation processing method provided in the embodiment may be executed by an SVGA tool in the electronic device. The flowcharts in the embodiments of the present invention are not intended to limit the order in which the steps are performed. Some of the steps in the flow chart may be added or subtracted as desired.
Specifically, as shown in fig. 6, the method includes the steps of:
Step S610, a first animation file in a SVGA format previewed in an animation preview interface is obtained; the animation preview interface comprises a material providing inlet.
Step S620, a plurality of target materials provided by a material providing inlet are obtained, and target component identifiers corresponding to the target materials are identified; wherein the plurality of target material corresponds to the same target part identification.
In the embodiment of the invention, a plurality of target materials can be provided in batches through the material providing inlet, and the target materials provided in batches can correspond to different target component identifications or the same target component identifications. The embodiment of the invention mainly aims at optimizing the scenes of a plurality of target materials corresponding to the same target component identifier.
Step S630, detecting a selection operation of the target materials, determining the target materials corresponding to the selection operation, and replacing original materials corresponding to the target component identifiers in the first animation file with the target materials corresponding to the selection operation to generate a second animation file in the SVGA format.
After a user selects a certain target component identifier, if the target component identifier corresponds to a plurality of target materials, the plurality of target materials are displayed. The user can select corresponding target materials through selection operation, and replace original materials corresponding to target component identifiers in the first animation file with target materials corresponding to the selection operation, so that the user can sequentially select corresponding target materials for replacement after uploading a plurality of target materials, sequentially check animation effects after different target materials are replaced, and improve the accuracy of material replacement.
Step S640, for each target material in the plurality of target materials, replaces the original material corresponding to the target component identifier in the first animation file with the target material to generate a second animation file corresponding to the target material, and plays the second animation file corresponding to each target material in the animation preview interface.
In this step, under the condition that the target component identifier corresponds to a plurality of target materials, for each target material, the original material corresponding to the target component identifier in the first animation file is replaced by the target material, so as to generate a second animation file corresponding to the target material, and the second animation files corresponding to the target materials can be played respectively, so that the user can compare the replacement effects of different target materials conveniently.
Step S630 and step S640 are two ways of generating the second animation file, and may be executed alternatively or simultaneously in the actual implementation process. When the method is executed simultaneously, a user can select a plurality of target materials, then generate second animation files corresponding to the selected plurality of target materials, and display the second animation files corresponding to the plurality of target materials.
Taking fig. 7 as an example, the user clicks on animation component identification 22-01 of presentation field 72, and then presents the original material of animation component identification 22-01 in presentation field 73 (i.e., corresponding to the circle in presentation field 73 in fig. 7). And two target materials corresponding to animation component identification 22-01 are shown in presentation field 74 (i.e., corresponding to pentagonal and hexagonal materials in presentation field 74 in fig. 7). After clicking on the pentagon target material, the user may replace the original material with the pentagon target material, and generate a second animation file X, and play the second animation file X in the presentation field 71. The user may click on the hexagonal target material, replace the original material with the hexagonal target material, generate a corresponding second animation file Y, and play the second animation file Y in the display field 71. Or the user does not need to select, but generates the second animation file X and the second animation file Y, and plays the second animation file X and the second animation file Y in the display field 71 at the same time, so that the user can visually compare the effects of the second animation file X and the second animation file Y.
Therefore, the embodiment of the invention can provide the target materials in batches, thereby improving the material providing efficiency; and under the condition that a plurality of target materials correspond to the same target component identifier, a user can select a certain target material, and then a second animation file corresponding to the target material is generated for playing, so that the user can conveniently check the corresponding material replacement effect; and the second animation files corresponding to the target materials can be generated for playing, so that the user can conveniently compare the material replacement effects of different target materials.
Fig. 8 is a schematic diagram showing a functional structure of an animation processing device according to an embodiment of the present invention. As shown in fig. 8, the apparatus 800 includes: a first acquisition module 810, a second acquisition module 820, a replacement module 830, and a presentation module 840.
A first obtaining module 810, configured to obtain a first animation file in SVGA format previewed in an animation preview interface; the animation preview interface comprises a material providing inlet;
a second obtaining module 820, configured to obtain a target material provided through the material providing inlet, and identify a target component identifier corresponding to the target material;
a replacing module 830, configured to replace original material corresponding to the target component identifier in the first animation file with the target material, so as to generate a second animation file in SVGA format;
and a display module 840, configured to play the second animation file in the animation preview interface.
In an alternative embodiment, the second acquisition module is configured to: and acquiring the target material provided by the material providing inlet and the configuration information of the target material, and taking the animation component identifier contained in the configuration information as a target component identifier corresponding to the target material.
In an alternative embodiment, the apparatus further comprises:
A third obtaining module, configured to obtain component information of at least one animation component included in the first animation file;
And the display module is used for displaying the component information in the animation preview interface.
In an alternative embodiment, the display module is configured to: displaying the component identification of the at least one animation component in the animation preview interface;
detecting a first trigger operation aiming at the displayed component identifier, and identifying an animation component identifier aiming at the first trigger operation; and displaying the original material corresponding to the animation component identification aimed at by the first triggering operation in an animation preview interface.
In an alternative embodiment, the second acquisition module is configured to: detecting a second triggering operation aiming at the displayed component information, and identifying an animation component identifier aiming at the second triggering operation;
and acquiring target materials provided by the material providing entrance, and taking the component identification of the animation component identification aimed by the second triggering operation as the target component identification.
In an alternative embodiment, the apparatus further comprises: the generation module is used for generating a frame image of a preset animation frame in the first animation file;
The display module is used for displaying the frame image;
and the export module is used for exporting the frame image.
In an alternative embodiment, the generating module is configured to: displaying the frame identification of the animation frame in the first animation file, detecting a third triggering operation aiming at the displayed frame identification, identifying the frame identification aiming at the third triggering operation, and taking the animation frame corresponding to the frame identification aiming at the third triggering operation as a preset animation frame;
And/or displaying a frame identification input inlet, acquiring a frame identification input through the frame identification input inlet, and taking an animation frame corresponding to the input frame identification as a preset animation frame;
And/or monitoring preview pause operation of the first animation file, identifying an animation frame in the first animation file currently displayed in the animation preview interface, and taking the animation frame in the first animation file currently displayed in the animation preview interface as a preset animation frame.
In an alternative embodiment, the generating module is configured to: determining an animation part corresponding to the preset animation frame;
And rendering in a preset layer according to the original material of the corresponding animation part and the position and/or the gesture in the preset animation frame so as to generate a frame image of the preset animation frame.
In an alternative embodiment, the second acquisition module is configured to: detecting a preset touch operation aiming at a displayed frame image, and identifying a touch position of the preset touch operation in the frame image;
Acquiring the position information of an animation part corresponding to the preset animation frame in the preset animation frame;
Comparing the position information of the animation part in the preset animation frame with the touch position, and determining the animation part with the position information matched with the touch position;
And acquiring target materials provided by the material providing entrance, and taking the component identification of the animation component with the position information matched with the touch position as the target component identification.
In an alternative embodiment, the display module is configured to: and displaying the original material of the animation component with the determined position information matched with the touch position.
In an alternative embodiment, the generating module is configured to: generating frame images of all animation frames in the first animation file;
And generating an animation file in a preset format according to the frame images of the animation frames.
In an alternative embodiment, the target material is a plurality of target materials, and the plurality of target materials correspond to the same target component identifier;
The replacement module is used for: detecting a selection operation of target materials, determining the target materials corresponding to the selection operation, and replacing original materials corresponding to target component identifiers in the first animation file with the target materials corresponding to the selection operation to generate a second animation file in an SVGA format.
In an alternative embodiment, the target material is a plurality of target materials, and the plurality of target materials correspond to the same target component identifier;
The replacement module is used for: for each target material in the target materials, replacing original materials corresponding to the target component identifiers in the first animation file with the target material to generate a second animation file corresponding to the target material;
the display module is used for: and respectively playing the second animation files corresponding to the target materials in the animation preview interface.
According to the embodiment of the invention, the material providing inlet is displayed in the animation preview interface of the first animation file in the SVGA format, so that a user can provide the target material through the material providing inlet and replace the original material of the corresponding part in the first animation file with the target material to generate the second animation file, thereby realizing the modification of the first animation file in the SVGA format and improving the low SVGA animation production efficiency; and preview the second animation file in time after generating the second animation file, so that the user can check the animation effect after modification in time, and further the animation production efficiency and the user experience are improved.
Fig. 9 shows a schematic structural diagram of an electronic device according to an embodiment of the present invention. The specific embodiments of the present invention are not limited to specific implementations of electronic devices.
As shown in fig. 9, the electronic device may include: a processor 902, a communication interface (Communications Interface), a memory 906, and a communication bus 908.
Wherein: processor 902, communication interface 904, and memory 906 communicate with each other via a communication bus 908. A communication interface 904 for communicating with network elements of other devices, such as clients or other electronic devices. The processor 902 is configured to execute the program 910, and may specifically perform the relevant steps in the embodiments of the animation processing method described above.
In particular, the program 910 may include program code including computer-operating instructions.
The processor 902 may be a central processing unit, CPU, or Application specific integrated Circuit ASIC (Application SPECIFIC INTEGRATED circuits), or one or more integrated circuits configured to implement embodiments of the present invention. The one or more processors included in the electronic device may be the same type of processor, such as one or more CPUs; but may also be different types of processors such as one or more CPUs and one or more ASICs.
A memory 906 for storing a program 910. Memory 906 may comprise high-speed RAM memory or may also include non-volatile memory (non-volatile memory), such as at least one disk memory. The program 910 may be specifically configured to cause the processor 902 to perform the method in any of the method embodiments described above.
An embodiment of the present invention provides a non-volatile computer storage medium storing at least one executable instruction that can perform the animation processing method of any of the above-described method embodiments.
The embodiments of the present invention are based on the same inventive concept, and different embodiments may be referred to each other, which is not described herein.
The algorithms or displays presented herein are not inherently related to any particular computer, virtual system, or other apparatus. Various general-purpose systems may also be used with the teachings herein. The required structure for a construction of such a system is apparent from the description above. In addition, embodiments of the present invention are not directed to any particular programming language. It will be appreciated that the teachings of embodiments of the present invention described herein may be implemented in a variety of programming languages, and the above description of specific languages is provided for disclosure of enablement and best mode of the embodiments of the present invention.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the embodiments of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be construed as reflecting the intention that: i.e., an embodiment of the invention that is claimed, requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules in the apparatus of the embodiments may be adaptively changed and disposed in one or more apparatuses different from the embodiments. The modules or units or components of the embodiments may be combined into one module or unit or component and, furthermore, they may be divided into a plurality of sub-modules or sub-units or sub-components. Any combination of all features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or units of any method or apparatus so disclosed, may be used in combination, except insofar as at least some of such features and/or processes or units are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments herein include some features but not others included in other embodiments, combinations of features of different embodiments are meant to be within the scope of embodiments of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments can be used in any combination.
The various component embodiments of the present invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that some or all of the functionality of some or all of the components according to embodiments of the present invention may be implemented in practice using a microprocessor or Digital Signal Processor (DSP). Embodiments of the present invention may also be implemented as a device or apparatus program (e.g., a computer program and a computer program product) for performing a portion or all of the methods described herein. Such a program embodying the embodiments of the present invention may be stored on a computer readable medium, or may have the form of one or more signals. Such signals may be downloaded from an internet website, provided on a carrier signal, or provided in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. Embodiments of the invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names. The steps in the above embodiments should not be construed as limiting the order of execution unless specifically stated.

Claims (16)

1. An animation processing method, comprising:
acquiring a first animation file in a SVGA format previewed in an animation preview interface; the animation preview interface comprises a material providing inlet;
Acquiring target materials provided by the material providing inlet, and identifying a target component identifier corresponding to the target materials;
Replacing original materials corresponding to the target component identification in the first animation file with the target materials to generate a second animation file in an SVGA format;
and playing the second animation file in the animation preview interface.
2. The method of claim 1, wherein the obtaining the target material provided through the material providing portal and identifying the target component identifier corresponding to the target material further comprises:
And acquiring the target material provided by the material providing inlet and the configuration information of the target material, and taking the animation component identifier contained in the configuration information as a target component identifier corresponding to the target material.
3. The method according to claim 1 or 2, characterized in that the method further comprises:
acquiring component information of at least one animation component contained in the first animation file;
And displaying the component information in the animation preview interface.
4. The method of claim 3, wherein the presenting the part information in the animation preview interface further comprises: displaying the component identification of the at least one animation component in the animation preview interface;
The method further comprises the steps of: detecting a first trigger operation aiming at the displayed component identifier, and identifying an animation component identifier aiming at the first trigger operation; and displaying the original material corresponding to the animation component identification aimed at by the first triggering operation in an animation preview interface.
5. The method according to claim 3 or 4, wherein the acquiring the target material provided through the material providing portal and identifying the target component identifier corresponding to the target material further comprises:
Detecting a second triggering operation aiming at the displayed component information, and identifying an animation component identifier aiming at the second triggering operation;
and acquiring target materials provided by the material providing entrance, and taking the component identification of the animation component identification aimed by the second triggering operation as the target component identification.
6. The method according to any one of claims 1-5, further comprising:
and generating a frame image of a preset animation frame in the first animation file, displaying the frame image and/or deriving the frame image.
7. The method of claim 6, wherein the preset animation frame is determined by:
Displaying the frame identification of the animation frame in the first animation file, detecting a third triggering operation aiming at the displayed frame identification, identifying the frame identification aiming at the third triggering operation, and taking the animation frame corresponding to the frame identification aiming at the third triggering operation as a preset animation frame;
And/or displaying a frame identification input inlet, acquiring a frame identification input through the frame identification input inlet, and taking an animation frame corresponding to the input frame identification as a preset animation frame;
And/or monitoring preview pause operation of the first animation file, identifying an animation frame in the first animation file currently displayed in the animation preview interface, and taking the animation frame in the first animation file currently displayed in the animation preview interface as a preset animation frame.
8. The method according to claim 6 or 7, wherein generating the frame image of the preset animation frame in the first animation file further comprises:
Determining an animation part corresponding to the preset animation frame;
And rendering in a preset layer according to the original material of the corresponding animation part and the position and/or the gesture in the preset animation frame so as to generate a frame image of the preset animation frame.
9. The method according to any one of claims 6-8, wherein the acquiring the target material provided through the material providing portal and identifying the target component identifier corresponding to the target material further comprises:
Detecting a preset touch operation aiming at a displayed frame image, and identifying a touch position of the preset touch operation in the frame image;
Acquiring the position information of an animation part corresponding to the preset animation frame in the preset animation frame;
Comparing the position information of the animation part in the preset animation frame with the touch position, and determining the animation part with the position information matched with the touch position;
And acquiring target materials provided by the material providing entrance, and taking the component identification of the animation component with the position information matched with the touch position as the target component identification.
10. The method of claim 9, wherein after determining an animation component whose position information matches the touch position, the method further comprises:
and displaying the original material of the animation component with the determined position information matched with the touch position.
11. The method according to any one of claims 1-10, further comprising:
Generating frame images of all animation frames in the first animation file;
And generating an animation file in a preset format according to the frame images of the animation frames.
12. The method of any one of claims 1-11, wherein the target material is a plurality of target materials, and wherein the plurality of target materials correspond to a same target part identity;
The replacing the original material corresponding to the target component identifier in the first animation file with the target material to generate a second animation file in SVGA format further comprises:
Detecting a selection operation of target materials, determining the target materials corresponding to the selection operation, and replacing original materials corresponding to target component identifiers in the first animation file with the target materials corresponding to the selection operation to generate a second animation file in an SVGA format.
13. The method of any one of claims 1-11, wherein the target material is a plurality of target materials, and wherein the plurality of target materials correspond to a same target part identity;
replacing original materials corresponding to the target component identifiers in the first animation file with the target materials to generate a second animation file in an SVGA format; playing the second animation file in the animation preview interface further comprises:
And replacing original materials corresponding to the target component identifiers in the first animation file with the target materials aiming at each target material in the target materials to generate second animation files corresponding to the target materials, and respectively playing the second animation files corresponding to the target materials in the animation preview interface.
14. An animation processing device, comprising:
the first acquisition module is used for acquiring a first animation file in a SVGA format previewed in the animation preview interface; the animation preview interface comprises a material providing inlet;
the second acquisition module is used for acquiring target materials provided by the material providing inlet and identifying target component identifiers corresponding to the target materials;
the replacing module is used for replacing original materials corresponding to the target component identifiers in the first animation file with the target materials so as to generate a second animation file in an SVGA format;
and the display module is used for playing the second animation file in the animation preview interface.
15. An electronic device, comprising: the device comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete communication with each other through the communication bus;
the memory is configured to store at least one executable instruction, where the executable instruction causes the processor to perform operations corresponding to the animation processing method according to any one of claims 1-13.
16. A computer storage medium having stored therein at least one executable instruction for causing a processor to perform operations corresponding to the animation processing method of any one of claims 1-13.
CN202211457347.XA 2022-11-17 2022-11-17 Animation processing method, device, electronic equipment and computer storage medium Pending CN118115633A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211457347.XA CN118115633A (en) 2022-11-17 2022-11-17 Animation processing method, device, electronic equipment and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211457347.XA CN118115633A (en) 2022-11-17 2022-11-17 Animation processing method, device, electronic equipment and computer storage medium

Publications (1)

Publication Number Publication Date
CN118115633A true CN118115633A (en) 2024-05-31

Family

ID=91216385

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211457347.XA Pending CN118115633A (en) 2022-11-17 2022-11-17 Animation processing method, device, electronic equipment and computer storage medium

Country Status (1)

Country Link
CN (1) CN118115633A (en)

Similar Documents

Publication Publication Date Title
US20210158594A1 (en) Dynamic emoticon-generating method, computer-readable storage medium and computer device
US20130272679A1 (en) Video Generator System
US11463748B2 (en) Identifying relevance of a video
CN111752557A (en) Display method and device
CN107748615B (en) Screen control method and device, storage medium and electronic equipment
US20100192098A1 (en) Accelerators for capturing content
US20230061035A1 (en) Methods and systems for automated testing using browser extension
CN112822560B (en) Virtual gift giving method, system, computer device and storage medium
JP6206202B2 (en) Information processing apparatus and information processing program
CN112532896A (en) Video production method, video production device, electronic device and storage medium
CN111222571A (en) Image special effect processing method and device, electronic equipment and storage medium
US11665119B2 (en) Information replying method, apparatus, electronic device, computer storage medium, and product
US20220382963A1 (en) Virtual multimedia scenario editing method, electronic device, and storage medium
CN110855557A (en) Video sharing method and device and storage medium
WO2024002092A1 (en) Method and apparatus for pushing video, and storage medium
CN109271607A (en) User Page layout detection method and device, electronic equipment
CN110308848B (en) Label interaction method and device and computer storage medium
US11967344B2 (en) Video processing method and apparatus, device and computer readable storage medium
CN118115633A (en) Animation processing method, device, electronic equipment and computer storage medium
CN113496454A (en) Image processing method and device, computer readable medium and electronic equipment
CN113590564B (en) Data storage method, device, electronic equipment and storage medium
CN113553505A (en) Video recommendation method and device and computing equipment
US20160027082A1 (en) Virtual shop for electronic greeting cards
CN111061472A (en) Method and system for generating visualized edited violin diagram
CN111491184A (en) Method and device for generating situational subtitles, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination