CN113538633A - Animation playing method and device, electronic equipment and computer readable storage medium - Google Patents

Animation playing method and device, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN113538633A
CN113538633A CN202110836317.9A CN202110836317A CN113538633A CN 113538633 A CN113538633 A CN 113538633A CN 202110836317 A CN202110836317 A CN 202110836317A CN 113538633 A CN113538633 A CN 113538633A
Authority
CN
China
Prior art keywords
image element
animation
target
display object
layout
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110836317.9A
Other languages
Chinese (zh)
Other versions
CN113538633B (en
Inventor
郝华栋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dajia Internet Information Technology Co Ltd
Original Assignee
Beijing Dajia Internet Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dajia Internet Information Technology Co Ltd filed Critical Beijing Dajia Internet Information Technology Co Ltd
Priority to CN202110836317.9A priority Critical patent/CN113538633B/en
Publication of CN113538633A publication Critical patent/CN113538633A/en
Application granted granted Critical
Publication of CN113538633B publication Critical patent/CN113538633B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The disclosure relates to an animation playing method and device, electronic equipment and a computer readable storage medium, and belongs to the technical field of multimedia. According to the method and the device, when an element replacement instruction is received, a first image element to be replaced in a target display object is determined, then layout information used for indicating the layout of a second image element to be generated in the target display object is obtained, business information used for indicating business data to be included by the second image element is obtained, then the second image element used for replacing the first image element is generated based on the layout information and the business information, and therefore rendering is carried out based on the second image element and an attribute description file of the target animation, animation rendering is achieved, the target animation capable of being played in the target display object is obtained, developers do not need to write codes manually, man-machine interaction efficiency is improved, and animation production efficiency is improved.

Description

Animation playing method and device, electronic equipment and computer readable storage medium
Technical Field
The present disclosure relates to the field of multimedia technologies, and in particular, to an animation playing method and apparatus, an electronic device, and a computer-readable storage medium.
Background
Nowadays, actual service data is often involved in the animation process, for example, the actual service data of the money of the red envelope is involved in the animation process of opening the red envelope. At present, when the animation effect related to actual business data is realized, the animation effect can only be realized by manually writing codes by developers, and the human-computer interaction efficiency is low, so that the animation production efficiency is low.
Disclosure of Invention
The present disclosure provides an animation playing method, device, electronic device and computer-readable storage medium to improve human-computer interaction efficiency in an animation production process, thereby improving animation production efficiency. The technical scheme of the disclosure is as follows:
according to a first aspect of the embodiments of the present disclosure, there is provided an animation playing method, including:
in response to an element replacement instruction, determining a first image element to be displayed in a target display object, wherein the element replacement instruction is used for indicating replacement of the first image element;
obtaining layout information and service information, wherein the layout information is used for indicating the layout of a second image element to be generated in the target display object, and the service information is used for indicating service data included by the second image element;
generating a second image element based on the layout information and the service information, and replacing the first image element with the second image element;
and rendering based on the attribute description file of the target animation and the second image element so as to play the target animation in the target display object, wherein the attribute description file comprises attribute information corresponding to the animation effect included in the target animation.
According to the scheme provided by the disclosure, when an element replacement instruction is received, a first image element to be replaced in a target display object is determined, then layout information used for indicating the layout of a second image element to be generated in the target display object is obtained, business information used for indicating business data to be included by the second image element is obtained, and then the second image element used for replacing the first image element is generated based on the layout information and the business information, so that an attribute description file based on the second image element and the target animation is rendered, animation rendering is realized, then the target animation capable of being played in the target display object is obtained, developers are not required to manually write codes, the man-machine interaction efficiency is improved, and the animation production efficiency is improved.
In some embodiments, generating the second image element based on the layout information and the business information comprises:
determining a layout of a second image element to be generated in the target display object based on the layout information;
and adding the service data indicated by the service information into the layout of the second image element to be generated to obtain the second image element.
The layout of the second image element to be generated in the target display object is determined based on the layout information, so that the part, which does not change along with the change of the business data, of the second image element to be generated is determined, the business data indicated by the business information is added to the layout, the complete second image element is obtained, automatic generation of the second image element is achieved, when the business data change, a developer does not need to write codes to generate the second image element corresponding to the corresponding business data, the man-machine interaction efficiency is improved, the generation efficiency of the second image element is improved, and the animation production efficiency can be improved.
In some embodiments, rendering based on the property description file of the target animation and the second image element to play the target animation in the target display object includes:
and rendering the second image element based on the attribute description file of the target animation so as to play the target animation based on the second image element in the target display object.
The second image element is rendered based on the attribute description file of the target animation after the first image element is replaced by the second image element, so that the second image element is rendered, the target animation can be played in the target display object, manual operation of developers is not needed, the human-computer interaction efficiency is improved, and the animation playing efficiency is improved.
In some embodiments, the method further comprises:
and in the case that the first image element does not exist, rendering is carried out based on the attribute description file so as to play the target animation in the target display object.
Corresponding image elements are directly rendered based on the attribute description file under the condition that the first image elements to be replaced do not exist, so that the target animation in the target display object is played, manual operation of developers is not needed, the man-machine interaction efficiency is improved, the animation production efficiency is improved, and the animation playing efficiency can be improved.
In some embodiments, the element replacement instruction is triggered by a display instruction of the target display object.
When the display instruction of the target display object is triggered, the element replacement instruction is triggered, so that the electronic equipment can respond to the element replacement instruction and automatically replace the first image element to be replaced, the rendering of the target animation related to the service data is further realized, the manual operation of a developer is not needed, the human-computer interaction efficiency is improved, and the automation of the animation production process is realized.
In some embodiments, the target display object is a resource issuance prompt box, and the resource issuance prompt box is used for issuing virtual resources; or, the target display object is a function control.
By providing two possible target display objects and realizing the display of the target animation in the two target display objects through the scheme provided by the disclosure, the diversity of the display positions of the target animation is improved.
According to a second aspect of the embodiments of the present disclosure, there is provided an animation playback device including:
a determination unit configured to perform determining a first image element to be displayed in the target display object in response to an element replacement instruction, the element replacement instruction being for instructing replacement of the first image element;
an obtaining unit configured to perform obtaining layout information and business information, the layout information indicating a layout of a second image element to be generated in the target display object, the business information indicating business data included in the second image element;
a generating unit configured to perform generating a second image element based on the layout information and the service information;
a replacement unit configured to perform replacement of the first image element with the second image element;
and the rendering unit is configured to perform rendering based on an attribute description file of the target animation and the second image element so as to play the target animation in the target display object, wherein the attribute description file comprises attribute information corresponding to an animation effect included in the target animation.
In some embodiments, the generating unit is configured to perform determining a layout of the second image element to be generated in the target display object based on the layout information; and adding the service data indicated by the service information into the layout of the second image element to be generated to obtain the second image element.
In some embodiments, the rendering unit is configured to perform rendering the second image element based on the property description file of the target animation to play the target animation based on the second image element in the target display object.
In some embodiments, the rendering unit is further configured to perform rendering based on the property description file in the absence of the first image element to play the target animation in the target display object.
In some embodiments, the element replacement instruction is triggered by a display instruction of the target display object.
In some embodiments, the target display object is a resource issuance prompt box, and the resource issuance prompt box is used for issuing virtual resources; or, the target display object is a function control.
According to a third aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the animation playing method according to any one of the first aspect and the first aspect.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium, wherein instructions of the computer-readable storage medium, when executed by a processor of an electronic device, enable the electronic device to perform the animation playing method according to any one of the first aspect and the first aspect.
According to a fifth aspect of embodiments of the present disclosure, there is provided a computer program product, including a computer program, which when executed by a processor, implements the animation playback method according to any one of the first aspect and the first aspect.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure and are not to be construed as limiting the disclosure.
FIG. 1 is a flow diagram illustrating a method of playing an animation according to an exemplary embodiment.
FIG. 2 is a flow diagram illustrating a method of playing an animation according to an example embodiment.
Fig. 3 is a schematic diagram illustrating a second image element according to an example embodiment.
Fig. 4 is a flow chart illustrating an implementation of a video playback method according to an exemplary embodiment.
FIG. 5 is a schematic diagram illustrating an image element to be displayed in accordance with an exemplary embodiment.
Fig. 6 is a block diagram illustrating an animation playback device according to an example embodiment.
Fig. 7 is a block diagram illustrating an electronic device 700 in accordance with an example embodiment.
Detailed Description
In order to make the technical solutions of the present disclosure better understood by those of ordinary skill in the art, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the disclosure described herein are capable of operation in sequences other than those illustrated or otherwise described herein. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
In addition, the data and information referred to in the present disclosure are data and information authorized by a user or sufficiently authorized by each party.
The scheme provided by the disclosure can be applied to animation scenes. In an animation scene, lottiee is used as a set of cross-platform complete animation effect solution, animations designed by animation designers using AE (Object After Effects, video editing software) can be exported to Json (Json Object Notation) format files, and then the exported files can be directly called on various types of operating systems, platforms or frames, so that animation playing is realized. That is, in the animation production process, after an animation designer designs an animation through AE, the animation designer exports a target animation designed by the animation designer into an attribute description file in a Json format through a Bodymovin plug-in (AE animation export plug-in) in AE software, and then the exported attribute description file is sent to a developer, the developer receives the attribute description file through electronic equipment, and lottiee is installed and run on the electronic equipment of the developer, so that the electronic equipment can directly play the target animation corresponding to the attribute description file on the basis of the attribute description file through the lottiee.
Fig. 1 is a flow chart illustrating a method of playing an animation according to an exemplary embodiment, as shown in fig. 1, the method including the following steps.
In step S101, the electronic device determines a first image element to be displayed in the target display object in response to an element replacement instruction, the element replacement instruction being for instructing replacement of the first image element.
In step S102, the electronic device obtains layout information and service information, where the layout information is used to indicate a layout of a second image element to be generated in the target display object, and the service information is used to indicate service data included in the second image element.
In step S103, the electronic device generates a second image element based on the layout information and the service information, and replaces the first image element with the second image element.
In step S104, the electronic device renders based on the attribute description file of the target animation and the second image element to play the target animation in the target display object, where the attribute description file includes attribute information corresponding to an animation effect included in the target animation.
According to the scheme provided by the embodiment of the disclosure, when an element replacement instruction is received, a first image element to be replaced in a target display object is determined, so that layout information used for indicating the layout of a second image element to be generated in the target display object is obtained, service information used for indicating service data to be included by the second image element is obtained, and then the second image element used for replacing the first image element is generated based on the layout information and the service information, so that an attribute description file based on the second image element and the target animation is rendered to realize animation rendering, and further the target animation capable of being played in the target display object is obtained.
In some embodiments, generating the second image element based on the layout information and the business information comprises:
determining a layout of a second image element to be generated in the target display object based on the layout information;
and adding the service data indicated by the service information into the layout of the second image element to be generated to obtain the second image element.
In some embodiments, rendering based on the property description file of the target animation and the second image element to play the target animation in the target display object includes:
and rendering the second image element based on the attribute description file of the target animation so as to play the target animation based on the second image element in the target display object.
In some embodiments, the method further comprises:
and in the case that the first image element does not exist, rendering is carried out based on the attribute description file so as to play the target animation in the target display object.
In some embodiments, the element replacement instruction is triggered by a display instruction of the target display object.
In some embodiments, the target display object is a resource issuance prompt box, and the resource issuance prompt box is used for issuing virtual resources; or, the target display object is a function control.
The process shown in fig. 1 is only a basic flow of the present disclosure, and the scheme provided by the present disclosure is further described below based on a specific implementation process. Fig. 2 is a flow chart illustrating a method of playing an animation according to an exemplary embodiment, as shown in fig. 2, the method including the following steps.
In step S201, the electronic device determines a first image element to be displayed in the target display object in response to an element replacement instruction, the element replacement instruction being for instructing replacement of the first image element.
The element replacement instruction is triggered by a display instruction of the target display object. In some embodiments, the electronic device obtains an attribute description file of the target animation, and after obtaining the attribute description file, the electronic device can trigger a display instruction for the target display object based on the attribute description file through lottiee software running on the electronic device, so as to trigger the element replacement instruction. The attribute description file is a Json file and comprises attribute information corresponding to animation effects included in the target animation.
The process of acquiring the attribute description file of the target animation by the electronic equipment comprises the following steps: after the animation designer exports the attribute description file through the body movie plugin of the AE software, the attribute description file is stored in the electronic device used by the animation designer, so that the animation designer transmits the attribute description file to the electronic device used by the developer through the electronic device, and the electronic device used by the developer acquires the attribute description file.
After the attribute description file is obtained, the electronic device can trigger a playing instruction of the target animation through Lottie software, the playing instruction of the target animation is used for indicating the electronic device to play the target animation based on the attribute description file, the playing instruction of the target animation can trigger a display instruction of a target display object, and then the element replacement instruction is triggered.
When the display instruction of the target display object is triggered, the element replacement instruction is triggered, so that the electronic equipment can respond to the element replacement instruction and automatically replace the first image element to be replaced, the rendering of the target animation related to the service data is further realized, the manual operation of a developer is not needed, the human-computer interaction efficiency is improved, and the automation of the animation production process is realized.
In some embodiments, in response to the element replacement instruction, the electronic device invokes an element filtering function, and determines whether the image element to be displayed in the target display object is the first image element to be replaced by the element filtering function, so that when the image element to be displayed in the target display object is the first image element to be replaced, the subsequent steps S202 to S205 are performed.
The element screening function is a Bitmap fetchBitmap (Lottiemiageasset asset) and is used for providing an element replacement entry, and the first image element to be replaced can be determined based on the element identification through the element screening function.
That is, the electronic device calls the element filtering function in response to the element replacement instruction, determines whether the element identifier of the image element to be currently displayed is consistent with the element identifier of the first image element to be replaced based on the element identifier of the image element to be currently displayed, and determines that the image element to be currently displayed is the first image element to be replaced if the element identifier of the image element to be currently displayed is consistent with the element identifier of the first image element to be replaced. And the like, so as to determine each first image element to be replaced in the plurality of image elements corresponding to the target animation.
The first image element is a placeholder map, or the first image element is of another type, which is not limited in the embodiments of the present disclosure. The target display object is a resource distribution prompt box which is used for distributing virtual resources; or, the target display object is a function control; alternatively, the target display object is another type of object, which is not limited in this embodiment of the disclosure.
For example, the resource issuance prompt box is a red pop window, and the function control is a function button, or the resource issuance prompt box and the function control are of other types, which is not limited in the embodiments of the present disclosure.
By providing two possible target display objects and realizing the display of the target animation in the two target display objects through the scheme provided by the disclosure, the diversity of the display positions of the target animation is improved.
In step S202, the electronic device obtains layout information and service information, where the layout information is used to indicate a layout of a second image element to be generated in the target display object, and the service information is used to indicate service data included in the second image element.
The layout of the second image element in the target display object is a portion of the second image element that does not change with the change of the service data. The layout information includes display positions of respective areas in the target display object, display forms of the respective areas, and the like.
For example, referring to fig. 3, fig. 3 is a schematic diagram illustrating a second image element according to an exemplary embodiment, in the interface shown in fig. 3, positions of the area 301, the area 302, and the area 303, and display forms of three icons (i.e., the coin icon 3021, the like icon 3022, and the like icon 3023) displayed in the area 302 and characters displayed in the area 303 are all parts that do not change with changes in the business data, so that the electronic apparatus can directly acquire layout information, and can determine positions of the area 301, the area 302, and the area 303, and display forms of three icons displayed in the area 302 and characters displayed in the area 303 based on the acquired layout information.
In the second image element shown in fig. 3, the amount of the coupon, the availability condition, the availability range, and the expiration date displayed in the area 301, and the number of coins located in the upper right corner of the coin icon 3021, the number of likes located in the upper right corner of the like icon 3022, and the number of love located in the upper right corner of the like icon 3023 in the area 302 are business data, and the business data changes according to the account registered on the electronic device.
In step S203, the electronic device generates a second image element based on the layout information and the service information.
In some embodiments, the electronic device determines a layout of the second image element to be generated in the target display object based on the layout information; and adding the service data indicated by the service information to the layout of the second image element to be generated to obtain the second image element. The second image element is a static map, or the second image element is of another type, which is not limited in this embodiment of the disclosure.
Still taking the second image element shown in fig. 3 as an example, after the electronic device acquires the corresponding layout information and service data through step S202, add the acquired coupon amount, available condition, available range, and validity period to the corresponding position of the area 301, add the acquired gold coin number to the upper right corner of the gold coin icon 3021, add the acquired praise number to the upper right corner of the praise icon 3022, and add the acquired loise number to the upper right corner of the loise icon 3023, thereby implementing generation of the second image element.
The layout of the second image element to be generated in the target display object is determined based on the layout information, so that the part, which does not change along with the change of the business data, of the second image element to be generated is determined, the business data indicated by the business information is added to the layout, the complete second image element is obtained, automatic generation of the second image element is achieved, when the business data change, a developer does not need to write codes to generate the second image element corresponding to the corresponding business data, the man-machine interaction efficiency is improved, the generation efficiency of the second image element is improved, and the animation production efficiency can be improved.
The above steps S202 and S203 are implemented by a first element obtaining function, which is used to generate a second image element related to the service data. In some embodiments, the electronic device calls the first element obtaining function, and through the first element obtaining function, obtaining of the layout information and the service data can be achieved, so as to obtain the second image element generated by fusing the service data.
In step S204, the electronic device replaces the first image element with the second image element.
In step S205, the electronic device renders the second image element based on the attribute description file of the target animation, so as to play the target animation based on the second image element in the target display object, where the attribute description file includes attribute information corresponding to an animation effect included in the target animation.
The second image element is rendered based on the attribute description file of the target animation after the first image element is replaced by the second image element, so that the second image element is rendered, the target animation can be played in the target display object, manual operation of developers is not needed, the human-computer interaction efficiency is improved, and the animation playing efficiency is improved.
In some embodiments, the electronic device calls an animation playing function, and the target animation is played based on the attribute description file through the animation playing function.
Wherein the animation play function is Lottie animation View.
The process of the above steps S201 to S205 is shown in fig. 4, and fig. 4 is a flowchart illustrating an implementation of a video playing method according to an exemplary embodiment, after an animation designer designs an animation, an attribute description file is exported through a Bodymovin plug-in, and an electronic device obtains layout information and service data of a second image element to be displayed through a scheme provided by the present disclosure, generates the second image element based on the obtained layout information and service data, and replaces a first image element with the second image element, so as to play a target animation based on the second image element and the attribute description file.
In other embodiments, in the case that the first image element does not exist, the electronic device performs rendering based on the attribute description file to play the target animation in the target display object.
That is, under the condition that there is no first image element to be replaced, the electronic device directly obtains the image element to be displayed, and then renders the image element to be displayed based on the attribute description file, thereby implementing the playing of the target animation, that is, the playing of the target animation can be implemented without replacing the image element.
The process of acquiring the image element to be displayed in the absence of the first image element to be replaced is realized by a second element acquisition function, and the second element acquisition function is used for acquiring the image element to be displayed. In some embodiments, the electronic device calls the second element obtaining function, and obtains the image element to be displayed through the second element obtaining function. Wherein the second element obtaining function is a Bitmap getBitmap (Lottie ImageAsset asset).
Corresponding image elements are directly rendered based on the attribute description file under the condition that the first image elements to be replaced do not exist, so that the target animation in the target display object is played, manual operation of developers is not needed, the man-machine interaction efficiency is improved, the animation production efficiency is improved, and the animation playing efficiency can be improved.
According to the scheme provided by the embodiment of the disclosure, when an element replacement instruction is received, a first image element to be replaced in a target display object is determined, so that layout information used for indicating the layout of a second image element to be generated in the target display object is obtained, service information used for indicating service data to be included by the second image element is obtained, and then the second image element used for replacing the first image element is generated based on the layout information and the service information, so that an attribute description file based on the second image element and the target animation is rendered to realize animation rendering, and further the target animation capable of being played in the target display object is obtained.
In addition, according to the scheme provided by the embodiment of the disclosure, the manufactured target animation is basically consistent with the animation designed by the animation designer through the AE, that is, the scheme provided by the embodiment of the disclosure can greatly improve the animation restoration degree. And after the animation designer finishes the animation design through AE software, the attribute description file of the designed target animation can be automatically derived through the Bodymovin plug-in, and then the target animation is automatically manufactured and played through the scheme provided by the embodiment of the disclosure, the whole process does not involve the process of manually writing codes by developers, the low codes of the animation manufacture are realized, the development efficiency is improved, and compared with the scheme of manually writing the codes to realize the playing of the target animation, the scheme provided by the embodiment of the disclosure improves the development efficiency by more than 95%. In addition, the animation production process does not involve the process of manually writing codes by developers, so that the developers are not required to modify the codes subsequently to modify the display effect of the target animation, and the labor cost is saved.
The effect of the present disclosure will be further explained below with reference to a practical display example, which is shown in figure 5, figure 5 is a schematic diagram illustrating an image element to be displayed according to an exemplary embodiment, when the target animation is displayed based on the image elements shown in fig. 5, if Lottie is used, the electronic device can display the special effect of red envelope based on the image elements shown in fig. 5, and then after the red envelope special effect display is finished, the image elements shown in fig. 5 are displayed again, and in combination with the scheme provided by the present disclosure, the electronic device can obtain the layout information and the service data shown in fig. 3 by itself, and then generates a second image element as shown in figure 3 based on the acquired layout information and the service data, therefore, after the red packet opening special effect display is completed, the second image element shown in fig. 3 is displayed, so that the fusion of Lottie and service data is realized.
The layout information is combined with the service data to generate a second image element, and the second image element is used for replacing the first image element, so that the service data to be displayed is fused into Lottie software, the capability expansion of the Lottie software is realized, animation production related to the service data can be realized by combining Lottie with the scheme provided by the disclosure, the automation of the animation production process related to the service data is realized, and developers do not need to manually write codes according to actual service data every time to realize animation production.
Fig. 6 is a block diagram illustrating an animation playback device according to an example embodiment. Referring to fig. 6, the apparatus includes:
a determining unit 601 configured to perform determining a first image element to be displayed in a target display object in response to an element replacement instruction, the element replacement instruction being for instructing replacement of the first image element;
an obtaining unit 602 configured to perform obtaining layout information and service information, the layout information being used for indicating a layout of a second image element to be generated in the target display object, the service information being used for indicating service data included in the second image element;
a generating unit 603 configured to perform generating a second image element based on the layout information and the service information;
a replacement unit 604 configured to perform replacement of the first image element with the second image element;
and a rendering unit 605 configured to perform rendering based on an attribute description file of the target animation and the second image element to play the target animation in the target display object, wherein the attribute description file includes attribute information corresponding to an animation effect included in the target animation.
According to the device provided by the embodiment of the disclosure, when an element replacement instruction is received, a first image element to be replaced in a target display object is determined, so that layout information for indicating the layout of a second image element to be generated in the target display object is obtained, service information for indicating service data to be included by the second image element is obtained, and then the second image element for replacing the first image element is generated based on the layout information and the service information, so that an attribute description file based on the second image element and the target animation is rendered to realize animation rendering, and further a target animation capable of being played in the target display object is obtained.
In some embodiments, the generating unit 603 is configured to perform determining a layout of the second image element to be generated in the target display object based on the layout information; and adding the service data indicated by the service information into the layout of the second image element to be generated to obtain the second image element.
In some embodiments, the rendering unit 605 is configured to perform rendering the second image element based on the property description file of the target animation, so as to play the target animation based on the second image element in the target display object.
In some embodiments, the rendering unit 605 is further configured to perform rendering based on the property description file to play the target animation in the target display object in the absence of the first image element.
In some embodiments, the element replacement instruction is triggered by a display instruction of the target display object.
In some embodiments, the target display object is a resource issuance prompt box, and the resource issuance prompt box is used for issuing virtual resources; or, the target display object is a function control.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 7 is a block diagram illustrating an electronic device 700 in accordance with an example embodiment. The electronic device 700 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. The electronic device 700 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, and so forth.
In general, the electronic device 700 includes: one or more processors 701 and one or more memories 702.
The processor 701 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 701 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 701 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 701 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, the processor 701 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 702 may include one or more computer-readable storage media, which may be non-transitory. Memory 702 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 702 is used to store at least one program code for execution by the processor 701 to implement the animation playback method provided by the method embodiments of the present disclosure.
In some embodiments, the electronic device 700 may further optionally include: a peripheral interface 703 and at least one peripheral. The processor 701, the memory 702, and the peripheral interface 703 may be connected by buses or signal lines. Various peripheral devices may be connected to peripheral interface 703 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 704, display 705, camera 706, audio circuitry 707, positioning components 708, and power source 709.
The peripheral interface 703 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 701 and the memory 702. In some embodiments, processor 701, memory 702, and peripheral interface 703 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 701, the memory 702, and the peripheral interface 703 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The Radio Frequency circuit 704 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 704 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 704 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 704 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 704 may communicate with other electronic devices via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 704 may also include NFC (Near Field Communication) related circuits, which are not limited by this disclosure.
The display screen 705 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 705 is a touch display screen, the display screen 705 also has the ability to capture touch signals on or over the surface of the display screen 705. The touch signal may be input to the processor 701 as a control signal for processing. At this point, the display 705 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 705 may be one, providing the front panel of the electronic device 700; in other embodiments, the number of the display screens 705 may be at least two, and the at least two display screens are respectively disposed on different surfaces of the electronic device 700 or are in a folding design; in still other embodiments, the display 705 may be a flexible display disposed on a curved surface or on a folded surface of the electronic device 700. Even more, the display 705 may be arranged in a non-rectangular irregular pattern, i.e. a shaped screen. The Display 705 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), or the like.
The camera assembly 706 is used to capture images or video. Optionally, camera assembly 706 includes a front camera and a rear camera. Generally, a front camera is disposed on a front panel of an electronic apparatus, and a rear camera is disposed on a rear surface of the electronic apparatus. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 706 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 707 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 701 for processing or inputting the electric signals to the radio frequency circuit 704 to realize voice communication. For stereo capture or noise reduction purposes, the microphones may be multiple and disposed at different locations of the electronic device 700. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 701 or the radio frequency circuit 704 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 707 may also include a headphone jack.
The positioning component 708 is operable to locate a current geographic Location of the electronic device 700 to implement a navigation or LBS (Location Based Service). The Positioning component 708 can be a Positioning component based on the GPS (Global Positioning System) in the united states, the beidou System in china, the graves System in russia, or the galileo System in the european union.
The power supply 709 is used to supply power to various components in the electronic device 700. The power source 709 may be alternating current, direct current, disposable batteries, or rechargeable batteries. When power source 709 includes a rechargeable battery, the rechargeable battery may support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the electronic device 700 also includes one or more sensors 710. The one or more sensors 710 include, but are not limited to: acceleration sensor 711, gyro sensor 712, pressure sensor 713, fingerprint sensor 714, optical sensor 715, and proximity sensor 716.
The acceleration sensor 711 may detect the magnitude of acceleration in three coordinate axes of a coordinate system established with the electronic device 700. For example, the acceleration sensor 711 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 701 may control the display screen 705 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 711. The acceleration sensor 711 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 712 may detect a body direction and a rotation angle of the electronic device 700, and the gyro sensor 712 may cooperate with the acceleration sensor 711 to acquire a 3D motion of the user with respect to the electronic device 700. From the data collected by the gyro sensor 712, the processor 701 may implement the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 713 may be disposed on a side bezel of electronic device 700 and/or underlying display screen 705. When the pressure sensor 713 is disposed on a side frame of the electronic device 700, a user holding signal of the electronic device 700 may be detected, and the processor 701 may perform left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 713. When the pressure sensor 713 is disposed at a lower layer of the display screen 705, the processor 701 controls the operability control on the UI interface according to the pressure operation of the user on the display screen 705. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 714 is used for collecting a fingerprint of a user, and the processor 701 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 714, or the fingerprint sensor 714 identifies the identity of the user according to the collected fingerprint. When the user identity is identified as a trusted identity, the processor 701 authorizes the user to perform relevant sensitive operations, including unlocking a screen, viewing encrypted information, downloading software, paying, changing settings, and the like. The fingerprint sensor 714 may be disposed on the front, back, or side of the electronic device 700. When a physical button or vendor Logo is provided on the electronic device 700, the fingerprint sensor 714 may be integrated with the physical button or vendor Logo.
The optical sensor 715 is used to collect the ambient light intensity. In one embodiment, the processor 701 may control the display brightness of the display screen 705 based on the ambient light intensity collected by the optical sensor 715. Specifically, when the ambient light intensity is high, the display brightness of the display screen 705 is increased; when the ambient light intensity is low, the display brightness of the display screen 705 is adjusted down. In another embodiment, processor 701 may also dynamically adjust the shooting parameters of camera assembly 706 based on the ambient light intensity collected by optical sensor 715.
A proximity sensor 716, also referred to as a distance sensor, is typically disposed on the front panel of the electronic device 700. The proximity sensor 716 is used to capture the distance between the user and the front of the electronic device 700. In one embodiment, the processor 701 controls the display screen 705 to switch from the bright screen state to the dark screen state when the proximity sensor 716 detects that the distance between the user and the front surface of the electronic device 700 is gradually decreased; when the proximity sensor 716 detects that the distance between the user and the front surface of the electronic device 700 is gradually increased, the processor 701 controls the display screen 705 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 7 does not constitute a limitation of the electronic device 700 and may include more or fewer components than those shown, or combine certain components, or employ a different arrangement of components.
In an exemplary embodiment, a computer-readable storage medium comprising instructions, such as the memory 702 comprising instructions, executable by the processor 701 of the electronic device 700 to perform the animation playback method described above is also provided. Alternatively, the computer-readable storage medium is a Read-Only Memory (ROM), a Random Access Memory (RAM), a Compact Disc Read-Only Memory (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, or the like.
In an exemplary embodiment, there is also provided a computer program product comprising computer program instructions, which are executed by a processor of an electronic device, to implement the animation playback method described above.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. An animation playing method, characterized in that the method comprises:
determining a first image element to be displayed in a target display object in response to an element replacement instruction, wherein the element replacement instruction is used for indicating replacement of the first image element;
obtaining layout information and service information, wherein the layout information is used for indicating the layout of a second image element to be generated in the target display object, and the service information is used for indicating service data included by the second image element;
generating a second image element based on the layout information and the service information, and replacing the first image element with the second image element;
and rendering based on an attribute description file of the target animation and the second image element so as to play the target animation in the target display object, wherein the attribute description file comprises attribute information corresponding to an animation effect included in the target animation.
2. The method of claim 1, wherein generating a second image element based on the layout information and the business information comprises:
determining a layout of a second image element to be generated in the target display object based on the layout information;
and adding the service data indicated by the service information into the layout of the second image element to be generated to obtain the second image element.
3. The method of claim 1, wherein rendering the object animation-based property description file and the second image element to play the object animation in the object display object comprises:
and rendering the second image element based on the attribute description file of the target animation so as to play the target animation based on the second image element in the target display object.
4. The method of claim 1, further comprising:
rendering based on the property description file to play the target animation in the target display object in the absence of the first image element.
5. The method of claim 1, wherein the element replacement instruction is triggered by a display instruction of the target display object.
6. The method according to claim 1, wherein the target display object is a resource issuance prompt box, and the resource issuance prompt box is used for issuing virtual resources; or, the target display object is a function control.
7. An animation playback apparatus, comprising:
a determination unit configured to perform determining a first image element to be displayed in a target display object in response to an element replacement instruction, the element replacement instruction being for instructing replacement of the first image element;
an obtaining unit configured to perform obtaining layout information and service information, the layout information being used for indicating a layout of a second image element to be generated in the target display object, the service information being used for indicating service data included in the second image element;
a generating unit configured to perform generating a second image element based on the layout information and the service information;
a replacement unit configured to perform replacement of the first image element with the second image element;
and the rendering unit is configured to perform rendering based on an attribute description file of the target animation and the second image element so as to play the target animation in the target display object, wherein the attribute description file comprises attribute information corresponding to an animation effect included in the target animation.
8. The apparatus according to claim 7, wherein the generating unit is configured to perform determining a layout of a second image element to be generated in the target display object based on the layout information; and adding the service data indicated by the service information into the layout of the second image element to be generated to obtain the second image element.
9. An electronic device, comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the animation playback method of any one of claims 1 to 6.
10. A computer-readable storage medium, wherein instructions in the computer-readable storage medium, when executed by a processor of an electronic device, enable the electronic device to perform the animation playback method of any one of claims 1 to 6.
CN202110836317.9A 2021-07-23 2021-07-23 Animation playing method and device, electronic equipment and computer readable storage medium Active CN113538633B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110836317.9A CN113538633B (en) 2021-07-23 2021-07-23 Animation playing method and device, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110836317.9A CN113538633B (en) 2021-07-23 2021-07-23 Animation playing method and device, electronic equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN113538633A true CN113538633A (en) 2021-10-22
CN113538633B CN113538633B (en) 2024-05-14

Family

ID=78120680

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110836317.9A Active CN113538633B (en) 2021-07-23 2021-07-23 Animation playing method and device, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN113538633B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115272536A (en) * 2022-09-26 2022-11-01 深圳乐娱游网络科技有限公司 Animation playing method and device and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003102720A2 (en) * 2002-05-31 2003-12-11 Isioux B.V. Method and system for producing an animation
CN110708596A (en) * 2019-09-29 2020-01-17 北京达佳互联信息技术有限公司 Method and device for generating video, electronic equipment and readable storage medium
CN112052416A (en) * 2020-08-26 2020-12-08 腾讯科技(上海)有限公司 Method and device for displaying image elements
CN112135161A (en) * 2020-09-25 2020-12-25 广州华多网络科技有限公司 Dynamic effect display method and device of virtual gift, storage medium and electronic equipment
CN112308947A (en) * 2019-07-25 2021-02-02 腾讯科技(深圳)有限公司 Animation generation method and device and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003102720A2 (en) * 2002-05-31 2003-12-11 Isioux B.V. Method and system for producing an animation
CN112308947A (en) * 2019-07-25 2021-02-02 腾讯科技(深圳)有限公司 Animation generation method and device and storage medium
CN110708596A (en) * 2019-09-29 2020-01-17 北京达佳互联信息技术有限公司 Method and device for generating video, electronic equipment and readable storage medium
CN112052416A (en) * 2020-08-26 2020-12-08 腾讯科技(上海)有限公司 Method and device for displaying image elements
CN112135161A (en) * 2020-09-25 2020-12-25 广州华多网络科技有限公司 Dynamic effect display method and device of virtual gift, storage medium and electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115272536A (en) * 2022-09-26 2022-11-01 深圳乐娱游网络科技有限公司 Animation playing method and device and electronic equipment

Also Published As

Publication number Publication date
CN113538633B (en) 2024-05-14

Similar Documents

Publication Publication Date Title
CN108401124B (en) Video recording method and device
CN110971930A (en) Live virtual image broadcasting method, device, terminal and storage medium
CN109359262B (en) Animation playing method, device, terminal and storage medium
CN109327608B (en) Song sharing method, terminal, server and system
CN108897597B (en) Method and device for guiding configuration of live broadcast template
CN108965757B (en) Video recording method, device, terminal and storage medium
CN109144346B (en) Song sharing method and device and storage medium
CN109192218B (en) Method and apparatus for audio processing
CN108717365B (en) Method and device for executing function in application program
CN110321126B (en) Method and device for generating page code
CN111083526B (en) Video transition method and device, computer equipment and storage medium
CN113409427B (en) Animation playing method and device, electronic equipment and computer readable storage medium
CN108900925B (en) Method and device for setting live broadcast template
CN111880888B (en) Preview cover generation method and device, electronic equipment and storage medium
CN112965683A (en) Volume adjusting method and device, electronic equipment and medium
CN110288689B (en) Method and device for rendering electronic map
CN111752666A (en) Window display method and device and terminal
CN112257006A (en) Page information configuration method, device, equipment and computer readable storage medium
CN109660876B (en) Method and device for displaying list
CN111083554A (en) Method and device for displaying live gift
CN111437600A (en) Plot showing method, plot showing device, plot showing equipment and storage medium
CN110933454B (en) Method, device, equipment and storage medium for processing live broadcast budding gift
CN113538633B (en) Animation playing method and device, electronic equipment and computer readable storage medium
CN111443858A (en) Application interface display method and device, terminal and storage medium
CN112637624B (en) Live stream processing method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant