CN108234903B - Interactive special effect video processing method, medium and terminal equipment - Google Patents

Interactive special effect video processing method, medium and terminal equipment Download PDF

Info

Publication number
CN108234903B
CN108234903B CN201810089957.6A CN201810089957A CN108234903B CN 108234903 B CN108234903 B CN 108234903B CN 201810089957 A CN201810089957 A CN 201810089957A CN 108234903 B CN108234903 B CN 108234903B
Authority
CN
China
Prior art keywords
special effect
effect
video
special
interactive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810089957.6A
Other languages
Chinese (zh)
Other versions
CN108234903A (en
Inventor
袁少龙
周宇涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bigo Technology Pte Ltd
Original Assignee
Guangzhou Baiguoyuan Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Baiguoyuan Information Technology Co Ltd filed Critical Guangzhou Baiguoyuan Information Technology Co Ltd
Priority to CN201810089957.6A priority Critical patent/CN108234903B/en
Publication of CN108234903A publication Critical patent/CN108234903A/en
Priority to EP18903360.8A priority patent/EP3748954A4/en
Priority to US16/965,454 priority patent/US11533442B2/en
Priority to PCT/CN2018/123236 priority patent/WO2019149000A1/en
Priority to RU2020128552A priority patent/RU2758910C1/en
Application granted granted Critical
Publication of CN108234903B publication Critical patent/CN108234903B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/802D [Two Dimensional] animation, e.g. using sprites
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/46Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4318Generation of visual interfaces for content selection or interaction; Content or additional data rendering by altering the content in the rendering process, e.g. blanking, blurring or masking an image region

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Studio Devices (AREA)
  • Studio Circuits (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

The invention provides a processing method, a medium and terminal equipment of an interactive special effect video, wherein the processing method of the interactive special effect video comprises the following steps: acquiring a reference video containing a first special effect; acquiring a second special effect interacted with the first special effect; and processing the image of the current video according to the second special effect to obtain the video containing the second special effect. The invention simplifies the steps of making a plurality of special effects by the user, and the interaction effect of the second special effect and the first special effect increases the entertainment of the user.

Description

Interactive special effect video processing method, medium and terminal equipment
Technical Field
The invention relates to an information processing technology, in particular to a processing method, a medium and terminal equipment of an interactive special-effect video.
Background
The existing video special effect is generally obtained by adding special effect synthesis through post-processing after a user shoots a video in advance; for videos with a plurality of special effects, each special effect is generally produced one by one, the production mode of special effect synthesis is very time-consuming, and when a plurality of special effects have a certain interactive relation, higher requirements are placed on the professional degree and the fine degree of user operation, so that common users are difficult to produce videos with a plurality of special effects in video entertainment, especially special effect videos with the interactive relation, the production threshold of the users is increased, and the video entertainment mode of the common users is limited.
Disclosure of Invention
The invention aims to solve at least one of the technical defects, and particularly reduces the difficulty of making interactive video special effects by users.
The invention provides a method for processing an interactive special effect video, which comprises the following steps:
acquiring a reference video containing a first special effect;
acquiring a second special effect interacted with the first special effect;
and processing the image of the current video according to the second special effect to obtain the video containing the second special effect.
Preferably, the reference video carries content information of the first special effect;
the obtaining a second effect that interacts with the first effect includes:
acquiring the first special effect from the content information;
and acquiring the second special effect interacted with the first special effect from the interactive special effect corresponding table.
Preferably, the reference video carries content information of the second effect interacting with the first effect;
the obtaining the second effect that interacts with the first effect includes:
and acquiring a second special effect interacted with the first special effect from the content information.
Preferably, the obtaining a second effect interacting with the first effect comprises:
identifying features of the first effect in the reference video;
and acquiring the second special effect interacted with the first special effect from an interactive special effect corresponding table according to the characteristics.
Preferably, the obtaining the second effect interacting with the first effect from the interactive effect corresponding table includes:
acquiring a special effect group interacted with the first special effect from an interactive special effect corresponding table of the terminal or the special effect server; wherein the special effect group comprises more than two second special effects, and each second special effect has a color attribute and a special effect score fed back by a user;
judging whether a second special effect selection instruction input by a user is received;
if the second special effect selection instruction is received, acquiring a second special effect corresponding to the second special effect selection instruction from the special effect group;
if the second special effect selection instruction is not received, whether special effect color self-adaption is set or not is judged, if yes, the color average value of the frame picture of the current video is calculated, and a second special effect of the color attribute corresponding to the color average value is obtained, and if not, the second special effect with the highest special effect score is obtained.
Preferably, the first effect and the second effect are effects with the same content, opposite content, or similar content.
Preferably, the first effect and the second effect are interactive effects on a time axis with a play start time as a reference start point.
Preferably, the interactive special effect on the time axis with the play-starting time as the reference starting point includes: on the time axis with the play start time as a reference starting point, the interactive time points of the first special effect and the second special effect are the same;
processing an image of a current video according to the second effect, comprising:
acquiring a time point of the first special effect in the reference video on the time axis with the play-starting time as a reference starting point;
and processing the image of the current video corresponding to the time point by using the second special effect.
Preferably, the interactive special effect on the time axis with the play-starting time as the reference starting point includes: on the time axis with the play start time as a reference starting point, the interactive time points of the first special effect and the second special effect are in a sequential arrangement relationship;
processing an image of a current video according to the second effect, comprising:
acquiring a time point of the first special effect in the reference video on a time axis with the play starting time as a reference starting point;
obtaining a time point of the second special effect in the current video according to the sequence relation;
and processing the image of the current video corresponding to the time point by using the second special effect.
Preferably, after obtaining the video including the second special effect, the method further includes:
and combining the video containing the second special effect and the reference video containing the first special effect into one video.
The invention further provides a computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the steps of the interactive special effect video processing method according to any one of the preceding claims.
The invention also provides a terminal device, which comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor; and when the processor executes the computer program, the steps of any one of the interactive special effect video processing methods are realized.
The invention has the following beneficial effects:
1. according to the method and the device, the second special effect can be obtained according to the first special effect of the reference video, so that the video containing the second special effect is automatically generated, the steps of making a plurality of special effects by a user are simplified, and the difficulty of making a plurality of video special effects by the user is reduced; moreover, the second special effect and the first special effect have multiple interactive effects, and the entertainment of a user for making a video special effect is increased.
2. According to the invention, the content information of the first special effect and/or the second special effect can be carried in the reference video, so that the first special effect and/or the second special effect can be edited independently, the whole reference video is prevented from being edited, and the occupation of the memory space of a terminal is reduced; but also replacing the first effect and/or the second effect becomes simpler.
3. According to the invention, the special effect group corresponding to the first special effect can be determined through the interactive special effect corresponding table, and the second special effect can be determined according to the user selection, the system setting or the special effect grade, so that the combination mode between the first special effect and the second special effect is increased, and the interactive effect among various special effects is enriched; the entertainment of the video effect can be further enhanced through the interaction time between the first effect and the second effect.
Additional aspects and advantages of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
The foregoing and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic flow chart of a first embodiment of the treatment method of the present invention;
FIG. 2 is a schematic flow chart of a second embodiment of the treatment method of the present invention;
fig. 3 is a schematic diagram of an embodiment of the terminal device according to the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative only and should not be construed as limiting the invention.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. As used herein, the term "and/or" includes all or any element and all combinations of one or more of the associated listed items.
It will be understood by those skilled in the art that, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
The invention provides a method for processing an interactive special effect video, which comprises the following steps of:
step S10: acquiring a reference video containing a first special effect;
step S20: acquiring a second special effect interacted with the first special effect;
step S30: and processing the image of the current video according to the second special effect to obtain a video containing the second special effect.
Wherein each step is as follows:
step S10: acquiring a reference video containing a first special effect;
the reference video can be a video recorded immediately or a video prestored in the terminal; the first effect included in the reference video may be a visual effect or a sound effect; when the first special effect is a visual special effect, the first special effect can be displayed in a picture of the reference video or not displayed in the picture and only stored in a reference video related file; meanwhile, the first special effect and the reference video can be the same video stream file, and can also be different files from the reference video, and only when the first special effect and the reference video are played, the first special effect and the reference video are synthesized and output in the picture or sound of the same video according to the corresponding information or the matching information. The first effect may be an effect associated with limb movement in a video picture, such as: when the limb action of a ringing finger appears in the video picture, matching a first special effect of user scene switching for the limb action; the special effect of the user scene switching can be displayed in the video picture of the reference video, the first special effect can be hidden according to the user requirement, and the first special effect is displayed only when another action for triggering the display of the special effect occurs.
Step S20: acquiring a second special effect interacted with the first special effect;
the second special effect can be a visual special effect or a sound special effect; the second effect of the first effect interaction means that the first effect and the second effect have a special associated effect, for example: when the first special effect is a spark special effect appearing in a video picture, a second special effect that snowflakes appear in the video picture can be simultaneously or delayed; when the first special effect is that one transfer gate appears on the left side of a character in a video picture, a second special effect of another transfer gate can also appear on the right side of the character in the video picture simultaneously or in a delayed manner; when the first special effect is a visual special effect that explosion appears in a video picture, a second special effect that an explosion sound effect appears in the video can be output simultaneously or in a delayed mode. The second special effect can be obtained from the terminal or the special effect server.
Step S30: and processing the image of the current video according to the second special effect to obtain the video containing the second special effect.
The processing of the image of the current video may include various ways, such as: the first special effect and the second special effect are simultaneously output in the image or the sound of the current video, or the first special effect and the second special effect are sequentially displayed in the picture of the reference video according to a certain time, or the first special effect and the second special effect are displayed in the reference video according to a certain interactive motion track, or a certain trigger logic is preset for the first special effect and the second special effect, so that the subsequent interactive output is facilitated, and the like. When at least one of the first effect and the second effect is a visual effect, a video including the visual effect may be formed; when the first effect and the second effect are both sound effects, the image of the current video can be processed according to the sound of the second effect to form a video with an image effect.
According to the embodiment, the second special effect can be obtained according to the first special effect of the reference video so as to automatically generate the video containing the second special effect, so that the process of making a plurality of special effects by a user is simplified, and the difficulty of making a plurality of video special effects by the user is reduced; and the second special effect and the first special effect have an interactive effect, so that the entertainment of a user for making a video special effect is increased.
Based on the first embodiment, the present invention also proposes another embodiment: the reference video carries content information of the first special effect;
the obtaining a second effect that interacts with the first effect includes:
acquiring the first special effect from the content information;
and acquiring a second special effect of the first special effect interaction from the interactive special effect corresponding table.
In this embodiment, the content information may directly include a display effect or a sound effect of the first special effect, or may include a local address or a network address for acquiring the first special effect; in addition, the content information may further include a video author of the first special effect, a special effect duration, a reference video decoding mode, and the like; acquiring the first special effect from the content information, directly acquiring the first special effect from the content information, or acquiring the first special effect according to a local address or a network address of the first special effect; before the second special effect is obtained, the interactive special effect corresponding table can be preset so as to determine the second special effect according to the first special effect. In this embodiment, the first effect may be included in the content information, so as to avoid directly synthesizing the first effect in the reference video, and achieve the purpose of conveniently modifying the first effect. For example: when the first special effect is a visual special effect, the first special effect can be read from the content information instead of being directly displayed in the picture of the reference video, so that the first special effect in the content information can be independently modified or edited, the aim of modifying the first special effect in the reference video by editing the reference video is fulfilled, and the memory space occupied by editing the special effect is saved; but also saves modification time or replacement time for the first effect.
Based on the above embodiment, the present invention provides another embodiment: the reference video carries content information of the second effect interacting with the first effect;
the obtaining the second effect that interacts with the first effect includes:
and acquiring the second special effect interacted with the first special effect from the content information.
In this embodiment, the content information not only carries the related information of the first special effect, but also carries the related information of the second special effect. The content information may directly include a display effect or a sound effect of the second effect, may also include a local address or a network address for obtaining the second effect, and may also include the interactive effect correspondence table in which the first effect and the second effect are in correspondence; in addition, the content information may further include a video author, a special effect duration, and the like of the second special effect; the second special effect is obtained from the content information, the second special effect can be directly obtained from the content information, and the second special effect can also be obtained according to a local address or a network address of the second special effect. In this embodiment, the content information of the second special effect is carried in the reference video, and the purpose of modifying the second special effect can be achieved by modifying or replacing the content information, without modifying the second special effect in the reference video, thereby simplifying the operation of modifying the second special effect.
Based on the first embodiment, the present invention proposes yet another embodiment: the obtaining a second effect that interacts with the first effect includes:
identifying features of the first effect in the reference video;
and acquiring the second special effect interacted with the first special effect from an interactive special effect corresponding table according to the characteristics.
When the first effect and the reference video are the same video stream file, the embodiment can identify the feature of the first effect from the reference video, so as to obtain the second effect according to the interactive effect corresponding table. The interactive special effect corresponding table can be prestored in a special effect server or the terminal so as to determine the corresponding relation between the first special effect and the second special effect; when the second effect corresponding to the first effect needs to be modified, the second effect can be directly modified in the interactive effect corresponding table without being modified in the reference video, so that the modification step is simplified.
When the first special effect and the reference video are separate files, the reference video may also carry the content information, or carry information that the first special effect has a corresponding relationship with the reference video, so as to identify the feature of the first special effect. The features of the first special effect may be directly included in the reference video or may be located in a corresponding local address or network address. And after the characteristics of the first special effect are identified, acquiring the second special effect interacted with the first special effect from an interactive special effect corresponding table according to the characteristics. The second special effect may be directly included in the reference video or may be located in a corresponding local address or network address. When the second special effect is directly included in the reference video, the second special effect can not be displayed or output for the moment, and after an instruction for triggering display or output is obtained, a special effect picture is displayed or a special effect audio is output. When the second special effect is included in the reference video, the time for acquiring the second special effect can be saved, and the situation that the time delay for acquiring the second special effect is abnormal or the acquisition fails is avoided.
Based on the above embodiment, the present invention further provides another embodiment:
the obtaining a reference video including a first effect and obtaining a second effect interacting with the first effect includes:
judging whether the reference video carries content information of the second special effect interacting with the first special effect;
if the content information of the second special effect is carried, acquiring the second special effect interacted with the first special effect from the content information;
if the reference video does not carry the content information of the second special effect, judging whether the reference video carries the content information of the first special effect;
if the content information carrying the first special effect is carried, acquiring the first special effect from the content information; acquiring the second special effect interacted with the first special effect from the interactive special effect corresponding table;
if the content information carrying the first special effect does not exist, identifying the characteristic of the first special effect in the reference video; and acquiring the second special effect interacted with the first special effect from the interactive special effect corresponding table according to the characteristics.
The embodiment can preferentially and directly acquire the second special effect from the content information so as to accelerate the time for acquiring the second special effect; if the content information of the second special effect does not exist, acquiring the first special effect according to the content information of the first special effect, and acquiring the second special effect corresponding to the first special effect from the interactive special effect corresponding table; if the content information of the first special effect does not exist, identifying the characteristics of the first special effect from the reference video, and then acquiring the second special effect based on the characteristics. The embodiment has a plurality of ways for acquiring the second special effect, and takes the way with the highest acquisition speed as the first order acquisition way to accelerate the acquisition speed of the second special effect; meanwhile, other acquisition modes are used as alternative modes to ensure that the second special effect can be acquired.
Based on the above embodiment, the present invention provides another embodiment: the obtaining, from the interactive effect correspondence table, the second effect that interacts with the first effect includes:
acquiring a special effect group interacted with the first special effect from an interactive special effect corresponding table of the terminal or the special effect server; wherein the special effect group comprises more than two second special effects, and each second special effect has a color attribute and a special effect score fed back by a user;
judging whether a second special effect selection instruction input by a user is received;
if the second special effect selection instruction is received, acquiring a second special effect corresponding to the second special effect selection instruction from the special effect group;
if the second special effect selection instruction is not received, whether special effect color self-adaption is set or not is judged, if yes, the color average value of the frame picture of the current video is calculated, and a second special effect of the color attribute corresponding to the color average value is obtained, and if not, the second special effect with the highest special effect score is obtained.
This embodiment provides a specific implementation of the color-based special effect. For example: the first special effect is sound, and the second special effect interacted with the first special effect is a fruit visual special effect with different colors; when the first effect triggers the second effect, the user may select which effect to input due to the fruit visual effects in the group of effects having different colors; if the user does not select, the video display method can automatically select according to the average value of the colors of the frame pictures of the current video or automatically select according to the special effect score. The color special effect interaction mode in the embodiment can be realized by determining the second special effect by a user so as to improve the participation degree of the user and increase the creative form of the second special effect, and also provides a function of automatically adding the second special effect, so that the user operation is simplified, and the user experience is improved.
The second special effects in the special effect group can be provided for the user by the third-party server, and the user can score each second special effect in the special effect group so as to improve the interactivity of the user.
In the present invention, the first effect and the second effect may be the same, opposite, or similar in content. For example, if the contents are the same, the moving manner of the picture, sound or special effect of the special effect is the same, for example: when the first effect is an effect of adding a blush to the character a, the second effect is an effect of adding the same blush to the character B; the opposite content may include a mirror image of the picture action, or an opposite change of the picture, such as: the first special effect is the enlargement of a character A, and the second special effect is the reduction of a character B; the content similarity may be adding similar special effects, such as: the first special effect is to add a vertigo special effect on the vertex of a character A, and the second special effect is to add another different vertigo special effect on the vertex of a character B. The invention can form various special effects such as exaggeration, contrast and the like through different combinations of the first special effect and the second special effect, and provides more and richer entertainment modes for users.
The present invention also proposes a second embodiment: the first effect and the second effect may also be interactive effects on a time axis with a start time as a reference start point. In this embodiment, step S20 in the first embodiment may be:
step S21: and acquiring a second special effect interacted with the first special effect, wherein the first special effect and the second special effect are interacted on a time axis with the play-starting time as a reference starting point.
The interaction time or trigger time of the first special effect and the second special effect can be a time axis as a reference object, can simultaneously trigger special effects of displaying pictures or broadcasting sounds, can delay one of the special effects, or can be staggered on the time axis.
Based on the foregoing second embodiment, the present invention also proposes the following embodiments: the interactive special effect on the time axis with the play-starting time as the reference starting point comprises the following steps: on the time axis with the play start time as a reference starting point, the interactive time points of the first special effect and the second special effect are the same;
processing an image of a current video according to the second effect, comprising:
acquiring a time point of the first special effect in the reference video on the time axis with the play-starting time as a reference starting point;
and processing the image of the current video corresponding to the time point by using the second special effect.
In this embodiment, the interaction time points of the first effect and the second effect are the same. For example, the first effect and the second effect are both flame effects, and can be simultaneously displayed at different positions of a video picture after 5 seconds of the play-starting time; or the first special effect is a flame vision special effect, and the second special effect is a flame sound special effect, and can appear after 5 seconds of the starting playing time so as to strengthen the special effect of combustion.
Based on the foregoing second embodiment, the present invention also proposes the following embodiments: the interactive special effect on the time axis with the play-starting time as the reference starting point comprises the following steps: on the time axis with the play start time as a reference starting point, the interactive time points of the first special effect and the second special effect are in a sequential arrangement relationship;
processing an image of a current video according to the second effect, comprising:
acquiring a time point of the first special effect in the reference video on the time axis with the play-starting time as a reference starting point;
obtaining a time point of the second special effect in the current video according to the sequence relation;
and processing the image of the current video corresponding to the time point by using the second special effect.
In this embodiment, the interaction time points of the first effect and the second effect are staggered. For example, the first special effect is raining, the second special effect is umbrella opening, and the occurrence time of the second special effect can be delayed according to the acquired time point of the first special effect in the reference video, so that the logic relationship between the two special effects is embodied, or more various entertainment effects are achieved according to the time difference relationship between the two special effects.
Based on the above embodiments: after obtaining the video containing the second special effect, the method may further include:
and combining the video containing the second special effect and the reference video containing the first special effect into one video.
The video including the second effect may or may not include the first effect. In order to enhance the entertainment effect, the video containing the second special effect and the reference video containing the first special effect can be synthesized into one video again, so that two video pictures with a contrast effect are formed in the one video. For example: the reference video shows a food for the user, the first special effect is that the user sees an exaggerated expression of the fast joss of the food, and the second special effect is that the user sees an exaggerated expression of the frigidity rejection of the food; the first video of the user seeing the delicious fast food and the second video of seeing the delicious cold rejection can be combined into one video to achieve the effect of comparison.
The synthesis method has various modes, such as: the position relation of the first video and the second video after being called as one video can be set left and right or up and down, or can be set as that one video is played first, and the other video is played continuously after the playing is finished; the playing time of the first video and the playing time of the second video can be staggered by preset time and/or preset distance, so that a user can achieve more diversified combination effects through multiple combination forms, and richer entertainment modes are provided for the user.
The invention further provides a computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the steps of the interactive special effect video processing method according to any one of the preceding claims.
The invention also provides a terminal device, which comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor; and when the processor executes the computer program, the steps of any one of the interactive special effect video processing methods are realized.
Fig. 3 is a block diagram of a part of the terminal device according to the present invention, and for convenience of description, only the part related to the embodiment of the present invention is shown. The terminal equipment can be mobile phones, tablet computers, notebook computers, desktop computers and other terminal equipment capable of watching video programs. The following describes the operation of the terminal device according to the present invention by taking a mobile phone as an example.
Referring to fig. 3, the mobile phone includes a processor, a memory, an input unit, a display unit, and the like. Those skilled in the art will appreciate that the handset configuration shown in fig. 3 is not intended to be limiting of all handsets, and may include more or less components than those shown, or some components in combination. The memory can be used for storing the computer program and each functional module, and the processor executes various functional applications and data processing of the mobile phone by running the computer program stored in the memory. The memory may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a function of playing a video), and the like; the storage data area may store data (such as video data, etc.) created according to the use of the cellular phone, etc. Further, the memory may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit may be used to receive a search keyword input by a user and to generate a signal input related to user setting and function control of the cellular phone. Specifically, the input unit may include a touch panel and other input devices. The touch panel can collect touch operations of a user on or near the touch panel (for example, operations of the user on or near the touch panel by using any suitable object or accessory such as a finger, a stylus and the like) and drive the corresponding connecting device according to a preset program; other input devices may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., play control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like. The display unit may be used to display information input by a user or information provided to the user and various menus of the mobile phone. The display unit may take the form of a liquid crystal display, an organic light emitting diode, or the like. The processor is the control center of the mobile phone, is connected with all parts of the whole computer by various interfaces and lines, and executes various functions and processes data by running or executing software programs and/or modules stored in the memory and calling data stored in the memory.
In the embodiment of the present invention, the processor included in the terminal device further has the following functions:
acquiring a reference video containing a first special effect;
acquiring a second special effect interacted with the first special effect;
and processing the image of the current video according to the second special effect to obtain the video containing the second special effect.
In addition, each module in each embodiment of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc.
The foregoing is only a partial embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (11)

1. A method for processing an interactive special effect video, comprising:
acquiring a reference video containing a first special effect; wherein the reference video carries content information of the first special effect;
acquiring a second special effect interacted with the first special effect; the method comprises the following steps: acquiring the first special effect from the content information; acquiring the second special effect interacted with the first special effect from the interactive special effect corresponding table;
and processing the image of the current video according to the second special effect to obtain the video containing the second special effect.
2. The processing method according to claim 1, characterized in that: the reference video carries content information of the second effect interacting with the first effect;
the obtaining the second effect that interacts with the first effect includes:
and acquiring a second special effect interacted with the first special effect from the content information.
3. The processing method according to claim 1, characterized in that: the obtaining a second effect that interacts with the first effect includes:
identifying features of the first effect in the reference video;
and acquiring the second special effect interacted with the first special effect from an interactive special effect corresponding table according to the characteristics.
4. The processing method according to claim 1 or 3, characterized in that: the obtaining, from the interactive effect correspondence table, the second effect that interacts with the first effect includes:
acquiring a special effect group interacted with the first special effect from an interactive special effect corresponding table of the terminal or the special effect server; wherein the special effect group comprises more than two second special effects, and each second special effect has a color attribute and a special effect score fed back by a user;
judging whether a second special effect selection instruction input by a user is received;
if the second special effect selection instruction is received, acquiring a second special effect corresponding to the second special effect selection instruction from the special effect group;
if the second special effect selection instruction is not received, whether special effect color self-adaption is set or not is judged, if yes, the color average value of the frame picture of the current video is calculated, and a second special effect of the color attribute corresponding to the color average value is obtained, and if not, the second special effect with the highest special effect score is obtained.
5. The processing method according to claim 1, characterized in that: the first effect and the second effect are effects with the same content, opposite content or similar content.
6. The processing method according to claim 1, characterized in that: the first effect and the second effect are interactive effects on a time axis with the play start time as a reference starting point.
7. The processing method according to claim 6, characterized in that: the interactive special effect on the time axis with the play-starting time as the reference starting point comprises the following steps: on the time axis with the play start time as a reference starting point, the interactive time points of the first special effect and the second special effect are the same;
processing an image of a current video according to the second effect, comprising:
acquiring a time point of the first special effect in the reference video on the time axis with the play-starting time as a reference starting point;
and processing the image of the current video corresponding to the time point by using the second special effect.
8. The processing method according to claim 6, characterized in that: the interactive special effect on the time axis with the play-starting time as the reference starting point comprises the following steps: on the time axis with the play start time as a reference starting point, the interactive time points of the first special effect and the second special effect are in a sequential arrangement relationship;
processing an image of a current video according to the second effect, comprising:
acquiring a time point of the first special effect in the reference video on a time axis with the play starting time as a reference starting point;
obtaining a time point of the second special effect in the current video according to the sequence relation;
and processing the image of the current video corresponding to the time point by using the second special effect.
9. The processing method according to claim 1, characterized in that: after the obtaining of the video including the second special effect, the method further includes:
and combining the video containing the second special effect and the reference video containing the first special effect into one video.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when being executed by a processor, carries out the steps of the method for processing an interactive special effects video according to any one of claims 1 to 9.
11. A terminal device comprising a memory, a processor and a computer program stored on the memory and executable on the processor; characterized in that the processor implements the steps of the method for processing interactive special effects video according to any one of claims 1 to 9 when executing the computer program.
CN201810089957.6A 2018-01-30 2018-01-30 Interactive special effect video processing method, medium and terminal equipment Active CN108234903B (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
CN201810089957.6A CN108234903B (en) 2018-01-30 2018-01-30 Interactive special effect video processing method, medium and terminal equipment
EP18903360.8A EP3748954A4 (en) 2018-01-30 2018-12-24 Processing method for achieving interactive special effects for video, medium, and terminal apparatus
US16/965,454 US11533442B2 (en) 2018-01-30 2018-12-24 Method for processing video with special effects, storage medium, and terminal device thereof
PCT/CN2018/123236 WO2019149000A1 (en) 2018-01-30 2018-12-24 Processing method for achieving interactive special effects for video, medium, and terminal apparatus
RU2020128552A RU2758910C1 (en) 2018-01-30 2018-12-24 Method for processing interconnected special effects for video, data storage medium and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810089957.6A CN108234903B (en) 2018-01-30 2018-01-30 Interactive special effect video processing method, medium and terminal equipment

Publications (2)

Publication Number Publication Date
CN108234903A CN108234903A (en) 2018-06-29
CN108234903B true CN108234903B (en) 2020-05-19

Family

ID=62669780

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810089957.6A Active CN108234903B (en) 2018-01-30 2018-01-30 Interactive special effect video processing method, medium and terminal equipment

Country Status (5)

Country Link
US (1) US11533442B2 (en)
EP (1) EP3748954A4 (en)
CN (1) CN108234903B (en)
RU (1) RU2758910C1 (en)
WO (1) WO2019149000A1 (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108234903B (en) * 2018-01-30 2020-05-19 广州市百果园信息技术有限公司 Interactive special effect video processing method, medium and terminal equipment
CN109104586B (en) * 2018-10-08 2021-05-07 北京小鱼在家科技有限公司 Special effect adding method and device, video call equipment and storage medium
CN109529329B (en) * 2018-11-21 2022-04-12 北京像素软件科技股份有限公司 Game special effect processing method and device
CN109710255B (en) * 2018-12-24 2022-07-12 网易(杭州)网络有限公司 Special effect processing method, special effect processing device, electronic device and storage medium
WO2022062788A1 (en) * 2020-09-28 2022-03-31 北京达佳互联信息技术有限公司 Interactive special effect display method and terminal
CN112906553B (en) * 2021-02-09 2022-05-17 北京字跳网络技术有限公司 Image processing method, apparatus, device and medium
CN114885201B (en) * 2022-05-06 2024-04-02 林间 Video comparison viewing method, device, equipment and storage medium
CN115941841A (en) * 2022-12-06 2023-04-07 北京字跳网络技术有限公司 Associated information display method, device, equipment, storage medium and program product

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102779028A (en) * 2011-05-09 2012-11-14 腾讯科技(深圳)有限公司 Implementation method and device for special effect synthesizing engine of client side
CN105959725A (en) * 2016-05-30 2016-09-21 徐文波 Method and device for loading media special effects in video
CN106385591A (en) * 2016-10-17 2017-02-08 腾讯科技(上海)有限公司 Video processing method and video processing device
CN107168606A (en) * 2017-05-12 2017-09-15 武汉斗鱼网络科技有限公司 Dialog control display methods, device and user terminal

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030052909A1 (en) * 2001-06-25 2003-03-20 Arcsoft, Inc. Real-time rendering of edited video stream
US7102643B2 (en) * 2001-11-09 2006-09-05 Vibe Solutions Group, Inc. Method and apparatus for controlling the visual presentation of data
JP4066162B2 (en) * 2002-09-27 2008-03-26 富士フイルム株式会社 Image editing apparatus, image editing program, and image editing method
JP4820136B2 (en) 2005-09-22 2011-11-24 パナソニック株式会社 Video / audio recording apparatus and video / audio recording method
US20150339010A1 (en) * 2012-07-23 2015-11-26 Sudheer Kumar Pamuru System and method for producing videos with overlays
KR101580237B1 (en) 2013-05-15 2015-12-28 씨제이포디플렉스 주식회사 Method and System for Providing 4D Content Production Service, Content Production Apparatus Therefor
CN103389855B (en) * 2013-07-11 2017-02-08 广东欧珀移动通信有限公司 Mobile terminal interacting method and device
US20160173960A1 (en) 2014-01-31 2016-06-16 EyeGroove, Inc. Methods and systems for generating audiovisual media items
CN103905885B (en) * 2014-03-25 2018-08-31 广州华多网络科技有限公司 Net cast method and device
WO2016030879A1 (en) 2014-08-26 2016-03-03 Mobli Technologies 2010 Ltd. Distribution of visual content editing function
CN104394331A (en) * 2014-12-05 2015-03-04 厦门美图之家科技有限公司 Video processing method for adding matching sound effect in video picture
US9667885B2 (en) 2014-12-12 2017-05-30 Futurewei Technologies, Inc. Systems and methods to achieve interactive special effects
CN104618797B (en) * 2015-02-06 2018-02-13 腾讯科技(北京)有限公司 Information processing method, device and client
CN104703043A (en) * 2015-03-26 2015-06-10 努比亚技术有限公司 Video special effect adding method and device
CN104780458A (en) * 2015-04-16 2015-07-15 美国掌赢信息科技有限公司 Method and electronic equipment for loading effects in instant video
CN104954848A (en) * 2015-05-12 2015-09-30 乐视致新电子科技(天津)有限公司 Intelligent terminal display graphic user interface control method and device
CN105491441B (en) * 2015-11-26 2019-06-25 广州华多网络科技有限公司 A kind of special efficacy management control method and device
CN107195310A (en) * 2017-03-05 2017-09-22 杭州趣维科技有限公司 A kind of method for processing video frequency of sound driver particle effect
CN108234903B (en) 2018-01-30 2020-05-19 广州市百果园信息技术有限公司 Interactive special effect video processing method, medium and terminal equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102779028A (en) * 2011-05-09 2012-11-14 腾讯科技(深圳)有限公司 Implementation method and device for special effect synthesizing engine of client side
CN105959725A (en) * 2016-05-30 2016-09-21 徐文波 Method and device for loading media special effects in video
CN106385591A (en) * 2016-10-17 2017-02-08 腾讯科技(上海)有限公司 Video processing method and video processing device
CN107168606A (en) * 2017-05-12 2017-09-15 武汉斗鱼网络科技有限公司 Dialog control display methods, device and user terminal

Also Published As

Publication number Publication date
WO2019149000A1 (en) 2019-08-08
US20210058564A1 (en) 2021-02-25
US11533442B2 (en) 2022-12-20
EP3748954A1 (en) 2020-12-09
EP3748954A4 (en) 2021-03-24
RU2758910C1 (en) 2021-11-03
CN108234903A (en) 2018-06-29

Similar Documents

Publication Publication Date Title
CN108234903B (en) Interactive special effect video processing method, medium and terminal equipment
US9743145B2 (en) Second screen dilemma function
US9576334B2 (en) Second screen recipes function
US9583147B2 (en) Second screen shopping function
US10448081B2 (en) Multimedia information processing method, terminal, and computer storage medium for interactive user screen
US9641898B2 (en) Methods and systems for in-video library
US9967621B2 (en) Dynamic personalized program content
WO2015074469A1 (en) Method for realizing picture-in-picture playing and picture-in-picture playing device
CN109068081A (en) Video generation method, device, electronic equipment and storage medium
US9578370B2 (en) Second screen locations function
CN103686200A (en) Intelligent television video resource searching method and system
WO2017181652A1 (en) Method and system for generating dynamic program list
CN114697721A (en) Bullet screen display method and electronic equipment
CN116668759A (en) Playing method for mobile terminal application and related equipment
CN109348241B (en) Video playing method and device in multi-user video live broadcasting room and computer equipment
US20170155943A1 (en) Method and electronic device for customizing and playing personalized programme
CN115237314B (en) Information recommendation method and device and electronic equipment
CN111079051B (en) Method and device for playing display content
CN114666648B (en) Video playing method and electronic equipment
CN108024147A (en) Scene playback method, smart television and computer-readable recording medium
JP6941723B1 (en) Image display device and program
WO2022183866A1 (en) Method and apparatus for generating interactive video
US20240163518A1 (en) Interaction method and electronic device
WO2023165364A1 (en) Virtual reality-based video playback method and apparatus, and electronic device
CN115866352A (en) Interaction method, interaction device, electronic equipment, readable storage medium and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20220523

Address after: 31a, 15th floor, building 30, maple commercial city, bangrang Road, Brazil

Patentee after: Baiguoyuan Technology (Singapore) Co.,Ltd.

Address before: Building B-1, North District, Wanda Commercial Plaza, Wanbo business district, No. 79, Wanbo 2nd Road, Nancun Town, Panyu District, Guangzhou City, Guangdong Province

Patentee before: GUANGZHOU BAIGUOYUAN INFORMATION TECHNOLOGY Co.,Ltd.

TR01 Transfer of patent right