CN111031395A - Video playing method, device, terminal and storage medium - Google Patents

Video playing method, device, terminal and storage medium Download PDF

Info

Publication number
CN111031395A
CN111031395A CN201911320942.7A CN201911320942A CN111031395A CN 111031395 A CN111031395 A CN 111031395A CN 201911320942 A CN201911320942 A CN 201911320942A CN 111031395 A CN111031395 A CN 111031395A
Authority
CN
China
Prior art keywords
video
target
interactive
event
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911320942.7A
Other languages
Chinese (zh)
Inventor
刘晓丹
杨子斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing QIYI Century Science and Technology Co Ltd
Original Assignee
Beijing QIYI Century Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing QIYI Century Science and Technology Co Ltd filed Critical Beijing QIYI Century Science and Technology Co Ltd
Priority to CN201911320942.7A priority Critical patent/CN111031395A/en
Publication of CN111031395A publication Critical patent/CN111031395A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44222Analytics of user selections, e.g. selection of programs or purchase activity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47217End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • H04N21/8456Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Databases & Information Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Computer Interaction (AREA)
  • Television Signal Processing For Recording (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The embodiment of the invention provides a video playing method, a video playing device, a video playing terminal and a storage medium, which are applied to interactive videos, wherein the interactive videos comprise a plurality of video segments, each video segment comprises at least one interactive event, the method comprises the steps of obtaining a target interactive event corresponding to the currently played video segment, wherein the target interactive event corresponds to a plurality of options, and each option is associated with one video segment to be played; loading a video clip to be played; receiving response operation of a user to a target interaction event; determining a target video clip corresponding to the response operation from the loaded video clips to be played; and playing the target video clip. The method comprises the steps that a video clip to be played is loaded in advance before response operation of a user is accepted, so that fluency of the video clip to be played in playing is guaranteed; the situation development decided by the user is realized, and the video content is enriched; in the process of watching the video by the user, the initiative of the user can be transferred, the watching experience of the user is improved, and meanwhile the watching value of the video is improved.

Description

Video playing method, device, terminal and storage medium
Technical Field
The present invention relates to the field of video processing technologies, and in particular, to a video playing method, an apparatus, a terminal, and a storage medium.
Background
With the continuous development of computer technology, electronic devices are more and more widely used, and users often use the electronic devices to watch videos.
In the traditional video playing process, a user can only passively receive video content and can not autonomously determine the development of a plot or determine the content of video playing according to own preference. The interactive video is a novel video which can integrate the interactive experience into the linear playing video so as to enhance the interactivity between the user and the video. However, in the prior art, interactive videos often have the problems of unsmooth playing, incapability of realizing large-scale production and the like, so that the application and development of the interactive videos are influenced, and the watching experience effect of a user is poor.
Disclosure of Invention
In view of the above problems, the present invention provides a video playing method, device, terminal and storage medium, so as to solve the problems of unsmooth interactive video playing and inability of large-scale production to a certain extent.
According to a first aspect of the present invention, there is provided a video playing method applied to an interactive video, the interactive video including a plurality of video segments, the video segments including at least one interactive event, the method including:
acquiring a target interaction event corresponding to a currently played video clip, wherein the target interaction event corresponds to a plurality of options, and each option is associated with a video clip to be played;
loading the video clip to be played;
receiving response operation of a user to the target interaction event;
determining a target video clip corresponding to the response operation from the loaded video clips to be played;
and playing the target video clip.
Optionally, before obtaining the target interaction event corresponding to the currently played video segment, the method further includes:
acquiring a description file of the interactive video, wherein the description file comprises interactive event information corresponding to interactive events of video clips of the interactive video;
the obtaining of the target interaction event corresponding to the currently played video clip includes:
and acquiring a target interactive event corresponding to the currently played video clip from the interactive events corresponding to the interactive event information.
Optionally, the step of obtaining a target interactivity event corresponding to the currently played video segment from interactivity events corresponding to the interactivity event information includes:
acquiring a current video identifier corresponding to the currently played video clip;
searching for prepared interactive event information corresponding to the current video identification from the interactive event information;
acquiring the starting display time of a prepared interactive event corresponding to the prepared interactive event information and the current playing time of the currently played video clip;
and determining the prepared interactive event of which the starting presentation time is adjacent to the current playing time and is after the current playing time as the target interactive event.
Optionally, the step of receiving a response operation of the user to the target interaction event includes:
and when the current playing time is equal to the starting display time of the target interaction event, creating a target display interface of the target interaction event, and displaying the target display interface.
Optionally, the target interaction event further includes an end presentation time, and the step of receiving a response operation of the user to the target interaction event further includes:
and receiving response operation of a user to the target display interface within the starting display time and the ending display time of the target interaction event.
Optionally, the step of playing the target video segment includes:
acquiring the jumping time of the currently played video clip and the starting time of the target video clip;
and when the current playing time is equal to the jumping time, playing the target video clip from the starting time of the target video clip.
According to a second aspect of the present invention, there is provided a video playing apparatus applied to an interactive video, the interactive video including a plurality of video segments, the video segments including at least one interactive event, the apparatus including:
the device comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring a target interaction event corresponding to a currently played video clip, the target interaction event corresponds to a plurality of options, and each option is associated with one video clip to be played;
the loading module is used for loading the video clip to be played;
the receiving module is used for receiving response operation of a user to the target interaction event;
a determining module, configured to determine, from the loaded video segments to be played, a target video segment corresponding to the response operation;
and the playing module is used for playing the target video clip.
Optionally, the apparatus further comprises:
the description file acquisition module is used for acquiring a description file of the interactive video, wherein the description file comprises interactive event information corresponding to interactive events of video clips of the interactive video;
the obtaining module is configured to obtain a target interactive event corresponding to the currently played video segment from the interactive events corresponding to the interactive event information.
Optionally, the obtaining module includes:
the first obtaining submodule is used for obtaining a current video identifier corresponding to the currently played video clip;
the first searching submodule is used for searching the prepared interactive event information corresponding to the current video identification from the interactive event information;
a second obtaining sub-module, configured to obtain a start display time of a prepared interactivity event corresponding to the prepared interactivity event information and a current playing time of the currently played video segment;
a first determining sub-module, configured to determine, as the target interactivity event, a preliminary interactivity event that is adjacent to the current play time at the start presentation time and is subsequent to the current play time.
Optionally, the apparatus further comprises:
and the display module is used for creating a target display interface of the target interaction event and displaying the target display interface when the current playing time is equal to the starting display time of the target interaction event.
Optionally, the target interaction event further includes an end presentation time, and the receiving module is configured to:
and receiving response operation of a user to the target display interface within the starting display time and the ending display time of the target interaction event.
Optionally, the playing module includes:
a third obtaining sub-module, configured to obtain a skip time of the currently played video segment and a start time of the target video segment;
and the second determining submodule is used for playing the target video clip from the starting time of the target video clip when the current playing time is equal to the jumping time.
According to a third aspect of the present invention, there is provided a terminal comprising: a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing a video playback method as claimed in any one of the first aspects.
According to a fourth aspect of the present invention, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the video playback method as defined in any one of the first aspects.
Compared with the background art, the embodiment of the invention has the following advantages:
in the embodiment of the invention, the interactive video comprises a plurality of video segments, each video segment comprises at least one interactive event, the target interactive event corresponds to a plurality of options by acquiring the target interactive event corresponding to the currently played video segment, and each option corresponds to one video segment to be played; then, loading the video clip to be played so as to ensure the fluency of the video clip to be played when being played; then, receiving response operation of a user to the target interaction event to enhance the interactivity between the user and the video, determining a corresponding target video segment from the loaded video segments to be played according to the response operation, and finally playing the target video segment to realize the plot development decided by the user and enrich the video content; in the process of watching the video by the user, the initiative of the user can be transferred, the watching experience of the user is improved, and meanwhile the watching value of the video is improved.
Drawings
Fig. 1 is a schematic flowchart illustrating steps of a video playing method according to an embodiment of the present invention;
FIG. 2 is a flow chart illustrating steps of another video playing method according to an embodiment of the present invention;
FIG. 3 is a flowchart illustrating steps of a specific implementation of step 202 shown in FIG. 2;
FIG. 4 is a flowchart illustrating a step of step 207 in FIG. 2;
fig. 5 is a block diagram of a design of a video playback method according to an embodiment of the present invention;
fig. 6 is a block diagram of a video playback device according to an embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
Referring to fig. 1, a flowchart illustrating steps of a video playing method according to an embodiment of the present invention is shown, where the method is applied to an interactive video, the interactive video includes a plurality of video clips, and the video clips may include at least one interactive event, so that during the interactive video being played by an electronic device, a user may interact with video content by operating the electronic device to determine the development of a story in the video or to play more content of a portion of interest to the user. Electronic devices include, but are not limited to, smart phones, tablets, laptops, palmtops, smart televisions, and the like. The method may specifically comprise the steps of:
step 101, acquiring a target interaction event corresponding to a currently played video clip, where the target interaction event corresponds to a plurality of options, and each option is associated with a video clip to be played.
In the embodiment of the invention, the interactive event can refer to an event which allows a user to select autonomously in the playing process of a video clip of an interactive video; one or more interactive events may exist in one video clip. When a plurality of interactive events exist in one video clip, the interactive events are displayed in sequence according to the time sequence. The interactive event is a channel for information interaction between a person and the electronic equipment, and the electronic equipment provides information for a user through the interactive event so as to be read, analyzed and judged by the user; and the user inputs information to the electronic equipment through the interaction event and performs corresponding operation. The target interactive event is an upcoming interactive event in the current playing process of the currently played video clip; the target interaction event corresponds to a plurality of options and is used for the user to select the options of the interaction event; specifically, a plurality of selection boxes of the target interaction event can be displayed in the screen, and one of the selection boxes is selected by the user; and a prompt box and an input box of the target interaction event can be displayed in the screen, and a user inputs a corresponding selection result in the input box according to prompt information in the prompt box. Each option corresponds to a video segment to be played, it can be understood that different users may select different selection boxes or input different selection results, and the video segments associated with different selection boxes or different selection results may be different, so as to change the development of the scenario according to the selection of the user, or determine the order and content of video playing according to the selection of the user.
And 102, loading the video clip to be played.
In the embodiment of the invention, after the target interaction event is determined, the video clip identifier associated with each option can be determined through a plurality of options of the target interaction event, all the video clips associated with the target interaction event can be obtained through the video clip identifiers, and all the associated video clips are pre-loaded, so that in the subsequent step, no matter which option is selected by a user, the video clip associated with the option can have the condition of starting playing, and the problems of blockage, unsmooth connection and the like of the two video clips are solved.
And 103, receiving response operation of the user to the target interaction event.
In the embodiment of the present invention, the response operation is a response made by the user for the target interaction event, and the target interaction event corresponds to a plurality of options, so the response operation may be considered as an operation of selecting one of the options from the plurality of options corresponding to the target interaction event. Specifically, the response operation includes a selection operation and an input operation; when the target interactive event of the currently played video clip is displayed on the screen of the electronic device, the user can execute corresponding operation according to the information provided by the target interactive event. For example, when a video is played in front of three closed doors, 1, 2 and 3 are written on the three doors respectively, a target interaction event is displayed in a screen and corresponds to three selection boxes, the first selection box displays an opened door 1, the second selection box displays an opened door 2, and the third selection box displays an opened door 3; specifically, the user can select one of the selection frames through a touch screen of the electronic device through touch operation, or select the corresponding one of the selection frames through cooperation of a direction key and a determination key of the electronic device, for example, the selection frame selected by the pre-selection frame is moved through a left key and a right key, and the selection frame selected by the pre-selection frame is determined through the determination key; for another example, when a video is played in front of three closed doors, 1, 2, and 3 are written on the three doors respectively, and a target interaction event is displayed in the screen, where the target interaction event is a prompt box and an input box, where the prompt box displays "please input the number of the door you want to open", a user can input one of the numbers 1, 2, and 3 in the input box to determine the subsequent development scenario of the scenario, specifically, the user determines to open the door with the number of 1 by pressing the key 1 through a numeric key of the electronic device, determines to open the door with the number of 2 by pressing the key 2, and determines to open the door with the number of 3 by pressing the key 3; or writing a corresponding number in a handwriting screen through the handwriting screen of the electronic equipment to determine the number of the door selected to be opened.
And 104, determining a target video clip corresponding to the response operation from the loaded video clips to be played.
In the embodiment of the invention, a target video clip is determined according to the response operation of a user to a target interaction event; the target interaction event corresponds to a plurality of options, each option is associated with a video clip to be played, the target option selected by the user can be determined according to the response operation of the user to the target event, and then the video clip to be played corresponding to the target option can be determined as the target video clip according to the association relationship between the target option and the video clip to be played.
Since loading all the video clips to be played associated with the target interactive event is performed before receiving the response operation of the user to the target interactive event, the target video clip is in the loaded video clip to be played.
For example, the currently played video segment is 10 minutes in total, the target interaction event display event of the currently played video segment is the 8 th minute of the currently played video segment, and there are three options of the target interaction event, where the video segment to be played associated with the first option is video segment a, the video segment to be played associated with the second option is video segment b, and the video segment to be played associated with the third option is video segment c. When the currently played video clip starts to play, the video clip a, the video clip b and the video clip c are all loaded into the memory, when the currently played video clip is played to 8 th minute, a target interaction event is displayed in the screen, a user selects the target interaction event, and if the user selects the second option, the corresponding target video clip is the video clip b in the loaded three video clips.
And if the response operation of the user to the target interaction event is not received, determining a target video clip according to a default rule. Specifically, the default rule may be that when a response operation of the user is not received, a video segment associated with one specific option of the target interaction event is a target video segment; it can also be understood that, if the response operation of the receiving user to the target interactive event is null, the default response operation of the user to the target interactive event is a specific option for selecting the target interactive event, and a video clip associated with the option is determined as a target video clip; it should be noted that the specific option may be any one of options of the target interaction event, which is determined by the video producer when the script describing the file is written. In another example, if a response operation of the user to the target interaction event is not received, the playing of the current video clip is paused.
And 105, playing the target video clip.
In the embodiment of the invention, the target video clip is determined according to the response operation of the user to the target interaction event. And when the target video clip is determined, playing the target video clip. Specifically, the target video segment may be directly played after receiving a response operation of the user, or a skip node may be set in the currently played video segment, and when the currently played video segment is played to the skip node, the target video segment is played; and a jump time node can be set in the target video clip, and the target video clip is played from the jump time node of the target video clip. When the target video segment starts to be played, the target video segment is determined as the currently played video segment, and the step 101 is executed.
In summary, the video playing method provided in the embodiment of the present invention is applied to an interactive video, where the interactive video includes a plurality of video segments, and by obtaining a target interaction event corresponding to a currently played video segment, the target interaction event corresponds to a plurality of options, and each option corresponds to a video segment to be played; then, loading the video clip to be played so as to ensure the fluency of the video clip to be played when being played; then, receiving response operation of a user to the target interaction event to enhance the interactivity between the user and the video, determining a corresponding target video segment from the loaded video segments to be played according to the response operation, and finally playing the target video segment to realize the plot development decided by the user and enrich the video content; in the process of watching the video by the user, the initiative of the user can be transferred, the watching experience of the user is improved, and meanwhile the watching value of the video is improved.
Referring to fig. 2, a schematic flowchart illustrating steps of a video playing method according to another embodiment of the present invention is shown, where the method may include:
step 201, obtaining a description file of the interactive video, where the description file includes interactive event information corresponding to interactive events of all video segments.
In the embodiment of the invention, the interactive video comprises the description file, and the description file comprises interactive event information corresponding to all interactive events in the interactive video. Specifically, when the interactive video starts to play, the player acquires a description file corresponding to the interactive video from a background server; the file format of the description file can be json format or xml format; json is a JS object numbered musical notation, which is a lightweight data exchange format; xml, a subset of standard generalized markup language, is a markup language for marking electronic documents to be structured; the json and the xml both use a structured method to mark data, so that the editing difficulty of the script of the description file can be simplified, the difference of data structures/modes between different platforms and different systems can be conveniently solved, the applicability of the description file is improved, and when the interactive video is online, only the corresponding description file needs to be edited, so that the development and application cost of the interactive video can be reduced, and the large-scale production of the interactive video can be realized.
By analyzing the description file, the video information list can be obtained, and the video information list or the whole description file is stored in a local storage, wherein the local storage can be a memory or a magnetic disk, so that the video information list or the description file can be directly obtained from the local storage when the video information list or the description file needs to be used in the following process, the obtaining speed is improved, and the requirement on a network when the video information list or the description file needs to be obtained again is reduced. The video information list includes descriptions of all interactivity events, that is, interactivity event information corresponding to all interactivity events, where a video identifier to which each interactivity event belongs is described in the video information list, and the video identifier may be a video ID, a video name, and the like, and the video identifier to which an interactivity event belongs determines in which video segment the interactivity event occurs, for example, a video identifier to which an interactivity event belongs is a video ID, and the video ID is 1, at this time, a video segment corresponding to the video ID of 1 is a first video segment, which indicates that an interactivity event with the video ID of 1 may occur when the first video segment is played; the video information list also describes the starting display time and the ending display time of each interactive event; options available for user selection in each interaction event; video clip identification associated with each option; presentation forms of interaction events, and so on. The start presentation time may be understood as a time when the interactive event appears in the device screen, and the end presentation time may be understood as a time when the interactive event disappears from the device screen. Therefore, no matter which video clip is currently played, the interactive event information corresponding to all the interactive events of the currently played video clip can be obtained by acquiring the description file of the interactive video.
Step 202, obtaining a target interactive event corresponding to the currently played video clip from the interactive events corresponding to the interactive event information.
The target interactive event is an upcoming interactive event in the current playing process of the currently played video clip; all the interactive events can be obtained from all the interactive event information of the interactive video, and then the target interactive event is obtained from all the interactive events. The target interaction event corresponds to a plurality of options, and each option is associated with a video clip to be played.
Specifically, referring to fig. 3, step 202 can be implemented by the following steps:
step 2021, obtain the current video identifier corresponding to the currently played video clip.
In this step, each video clip has a unique corresponding video identifier, and the video identifier may be a video ID or a video name. The corresponding current video identification can be obtained according to the currently played video clip, and the corresponding video clip can be uniquely determined according to the video identification.
Step 2022, look up the prepared interactivity event information corresponding to the current video identifier from the interactivity event information.
In this step, the obtained interactivity event information includes interactivity event information corresponding to the interactivity events of all video segments, and it can be understood that the interactivity event information includes both interactivity event information whose video identifier is the same as the current video identifier of the currently played video segment and interactivity event information whose video identifier is different from the current video identifier of the currently played video segment; only the interaction event corresponding to the interaction event information with the video identifier being the same as the current video identifier can be displayed in the currently played video clip; therefore, the interactivity event information corresponding to the current video identifier is searched from the interactivity event information, that is, the interactivity event information of which the video identifier is the same as the current video identifier is searched, and the interactivity event information is determined as the prepared interactivity event information. It should be noted that the preliminary interactivity event information may be one or more.
Step 2023, obtaining the start display time of the prepared interactivity event corresponding to the prepared interactivity event information and the current playing time of the currently playing video segment.
In this step, one piece of preliminary interactive event information corresponds to one preliminary interactive event, and the presentation start time of the preliminary interactive event may refer to a time point at which the interactive event starts to be presented in a screen of the electronic device; the current playing time of the currently playing video clip may refer to a progress of the currently playing video clip; for example, the currently played video clip is VideoA, and the preliminary interactive events include Event1 and Event 2; the duration of the VideoA is 10 minutes, and the VideoA is played for 4 minutes, namely the current playing time is the 4 th minute of the VideoA; event1 begins presentation at the 3 rd minute of VideoA, indicating that Event1 can be presented when VideoA is played to the 3 rd minute; event2 begins presentation at the 5 th minute of VideoA, indicating that Event2 can be presented when VideoA is played to the 5 th minute. The starting presentation time of the prepared interactive event and the current playing time of the currently played video clip are obtained, which are the precondition for determining the target interactive event.
Step 2024, determining a prepared interactivity event, of which the start presentation time is adjacent to the current play time and is after the current play time, as the target interactivity event.
In this step, the currently played video segment uses the current playing time as a starting point, and a target interaction event is determined from the prepared interaction events, where the target interaction event may be considered as a latest interaction event to be displayed on the screen of the electronic device. For example, the currently played video clip is VideoA, and the preliminary interactive events include Event1 and Event 2; the duration of the VideoA is 10 minutes, and the VideoA is played for 4 minutes, namely the current playing time is the 4 th minute of the VideoA; the starting presentation time of Event1 is the 3 rd minute of VideoA, and the starting presentation time of Event2 is the 5 th minute of VideoA; at this time, the preliminary interactivity events having the presentation start time adjacent to the current play time include Event1 and Event2, and among Event1 and Event2, the preliminary interactivity Event having the presentation start time after the current play time is Event2, and therefore Event2 is the target interactivity Event of the current play time of the VideoA.
Further, after the target interaction event is determined, a target display interface corresponding to the target interaction event can be created according to the description information of the target interaction event in the description file, and the target display interface can reflect the display form of the target interaction event on the screen of the electronic device, for example, the target interaction event is displayed in a selection box form or in a prompt box and an input box form; and background patterns, fonts, word sizes, colors and the like when the target interaction event is displayed. Hiding the target presentation interface before the start presentation time of the target interaction event.
Step 203, loading the video clip to be played.
The implementation manner of this step may refer to step 102, which is not described herein again in this embodiment of the present invention.
And 204, when the current playing time is equal to the starting display time of the target interaction event, creating a target display interface of the target interaction event, and displaying the target display interface.
In the embodiment of the invention, the current playing time is real-time and can change along with the playing progress of the interactive video. And when the current playing time is equal to the starting display time of the target interaction event, creating a target display interface of the target interaction event, and displaying the target display interface. For example, the currently played video clip is VideoA, and the preliminary interactive events include Event1 and Event 2; the duration of the VideoA is 10 minutes, and the VideoA is played for 4 minutes, namely the current playing time is the 4 th minute of the VideoA; the starting presentation time of Event1 is the 3 rd minute of VideoA, and the starting presentation time of Event2 is the 5 th minute of VideoA; at this time, the target interactivity Event is Event 2; when the video A continues to be played to the 5 th minute of the video A, the current playing time is the 5 th minute of the video A and is equal to the starting display time of the target interaction Event2, a display interface of the target interaction Event is created, the display interface can comprise a prompt area and a response area, and the prompt area is used for providing information for a user for reading, analyzing and judging; the answer area is used for receiving reply information which is input by a user and corresponds to the information of the prompt area, and the reply information can be input text or digital information or information for performing single selection or multiple selections from multiple options in the answer area; the target presentation interface may be displayed in the form of a dialog box or a floating window in the screen of the electronic device.
Step 205, the target interaction event further includes an end presentation time, and a response operation of the user to the target presentation interface is received within the start presentation time and the end presentation time of the target interaction event.
Specifically, the corresponding target interaction event information is determined according to the target interaction event, and the ending display time of the target interaction event is obtained from the description file according to the target interaction event information. The presentation duration of the target interaction event may be set by a timer, with the start presentation time as a timing start point of the timer, and the end presentation time as a timing end point of the timer. For example, the start presentation time of the target interaction Event2 is 5 minutes and the end presentation time is 5 minutes and 10 seconds of the video; it is understood that the presentation duration of the target interactivity Event2 is 10 seconds, and when the current playing time exceeds the 5 th minute 10 seconds of the video a, i.e. the timer is timed out, for example, when the current playing time is the 5 th minute 11 seconds of the video a, the target presentation interface of the target interactivity Event2 disappears or hides from the screen of the electronic device.
In the embodiment of the invention, the response operation of a user to a target interaction event is received within the starting display time and the ending display time of the target interaction event; it is understood that the response operation of the user to the target interactive event is received when the target interactive event is displayed on the screen of the electronic device. The response operation comprises a selection operation and an input operation; when the target interactive event of the currently played video clip is displayed on the screen of the electronic device, the user can execute corresponding operation according to the information provided by the target interactive event so as to interact with the currently played video clip.
And if the response operation of the user to the target interaction event is not received within the starting display time and the ending display time of the target interaction event, considering that the response operation of the user to the target interaction event is received as null.
Step 206, determining a target video clip corresponding to the response operation from the loaded video clips to be played.
The implementation manner of this step may refer to step 104, which is not described herein again in this embodiment of the present invention.
Step 207, playing the target video clip.
Specifically, referring to fig. 4, step 207 can be implemented by the following steps:
step 2071, obtaining the jump time of the currently played video segment and the start time of the target video segment.
In this step, the skip time is set when the video is made, and may refer to a first connection time point between the currently played video segment and the target video segment, where the first connection time point may be an end time of the currently played video segment or a first specific time node of the currently played video segment. For example, the currently played video segment is VideoA, the duration of the VideoA is 10 minutes, and the jump time may be an end time of the VideoA, that is, the jump time is 10 minutes of the VideoA, that is, the playing of the current video segment VideoA is ended when the VideoA is played to 10 minutes; the jump time may also be the first specific time node of the VideoA, e.g. the 8 th minute of the VideoA, i.e. the playing of the current video segment VideoA is finished when the VideoA is played to the 8 th minute, when the remaining 2 minutes of the video of the VideoA are not played. It should be noted that the jump time of the currently played video segment is after the end presentation time of the target interactive event.
The starting time is set during video production, and may refer to a second connection time point between the target video clip and the currently played video clip, where the second connection time point may be a start time of the target video clip or a second specific time node of the target video clip. For example, the target video clip is video b, the duration of the video b is 10 minutes, and the start time may be the start time of the video b, i.e. the 0 th minute of the video b, i.e. when the video b is played, the video clip is played from the 0 th minute of the video b; the start-up time may also be a second specific time node of the VideoB, e.g. 3 rd minute of the VideoB, i.e. when playing the VideoB, the playback starts from 3 rd minute of the VideoB, at which time the video of the VideoB before 3 rd minute will not be played.
Step 2072, when the current playing time is equal to the jumping time, playing the target video clip from the starting time of the target video clip.
In this step, the skip time may correspond to a node on the progress bar of the currently played video clip, and the progress bar of the currently played video clip may be used to indicate a corresponding position of the current play time of the currently played video clip in the total duration of the currently played video clip. When the current playing time is equal to the jump time, it can be understood that when the corresponding position of the current playing time in the progress bar coincides with the corresponding node of the jump time in the progress bar, the playing of the current video clip is finished; simultaneously starting playing the target video clip from the starting time of the target video clip; therefore, the smoothness of the picture and the continuity of the story line when the two video clips jump are realized.
For example, the currently played video segment is video a, the video content is that a character is walking forward, and there are two intersections in front, where one intersection is left and the other is right, at this time, a target display interface of a target interaction event is displayed in the screen, the content of the target display interface includes two selection frames, where one selection frame displays left, and the video content of the corresponding video segment video b is that the character walks on the road of the left intersection; the other selection frame displays the right, and the video content of the corresponding video clip VideoC is that the character walks on the road of the left intersection; the user can make a selection within the starting display time and the ending display time of the target interaction event, when the user makes the selection, the character in the video content may not walk to the intersection, if the target video clip VideoB or VideoC corresponding to the option is played immediately when the user makes the selection, after the currently played video clip is connected with the target video clip, the displayed character does not walk to the intersection, the turning is completed, and the character directly walks on the road after the turning. The video content appears to the user to be very incoherent. In the embodiment of the invention, the jumping time of the currently played video clip and the starting time of the target video clip are set, and in the above example, the jumping time corresponds to the time when the person in the video content of the VideoA walks to the intersection; the starting time is the time when the character in the video content of the target video clip starts to walk from the turning; therefore, after the currently played video clip is connected with the target video clip, the characters are shown to go to the intersection and turn, the plots are consistent, and the user has good watching experience.
Referring to fig. 5, a block diagram of a design scheme of a video playing method according to an embodiment of the present invention is shown.
When the interactive video starts to play, initializing a player, acquiring and analyzing a description file of the interactive video to acquire interactive event information corresponding to all interactive events, and playing a current video segment; determining whether the current video clip has an interactive event according to the description file, if not, continuing to play according to a default rule, wherein the default rule can be played according to the playing sequence of the video clip determined by a producer in the interactive video production process; if the video clip exists, the target interaction event is confirmed and all the video clips related to the target interaction event are loaded, so that the video file can be directly obtained from the local place when the corresponding video clip is played, the video playing is not influenced by the network condition, and the playing is smooth; then, monitoring whether the display condition of the target interaction event is met, and when the display condition of the target interaction event is met, namely the current playing time is equal to the display starting time of the target interaction event, displaying the target interaction event on a screen through a target display interface; when the display condition of the target interaction event is not met, continuing monitoring; then, receiving response operation of a user to the target interaction event within the display duration of the target interaction event, or playing a target video clip corresponding to the response operation or playing a default video clip when the current playing time is equal to the display ending time of the target interaction event; and returning to the step of playing the current video clip when the target video clip or the default video clip is played. The interactive video playing method provided by the embodiment of the invention enriches the video content, can mobilize the initiative of the user in the process of watching the video by the user, improves the watching experience of the user, and simultaneously improves the watching value of the video.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 6, a schematic block diagram of a video playback apparatus according to an embodiment of the present invention is shown, the video playback apparatus is applied to an interactive video, the interactive video includes a plurality of video segments, the video segments include at least one interactive event, and the apparatus includes:
an obtaining module 601, configured to obtain a target interaction event corresponding to a currently played video segment, where the target interaction event corresponds to a plurality of options, and each option is associated with a video segment to be played;
a loading module 602, configured to load the video segment to be played;
a receiving module 603, configured to receive a response operation of the user to the target interaction event;
a determining module 604, configured to determine, from the loaded video segments to be played, a target video segment corresponding to the response operation;
a playing module 605, configured to play the target video segment.
In a preferred embodiment of the present invention, the apparatus further comprises:
the description file acquisition module is used for acquiring a description file of the interactive video, wherein the description file comprises interactive event information corresponding to interactive events of video clips of the interactive video;
the obtaining module 601 is configured to obtain a target interactivity event corresponding to the currently played video segment from interactivity events corresponding to the interactivity event information. In a preferred embodiment of the present invention, the obtaining module 601 includes:
the first obtaining submodule is used for obtaining a current video identifier corresponding to the currently played video clip;
the first searching submodule is used for searching the prepared interactive event information corresponding to the current video identification from the interactive event information;
a second obtaining sub-module, configured to obtain a start display time of a prepared interactivity event corresponding to the prepared interactivity event information and a current playing time of the currently played video segment;
a first determining sub-module, configured to determine, as the target interactivity event, a preliminary interactivity event that is adjacent to the current play time at the start presentation time and is subsequent to the current play time.
In a preferred embodiment of the present invention, the apparatus further comprises:
and the display module is used for creating a target display interface of the target interaction event and displaying the target display interface when the current playing time is equal to the starting display time of the target interaction event.
In a preferred embodiment of the present invention, the target interaction event further includes an end presentation time, and the receiving module 603 is further configured to:
and receiving response operation of a user to the target display interface within the starting display time and the ending display time of the target interaction event.
In a preferred embodiment of the present invention, the playing module 605 includes:
a third obtaining sub-module, configured to obtain a skip time of the currently played video segment and a start time of the target video segment;
and the second determining submodule is used for playing the target video clip from the starting time of the target video clip when the current playing time is equal to the jumping time.
In a preferred embodiment of the present invention, the file format of the description file is json format or xml format.
According to the video playing device provided by the embodiment of the invention, the description file of the interactive video is obtained, the description file comprises the interactive event information corresponding to the interactive event of the video segment of the interactive video, and the description file adopts a json format or an xml format, so that the script writing difficulty and the development and application cost can be reduced; searching a target interactive event corresponding to a currently played video clip from interactive events corresponding to interactive event information of the description file, wherein the target interactive event corresponds to a plurality of options, and each option corresponds to one video clip to be played; then, loading the video clip to be played so as to ensure the fluency of the video clip to be played when being played; then, receiving a response operation of a user to a target interaction event to enhance the interactivity between the user and a video, determining a corresponding target video segment from the loaded video segments to be played according to the response operation, and finally playing the target video segment, wherein when the target video segment is played, the skip time of the currently played video segment and the play starting time of the target video segment are obtained to ensure that the two video segments are smoothly and naturally linked, thereby realizing the plot development decided by the user and enriching the video content; in the process of watching the video by the user, the initiative of the user can be transferred, the watching experience of the user is improved, and meanwhile the watching value of the video is improved.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The video playing method and the video playing apparatus provided by the present invention are described in detail above, and the principle and the implementation of the present invention are explained in detail herein by applying specific examples, and the description of the above examples is only used to help understanding the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (14)

1. A video playing method applied to an interactive video, wherein the interactive video comprises a plurality of video segments, and the video segments comprise at least one interactive event, the method comprising:
acquiring a target interaction event corresponding to a currently played video clip, wherein the target interaction event corresponds to a plurality of options, and each option is associated with a video clip to be played;
loading the video clip to be played;
receiving response operation of a user to the target interaction event;
determining a target video clip corresponding to the response operation from the loaded video clips to be played;
and playing the target video clip.
2. The method of claim 1, further comprising, before obtaining the target interaction event corresponding to the currently played video segment:
acquiring a description file of the interactive video, wherein the description file comprises interactive event information corresponding to interactive events of video clips of the interactive video;
the obtaining of the target interaction event corresponding to the currently played video clip includes:
and acquiring a target interactive event corresponding to the currently played video clip from the interactive events corresponding to the interactive event information.
3. The method according to claim 2, wherein the step of obtaining the target interactivity event corresponding to the currently played video segment from the interactivity event corresponding to the interactivity event information comprises:
acquiring a current video identifier corresponding to the currently played video clip;
searching for prepared interactive event information corresponding to the current video identification from the interactive event information;
acquiring the starting display time of a prepared interactive event corresponding to the prepared interactive event information and the current playing time of the currently played video clip;
and determining the prepared interactive event of which the starting presentation time is adjacent to the current playing time and is after the current playing time as the target interactive event.
4. The method of claim 3, wherein the step of receiving the response operation of the user to the target interaction event comprises:
and when the current playing time is equal to the starting display time of the target interaction event, creating a target display interface of the target interaction event, and displaying the target display interface.
5. The method of claim 4, wherein the target interaction event further comprises an end presentation time, and the step of receiving a response operation of the user to the target interaction event further comprises:
and receiving response operation of a user to the target display interface within the starting display time and the ending display time of the target interaction event.
6. The method of claim 5, wherein the step of playing the target video segment comprises:
acquiring the jumping time of the currently played video clip and the starting time of the target video clip;
and when the current playing time is equal to the jumping time, playing the target video clip from the starting time of the target video clip.
7. A video playback apparatus applied to an interactive video, the interactive video comprising a plurality of video segments, the video segments comprising at least one interactive event, the apparatus comprising:
the device comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring a target interaction event corresponding to a currently played video clip, the target interaction event corresponds to a plurality of options, and each option is associated with one video clip to be played;
the loading module is used for loading the video clip to be played;
the receiving module is used for receiving response operation of a user to the target interaction event;
a determining module, configured to determine, from the loaded video segments to be played, a target video segment corresponding to the response operation;
and the playing module is used for playing the target video clip.
8. The apparatus of claim 7, further comprising:
the description file acquisition module is used for acquiring a description file of the interactive video, wherein the description file comprises interactive event information corresponding to interactive events of video clips of the interactive video;
the obtaining module is configured to obtain a target interactive event corresponding to the currently played video segment from the interactive events corresponding to the interactive event information.
9. The apparatus of claim 8, wherein the obtaining module comprises:
the first obtaining submodule is used for obtaining a current video identifier corresponding to the currently played video clip;
the first searching submodule is used for searching the prepared interactive event information corresponding to the current video identification from the interactive event information;
a second obtaining sub-module, configured to obtain a start display time of a prepared interactivity event corresponding to the prepared interactivity event information and a current playing time of the currently played video segment;
a first determining sub-module, configured to determine, as the target interactivity event, a preliminary interactivity event that is adjacent to the current play time at the start presentation time and is subsequent to the current play time.
10. The apparatus of claim 9, further comprising:
and the display module is used for creating a target display interface of the target interaction event and displaying the target display interface when the current playing time is equal to the starting display time of the target interaction event.
11. The apparatus of claim 10, wherein the target interaction event further comprises an end presentation time, and wherein the receiving module is configured to:
and receiving response operation of a user to the target display interface within the starting display time and the ending display time of the target interaction event.
12. The apparatus of claim 11, wherein the playback module comprises:
a third obtaining sub-module, configured to obtain a skip time of the currently played video segment and a start time of the target video segment;
and the second determining submodule is used for playing the target video clip from the starting time of the target video clip when the current playing time is equal to the jumping time.
13. A terminal, comprising: processor, memory and computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing a video playback method as claimed in any one of claims 1 to 6.
14. A computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the video playback method according to any one of claims 1 to 6.
CN201911320942.7A 2019-12-19 2019-12-19 Video playing method, device, terminal and storage medium Pending CN111031395A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911320942.7A CN111031395A (en) 2019-12-19 2019-12-19 Video playing method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911320942.7A CN111031395A (en) 2019-12-19 2019-12-19 Video playing method, device, terminal and storage medium

Publications (1)

Publication Number Publication Date
CN111031395A true CN111031395A (en) 2020-04-17

Family

ID=70212201

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911320942.7A Pending CN111031395A (en) 2019-12-19 2019-12-19 Video playing method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN111031395A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111669626A (en) * 2020-06-10 2020-09-15 北京奇艺世纪科技有限公司 Method and device for determining default play relationship of videos and electronic equipment
CN111726694A (en) * 2020-06-30 2020-09-29 北京奇艺世纪科技有限公司 Interactive video recovery playing method and device, electronic equipment and storage medium
CN111935548A (en) * 2020-08-11 2020-11-13 深圳市前海手绘科技文化有限公司 Interactive hand-drawn video production method
CN113315996A (en) * 2021-05-17 2021-08-27 游艺星际(北京)科技有限公司 Method and device for controlling video playing and electronic equipment
CN114979782A (en) * 2022-06-28 2022-08-30 北京爱奇艺科技有限公司 Video playing method and device, electronic equipment and storage medium
WO2022183866A1 (en) * 2021-03-04 2022-09-09 上海哔哩哔哩科技有限公司 Method and apparatus for generating interactive video
CN115460468A (en) * 2022-08-10 2022-12-09 北京爱奇艺科技有限公司 Interactive video file creating method and interactive video playing method and device
WO2023093451A1 (en) * 2021-11-26 2023-06-01 北京字跳网络技术有限公司 Live-streaming interaction method and apparatus in game, and computer device and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110126106A1 (en) * 2008-04-07 2011-05-26 Nitzan Ben Shaul System for generating an interactive or non-interactive branching movie segment by segment and methods useful in conjunction therewith
CN105472456A (en) * 2015-11-27 2016-04-06 北京奇艺世纪科技有限公司 Video playing method and device
CN106210836A (en) * 2016-07-28 2016-12-07 广东小天才科技有限公司 Interactive learning method and device in a kind of video display process, terminal unit
CN106998486A (en) * 2016-01-22 2017-08-01 百度在线网络技术(北京)有限公司 Video broadcasting method and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110126106A1 (en) * 2008-04-07 2011-05-26 Nitzan Ben Shaul System for generating an interactive or non-interactive branching movie segment by segment and methods useful in conjunction therewith
CN105472456A (en) * 2015-11-27 2016-04-06 北京奇艺世纪科技有限公司 Video playing method and device
CN106998486A (en) * 2016-01-22 2017-08-01 百度在线网络技术(北京)有限公司 Video broadcasting method and device
CN106210836A (en) * 2016-07-28 2016-12-07 广东小天才科技有限公司 Interactive learning method and device in a kind of video display process, terminal unit

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111669626A (en) * 2020-06-10 2020-09-15 北京奇艺世纪科技有限公司 Method and device for determining default play relationship of videos and electronic equipment
CN111726694A (en) * 2020-06-30 2020-09-29 北京奇艺世纪科技有限公司 Interactive video recovery playing method and device, electronic equipment and storage medium
CN111726694B (en) * 2020-06-30 2022-06-03 北京奇艺世纪科技有限公司 Interactive video recovery playing method and device, electronic equipment and storage medium
CN111935548A (en) * 2020-08-11 2020-11-13 深圳市前海手绘科技文化有限公司 Interactive hand-drawn video production method
WO2022033129A1 (en) * 2020-08-11 2022-02-17 深圳市前海手绘科技文化有限公司 Interactive hand-drawn video production method
WO2022183866A1 (en) * 2021-03-04 2022-09-09 上海哔哩哔哩科技有限公司 Method and apparatus for generating interactive video
CN113315996A (en) * 2021-05-17 2021-08-27 游艺星际(北京)科技有限公司 Method and device for controlling video playing and electronic equipment
CN113315996B (en) * 2021-05-17 2023-08-18 游艺星际(北京)科技有限公司 Method and device for controlling video playing and electronic equipment
WO2023093451A1 (en) * 2021-11-26 2023-06-01 北京字跳网络技术有限公司 Live-streaming interaction method and apparatus in game, and computer device and storage medium
CN114979782A (en) * 2022-06-28 2022-08-30 北京爱奇艺科技有限公司 Video playing method and device, electronic equipment and storage medium
CN115460468A (en) * 2022-08-10 2022-12-09 北京爱奇艺科技有限公司 Interactive video file creating method and interactive video playing method and device
CN115460468B (en) * 2022-08-10 2023-09-15 北京爱奇艺科技有限公司 Interactive video file creation method, interactive video playing method, device, electronic equipment and medium

Similar Documents

Publication Publication Date Title
CN111031395A (en) Video playing method, device, terminal and storage medium
CN108650555B (en) Video interface display method, interactive information generation method, player and server
US20180126279A1 (en) Apparatus and methods for multimedia games
CN111031379B (en) Video playing method, device, terminal and storage medium
TWI538498B (en) Methods and apparatus for keyword-based, non-linear navigation of video streams and other content
US20140019865A1 (en) Visual story engine
US8151179B1 (en) Method and system for providing linked video and slides from a presentation
EP1919216A1 (en) Personalised media presentation
WO2017206748A1 (en) Video playing control method and apparatus, and video playing system
JP2006514322A (en) Video-based language learning system
CN104703055A (en) Locating method and device of video playing
CN103488661A (en) Audio/video file annotation system
CN111565330A (en) Synchronous subtitle adding method and device, electronic equipment and storage medium
CN111711861B (en) Video processing method and device, electronic equipment and readable storage medium
CN111654754A (en) Video playing method and device, electronic equipment and readable storage medium
CN114339285B (en) Knowledge point processing method, video processing method, device and electronic equipment
CN110958470A (en) Multimedia content processing method, device, medium and electronic equipment
US20110016396A1 (en) Content media reproduction device and content media
Sadallah et al. CHM: an annotation-and component-based hypervideo model for the Web
CN112104908A (en) Audio and video file playing method and device, computer equipment and readable storage medium
TWI575457B (en) System and method for online editing and exchanging interactive three dimension multimedia, and computer-readable medium thereof
KR101703321B1 (en) Method and apparatus for providing contents complex
Renz et al. Optimizing the video experience in moocs
CN113613056A (en) Animation special effect display method and device, electronic equipment and medium
CN104883614A (en) WEB video playing method based on Adobe FlashPlayer and Jquery frame

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200417