CN111225225B - Live broadcast playback method, device, terminal and storage medium - Google Patents

Live broadcast playback method, device, terminal and storage medium Download PDF

Info

Publication number
CN111225225B
CN111225225B CN201811428963.6A CN201811428963A CN111225225B CN 111225225 B CN111225225 B CN 111225225B CN 201811428963 A CN201811428963 A CN 201811428963A CN 111225225 B CN111225225 B CN 111225225B
Authority
CN
China
Prior art keywords
video stream
live
terminal
interactive item
live video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811428963.6A
Other languages
Chinese (zh)
Other versions
CN111225225A (en
Inventor
汤进伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201811428963.6A priority Critical patent/CN111225225B/en
Publication of CN111225225A publication Critical patent/CN111225225A/en
Application granted granted Critical
Publication of CN111225225B publication Critical patent/CN111225225B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/238Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
    • H04N21/2387Stream processing in response to a playback request from an end-user, e.g. for trick-play
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47217End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The embodiment of the invention discloses a live broadcast playback method, a live broadcast playback device, a terminal and a storage medium, wherein the live broadcast playback method comprises the following steps: receiving a live broadcast watching request sent by a first terminal, and sending a live broadcast video stream to the first terminal; acquiring an interactive project initiated by a main broadcast; merging the interactive items into the live video stream, and sending the live video stream merged with the interactive items to the first terminal; recording the time corresponding relation between the interactive item and the live video stream; receiving a video playback request sent by a second terminal, and sending the live video stream to the second terminal; and simulating the anchor to initiate the interactive item at the corresponding moment according to the time corresponding relation to obtain the live video stream combined with the interactive item, and sending the live video stream combined with the interactive item to the second terminal. The embodiment of the invention can enable the audience watching the playback to have the same interactive experience as the audience watching the live broadcast.

Description

Live broadcast playback method, device, terminal and storage medium
Technical Field
The embodiment of the invention relates to the technical field of live broadcast, in particular to a live broadcast playback method, a live broadcast playback device, a terminal and a storage medium.
Background
With the development of computer technology and network technology, live webcast is widely popularized, live webcast can be performed by logging in a live webcast application program, and audiences can enter an interested live webcast room to watch live webcast content by logging in the live webcast application program. Playback may also be viewed if the viewer has not had time to watch the live.
In the prior art, during the live broadcast, the interaction between the audience and the anchor can be performed, such as: the audience can give comments aiming at the live content, the comments float on the live screen in the form of bullet screens, or the audience can give a virtual gift to the main broadcast. That is, the interaction modes provided by the prior art are mainly applied in the live broadcasting process, and for the playback process, an interactive implementation scheme is lacked.
Disclosure of Invention
In view of this, embodiments of the present invention provide a live broadcast playback method, apparatus, terminal, and storage medium, which can provide an interactive implementation scheme during playback, so that a viewer watching playback has the same interactive experience as a viewer watching live broadcast.
The live broadcast playback method provided by the embodiment of the invention comprises the following steps:
receiving a live broadcast watching request sent by a first terminal, and sending a live broadcast video stream to the first terminal;
acquiring an interactive project initiated by a main broadcast;
merging the interactive items into the live video stream, and sending the live video stream merged with the interactive items to the first terminal;
recording the time corresponding relation between the interactive item and the live video stream;
receiving a video playback request sent by a second terminal, and sending the live video stream to the second terminal;
and simulating the anchor to initiate the interactive item at the corresponding moment according to the time corresponding relation to obtain the live video stream combined with the interactive item, and sending the live video stream combined with the interactive item to the second terminal.
The live playback device provided by the embodiment of the invention comprises:
the first receiving and sending unit is used for receiving a live broadcast watching request sent by a first terminal and sending a live broadcast video stream to the first terminal;
the first acquisition unit is used for acquiring an interactive item initiated by a main broadcast;
the merging unit is used for merging the interactive items into the live video stream and sending the live video stream merged with the interactive items to the first terminal;
the recording unit is used for recording the time corresponding relation between the interactive item and the live video stream;
the second transceiving unit is used for receiving a video playback request sent by a second terminal and sending the live video stream to the second terminal;
the simulation unit is used for simulating the anchor to initiate the interactive item at the corresponding moment according to the time corresponding relation to obtain a live video stream combined with the interactive item;
the second transceiving unit is further configured to send the live video stream combined with the interactive item to the second terminal.
The embodiment of the invention also provides a terminal, which comprises a memory and a processor, wherein the memory stores a computer program, and when the computer program is executed by the processor, the processor executes the live broadcast playback method provided by the embodiment of the invention.
The embodiment of the invention also provides a storage medium, wherein the storage medium is used for storing a plurality of instructions, and the instructions are suitable for being loaded by the processor and executing the live broadcast playback method provided by the embodiment of the invention.
In the embodiment of the invention, during live broadcasting, when the anchor initiates an interactive item and the audience watching the live broadcasting interacts, the anchor end can record the time corresponding relation between the interactive item initiated by the anchor and the live video stream, and during playback, the anchor end can simulate the anchor to initiate the interactive item at the corresponding moment according to the recorded time corresponding relation, so that the audience watching the playback can also experience the interactive item, the audience watching the playback has the same interactive experience as the audience watching the live broadcasting, and the interactive requirement in the playback process is solved.
Drawings
Fig. 1 is a scene schematic diagram of a live playback system provided in an embodiment of the present invention.
Fig. 2a is a schematic flow chart of a live playback method according to an embodiment of the present invention.
Fig. 2b is a schematic view of a live interface according to an embodiment of the present invention.
Fig. 2c is a schematic diagram of an interactive item acquisition interface according to an embodiment of the present invention.
Fig. 2d is a schematic diagram of an interactive item display interface according to an embodiment of the present invention.
Fig. 3a is another schematic flow chart of a live playback method according to an embodiment of the present invention.
Fig. 3b is a schematic diagram of a title display interface according to an embodiment of the present invention.
Fig. 4 is a schematic structural diagram of a live playback apparatus according to an embodiment of the present invention.
Fig. 5 is another schematic structural diagram of a live playback apparatus according to an embodiment of the present invention.
Fig. 6 is a schematic structural diagram of a terminal according to an embodiment of the present invention.
Detailed Description
Referring to the drawings, wherein like reference numbers refer to like elements, the principles of the present application are illustrated as being implemented in a suitable computing environment.
In the description that follows, specific embodiments of the present application will be described with reference to steps and symbols executed by one or more computers, unless otherwise indicated. Accordingly, these steps and operations will be referred to, several times, as being performed by a computer, the computer performing operations involving a processing unit of the computer in electronic signals representing data in a structured form. This operation transforms the data or maintains it at locations in the computer's memory system, which may be reconfigured or otherwise altered in a manner well known to those skilled in the art. The data maintains a data structure that is a physical location of the memory that has particular characteristics defined by the data format. However, while the principles of the application have been described in language specific to above, it is not intended to be limited to the specific form set forth herein, and it will be recognized by those of ordinary skill in the art that various of the steps and operations described below may be implemented in hardware.
The term module, as used herein, may be considered a software object executing on the computing system. The various components, modules, engines, and services described herein may be viewed as objects implemented on the computing system. The apparatus and method described herein may be implemented in software, but may also be implemented in hardware, and are within the scope of the present application.
The terms "first", "second", and "third", etc. in this application are used to distinguish between different objects and not to describe a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or modules is not limited to only those steps or modules listed, but rather, some embodiments may include other steps or modules not listed or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
Because the prior art lacks an interaction implementation scheme in the process of playing back the live video, the audience watching the playback cannot obtain the same interaction experience as the audience watching the live video, the embodiment of the invention provides the live playback method, so that the audience watching the playback can have the same interaction experience as the audience watching the live video, and the interaction requirement in the playback process is met.
The live playback method provided by the embodiment of the invention can be implemented in a live playback device, and the live playback device can be specifically integrated in a terminal or other equipment with a storage unit, a microprocessor and an arithmetic capability, wherein the terminal comprises: and the mobile phone, the tablet computer, the personal computer and the like.
Taking an example that a live playback device is integrated in a terminal, please refer to fig. 1, where fig. 1 is a scene schematic diagram of a live playback system according to an embodiment of the present invention, where the live playback system may include a terminal, and the terminal may include a terminal of a main broadcast (i.e., a main broadcast terminal on which a live application is installed), a terminal of a viewer watching the main broadcast (i.e., a first terminal on which the live application is installed), and a terminal of the viewer watching playback (i.e., a second terminal on which the live application is also installed).
In specific implementation, the anchor can enter a live broadcast room through a live broadcast application installed at the anchor to perform live broadcast, such as: talent show live broadcast, teaching live broadcast, game live broadcast and the like; the method comprises the steps that a viewer who wants to watch live broadcasting can send a live broadcasting watching request to a main broadcasting end through a first terminal, and after the main broadcasting end receives the live broadcasting watching request sent by the first terminal, the main broadcasting end can obtain a live broadcasting video stream of the main broadcasting and send the obtained live broadcasting video stream to the first terminal; in the process of live broadcasting, when the anchor wants to interact with the audience, the anchor can obtain the interactive items initiated by the anchor, such as: answering, playing games, telling stories, calling and the like, combining the acquired interactive items in a live video stream, and sending the live video stream combined with the interactive items to the first terminal so as to realize interaction with the audience watching the live video stream; meanwhile, the anchor end can record the time corresponding relation between the interactive item initiated by the anchor and the live video stream; when a viewer needs to watch the live broadcast playback, the viewer can send a video playback request to an anchor terminal through a second terminal, the anchor terminal can send the live broadcast video stream to the second terminal after receiving the video playback request, and in the playback process, the anchor terminal can simulate the anchor terminal to initiate the interactive item at a corresponding moment according to the corresponding relation of the recorded time to obtain the live broadcast video stream combined with the interactive item, and send the live broadcast video stream combined with the interactive item to the second terminal, so that the interaction with the viewer watching the playback is realized during the playback.
For example, in the live broadcasting process, when the live broadcasting reaches 3 minutes, the anchor gives a title to the audience, the audience watching the live broadcasting answers, and the anchor end records the time corresponding relation between the title and the live video stream; when a viewer needs to watch the playback, when the playback reaches the 3 rd minute, the anchor end can simulate the anchor to put forward the questions according to the record when the playback reaches the 3 rd minute so as to enable the viewer watching the playback to answer, and thus, the viewer watching the playback has the same answer interaction experience as the viewer watching the live broadcast.
In the embodiment of the present invention, the terminal for watching the live audience and the terminal for watching the played back audience may be the same terminal (i.e., the audience can watch the live audience through the terminal and can also watch the played back through the terminal, and the audience watching the live audience and the audience watching the played back may be the same audience or different audiences), and certainly, the terminal for watching the live audience and the terminal for watching the played back audience may also be different terminals, which is not specifically limited herein.
In addition, the live broadcast playback system of the embodiment of the invention can also comprise a server, the server can be a background server of the live broadcast application program, and data interaction between the anchor terminal and the first terminal and data interaction between the anchor terminal and the second terminal can be realized through the server.
It should be noted that the scene schematic diagram of the live playback system shown in fig. 1 is only an example, and the live playback system and the scene described in the embodiment of the present invention are for more clearly illustrating the technical solution of the embodiment of the present invention, and do not form a limitation on the technical solution provided in the embodiment of the present invention.
The following are detailed below.
In this embodiment, a live playback method according to an embodiment of the present invention will be described from the perspective of a live playback device, where the live playback device may be integrated in a terminal, that is, an anchor terminal, as shown in fig. 2a, the live playback method according to the embodiment includes the following steps:
step S201, receiving a live broadcast watching request sent by a first terminal.
In specific implementation, the anchor can enter a live broadcast room through a live broadcast application installed at the anchor to perform live broadcast, such as: talent exhibition live broadcast, teaching live broadcast, game live broadcast and the like, and correspondingly, the anchor broadcast can be talent exhibitors, teachers, game players and the like. When a viewer wants to watch the live broadcast of the anchor, the viewer can enter the live broadcast room through the first terminal and send a live broadcast watching request to the anchor terminal, and the anchor terminal receives the live broadcast watching request sent by the first terminal. In this embodiment, the first terminal refers to a terminal of a viewer watching a live broadcast.
Step S202, sending the live video stream to the first terminal.
Specifically, the anchor terminal may obtain a live video stream of the anchor, and send the live video stream of the anchor to a first terminal of a viewer entering the live broadcast room to watch live broadcast through the server.
Specifically, the method for the anchor side to acquire the live video stream of the anchor may include:
firstly, a video stream acquired by a local camera of a main broadcast end is acquired, and the acquired video stream is used as a live video stream of the main broadcast. The method comprises the steps of directly broadcasting by a camera, acquiring a real-time activity scene of a main broadcast by the camera, and taking the acquired real-time activity scene as a live broadcast video stream. Live scenes where this approach may be applicable are for example: teaching lecture live broadcasting scenes, talent skill showing live broadcasting scenes and the like, and the obtained live video stream can be a lecture video stream and a talent skill showing video stream.
And secondly, acquiring a video stream of a screen designated area from a local video memory of a main broadcast end, and taking the acquired video stream as a live video stream of the main broadcast. I.e. a screen-recording live broadcast, this way can be applied to live scenes such as: and playing a live game scene, wherein the obtained live video stream can be a game video stream in progress of a player.
And step S203, acquiring the interactive item initiated by the anchor.
In the live broadcasting process, the anchor can interact with the audience for some items, and the anchor end obtains the interactive items initiated by the anchor. For example, the anchor may call with the audience, ask questions to be answered by the audience, play some games with the audience, tell stories to the audience, and the like, and the anchor obtains the interactive items initiated by the anchor.
In a specific embodiment, the method for acquiring the interactive item initiated by the anchor may be as follows:
receiving an interaction request initiated by the anchor triggering an interaction control;
displaying an interactive item creation page in response to the interactive request;
and acquiring the interactive item which is input by the anchor in real time from the interactive item creating page, or acquiring the interactive item selected by the anchor from a material library from the interactive item creating page.
For example, as shown in fig. 2b, an interaction control may be provided for the anchor in the live application, and the anchor may trigger the interaction control (e.g., "i want to interact" in fig. 2 b) by means of finger click, voice input, or the like to initiate an interaction request, and the anchor displays an interaction item creation page in response to the interaction request. In this embodiment, the interactive item creation page may be displayed in a pop-up window, a list, a drop-down box, or the like, and in a specific embodiment, the interactive item creation page may be as shown in fig. 2c, and the interactive item creation page may provide two ways of creating an interactive item for the anchor, namely, first, real-time entry; and secondly, selecting from a material library.
In the real-time entry of the creation mode, the anchor can enter the interactive items at the corresponding positions of the pages according to the prompts of the entry frames, and the anchor end acquires the interactive items entered by the anchor. In the selection of the creation mode in the material library, the material library can be established in advance, the established material library can be stored locally at the anchor terminal or stored at a remote terminal, and the anchor terminal can open the material library according to the page prompt to select the required interactive items.
And step S204, merging the interactive items into the live video stream to obtain the live video stream merged with the interactive items.
The specific combination method may include the steps of:
generating a webpage layer, and displaying the interactive item in the webpage layer;
and merging the webpage layers in the live video stream to obtain the live video stream merged with the interactive items.
In this embodiment, a web page layer for displaying an interactive item may be generated by using a web page (web) technology, that is, a HyperText Markup Language (HTML) is used to construct page elements (e.g., characters, links, forms, etc.), a Cascading Style Sheet (CSS) is used to perform page layout (e.g., a typesetting Style, a font size, etc.), and a script Javascript is used to control page interaction (e.g., button response, mouse following, content input, etc.), so as to generate the web page layer.
When the Web technology is adopted to generate the Web page layer, the used language is the HTML language of World Wide Web Consortium (W3C) standard, and the generated Web page layer can run on a network and a standard browser, so that the interactive items displayed in the Web page layer can be displayed in a cross-platform manner. For example, the interactive items displayed on the web page layers can be displayed on a terminal configured with an ios system or an Android system, the terminal can directly display the web page layers on which the interactive items are displayed in a browser, or can call a page view (webview) in a live application program to display the web page layers on which the interactive items are displayed, and the problem of complex bottom layer adaptation and cross-platform development language does not need to be considered for the display of the web page layers.
In addition, in this embodiment, when the interactive item includes a formula, for example, the interactive item is a mathematical title that is mainly played, a formula renderer (for example, a LaTex renderer) may be used to render the formula as a picture embedded in the webpage layer, so as to optimize a display effect of the formula, for example: the color, the whole size and the like of the formula can be conveniently adjusted.
In a specific implementation, when the webpage layers are merged into a live video stream, the size of the webpage layers may be adjusted according to the window display size of the live video stream, for example, the size of the webpage layers may be adjusted to be the same as the window display size of the live video stream, and then the two are aligned to cover the display window of the live video stream with the webpage layers, so as to obtain the live video stream merged with the interactive item; for another example, the size of the webpage layer may be adjusted to be smaller than the window display size of the live video stream, and then the webpage layer is used to cover a part of the display window of the live video stream, so as to obtain the live video stream combined with the interactive item. In a specific embodiment, the display effect of the merged web page layer with the interactive item and the live video stream can be as shown in fig. 2 d.
And S205, sending the live video stream combined with the interactive item to the first terminal.
The first terminal can show the live video stream combined with the interactive item to a user of the first terminal, namely, a live audience is watched, so that the live watching can be interacted with the anchor program. For example, when the interactive item is an answer, the webpage layer incorporated in the live video stream is used to display the title given by the anchor, and the audience can answer the title on the webpage layer of the live video stream; for example, when the interactive item is a story of the anchor, the webpage layer merged on the live video stream can display the content of the story of the anchor, and the audience can see the content of the story spoken by the anchor on the webpage layer of the live video stream; for another example, when the interactive item is playing a game, the web page layer incorporated in the live video stream may display a game rule or a game guide, and the viewer may view the game rule or the game guide on the web page layer of the live video stream.
In addition, after the live video stream combined with the interactive item is obtained, the anchor terminal can also automatically delete or hide the webpage layer in the live video stream to end the interaction and continue the live broadcast. For example, after obtaining the live video stream merged with the interactive item, the anchor terminal may start a timer to time, and when the timed length reaches a preset time length, the anchor terminal may delete or hide the webpage layers merged in the live video stream to obtain the live video stream of the anchor, and continuously push the live video stream to the first terminal to end the interaction, so that the audience of the first terminal continuously watches the live broadcast performed by the anchor. In this embodiment, the preset duration may be an interaction duration reserved for the audience, and a value of the preset duration may be, for example, 30 seconds, 1 minute, or the like, and specifically may be a value according to an actual requirement of the interaction item. For example, timing may be performed after a live video stream incorporating the interactive item is obtained, and when the timing is as long as 30 seconds, the anchor terminal automatically deletes or hides the webpage layer. In a specific implementation, the timing may be performed in a countdown manner, and is not specifically limited herein.
Or after the live video stream combined with the interactive item is obtained, the anchor terminal may also delete or hide the webpage layer combined in the live video stream according to the operation of the anchor terminal to end the interaction and continue the live broadcast. For example, a stop control (e.g., "stop interaction" in fig. 2 b) may be provided for an anchor in a live application, after a live video stream merged with the interaction item is obtained, the anchor may trigger the stop control by means of finger click, voice input, and the like to initiate a stop instruction, the anchor receives the stop instruction initiated by the anchor, deletes or hides the web page layer in the live video stream, obtains a live video stream of the anchor, and continuously pushes the live video stream to the first terminal, so that the audience of the first terminal continuously watches the live broadcast performed by the anchor.
And step S206, recording the time corresponding relation between the interactive item and the live video stream.
Specifically, the initiation time of the interactive item relative to the live video stream may be recorded, and in addition, the duration time and the stop time of the interactive item relative to the live video stream may also be recorded.
In a specific embodiment, the recorded time correspondence may be as shown in table 1, for example:
interactive item Initiation time Duration of time Time of rest
Answering question 3 min 00 s 30 seconds 3 minutes and 30 seconds
Story telling 20 minutes and 01 seconds 1 minute 21 minutes and 01 seconds
Playing games 28 minutes and 05 seconds 15 seconds 28 minutes and 20 seconds
TABLE 1
And step S207, receiving a video playback request sent by the second terminal.
The second terminal is a terminal of a viewer who needs to watch and play back the live video, the viewer who needs to watch and play back the live video can send a video playback request to the anchor terminal through the second terminal, and the anchor terminal receives the video playback request sent by the second terminal.
And step S208, sending the live video stream to the second terminal.
And S209, simulating the anchor to initiate the interactive item at the corresponding moment according to the time corresponding relation to obtain the live video stream combined with the interactive item.
And step S210, sending the live video stream combined with the interactive item to the second terminal.
Specifically, the anchor terminal may simulate the anchor at a corresponding time according to the recorded time correspondence relationship to initiate a corresponding interactive item, display the initiated interactive item in a web page layer, and finally merge the web page layer on which the interactive item is displayed in a live video stream and send the web page layer to the second terminal.
For example, when the recorded time correspondence is shown in table 1, when the live video is played back, and when the playback reaches 3 minutes 00 seconds, the anchor terminal may simulate to initiate an interactive item of answering, display the title in a webpage layer, and combine the webpage layer with the displayed title in the live video stream to send to the second terminal, so that the audience at the second terminal can realize the interaction of answering when watching the playback; when the playback time reaches 20 minutes and 01 seconds, the anchor terminal simulates and initiates an interactive item of speaking a story, the content of the story is displayed in the webpage layer, and the webpage layer on which the story content is displayed is combined in a live video stream and sent to the second terminal, so that the audience of the second terminal can see the story spoken by the anchor terminal during the live broadcast when watching the playback time; when the playback time reaches 28 minutes and 05 seconds, the anchor terminal simulates and initiates the interactive item of playing the game, the game rule or the game instruction is displayed in the webpage layer, and the webpage layer on which the game rule or the game instruction is displayed is merged in the live video stream and is sent to the second terminal, so that the audience of the second terminal can see the game which participates in the anchor initiation when watching the playback time.
Further, the anchor terminal may delete the interactive item incorporated in the live video stream according to the recorded time correspondence, and continuously transmit the live video stream obtained after deletion to the second terminal, so that the audience at the second terminal can continue to watch the playback of the live video.
In this embodiment, when live broadcasting, when the anchor initiates an interactive item and watches live audience to interact, the anchor can record the time corresponding relation between the interactive item initiated by the anchor and the live video stream, and when playing back, the anchor can simulate the anchor to initiate the interactive item at a corresponding moment according to the recorded time corresponding relation, so that the audience watching the playback can also experience the interactive item, the audience watching the playback has the same interactive experience as the audience watching the live broadcast, and the interactive requirement in the playback process is solved.
In the live teaching scene, the anchor can be a teacher giving online lessons, a terminal used by the teacher for live broadcasting can be called a teacher end, the audience can include students learning through live broadcasting and students playing back learning through watching live video, a terminal used by the students learning through live broadcasting can be called a first student end, a terminal used by the students playing back learning through watching live video can be called a second student end, as shown in fig. 3a, the live broadcasting playback method in the scene includes:
step S301, receiving a live broadcast watching request sent by a first student terminal.
Step S302, sending the live video stream to a first student terminal.
For example, the teacher can get into the live broadcast room through the live broadcast application program that the teacher end installed, and after the teacher got into the live broadcast room, the teacher end can open the camera, acquires teacher's the video stream of giving lessons, passes through the server propelling movement with teacher's the video stream of giving lessons and needs watch live student's terminal, first student end promptly as live broadcast video stream.
And step S303, obtaining the questions given by the teacher.
In the specific implementation, an interactive control can be provided in a live application program, the interactive control can be triggered by finger clicking, voice input and the like, when a teacher needs to give a question, the interactive control can be triggered, after the interactive control is triggered, the teacher end acquires an interactive request which is initiated by triggering the interactive control, then an interactive item creation page is displayed in response to the interactive request, and then the question which is given by the teacher is acquired from the interactive item creation page.
For example, in the live broadcasting process, after a teacher explains a chapter, if the teacher wants to know the mastering conditions of the student on the content of the chapter, the teacher can give a question to the student to answer. For example, a teacher may trigger an interaction control in a live application to initiate an interaction request, then enter a title in real time on an interaction project creation page or select a title from a material library, and the teacher end obtains the title given by the teacher.
In the method of real-time entry of questions, the interactive project creation page can provide the teacher with question creation templates of various types of questions, so that the teacher can select the question creation templates of corresponding types according to requirements and input question data in the selected question creation templates to complete the questions, and the types of the questions can include but are not limited to selection of questions, filling of blank questions, question and answer, judgment of questions and line connection of questions. And the teacher end acquires the question data input by the teacher in the question creating template to obtain the questions asked by the teacher. The question creation template is provided, so that the question creation can be normalized, and the question creation efficiency is improved.
In the question setting mode of selecting questions from a material library, the material library can be a question library, the set question library can be stored locally at a teacher end or remotely, the questions in the question library can be classified according to the types of selected questions, blank filling questions, question answering, judgment questions, line connection questions and the like, the teacher can select the questions of the required types from the question library, and therefore the questions can be set, and the questions selected by the teacher from the question library can be obtained by the teacher end, and the questions set by the teacher can be obtained. By providing the question bank, the time for setting questions can be reduced, and the question setting efficiency can be further improved.
And S304, generating a webpage layer, and displaying the title in the webpage layer.
In this embodiment, a web technology may be used to generate a web page layer for displaying a title. The webpage map layer is adopted to bear the questions, so that various types of questions can be borne, such as selection questions, blank filling questions, question answering questions, judgment questions, connection questions, material questions and the like, and the types of the questions are enriched; in addition, the webpage layer is adopted to bear the title, the presentation form of the title can be more flexible, and the title can be presented by pictures and HTML rich text information (bold, underline and middle-drawn line) besides the traditional text and formula presentation of the title.
In addition, in this embodiment, when the given title includes a formula, a formula renderer (for example, a LaTex renderer) may be used to render the formula as a picture embedded in the webpage layer, so as to optimize a display effect of the formula, for example: the color, the whole size and the like of the formula can be conveniently adjusted.
And S305, combining the webpage layers in the live video stream to obtain the live video stream combined with the title.
In a specific embodiment, a specific rendering effect of incorporating the page layer carrying the title in the live video stream can be as shown in fig. 3 b.
And S306, sending the live video stream combined with the title to a first student end.
The students at the first student end can answer questions combined in the live video stream, the first student end can feed back answer information of the students to the teacher end, the teacher end can feed back a review result to the first student end in real time according to the answer information, and further the teacher end can adjust teaching strategies according to the answer information of the students.
For example, after obtaining the answer information of the student, the teacher end may compare the answer information of the student with the correct answer of the question to obtain a review result, feed the review result back to the first student end in real time, and count the answer accuracy of the question made by the teacher according to the review result, if the answer accuracy is low, prompt the teacher to repeat or enhance the live broadcast of the course content related to the question, and the teacher adjusts the teaching content according to the prompt.
And S307, deleting a webpage layer in the live video stream, and continuously sending the live video stream to the first student side.
After the live broadcast video stream combined with the title is obtained, the teacher end can automatically delete or hide the webpage layer displaying the title in the live broadcast video stream so as to end the answering interaction and continue the live broadcast. For example, after obtaining the live video stream with the incorporated question, the teacher end may start a timer to time, and when the timed length reaches a preset time length, the teacher end may delete or hide the webpage layers incorporated in the live video stream to obtain the live video stream, and continuously push the live video stream to the first student end to end answering, so that the student continuously watches the live lecture broadcast performed by the teacher. In this embodiment, the preset duration may be the answer duration reserved for students, and the value of the preset duration may be, for example, 30 seconds, 1 minute, and may specifically be a value according to an actual demand. For example, as shown in fig. 3b, a countdown may be performed for 30 seconds after the live video stream with the title is obtained, and when the countdown reaches 0 second, the teacher end automatically deletes or hides the webpage layers.
Or after the live broadcast video stream with the merged title is obtained, the teacher end can delete or hide the webpage layer merged in the live broadcast video stream according to the operation of the teacher so as to finish answering and continue live broadcast. For example, a stop control may be provided for a teacher in a live application program, after a live video stream with a merged title is obtained, the teacher may trigger the stop control by means of finger click, voice input, and the like to initiate a stop instruction, the teacher receives the stop instruction initiated by the teacher, deletes or hides the webpage layer in the live video stream, obtains a live video stream, and pushes the live video stream to a first student terminal, so that the student continues to watch a live lecture broadcast performed by the teacher.
And step S308, recording the time corresponding relation between the titles and the live video stream.
Specifically, the start time of the title with respect to the live video stream may be recorded, and the duration and stop time of the title with respect to the live video stream may also be recorded. In a specific embodiment, the recorded time correspondence may be as shown in table 2 below:
topic of questions Initiation time Duration of time Time of rest
Choosing questions 10 min 00 sec 30 seconds 10 minutes and 30 seconds
Filling in the blank 20 minutes 00 seconds 30 seconds 20 minutes and 30 seconds
Question and answer 30 minutes 00 seconds 1 minute 31 min 00 s
TABLE 2
And step S309, receiving a video playback request sent by the second student.
The second student end is a terminal of a student needing to watch and play back the live video, the student needing to watch and play back the live video can send a video playback request to the teacher end through the second student end, and the teacher end receives the video playback request sent by the second student end. For example, a student misses the live time, a video playback request may be sent to the teacher end through the second student end to request to watch playback of the live video.
And step S310, sending the live video stream to a second student end.
And step S311, simulating the teacher to set questions according to the recorded time corresponding relation to obtain live video streams combined with the questions.
Specifically, the teacher end may simulate the teacher to output a corresponding title at a corresponding time according to the recorded time correspondence relationship, display the output title in a webpage layer, and finally merge the webpage layer on which the title is displayed in the live video stream.
For example, when the recorded time correspondence is shown in table 2, when the live video is played back, when the playback reaches 10 minutes and 00 seconds, the teacher end may simulate the teacher to generate a choice (the choice is the same as the choice generated by the teacher during live video), display the generated choice in the web page layer, and combine the web page layer on which the choice is displayed in the live video stream and send the live video stream to the second student end, so that the student watching the playback can answer the choice; when the playback time reaches 20 minutes and 00 seconds, the teacher end simulates the teacher to give a blank filling question (the blank filling question is the same as that given by the teacher in live broadcasting), the blank filling question is displayed in the webpage layer, and the webpage layer with the blank filling question is combined in the live broadcasting video stream and sent to the second student end, so that students watching the playback time can answer the blank filling question; when the playback time reaches 30 minutes and 00 seconds, the teacher end simulates the teacher to give out a question and answer (the question and answer are the same as those given out by the teacher in the live broadcast), the question and answer is displayed in the webpage layer, and the webpage layer with the question and answer displayed is combined in the live broadcast video stream and is sent to the second student end, so that students watching the playback time can answer the question and answer.
And step S312, sending the live video stream combined with the title to a second student end.
The student of the second student end can also solve the question combined in the played back live video stream, and the second student end can feed back the answer information to the teacher end, but the teacher end does not necessarily feed back the review result of the question to the second student end in time because the question is played back and the teacher is not necessarily on-line.
And step S313, deleting the webpage image layers in the live video stream according to the recorded time corresponding relation, and continuously sending the live video stream to the second student terminal.
For example, when the recorded time correspondence is shown in table 2, the anchor terminal may simulate the teacher to give a choice when the video is played back to 10 minutes and 00 seconds, and delete the choice displayed in the live video stream when the video is played back to 10 minutes and 30 seconds; when the video is played back to 20 minutes and 00 seconds, simulating a teacher to give a blank filling question, and deleting the blank filling question displayed in the live video stream at 20 minutes and 30 seconds; when the video is played back to 30 minutes and 00 seconds, the simulation teacher gives a question and answer, and deletes the question and answer displayed in the live video stream at 31 minutes and 30 seconds. In this way, students watching playback will have the same answering experience as students watching live.
In the embodiment, the teacher end synthesizes the questions in the live video stream in the form of the webpage layers, so that answer pages are provided for the student end, the answer pages are almost pushed to all students watching the live broadcast of the teacher at the same time by the server, therefore, the students watching the live broadcast can see the questions at the same time, and the questions can be isolated from the answering processes of different students due to the form display of the webpage layers, so that the independence of the answers is ensured, and the interaction of the mode of answering the teacher and the students is realized in the live broadcast process, and the interaction efficiency of the teacher and the students is improved.
Further, when broadcasting directly, through the time corresponding relation of the questions that the record mr made and live video stream to make when the playback, mr end can simulate mr to make the questions at corresponding moment according to the time corresponding relation of record, so that the student who watches the playback also can answer the questions that the teacher made when broadcasting directly, make the student who watches the playback have with watch the student of broadcasting directly the same answer experience of questions.
In order to better implement the live playback method provided by the embodiment of the present invention, an embodiment of the present invention further provides a live playback apparatus, and as shown in fig. 4, the live playback apparatus of the present embodiment includes: the first transceiver 401, the first obtaining unit 402, the merging unit 403, the recording unit 404, the second transceiver 405, and the simulation unit 406 are as follows:
a first transceiving unit 401, configured to receive a live viewing request sent by a first terminal, and send a live video stream to the first terminal;
a first obtaining unit 402, configured to obtain an interactive item initiated by a host;
a merging unit 403, configured to merge the interactive item into the live video stream, and send the live video stream merged with the interactive item to the first terminal;
a recording unit 404, configured to record a time correspondence between the interactive item and the live video stream;
a second transceiving unit 405, configured to receive a video playback request sent by a second terminal, and send the live video stream to the second terminal;
a simulation unit 406, configured to simulate, at a corresponding time, the anchor to initiate the interactive item according to the time correspondence relationship, so as to obtain a live video stream merged with the interactive item;
the second transceiver unit 405 is further configured to send the live video stream combined with the interactive item to the second terminal.
In an embodiment, as shown in fig. 5, the first obtaining unit 402 includes:
a receiving subunit 4021, configured to receive an interaction request initiated by the anchor triggering an interaction control;
a display subunit 4022, configured to display an interaction item creation page in response to the interaction request;
the obtaining subunit 4023 is configured to obtain the interaction item entered by the anchor in real time from the interaction item creation page, or obtain the interaction item selected by the anchor from a material library from the interaction item creation page.
In an embodiment, as shown in fig. 5, the merging unit 403 includes:
the generating subunit 4031 is configured to generate a web page layer, and display the interactive item in the web page layer;
a merging subunit 4032, configured to merge the webpage layers in the live video stream to obtain a live video stream merged with the interactive item.
In an embodiment, the generating subunit 4031 is further configured to, when the interaction item includes a formula, render the formula as a picture by using a formula renderer and embed the picture in the webpage layer.
In one embodiment, as shown in fig. 5, the apparatus further comprises:
a timing unit 407, configured to start a timer to perform timing;
a first stopping unit 408, configured to delete or hide the interactive item in the live video stream incorporating the interactive item when the counted length reaches a preset length.
In one embodiment, as shown in fig. 5, the apparatus further comprises:
a receiving unit 409, configured to receive a stop request initiated by the anchor triggering a stop control;
a second stopping unit 410, configured to delete or hide the interactive item in the live video stream incorporating the interactive item.
In an embodiment, the recording unit 404 is specifically configured to record an initiation time, a duration and a stop time of the interactive item with respect to the live video stream.
In one embodiment, as shown in fig. 5, the apparatus further comprises:
a second obtaining unit 411, configured to obtain a video stream acquired by a local camera at a anchor end, and use the obtained video stream as the live video stream;
or acquiring the video stream of the screen specified area from the local video memory of the main broadcasting end, and taking the acquired video stream as the live video stream.
It should be noted that, when the live playback apparatus provided in the foregoing embodiment implements live playback, the division of the functional modules is merely used as an example, and in practical applications, the function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the live playback device and the live playback method provided by the above embodiments belong to the same concept, and specific implementation processes thereof are detailed in the method embodiments and are not described herein again.
The device in this embodiment, when live, when the anchor launches the interactive item and watches live spectator and carry out the interdynamic, can record the time corresponding relation of the interactive item that the anchor launched and live video stream, when the playback, can simulate the anchor to launch this interactive item at corresponding moment according to the time corresponding relation of record, so that the spectator who watches the playback also can experience this interactive item, make the spectator who watches the playback have with watch the same interactive experience of live spectator, the interactive demand in the playback process has been solved.
An embodiment of the present invention further provides a terminal, that is, an anchor terminal, as shown in fig. 6, which shows a schematic structural diagram of the terminal according to the embodiment of the present invention, specifically:
the terminal may include components such as a processor 501 of one or more processing cores, memory 502 of one or more computer-readable storage devices, Radio Frequency (RF) circuitry 503, a power supply 504, an input unit 505, and a display unit 506. Those skilled in the art will appreciate that the terminal structure shown in fig. 6 is not intended to be limiting and may in fact include more or fewer components than shown, or some components may be combined, or a different arrangement of components. Wherein:
the processor 501 is a control center of the terminal, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the terminal and processes data by running or executing software programs and/or modules stored in the memory 502 and calling data stored in the memory 502, thereby performing overall monitoring of the terminal. Optionally, processor 501 may include one or more processing cores; preferably, the processor 501 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 501.
The memory 502 may be used to store software programs and modules, and the processor 501 executes various functional applications and data processing by operating the software programs and modules stored in the memory 502. The memory 502 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data created according to the use of the server, and the like. Further, the memory 502 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 502 may also include a memory controller to provide the processor 501 with access to the memory 502.
The RF circuit 503 may be used for receiving and transmitting signals during information transmission and reception, and in particular, for receiving downlink information of a base station and then processing the received downlink information by one or more processors 501; in addition, data relating to uplink is transmitted to the base station. In general, the RF circuitry 503 includes, but is not limited to, an antenna, at least one Amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuitry 503 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Message Service (SMS), and the like.
The terminal also includes a power supply 504 (e.g., a battery) for powering the various components, and preferably, the power supply 504 is logically coupled to the processor 501 via a power management system such that functions such as managing charging, discharging, and power consumption are performed via the power management system. The power supply 504 may also include any component of one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
The terminal may further include an input unit 505, and the input unit 505 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control.
The terminal may further include a display unit 506, and the display unit 506 may be used to display information input by the user or information provided to the user and various graphic user interfaces of the server, which may be configured by graphics, text, icons, videos, and any combination thereof. The Display unit 506 may include a Display panel, and optionally, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch-sensitive surface may overlay the display panel, and when a touch operation is detected on or near the touch-sensitive surface, the touch operation is transmitted to the processor 501 to determine the type of the touch event, and then the processor 501 provides a corresponding visual output on the display panel according to the type of the touch event. Although in FIG. 6 the touch-sensitive surface and the display panel are two separate components to implement input and output functions, in some embodiments the touch-sensitive surface may be integrated with the display panel to implement input and output functions.
Although not shown, the terminal may further include a camera, a bluetooth module, and the like, which will not be described herein. Specifically, in this embodiment, the processor 501 in the terminal loads the executable file corresponding to the process of one or more application programs into the memory 502 according to the following instructions, and the processor 501 runs the application programs stored in the memory 502, so as to implement various functions as follows:
receiving a live broadcast watching request sent by a first terminal, and sending a live broadcast video stream to the first terminal;
acquiring an interactive project initiated by a main broadcast;
merging the interactive items into the live video stream, and sending the live video stream merged with the interactive items to the first terminal;
recording the time corresponding relation between the interactive item and the live video stream;
receiving a video playback request sent by a second terminal, and sending the live video stream to the second terminal;
and simulating the anchor to initiate the interactive item at the corresponding moment according to the time corresponding relation to obtain the live video stream combined with the interactive item, and sending the live video stream combined with the interactive item to the second terminal.
In some embodiments, when acquiring an interactive item initiated by a host, the processor 501 is specifically configured to perform the following steps:
receiving an interaction request initiated by the anchor triggering an interaction control;
displaying an interactive item creation page in response to the interactive request;
and acquiring the interactive item which is input by the anchor in real time from the interactive item creating page, or acquiring the interactive item selected by the anchor from a material library from the interactive item creating page.
In some embodiments, when the interactive item is incorporated in the live video stream, the processor 501 is specifically configured to perform the following steps:
generating a webpage layer, and displaying the interactive item in the webpage layer;
and merging the webpage layers in the live video stream to obtain the live video stream merged with the interactive items.
In some embodiments, processor 501 is further configured to perform the following steps:
and when the interactive items comprise formulas, rendering the formulas into pictures by adopting a formula renderer and embedding the pictures into the webpage layers.
In some embodiments, after obtaining the live video stream incorporating the interactive item, the processor 501 is further configured to perform the following steps:
starting a timer to time;
and when the timed length reaches a preset length, deleting or hiding the interactive item in the live video stream combined with the interactive item.
In some embodiments, after obtaining the live video stream incorporating the interactive item, the processor 501 is further configured to perform the following steps:
receiving a stop request initiated by the anchor triggering a stop control;
deleting or hiding the interactive item in the live video stream incorporating the interactive item.
In some embodiments, when recording the time correspondence between the interactive item and the live video stream, the processor 501 is specifically configured to execute the following steps:
recording the initiation time, duration and stop time of the interactive item relative to the live video stream.
In some embodiments, before sending the live video stream to the first terminal, the processor 501 is further configured to perform the following steps:
acquiring a video stream acquired by a local camera of a main broadcasting end, and taking the acquired video stream as the live video stream;
or acquiring the video stream of the screen specified area from the local video memory of the main broadcasting end, and taking the acquired video stream as the live video stream.
The terminal of this embodiment, when live, when the anchor launches the interactive item and watches live spectator and carry out the interdynamic, can record the time corresponding relation of the interactive item that the anchor launched and live video stream, when the playback, can simulate the anchor and launch this interactive item at corresponding moment according to the time corresponding relation of record, so that the spectator who watches the playback also can experience this interactive item, make the spectator who watches the playback have with watch the same interactive experience of live spectator, interactive demand among the playback process has been solved.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present invention provide a storage medium, in which a plurality of instructions are stored, where the instructions can be loaded by a processor to execute steps in any live playback method provided by embodiments of the present invention. For example, the instructions may perform the steps of:
receiving a live broadcast watching request sent by a first terminal, and sending a live broadcast video stream to the first terminal; acquiring an interactive project initiated by a main broadcast; merging the interactive items into the live video stream, and sending the live video stream merged with the interactive items to the first terminal; recording the time corresponding relation between the interactive item and the live video stream; receiving a video playback request sent by a second terminal, and sending the live video stream to the second terminal; and simulating the anchor to initiate the interactive item at the corresponding moment according to the time corresponding relation to obtain the live video stream combined with the interactive item, and sending the live video stream combined with the interactive item to the second terminal.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the instructions stored in the storage medium can execute the steps in any live playback method provided in the embodiments of the present invention, beneficial effects that can be achieved by any live playback method provided in the embodiments of the present invention can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.

Claims (13)

1. A live playback method, comprising:
receiving a live broadcast watching request sent by a first terminal, and sending a live broadcast video stream to the first terminal;
acquiring an interactive project initiated by a main broadcast;
generating a webpage layer, and displaying the interactive item in the webpage layer;
merging the webpage layers into the live video stream according to the window display size of the live video stream to obtain the live video stream merged with the interactive item, and sending the live video stream merged with the interactive item to the first terminal;
recording the time corresponding relation between the interactive item and the live video stream;
receiving a video playback request sent by a second terminal, and sending the live video stream to the second terminal;
and simulating the anchor to initiate the interactive item at the corresponding moment according to the time corresponding relation to obtain the live video stream combined with the interactive item, and sending the live video stream combined with the interactive item to the second terminal.
2. The live playback method of claim 1, wherein the obtaining of the anchor-initiated interactive item comprises:
receiving an interaction request initiated by the anchor triggering an interaction control;
displaying an interactive item creation page in response to the interactive request;
and acquiring the interactive item which is input by the anchor in real time from the interactive item creating page, or acquiring the interactive item selected by the anchor from a material library from the interactive item creating page.
3. The live playback method according to claim 1, wherein when a formula is included in the interactive item, the formula is rendered as a picture embedded in the webpage layer by using a formula renderer.
4. The live playback method of claim 1, further comprising, after obtaining the live video stream incorporating the interactive item:
starting a timer to time;
and when the timed length reaches a preset length, deleting or hiding the interactive item in the live video stream combined with the interactive item.
5. The live playback method of claim 1, further comprising, after obtaining the live video stream incorporating the interactive item:
receiving a stop request initiated by the anchor triggering a stop control;
deleting or hiding the interactive item in the live video stream incorporating the interactive item.
6. The live playback method according to claim 4 or 5, wherein the recording of the time correspondence between the interactive item and the live video stream comprises:
recording the initiation time, duration and stop time of the interactive item relative to the live video stream.
7. The live playback method of claim 1, further comprising, prior to sending the live video stream to the first terminal:
acquiring a video stream acquired by a local camera of a main broadcasting end, and taking the acquired video stream as the live video stream;
or acquiring the video stream of the screen specified area from the local video memory of the main broadcasting end, and taking the acquired video stream as the live video stream.
8. A live playback apparatus, comprising:
the first receiving and sending unit is used for receiving a live broadcast watching request sent by a first terminal and sending a live broadcast video stream to the first terminal;
the first acquisition unit is used for acquiring an interactive item initiated by a main broadcast;
the generating subunit is used for generating a webpage layer and displaying the interactive item in the webpage layer;
the merging subunit is configured to merge the webpage layers into the live video stream to obtain a live video stream merged with the interactive item, and send the live video stream merged with the interactive item to the first terminal;
the recording unit is used for recording the time corresponding relation between the interactive item and the live video stream;
the second transceiving unit is used for receiving a video playback request sent by a second terminal and sending the live video stream to the second terminal;
the simulation unit is used for simulating the anchor to initiate the interactive item at the corresponding moment according to the time corresponding relation to obtain a live video stream combined with the interactive item;
the second transceiving unit is further configured to send the live video stream combined with the interactive item to the second terminal.
9. The live playback apparatus according to claim 8, wherein the first acquisition unit includes:
the receiving subunit is used for receiving an interaction request initiated by the anchor triggering the interaction control;
the display subunit is used for responding to the interaction request and displaying an interaction item creation page;
and the acquisition subunit is used for acquiring the interactive item recorded by the anchor in real time from the interactive item creation page, or acquiring the interactive item selected by the anchor from a material library from the interactive item creation page.
10. The live playback device of claim 8,
the generating subunit is further configured to render the formula as a picture by using a formula renderer when the interaction item includes the formula, and embed the picture in the webpage layer.
11. The live playback device of claim 8,
the recording unit is specifically configured to record an initiation time, a duration time, and a stop time of the interactive item with respect to the live video stream.
12. A terminal comprising a memory and a processor, the memory storing a computer program that, when executed by the processor, causes the processor to perform the steps of the method according to any one of claims 1 to 7.
13. A storage medium for storing a plurality of instructions adapted to be loaded by a processor and to perform the steps of the method according to any one of claims 1 to 7.
CN201811428963.6A 2018-11-27 2018-11-27 Live broadcast playback method, device, terminal and storage medium Active CN111225225B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811428963.6A CN111225225B (en) 2018-11-27 2018-11-27 Live broadcast playback method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811428963.6A CN111225225B (en) 2018-11-27 2018-11-27 Live broadcast playback method, device, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN111225225A CN111225225A (en) 2020-06-02
CN111225225B true CN111225225B (en) 2021-08-31

Family

ID=70828929

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811428963.6A Active CN111225225B (en) 2018-11-27 2018-11-27 Live broadcast playback method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN111225225B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112216311B (en) * 2020-09-03 2022-06-24 杭州壹百分教育科技有限公司 Online classroom reappearing method, system, equipment and medium
CN112218127B (en) * 2020-10-16 2022-03-01 广州方硅信息技术有限公司 Virtual live broadcast method, device, equipment and storage medium
CN112836469A (en) * 2021-01-27 2021-05-25 北京百家科技集团有限公司 Information rendering method and device
CN113422976B (en) * 2021-06-22 2022-09-20 读书郎教育科技有限公司 System and method for realizing online course learning competition
CN113596489B (en) * 2021-07-05 2023-07-04 咪咕互动娱乐有限公司 Live broadcast teaching method, device, equipment and computer readable storage medium
CN113596492A (en) * 2021-07-26 2021-11-02 上海哔哩哔哩科技有限公司 Gift display method and system in network live broadcast
CN114339354A (en) * 2021-12-31 2022-04-12 广州趣丸网络科技有限公司 Live broadcast rebroadcasting method, device, equipment and readable storage medium
CN114554234A (en) * 2022-01-18 2022-05-27 阿里巴巴(中国)有限公司 Method, device, storage medium, processor and system for generating live broadcast record

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105763824A (en) * 2007-11-01 2016-07-13 斯灵媒体公司 Method Used For Recording Media Programs In Remote Device And System Used For Recording Media Program To Play Back
CN106341695A (en) * 2016-08-31 2017-01-18 腾讯数码(天津)有限公司 Interaction method, device and system of live streaming room
CN106406998A (en) * 2016-09-28 2017-02-15 北京奇虎科技有限公司 Method and device for processing user interface
CN106488301A (en) * 2015-08-25 2017-03-08 北京新唐思创教育科技有限公司 A kind of record screen method and apparatus and video broadcasting method and device
CN106888398A (en) * 2016-12-31 2017-06-23 天脉聚源(北京)科技有限公司 The interactive method and apparatus of guess
CN106911967A (en) * 2017-02-27 2017-06-30 北京小米移动软件有限公司 Direct playing and playback method and device
CA2990456A1 (en) * 2016-12-29 2018-06-29 Dressbot Inc. System and method for multi-user digital interactive experience

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104065982B (en) * 2014-06-19 2015-12-30 腾讯科技(深圳)有限公司 The method and apparatus of live streaming media

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105763824A (en) * 2007-11-01 2016-07-13 斯灵媒体公司 Method Used For Recording Media Programs In Remote Device And System Used For Recording Media Program To Play Back
CN106488301A (en) * 2015-08-25 2017-03-08 北京新唐思创教育科技有限公司 A kind of record screen method and apparatus and video broadcasting method and device
CN106341695A (en) * 2016-08-31 2017-01-18 腾讯数码(天津)有限公司 Interaction method, device and system of live streaming room
CN106406998A (en) * 2016-09-28 2017-02-15 北京奇虎科技有限公司 Method and device for processing user interface
CA2990456A1 (en) * 2016-12-29 2018-06-29 Dressbot Inc. System and method for multi-user digital interactive experience
CN106888398A (en) * 2016-12-31 2017-06-23 天脉聚源(北京)科技有限公司 The interactive method and apparatus of guess
CN106911967A (en) * 2017-02-27 2017-06-30 北京小米移动软件有限公司 Direct playing and playback method and device

Also Published As

Publication number Publication date
CN111225225A (en) 2020-06-02

Similar Documents

Publication Publication Date Title
CN111225225B (en) Live broadcast playback method, device, terminal and storage medium
US11450350B2 (en) Video recording method and apparatus, video playing method and apparatus, device, and storage medium
US11151889B2 (en) Video presentation, digital compositing, and streaming techniques implemented via a computer network
CN110570698B (en) Online teaching control method and device, storage medium and terminal
CN106227335B (en) Interactive learning method for preview lecture and video course and application learning client
US20160203645A1 (en) System and method for delivering augmented reality to printed books
US9861895B2 (en) Apparatus and methods for multimedia games
US20090263777A1 (en) Immersive interactive environment for asynchronous learning and entertainment
CN112601100A (en) Live broadcast interaction method, device, equipment and medium
US11320895B2 (en) Method and apparatus to compose a story for a user depending on an attribute of the user
CN112437353A (en) Video processing method, video processing apparatus, electronic device, and readable storage medium
CN112672219B (en) Comment information interaction method and device and electronic equipment
CN106921724A (en) Game promotion content processing method and device
CN111462561A (en) Cloud computing-based dual-teacher classroom management method and platform
WO2023134419A1 (en) Information interaction method and apparatus, and device and storage medium
CN114205635A (en) Live comment display method, device, equipment, program product and medium
JP2022080645A (en) Program, information processing method, information processing device and system
CN106886349A (en) The interactive method and apparatus of answer
CN112218130A (en) Control method and device for interactive video, storage medium and terminal
US20120063743A1 (en) System and method for remote presentation provision
KR102036639B1 (en) Mobile terminal of playing moving picture lecture and method of displaying related moving picture
CN114422843B (en) video color egg playing method and device, electronic equipment and medium
JP7212721B2 (en) Program, information processing method, information processing apparatus, and system
CN114846808A (en) Content distribution system, content distribution method, and content distribution program
CN114253501A (en) Information processing method, information processing apparatus, and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40025266

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant