CN106911967B - Live broadcast playback method and device - Google Patents

Live broadcast playback method and device Download PDF

Info

Publication number
CN106911967B
CN106911967B CN201710109418.XA CN201710109418A CN106911967B CN 106911967 B CN106911967 B CN 106911967B CN 201710109418 A CN201710109418 A CN 201710109418A CN 106911967 B CN106911967 B CN 106911967B
Authority
CN
China
Prior art keywords
video stream
information
terminal device
live
terminal equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710109418.XA
Other languages
Chinese (zh)
Other versions
CN106911967A (en
Inventor
梁秋实
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201710109418.XA priority Critical patent/CN106911967B/en
Publication of CN106911967A publication Critical patent/CN106911967A/en
Application granted granted Critical
Publication of CN106911967B publication Critical patent/CN106911967B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47217End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/75Media network packet handling
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Information Transfer Between Computers (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The disclosure relates to a live broadcast playback method and a device, wherein the method comprises the following steps: a first terminal device generates a live video stream; the first terminal equipment sends the live video stream to at least one other terminal equipment; the method comprises the steps that a first terminal device determines corresponding label information of a live video stream at different moments; the method comprises the steps that a first terminal device establishes a corresponding relation between a live video stream and tag information; the first terminal equipment receives a playback request message sent by the second terminal equipment; and the first terminal equipment sends the video stream carrying the label information to the second terminal equipment according to the playback request message. Thereby improving the user experience.

Description

Live broadcast playback method and device
Technical Field
The present disclosure relates to video processing methods, and in particular, to a live playback method and apparatus.
Background
At present, very popular video live broadcast is carried out on the network, a user can interact with audiences in real time in a live broadcast mode, and the audiences can also send gifts, praise, barrage and the like to live broadcasters.
But some viewers may not have time to see the live or feel not enough to see the video again for several times. They need to watch the video playback. Wherein the viewer can view the playback directly from the beginning without fast forwarding or can view the playback in fast forwarding.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides a live playback method and apparatus. The technical scheme is as follows:
according to a first aspect of the embodiments of the present disclosure, there is provided a live playback method, including:
a first terminal device generates a live video stream;
the first terminal equipment sends the live video stream to at least one other terminal equipment;
the first terminal equipment determines corresponding label information of the live video stream at different moments according to the interactive information of at least one other terminal equipment;
the method comprises the steps that a first terminal device establishes a corresponding relation between a live video stream and tag information;
the first terminal equipment receives a playback request message sent by the second terminal equipment;
and the first terminal equipment sends the video stream carrying the label information to the second terminal equipment according to the playback request message.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects: by the method, the user can directly see the corresponding label information at different moments when playing back and watching the video stream, so that the user can know the interaction process between the anchor and the audience during live broadcasting. Thereby improving the user experience.
Optionally, the method further comprises: the method comprises the steps that a first terminal device determines brief introduction information of a live video stream according to tag information of the live video stream at different moments; the video stream sent by the first terminal device to the second terminal device also carries the profile information. Thereby improving the user experience.
Optionally, the tag information corresponding to any time includes at least one of the following: and between any time and the last time of the any time, newly increased attention people, newly increased praise people, newly increased gift sending people, newly increased barrage number and newly increased key content information.
Optionally, the method further comprises: the method comprises the steps that a first terminal device obtains keywords of a video clip and/or expression information in the video clip between any moment and the last moment of any moment; and the first terminal equipment determines newly added key content information according to the keywords of the video clip and/or the expression information in the video clip.
The newly added key content information can be determined through the method, the terminal can take the newly added key content information at any moment as the label information at the moment, and the label information is marked at the corresponding moment of the video; the tag information is displayed when the video is played back. Thereby improving the user experience.
A live playback method will be described below, where the method corresponds to the above method, and the corresponding content has the same technical effect, and will not be described herein again.
According to a second aspect of the embodiments of the present disclosure, there is provided a live playback method, including:
the second terminal equipment sends a playback request message to the first terminal equipment;
the second terminal equipment receives a video stream carrying the label information sent by the first terminal equipment; the tag information is information determined by the first terminal device according to the interaction information of the live video stream sent by the first terminal device by at least one other terminal device;
and the second terminal equipment plays back the video stream carrying the label information.
Optionally, the playing back, by the second terminal device, the video stream carrying the tag information includes:
the second terminal equipment acquires selection operation performed by a user according to the label information, and the selection operation is used for selecting a jump position on a progress bar of the video stream;
the second terminal device jumps to the jump position according to the selection operation and plays back the video stream from the jump position.
Optionally, the video stream carrying the tag information further includes: profile information of the video stream; the profile information is information determined by the first terminal device according to the tag information of the live video stream at different times.
Optionally, the tag information corresponding to any time includes at least one of the following: and between any moment and the previous moment, newly increased attention number, newly increased praise number, newly increased gift sending number, newly increased barrage number and newly increased key content information.
A live playback apparatus will be described below, where the apparatus portion corresponds to the method provided in the first aspect and the optional manner of the first aspect, and the corresponding content technical effect is the same, and will not be described herein again.
According to a third aspect of the embodiments of the present disclosure, there is provided a live playback apparatus including:
a generation module configured to generate a live video stream;
the first sending module is configured to send the live video stream to at least one other terminal device;
the first determining module is configured to determine tag information corresponding to the live video stream at different moments according to the interaction information of at least one other terminal device;
the establishment module is configured to establish a corresponding relation between the live video stream and the label information;
a receiving module configured to receive a playback request message sent by a second terminal device;
and the second sending module is configured to send the video stream carrying the tag information to the second terminal equipment according to the playback request message.
Optionally, the method further comprises: the second determination module is configured to determine the brief introduction information of the live video stream according to the label information of the live video stream at different moments;
the video stream sent by the apparatus to the second terminal device also carries the profile information.
Optionally, the tag information corresponding to any time includes at least one of the following: and between any time and the last time of the any time, newly increased attention people, newly increased praise people, newly increased gift sending people, newly increased barrage number and newly increased key content information.
Optionally, the method further comprises: the acquisition module is configured to acquire keywords of the video clip and/or expression information in the video clip between any moment and the last moment of any moment;
and the third determining module is configured to determine newly-added key content information according to the key words of the video clips and/or the expression information in the video clips.
A live playback apparatus will be described below, where the apparatus portion corresponds to the method provided in the second aspect and the optional manner of the second aspect, and the corresponding content technical effect is the same, and will not be described herein again.
According to a fourth aspect of the embodiments of the present disclosure, there is provided a live playback apparatus including:
a transmission module configured to transmit a playback request message to the first terminal device;
the receiving module is configured to receive a video stream carrying tag information sent by a first terminal device; the tag information is information determined by the first terminal device according to the interaction information of the live video stream sent by the first terminal device by at least one other terminal device;
and the playback module is configured to play back the video stream carrying the label information.
Optionally, the playback module comprises: the acquisition sub-module is respectively configured to acquire selection operation performed by a user according to the tag information, and the selection operation is used for selecting a jump position on a progress bar of the video stream;
and the playback sub-module is configured to jump to the jump position according to the selection operation and play back the video stream from the jump position.
Optionally, the video stream carrying the tag information further includes: profile information of the video stream; the profile information is information determined by the first terminal device according to the tag information of the live video stream at different times.
Optionally, the tag information corresponding to any time includes at least one of the following: and between any moment and the previous moment, newly increased attention number, newly increased praise number, newly increased gift sending number, newly increased barrage number and newly increased key content information.
According to a fifth aspect of embodiments of the present disclosure, there is provided a live playback apparatus, the apparatus including:
a processor, a transmitter and a receiver;
a memory for storing executable instructions of the processor;
a processor configured to generate a live video stream;
a transmitter configured to transmit a live video stream to at least one other terminal device;
the processor is further configured to determine corresponding label information of the live video stream at different moments according to the interaction information of at least one other terminal device;
the processor is further configured to establish a corresponding relation between the live video stream and the tag information;
a receiver configured to receive a playback request message transmitted by a second terminal device;
and the transmitter is also configured to transmit the video stream carrying the label information to the second terminal equipment according to the playback request message.
According to a sixth aspect of embodiments of the present disclosure, there is provided a live playback apparatus, the apparatus including:
a processor, a transmitter and a receiver;
a memory for storing executable instructions of the processor;
a transmitter configured to transmit a playback request message to the first terminal device;
the receiver is configured to receive a video stream carrying the tag information sent by the first terminal equipment; the tag information is information determined by the first terminal device according to the interaction information of the live video stream sent by the first terminal device by at least one other terminal device;
a processor configured to play back a video stream carrying tag information.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects: the present disclosure provides a live playback method and apparatus, the method including: a first terminal device generates a live video stream; the first terminal equipment sends the live video stream to at least one other terminal equipment; the method comprises the steps that a first terminal device determines corresponding label information of a live video stream at different moments; the method comprises the steps that a first terminal device establishes a corresponding relation between a live video stream and tag information; the first terminal equipment receives a playback request message sent by the second terminal equipment; and the first terminal equipment sends the video stream carrying the label information to the second terminal equipment according to the playback request message. By the method, the user can directly see the corresponding label information at different moments when playing back and watching the video stream, so that the user can know the interaction process between the anchor and the audience during live broadcasting. Thereby improving the user experience.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is an interactive flow diagram illustrating a method of live playback in accordance with an exemplary embodiment;
FIG. 1A is a schematic illustration of an interface in practice of an exemplary embodiment;
FIG. 2 is an interactive flow diagram illustrating a method of live playback in accordance with another exemplary embodiment;
FIG. 2A is a schematic illustration of an interface in an exemplary embodiment when implemented;
FIG. 3 is a flow diagram illustrating a method of live playback in accordance with yet another illustrative embodiment;
FIG. 4 is a block diagram illustrating a live playback device in accordance with an exemplary embodiment;
FIG. 5 is a block diagram of a live playback device shown in accordance with another exemplary embodiment;
FIG. 6 is a block diagram illustrating a live playback device in accordance with yet another exemplary embodiment;
FIG. 7 is a block diagram illustrating a live playback device in accordance with an exemplary embodiment;
FIG. 8 is a block diagram of a live playback device shown in accordance with another exemplary embodiment;
FIG. 9 is a block diagram illustrating a live playback device in accordance with an exemplary embodiment;
FIG. 10 is a block diagram of a live playback device shown in accordance with another exemplary embodiment;
fig. 11 is a block diagram illustrating a live playback device 1100 in accordance with an exemplary embodiment;
fig. 12 is a block diagram illustrating a live playback device 1200 according to an example embodiment.
With the foregoing drawings in mind, certain embodiments of the disclosure have been shown and described in more detail below. These drawings and written description are not intended to limit the scope of the disclosed concepts in any way, but rather to illustrate the concepts of the disclosure to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
Fig. 1 is an interactive flowchart illustrating a live playback method according to an exemplary embodiment, which is illustrated in the present embodiment as being applied to a terminal including a display screen. The live playback method can comprise the following steps:
in step S101: a first terminal device generates a live video stream;
specifically, the first terminal device may send an authentication request message to the server. The authentication request message may carry account information and password information of the user. After the server receives the authentication request message, the server authenticates the user identity according to the account information and the password information. And if the authentication is successful, the server sends an authentication response message to the first terminal equipment, wherein the authentication response message is used for endowing the user with the authority of live broadcasting the video. After the user obtains the authority, the user can record the video and send the video stream to the server.
In step S102: the first terminal equipment sends the live video stream to at least one other terminal equipment;
namely, the first terminal device sends the live video stream to at least one other terminal device through the server. These other terminal devices must satisfy two conditions: firstly, they have installed application software corresponding to the live video stream; second, they have sent a message to the server requesting that the live video stream be acquired. A terminal device sends a message to the server requesting to obtain the live video stream, that is, the terminal device clicks to watch the live video stream.
In step S103: the first terminal equipment determines corresponding label information of the live video stream at different moments according to the interactive information of at least one other terminal equipment;
the interaction information of the at least one other terminal device may include: and the other terminal equipment at least one of likes, concerns, presents and transmits the barrage of the live video stream. The label information corresponding to any time comprises at least one of the following items: the number of newly-added attention people between any moment and the last moment of any moment, the number of newly-added praise people between any moment and the last moment, the number of newly-added gift-sending people between any moment and the last moment, the number of newly-added barrage between any moment and the last moment and the newly-added key content information between any moment and the last moment. The key content information may be determined according to a keyword in the video segment between the any time and the previous time and/or expression information in the video segment. And the last moment is a moment which is before any moment and is away from the preset time. For example: assuming that the preset time is 1 minute and any time is 10:00, the last time is 9: 59.
For example: the tag information corresponding to the first time may include: the newly-increased number of the concerned people between the last moment and the first moment is 1 million people, the newly-increased number of praise people between the last moment and the first moment is 5000 people, the newly-increased number of the people who send gifts between the last moment and the first moment is 4000 people, the newly-increased number of the barracks between the last moment and the first moment is 8000, and the newly-increased key content information between the last moment and the first moment is A people, smiles, dragon gifts and the like.
In step S104: the method comprises the steps that a first terminal device establishes a corresponding relation between a live video stream and tag information;
and the first terminal equipment establishes the corresponding relation between the live video stream and the label information according to the moment information of the live video stream and the moment information of the label information.
For example: the corresponding relationship between the live video stream and the tag information established by the first terminal device is as follows:
Figure BDA0001234010810000061
Figure BDA0001234010810000071
song skewering, smiling to show mouth skin, dancing to show flowers, and interacting with fans can be understood as the key content information.
After the corresponding relation between the live video stream and the label information is established by the first terminal equipment; the first terminal device may store the live video stream and the corresponding tag information of the live video stream at different times according to the corresponding relationship. Wherein the first terminal device may store the correspondence in the form of a table.
In step S105: the first terminal equipment receives a playback request message sent by the second terminal equipment;
in step S106: and the first terminal equipment sends the video stream carrying the label information to the second terminal equipment according to the playback request message.
The following description is made with reference to step S105 and step S106: and the first terminal equipment receives the playback request message sent by the second terminal equipment, namely the user clicks a play key on the second terminal equipment to play back the video stream. At this time, the first terminal device may send the video stream carrying the tag information to the second terminal device. The second terminal device may buffer the video stream to play the video stream. In the process of playing the video stream by the second terminal device, the second terminal device may display a progress bar of the video stream, and the progress bar is marked with tag information corresponding to a plurality of moments. FIG. 1A is a schematic illustration of an interface in practice of an exemplary embodiment; as shown in fig. 1A, the label information respectively marked on the progress bar is:
Figure BDA0001234010810000072
the embodiment of the disclosure provides a live broadcast playback method, which includes: a first terminal device generates a live video stream; the first terminal equipment sends the live video stream to at least one other terminal equipment; the method comprises the steps that a first terminal device determines corresponding label information of a live video stream at different moments; the method comprises the steps that a first terminal device establishes a corresponding relation between a live video stream and tag information; the first terminal equipment receives a playback request message sent by the second terminal equipment; and the first terminal equipment sends the video stream carrying the label information to the second terminal equipment according to the playback request message. By the method, the user can directly see the corresponding label information at different moments when playing back and watching the video stream, so that the user can know the interaction process between the anchor and the audience during live broadcasting. Thereby improving the user experience.
Based on the basis of the previous embodiment, fig. 2 is an interactive flowchart illustrating a live playback method according to another exemplary embodiment, which is exemplified by applying the live playback method to a terminal including a display screen. The live playback method can comprise the following steps:
in step S201: a first terminal device generates a live video stream;
in step S202: the first terminal equipment sends the live video stream to at least one other terminal equipment;
in step S203: the first terminal equipment determines corresponding label information of the live video stream at different moments according to the interactive information of at least one other terminal equipment;
in step S204: the method comprises the steps that a first terminal device establishes a corresponding relation between a live video stream and tag information;
in step S205: the first terminal equipment receives a playback request message sent by the second terminal equipment;
in step S206: the first terminal equipment sends a video stream carrying the label information to the second terminal equipment according to the playback request message;
step S201 to step S206 are the same as step S101 to step S106, and are not described herein again.
In step S207: the second terminal equipment acquires selection operation performed by the user according to the label information;
in step S208: the second terminal device jumps to the jump position according to the selection operation and plays back the video stream from the jump position.
The following description is made with reference to step S207 and step S208: the selecting operation is to select a jump position on a progress bar of the video stream. For example: FIG. 2A is a schematic illustration of an interface in an exemplary embodiment when implemented; as shown in fig. 2A, the tag information corresponding to different times is:
Figure BDA0001234010810000081
the user chooses to jump to 10: 54, now it is straightforward to jump to 10: 54, and from 10: 54 begin playing the video. That is, the viewer can directly watch the video of "talking joke playing mouth skin".
Alternatively, as shown in fig. 2A, after the user selects the jump position, a video widget corresponding to the jump position may be displayed on the progress bar, and the user may watch the video through the video widget. If the video is determined to be the video which the user wants to watch, clicking the video small window, and amplifying the video small window to facilitate the user to watch the video.
The embodiment of the disclosure provides a live broadcast playback method, which includes: the second terminal equipment acquires selection operation performed by the user according to the label information; the second terminal device jumps to the jump position according to the selection operation and plays back the video stream from the jump position. That is, the user can select the favorite video clip according to the label information and directly play the video clip. Thereby improving the user experience.
Based on the embodiment shown in fig. 1, further, before step S106, the method further includes: and the first terminal equipment determines the brief introduction information of the live video stream according to the label information of the live video stream at different moments. Step S106 specifically includes: and the first terminal equipment sends the video stream carrying the label information and the brief introduction information to the second terminal equipment according to the playback request message.
For example: the key content information included in the tag information at the first moment is Qinghai, the key content information included in the tag information at the second moment is scenery, the number of praise people in the tag information at the third moment is 5000, and the key content information included in the tag information at the second moment is horse riding. In summary, determining the profile information of the live video according to the tag information at different times is: "anchor to go to Qinghai lake to play, direct seeding to ride horse, landscape is super praise". Profile information for a video may be displayed at the beginning of the video.
The embodiment of the disclosure provides a live broadcast playback method, which includes: determining brief introduction information of the live video according to the label information of the live video at different moments; when playing back a video, profile information of the video is displayed. Thereby improving the user experience.
Based on any of the above embodiments, further, fig. 3 is a flowchart illustrating a live playback method according to still another exemplary embodiment, which is exemplified by applying the live playback method to a terminal including a display screen. The method further comprises the following steps:
in step S301: the method comprises the steps that a first terminal device obtains keywords of a video clip between any moment and the last moment of any moment and/or expression information in the video clip;
in step S302: and the first terminal equipment determines newly added key content information according to the keywords of the video clip and/or the expression information in the video clip.
For example: and if the keyword of the video clip between the first moment and the last moment of the first moment is Qinghai and the expression information is smile, determining that the key content information is played in Qinghai lake according to the keyword and the expression information. The newly added key content information at the first moment can be determined by the method, the terminal can take the newly added key content information at the first moment as the label information at the first moment, and the label information is marked at the first moment of the video; the tag information is displayed when the video is played back. Thereby improving the user experience.
FIG. 4 is a block diagram illustrating a live playback device in accordance with an exemplary embodiment; the live playback device may be implemented as part or all of a terminal that includes a display, through software, hardware, or a combination of both. As shown in fig. 4, the live playback apparatus may include:
a generating module 41 configured to generate a live video stream;
a first sending module 42 configured to send the live video stream generated by the generating module 41 to at least one other terminal device;
a first determining module 43, configured to determine, according to the interaction information of the at least one other terminal device, tag information corresponding to the live video stream at different times;
an establishing module 44 configured to establish a corresponding relationship between the live video stream and the tag information determined by the first determining module 43;
a receiving module 45 configured to receive the playback request message sent by the second terminal device;
a second sending module 46, configured to send the video stream carrying the tag information to the second terminal device according to the playback request message.
Optionally, the tag information corresponding to any time includes at least one of the following: and between any moment and the previous moment, newly increased attention number, newly increased praise number, newly increased gift sending number, newly increased barrage number and newly increased key content information.
The live playback apparatus provided in the embodiment of the present disclosure may be configured to execute the method steps corresponding to the first terminal device, and specific content and effect are not described herein again.
Based on the basis of the previous embodiment, fig. 5 is a block diagram of a live playback apparatus shown according to another exemplary embodiment; the live playback device may be implemented as part or all of a terminal that includes a display, through software, hardware, or a combination of both. As shown in fig. 5, the live playback apparatus further includes:
a second determining module 47 configured to determine profile information of the live video stream according to the tag information of the live video stream at different times;
the video stream sent by the live playback apparatus to the second terminal device also carries profile information.
The live playback apparatus provided in the embodiment of the present disclosure may be configured to execute the method steps corresponding to the first terminal device, and specific content and effect are not described herein again.
Based on any of the embodiments described above, fig. 6 is a block diagram of a live playback apparatus shown according to yet another exemplary embodiment; the live playback device may be implemented as part or all of a terminal that includes a display, through software, hardware, or a combination of both. As shown in fig. 6, the live playback apparatus further includes:
an obtaining module 48 configured to obtain keywords of a video segment and/or expression information in the video segment between any time and a previous time of the any time;
and a third determining module 49 configured to determine newly added key content information according to the keywords of the video segment and/or the expression information in the video segment.
The live playback apparatus provided in the embodiment of the present disclosure may be configured to execute the method steps corresponding to the first terminal device, and specific content and effect are not described herein again.
FIG. 7 is a block diagram illustrating a live playback device in accordance with an exemplary embodiment; the live playback device may be implemented as part or all of a terminal that includes a display, through software, hardware, or a combination of both. As shown in fig. 7, the live playback apparatus includes:
a sending module 71 configured to send a playback request message to the first terminal device;
a receiving module 72 configured to receive a video stream carrying tag information sent by a first terminal device; the tag information is information determined by the first terminal device according to the interaction information of the live video stream sent by the first terminal device by at least one other terminal device;
a playback module 73 configured to play back the video stream carrying the tag information.
The live playback apparatus provided in the embodiment of the present disclosure may be configured to execute the method steps corresponding to the second terminal device, and specific content and effect are not described herein again.
Based on the basis of the previous embodiment, further, fig. 8 is a block diagram of a live playback apparatus shown according to another exemplary embodiment; the live playback device may be implemented as part or all of a terminal that includes a display, through software, hardware, or a combination of both. As shown in fig. 8, the playback module 73 includes:
an obtaining sub-module 731, configured to obtain a selection operation performed by a user according to the tag information, where the selection operation is used to select a skip position on a progress bar of the video stream;
a playback sub-module 732 configured to jump to the jump position according to the selection operation and play back the video stream from the jump position.
The live playback apparatus provided in the embodiment of the present disclosure may be configured to execute the method steps corresponding to the second terminal device, and specific content and effect are not described herein again.
FIG. 9 is a block diagram illustrating a live playback device in accordance with an exemplary embodiment; the live playback device may be implemented as part or all of a terminal that includes a display, through software, hardware, or a combination of both. As shown in fig. 9, the live playback apparatus includes: a processor 91, a transmitter 92 and a receiver 93; a memory 94 for storing executable instructions of the processor 91;
the processor 91 configured to generate a live video stream;
the transmitter 92 configured to transmit the live video stream to at least one other terminal device;
the processor 91 is further configured to determine, according to the interaction information of the at least one other terminal device, tag information corresponding to the live video stream at different times;
the processor 91 is further configured to establish a corresponding relationship between the live video stream and the tag information;
the receiver 93 is configured to receive a playback request message sent by a second terminal device;
the transmitter 92 is further configured to transmit the video stream carrying the tag information to the second terminal device according to the playback request message.
The live playback apparatus provided in the embodiment of the present disclosure may be configured to execute the method steps corresponding to the first terminal device, and specific content and effect are not described herein again.
FIG. 10 is a block diagram of a live playback device shown in accordance with another exemplary embodiment; the live playback device may be implemented as part or all of a terminal that includes a display, through software, hardware, or a combination of both. As shown in fig. 10, the live playback apparatus includes: a processor 1001, a transmitter 1002, and a receiver 1003; a memory 1004 for storing executable instructions of the processor 1001;
the transmitter 1002 configured to transmit a playback request message to the first terminal device;
the receiver 1003 is configured to receive a video stream carrying tag information sent by the first terminal device; the tag information is information determined by the first terminal device according to interaction information of a live video stream sent by at least one other terminal device to the first terminal device;
the processor 1001 is configured to play back a video stream carrying the tag information.
The live playback apparatus provided in the embodiment of the present disclosure may be configured to execute the method steps corresponding to the second terminal device, and specific content and effect are not described herein again.
Fig. 11 is a block diagram illustrating a live playback device 1100 according to an example embodiment. For example, the apparatus 1100 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 11, apparatus 1100 may include one or more of the following components: processing component 1102, memory 1104, power component 1106, multimedia component 1108, audio component 1110, input/output (I/O) interface 1112, sensor component 1114, and communications component 1116.
The processing component 1102 generally controls the overall operation of the device 1100, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 1102 may include one or more processors 1120 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 1102 may include one or more modules that facilitate interaction between the processing component 1102 and other components. For example, the processing component 1102 may include a multimedia module to facilitate interaction between the multimedia component 1108 and the processing component 1102.
The memory 1104 is configured to store various types of data to support operations at the apparatus 1100. Examples of such data include instructions for any application or method operating on device 1100, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 1104 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
A power component 1106 provides power to the various components of the device 1100. The power components 1106 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the apparatus 1100.
The multimedia component 1108 includes a touch-sensitive display screen that provides an output interface between the device 1100 and a user. In some embodiments, the touch display screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 1108 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 1100 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 1110 is configured to output and/or input audio signals. For example, the audio component 1110 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 1100 is in operating modes, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 1104 or transmitted via the communication component 1116. In some embodiments, the audio assembly 1110 further includes a speaker for outputting audio signals.
The I/O interface 1112 provides an interface between the processing component 1102 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a main bar button, a volume button, a start button, and a lock button.
The sensor assembly 1114 includes one or more sensors for providing various aspects of state assessment for the apparatus 1100. For example, the sensor assembly 1114 may detect an open/closed state of the apparatus 1100, the relative positioning of components, such as a display and keypad of the apparatus 1100, the sensor assembly 1114 may also detect a change in position of the apparatus 1100 or a component of the apparatus 1100, the presence or absence of user contact with the apparatus 1100, orientation or acceleration/deceleration of the apparatus 1100, and a change in temperature of the apparatus 1100. The sensor assembly 1114 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 1114 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 1114 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 1116 is configured to facilitate wired or wireless communication between the apparatus 1100 and other devices. The apparatus 1100 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 1116 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 1116 also includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 1100 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing some of the steps of the live playback method described above.
In an exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, such as memory 1104 comprising instructions, executable by processor 1120 of apparatus 1100 to perform some of the steps of the method described above, is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
A non-transitory computer readable storage medium having instructions therein that, when executed by a processor of apparatus 1100, enable apparatus 1100 to perform the steps of a live playback method.
Wherein the processor 1120 is configured to generate a live video stream; a transmitter in I/O interface 1112 configured to transmit the live video stream to at least one other end device;
the processor 1120 is further configured to determine, according to the interaction information of the at least one other terminal device, tag information corresponding to the live video stream at different times; the processor 1120 is further configured to establish a corresponding relationship between the live video stream and the tag information;
a receiver in the I/O interface 1112 configured to receive a playback request message transmitted by the second terminal device; a transmitter in the I/O interface 1112, further configured to transmit the video stream carrying the tag information to the second terminal device according to the playback request message.
Fig. 12 is a block diagram illustrating a live playback device 1200 according to an example embodiment. For example, the apparatus 1200 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 12, the apparatus 1200 may include one or more of the following components: processing component 1202, memory 1204, power component 1206, multimedia component 1208, audio component 1210, input/output (I/O) interface 1212, sensor component 1214, and communications component 1216.
The processing component 1202 generally controls overall operation of the apparatus 1200, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 1202 may include one or more processors 1220 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 1202 can include one or more modules that facilitate interaction between the processing component 1202 and other components. For example, the processing component 1202 can include a multimedia module to facilitate interaction between the multimedia component 1208 and the processing component 1202.
The memory 1204 is configured to store various types of data to support operation at the apparatus 1200. Examples of such data include instructions for any application or method operating on the device 1200, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 1204 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
A power supply component 1206 provides power to the various components of the device 1200. The power components 1206 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the apparatus 1100.
The multimedia component 1208 comprises a touch-sensitive display screen that provides an output interface between the device 1200 and a user. In some embodiments, the touch display screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 1208 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the apparatus 1200 is in an operation mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
Audio component 1210 is configured to output and/or input audio signals. For example, audio component 1210 includes a Microphone (MIC) configured to receive external audio signals when apparatus 1200 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 1204 or transmitted via the communication component 1216. In some embodiments, audio assembly 1210 further includes a speaker for outputting audio signals.
The I/O interface 1212 provides an interface between the processing component 1202 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a main bar button, a volume button, a start button, and a lock button.
The sensor assembly 1214 includes one or more sensors for providing various aspects of state assessment for the apparatus 1200. For example, the sensor assembly 1214 may detect an open/closed state of the apparatus 1200, the relative positioning of the components, such as a display and keypad of the apparatus 1200, the sensor assembly 1214 may also detect a change in the position of the apparatus 1200 or a component of the apparatus 1200, the presence or absence of user contact with the apparatus 1200, orientation or acceleration/deceleration of the apparatus 1200, and a change in the temperature of the apparatus 1200. The sensor assembly 1214 may include a proximity sensor configured to detect the presence of a nearby object in the absence of any physical contact. The sensor assembly 1214 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 1214 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communications component 1216 is configured to facilitate communications between the apparatus 1200 and other devices in a wired or wireless manner. The apparatus 1200 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 1216 receives the broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communications component 1216 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 1200 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing some of the steps of the live playback method described above.
In an exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, such as memory 1204 comprising instructions, executable by processor 1220 of apparatus 1200 to perform some of the steps of the method described above, is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
A non-transitory computer readable storage medium having instructions therein which, when executed by a processor of an apparatus 1200, enable the apparatus 1200 to perform the steps of a live playback method.
A transmitter in the I/O interface 1212 configured to transmit a playback request message to the first terminal device;
a receiver in the I/O interface 1212, configured to receive a video stream carrying tag information sent by the first terminal device; the tag information is information determined by the first terminal device according to interaction information of a live video stream sent by at least one other terminal device to the first terminal device;
the processor 1220 is configured to play back a video stream carrying the tag information.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (12)

1. A live playback method, comprising:
a first terminal device generates a live video stream;
the first terminal equipment sends the live video stream to at least one other terminal equipment;
the first terminal equipment determines label information corresponding to the live video stream at different moments according to the interaction information of the at least one other terminal equipment;
the first terminal equipment establishes a corresponding relation between the live video stream and the label information;
the first terminal equipment receives a playback request message sent by second terminal equipment;
the first terminal equipment sends the video stream carrying the label information to the second terminal equipment according to the playback request message;
the second terminal equipment acquires selection operation performed by a user according to the tag information, wherein the selection operation is used for selecting a jump position on a progress bar of the video stream;
the second terminal equipment jumps to the jump position according to the selection operation and plays back the video stream from the jump position;
the label information corresponding to any time comprises at least one of the following items: newly increased number of people concerned, newly increased number of people praise, newly increased number of people delivering gifts, newly increased number of barrage and newly increased key content information between any moment and the previous moment;
and the label information is used for representing the interaction process of the anchor and the audience during live broadcasting.
2. The method of claim 1, further comprising:
the first terminal equipment determines brief introduction information of the live video stream according to the label information of the live video stream at different moments;
the video stream sent by the first terminal device to the second terminal device also carries the profile information.
3. The method of claim 1, further comprising:
the first terminal equipment acquires keywords of a video clip between any moment and the last moment of any moment and/or expression information in the video clip;
and the first terminal equipment determines the newly added key content information according to the keywords of the video clip and/or the expression information in the video clip.
4. A live playback method, comprising:
the second terminal equipment sends a playback request message to the first terminal equipment;
the second terminal equipment receives a video stream carrying tag information sent by the first terminal equipment; the tag information is information determined by the first terminal device according to interaction information of a live video stream sent by at least one other terminal device to the first terminal device;
the second terminal equipment plays back the video stream carrying the label information; the second terminal equipment playing back the video stream carrying the label information comprises: the second terminal equipment acquires selection operation performed by a user according to the tag information, wherein the selection operation is used for selecting a jump position on a progress bar of the video stream;
the second terminal equipment jumps to the jump position according to the selection operation and plays back the video stream from the jump position;
the label information corresponding to any time comprises at least one of the following items: newly increased number of people concerned, newly increased number of people praise, newly increased number of people delivering gifts, newly increased number of barrage and newly increased key content information between any moment and the previous moment;
and the label information is used for representing the interaction process of the anchor and the audience during live broadcasting.
5. The method of claim 4, wherein the video stream carrying the tag information further comprises: profile information of the video stream; the profile information is information determined by the first terminal device according to the tag information of the live video stream at different times.
6. A live playback apparatus, comprising:
a generation module configured to generate a live video stream;
a first sending module configured to send the live video stream to at least one other terminal device;
the first determining module is configured to determine tag information corresponding to the live video stream at different moments according to the interaction information of the at least one other terminal device;
the establishment module is configured to establish a corresponding relation between the live video stream and the label information;
a receiving module configured to receive a playback request message sent by a second terminal device;
a second sending module, configured to send the video stream carrying the tag information to the second terminal device according to the playback request message;
the second terminal equipment acquires selection operation performed by a user according to the tag information, wherein the selection operation is used for selecting a jump position on a progress bar of the video stream;
the second terminal equipment jumps to the jump position according to the selection operation and plays back the video stream from the jump position;
the label information corresponding to any time comprises at least one of the following items: newly increased number of people concerned, newly increased number of people praise, newly increased number of people delivering gifts, newly increased number of barrage and newly increased key content information between any moment and the previous moment;
and the label information is used for representing the interaction process of the anchor and the audience during live broadcasting.
7. The apparatus of claim 6, further comprising:
the second determination module is configured to determine the brief introduction information of the live video stream according to the label information of the live video stream at different moments;
the video stream sent by the apparatus to the second terminal device also carries the profile information.
8. The apparatus of claim 6, further comprising:
the acquisition module is configured to acquire keywords of a video clip and/or expression information in the video clip between the any moment and a last moment of the any moment;
and the third determining module is configured to determine the newly added key content information according to the keywords of the video clip and/or the expression information in the video clip.
9. A live playback apparatus, comprising:
a transmission module configured to transmit a playback request message to the first terminal device;
the receiving module is configured to receive a video stream carrying tag information sent by the first terminal device; the tag information is information determined by the first terminal device according to interaction information of a live video stream sent by at least one other terminal device to the first terminal device;
a playback module configured to play back a video stream carrying the tag information; the playback module includes:
the obtaining sub-module is configured to obtain a selection operation performed by a user according to the tag information, wherein the selection operation is used for selecting a jump position on a progress bar of the video stream;
a playback sub-module configured to jump to the jump position according to the selection operation and play back the video stream from the jump position;
the label information corresponding to any time comprises at least one of the following items: newly increased number of people concerned, newly increased number of people praise, newly increased number of people delivering gifts, newly increased number of barrage and newly increased key content information between any moment and the previous moment;
and the label information is used for representing the interaction process of the anchor and the audience during live broadcasting.
10. The apparatus of claim 9, wherein the video stream carrying the tag information further comprises: profile information of the video stream; the profile information is information determined by the first terminal device according to the tag information of the live video stream at different times.
11. A live playback apparatus, characterized in that the apparatus comprises:
a processor, a transmitter and a receiver;
a memory for storing executable instructions of the processor;
the processor configured to generate a live video stream;
the transmitter is configured to transmit the live video stream to at least one other terminal device;
the processor is further configured to determine, according to the interaction information of the at least one other terminal device, tag information corresponding to the live video stream at different times;
the processor is further configured to establish a corresponding relationship between the live video stream and the tag information;
the receiver is configured to receive a playback request message sent by a second terminal device;
the transmitter is further configured to transmit the video stream carrying the tag information to the second terminal device according to the playback request message;
the second terminal equipment acquires selection operation performed by a user according to the tag information, wherein the selection operation is used for selecting a jump position on a progress bar of the video stream;
the second terminal equipment jumps to the jump position according to the selection operation and plays back the video stream from the jump position;
the label information corresponding to any time comprises at least one of the following items: newly increased number of people concerned, newly increased number of people praise, newly increased number of people delivering gifts, newly increased number of barrage and newly increased key content information between any moment and the previous moment;
and the label information is used for representing the interaction process of the anchor and the audience during live broadcasting.
12. A live playback apparatus, characterized in that the apparatus comprises:
a processor, a transmitter and a receiver;
a memory for storing executable instructions of the processor;
the transmitter is configured to transmit a playback request message to the first terminal device;
the receiver is configured to receive a video stream carrying tag information sent by the first terminal device; the tag information is information determined by the first terminal device according to interaction information of a live video stream sent by at least one other terminal device to the first terminal device;
the processor is configured to play back a video stream carrying the tag information;
the processor is further configured to acquire a selection operation performed by a user according to the tag information, wherein the selection operation is used for selecting a jump position on a progress bar of the video stream; jumping to the jumping position according to the selection operation, and playing back the video stream from the jumping position;
the label information corresponding to any time comprises at least one of the following items: newly increased number of people concerned, newly increased number of people praise, newly increased number of people delivering gifts, newly increased number of barrage and newly increased key content information between any moment and the previous moment;
and the label information is used for representing the interaction process of the anchor and the audience during live broadcasting.
CN201710109418.XA 2017-02-27 2017-02-27 Live broadcast playback method and device Active CN106911967B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710109418.XA CN106911967B (en) 2017-02-27 2017-02-27 Live broadcast playback method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710109418.XA CN106911967B (en) 2017-02-27 2017-02-27 Live broadcast playback method and device

Publications (2)

Publication Number Publication Date
CN106911967A CN106911967A (en) 2017-06-30
CN106911967B true CN106911967B (en) 2022-04-15

Family

ID=59209131

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710109418.XA Active CN106911967B (en) 2017-02-27 2017-02-27 Live broadcast playback method and device

Country Status (1)

Country Link
CN (1) CN106911967B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109922375A (en) * 2017-12-13 2019-06-21 上海聚力传媒技术有限公司 Event methods of exhibiting, playback terminal, video system and storage medium in live streaming
CN108668163B (en) * 2018-05-03 2020-03-20 广州虎牙信息科技有限公司 Live broadcast method and device, computer readable storage medium and computer equipment
CN108833946A (en) * 2018-06-22 2018-11-16 青岛海信传媒网络技术有限公司 Video broadcasting method and device
CN109348239B (en) * 2018-10-18 2020-05-19 北京达佳互联信息技术有限公司 Live broadcast fragment processing method and device, electronic equipment and storage medium
CN111225225B (en) * 2018-11-27 2021-08-31 腾讯科技(深圳)有限公司 Live broadcast playback method, device, terminal and storage medium
CN109947993B (en) * 2019-03-14 2022-10-21 阿波罗智联(北京)科技有限公司 Plot skipping method and device based on voice recognition and computer equipment
CN110418157B (en) * 2019-08-28 2020-11-27 广州华多网络科技有限公司 Live video playback method and device, storage medium and electronic equipment
CN112182288B (en) * 2020-09-30 2023-10-03 北京达佳互联信息技术有限公司 Data storage method and device, electronic equipment and storage medium
CN113422976B (en) * 2021-06-22 2022-09-20 读书郎教育科技有限公司 System and method for realizing online course learning competition
CN114466216B (en) * 2022-02-15 2023-11-03 上海哔哩哔哩科技有限公司 Live broadcast room display method, server side and live broadcast client side
CN115103213B (en) * 2022-06-10 2023-10-17 咪咕视讯科技有限公司 Information processing method, apparatus, device and computer readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104980790A (en) * 2015-06-30 2015-10-14 北京奇艺世纪科技有限公司 Voice subtitle generating method and apparatus, and playing method and apparatus
CN106131593A (en) * 2016-06-28 2016-11-16 乐视控股(北京)有限公司 Content processing method and device
CN106293410A (en) * 2016-08-22 2017-01-04 维沃移动通信有限公司 A kind of video progress control method and mobile terminal

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8429696B2 (en) * 2003-10-31 2013-04-23 Microsoft Corporation Multimedia presentation resumption within an environment of multiple presentation systems
CN102290082B (en) * 2011-07-05 2014-03-26 央视国际网络有限公司 Method and device for processing brilliant video replay clip
US8977104B2 (en) * 2012-09-05 2015-03-10 Verizon Patent And Licensing Inc. Tagging video content
CN103945241A (en) * 2014-05-12 2014-07-23 腾讯科技(深圳)有限公司 Streaming data statistical method, system and related device
CN104038834A (en) * 2014-05-19 2014-09-10 乐视网信息技术(北京)股份有限公司 Video positioning method and device
CN105491456A (en) * 2014-10-11 2016-04-13 中兴通讯股份有限公司 Video content recommendation method and device as well as video content evaluation method and device
CN106162230A (en) * 2016-07-28 2016-11-23 北京小米移动软件有限公司 The processing method of live information, device, Zhu Boduan, server and system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104980790A (en) * 2015-06-30 2015-10-14 北京奇艺世纪科技有限公司 Voice subtitle generating method and apparatus, and playing method and apparatus
CN106131593A (en) * 2016-06-28 2016-11-16 乐视控股(北京)有限公司 Content processing method and device
CN106293410A (en) * 2016-08-22 2017-01-04 维沃移动通信有限公司 A kind of video progress control method and mobile terminal

Also Published As

Publication number Publication date
CN106911967A (en) 2017-06-30

Similar Documents

Publication Publication Date Title
CN106911967B (en) Live broadcast playback method and device
CN106791893B (en) Video live broadcasting method and device
CN106941624B (en) Processing method and device for network video trial viewing
CN111970533A (en) Interaction method and device for live broadcast room and electronic equipment
CN106162230A (en) The processing method of live information, device, Zhu Boduan, server and system
KR20180026745A (en) Game live broadcasting method and apparatus
CN107743244B (en) Video live broadcasting method and device
WO2022028234A1 (en) Live broadcast room sharing method and apparatus
US20150341698A1 (en) Method and device for providing selection of video
CN106792173B (en) Video playing method and device and non-transitory computer readable storage medium
KR20160022286A (en) Method and apparatus for sharing video information
CN109151565B (en) Method and device for playing voice, electronic equipment and storage medium
CN109039872B (en) Real-time voice information interaction method and device, electronic equipment and storage medium
CN109614470B (en) Method and device for processing answer information, terminal and readable storage medium
CN109451341B (en) Video playing method, video playing device, electronic equipment and storage medium
CN110719530A (en) Video playing method and device, electronic equipment and storage medium
CN104639977A (en) Program playing method and device
US20220078221A1 (en) Interactive method and apparatus for multimedia service
CN111031332A (en) Data interaction method, device, server and storage medium
CN108845749B (en) Page display method and device
CN113111220A (en) Video processing method, device, equipment, server and storage medium
CN112188230A (en) Virtual resource processing method and device, terminal equipment and server
CN110992920B (en) Live broadcasting chorus method and device, electronic equipment and storage medium
CN112532931A (en) Video processing method and device and electronic equipment
CN111182328A (en) Video editing method, device, server, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20240626

Address after: Room 001, 5th Floor, Building A, No. 37 Yongchu Road, Jianye District, Nanjing City, Jiangsu Province, 210092

Patentee after: Beijing Xiaomi Mobile Software Co.,Ltd. Nanjing Branch

Country or region after: China

Patentee after: BEIJING XIAOMI MOBILE SOFTWARE Co.,Ltd.

Address before: 100085 Huarun Qingcai Street 68, Haidian District, Beijing, two stage, 9 floor, 01 rooms.

Patentee before: BEIJING XIAOMI MOBILE SOFTWARE Co.,Ltd.

Country or region before: China