CN111918140B - Video playing control method and device, computer equipment and storage medium - Google Patents

Video playing control method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN111918140B
CN111918140B CN202010783516.3A CN202010783516A CN111918140B CN 111918140 B CN111918140 B CN 111918140B CN 202010783516 A CN202010783516 A CN 202010783516A CN 111918140 B CN111918140 B CN 111918140B
Authority
CN
China
Prior art keywords
video
parameter
interactive
interaction
segment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010783516.3A
Other languages
Chinese (zh)
Other versions
CN111918140A (en
Inventor
郑钿彬
孟庆春
刘里
尹金钢
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010783516.3A priority Critical patent/CN111918140B/en
Publication of CN111918140A publication Critical patent/CN111918140A/en
Application granted granted Critical
Publication of CN111918140B publication Critical patent/CN111918140B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47205End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Engineering & Computer Science (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to a video playing control method, a video playing control device, computer equipment and a storage medium, and relates to the technical field of video playing. The method comprises the following steps: acquiring a video control request, wherein the video control request is triggered when a target terminal plays a first video clip; the first video segment is one of at least two video segments contained in the interactive video; acquiring a first parameter corresponding to the interactive video, wherein the first parameter is generated based on the interactive record of group users on the interactive video; the group of users are the users who have viewed or are viewing the interactive video; and generating video control information based on the first parameter, wherein the video control information is used for controlling the target terminal to play a second video clip corresponding to the first parameter. The scheme realizes the group interaction of the interactive videos, expands the interaction mode between the user and the interactive videos and improves the user interaction effect in the interactive video playing process.

Description

Video playing control method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of video playing, and in particular, to a video playing control method and apparatus, a computer device, and a storage medium.
Background
With the popularization of mobile terminal applications and the development of internet technologies, people put forward the requirements of selecting and playing different scenarios through interaction with terminals according to personal preference to the scenarios when watching videos.
In the related technology, in order to meet the interaction requirement of a user in watching a video, an interaction option is added in the video playing to form an interaction video, and the user influences the playing sequence of the subsequent video according to the interaction option, so that the subsequent plot development of the video is influenced, and the effect of selecting the played video segment according to the requirement of the user is achieved.
However, in the related art, when a user watching an interactive video browses the interactive video, the video playing content is only affected by the interaction behavior of a single user, and the interaction effect is single.
Disclosure of Invention
The embodiment of the application provides a video playing control method, a video playing control device, computer equipment and a storage medium, which can realize group interaction of interactive videos and improve the user interaction effect in the video playing process, and the technical scheme is as follows:
in one aspect, a video playing control method is provided, where the method includes:
acquiring a video control request, wherein the video control request is triggered when a target terminal plays a first video clip; the first video segment is one of at least two video segments contained in the interactive video;
acquiring a first parameter corresponding to the interactive video, wherein the first parameter is generated based on the interactive record of group users on the interactive video; the group of users are the users who have viewed or are viewing the interactive video;
and generating video control information based on the first parameter, wherein the video control information is used for controlling the target terminal to play a second video segment corresponding to the first parameter.
In another aspect, a video playback control apparatus is provided, the apparatus including:
the control request acquisition module is used for acquiring a video control request, wherein the video control request is triggered when a target terminal plays a first video segment; the first video segment is one of at least two video segments contained in the interactive video;
the first parameter acquisition module is used for acquiring a first parameter corresponding to the interactive video, and the first parameter is generated based on the interactive record of the group users on the interactive video; the group of users are the users who have viewed or are viewing the interactive video;
and the control information generating module is used for generating video control information based on the first parameter, wherein the video control information is used for controlling the target terminal to play a second video segment corresponding to the first parameter.
In one possible implementation, the apparatus further includes:
the interaction record acquisition module is used for acquiring a target interaction record, wherein the target interaction record is an interaction record corresponding to interaction operation executed on a target video clip by the group of users, and the target video clip is at least one video clip appointed in the interaction video;
and the first parameter generation module is used for generating the first parameter based on the target interaction record.
In a possible implementation manner, the first parameter generation module is further configured to,
in response to receiving a first interaction record, updating the first parameter based on the first interaction record; the first interaction record is any one of the target interaction records.
In one possible implementation manner, the group users are users who have watched or are watching the interactive video, and each user has a specified social relationship with the first user; the first user is a user corresponding to the target terminal.
In one possible implementation manner, the control information generating module includes:
a second video segment determining unit, configured to determine the second video segment according to the first parameter;
a control information generating unit configured to generate the video control information based on the determined second video segment.
In a possible implementation manner, the video control information is used to instruct the target terminal to display an interaction option of the second video segment in a video picture of the first video segment;
or,
the video control information is used for indicating the target terminal to be in the video picture of the first video clip and corresponding to the guide information corresponding to the interaction option of the second video clip, and the guide information is used for guiding the user to trigger the interaction option of the second video clip;
or,
the video control information is used for triggering the target terminal to directly play the second video clip when the first video clip is played.
In a possible implementation manner, the video control information includes at least one of an identifier of the second video segment and a network address of the second video segment.
In a possible implementation, the second video segment determining unit is configured to,
determining the second video segment based on the first parameter and a second parameter; the second parameter is generated based on the interactive record of the interactive video in the process that the first user watches the interactive video at this time; the first user is a user corresponding to the target terminal.
In one possible implementation manner, the second video segment determining unit includes:
the parameter weighting subunit is used for weighting the first parameter and the second parameter to obtain a weighted parameter;
a second video segment determination subunit to determine the second video segment based on the weighting parameter.
In a possible implementation, the second video determining unit is configured to,
determining at least two candidate video segments based on the first parameter, the candidate video segments being video segments played subsequent to the first video segment;
determining the second video segment from the at least two candidate video segments based on the second parameter.
In another possible implementation manner, the second video determining unit is further configured to,
determining at least two candidate video segments based on the second parameter, the candidate video segments being video segments played subsequent to the first video segment;
determining the second video segment from the at least two candidate video segments based on the first parameter.
In a possible implementation, the second video determination unit is further configured to,
determining the second video segment based on the first parameter and a third parameter; the third parameter is generated based on the barrage content sent by the group of users in the process of watching the first video segment.
In yet another aspect, a computer device is provided, which includes a processor and a memory, where at least one instruction, at least one program, a code set, or a set of instructions is stored in the memory, and the at least one instruction, the at least one program, the code set, or the set of instructions is loaded and executed by the processor to implement the above-mentioned video playing control method.
In yet another aspect, a computer-readable storage medium is provided, in which at least one instruction, at least one program, a set of codes, or a set of instructions is stored, which is loaded and executed by a processor to implement the above-mentioned video playback control method.
In yet another aspect, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device may read the computer instructions from the computer-readable storage medium, and execute the computer instructions, so that the computer device implements the video playback control method.
The technical scheme provided by the application can comprise the following beneficial effects:
when the terminal plays the video clip of the interactive video, the group parameters corresponding to the interactive record of the group users on the interactive video are obtained, and the next video clip played by the terminal is influenced according to the group parameters, so that the plot development of the interactive video watched by the users is influenced by the interactive record of the interactive video by other users, correspondingly, the interactive record of the users on the interactive video can also influence the plot development of the interactive video watched by other users in the future, the group interaction of the interactive video is realized, the interaction mode between the users and the interactive video is expanded, and the user interaction effect in the interactive video playing process is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
Fig. 1 is a schematic diagram illustrating a video playback control system provided by an exemplary embodiment of the present application;
fig. 2 is a schematic structural diagram of a video playback terminal according to an exemplary embodiment of the present application;
fig. 3 is a flowchart illustrating a video playback control method according to an exemplary embodiment of the present application;
fig. 4 is a flowchart illustrating a method of controlling video playback according to an exemplary embodiment of the present application;
FIG. 5 is a flow chart illustrating a video playback control scenario according to the embodiment shown in FIG. 4;
fig. 6 shows a system flowchart of a video playback control method according to an exemplary embodiment of the present application;
FIG. 7 is a schematic interface diagram of an interactive video interaction according to the embodiment shown in FIG. 6;
FIG. 8 is a schematic interface diagram of an interactive video community interaction according to the embodiment shown in FIG. 6;
fig. 9 is a block diagram of a video playback control apparatus according to an exemplary embodiment of the present application;
FIG. 10 is a block diagram illustrating the structure of a computer device according to an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
Before describing the various embodiments shown herein, several concepts related to the present application will be described:
1) interactive Video (Interactive Video, IV)
The interactive video is a brand-new video type and aims to bring richer watching experience to audiences by enhancing somatosensory feedback, plot participation, content exploration and other modes.
Interaction and information interaction with a user are generally realized through an interaction component in the interactive video.
In the embodiment of the present application, an interactive video may include a plurality of video nodes, and each video node corresponds to a video clip. Besides the end node, after each video node in the interactive video is played, an interactive option can be provided for a user, and a corresponding next video node is determined according to the selection operation of the user on the interactive option, wherein different interactive options can correspond to different video nodes. The interactive options can be preset by a developer of the interactive video according to different plot trends, and in the playing process of the interactive video, a user can influence the plot trends of the video through the operation of the interactive options, so that the interaction between the user and the video plot is realized.
2) Numerical variables
The numerical variable is a self-defined numerical variable which is calculated according to the interactive behavior of the user in the film watching process and belongs to the user. In each subsequent embodiment of the present application, the numerical variable may be regarded as a quantized feature value for describing the interaction behavior of the user, and the quantized feature value may be a parameter that affects the interactive video scenario development, that is, different scenario developments may occur when the user watches the interactive video based on different numerical variables.
Referring to fig. 1, a schematic diagram of a video playback control system provided in an exemplary embodiment of the present application is shown, and as shown in fig. 1, the video playback control system 100 includes a server 110 and a terminal 120.
The server 110 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a Network service, cloud communication, a middleware service, a domain name service, a security service, a CDN (Content Delivery Network), a big data and artificial intelligence platform, and the like.
The terminal 120 is installed with an application program corresponding to video playing control, each user can access a server of the application program through the terminal 120, and the terminal may be a terminal device having a network connection function and an interface display function, for example, the terminal 120 may be a smart phone, a tablet computer, an e-book reader, smart glasses, a smart watch, a smart television, an MP3 player (Moving Picture Experts Group Audio Layer III, motion Picture Experts compression standard Audio Layer 3), an MP4(Moving Picture Experts Group Audio Layer IV, motion Picture Experts compression standard Audio Layer 4), a laptop computer, a desktop computer, and the like.
The video playback control system 100 may include a plurality of terminals 120, where the number of terminals 120 in the video playback control system 100 is not limited in the embodiment of the present application.
The video playback control system 100 may include one or more servers 110, where the number of servers 110 in the video playback control system 100 is not limited in the embodiment of the present application.
In one possible implementation manner, the terminal 120 includes a main board, an external input/output device, a memory, an external interface, a capacitive touch system, and a power source.
Wherein, the mainboard is integrated with processing elements such as a processor, a controller and the like.
The external input/output devices may include a display component (e.g., a display screen), a sound playing component (e.g., a speaker), a sound collecting component (e.g., a microphone), and various keys.
The memory has stored therein program code and data.
The external interface may include a headset interface, a charging interface, a data interface, and the like.
The capacitive touch system may be integrated in a display component or a key of the external input/output device, and the capacitive touch system is used for detecting a touch operation performed by a user on the display component or the key.
The power supply is used to power the various other components in the terminal.
The terminal is connected with the server through a communication network. Optionally, the communication network is a wired network or a wireless network.
Optionally, the system may also include a database 130.
The database 130 may be a Redis database, or may be another type of database. The database 130 is used to store various data, such as user information of each user, each interactive video, and numerical variables (parameters) of the interactive video.
Optionally, the system may further include a management device (not shown in fig. 1), which is connected to the server 110 through a communication network. Optionally, the communication network is a wired network or a wireless network.
Optionally, the wireless network or wired network described above uses standard communication techniques and/or protocols. The Network is typically the Internet, but may be any Network including, but not limited to, a Local Area Network (LAN), a Metropolitan Area Network (MAN), a Wide Area Network (WAN), a mobile, wireline or wireless Network, a private Network, or any combination of virtual private networks. In some embodiments, data exchanged over a network is represented using techniques and/or formats including Hypertext Mark-up Language (HTML), Extensible Markup Language (XML), and the like. All or some of the links may also be encrypted using conventional encryption techniques such as Secure Socket Layer (SSL), Transport Layer Security (TLS), Virtual Private Network (VPN), Internet Protocol Security (IPsec). In other embodiments, custom and/or dedicated data communication techniques may also be used in place of, or in addition to, the data communication techniques described above.
In the embodiment of the application, the interactive video comprises a plurality of plot branches composed of video segments. For example, please refer to fig. 2, which shows a segment relation diagram of an interactive video according to an embodiment of the present application. As shown in fig. 2, the interactive video includes a video segment 201 to a video segment 206, the interactive video first plays the video segment 201, the video segment 201 has two lower video scores, and according to the interaction behavior between the user and the video segment 201, a video segment corresponding to the interaction behavior is selected as a next video segment played by the terminal, for example, in response to the user selecting option a in the video segment 201, the terminal takes the video segment 202 as a next video segment played by the terminal, and in response to the user selecting option a in the video segment 202, the terminal takes the video segment 204 as a next video segment played by the terminal; in response to the user selecting option B in the video clip 202, the terminal takes the video clip 205 as the next video clip played by the terminal; when a user selects an option B in the video clip 201, the terminal takes the video clip 203 as the next video clip played by the terminal, reads a numerical variable in the server at the moment, responds to the numerical variable being greater than N (N is a preset value), and directly skips to play the video clip 206; and when the numerical variable is not larger than N, directly jumping to the video clip 205, and using the video clip 205 as the next video clip played by the terminal.
Please refer to fig. 3, which is a flowchart illustrating a video playback control method according to an exemplary embodiment. The method may be performed by a computer device, which may be the server 110 in the system shown in fig. 1, or the computer device may be the terminal 120 in the system shown in fig. 1, or the computer device may include the server 110 and the terminal 120 in the system shown in fig. 1. As shown in fig. 3, the flow of the video playing control may include the following steps:
step 31, acquiring a video control request, wherein the video control request is triggered when the target terminal plays the first video segment; the first video segment is one of at least two video segments contained in the interactive video.
In one possible implementation manner, a video interaction control is superimposed on the first video segment, and a user performs an interaction operation with the interactive video through the video interaction control.
The video interaction control is a video selection control, and a user selects to play a next video clip corresponding to the interaction operation corresponding to the video selection control by clicking the video selection control.
Or the video interaction control is a bullet screen sending control, a user inputs a message in the bullet screen sending control, the bullet screen sending control is subjected to specified operation to send a bullet screen, and the target terminal responds to the specified operation, sends information including the content of the bullet screen to the server and stores the information in a bullet screen storage of the server.
In another possible implementation, the video interaction control may not be present on the first video segment. The target terminal receives interaction through other operations besides the video interaction control, for example, the target terminal receives user interaction on the interactive video by detecting at least one of a sliding touch operation, a shaking operation and a voice control operation.
Step 32, acquiring a first parameter corresponding to the interactive video, wherein the first parameter is generated based on the interactive records of the group users on the interactive video; the group of users are the individual users who have viewed or are viewing the interactive video.
Wherein, the group of users are all or part of users who have watched or are watching the interactive video.
The first parameter is generated based on the interaction record of the group users on the interaction video, that is, the group users interact with the interaction video to jointly influence the first parameter, or the group users share the first parameter corresponding to the interaction video.
For example, when a user of the group of users performs an interactive operation on the interactive video to generate a new interactive record, the first parameter is updated along with the new interactive record.
The first parameter is generated based on an interaction record of the group of users on the interaction video, that is, the group of users commonly influence the first parameter, in other words, the group of users share the first parameter corresponding to the interaction video.
For example, when a user in the group of users performs an interactive operation on the interactive video to generate a new interactive record, the first parameter is updated to a new first parameter along with the new interactive record.
And step 33, generating video control information based on the first parameter, where the video control information is used to control the target terminal to play a second video segment corresponding to the first parameter.
In the embodiment of the application, when the target terminal plays the first video segment in the process of playing the interactive video, the first parameter generated by the interactive record of the group users on the interactive video can be obtained, and the target terminal is controlled to play the subsequent corresponding second video segment based on the first parameter, that is, the plot development after the first video segment in the interactive video is influenced by the interactive record of the group users on the interactive video.
In summary, in the scheme shown in the embodiment of the present application, when the terminal plays a video segment of an interactive video, the server obtains a corresponding group parameter according to an interaction record of a group user on the interactive video, and affects a next video segment played by the terminal according to the group parameter, so that a scenario development of the interactive video watched by the user is affected by interaction records of other users on the interactive video, and correspondingly, the interaction record of the user on the interactive video may also affect a scenario development of other users on watching the interactive video in the future, thereby implementing group interaction of the interactive video, expanding an interaction manner between the user and the interactive video, and improving an effect of user interaction in a playing process of the interactive video.
Please refer to fig. 4, which is a flowchart illustrating a method of controlling video playback according to an exemplary embodiment. The method may be performed by a computer device, for example, the computer device may be the server 110 in the system shown in fig. 1, or the computer device may be the terminal 120 in the system shown in fig. 1, or the computer device may include the server 110 and the terminal 120 in the system shown in fig. 1. Taking the computer device as a server as an example, as shown in fig. 4, the flow of the video playing control may include the following steps:
step 410, obtaining a target interaction record, where the target interaction record is an interaction record corresponding to an interaction operation performed on a target video segment by the group of users, and the target video segment is at least one video segment specified in the interaction video.
In one possible implementation, the target interaction record is an interaction record corresponding to an interaction operation performed on the target video segment by the group of users within a certain time period.
In the interaction process of the interactive video, the interaction trends of the group users and the interactive video may be different in different time periods, in this embodiment of the application, in order to ensure that the target interaction record is an effective interaction record, the server takes the interaction record of the interaction operation executed by the group users on the target video segment in a certain time period as the target interaction record, for example, the certain time period may be a specified length time period before the current time, so that the interaction record with longer time is discarded, and the real-time performance of the target interaction record is improved.
In one possible implementation manner, in response to the interactive operation being a selection operation, the target interactive record is a record of selection of each option in the selection control of the target video clip by the group of users.
When a user operates a selection control contained in a target video clip, the terminal takes the selection record of the user on the selection control as a target interaction record. For example, the target video clip comprises two options of 'A' and 'B', when the user selects the option 'A', the terminal sends the selection record of the option 'A' selected by the user as a target interaction record to the server; when the user selects the option "B", the terminal sends and stores the selection record of the user selected option "B" as a target interaction record in the server.
In another possible implementation manner, in response to the interactive operation being a bullet screen sending operation, the target interactive record is a bullet screen sending record of the group users in the target video segment.
When a user carries out a bullet screen sending operation in a target video clip, a terminal records a bullet screen sending record of the user for carrying out the bullet screen sending operation in the target video clip, wherein the bullet screen sending record comprises bullet screen sending time, a user ID for sending a bullet screen, bullet screen sending content and the like, and sends and stores information containing the bullet screen sending record into a server.
In one possible implementation, the target video segment is at least one of the preceding video segments; the front video segment includes the first video segment and video segments preceding the first video segment.
In this embodiment of the application, the target video segment is the first video segment or one of the segments before the first video segment, and the interaction record generated by the group of users performing an interaction operation on the target video segment is the interaction record of the group of users on the first video segment or each segment before the first video segment.
Step 420, generating the first parameter based on the target interaction record.
In one possible implementation, in response to receiving a first interaction record, the server updates the first parameter based on the first interaction record; the first interaction record is any one of the target interaction records.
In the embodiment of the application, each time the server receives one of the target interaction records, the server updates the first parameter based on the interaction record. That is, when any one of the group of users is watching the interactive video, the user interacts with the interactive video to generate an interactive operation, and the interactive operation is fed back to the server, and the server updates the first parameter in real time based on the interactive record, in other words, the server can update the first parameter in real time.
In one possible implementation, the group of users may be all users who have viewed the interactive video all over the network or are viewing the interactive video.
In the embodiment of the application, the server can count all users who have watched or are watching the interactive video in the whole network, and update the first parameter according to the target interaction records of all the users, wherein the updated first parameter accords with the interaction results of most users in the whole network.
In one possible implementation, the group of users are users who have viewed or are viewing the interactive video, and each user has a specified social relationship with the first user; the first user is a user corresponding to the target terminal.
In the embodiment of the application, the group user may be each user in a certain friend circle or each user in a group obtained by grouping through other social relations, the server groups the group users through the social relations, the users in the group share the same group parameter, and the updated first parameter has social properties and is more suitable for the interaction result of the users in the social group on the interaction video.
In one possible implementation, the first parameter is determined according to the number of specified operations in the target interaction record.
For example, in response to the interactive operation being a selection operation, the target interactive record is a selection record of a certain option in the selection control of the target video clip by the group of users. When the user operates the selection control contained in the target video clip, the terminal takes the selection record of the user on the selection control as the target interaction record. For example, the target video clip comprises two options of 'A' and 'B', when the user selects the option 'A', the terminal sends and stores the record of the option 'A' selected by the user to the server; when the user selects the option "B", the terminal transmits and stores a record of the user selection of the option "B" to the server. In this case, the first parameter may be determined according to the number of times that the user selects the option "a" in the target interaction record, and when the number of times that the user selects the option "a" in the target interaction record is 14 times, the first parameter may be 14.
In another possible implementation manner, the first parameter is determined according to a ratio of the number of the specified operations in the target interaction record to the total number of times of the target interaction record.
For example, the first parameter may be determined according to a user ratio of the option "a" selected in the target interaction record, and when the number of times that each user selects the option "a" in the target interaction record is 39 times and the target interaction record has a total of one hundred selection records, the server may use 39% of the ratio of each user selection option "a" as the first parameter.
Step 430, receiving a video control request, where the video control request is sent by a target terminal when playing a first video segment; the first video segment is one of at least two video segments contained in the interactive video.
When the target terminal plays the first video clip, the target terminal sends the video control request to the server, wherein the video control request is used for acquiring the next video clip played by the terminal from the server, or the video control request is used for acquiring the indication information of the next video clip played by the terminal from the server.
In a possible implementation manner, when each video segment in the interactive video is played, the target terminal sends a video control request to the server respectively.
In the embodiment of the application, when each video segment of the interactive video is played by the target terminal, the video control request can be sent to the server, and the server judges whether the second video segment needs to be determined according to the numerical variable according to the video control request. For example, the video control request may include a video segment identifier, and when the server detects a specific video segment identifier, the server reads a corresponding numerical variable and determines a second video segment according to the numerical variable; when the server does not detect a specific video segment identifier in the video control request, the next video segment representing the video segment does not need to be controlled by a numerical variable, and the server may not respond to the request.
In a possible implementation manner, the specific video clip identifier may be preset by a developer of the interactive video and stored in the server.
In another possible implementation manner, the target terminal sends a video control request when playing to the specified video clip.
In a possible implementation manner of the embodiment of the application, for a video segment that needs to trigger a video control request, the video segment includes a trigger mark, and when a target terminal plays a video segment, if the trigger mark is not detected, the target terminal does not send the video control request to a server; and when the user terminal detects the trigger mark in the process of playing the first video clip, the target terminal sends a video control request to the server.
In a possible implementation manner, the trigger flag may be preset by a developer of the interactive video in a video segment that needs to trigger the video control request.
Step 440, obtaining a first parameter corresponding to the interactive video, where the first parameter is generated based on the interactive records of the group users on the interactive video; the group of users are the individual users who have viewed or are viewing the interactive video.
The first parameter also becomes a group parameter, namely a numerical variable corresponding to the group user.
In one possible implementation, the first parameters corresponding to different video segments may be different.
In the embodiment of the application, the server controls the next video clip played by the terminal through different group parameters corresponding to different video clips played by the target terminal. For example, a video control request sent to a server when a terminal plays a first video clip includes an identifier of the first video clip, the server obtains a group parameter (i.e., the first parameter) corresponding to the first video clip according to the identifier of the first video clip, and determines a second video clip according to the group parameter, where the second video clip is a video clip played next after the terminal plays the first video clip; when the terminal plays the third video clip, the video control request sent to the server contains the identifier of the third video clip, and the server acquires the group parameters corresponding to the third video clip according to the identifier of the third video clip, and determines a fourth video clip according to the group parameters, wherein the fourth video clip is the video clip played next after the terminal plays the third video clip.
The correspondence between the identifier of the first video segment and the first parameter, and the correspondence between the identifier of the third video segment and the corresponding group parameter may be preset by a developer of the interactive video.
After the server acquires the first parameter, video control information may be generated based on the first parameter, where the video control information is used to control the target terminal to play a second video segment corresponding to the first parameter. The process may refer to subsequent steps 450 and 460.
Step 450, determining a second video segment according to the first parameter.
In one possible implementation, the second video segment is determined based on the first parameter and the second parameter; the second parameter is generated based on the interactive record of the interactive video in the process that the first user watches the interactive video; the first user is a user corresponding to the target terminal.
The second parameter is formed during the process that the user of the target terminal watches the interactive video this time, that is, the second parameter is generated based on the first user and the user record of the front video clip, and the front video clip comprises the first video clip and each video clip before the first video clip.
The second parameter is only affected by the interactive operation between the user corresponding to the target terminal and the interactive video, that is, the second parameter is the personal parameter of the user corresponding to the target terminal, and the interactive operation of other users does not affect the second parameter. Correspondingly, the second parameter does not show the trend of the plot when other users watch the interactive video.
In an exemplary aspect of the embodiment of the present application, the second video segment is determined according to the first parameter and the second parameter, that is, the second video segment is affected by both the personal parameter and the group parameter.
In a possible implementation manner, the server performs weighting processing on the first parameter and the second parameter to obtain a weighted parameter; and determining the second video segment based on the weighting parameter.
Wherein, the weighting processing of the first parameter and the second parameter means that the weight of the first parameter and the second parameter is controlled to control the proportion of the influence of the personal interactive record and the group interactive record on the interactive video, when the weight of the first parameter is larger, the influence of the personal interactive record on the interactive video is larger, the personal interactive record plays a main role in the selection of the second video clip, and the group interactive record plays a secondary role in the selection of the second video clip; when the second parameter weight is larger, the interactive records of the group are larger for the interactive video image, the interactive records of the group play a main role in the selection of the second video segment, and the interactive records of the individual play a secondary role in the selection of the second video segment. Wherein the weights of the first parameter and the second parameter can be preset by a developer of the interactive video.
In one possible implementation, at least two candidate video segments are determined based on the first parameter, the candidate video segments being video segments played after the first video segment; the second video segment is determined from the at least two candidate video segments based on the second parameter.
In an exemplary scenario of the embodiment of the present application, the server determines at least two candidate video segments according to the first parameter, and then determines the second video segment directly from the at least two candidate video segments according to the second parameter. That is, the server may select the candidate video segments through the interaction records of the group, and then determine the second video segment to be finally played among the at least two candidate video segments according to the interaction records of the individual.
In one possible implementation, at least two candidate video segments are determined based on the second parameter, the candidate video segments being video segments played after the first video segment; the second video segment is determined from the at least two candidate video segments based on the first parameter.
In another exemplary scheme of the embodiment of the present application, the server may also select the candidate video segments through an individual interaction record, and then determine a second video segment to be finally played among the at least two candidate video segments according to a group interaction record.
In one possible implementation, the second video segment is determined based on the first parameter and the third parameter; the third parameter is generated based on the barrage content sent by the group of users in the process of watching the first video segment.
In the playing process of the interactive video, the barrage is an important method for enhancing group interaction, and a user usually expresses his own idea or selection trend by sending the barrage in the interaction process with a certain video segment of the interactive video.
In a possible implementation manner of the embodiment of the application, the third parameter may be implemented by an NLP (Natural Language Processing) technology, for example, a keyword of the first video segment is preset, a bullet screen sent by the group of users in a process of watching the first video segment is read, Natural Language Processing is performed on the bullet screen, semantic similarity between each bullet screen and the keyword is obtained, in response to that the semantic similarity between a bullet screen and the keyword is greater than a certain threshold, a value of the third parameter is incremented, and after all the bullet screens of the group of users are processed, the third parameter value is obtained.
In one possible implementation manner, the second video segment is determined based on the first parameter and a fourth parameter, the fourth parameter is generated based on an interaction record of the first user in a front interactive video, and the front interactive video is an interactive video of the same type as the interactive video.
In the embodiment of the application, the server obtains the fourth parameter through the interactive record of the first user in other interactive videos of the same type, and determines the second video clip according to the fourth parameter and the first parameter, namely the selection trend of the first user is considered through obtaining the previous selection record of the first user, and the selection trend of the group where the first user is located is also considered, so that the influence of the interaction between the user and the interactive video in the subsequent video clip playing is enhanced.
In the embodiment of the present application, the second video segment may be a single video segment, or may include a plurality of video segments.
Step 460, sending video control information to the target terminal based on the determined second video segment.
Wherein, the server may generate video control information based on the determined second video segment and transmit the video control information to the target terminal.
Please refer to fig. 5, which illustrates a flowchart of a video playback control scene according to an embodiment of the present application. As shown in fig. 5, during the process of playing an interactive video, the terminal 510 corresponding to each group user sends an interactive record corresponding to the interactive video to the server 530, the server 530 stores the interactive record in a record memory in the server 530, and the server 530 generates a first parameter (i.e., a group variable) according to the interactive record. In the process of playing the interactive video, in response to the interactive operation between the first user and the target terminal 520, the target terminal 520 sends first interactive information 521 generated by the interactive operation to the server 530, the server 530 updates a first parameter generated in advance according to the first interactive information 521 to obtain an updated first parameter, and the server 530 further generates a second parameter (personal parameter) according to the first interactive information or updates the second parameter, wherein the second parameter is generated only in relation to the interactive record between the first user and the interactive video. When the target terminal 520 plays the first video segment, the target terminal 520 sends a video acquisition request 522 to the server 530, the server 530 responds to the video acquisition request 522, determines a second video segment played after the first video segment according to the first parameter and the second parameter, generates video control information 523 according to the determined second video segment, and sends the video control information 523 to the target terminal 520, and the target terminal 520 receives the video control information 523 and plays the corresponding second video segment according to the video control information 523.
In one possible implementation, the video control information includes information of a plurality of candidate second video segments.
The video control information sent by the server to the terminal may include information of the second video segments, that is, the server determines a plurality of candidate second video segments according to the first parameter, and sends the video control information including the plurality of second video segments to the terminal, and the terminal selects one of the plurality of second video segments as a next video segment played by the terminal.
In one possible implementation manner, the terminal displays the selection controls corresponding to the plurality of second video segments on the first video segment according to the video control information.
The server determines a second video clip corresponding to the first parameter according to the first parameter, the second video clip can be a plurality of second video clips, video control information corresponding to the plurality of second video clips is sent to the terminal, the terminal displays a selection control corresponding to the second video clip on the first video clip according to the video control information, and a user selects one of the plurality of second video clips as a video clip to be played next by performing selection operation on the selection control.
In a possible implementation manner, the video control information is used to instruct the target terminal to show an interactive option of the second video segment in a video picture of the first video segment.
The video control information sent by the server to the terminal comprises interactive information, the terminal displays the interactive option of the second video clip according to the interactive information, and the user selects the interactive option to influence the video clip played subsequently by the terminal.
In another possible implementation manner, the video control information is used to instruct the target terminal to indicate, in a video picture of the first video segment, guidance information corresponding to the interaction option of the second video segment, where the guidance information is used to guide a user to trigger the interaction option of the second video segment.
The video control information includes, in addition to the interactive information, guidance information, and the terminal displays the interactive option of the second video segment according to the interactive information, and then displays, according to the guidance information corresponding to the interactive option, a guidance statement corresponding to the guidance information on the interactive option, such as "there is a surprise here", and the like.
In another possible implementation manner, the video control information is used to trigger the target terminal to directly play the second video segment when the playing of the first video segment is finished.
In this embodiment of the application, the second video segment may be a single video segment, and the terminal directly determines, according to the video control information, that the second video segment is a next video segment played by the terminal, and directly skips to play the second video segment after the first video segment is played.
In a possible implementation manner, the video control information includes at least one of an identifier of the second video segment and a network address of the second video segment.
For example, when the terminal uses a wireless local area network or a wired network, when the terminal opens the interactive video, the terminal directly caches all video segments included in the interactive video, and the terminal directly reads the second video segment corresponding to the identifier in the cached video segments according to the identifier of the second video segment in the video control information as the next video segment played by the terminal.
For example, in response to the terminal using the operator network, the terminal acquires the network address of the second video segment included in the video control information, and accesses the network address to acquire the second video segment as the next video segment played by the terminal.
In summary, in the scheme shown in the embodiment of the present application, when the terminal plays a video segment of an interactive video, the server obtains a corresponding group parameter according to an interaction record of a group user on the interactive video, and affects a next video segment played by the terminal according to the group parameter, so that a scenario development of the interactive video watched by the user is affected by interaction records of other users on the interactive video, and correspondingly, the interaction record of the user on the interactive video may also affect a scenario development of other users on watching the interactive video in the future, thereby implementing group interaction of the interactive video, expanding an interaction manner between the user and the interactive video, and improving an effect of user interaction in a playing process of the interactive video.
Based on the scheme described in the embodiment shown in fig. 4, taking the scheme as an example for controlling the subsequent video segment of the interactive video by combining the first parameter and the second parameter, please refer to fig. 6, which shows a system flowchart of a video playing control method provided in an exemplary embodiment of the present application.
As shown in fig. 6, the terminal 610 plays a first video clip, which is one of at least two video clips included in the interactive video.
That is, when a user watches an interactive video on a terminal through a client, when the interactive video skips to play the first video clip and the interactive control is superimposed on the first video clip, the user operates the interactive control to generate an interactive behavior (such as different selections, sliding, and the like), the terminal reports information including the interactive behavior to the server 620, and the server 620 receives the information of the interactive behavior and stores the information in the behavior recording memory.
The server 620 reads the behavior record memory and performs quantization processing on the information containing the interaction behavior. For example, a selection control is superimposed on the first video segment, and the user can perform a selection operation on the selection control, and when the user selects option a, the information containing option a selected by the user is quantized to "1", and when the user selects option B, the information containing option B selected by the user is quantized to "0".
In one possible implementation manner, the value change generated by the operation of the user on the interactive control is distinguished and processed.
And the server calculates and processes the quantized interactive behavior information to obtain the parameters corresponding to the interactive behavior information. In this embodiment, the calculation process may be a calculation process based on a statistical method, and the interactive behavior information is counted to calculate and update the individual variable and the group variable, that is, the operation performed by the user on the interactive control may generate two variables, the group variable (first parameter) and the individual variable (second parameter). The personal variable is the variable of each user, the variable is stored in the server in a separated mode, each user has the personal variable, and the subsequent interactive video of the user is only affected when the personal variable is updated due to the current interactive behavior of the user; the group variable is a variable that a plurality of users have in a unified manner, and may be a variable that the whole server stores only one group variable, or a preset group variable that users in a certain group (for example, a certain circle of friends) share, and after the group variable is updated by the interaction behavior of a user in the group, the group variable read by all users in the group is the updated group variable.
The updated variables can be fed back to the user to influence the plot development, and both the individual variables and the group variables can influence the plot development, that is, different values result in different plot trends (for example, a certain variable is greater than 10 for the plot a, and less than or equal to 10 for the plot B).
The group variable can affect not only the second video segment but also other video segments in the interactive video, for example, the first video segment can also be affected by the group variable.
When the user interaction behavior leads to personal variable updating, the numerical value change generated by the interaction only influences the video playing of the interactive video currently watched by the user, the server generates video control information containing a second video clip according to the personal variable of the user and sends the video control information to the terminal, and the terminal plays the corresponding second video clip according to the video control information, namely only the interactive video of the user can be influenced by the numerical value.
When the user interaction behavior results in a group variable update: the numerical variable generated by the interaction influences the playing of the video clip of the interactive video watched by all the users who share the group variable currently or later, namely, two influences can be caused: 1) the user causing the group variable update can influence the subsequent video clip playing of the interactive video including the first video clip by the numerical value; 2) all other users sharing the group variable influence the video clip playing of the interactive video when watching the interactive video currently or later.
And the server determines the next video clip to be played by the terminal, namely a second video clip according to the personal variable and the group variable, and sends video control information for controlling the terminal to play the video to the terminal.
In one possible implementation, the second video clip to be played by the terminal may be determined by voting the individual variable and the group variable.
In the voting process, the voting number is essentially a quantitative value of the user's group voting behavior. Please refer to fig. 7, which illustrates an interface diagram of an interactive video interaction according to an embodiment of the present application. As shown in fig. 7, a video interaction control is superimposed on the first video segment, and the video interaction control is a video selection control, and the video selection control has two options a and B and a discard option C, and a user can select the video selection control, i.e., select option a or select option B and select discard option C, and update the individual variable and the group variable according to the option selected by the user.
In one possible implementation, there may be multiple group variables simultaneously within a social group.
For example, in the first video segment of the interactive video shown in fig. 7, the group variable a represents the number of people in the social group who select option a, and the group variable B represents the number of people in the social group who select option B; the personal variable s may be any one of 0, 1 and 2, a personal variable s of 0 represents that the user has forgone selection, a personal variable s of 1 represents that the user has selected option a, and a personal variable s of 2 represents that the user has selected option B.
That is, the interaction behavior of the selection a can be represented by an interaction function f ([ a, b, s ]) of [ a +1, b,1], that is, when a user selects the option a, the group variable a is added with one, the value of the group variable b is unchanged, and the value of the personal variable s is 1; the interactive behavior of selecting B can be represented by an interactive function g ([ a, B, s ]) [ a, B +1,2], that is, when a user selects an option B, the value of the group variable a is unchanged, the value of the group variable B is increased by one, and the value of the personal variable s is 2;
when the group variable action is not considered, two subsequent video segment branches are preset in the first video segment, if a user selects an option A, namely the personal variable value s is 1, the server sends video control information of a second video segment corresponding to the personal variable value s being 1 to the terminal, and the terminal plays the second video segment corresponding to the personal variable value s being 1; if the user selects the option B and the personal variable value s is 2, the server sends the video control information of the second video clip corresponding to the personal variable value s of 2 to the terminal, and the terminal plays the second video clip corresponding to the personal variable value s of 2.
Please refer to fig. 8, which illustrates an interface diagram of an interactive video group interaction according to an embodiment of the present application. When the group variable acts, in the first video segment, when the user 1 is the first user participating in the interactive video in a certain social group, the preset personal variable and the preset group variable are both initialized to 0, and if the user selects the option a in the first video segment, the variation values of the personal variable and the group variable are as shown in table 1:
TABLE 1
Variables of a b s
Initial 0 0 0
Selection A 1 0 1
The group variable a is updated to 1 from the initial value 0, the group variable b keeps a numerical value, the personal variable s takes 1, at the moment, the server sends the video control information of the second video clip corresponding to the personal variable s being 1 to the terminal, and the terminal plays the second video clip corresponding to the personal variable s being 1.
When the second user participating in the interactive video in the social group selects option B in the second video clip, the variation values of the personal variable and the group variable at this time are shown in table 2:
TABLE 2
Figure BDA0002621063310000201
Figure BDA0002621063310000211
The group variable a keeps the value unchanged, the group variable b is updated to 1 from the value 0, the personal variable s takes 2, the server sends the video control information of the second video clip corresponding to the personal variable s of 2 to the terminal, and the terminal plays the second video clip corresponding to the personal variable s of 2.
When a user participating in the interactive video in the social video group selects the abandon option C in the second video segment, the variation values of the personal variable and the group variable at this time are shown in table 3:
TABLE 3
Variables of a b s
Initial 1 1 0
Selection c 1 1 0
That is, at this time, the group variable a and the group variable b are both kept unchanged, while the individual variable s still takes 0, and the terminal directly jumps to the second video segment corresponding to the abandon option.
As shown in fig. 8, when the difference between the group variable a and the group variable b is greater than a certain threshold, for example, the threshold is 10, the group variable a is 20, and the group variable b is 40, at this time, in response to that the difference between the group variable b and the group variable a is greater than the threshold, the server sends the video control information to the target terminal, which includes the guidance information for the option a, and the terminal receives the guidance information for the option a, displays a guidance frame "there is a surprise" on the option a, and guides the user to select the option a.
The server collects the interactive behavior information generated by the users in the process of watching the interactive video through the group variable numerical system, the interactive behaviors of the user group of the whole network are calculated into a plurality of quantitative characteristic numerical values through a predefined calculation formula, and the characteristic numerical values are fed back to the plot of the interactive video again to influence the playing of the subsequent video segments. Namely, the user interaction behavior influences the video clip playing of the interactive video watched by other people, and the personal interaction behavior of the user is expanded to influence the video playing of a group, namely, the personal interaction behavior of the user can generate interaction influence in a group, so that the participation degree of the user in the interactive video is improved, the person can provide help for the development of the interactive video, and a larger creation space is provided for an interactive video creator.
In summary, in the scheme shown in the embodiment of the present application, when the terminal plays a video segment of an interactive video, the server obtains a corresponding group parameter according to an interaction record of a group user on the interactive video, and affects a next video segment played by the terminal according to the group parameter, so that a scenario development of the interactive video watched by the user is affected by interaction records of other users on the interactive video, and correspondingly, the interaction record of the user on the interactive video may also affect a scenario development of other users on watching the interactive video in the future, thereby implementing group interaction of the interactive video, expanding an interaction manner between the user and the interactive video, and improving an effect of user interaction in a playing process of the interactive video.
Referring to fig. 9, a block diagram of a video playback control apparatus according to an exemplary embodiment of the present application is shown, where as shown in fig. 9, the video playback control apparatus includes:
a control request obtaining module 901, configured to obtain a video control request, where the video control request is triggered when a target terminal plays a first video segment; the first video segment is one of at least two video segments contained in the interactive video;
a first parameter obtaining module 902, configured to obtain a first parameter corresponding to the interactive video, where the first parameter is generated based on an interaction record of group users on the interactive video; the group of users are the users who have viewed or are viewing the interactive video;
a control information generating module 903, configured to generate video control information based on the first parameter, where the video control information is used to control the target terminal to play a second video segment corresponding to the first parameter.
In one possible implementation, the apparatus further includes:
the interaction record acquisition module is used for acquiring a target interaction record, wherein the target interaction record is an interaction record corresponding to interaction operation executed on a target video clip by the group of users, and the target video clip is at least one video clip appointed in the interaction video;
and the first parameter generation module is used for generating the first parameter based on the target interaction record.
In one possible implementation manner, the first parameter generating module is configured to,
in response to receiving a first interaction record, updating the first parameter based on the first interaction record; the first interaction record is any one of the target interaction records.
In one possible implementation manner, the group users are users who have watched or are watching the interactive video, and each user has a specified social relationship with the first user; the first user is a user corresponding to the target terminal.
In a possible implementation manner, the control information generating module 903 includes:
a second video segment determining unit, configured to determine the second video segment according to the first parameter;
a control information generating unit configured to generate the video control information based on the determined second video segment.
In a possible implementation manner, the video control information is used to instruct the target terminal to display an interaction option of the second video segment in a video picture of the first video segment;
or,
the video control information is used for indicating the target terminal to be in the video picture of the first video clip and corresponding to the guide information corresponding to the interaction option of the second video clip, and the guide information is used for guiding the user to trigger the interaction option of the second video clip;
or,
the video control information is used for triggering the target terminal to directly play the second video clip when the first video clip is played.
In a possible implementation manner, the video control information includes at least one of an identifier of the second video segment and a network address of the second video segment.
In a possible implementation, the second video segment determining unit is configured to,
determining the second video segment based on the first parameter and a second parameter; the second parameter is generated based on the interactive record of the interactive video in the process that the first user watches the interactive video at this time; the first user is a user corresponding to the target terminal.
In one possible implementation manner, the second video segment determining unit includes:
the parameter weighting subunit is used for weighting the first parameter and the second parameter to obtain a weighted parameter;
a second video segment determination subunit to determine the second video segment based on the weighting parameter.
In a possible implementation, the second video determining unit is configured to,
determining at least two candidate video segments based on the first parameter, the candidate video segments being video segments played subsequent to the first video segment;
determining the second video segment from the at least two candidate video segments based on the second parameter.
In another possible implementation manner, the second video determining unit is further configured to,
determining at least two candidate video segments based on the second parameter, the candidate video segments being video segments played subsequent to the first video segment;
determining the second video segment from the at least two candidate video segments based on the first parameter.
In a possible implementation, the second video determination unit is further configured to,
determining the second video segment based on the first parameter and a third parameter; the third parameter is generated based on the barrage content sent by the group of users in the process of watching the first video segment.
In summary, in the scheme shown in the embodiment of the present application, when the terminal plays a video segment of an interactive video, the server obtains a corresponding group parameter according to an interaction record of a group user on the interactive video, and affects a next video segment played by the terminal according to the group parameter, so that a scenario development of the interactive video watched by the user is affected by interaction records of other users on the interactive video, and correspondingly, the interaction record of the user on the interactive video may also affect a scenario development of other users on watching the interactive video in the future, thereby implementing group interaction of the interactive video, expanding an interaction manner between the user and the interactive video, and improving an effect of user interaction in a playing process of the interactive video.
Fig. 10 is a block diagram illustrating the structure of a computer device 1000 according to an example embodiment. The computer device may be implemented as a server in the above-mentioned aspects of the present application. The computer apparatus 1000 includes a Central Processing Unit (CPU) 1001, a system Memory 1004 including a Random Access Memory (RAM) 1002 and a Read-Only Memory (ROM) 1003, and a system bus 1005 connecting the system Memory 1004 and the Central Processing Unit 1001. The computer device 1000 also includes a mass storage device 1006 for storing an operating system 1009, application programs 1010, and other program modules 1011.
The mass storage device 1006 is connected to the central processing unit 1001 through a mass storage controller (not shown) connected to the system bus 1005. The mass storage device 1006 and its associated computer-readable storage media provide non-volatile storage for the computer device 1000. That is, the mass storage device 1006 may include a computer-readable storage medium (not shown) such as a hard disk or Compact Disc-Only Memory (CD-ROM) drive.
Without loss of generality, the computer-readable storage media may include computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes RAM, ROM, Erasable Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), flash Memory or other solid state Memory technology, CD-ROM, Digital Versatile Disks (DVD), or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage, or other magnetic storage devices. Of course, those skilled in the art will appreciate that the computer storage media is not limited to the foregoing. The system memory 1004 and mass storage device 1006 described above may be collectively referred to as memory.
The computer device 1000 may also operate as a remote computer connected to a network through a network, such as the internet, in accordance with various embodiments of the present disclosure. That is, the computer device 1000 may be connected to the network 1008 through the network interface unit 1007 connected to the system bus 1005, or may be connected to other types of networks or remote computer systems (not shown) using the network interface unit 1007.
The memory further includes at least one instruction, at least one program, a code set, or a set of instructions, which is stored in the memory, and the central processing unit 1001 implements all or part of the steps in the video playing control method shown in each of the above embodiments by executing the at least one instruction, the at least one program, the code set, or the set of instructions.
In an exemplary embodiment, a computer program product or computer program is also provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the methods shown in the various embodiments described above.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (15)

1. A video playback control method, the method comprising:
acquiring a video control request, wherein the video control request is triggered when a target terminal plays a first video clip; the first video segment is one of at least two video segments contained in the interactive video;
acquiring a first parameter corresponding to the interactive video, wherein the first parameter is generated based on the interactive record of group users on the interactive video; the group of users are the users who have viewed or are viewing the interactive video;
generating video control information based on the first parameter, wherein the video control information is used for controlling the target terminal to play a second video segment corresponding to the first parameter; the second video segment is a video segment played after the first video segment.
2. The method of claim 1, wherein before the obtaining the first parameter corresponding to the interactive video, the method further comprises:
acquiring a target interaction record, wherein the target interaction record is an interaction record corresponding to an interaction operation executed on a target video clip by the group of users, and the target video clip is at least one video clip appointed in the interaction video;
and generating the first parameter based on the target interaction record.
3. The method of claim 2, wherein generating the first parameter based on the target interaction record comprises:
in response to receiving a first interaction record, updating the first parameter based on the first interaction record; the first interaction record is any one of the target interaction records.
4. The method of claim 2, wherein the group of users are users who have viewed or are viewing the interactive video, each user having a specified social relationship with the first user; the first user is a user corresponding to the target terminal.
5. The method of claim 1, wherein generating video control information based on the first parameter comprises:
determining the second video segment according to the first parameter;
generating the video control information based on the determined second video segment.
6. The method of claim 5,
the video control information is used for indicating the target terminal to display the interaction option of the second video clip in the video picture of the first video clip;
or,
the video control information is used for indicating the target terminal to be in the video picture of the first video clip and corresponding to the guide information corresponding to the interaction option of the second video clip, and the guide information is used for guiding the user to trigger the interaction option of the second video clip;
or,
the video control information is used for triggering the target terminal to directly play the second video clip when the first video clip is played.
7. The method according to claim 5 or 6, wherein the video control information comprises at least one of an identification of the second video segment and a network address of the second video segment.
8. The method of claim 5, wherein determining the second video segment according to the first parameter comprises:
determining the second video segment based on the first parameter and a second parameter; the second parameter is generated based on the interactive record of the interactive video in the process that the first user watches the interactive video at this time; the first user is a user corresponding to the target terminal.
9. The method of claim 8, wherein determining the second video segment based on the first parameter and the second parameter comprises:
weighting the first parameter and the second parameter to obtain a weighted parameter;
determining the second video segment based on the weighting parameter.
10. The method of claim 8, wherein determining the second video segment based on the first parameter and the second parameter comprises:
determining at least two candidate video segments based on the first parameter, the candidate video segments being video segments played subsequent to the first video segment;
determining the second video segment from the at least two candidate video segments based on the second parameter.
11. The method of claim 8, wherein determining the second video segment based on the first parameter and the second parameter comprises:
determining at least two candidate video segments based on the second parameter, the candidate video segments being video segments played subsequent to the first video segment;
determining the second video segment from the at least two candidate video segments based on the first parameter.
12. The method of claim 5, wherein determining the second video segment according to the first parameter comprises:
determining the second video segment based on the first parameter and a third parameter; the third parameter is generated based on the barrage content sent by the group of users in the process of watching the first video segment.
13. A video playback control apparatus, characterized in that the apparatus comprises:
the control request acquisition module is used for generating a video control request, and the video control request is triggered when the target terminal plays the first video segment; the first video segment is one of at least two video segments contained in the interactive video;
the first parameter acquisition module is used for acquiring a first parameter corresponding to the interactive video, and the first parameter is generated based on the interactive record of the group users on the interactive video; the group of users are the users who have viewed or are viewing the interactive video;
a control information generation module, configured to generate video control information based on the first parameter, where the video control information is used to control the target terminal to play a second video segment corresponding to the first parameter; the second video segment is a video segment played after the first video segment.
14. A computer device comprising a processor and a memory, the memory having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, the at least one instruction, the at least one program, the set of codes, or the set of instructions being loaded and executed by the processor to implement the video playback control method of any of claims 1 to 12.
15. A computer readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement the video playback control method of any of claims 1 to 12.
CN202010783516.3A 2020-08-06 2020-08-06 Video playing control method and device, computer equipment and storage medium Active CN111918140B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010783516.3A CN111918140B (en) 2020-08-06 2020-08-06 Video playing control method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010783516.3A CN111918140B (en) 2020-08-06 2020-08-06 Video playing control method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111918140A CN111918140A (en) 2020-11-10
CN111918140B true CN111918140B (en) 2021-08-03

Family

ID=73287939

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010783516.3A Active CN111918140B (en) 2020-08-06 2020-08-06 Video playing control method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111918140B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114003761A (en) * 2021-10-29 2022-02-01 深圳市兆驰股份有限公司 Video interaction method, device, system, equipment and computer readable storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8898705B2 (en) * 2012-03-28 2014-11-25 United Video Properties, Inc. System and methods for modifying improperly formatted metadata
CN109167950A (en) * 2018-10-25 2019-01-08 腾讯科技(深圳)有限公司 Video recording method, video broadcasting method, device, equipment and storage medium
CN110191358A (en) * 2019-07-19 2019-08-30 北京奇艺世纪科技有限公司 Video generation method and device
CN110677707A (en) * 2019-09-26 2020-01-10 林云帆 Interactive video generation method, generation device, equipment and readable medium
CN110809175A (en) * 2019-09-27 2020-02-18 腾讯科技(深圳)有限公司 Video recommendation method and device
CN111031379A (en) * 2019-12-19 2020-04-17 北京奇艺世纪科技有限公司 Video playing method, device, terminal and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050166230A1 (en) * 2003-03-18 2005-07-28 Gaydou Danny R. Systems and methods for providing transport control
CN110784753B (en) * 2019-10-15 2023-01-17 腾讯科技(深圳)有限公司 Interactive video playing method and device, storage medium and electronic equipment
CN110933473A (en) * 2019-12-10 2020-03-27 北京爱奇艺科技有限公司 Video playing heat determining method and device
CN111447239B (en) * 2020-04-13 2023-07-04 抖音视界有限公司 Video stream playing control method, device and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8898705B2 (en) * 2012-03-28 2014-11-25 United Video Properties, Inc. System and methods for modifying improperly formatted metadata
CN109167950A (en) * 2018-10-25 2019-01-08 腾讯科技(深圳)有限公司 Video recording method, video broadcasting method, device, equipment and storage medium
CN110191358A (en) * 2019-07-19 2019-08-30 北京奇艺世纪科技有限公司 Video generation method and device
CN110677707A (en) * 2019-09-26 2020-01-10 林云帆 Interactive video generation method, generation device, equipment and readable medium
CN110809175A (en) * 2019-09-27 2020-02-18 腾讯科技(深圳)有限公司 Video recommendation method and device
CN111031379A (en) * 2019-12-19 2020-04-17 北京奇艺世纪科技有限公司 Video playing method, device, terminal and storage medium

Also Published As

Publication number Publication date
CN111918140A (en) 2020-11-10

Similar Documents

Publication Publication Date Title
CN108184144B (en) Live broadcast method and device, storage medium and electronic equipment
CN111708901B (en) Multimedia resource recommendation method and device, electronic equipment and storage medium
US9278288B2 (en) Automatic generation of a game replay video
KR101571443B1 (en) Video Recommendation based on Video Co-occurrence Statistics
CN109639786B (en) Multimedia resource distribution method, device, server and storage medium
US10864447B1 (en) Highlight presentation interface in a game spectating system
CN110708588B (en) Barrage display method and device, terminal and storage medium
CN110929086A (en) Audio and video recommendation method and device and storage medium
US10363488B1 (en) Determining highlights in a game spectating system
CN108171160B (en) Task result identification method and device, storage medium and electronic equipment
CN111444415B (en) Barrage processing method, server, client, electronic equipment and storage medium
US11463538B2 (en) Adapting playback settings based on change history
CN113015010B (en) Push parameter determination method, device, equipment and computer readable storage medium
CN110162667A (en) Video generation method, device and storage medium
CN108604204A (en) The record of the application executed on the computing device is automatically selected and distributed in a network environment
CN110909241B (en) Information recommendation method, user identification recommendation method, device and equipment
CN111918140B (en) Video playing control method and device, computer equipment and storage medium
CN111294620A (en) Video recommendation method and device
CN112287799A (en) Video processing method and device based on artificial intelligence and electronic equipment
CN113490062B (en) Video barrage sequencing method and device, server and storage medium
CN114866852B (en) Control showing method and device for interactive video, computer equipment and storage medium
CN112749946B (en) Word2vec model training and data recall method and device
JP7217905B1 (en) Systems, methods, and computer readable media for classifying live streaming data
CN117156204B (en) Processing method and system of VR cloud game platform
CN114257874B (en) Video playing speed control method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant