CN114422866A - Video processing method and device, electronic equipment and storage medium - Google Patents

Video processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114422866A
CN114422866A CN202210048030.4A CN202210048030A CN114422866A CN 114422866 A CN114422866 A CN 114422866A CN 202210048030 A CN202210048030 A CN 202210048030A CN 114422866 A CN114422866 A CN 114422866A
Authority
CN
China
Prior art keywords
video
playing
played
change rate
video frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210048030.4A
Other languages
Chinese (zh)
Other versions
CN114422866B (en
Inventor
吕华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen TCL New Technology Co Ltd
Original Assignee
Shenzhen TCL New Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen TCL New Technology Co Ltd filed Critical Shenzhen TCL New Technology Co Ltd
Priority to CN202210048030.4A priority Critical patent/CN114422866B/en
Publication of CN114422866A publication Critical patent/CN114422866A/en
Application granted granted Critical
Publication of CN114422866B publication Critical patent/CN114422866B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/647Control signaling between network components and server or clients; Network processes for video distribution between server and clients, e.g. controlling the quality of the video stream, by dropping packets, protecting content from unauthorised alteration within the network, monitoring of network load, bridging between two different networks, e.g. between IP and wireless
    • H04N21/64723Monitoring of network processes or resources, e.g. monitoring of network load
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/80Responding to QoS
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/643Communication protocols
    • H04N21/6437Real-time Transport Protocol [RTP]

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Security & Cryptography (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The embodiment of the application discloses a video processing method, a video processing device, electronic equipment and a storage medium; the video processing method can receive the sequence identification of video frames in the playing video; based on the sequence identification, performing video frame loss detection on the played video; when detecting that the video frame loss exists in the played video, identifying the picture change rate of the played video; and playing and adjusting the played video based on the picture change rate, so that the playing effect of the played video is improved, and the experience of watching the played video by a user is improved.

Description

Video processing method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of communications technologies, and in particular, to a video processing method and apparatus, an electronic device, and a storage medium.
Background
With the development of information technology and communication technology, watching video has become a part of people's life. In the prior art, a Transmission Control Protocol (TCP) or a User Datagram Protocol (UDP) is generally adopted for real-time Transmission of playing video. When the video is transmitted by adopting a TCP (transmission control protocol) or UDP (user datagram protocol), when the video frame loss occurs in the played video, the video is not combined with a specific video playing scene, so that the playing effect of the played video is reduced, and the experience of watching the played video by a user is reduced.
Disclosure of Invention
The embodiment of the application provides a video processing method and device, an electronic device and a storage medium, which can improve the playing effect of playing a video, thereby improving the experience of watching the played video by a user.
The embodiment of the application provides a video processing method, which comprises the following steps:
receiving a sequence identifier of a video frame in a playing video;
based on the sequence identification, performing video frame loss detection on the playing video;
when detecting that the video frame loss exists in the played video, identifying the picture change rate of the played video;
and carrying out playing adjustment processing on the playing video based on the picture change rate.
Correspondingly, an embodiment of the present application further provides a video processing apparatus, including:
the receiving unit is used for receiving the sequence identification of the video frames in the playing video;
a loss detection unit, configured to perform video frame loss detection on the played video based on the sequence identifier;
the identification unit is used for identifying the picture change rate of the playing video when the playing video is detected to have video frame loss;
and the playing adjustment unit is used for carrying out playing adjustment processing on the playing video based on the picture change rate.
In one embodiment, the loss detection unit includes:
the identifier matching subunit is used for matching the sequence identifier with a preset reference sequence identifier;
the detection subunit is configured to detect whether a video frame corresponding to the preset reference sequence identifier is included in a preset data pool or not when the sequence identifier matches the preset reference sequence identifier;
and the overtime judgment subunit is configured to judge whether the acquisition time of the video frame corresponding to the preset reference sequence identifier is overtime when the preset data pool does not include the video frame corresponding to the preset reference sequence identifier.
In an embodiment, the timeout determining subunit includes:
the extraction module is used for extracting the acquisition time of the video frame corresponding to the preset reference sequence identifier when the preset data pool does not comprise the video frame corresponding to the preset reference sequence identifier;
the time matching module is used for matching the acquisition time with a preset time threshold;
and the overtime module is used for acquiring overtime video frames corresponding to the preset reference sequence identification when the acquisition time is not matched with the preset time threshold, and the video frames of the played video are lost.
In an embodiment, the loss detection unit further includes:
the updating subunit is configured to update the preset reference sequence identifier when the preset data pool includes a video frame corresponding to the sequence identifier, so as to obtain an updated reference sequence identifier;
and the receiving subunit is used for receiving the residual video frames in the playing video based on the updated reference sequence identifier.
In one embodiment, the identification unit includes:
the time calculating subunit is used for calculating the time information of the video frame of the playing video;
the statistical processing subunit is used for performing statistical processing on the time information to obtain the counted time information of the video frame;
and the logical operation processing is used for carrying out logical operation processing on the counted time information to obtain the picture change rate of the played video.
In an embodiment, the play adjusting unit includes:
the comparison subunit is used for comparing the picture change rate with a preset picture change rate to obtain a comparison result;
and the playing adjustment subunit is used for performing playing adjustment processing on the playing video by adopting a corresponding adjustment mode based on the comparison result.
In an embodiment, the play adjustment subunit includes:
the first playing adjustment module is used for performing playing adjustment processing on the playing video in a first adjustment mode when the picture change rate accords with the preset picture change rate;
and the second playing adjustment module is used for performing playing adjustment processing on the playing video in a second adjustment mode when the picture change rate does not accord with the preset picture change rate.
Correspondingly, the embodiment of the application also provides an electronic device, which comprises a memory and a processor; the memory stores a computer program, and the processor is configured to run the computer program in the memory to execute the video processing method provided by any one of the embodiments of the present application.
Correspondingly, an embodiment of the present application further provides a storage medium, where the storage medium stores a computer program, and the computer program, when executed by a processor, implements the video processing method provided in any embodiment of the present application.
The method and the device can receive the sequence identification of the video frames in the playing video; based on the sequence identification, performing video frame loss detection on the played video; when detecting that the video frame loss exists in the played video, identifying the picture change rate of the played video; and playing and adjusting the played video based on the picture change rate, so that the playing effect of the played video is improved, and the experience of watching the played video by a user is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a scene schematic diagram of a video processing method provided in an embodiment of the present application;
fig. 2 is a schematic flowchart of a video processing method according to an embodiment of the present application;
fig. 3 is a schematic flowchart of a video processing method according to an embodiment of the present application;
fig. 4 is a schematic view of another scene of a video processing method provided in an embodiment of the present application;
fig. 5 is a schematic flowchart of a video processing method according to an embodiment of the present application;
fig. 6 is a schematic flowchart of a video processing method according to an embodiment of the present application;
fig. 7 is a schematic flowchart of a video processing method according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a video processing apparatus according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, however, the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides a video processing method, which can be executed by a video processing device, and the video processing device can be integrated in an electronic device. The electronic device may include at least one of a terminal and a server. That is, the video processing method may be executed by the terminal or may be executed by the server.
Wherein, this terminal can include personal computer, panel computer, smart television, smart mobile phone, intelligent house, wearable electronic equipment, VR/AR equipment, on-vehicle computer etc..
The server may be an interworking server or a background server among a plurality of heterogeneous systems, an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, a cloud server providing basic cloud computing services such as cloud service, a cloud database, cloud computing, a cloud function, cloud storage, network service, cloud communication, middleware service, domain name service, security service, big data, an artificial intelligence platform and the like, and the like.
In an embodiment, as shown in fig. 1, the video processing apparatus may be integrated on an electronic device such as a terminal or a server to implement the video processing method provided in the embodiment of the present application. Specifically, the electronic device may receive a sequence identifier of a video frame in a playing video; based on the sequence identification, performing video frame loss detection on the played video; when detecting that the video frame loss exists in the played video, identifying the picture change rate of the played video; and performing playing adjustment processing on the playing video based on the picture change rate.
The following are detailed below, and it should be noted that the order of description of the following examples is not intended to limit the preferred order of the examples.
In the embodiments of the present application, the video processing method proposed in the embodiments of the present application will be described from the perspective of integrating a video processing apparatus in an electronic device.
As shown in fig. 2, a video processing method is provided, and the specific flow includes:
101. sequence identification of video frames in a playing video is received.
In one embodiment, the real-time Transmission of the playing video is generally performed by using a Transmission Control Protocol (TCP) or a User Datagram Protocol (UDP).
For example, when a network congestion occurs during real-time transmission of playing video using a TCP protocol, a sending end discards old data that has not been sent yet, and a transmission network does not discard the data. At this time, the user at the receiving end can see that the display frame of the playing video has frame skipping (also called frame skipping), but the phenomenon of screen blooming (also called mosaic) does not occur.
For another example, when a network congestion occurs during real-time transmission of playing video using UDP, the sending end does not discard data, but the transmission network discards data. At this time, the user at the receiving end will see the phenomenon of picture loss and mosaic appearance of the played video storage part.
In an embodiment, when there is a video frame loss in the played video, no consideration is given to the use scene of the user in the case of frame skipping or mosaic in the played video. For example, when the picture of the played video has little change, the mosaic is easily perceived by the user, and at this time, if the frame skipping mode is adopted, the playing effect of the played video is better. For another example, when the picture of the playing video changes at a high speed, a slight mosaic user is hard to perceive, and a frame skip is easy to perceive.
Therefore, the embodiment of the application provides a video processing method, which can combine sequence identification of video frames in a search and play video; based on the sequence identification, performing video frame loss detection on the played video; when detecting that the video frame loss exists in the played video, identifying the picture change rate of the played video; and performing playing adjustment processing on the playing video based on the picture change rate. By identifying the picture change rate of the played video, the played video can be adjusted in different adjustment modes based on different picture change rates of the played video, so that the playing effect of the played video is improved.
The playing video may include a video being played in real time.
The video frame may include a unit constituting the play video, that is, the play video is composed of video frames of one frame.
Wherein, the sequence identifier may indicate that the video frame is the second frame of the playing video. Through the sequence identification, whether the situation of losing video frames exists in the played video can be known.
In an embodiment, in order to ensure that the played video can be correctly recombined at the receiving end, when the transmitting end transmits the played video, the played video is often split into a plurality of video frames, and then each video frame is transmitted to the receiving end. Then, after receiving the video frame of the playing video, the receiving end can recombine the video frame, thereby obtaining the playing video. In order to enable the receiving end to correctly recombine the video frames of the played video, the transmitting end can add corresponding sequence identifiers to each video frame of the played video before sending the played video, so that the receiving end can correctly recombine the video frames according to the sequence identifiers of the video frames.
In an embodiment, before receiving the sequence identifier of the video frame in the played video, a communication connection must be established between the sending end and the receiving end, so that the played video can be transmitted between the sending end and the receiving end through the communication connection.
In an embodiment, a video compression mode and a video transmission format may be negotiated between a transmitting end and a receiving end. Then, a communication connection may be established based on the negotiated video compression mode and video transmission format, and video may be transmitted based on the communication connection.
For example, the sending end may send the video compression mode and the video transmission format that can be supported by the sending end to the receiving end, and then the receiving end may screen out the target video compression mode and the target video transmission format between the receiving end and the sending end from the video compression mode and the video transmission format that can be supported by the sending end in combination with the video compression mode and the video transmission format that can be supported by the receiving end. Then, the receiving end and the sending end can establish communication connection based on the target video compression mode and the target video transmission format, and transmit and play video in real time through the communication connection.
For example, the transmitting end and the receiving end may establish a video transmission channel and a control channel. The video transmission channel is used for transmitting and playing videos. For example, the video transmission channel may transmit the playing video based on a real-time Transport Protocol (RTP). Wherein the RTP protocol is a real-time transport protocol based on the UDP protocol. When a video transmission channel transmits a playing video based on an RTP protocol, one playing video is split into a plurality of RTP packets, and each RTP packet can be regarded as a video frame.
The control channel may be used for both transmission parties to negotiate a video compression mode and a video transmission format. In addition, when the video frame loss of the video storage is played, the control channel can also be used for transmitting the video frame loss information sent to the sending end by the receiving end, so that the sending end can send the video frame lost by the receiving end as soon as possible, and the played video can be recovered to be played normally.
102. And based on the sequence identification, carrying out video frame loss detection on the played video.
In one embodiment, when there is a video frame loss in the playing video, it indicates that the video processing apparatus needs to adjust the display condition of the playing video. For example, a display page on which a video is played is caused to display a frame skip, or a display page on which a video is played is caused to display a mosaic. Therefore, the video processing apparatus needs to perform video frame loss detection on the played video, and when it is detected that there is video frame loss in the played video, the video processing apparatus can identify the picture change rate of the played video, and perform play adjustment processing on the played video based on the picture change rate.
103. And when detecting that the video frame loss exists in the played video, identifying the picture change rate of the played video.
In an embodiment, when the video processing apparatus detects that there is a video frame loss in the played video, the video processing apparatus may identify a picture change rate of the played video, and perform play adjustment processing on the played video based on the picture change rate.
The frame rate of change may refer to a rate of change corresponding to a playing frame when the playing video is played.
In one embodiment, by identifying the picture change rate of the playing video, the sensitivity of the user to the change of the playing video can be judged. For example, when the picture of the playing video does not change much, the mosaic is easily perceived by the user, and the frame skipping is not easily perceived by the user. For another example, when the picture variation of the playing video is large, a slight mosaic user is hard to perceive, and the frame skipping is easy to perceive. Therefore, the video processing apparatus can recognize the picture change rate of the play video and perform adjustment processing on the play video based on the picture change rate.
104. And performing playing adjustment processing on the playing video based on the picture change rate.
In an embodiment, after the video processing apparatus identifies the picture change rate of the playing video, the playing video may be subjected to a playing adjustment process based on the picture change rate.
For example, when a video is played, when the transmission network has occasional congestion or errors, which results in a slow picture change rate, a frame skipping strategy is adopted when the played video is displayed, so that a user can not easily feel that the video is not smoothly played. For another example, when the screen change rate is high, a mosaic play policy may be adopted when displaying the played video, so that the user does not easily feel that the video is not played smoothly.
The embodiment of the application provides a video processing method, which comprises the following steps: receiving a sequence identifier of a video frame in a playing video; based on the sequence identification, performing video frame loss detection on the played video; when detecting that the video frame loss exists in the played video, identifying the picture change rate of the played video; and performing playing adjustment processing on the playing video based on the picture change rate. By the method provided by the embodiment of the application, different playing adjustment processing can be performed on the played video according to different picture change rates of the played video, so that a user cannot feel the situation that video frames are lost in the played video, and the playing effect of the played video and the experience of watching the played video by the user are improved.
The method described in the above examples is further illustrated in detail below by way of example.
The method of the embodiment of the present application will be described by taking an example that a video processing method is integrated on a receiving end (where the receiving end may be a terminal). Specifically, as shown in fig. 3, a flow of a video processing method provided in an embodiment of the present application may include:
201. receiving the sequence identification of video frame in playing video by receiving end
In an embodiment, before the receiving end receives the sequence identifier of the video frame in the playing video, the receiving end may establish a communication connection with the transmitting end.
For example, as shown in fig. 4, a video compression method, a video transmission format, and a variable frame rate to be supported may be negotiated between a transmitting end and a receiving end, and a communication connection may be established based on the negotiated video compression method, the negotiated video transmission format, and the variable frame rate to be supported.
The video compression method may be h.264. The video is the result of a continuous play of one picture, and the adjacent pictures have little change, so the picture in the middle only needs to transfer the difference relative to the previous picture. For example, a video is composed of 100 consecutive pictures, each called a frame. Simple compression is to retain the 1 st graph completely, retain a part of the difference between the 2 nd graph and the 1 st graph, and the like. The remaining portion of each picture is recompressed to form a series of compressed data that constitutes the video stream after compression. In h.264, the full picture frame is called an I frame (the first frame in the example), and only the difference portion is called a P frame. The actual compression is much more complicated than the above description, and there is also a difference frame that is a B frame and is not used for real-time transmission. Therefore, the receiving end needs to be able to display the picture, and must receive the I frame, otherwise, it cannot display the I frame, when the P frame is lost, only the difference part will not be displayed, and when the picture change is small, the user basically cannot perceive the difference part.
The frame rate is the number of pictures contained in one second, the larger the number is, the better the continuity is, but the transmission requirement is also high. Current video coding and decoding supports variable frame rates, i.e., pictures are delivered only when they change, and not when they do not change. Fixed frame rate transfer is not used any more, and if the sending end picture is not changed, the sending end picture is not sent by any data.
In one embodiment, after the communication connection is established, the receiving end and the transmitting end may transmit the playing video based on the communication connection. For example, when the receiving end is a television and the transmitting end is a computer, the television and the computer can transmit and play video based on the established communication connection.
In an embodiment, as shown in fig. 4, during the process of transmitting the playing video between the sending end and the receiving end, the receiving end may continuously perform video frame loss detection on the playing video. When the receiving end detects that video frame loss exists in the played video, the receiving end can request the sending end for picture synchronization.
202. And the receiving end carries out video frame loss detection on the played video based on the sequence identification.
In an embodiment, when the broadcast video is transmitted between the receiving end and the sending end, the transmission may be based on an RTP protocol. Wherein the RTP protocol is a real-time transport protocol based on the UDP protocol. When a video transmission channel transmits a playing video based on an RTP protocol, one playing video is split into a plurality of RTP packets, and each RTP packet can be regarded as a video frame. Wherein each RTP packet has a sequence identifier. Generally, sequence identifiers corresponding to consecutive RTP packets are consecutive, and if the sequence identifiers are not consecutive, a video frame loss condition exists. Therefore, the receiving end can perform video frame loss detection on the played video based on the sequence identification.
In an embodiment, after the receiving end receives the sequence identifier of the video frame of the playing video, it may be determined whether the video frame is the first frame based on the sequence identifier of the video frame.
When the video frame is the first frame, the sequence identifier of the video frame may be used as the preset reference sequence identifier. The preset reference sequence identifier can be used as a basis for judging whether a video frame is lost. The preset reference sequence identification may correspond to a video frame that the receiving end wants to receive. For example, when the reference sequence id is set to 12, it indicates that the receiving end wants to receive the video frame with the sequence id of 12.
And when the video frame is not the first frame, the sequence identifier of the video frame can be matched with the preset parameter sequence identifier. When the sequence identifier is matched with the preset reference sequence identifier, detecting whether a video frame corresponding to the preset reference sequence identifier is included in the preset data pool. And when the preset data pool does not comprise the video frame corresponding to the preset reference sequence identifier, judging whether the acquisition time of the video frame corresponding to the preset reference sequence identifier is overtime.
When the acquisition time of the video frame corresponding to the preset reference sequence identifier is over time, the video frame is lost.
For example, as shown in fig. 5, the receiving end acquires a new frame of video frame (corresponding to Fragment) of the playing video, where the sequence identifier of the Fragment is SEQ. Then, it can be determined whether Fragment is the first frame, and if so, let the preset reference sequence identifier (equivalent to NextFragmentSEQ) be SEQ. When Fragment is not the first frame, it can be judged whether SEQ is greater than or equal to NextFragmentSEQ. When the SEQ is smaller than the NextFragmentSEQ, the receiving end is indicated to have received the video frame, and therefore, the video frame can accept a new video frame. Wherein, when SEQ is greater than or equal to NextFragmentSEQ, the video frame Fragment can be added to the preset data pool. Then, the receiving end can determine whether the data pool includes a video frame with a preset reference sequence identifier of NextFragmentSEQ. When the preset data pool does not include the video frame corresponding to the NextFragmentSEQ, it is indicated that the receiving end does not receive the desired video frame, and at this time, the receiving end can judge whether the acquisition time of the video frame corresponding to the NextFragmentSEQ is overtime.
The preset data pool may be used to store video frames that have been received by the receiving end.
In one embodiment, the video frames of the playing video may not arrive at the receiving end in sequence, which is determined by the characteristics of the IP network. Therefore, when the receiving end receives the video frames, the video frames need to be sorted, and the sorting has a time limit, such as 100 ms. For example, it is currently necessary to receive a video frame of NextFragmentSEQ 11, but the video frame of SEQ 13 comes first, so the video frame of SEQ 13 can be saved first. If after more than 100ms, the video frame of SEQ 11 has not arrived, we declare that the video frame of SEQ 11 has been lost, and therefore wait for the video frame of SEQ 12, at which time NextFragmentSEQ 12 is updated.
In an embodiment, the step "when the preset data pool does not include the video frame corresponding to the sequence identifier, determining whether the acquisition time of the video frame corresponding to the preset reference sequence identifier is overtime" may include:
when the preset data pool does not comprise the video frame corresponding to the sequence identifier, extracting the acquisition time of the video frame corresponding to the preset reference sequence identifier;
matching the acquisition time with a preset time threshold;
and when the acquisition time is not matched with the preset time threshold, the acquisition of the video frame corresponding to the preset reference sequence identifier is overtime, and the loss of the video frame exists in the played video.
The acquisition time of the video frame may refer to the time when the receiving end actually acquires the video frame. If the receiving end has not received the video frame all the time, the acquisition time of the video frame will be accumulated all the time.
The preset time threshold may be a basis for determining whether the video frame has been acquired overtime.
For example, when the acquisition time of the video frame exceeds the preset time threshold, it indicates that the receiving end does not receive the video frame with the sequence identifier being the preset reference sequence identifier within the specified time, and therefore the video frame corresponding to the preset reference sequence identifier is acquired overtime, and the situation that the video frame is lost exists in the played video.
For example, the preset time threshold may be set to 100 ms. When the acquisition time of the video frame exceeds 100ms, it indicates that the receiving end does not receive the video frame with the sequence identifier being the preset reference sequence identifier within the specified time, so that the video frame corresponding to the preset reference sequence identifier is acquired overtime, and the situation that the video frame is lost exists in the played video.
In an embodiment, the video processing method provided in the embodiment of the present application further includes:
when the preset data pool comprises the video frames corresponding to the preset reference sequence identification, updating the preset reference sequence identification to obtain an updated reference sequence identification;
receiving remaining video frames in the played video based on the updated reference sequence identification.
For example, as shown in fig. 5, when the preset data pool includes a video frame corresponding to NextFragmentSEQ, it is indicated that the video frames in the preset data pool are consecutive, and at this time, the newly received Fragment may be removed from the preset data, and NextFragmentSEQ is incremented by 1. And then, receiving the rest video frames in the played video based on the updated reference sequence identifier, and detecting whether the situation that the video frames are lost exists in the played video based on the updated reference sequence identifier.
In addition, as shown in fig. 5, after the preset reference sequence identifier is updated, the receiving end may further determine that the received video frames can be reassembled into the played video. If the played video cannot be reconstructed, the receiving end can receive the remaining video frames in the played video based on the updated reference sequence identifier.
In an embodiment, when a video frame corresponding to the preset reference sequence identifier is not received within a specified time, the preset reference sequence identifier may also be updated to obtain an updated reference sequence identifier. Then, the receiving end informs the sending end that the video frame loss exists in the video playing, and the picture needs to be synchronized. The synchronous picture may refer to a complete picture, for example, an IDR frame summarized by the h.264 protocol, which needs to be sent by the sending end.
203. When detecting that the video frame loss exists in the played video, the receiving end identifies the picture change rate of the played video.
In an embodiment, when the receiving end detects that there is a video frame loss in the playing video, the receiving end can identify the frame change rate of the playing video. Specifically, the step "identifying the picture change rate of the playing video when detecting that there is a video frame loss in the playing video" may include:
calculating time information of video frames of the played video;
carrying out statistical processing on the time information to obtain the statistical time information of the video frames;
and performing logical operation processing on the counted time information to obtain the picture change rate of the played video.
Wherein the time information of the video frame may refer to a relationship between an actual reception time and an expected reception time of the video frame.
In an embodiment, the time information of the video frame may refer to a difference between an actual reception time and an expected reception time of the video frame, and the like.
The receiving end can obtain the actual receiving time of the video frame, subtract the expected receiving time of the video frame from the actual receiving time of the video frame, and use the obtained time difference as the time information of the video frame.
For example, as shown in fig. 6, the time information of the video frame may be equal to the difference between the actual reception time and the expected reception time of the video frame.
When a plurality of video frames of a playing video are received, time information of each video frame can be calculated respectively. For example, when 10 video frames of a playing video are received, time information of the 10 video frames may be calculated, respectively.
In an embodiment, the time information may be subjected to statistical processing to obtain the statistical time information of the video frame.
The statistical processing may be processing the time information by using a mathematical statistical method. For example, statistical processing may include averaging, calculating a variance or calculating a standard deviation, and so forth.
For example, after the time information of each video frame in the playing video is calculated, the time information of a plurality of video frames may be averaged, and the average may be used as the counted time information.
For example, as shown in fig. 6, a time difference average value of a plurality of video frames may be calculated. For example, there are 10 video frames, and the time difference of the 10 video frames can be further added and then divided by 10 to obtain the statistical time information.
In an embodiment, after the statistical time information is obtained, the statistical time information may be subjected to a logical operation process, so as to obtain a picture change rate of the played video.
For example, as shown in fig. 6, the picture change rate may be equal to the inverse of the counted time information.
In an embodiment, when it is detected that there is a video frame loss in the played video, the receiving end may identify a picture change rate of the played video, so that the played video may be adjusted and processed based on the picture change rate, thereby improving a playing effect of the played video and an experience of a user watching the played video.
204. And the receiving end carries out playing adjustment processing on the played video based on the picture change rate.
In an embodiment, after the receiving end identifies the picture change rate, the playing adjustment process may be performed on the playing video based on the picture change rate. The receiving end can judge whether the picture change rate of the played video is a high picture change rate or a low picture change rate, because the receiving end can perform different playing adjustment processing on the played video according to different picture change rates. Specifically, the step of performing play adjustment processing on the played video based on the picture change rate may include:
comparing the picture change rate with a preset picture change rate to obtain a comparison result;
and performing playing adjustment processing on the playing video by adopting a corresponding adjustment mode based on the comparison result.
The preset frame rate may be a frame rate preset by a developer. By comparing the picture change rate of the played video with the preset picture change rate, it can be determined whether the picture change rate of the played video is a high change rate or a low change rate.
In an embodiment, the step "performing playing adjustment processing on the playing video in a corresponding adjustment manner based on the comparison result" may include:
when the picture change rate accords with the preset picture change rate, performing playing adjustment processing on the played video by adopting a first adjustment mode;
and when the picture change rate does not accord with the preset picture change rate, performing playing adjustment processing on the played video by adopting a second adjustment mode.
In an embodiment, the fact that the frame rate of change matches the predetermined frame rate of change may mean that the frame rate of change is greater than the predetermined frame rate of change. At this time, the picture change rate of the played video frame belongs to a high change rate, and the playing adjustment processing may be performed on the played video in a first adjustment mode.
The first adjustment mode may refer to performing play adjustment processing on the played video in a mosaic mode, that is, displaying the mosaic when the played video has video frame loss, cannot be normally displayed, and has a high picture change rate.
In an embodiment, the fact that the frame rate does not conform to the predetermined frame rate may mean that the frame rate is less than or equal to the predetermined frame rate. At this time, the picture change rate of the playing video is a low change rate, and the playing adjustment processing may be performed on the playing video in the second adjustment mode.
The second adjustment mode may perform play adjustment processing on the played video in a frame skipping mode, that is, the played picture displays frame skipping when the video frame of the played video is lost, cannot be normally displayed, and the picture change rate is low.
In one embodiment, as shown in fig. 7, after identifying the frame rate of the playing video, the receiving end may determine whether the playing video is paused. When the playing video is not paused and the picture change rate is a high picture change rate, the mosaic can be displayed. And when the playing video is not paused and the picture change rate is low, the frame skipping can be displayed.
And when the playing video is paused, the abnormal condition of the playing video is shown. For example, there is a case where video frames are lost already in playing video. Then, it can be determined whether the received latest video frame is a picture synchronization frame (i.e., an IDR frame). If the latest video frame is the picture synchronization frame, the picture of the playing video is recovered. If the latest video frame is not the picture synchronization frame, the latest video frame can be discarded, and the playing video can be paused.
The embodiment of the application provides a video processing method, which comprises the following steps: receiving a sequence identifier of a video frame in a playing video; based on the sequence identification, performing video frame loss detection on the played video; when detecting that the video frame loss exists in the played video, identifying the picture change rate of the played video; and performing playing adjustment processing on the playing video based on the picture change rate. By the method provided by the embodiment of the application, different playing adjustment processing can be performed on the played video according to different picture change rates of the played video, so that a user cannot feel the situation that video frames are lost in the played video, and the playing effect of the played video and the experience of watching the played video by the user are improved.
The video processing method provided by the embodiment of the application can select one of the screen-blooming mode and the frame-skipping mode according to the picture change rate when the network condition is not good and the data is lost in the transmission process. The frame skipping is selected under the condition that the picture change rate is slow, the screen blooming is selected under the condition that the picture change rate is fast, and user experience is improved.
The video processing method provided by the embodiment of the application can be applied to a scene of projecting a screen to a television on a mobile phone or a computer, wherein the scene is usually a playing video or a basically static picture. Such as computer screen projection or PPT speech, etc. When the picture of the playing video is basically unchanged, the frame skipping mode can be adopted for displaying, and the clear picture displayed by the television is ensured. When data is lost, the television stops updating until a frame synchronization I frame is received, and meanwhile, the television serving as a receiving end immediately requires the computer to send a frame synchronization (I frame of an H.264 protocol), and can recover within 100ms according to the current network transmission speed. If the computer plays videos, the television can continue to play and guarantee fluency when data is lost, screens of different degrees can appear at the moment, and meanwhile, the computer can be required to send an I frame at once.
In order to better implement the video processing method provided by the embodiment of the present application, in an embodiment, a video processing apparatus is further provided, and the video processing apparatus may be integrated in an electronic device. The terms are the same as those in the video processing method, and details of implementation can be referred to the description in the method embodiment.
In an embodiment, there is provided a video processing apparatus, which may be specifically integrated in an electronic device, as shown in fig. 8, the video processing apparatus includes: the receiving unit 301, the loss detecting unit 302, the identifying unit 303, and the playing adjusting unit 304 are as follows:
a receiving unit 301, configured to receive a sequence identifier of a video frame in a playing video;
a loss detection unit 302, configured to perform video frame loss detection on the played video based on the sequence identifier;
an identifying unit 303, configured to identify a picture change rate of the playing video when it is detected that there is a video frame loss in the playing video;
a playing adjustment unit 304, configured to perform playing adjustment processing on the playing video based on the picture change rate.
In an embodiment, the loss detection unit 302 includes:
the identifier matching subunit is used for matching the sequence identifier with a preset reference sequence identifier;
the detection subunit is configured to detect whether a video frame corresponding to the preset reference sequence identifier is included in a preset data pool or not when the sequence identifier matches the preset reference sequence identifier;
and the overtime judgment subunit is configured to judge whether the acquisition time of the video frame corresponding to the preset reference sequence identifier is overtime when the preset data pool does not include the video frame corresponding to the preset reference sequence identifier.
In an embodiment, the timeout determining subunit includes:
the extraction module is used for extracting the acquisition time of the video frame corresponding to the preset reference sequence identifier when the preset data pool does not comprise the video frame corresponding to the preset reference sequence identifier;
the time matching module is used for matching the acquisition time with a preset time threshold;
and the overtime module is used for acquiring overtime video frames corresponding to the preset reference sequence identification when the acquisition time is not matched with the preset time threshold, and the video frames of the played video are lost.
In an embodiment, the loss detection unit 302 further includes:
the updating subunit is configured to update the preset reference sequence identifier when the preset data pool includes a video frame corresponding to the sequence identifier, so as to obtain an updated reference sequence identifier;
and the receiving subunit is used for receiving the residual video frames in the playing video based on the updated reference sequence identifier.
In an embodiment, the identifying unit 303 includes:
the time calculating subunit is used for calculating the time information of the video frame of the playing video;
the statistical processing subunit is used for performing statistical processing on the time information to obtain the counted time information of the video frame;
and the logical operation processing is used for carrying out logical operation processing on the counted time information to obtain the picture change rate of the played video.
In an embodiment, the playing adjusting unit 304 includes:
the comparison subunit is used for comparing the picture change rate with a preset picture change rate to obtain a comparison result;
and the playing adjustment subunit is used for performing playing adjustment processing on the playing video by adopting a corresponding adjustment mode based on the comparison result.
In an embodiment, the play adjustment subunit includes:
the first playing adjustment module is used for performing playing adjustment processing on the playing video in a first adjustment mode when the picture change rate accords with the preset picture change rate;
and the second playing adjustment module is used for performing playing adjustment processing on the playing video in a second adjustment mode when the picture change rate does not accord with the preset picture change rate.
In a specific implementation, the above units may be implemented as independent entities, or may be combined arbitrarily to be implemented as the same or several entities, and the specific implementation of the above units may refer to the foregoing method embodiments, which are not described herein again.
The video processing device can improve the reliability of video processing.
The embodiment of the application also provides an electronic device, which can comprise a terminal or a server; for example, the electronic device may be a server, such as a video processing server. As shown in fig. 9, it shows a schematic structural diagram of a terminal according to an embodiment of the present application, specifically:
the electronic device may include components such as a processor 401 of one or more processing cores, memory 402 of one or more computer-readable storage media, a power supply 403, and an input unit 404. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 9 does not constitute a limitation of the electronic device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. Wherein:
the processor 401 is a control center of the electronic device, connects various parts of the whole electronic device by various interfaces and lines, performs various functions of the electronic device and processes data by running or executing software programs and/or modules stored in the memory 402 and calling data stored in the memory 402, thereby performing overall monitoring of the electronic device. Optionally, processor 401 may include one or more processing cores; preferably, the processor 401 may integrate an application processor and a modem processor, wherein the application processor mainly handles operating systems, user pages, application programs, and the like, and the modem processor mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 401.
The memory 402 may be used to store software programs and modules, and the processor 401 executes various functional applications and data processing by operating the software programs and modules stored in the memory 402. The memory 402 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data created according to use of the computer device, and the like. Further, the memory 402 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 402 may also include a memory controller to provide the processor 401 access to the memory 402.
The electronic device further comprises a power supply 403 for supplying power to the various components, and preferably, the power supply 403 is logically connected to the processor 401 through a power management system, so that functions of managing charging, discharging, and power consumption are realized through the power management system. The power supply 403 may also include any component of one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
The electronic device may further include an input unit 404, and the input unit 404 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control.
Although not shown, the electronic device may further include a display unit and the like, which are not described in detail herein. Specifically, in this embodiment, the processor 401 in the electronic device loads the executable file corresponding to the process of one or more application programs into the memory 402 according to the following instructions, and the processor 401 runs the application program stored in the memory 402, thereby implementing various functions as follows:
receiving a sequence identifier of a video frame in a playing video;
based on the sequence identification, performing video frame loss detection on the playing video;
when detecting that the video frame loss exists in the played video, identifying the picture change rate of the played video;
and carrying out playing adjustment processing on the playing video based on the picture change rate.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
According to an aspect of the application, there is provided a computer program application or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method provided in the various alternative implementations of the above embodiments.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by a computer program, which may be stored in a computer-readable storage medium and loaded and executed by a processor, or by related hardware controlled by the computer program.
To this end, embodiments of the present application further provide a storage medium, in which a computer program is stored, where the computer program can be loaded by a processor to execute the steps in any one of the video processing methods provided in the embodiments of the present application. For example, the computer program may perform the steps of:
receiving a sequence identifier of a video frame in a playing video;
based on the sequence identification, performing video frame loss detection on the playing video;
when detecting that the video frame loss exists in the played video, identifying the picture change rate of the played video;
and carrying out playing adjustment processing on the playing video based on the picture change rate.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Since the computer program stored in the storage medium can execute the steps in any video processing method provided in the embodiments of the present application, beneficial effects that can be achieved by any video processing method provided in the embodiments of the present application can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
The foregoing detailed description has provided a video processing method, an apparatus, an electronic device, and a storage medium according to embodiments of the present application, and specific examples have been applied in the present application to explain the principles and implementations of the present application, and the descriptions of the foregoing embodiments are only used to help understand the method and the core ideas of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. A video processing method, comprising:
receiving a sequence identifier of a video frame in a playing video;
based on the sequence identification, performing video frame loss detection on the playing video;
when detecting that the video frame loss exists in the played video, identifying the picture change rate of the played video;
and carrying out playing adjustment processing on the playing video based on the picture change rate.
2. The method of claim 1, wherein the performing video frame loss detection on the played video based on the sequence identifier comprises:
matching the sequence identification with a preset reference sequence identification;
when the sequence identification is matched with the preset reference sequence identification, detecting whether a preset data pool comprises a video frame corresponding to the preset reference sequence identification;
and when the preset data pool does not comprise the video frame corresponding to the preset reference sequence identifier, judging whether the acquisition time of the video frame corresponding to the preset reference sequence identifier is overtime.
3. The method according to claim 2, wherein when the preset data pool does not include the video frame corresponding to the preset reference sequence identifier, the determining whether the acquisition time of the video frame corresponding to the preset reference sequence identifier is over time comprises:
when the preset data pool does not comprise the video frame corresponding to the preset reference sequence identifier, extracting the acquisition time of the video frame corresponding to the preset reference sequence identifier;
matching the acquisition time with a preset time threshold;
and when the acquisition time is not matched with the preset time threshold, acquiring overtime video frames corresponding to the preset reference sequence identification, and losing the video frames in the played video.
4. The method of claim 2, further comprising:
when the preset data pool comprises the video frame corresponding to the preset reference sequence identifier, updating the preset reference sequence identifier to obtain an updated reference sequence identifier;
receiving remaining video frames in the played video based on the updated reference sequence identification.
5. The method of claim 1, wherein the identifying the frame rate of change of the playing video when detecting that there is a video frame loss in the playing video comprises:
calculating time information of video frames of the playing video;
performing statistical processing on the time information to obtain the counted time information of the video frame;
and performing logical operation processing on the counted time information to obtain the picture change rate of the played video.
6. The method according to claim 1, wherein the performing a play adjustment process on the played video based on the picture change rate comprises:
comparing the picture change rate with a preset picture change rate to obtain a comparison result;
and performing playing adjustment processing on the playing video by adopting a corresponding adjustment mode based on the comparison result.
7. The method according to claim 6, wherein performing playback adjustment processing on the playback video in a corresponding adjustment manner based on the comparison result includes:
when the picture change rate accords with the preset picture change rate, performing playing adjustment processing on the played video in a first adjustment mode;
and when the picture change rate does not accord with the preset picture change rate, performing playing adjustment processing on the played video by adopting a second adjustment mode.
8. A video processing apparatus, comprising:
the receiving unit is used for receiving the sequence identification of the video frames in the playing video;
a loss detection unit, configured to perform video frame loss detection on the played video based on the sequence identifier;
the identification unit is used for identifying the picture change rate of the playing video when the playing video is detected to have video frame loss;
and the playing adjustment unit is used for carrying out playing adjustment processing on the playing video based on the picture change rate.
9. An electronic device comprising a memory and a processor; the memory stores a computer program, and the processor is configured to execute the computer program in the memory to perform the steps of the video processing method according to any one of claims 1 to 7.
10. A storage medium storing a plurality of computer programs adapted to be loaded by a processor to perform the steps of the video processing method according to any one of claims 1 to 7.
CN202210048030.4A 2022-01-17 2022-01-17 Video processing method and device, electronic equipment and storage medium Active CN114422866B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210048030.4A CN114422866B (en) 2022-01-17 2022-01-17 Video processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210048030.4A CN114422866B (en) 2022-01-17 2022-01-17 Video processing method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114422866A true CN114422866A (en) 2022-04-29
CN114422866B CN114422866B (en) 2023-07-25

Family

ID=81273486

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210048030.4A Active CN114422866B (en) 2022-01-17 2022-01-17 Video processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114422866B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116055790A (en) * 2022-07-29 2023-05-02 荣耀终端有限公司 Video playing method and system and electronic equipment

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109640168A (en) * 2018-11-27 2019-04-16 Oppo广东移动通信有限公司 Method for processing video frequency, device, electronic equipment and computer-readable medium
CN110072123A (en) * 2018-01-24 2019-07-30 中兴通讯股份有限公司 A kind of recovery playback method, video playing terminal and the server of video
CN110572695A (en) * 2019-08-07 2019-12-13 苏州科达科技股份有限公司 media data encoding and decoding methods and electronic equipment
CN111491201A (en) * 2020-04-08 2020-08-04 深圳市昊一源科技有限公司 Method for adjusting video code stream and video frame loss processing method
CN112073823A (en) * 2020-09-02 2020-12-11 深圳创维数字技术有限公司 Frame loss processing method, video playing terminal and computer readable storage medium
CN113099272A (en) * 2021-04-12 2021-07-09 上海商汤智能科技有限公司 Video processing method and device, electronic equipment and storage medium
US20210258644A1 (en) * 2020-03-31 2021-08-19 Baidu Online Network Technology (Beijing) Co., Ltd. Video playing method, apparatus, electronic device and storage medium
WO2021238940A1 (en) * 2020-05-26 2021-12-02 维沃移动通信有限公司 Video data processing method and apparatus, and electronic device
WO2021244440A1 (en) * 2020-06-04 2021-12-09 深圳市万普拉斯科技有限公司 Method, apparatus, and system for adjusting image quality of television, and television set

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110072123A (en) * 2018-01-24 2019-07-30 中兴通讯股份有限公司 A kind of recovery playback method, video playing terminal and the server of video
CN109640168A (en) * 2018-11-27 2019-04-16 Oppo广东移动通信有限公司 Method for processing video frequency, device, electronic equipment and computer-readable medium
CN110572695A (en) * 2019-08-07 2019-12-13 苏州科达科技股份有限公司 media data encoding and decoding methods and electronic equipment
US20210258644A1 (en) * 2020-03-31 2021-08-19 Baidu Online Network Technology (Beijing) Co., Ltd. Video playing method, apparatus, electronic device and storage medium
CN111491201A (en) * 2020-04-08 2020-08-04 深圳市昊一源科技有限公司 Method for adjusting video code stream and video frame loss processing method
WO2021238940A1 (en) * 2020-05-26 2021-12-02 维沃移动通信有限公司 Video data processing method and apparatus, and electronic device
WO2021244440A1 (en) * 2020-06-04 2021-12-09 深圳市万普拉斯科技有限公司 Method, apparatus, and system for adjusting image quality of television, and television set
CN112073823A (en) * 2020-09-02 2020-12-11 深圳创维数字技术有限公司 Frame loss processing method, video playing terminal and computer readable storage medium
CN113099272A (en) * 2021-04-12 2021-07-09 上海商汤智能科技有限公司 Video processing method and device, electronic equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116055790A (en) * 2022-07-29 2023-05-02 荣耀终端有限公司 Video playing method and system and electronic equipment
CN116055790B (en) * 2022-07-29 2024-03-19 荣耀终端有限公司 Video playing method and system and electronic equipment

Also Published As

Publication number Publication date
CN114422866B (en) 2023-07-25

Similar Documents

Publication Publication Date Title
CA2806284C (en) Transmission apparatus, transmission method, reception apparatus, reception method, program, and broadcasting system
CN109089130B (en) Method and device for adjusting timestamp of live video
EP2328349A1 (en) Information processing system and information processing device
US20050204046A1 (en) Data transmitting apparatus, data receiving apparatus, data transmitting manner, and data receiving manner
EP2466911A1 (en) Method and device for fast pushing unicast stream in fast channel change
CN111093094A (en) Video transcoding method, device and system, electronic equipment and readable storage medium
US20190245945A1 (en) Rapid optimization of media stream bitrate
CN109040830B (en) Live broadcast pause prediction method, switching method and device
CN109714622A (en) A kind of video data handling procedure, device and electronic equipment
CN109257618A (en) Company wheat interflow method, apparatus and server in a kind of live streaming
US20230066899A1 (en) Video data processing method and apparatus, and electronic device
US20140092997A1 (en) Error resilient transmission of random access frames and global coding parameters
CN114422866B (en) Video processing method and device, electronic equipment and storage medium
US20130262554A1 (en) Method and apparatus for managing content distribution over multiple terminal devices in collaborative media system
CN113010135B (en) Data processing method and device, display terminal and storage medium
US7769035B1 (en) Facilitating a channel change between multiple multimedia data streams
CN114025389A (en) Data transmission method and device, computer equipment and storage medium
CN113852866B (en) Media stream processing method, device and system
US8824566B2 (en) Method and apparatus for receiving images having undergone losses during transmission
CN110351577A (en) Live information processing method and processing device, storage medium, electronic equipment
CN116112620A (en) Processing method and system for improving video stream multipath merging stability
CN115514980A (en) Push stream live broadcast management method and device, computer and readable storage medium
CN112153413B (en) Method and server for processing screen splash in one-screen broadcast
CN108024121B (en) Voice barrage synchronization method and system
CN114416013A (en) Data transmission method, data transmission device, electronic equipment and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant