CN112866746A - Multi-path streaming cloud game control method, device, equipment and storage medium - Google Patents

Multi-path streaming cloud game control method, device, equipment and storage medium Download PDF

Info

Publication number
CN112866746A
CN112866746A CN202011641796.0A CN202011641796A CN112866746A CN 112866746 A CN112866746 A CN 112866746A CN 202011641796 A CN202011641796 A CN 202011641796A CN 112866746 A CN112866746 A CN 112866746A
Authority
CN
China
Prior art keywords
frame
frames
video
user terminal
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011641796.0A
Other languages
Chinese (zh)
Inventor
王叶群
蔡强
罗光辉
陈涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Wulian Technology Co ltd
Original Assignee
Hangzhou Wulian Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Wulian Technology Co ltd filed Critical Hangzhou Wulian Technology Co ltd
Priority to CN202011641796.0A priority Critical patent/CN112866746A/en
Publication of CN112866746A publication Critical patent/CN112866746A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/23418Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234309Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by transcoding between formats or standards, e.g. from MPEG-2 to MPEG-4 or from Quicktime to Realvideo
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234381Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by altering the temporal resolution, e.g. decreasing the frame rate by frame skipping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/24Monitoring of processes or resources, e.g. monitoring of server load, available bandwidth, upstream requests
    • H04N21/2402Monitoring of the downstream path of the transmission network, e.g. bandwidth available
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/266Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
    • H04N21/2662Controlling the complexity of the video stream, e.g. by scaling the resolution or bitrate of the video stream based on the client capabilities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440218Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by transcoding between formats or standards, e.g. from MPEG-2 to MPEG-4
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440281Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44227Monitoring of local network, e.g. connection or bandwidth variations; Detecting new devices in the local network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/462Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
    • H04N21/4621Controlling the complexity of the content stream or additional data, e.g. lowering the resolution or bit-rate of the video stream for a mobile client with a small screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4781Games

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

The invention discloses a multi-channel streaming cloud game control method, which provides a new video coding mode and a corresponding network transmission mode, modifies a coding reference relation in video coding to realize a hierarchical coding function, generates a reference P frame and a non-reference P frame, wherein the non-reference P frame is not predicted by reference of other frames, so that the decoding of other frames cannot be influenced even if the non-reference P frame is lost, in network transmission, matched code streams are transmitted in a self-adaptive mode according to the conditions of each streaming client side while the streaming performance experience of other client sides is not influenced, and the self-adaptive hierarchical transmission is carried out according to the actual network conditions and the terminal performance conditions of the client sides, so that each client side in a multi-channel streaming system can achieve the optimal video image quality and frame performance. The invention also discloses a multi-path streaming cloud game control device, equipment and a readable storage medium, and has corresponding technical effects.

Description

Multi-path streaming cloud game control method, device, equipment and storage medium
Technical Field
The invention relates to the technical field of cloud computing, in particular to a multi-path streaming cloud game control method, a multi-path streaming cloud game control device, multi-path streaming cloud game control equipment and a readable storage medium.
Background
The cloud game is different from the traditional game, is a game implementation mode based on cloud computing, cloud control, rendering and network transmission, and enables computing power to be increased, and the game is really operated at a host end of a cloud edge computing node instead of a local client end of a user. The host terminal transmits data to the client terminal through a network after capturing and coding the screen, the client terminal locally displays game sound and pictures by receiving audio and video data sent by the host terminal, and meanwhile, a user sends operation control data of the game to the host terminal to control the game.
In order to enhance the interest, playability and social attributes of a cloud game, promote value-added services and attract more users, a multi-path streaming cloud game system is available at present, and multiple users are allowed to access the system at the same time. At present, in a game system, different users in different application scenes simultaneously accessed by the users experience different on different clients, and the user experience is poor due to the fact that some user pictures are blocked and unsmooth.
In summary, how to satisfy the game experience of different user terminals when multiple users access the cloud game at the same time is a technical problem that needs to be solved urgently by those skilled in the art at present.
Disclosure of Invention
The invention aims to provide a multi-path streaming cloud game control method, a multi-path streaming cloud game control device, multi-path streaming cloud game control equipment and a readable storage medium, which can meet game experience of different user terminals when multiple users access a cloud game simultaneously.
In order to solve the technical problems, the invention provides the following technical scheme:
a multi-path streaming cloud game control method is characterized by comprising the following steps:
the method comprises the steps that a host end determines each user terminal connected with streaming in a cloud game system;
non-adjacent P frames in the video frames obtained by screen capture are used as reference P frames, and the rest P frames are used as non-reference P frames; wherein the video frame comprises a P frame and an I frame;
performing predictive coding on the reference P frame and the non-reference P frame according to adjacent reference P frames or adjacent I frames, and taking the coded video frame as a video frame to be transmitted;
and deleting the corresponding number of the non-reference P frames from the video frames to be transmitted according to the network state of the user terminal, and then transmitting the video frames.
Optionally, taking nonadjacent P frames in the video frames obtained by screen capture as reference P frames, and taking the remaining P frames as non-reference P frames, including:
if the video frame to be coded currently in the video frames obtained by screen capturing is a P frame, judging whether the remainder of the difference value of the frame number between the current P frame to be coded and the adjacent I frame divided by 2 is 0;
if yes, marking the current P frame to be coded as a reference frame;
if not, marking the current P frame to be coded as a non-reference frame.
Optionally, the predictive coding of the reference P frame and the non-reference P frame according to an adjacent reference P frame or an adjacent I frame includes:
if the current P frame to be coded is the reference frame, deleting the last P frame from a decoded image buffer and a reference frame queue, and carrying out predictive coding according to a P frame or an I frame which is separated from the current P frame to be coded by one bit;
and if the current P frame to be coded is the non-reference frame, performing predictive coding according to the P frame adjacent to the current P frame to be coded.
Optionally, after deleting a corresponding number of the non-reference P frames from the video frames to be transmitted according to the network state of the user terminal, the transmitting of the video frames includes:
determining a current video frame to be transmitted from the video frames to be transmitted, and judging whether the current video frame to be transmitted belongs to a discardable coding frame; wherein the discardable encoded frame comprises: the non-reference P frame;
if not, transmitting the current video frame to be transmitted to the user terminal;
if yes, judging whether the network state of the user terminal reaches a threshold value;
if not, deleting the current video frame to be transmitted, and executing the step of determining the current video frame to be transmitted from the video frames to be transmitted;
and if so, executing the step of transmitting the current video frame to be transmitted to the user terminal.
Optionally, the determining whether the network status of the ue reaches a threshold includes:
judging whether the frame transmission delay of the user terminal is greater than a delay threshold value or not;
judging whether the frame transmission packet loss rate of the user terminal is greater than a packet loss threshold value or not;
if the frame transmission delay of the user terminal is greater than the delay threshold or the frame transmission packet loss rate is greater than the packet loss threshold, judging that the network state of the user terminal does not reach the threshold;
and if the frame transmission delay of the user terminal is not greater than the delay threshold and the frame transmission packet loss rate is not greater than the packet loss threshold, determining that the network state of the user terminal reaches the threshold.
Optionally, the determining whether the frame transmission delay of the ue is greater than a delay threshold includes:
judging whether the difference value between the frame number of the current video frame to be transmitted and the frame number of the decoded upper screen of the user terminal is greater than a difference value threshold value or not;
if yes, judging that the frame transmission delay is greater than a delay threshold;
if not, the frame transmission delay is judged to be not greater than the delay threshold.
Optionally, before the non-adjacent P frames in the video frames obtained by screen capture are used as reference P frames and the remaining P frames are used as non-reference P frames, the method further includes:
determining a network bandwidth between the user terminal and the host;
judging whether the streaming with the user terminal is a host with video decision right;
if yes, setting the current coding rate as the network bandwidth, and executing the step of taking nonadjacent P frames in the video frames obtained by screen capture as reference P frames and taking the rest P frames as non-reference P frames;
if not, setting the current coding rate as the minimum value of the network bandwidth of all the user terminals accessing the streaming.
A multi-stream cloud game control apparatus includes:
the terminal determining unit is used for determining each user terminal connected with streaming in the cloud game system by the host side;
the frame dividing unit is used for taking nonadjacent P frames in the video frames obtained by screen capture as reference P frames and taking the rest P frames as non-reference P frames; wherein the video frame comprises a P frame and an I frame;
the hierarchical coding unit is used for carrying out predictive coding on the reference P frame and the non-reference P frame according to an adjacent reference P frame or an adjacent I frame and taking a video frame which is coded completely as a video frame to be transmitted;
and the self-adaptive transmission unit is used for transmitting the video frames after deleting the corresponding number of the non-reference P frames from the video frames to be transmitted according to the network state of the user terminal.
A multi-stream cloud game control device comprising:
a memory for storing a computer program;
and the processor is used for realizing the steps of the multi-path streaming cloud game control method when executing the computer program.
A readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the above-described multi-stream cloud game control method.
The method provided by the embodiment of the invention provides a new video coding mode and a corresponding network transmission mode, the coding reference relation is modified in video coding to realize the hierarchical coding function, a reference P frame and a non-reference P frame are generated, and the non-reference P frame is not predicted by reference of other frames, so that the decoding of other frames is not influenced even if the non-reference P frame is lost.
Accordingly, embodiments of the present invention further provide a multi-path streaming cloud game control apparatus, a device and a readable storage medium corresponding to the multi-path streaming cloud game control method, which have the above technical effects and are not described herein again.
Drawings
In order to more clearly illustrate the embodiments of the present invention or technical solutions in related arts, the drawings used in the description of the embodiments or related arts will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a block diagram of a multi-channel streaming system according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating an embodiment of a method for controlling a multi-channel streaming cloud game;
FIG. 3 is a diagram of a conventional video coding reference relationship structure;
FIG. 4 is a diagram of a hierarchical video coding reference relationship structure according to an embodiment of the present invention;
FIG. 5 is a diagram illustrating an adaptive transmission scheme according to an embodiment of the present invention;
FIG. 6 is a diagram illustrating a hierarchical encoding according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of a multi-channel streaming cloud game control device according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a multi-stream cloud game control device according to an embodiment of the present invention.
Detailed Description
The core of the invention is to provide a multi-path streaming cloud game control method, which can meet the game experience of different user terminals when multiple users access a cloud game simultaneously.
In order that those skilled in the art will better understand the disclosure, the invention will be described in further detail with reference to the accompanying drawings and specific embodiments. It is to be understood that the described embodiments are merely exemplary of the invention, and not restrictive of the full scope of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In order to enhance the interest, playability and social attributes of a cloud game, promote value-added services and attract more users, a multi-path streaming cloud game system is available at present, and multiple users are allowed to access the system at the same time.
As shown in fig. 1, a block diagram of a multi-stream system is shown, in which multiple clients can simultaneously access to a host to implement multi-stream. In practical application, the network bandwidth and terminal device conditions of the same user may be different, and different networks and different user terminals exist in a plurality of clients, and the requirements for video quality are different in various conditions. For example, when video data is transmitted by using a network, since the network bandwidth limits data transmission, when the network bandwidth is smaller, the video frame rate or the video bit rate can be reduced to reduce the video data transmission amount, and the frame rate or the bit rate can be dynamically adjusted according to the actual network condition. If the host transmits the same code stream to all the clients all looking at the same time, the experience of different clients will be different. The network is good, the decoding capability is strong, and the cloud game with a large screen has better and smoother experience; on the contrary, the network is poor, the decoding capability is poor, the user feels that the game is stuck, the experience effect of the cloud game is not good, the popularization of the multi-path streaming cloud game is influenced, and more customers cannot be attracted.
Under such a background, in other fields, Scalable Video Coding (SVC) is generally used to generate Video compressed code streams with different frame rates and resolutions by one-time Coding, but this method has a great limitation, and because the complexity of SVC Coding is high, hardware manufacturers supporting SVC Coding are not extensive, and many hard Coding hard solutions are not supported. In cloud game scenes, in order to keep low delay effect, a hard-coding and hard-decoding scheme is adopted, so that the standard SVC coding cannot be directly used.
In order to solve the output problem when multiple users of a multi-path streaming cloud game system access simultaneously, the invention provides a multi-path streaming cloud game control method, which can adaptively transmit matched code streams according to the condition of each path of client so as to realize that each path of client achieves the optimal video image quality and frame rate performance, thereby improving the user experience.
Referring to fig. 2, fig. 2 is a flowchart illustrating a method for controlling a multi-stream cloud game according to an embodiment of the present invention, the method including the following steps:
s101, determining each user terminal connected with streaming in a cloud game system by a host terminal;
the cloud game system may access several user terminals by streaming, as shown in fig. 1. In the multiple streams, all the ues accessing the streams are determined, and the present embodiment aims to perform adaptive code rate adjustment for each ue accessing the streams according to its own network environment.
S102, non-adjacent P frames in the video frames obtained by screen capture are used as reference P frames, and the rest P frames are used as non-reference P frames;
in order to adapt to network transmission states of different clients, implement transmission based on data that the clients can carry, and avoid situations such as jamming or resource waste, the present embodiment aims to generate video code streams with different frame rates by one-time encoding. In order to achieve the above purpose, the present embodiment implements hierarchical coding by modifying the reference relationship of the reference frame on the basis of the existing video coding standards h.264 and h.265, and meanwhile, the code streams after coding are all standard h.264 or h.265 code streams, and there is no compatibility problem.
Generally, a cloud game system performs video coding by using a standard IPPP video coding structure, the reference relationship of coding is as shown in fig. 3, each P frame refers to a previous I frame or P frame, and on the basis of the coding structure, if a frame is missing in the middle, the following frame cannot be decoded normally, and thus, the hierarchical coding function cannot be realized.
The embodiment provides a hierarchical coding mode, and it needs to be noted that, in order to reduce time delay, the coding in the embodiment abandons B frames, and only I frames and P frames are used, in the embodiment, 2-layer frame rate classification is mainly implemented, non-adjacent P frames in video frames obtained by screen capture are used as reference P frames, the remaining P frames are used as non-reference P frames, the reference P frames are frames used for implementing reference prediction, and the reference P frames cannot be discarded; non-reference P frames, which are not used for reference prediction of other frames, may be discarded. By means of hierarchical coding, a certain number of non-reference P frames are selected to be discarded, and different discarded numbers correspond to code streams with different frame rates, so that code streams with different frame rates can be coded at one time, and the method is suitable for network transmission quality of different clients.
Specifically, the determination of the first reference frame is not limited in this embodiment, and a P frame adjacent to the I frame may be used as the first reference frame, a P frame spaced from the I frame by one may be used as the first reference frame, or a certain frame to be encoded may be arbitrarily specified as the first reference frame. Since an I frame cannot be discarded as a starting frame in a segment of encoding, a P frame one-by-one apart from the I frame can be used as a first reference frame. Then, one implementation manner of using non-adjacent P frames in the video frames obtained by screen capture as reference P frames and using the remaining P frames as non-reference P frames is as follows:
if the video frame to be coded currently in the video frames obtained by screen capturing is a P frame, judging whether the remainder of the difference value of the frame number between the P frame to be coded currently and the frame number between the adjacent I frames divided by 2 is 0;
if yes, marking the current P frame to be coded as a reference frame;
if not, marking the current P frame to be coded as a non-reference frame.
Marking a unique serial number n for each frame according to the coding sequence, if the current frame is an I frame, the I frame is intra-frame coded, other frames do not need to be referred, the reference relation does not need to be changed, the serial number of the nearest I frame is marked as the serial number of the current frame, and I (n) ═ n. If the current frame is a P frame, assuming that the frame number of the current frame is P (n), if the difference between P (n) and i (n) divided by 2 equals 0, i.e. (P (n) -i (n)) 2 ═ 0 difference odd, the current frame is marked as reference frame T0(ii) a If the current frame is P frame, assume the frame number of the current frame is P (n), if the remainder obtained by dividing the difference between P (n) and I (n) by 2 is not equal to 0, i.e. (P) (n) -I (n)% 2! If 0, the current frame is marked as a non-reference frame T1
For example, encoding starts from 0, then frame 0 is an I-frame; frame 1 is a non-reference P frame, frame 2 is a reference P frame for frame 4, frame 3 is a non-reference P frame, frame 4 is a reference P frame for frame 6, and so on.
S103, performing predictive coding on a reference P frame and a non-reference P frame according to an adjacent reference P frame or an adjacent I frame, and taking a video frame subjected to coding as a video frame to be transmitted;
the reference P frame is used as a prediction reference frame for the next non-reference P frame and the reference P frame, the I frame is also used as a prediction reference frame for the prediction coding of the adjacent P frame and the next P frame, and the implementation of a hierarchical prediction coding based on the above coding mode is as follows:
if the current P frame to be coded is a reference frame, deleting the last P frame from the decoded image buffer, and carrying out predictive coding according to the P frame or I frame which is one bit away from the current P frame to be coded;
and if the current P frame to be coded is a non-reference frame, performing predictive coding according to the P frame adjacent to the current P frame to be coded.
A structure diagram of a reference relationship of hierarchical video coding based on the above coding method is shown in fig. 4, where coding starts from 0, and the 0 th frame is an I frame; the 1 st frame is a P frame, referring to the 0 th frame; when encoding the 2 nd frame, removing the 1 st frame from a DPB (Decoded Picture Buffer Decoded image Buffer), and referring to the 0 th frame; a third frame P frame, referring to the 2 nd frame P frame; when encoding the 4 th frame, the 3 rd frame is removed from the dpb (decoded Picture buffer), the earlier 2 nd frame … is referred to, and so on, the even frame can be labeled as T0(temporal layer0), the odd frame can be labeled as (T1) temporal layer1, the T0 frame can not be lost because it needs to be referred to by other frames, otherwise the decoding of the following frame will be affected. The T1 frame, because it is not referenced by other frames, even if lost, does not affect the decoding of subsequent other frames.
And S104, deleting the corresponding number of non-reference P frames from the video frames to be transmitted according to the network state of the user terminal, and then transmitting the video frames.
After coding, video data needs to be transmitted to the clients through a network, in order to ensure the experience effect of each client, adaptive transmission needs to be performed according to the actual network condition of each client, and whether the coding rate is adjusted or not is decided according to the bandwidth condition of the client. Specifically, a corresponding number of non-reference P frames are deleted from the video frame to be transmitted according to the network state of the user terminal, for example, if the network state is continuously poor, more non-reference P frames are deleted, and if the network state is good, a small number of non-reference P frames may not be deleted or deleted.
Specifically, the deletion rule for deleting a corresponding number of non-reference P frames from a video frame to be transmitted according to the network state of the user terminal is not limited in this embodiment, and one implementation manner is as follows:
(1) determining a current video frame to be transmitted from video frames to be transmitted, and judging whether the current video frame to be transmitted belongs to a discardable coding frame; wherein the discardable encoded frame comprises: a non-reference P frame;
(2) if not, transmitting the current video frame to be transmitted to the user terminal;
(3) if yes, judging whether the network state of the user terminal reaches a threshold value;
(4) if not, deleting the current video frame to be transmitted, and executing the step of determining the current video frame to be transmitted from the video frames to be transmitted;
(5) and if so, executing the step of transmitting the current video frame to be transmitted to the user terminal.
After the host is accessed for streaming, the real-time network condition may change at any time due to the network fluctuation, and the client feeds back real-time network data, such as information of network delay rtt, packet loss rate and the like, to the host in real time. Due to the fact that the client terminals are different in condition, the actual decoding and screen-loading performances are different, some delays are large, some delays are small, the client can feed back the frame number condition of the decoding and screen-loading to the host in real time, the host can make a decision on hierarchical transmission according to real-time network feedback data, and the frame rate is dynamically adjusted in real time.
However, the determination criterion for determining whether the network state of the ue reaches the threshold is not limited in this embodiment, and may be measured by two aspects, such as network delay and transmission quality, where an implementation manner for determining whether the network state of the ue reaches the threshold is as follows:
(1) judging whether the frame transmission delay of the user terminal is greater than a delay threshold value or not;
the method for determining frame transmission delay in this embodiment is not limited, and the determination may be performed according to a time length between when a video frame is successfully encoded and output to a client, or may be performed by calculating an average number of frames played in a period of time, and the like, where one implementation manner is as follows: whether the difference value between the frame number of the current video frame to be transmitted and the frame number of the decoded upper screen of the user terminal is greater than a difference value threshold value or not can be judged; if yes, judging that the frame transmission delay is greater than a delay threshold; if not, the frame transmission delay is judged to be not greater than the delay threshold. The method compares the sequence number difference value of the playing frame and the current coding frame, and has simple realization mode and high comparison speed.
(2) Judging whether the frame transmission packet loss rate of the user terminal is greater than a packet loss threshold value or not;
(3) if the frame transmission delay of the user terminal is greater than a delay threshold or the frame transmission packet loss rate is greater than a packet loss threshold, judging that the network state of the user terminal does not reach the threshold;
(4) and if the frame transmission delay of the user terminal is not greater than the delay threshold and the frame transmission packet loss rate is not greater than the packet loss threshold, judging that the network state of the user terminal reaches the threshold.
For example, the frame number of the current frame is marked as f (n), the currently received frame feedback number of the latest decoding screen of the client is ACK (n'), the current network packet loss rate is P, if (f (n)) > T | | | P > T1 is satisfied, the difference between the current frame number and the frame number of the decoding screen of the user terminal is greater than a difference threshold a (a may take an empirical value of 4), and whether the network state is poor, such as network delay, and stuck, exists or the packet loss rate is greater than a packet loss threshold b (b may take an empirical value of 5%), that is, when the network transmission quality is poor, the video frame is not sent. Otherwise, the video frame data is sent. It should be noted that, in this embodiment, specific numerical value setting of the threshold is not limited, and may be set correspondingly according to an actual empirical value.
Further, before non-adjacent P frames in the video frames obtained by screen capture are used as reference P frames and the remaining P frames are used as non-reference P frames, the following steps may be further performed:
determining a network bandwidth between a user terminal and a host terminal;
judging whether the streaming with the user terminal is a host with video decision right;
if yes, setting the current coding rate as a network bandwidth, and executing a step of taking nonadjacent P frames in the video frames obtained by screen capture as reference P frames and taking the rest P frames as non-reference P frames;
if not, setting the current coding rate as the minimum value of the network bandwidth of all the user terminals accessing the streaming.
If the host does not have the video decision right, in order to guarantee the fluency of video frame playing, the current coding code rate is set as the minimum value of the network bandwidth of all the user terminals accessing the streaming, and the video frame is transmitted by the minimum network bandwidth of all the user terminals; if the host has the video decision right, the current coding rate is set as the network bandwidth obtained by current detection, and then the rate is adjusted according to the real-time self-adaptive hierarchical coding.
The method can not only ensure the adaptive code rate adjustment of the network environment when the host has the video decision right, but also ensure the flow and stability of video frame transmission when the host does not have the video decision right.
To further understand, an adaptive transmission method is introduced here, and fig. 5 is a schematic diagram of an adaptive transmission method, which may be specifically executed according to the following steps:
1. when each path of client accesses the host, network bandwidth detection is carried out, and the network bandwidth between the client and the host is detected and determined as B. If the current path streaming is that the host has the video decision right, setting the current coding rate as a bandwidth B; otherwise, setting the current coding rate as the minimum bandwidth Bmin of all the access streaming clients.
2. Then all video frames captured by the screen capture are coded in a grading mode, as shown in fig. 6, a grading coding schematic diagram is shown, a unique serial number n is marked for each frame, if a current frame is an I frame, the serial number of the latest I frame is marked as the serial number of the current frame, I (n) ═ n, if the current frame is a P frame, the frame serial number of the current frame is assumed to be P (n), if the difference between P (n) and I (n) is divided by 2 and is equal to 0, the current frame is marked as T0, the last frame P (n-1) is selected from a DPB queue and a reference frame queue, the reference relation of the current frame is modified, and at the moment, the current frame takes a P (n-2) frame as a reference frame for predictive coding; if the current frame is P frame, assume the frame number of the current frame is P (n), if the remainder obtained by dividing the difference between P (n) and I (n) by 2 is not equal to 0, i.e. (P) (n) -I (n)% 2! When the current frame is 0, marking the current frame as T1, and the current frame carries out predictive coding by taking a P (n-1) frame as a reference frame; and marking the frame sequence number of the next frame as P (n +1) ═ P (n) +1, updating the frame sequence number to be the current frame, and entering the next frame coding cycle.
The code stream can be divided into two levels of T0 and T1 after being coded, wherein the code stream of the T1 layer can be discarded without influencing the decoding of other frames.
3. And the client feeds the frame sequence number back to the host after decoding and screen-up, and the frame sequence number is marked as ACK (n), and feeds back network packet loss rate information in real time.
4. If the frame to be transmitted currently belongs to T0, the video frame data is directly transmitted.
5. If the current frame belongs to T1 and the frame number of the current frame is marked as F (n), the currently received frame feedback number of the latest decoding screen of the client is ACK (n '), the current network packet loss rate is P, if (F (n) -ACK (n')) > T | | | P > T1 is met, the difference value between the current frame number and the frame number of the decoding screen of the client is greater than a threshold value a (a takes an empirical value of 4), or the packet loss rate is greater than a threshold value b (b takes an empirical value of 5%), the video frame is not sent, otherwise, the video frame data is sent.
The code stream after coding conforms to the H.264 or H.265 video coding standard syntax of the standard, but has two levels of T0 and T1, and the frame data of the T1 level can be discarded without affecting the decoding of the subsequent frames.
The technical scheme provided by the embodiment of the invention provides a new video coding mode and a corresponding network transmission mode, wherein a coding reference relation is modified in video coding to realize a hierarchical coding function, a reference P frame and a non-reference P frame are generated, and the non-reference P frame is not predicted by reference of other frames, so that the decoding of other frames cannot be influenced even if the non-reference P frame is lost.
Corresponding to the above method embodiments, the present invention further provides a multi-stream cloud game control device, and the multi-stream cloud game control device and the multi-stream cloud game control method described above may be referred to in correspondence.
Referring to fig. 7, the apparatus includes the following modules:
the terminal determining unit 110 is mainly used for determining each user terminal connected to streaming in the cloud game system at the host end;
the frame dividing unit 120 is mainly configured to use nonadjacent P frames in the video frames obtained by screen capture as reference P frames, and use the remaining P frames as non-reference P frames; the video frame comprises a P frame and an I frame;
the hierarchical encoding unit 130 is mainly configured to perform predictive encoding on a reference P frame and a non-reference P frame according to an adjacent reference P frame or an adjacent I frame, and use a video frame that is encoded as a video frame to be transmitted;
the adaptive transmission unit 140 is mainly configured to delete a corresponding number of non-reference P frames from the video frame to be transmitted according to the network status of the user terminal, and then transmit the video frame.
Corresponding to the above method embodiments, the present invention further provides a multi-stream cloud game control device, and the multi-stream cloud game control device described below and the multi-stream cloud game control method described above may be referred to in correspondence.
The multi-stream cloud game control apparatus includes:
a memory for storing a computer program;
and the processor is used for realizing the steps of the multi-path streaming cloud game control method of the embodiment of the method when executing the computer program.
Specifically, referring to fig. 8, a specific structural diagram of a multi-stream cloud game control device provided in this embodiment is shown, where the multi-stream cloud game control device may generate a relatively large difference due to different configurations or performances, and may include one or more processors (CPUs) 322 (e.g., one or more processors) and a memory 332, where the memory 332 stores one or more computer applications 342 or data 344. Memory 332 may be, among other things, transient or persistent storage. The program stored in memory 332 may include one or more modules (not shown), each of which may include a sequence of instructions operating on a data processing device. Still further, the central processor 322 may be configured to communicate with the memory 332 to execute a series of instruction operations in the memory 332 on the multi-stream cloud game control device 301.
The multi-stream cloud game control apparatus 301 may also include one or more power supplies 326, one or more wired or wireless network interfaces 350, one or more input-output interfaces 358, and/or one or more operating systems 341.
The steps in the multi-stream cloud game control method described above may be implemented by the structure of a multi-stream cloud game control apparatus.
Corresponding to the above method embodiments, the present invention further provides a readable storage medium, and a readable storage medium described below and a multi-stream cloud game control method described above may be referred to in correspondence.
A readable storage medium, on which a computer program is stored, wherein the computer program, when executed by a processor, implements the steps of the multi-stream cloud game control method of the above method embodiment.
The readable storage medium may be a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and various other readable storage media capable of storing program codes.
Those of skill would further appreciate that the various illustrative components and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.

Claims (10)

1. A multi-path streaming cloud game control method is characterized by comprising the following steps:
the method comprises the steps that a host end determines each user terminal connected with streaming in a cloud game system;
non-adjacent P frames in the video frames obtained by screen capture are used as reference P frames, and the rest P frames are used as non-reference P frames; wherein the video frame comprises a P frame and an I frame;
performing predictive coding on the reference P frame and the non-reference P frame according to adjacent reference P frames or adjacent I frames, and taking the coded video frame as a video frame to be transmitted;
and deleting the corresponding number of the non-reference P frames from the video frames to be transmitted according to the network state of the user terminal, and then transmitting the video frames.
2. The method of claim 1, wherein the capturing of video frames by capturing the screen with non-adjacent P frames as reference P frames and the capturing of remaining P frames as non-reference P frames comprises:
if the video frame to be coded currently in the video frames obtained by screen capturing is a P frame, judging whether the remainder of the difference value of the frame number between the current P frame to be coded and the adjacent I frame divided by 2 is 0;
if yes, marking the current P frame to be coded as a reference frame;
if not, marking the current P frame to be coded as a non-reference frame.
3. The method of claim 2, wherein the predictively encoding the reference P-frame and the non-reference P-frame from neighboring reference P-frames or neighboring I-frames comprises:
if the current P frame to be coded is the reference frame, deleting the last P frame from a decoded image buffer and a reference frame queue, and carrying out predictive coding according to a P frame or an I frame which is separated from the current P frame to be coded by one bit;
and if the current P frame to be coded is the non-reference frame, performing predictive coding according to the P frame adjacent to the current P frame to be coded.
4. The method according to claim 1, wherein the transmitting video frames after deleting a corresponding number of non-reference P frames from the video frames to be transmitted according to the network status of the user terminal comprises:
determining a current video frame to be transmitted from the video frames to be transmitted, and judging whether the current video frame to be transmitted belongs to a discardable coding frame; wherein the discardable encoded frame comprises: the non-reference P frame;
if not, transmitting the current video frame to be transmitted to the user terminal;
if yes, judging whether the network state of the user terminal reaches a threshold value;
if not, deleting the current video frame to be transmitted, and executing the step of determining the current video frame to be transmitted from the video frames to be transmitted;
and if so, executing the step of transmitting the current video frame to be transmitted to the user terminal.
5. The method of claim 4, wherein determining whether the network status of the user terminal reaches a threshold comprises:
judging whether the frame transmission delay of the user terminal is greater than a delay threshold value or not;
judging whether the frame transmission packet loss rate of the user terminal is greater than a packet loss threshold value or not;
if the frame transmission delay of the user terminal is greater than the delay threshold or the frame transmission packet loss rate is greater than the packet loss threshold, judging that the network state of the user terminal does not reach the threshold;
and if the frame transmission delay of the user terminal is not greater than the delay threshold and the frame transmission packet loss rate is not greater than the packet loss threshold, determining that the network state of the user terminal reaches the threshold.
6. The method of claim 5, wherein determining whether the frame transmission delay of the UE is greater than a delay threshold comprises:
judging whether the difference value between the frame number of the current video frame to be transmitted and the frame number of the decoded upper screen of the user terminal is greater than a difference value threshold value or not;
if yes, judging that the frame transmission delay is greater than a delay threshold;
if not, the frame transmission delay is judged to be not greater than the delay threshold.
7. The method of claim 1, wherein before the non-adjacent P frames in the captured video frames are used as reference P frames and the remaining P frames are used as non-reference P frames, the method further comprises:
determining a network bandwidth between the user terminal and the host;
judging whether the streaming with the user terminal is a host with video decision right;
if yes, setting the current coding rate as the network bandwidth, and executing the step of taking nonadjacent P frames in the video frames obtained by screen capture as reference P frames and taking the rest P frames as non-reference P frames;
if not, setting the current coding rate as the minimum value of the network bandwidth of all the user terminals accessing the streaming.
8. A multi-stream cloud game control apparatus, comprising:
the terminal determining unit is used for determining each user terminal connected with streaming in the cloud game system by the host side;
the frame dividing unit is used for taking nonadjacent P frames in the video frames obtained by screen capture as reference P frames and taking the rest P frames as non-reference P frames; wherein the video frame comprises a P frame and an I frame;
the hierarchical coding unit is used for carrying out predictive coding on the reference P frame and the non-reference P frame according to an adjacent reference P frame or an adjacent I frame and taking a video frame which is coded completely as a video frame to be transmitted;
and the self-adaptive transmission unit is used for transmitting the video frames after deleting the corresponding number of the non-reference P frames from the video frames to be transmitted according to the network state of the user terminal.
9. A multi-stream cloud game control apparatus, comprising:
a memory for storing a computer program;
a processor for implementing the steps of the method for controlling a multi-stream cloud game according to any one of claims 1 to 7 when executing the computer program.
10. A readable storage medium, having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the steps of the method for controlling a multi-stream cloud game according to any one of claims 1 to 7.
CN202011641796.0A 2020-12-31 2020-12-31 Multi-path streaming cloud game control method, device, equipment and storage medium Pending CN112866746A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011641796.0A CN112866746A (en) 2020-12-31 2020-12-31 Multi-path streaming cloud game control method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011641796.0A CN112866746A (en) 2020-12-31 2020-12-31 Multi-path streaming cloud game control method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112866746A true CN112866746A (en) 2021-05-28

Family

ID=76000833

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011641796.0A Pending CN112866746A (en) 2020-12-31 2020-12-31 Multi-path streaming cloud game control method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112866746A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114007137A (en) * 2021-10-29 2022-02-01 杭州雾联科技有限公司 ROI-based video hierarchical coding method, device and medium
CN114025233A (en) * 2021-10-27 2022-02-08 网易(杭州)网络有限公司 Data processing method and device, electronic equipment and storage medium
WO2023246936A1 (en) * 2022-06-24 2023-12-28 杭州海康威视数字技术股份有限公司 Image processing method and apparatus, and device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070248100A1 (en) * 2006-04-25 2007-10-25 Microsoft Corporation Quality of service support for A/V streams
CN101621688A (en) * 2009-04-30 2010-01-06 武汉大学 Codec method for realizing AVS video standard time domain classification
CN103701634A (en) * 2013-12-10 2014-04-02 广州华多网络科技有限公司 Method and device for transmitting multimedia data
CN105812097A (en) * 2016-03-16 2016-07-27 北京邮电大学 Self-adaptive AMR code rate adjusting method based on network states
CN108174234A (en) * 2018-01-12 2018-06-15 珠海全志科技股份有限公司 A kind of flow-medium transmission method and system
CN111615006A (en) * 2020-05-29 2020-09-01 高小翎 Video code conversion transmission control system based on network state self-evaluation

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070248100A1 (en) * 2006-04-25 2007-10-25 Microsoft Corporation Quality of service support for A/V streams
CN101621688A (en) * 2009-04-30 2010-01-06 武汉大学 Codec method for realizing AVS video standard time domain classification
CN103701634A (en) * 2013-12-10 2014-04-02 广州华多网络科技有限公司 Method and device for transmitting multimedia data
CN105812097A (en) * 2016-03-16 2016-07-27 北京邮电大学 Self-adaptive AMR code rate adjusting method based on network states
CN108174234A (en) * 2018-01-12 2018-06-15 珠海全志科技股份有限公司 A kind of flow-medium transmission method and system
CN111615006A (en) * 2020-05-29 2020-09-01 高小翎 Video code conversion transmission control system based on network state self-evaluation

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114025233A (en) * 2021-10-27 2022-02-08 网易(杭州)网络有限公司 Data processing method and device, electronic equipment and storage medium
CN114025233B (en) * 2021-10-27 2023-07-14 网易(杭州)网络有限公司 Data processing method and device, electronic equipment and storage medium
CN114007137A (en) * 2021-10-29 2022-02-01 杭州雾联科技有限公司 ROI-based video hierarchical coding method, device and medium
WO2023246936A1 (en) * 2022-06-24 2023-12-28 杭州海康威视数字技术股份有限公司 Image processing method and apparatus, and device

Similar Documents

Publication Publication Date Title
KR102039778B1 (en) Method and apparatus for adaptively providing multiple bit rate stream media in server
CN112866746A (en) Multi-path streaming cloud game control method, device, equipment and storage medium
US20170347158A1 (en) Frame Dropping Method for Video Frame and Video Sending Apparatus
US8621532B2 (en) Method of transmitting layered video-coded information
CN109660879B (en) Live broadcast frame loss method, system, computer equipment and storage medium
CN110024409B (en) Method and apparatus for key frame de-emphasis of video stream with multiple receivers
CN109168083B (en) Streaming media real-time playing method and device
US20170142029A1 (en) Method for data rate adaption in online media services, electronic device, and non-transitory computer-readable storage medium
CN113038128B (en) Data transmission method and device, electronic equipment and storage medium
US20200296470A1 (en) Video playback method, terminal apparatus, and storage medium
WO2017084277A1 (en) Code stream self-adaption method and system for online media service
WO2023142716A1 (en) Encoding method and apparatus, real-time communication method and apparatus, device, and storage medium
US20110067072A1 (en) Method and apparatus for performing MPEG video streaming over bandwidth constrained networks
CN113068001A (en) Data processing method, device, equipment and medium based on cascade camera
WO2023010992A1 (en) Video coding method and apparatus, computer readable medium, and electronic device
CN111970565A (en) Video data processing method and device, electronic equipment and storage medium
US20220408097A1 (en) Adaptively encoding video frames using content and network analysis
KR19980081099A (en) Image transfer device and image transfer method
CN116962179A (en) Network transmission optimization method and device, computer readable medium and electronic equipment
CN115514960A (en) Video coding method and device, electronic equipment and storage medium
CN112468818B (en) Video communication realization method and device, medium and electronic equipment
CN115767146A (en) Data flow control method, system, device, electronic equipment and storage medium
CN112565670B (en) Method for rapidly and smoothly drawing multi-layer video of cloud conference
CN105306970B (en) A kind of control method and device of live streaming media transmission speed
CN114007137A (en) ROI-based video hierarchical coding method, device and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination