WO2020155956A1 - 首帧均衡限流方法、装置、计算机设备及可读存储介质 - Google Patents

首帧均衡限流方法、装置、计算机设备及可读存储介质 Download PDF

Info

Publication number
WO2020155956A1
WO2020155956A1 PCT/CN2019/128418 CN2019128418W WO2020155956A1 WO 2020155956 A1 WO2020155956 A1 WO 2020155956A1 CN 2019128418 W CN2019128418 W CN 2019128418W WO 2020155956 A1 WO2020155956 A1 WO 2020155956A1
Authority
WO
WIPO (PCT)
Prior art keywords
audio
video
capacity
buffer area
code stream
Prior art date
Application number
PCT/CN2019/128418
Other languages
English (en)
French (fr)
Inventor
郑翰超
吴志强
陈辉
Original Assignee
上海哔哩哔哩科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海哔哩哔哩科技有限公司 filed Critical 上海哔哩哔哩科技有限公司
Priority to EP19912992.5A priority Critical patent/EP3863293A4/en
Publication of WO2020155956A1 publication Critical patent/WO2020155956A1/zh
Priority to US17/329,716 priority patent/US11463494B2/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4331Caching operations, e.g. of an advertisement for later insertion during playback
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/61Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio
    • H04L65/612Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio for unicast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/61Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio
    • H04L65/613Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio for the control of the source by the destination
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/65Network streaming protocols, e.g. real-time transport protocol [RTP] or real-time control protocol [RTCP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/75Media network packet handling
    • H04L65/765Media network packet handling intermediate
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/439Processing of audio elementary streams
    • H04N21/4392Processing of audio elementary streams involving audio buffer management
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44004Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving video buffer management, e.g. video decoder buffer or video display buffer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8106Monomedia components thereof involving special audio data, e.g. different tracks for different languages
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • H04N21/8456Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments

Definitions

  • This application relates to the technical field of audio and video playback, and in particular to a first frame equalization current limiting method, device, computer equipment, and readable storage medium.
  • smart terminals for example, mobile phones, computers
  • users can obtain video content and audio content from network servers through smart terminals , And play on the smart terminal.
  • the inventor realizes that in order to achieve better playback effects, how to quickly obtain the first frame of content and quickly start the broadcast when playing audio, video and other streaming media has become an urgent technical problem to be solved.
  • the purpose of this application is to provide a first frame equalizing and current limiting method, device, computer equipment, and readable storage medium, which can increase the download speed of downloaded video content, and can realize fast start-up, thereby improving user experience.
  • a first frame equalization current limiting method which includes the following steps:
  • the capacity of the video buffer area and the capacity of the audio buffer area are determined.
  • the determining the capacity of the video buffer area and the capacity of the audio buffer area according to the video code stream and the audio code stream further includes:
  • the video code stream calculate the first capacity required to play a first set number of video frames, and set the first capacity as the capacity of the video buffer area;
  • a second capacity size required for playing the first set number of audio frames is calculated, and the second capacity size is set as the capacity size of the audio buffer area.
  • the determining the capacity of the video buffer area and the capacity of the audio buffer area according to the video code stream and the audio code stream further includes:
  • the method further includes:
  • the capacity of the video buffer area is doubled until the capacity of the video buffer area reaches the preset size Half of the total capacity of the maximum buffer; and, each time the second set number of audio frames are played, the capacity of the audio buffer is doubled until the capacity of the audio buffer reaches a preset size
  • the maximum cache size is half of the total size.
  • the method further includes:
  • this application also provides a first frame equalization current limiting device, which includes the following components:
  • the first obtaining module is used to obtain the video data file and audio data file of the streaming media content to be played from the server when the streaming media content is played using DASH;
  • the second obtaining module is used to obtain a video code stream from the video data file, and obtain an audio code stream from the audio data file;
  • the capacity determination module is used to determine the capacity of the video buffer area and the audio buffer area according to the video code stream and the audio code stream.
  • the present application also provides a computer device, the computer device comprising: a memory, a processor, and computer-readable instructions stored on the memory and running on the processor, the processor The following steps are performed when the computer-readable instructions are executed:
  • the capacity of the video buffer area and the capacity of the audio buffer area are determined.
  • the present application also provides a computer-readable storage medium on which computer-readable instructions are stored.
  • the computer-readable instructions When the computer-readable instructions are executed by a processor, the following steps are performed:
  • the capacity of the video buffer area and the capacity of the audio buffer area are determined.
  • the first frame equalization current limiting method, device, computer equipment, and readable storage medium provided in this application configure the capacity of the video buffer area and the audio buffer area according to the proportional relationship between the video stream and the audio stream, so that the video buffer
  • the capacity of the area is greater than the capacity of the audio buffer area, which will cause the video content to occupy more bandwidth during the caching process, increase the download speed of the downloaded video content, and make it possible to download the same number of frames in the same time Video content and audio content.
  • the capacity of the initial buffer area is limited, so that the capacity of the initial video buffer area and the initial audio buffer area can cache the first frame of video content and the first frame Audio content is fine.
  • the capacity of the audio buffer area and the capacity of the video buffer area are dynamically expanded. By expanding the capacity of the audio buffer area and the video buffer area, the number of requests for buffering can be reduced and network congestion can be avoided.
  • FIG. 1 is a schematic diagram of an optional flow chart of the first frame equalization current limiting method provided in the first embodiment
  • FIG. 2 is a schematic diagram of an optional program module of the first frame equalization current limiting device provided in the second embodiment
  • FIG. 3 is a schematic diagram of an optional hardware architecture of the computer device provided in the third embodiment.
  • Fig. 1 is a schematic diagram of an optional flow chart of the first frame equalization current limiting method of this application. The method is applied to the DASH client. As shown in Fig. 1, the method may include the following steps:
  • Step S101 When using DASH to play streaming media content, obtain the video data file and audio data file of the streaming media content to be played from the server.
  • DASH Dynamic Adaptive Streaming over HTTP, HTTP-based dynamic adaptive streaming
  • HTTP-based dynamic adaptive streaming is an adaptive bit rate streaming technology that enables high-quality streaming media to be delivered over the Internet through traditional HTTP web servers.
  • audio content and video content are contained in a streaming media file.
  • the client obtains the streaming media file from the server and parses the streaming media file to play the audio and video; and in DASH In technology, to separate audio content from video content, the client needs to obtain a video data file and an audio data file from the server, and parse the video data file and audio data file to perform audio and video playback.
  • DASH decomposes a streaming media content into multiple segments, each segment contains a certain length (for example, 10 seconds) of playable content, and each segment corresponds to multiple streams for use, the client You can choose to download and play the segments of the specified stream according to the current network conditions.
  • using DASH technology can switch to streaming media content of different streams at any time during the process of playing streaming media content.
  • step S101 includes:
  • Step S102 Obtain a video code stream from the video data file, and obtain an audio code stream from the audio data file.
  • step S102 includes:
  • the video data file is parsed to obtain the video code stream information in the file header area;
  • the audio data file is parsed to obtain the audio code stream information in the file header area.
  • the video code stream of video content is ten times the audio code stream of audio content.
  • the video code stream of video content is 2M/s
  • the audio code stream of audio content is 200k/s.
  • Step S103 Determine the size of the video buffer area and the size of the audio buffer area according to the video code stream and the audio code stream.
  • the video buffer area is a preset memory for caching video content
  • the audio buffer area is a preset memory for caching audio content
  • the video buffer area and the audio buffer are The size of the zone can be dynamically adjusted.
  • the capacity of the video buffer area and the capacity of the audio buffer area can be determined in one of the following two ways:
  • the video code stream calculate the first capacity required to play a first set number of video frames, and set the first capacity as the capacity of the video buffer area;
  • the second capacity required for playing the first set number of audio frames is calculated, and the second capacity is set as the capacity of the audio buffer area.
  • the capacity required to play the first two frames of video content is calculated according to the video code stream and the capacity required to play the first two frames of audio content is calculated according to the audio code stream to be used as the video buffer area.
  • the size and the size of the audio buffer area are calculated according to the capacity required to play the first two frames of video content.
  • the audio and video are played using frames as the smallest unit; that is, when one frame of video content is cached, the frame of video content is played. Therefore, in this embodiment, in order to obtain the technical effect of quick start, the capacity required to play the first two frames of video content and the first two frames of audio content are used as the initial video buffer area and audio buffer area respectively. The size of the capacity.
  • the initial video buffer area and audio buffer area are set to the minimum size, and since the video code stream is much larger than the audio code stream, Therefore, the capacity of the video buffer area after setting will be much larger than that of the audio buffer area, which will cause the video content to occupy more bandwidth during the buffering process, and make it possible to download the same number of frames in the same time Video content and audio content.
  • the size of the video buffer area and the size of the audio buffer area are usually set to be the same.
  • the size of the video buffer area and the size of the audio buffer area are set separately It is 2M; therefore, video content and audio content occupy the same bandwidth during the caching process, resulting in the cached audio content being much larger than the cached audio content within the same download time.
  • the size of the video buffer area and the size of the audio buffer area are fixed.
  • the total capacity of the preset initial buffer is divided according to the ratio of the video code stream to the audio code stream; for example, if the ratio of the video code stream to the audio code stream is 10:1, the preset initial buffer The total capacity is divided into two parts at a ratio of 10:1, and the capacity of one part is 10 times the capacity of the other part.
  • the total initial buffer capacity is limited, so that the initial video buffer area and the initial audio buffer area can cache the first frame of video content and the first frame of audio content. can.
  • the capacity of the video buffer area and the audio buffer area are configured according to the proportional relationship between the video code stream and the audio code stream, so that the capacity of the video buffer area is greater than that of the audio
  • the size of the buffer area Since bandwidth cannot be allocated directly, the effect of allocating more bandwidth for cached video content can be achieved through the difference in the size of the buffer area.
  • the method further includes:
  • the capacity of the video buffer area is doubled until the capacity of the video buffer area reaches the preset size Half of the total capacity of the maximum buffer; and, each time the second set number of audio frames are played, the capacity of the audio buffer is doubled until the capacity of the audio buffer reaches a preset size
  • the maximum cache size is half of the total size.
  • the capacity of the video buffer area is doubled until the capacity of the video buffer area reaches 2M; and, after playing 10 frames of audio content, the capacity of the audio buffer area is reduced The size is doubled until the capacity of the audio buffer area reaches 2M.
  • the capacity of the audio buffer area and the capacity of the video buffer area are dynamically expanded.
  • the capacity of the audio buffer area and the video buffer area are dynamically expanded.
  • the method further includes:
  • the first frame of audio content and the first frame of video content can be played synchronously to perform audio and video playback. broadcast. Since only the first frame of audio content and the first frame of video content need to be played synchronously at the start of the broadcast, the capacity of the audio buffer area for buffering the first frame of audio content and the size of the video buffer area for buffering the first frame of video content The capacity size does not need to be set too large; in the audio and video playback process, the capacity of the audio buffer area and the capacity of the video buffer area are gradually expanded.
  • this embodiment provides a first frame equalization current limiting device.
  • FIG. 2 shows an optional structural block diagram of the first frame equalization current limiting device
  • the first frame equalization current limiting device is divided into one or more program modules, and the one or more program modules are stored in a storage medium and executed by one or more processors to complete the application.
  • the program module referred to in this application refers to a series of computer-readable instruction segments that can complete specific functions. The following description will specifically introduce the functions of each program module in this embodiment.
  • the first frame equalization current limiting device applied to the DASH client includes the following components:
  • the first obtaining module 201 is configured to obtain a video data file and an audio data file of the streaming media content to be played from the server when the streaming media content is played using DASH.
  • the first obtaining module 201 is used to:
  • the second obtaining module 202 is used to obtain a video code stream from the video data file, and obtain an audio code stream from the audio data file.
  • the second acquisition module 202 is used to:
  • the video data file is parsed to obtain the video code stream information in the file header area;
  • the audio data file is parsed to obtain the audio code stream information in the file header area.
  • the capacity determining module 203 is configured to determine the capacity of the video buffer area and the audio buffer area according to the video code stream and the audio code stream.
  • the video buffer area is a preset memory for caching video content
  • the audio buffer area is a preset memory for caching audio content
  • the video buffer area and the audio buffer are The size of the zone can be dynamically adjusted.
  • the capacity determination module 203 is used to:
  • the first capacity required to play a first set number of video frames is calculated, and the first capacity is set as the capacity of the video buffer area; according to the audio code Streaming, calculating the second capacity required to play the first set number of audio frames, and setting the second capacity as the capacity of the audio buffer area; or,
  • the device further includes:
  • the dynamic adjustment module is used to, after determining the size of the video buffer area and the size of the audio buffer area, in the process of playing the streaming media content, after playing the second set number of video frames, Double the capacity of the video buffer area until the capacity of the video buffer area reaches half of the preset maximum total buffer capacity; and, whenever the second set number of audio frames are played , Double the capacity of the audio buffer area until the capacity of the audio buffer area reaches half of the preset maximum total buffer capacity.
  • the device further includes:
  • the obtaining buffer module is used to obtain the first frame of video of the streaming media content after determining the size of the video buffer area and the size of the audio buffer area, and buffer the first frame of video in the video buffer In the area; acquiring the first frame of audio of the streaming media content, and buffering the first frame of audio in the audio buffer area.
  • This embodiment also provides a computer device, such as a smart phone, a tablet computer, a notebook computer, a desktop computer, a rack server, a blade server, a tower server, or a cabinet server (including independent servers, or A server cluster composed of multiple servers), etc.
  • the computer device 30 of this embodiment at least includes but is not limited to: a memory 301 and a processor 302 that can be communicatively connected to each other through a system bus. It should be pointed out that FIG. 3 only shows the computer device 30 with components 301 to 302, but it should be understood that it is not required to implement all the illustrated components, and more or fewer components may be implemented instead.
  • the memory 301 (ie, readable storage medium) includes flash memory, hard disk, multimedia card, card-type memory (for example, SD or DX memory, etc.), random access memory (RAM), static random access memory (SRAM), Read only memory (ROM), electrically erasable programmable read only memory (EEPROM), programmable read only memory (PROM), magnetic memory, magnetic disk, optical disk, etc.
  • the memory 301 may be an internal storage unit of the computer device 30, such as a hard disk or a memory of the computer device 30.
  • the memory 301 may also be an external storage device of the computer device 30, for example, a plug-in hard disk, a smart media card (SMC), and a secure digital (Secure Digital, SD) card, flash card (Flash Card), etc.
  • the memory 301 may also include both the internal storage unit of the computer device 30 and its external storage device.
  • the memory 301 is generally used to store an operating system and various application software installed in the computer device 30, such as the program code of the first frame equalization current limiting device in the second embodiment, and so on.
  • the memory 301 can also be used to temporarily store various types of data that have been output or will be output.
  • the processor 302 may be a central processing unit (Central Processing Unit, CPU), controller, microcontroller, microprocessor, or other data processing chip in some embodiments.
  • the processor 302 is generally used to control the overall operation of the computer device 30.
  • the processor 302 is configured to execute the program of the first frame equalization current limiting method stored in the processor 302, and the following steps are implemented when the program of the first frame equalization current limiting method is executed:
  • the capacity of the video buffer area and the capacity of the audio buffer area are determined.
  • This embodiment also provides a computer-readable storage medium (volatile/nonvolatile), such as flash memory, hard disk, multimedia card, card-type memory (for example, SD or DX memory, etc.), random access memory (RAM) , Static Random Access Memory (SRAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), Programmable Read Only Memory (PROM), Magnetic Memory, Disk, CD, Server, App Store And so on, computer-readable instructions are stored thereon, and the computer-readable instructions implement the following method steps when executed by a processor:
  • the capacity of the video buffer area and the capacity of the audio buffer area are determined.
  • the capacity of the video buffer area and the audio buffer area are configured according to the proportional relationship between the video code stream and the audio code stream, so that the video
  • the capacity of the buffer area is greater than the capacity of the audio buffer area, which will cause the video content to occupy more bandwidth during the buffering process, increase the download speed of the downloaded video content, and make it possible to download the same number of frames in the same time Video content and audio content.
  • the capacity of the initial buffer area is limited, so that the capacity of the initial video buffer area and the initial audio buffer area can cache the first frame of video content and the first frame Audio content is fine.
  • the capacity of the audio buffer area and the capacity of the video buffer area are dynamically expanded. By expanding the capacity of the audio buffer area and the video buffer area, the number of requests for buffering can be reduced and network congestion can be avoided.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Information Transfer Between Computers (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

本申请公开了一种首帧均衡限流方法、装置、计算机设备及可读存储介质,该方法包括:当使用DASH播放流媒体内容时,从服务器获取待播放的流媒体内容的视频数据文件和音频数据文件;从所述视频数据文件中获取视频码流,并从所述音频数据文件中获取音频码流;根据所述视频码流和音频码流,确定出视频缓存区的容量大小和音频缓存区的容量大小。本申请能够提高下载视频内容的下载速度,并可以实现快速起播,从而提高了用户体验度。

Description

首帧均衡限流方法、装置、计算机设备及可读存储介质
本申请申明2019年01月30日递交的申请号为201910092901.0、名称为“首帧均衡限流方法、装置、计算机设备及可读存储介质”的中国专利申请的优先权,该中国专利申请的整体内容以参考的方式结合在本申请中。
技术领域
本申请涉及音视频播放技术领域,具体涉及一种首帧均衡限流方法、装置、计算机设备及可读存储介质。
背景技术
随着互联网和智能终端的不断发展,越来越多的用户选择使用智能终端(例如,手机、电脑)播放音频、视频等流媒体;用户可以通过智能终端从网络服务器上获取视频内容和音频内容,并在智能终端上播放。发明人意识到,为了达到更好的播放效果,在播放音频、视频等流媒体时,如何快速获取首帧内容及快速起播成为急需解决的技术问题。
发明内容
本申请的目的在于提供一种首帧均衡限流方法、装置、计算机设备及可读存储介质,能够提高下载视频内容的下载速度,并可以实现快速起播,从而提高了用户体验度。
根据本申请的一个方面,提供了一种首帧均衡限流方法,该方法包括如下步骤:
当使用DASH播放流媒体内容时,从服务器获取待播放的流媒体内容的视频数据文件和音频数据文件;
从所述视频数据文件中获取视频码流,并从所述音频数据文件中获取音频码流;
根据所述视频码流和音频码流,确定出视频缓存区的容量大小和音频缓存区的容量大小。
可选的,所述根据所述视频码流和音频码流,确定出视频缓存区的容量大小和音频缓存区的容量大小,还包括:
根据所述视频码流,计算出播放第一设定数量的视频帧所需要的第一容量大小,并将所述第一容量大小设置为所述视频缓存区的容量大小;
根据所述音频码流,计算出播放所述第一设定数量的音频帧所需要的第二容量大小, 并将所述第二容量大小设定为所述音频缓存区的容量大小。
可选的,所述根据所述视频码流和音频码流,确定出视频缓存区的容量大小和音频缓存区的容量大小,还包括:
根据所述视频码流和音频码流确定出音视频码流比例,并按照所述音视频码流比例,将预设的初始缓存总容量大小划分为视频缓存区的容量大小和音频缓存区的容量大小。
可选的,在所述确定出视频缓存区的容量大小和音频缓存区的容量大小之后,所述方法还包括:
在播放所述流媒体内容的过程中,每当播放第二设定数量的视频帧之后,将所述视频缓存区的容量大小扩大一倍,直至所述视频缓存区的容量大小达到预设的最大缓存总容量大小的一半;以及,每当播放所述第二设定数量的音频帧之后,将所述音频缓存区的容量大小扩大一倍,直至所述音频缓存区的容量大小达到预设的最大缓存总容量大小的一半。
可选的,在所述确定出视频缓存区的容量大小和音频缓存区的容量大小之后,所述方法还包括:
获取所述流媒体内容的首帧视频,并将所述首帧视频缓存在所述视频缓存区中;
获取所述流媒体内容的首帧音频,并将所述首帧音频缓存在所述音频缓存区中。
为了实现上述目的,本申请还提供一种首帧均衡限流装置,该装置包括以下组成部分:
第一获取模块,用于当使用DASH播放流媒体内容时,从服务器获取待播放的流媒体内容的视频数据文件和音频数据文件;
第二获取模块,用于从所述视频数据文件中获取视频码流,并从所述音频数据文件中获取音频码流;
容量确定模块,用于根据所述视频码流和音频码流,确定出视频缓存区的容量大小和音频缓存区的容量大小。
为了实现上述目的,本申请还提供一种计算机设备,所述计算机设备包括:存储器、处理器以及存储在所述存储器上并可在所述处理器上运行的计算机可读指令,所述处理器执行所述计算机可读指令时执行以下步骤:
当使用DASH播放流媒体内容时,从服务器获取待播放的流媒体内容的视频数据文件和音频数据文件;
从所述视频数据文件中获取视频码流,并从所述音频数据文件中获取音频码流;
根据所述视频码流和音频码流,确定出视频缓存区的容量大小和音频缓存区的容量大 小。
为了实现上述目的,本申请还提供一种计算机可读存储介质,其上存储有计算机可读指令,所述计算机可读指令被处理器执行时以下步骤:
当使用DASH播放流媒体内容时,从服务器获取待播放的流媒体内容的视频数据文件和音频数据文件;
从所述视频数据文件中获取视频码流,并从所述音频数据文件中获取音频码流;
根据所述视频码流和音频码流,确定出视频缓存区的容量大小和音频缓存区的容量大小。
本申请提供的首帧均衡限流方法、装置、计算机设备及可读存储介质,根据视频码流和音频码流的比例关系配置视频缓存区的容量大小和音频缓存区的容量大小,使得视频缓存区的容量大小大于音频缓存区的容量大小,从而导致视频内容在缓存的过程中可以占用更多的带宽,提高下载视频内容的下载速度,并使得在相同的时间内,可以下载同样帧数的视频内容和音频内容。此外,为了达到快速输出首帧流媒体内容效果,限制初始的缓存区的容量大小,以使初始的视频缓存区的容量大小和初始的音频缓存区的容量大小可以缓存首帧视频内容和首帧音频内容即可。在音视频的播放过程中,动态扩大音频缓存区的容量大小和视频缓存区的容量大小。通过扩大音频缓存区的容量大小和视频缓存区的容量大小,可以减少请求缓存的次数,避免了网络拥堵。
附图说明
通过阅读下文优选实施方式的详细描述,各种其他的优点和益处对于本领域普通技术人员将变得清楚明了。附图仅用于示出优选实施方式的目的,而并不认为是对本申请的限制。而且在整个附图中,用相同的参考符号表示相同的部件。在附图中:
图1为实施例一提供的首帧均衡限流方法的一种可选的流程示意图;
图2为实施例二提供的首帧均衡限流装置的一种可选的程序模块示意图;
图3为实施例三提供的计算机设备的一种可选的硬件架构示意图。
具体实施方式
为了使本申请的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本 申请进行进一步详细说明。应当理解,此处所描述的具体实施例仅用以解释本申请,并不用于限定本申请。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
实施例一
下面结合附图对本申请提供的首帧均衡限流方法进行说明。
图1为本申请首帧均衡限流方法的一种可选的流程示意图,该方法应用于DASH客户端,如图1所示,该方法可以包括以下步骤:
步骤S101:当使用DASH播放流媒体内容时,从服务器获取待播放的流媒体内容的视频数据文件和音频数据文件。
其中,DASH(Dynamic Adaptive Streaming over HTTP,基于HTTP的动态自适应流)是一种自适应比特率流技术,使得高质量流媒体可以通过传统的HTTP网络服务器以互联网传递。在传统的音视频播放技术中,音频内容和视频内容被包含在一个流媒体文件中,客户端从服务器获取该流媒体文件,并通过解析该流媒体文件以进行音视频的播放;而在DASH技术中,将音频内容和视频内容分离开来,客户端需要从服务器分别获取视频数据文件和音频数据文件,并通过解析该视频数据文件和音频数据文件以进行音视频的播放。此外,DASH将一个流媒体内容分解为多个分片,每个分片包含一定长度(例如,10秒)的可播放内容,且每个分片对应有多个码流可供使用,客户端可以根据当前网络条件选择下载和播放指定码流的分片。与传统的音视频播放技术相比,使用DASH技术可以在播放流媒体内容的过程中随时切换至不同码流的流媒体内容。
具体的,步骤S101,包括:
向服务器发送获取视频数据文件和音频数据文件的请求,并接收由所述服务器反馈回的视频数据文件和音频数据文件。
步骤S102:从所述视频数据文件中获取视频码流,并从所述音频数据文件中获取音频码流。
具体的,步骤S102,包括:
对所述视频数据文件进行解析,以得到文件头部区域中的视频码流信息;并对所述音频数据文件进行解析,以得到文件头部区域中的音频码流信息。
在实际应用中,视频内容的视频码流是音频内容的音频码流的十倍,例如,视频内容的视频码流为2M/s,音频内容的音频码流为200k/s。
步骤S103:根据所述视频码流和音频码流,确定出视频缓存区的容量大小和音频缓存区的容量大小。
其中,所述视频缓存区是预设的用于缓存视频内容的内存,所述音频缓存区是预设的用于缓存音频内容的内存;在本实施例中,所述视频缓存区和音频缓存区的容量大小是可以动态调整的。
具体的,在步骤S103中,可以按照以下两种方式中的一种确定出视频缓存区的容量大小和音频缓存区的容量大小:
方式1:
根据所述视频码流,计算出播放第一设定数量的视频帧所需要的第一容量大小,并将所述第一容量大小设置为所述视频缓存区的容量大小;
根据所述音频码流,计算出播放所述第一设定数量的音频帧所需要的第二容量大小,并将所述第二容量大小设定为所述音频缓存区的容量大小。
优选的,根据所述视频码流计算出播放前两帧视频内容所需的容量大小和根据所述音频码流计算出播放前两帧音频内容所需的容量大小,以分别作为视频缓存区的容量大小和音频缓存区的容量大小。
由于在利用DASH播放音视频的过程中,是以帧作为最小单位进行音视频的播放的;即,当一帧视频内容缓存完成后,就对该帧视频内容进行播放。因此,在本实施例中,为了得到快速起播的技术效果,以播放前两帧视频内容和前两帧音频内容所需的容量大小分别作为初始的视频缓存区的容量大小和音频缓存区的容量大小。需要说明的是,为了达到快速起播的技术效果,在本实施例中,将最初的视频缓存区和音频缓存区的容量大小设置为最小值,并由于视频码流远远大于音频码流,所以设置后的视频缓存区的容量也将远远大于音频缓存区的容量,从而导致视频内容在缓存的过程中可以占用更多的带宽,并使得在相同的时间内,可以下载同样帧数的视频内容和音频内容。
方式2:
根据所述视频码流和音频码流确定出音视频码流比例,并按照所述音视频码流比例,将预设的初始缓存总容量大小划分为视频缓存区的容量大小和音频缓存区的容量大小。
需要说明的是,发明人所了解的是,视频缓存区的容量大小和音频缓存区的容量大小通常被设置为一样的,例如,将视频缓存区的容量大小和音频缓存区的容量大小分别设置为2M;因此,视频内容和音频内容在缓存过程中占用的带宽一样,从而导致在相同的下载时间内,已缓存的音频内容远远大于已缓存的音频内容。此外,在现有技术中,视频缓存区的容量大小和音频缓存区的容量大小为固定不变的。
在本实施例中,按照视频码流和音频码流的比例大小,划分预设的初始缓存总容量;例如,视频码流与音频码流的比例为10:1,则将预设的初始缓存总容量按照10:1的比例划 分为两份,其中一份的容量大小是另一份的容量大小的10倍。此外,为了达到快速输出首帧流媒体内容效果,限制初始缓存总容量,以使初始的视频缓存区的容量大小和初始的音频缓存区的容量大小可以缓存首帧视频内容和首帧音频内容即可。
在本实施例中,无论采用上述哪种方式,都是根据视频码流和音频码流的比例关系配置视频缓存区的容量大小和音频缓存区的容量大小,使得视频缓存区的容量大小大于音频缓存区的容量大小。由于无法直接对带宽进行分配,但可以通过缓存区的容量大小的差异达到为缓存视频内容分配更多的带宽的效果。此外,为了能够快速的输出首帧流媒体内容,降低初始配置的视频缓存区的容量大小和音频缓存区的容量大小。
进一步的,在所述确定出视频缓存区的容量大小和音频缓存区的容量大小之后,所述方法还包括:
在播放所述流媒体内容的过程中,每当播放第二设定数量的视频帧之后,将所述视频缓存区的容量大小扩大一倍,直至所述视频缓存区的容量大小达到预设的最大缓存总容量大小的一半;以及,每当播放所述第二设定数量的音频帧之后,将所述音频缓存区的容量大小扩大一倍,直至所述音频缓存区的容量大小达到预设的最大缓存总容量大小的一半。
优选的,每当播放10帧视频内容后,将视频缓存区的容量大小扩大一倍,直至视频缓存区的容量大小达到2M;以及,每当播放10帧音频内容后,将音频缓存区的容量大小扩大一倍,直至音频缓存区的容量大小达到2M。
在本实施例中,在音视频的播放过程中,动态扩大音频缓存区的容量大小和视频缓存区的容量大小。通过扩大音频缓存区的容量大小和视频缓存区的容量大小,可以减少请求缓存的次数,避免了网络拥堵。
更进一步的,在所述确定出视频缓存区的容量大小和音频缓存区的容量大小之后,所述方法还包括:
获取所述流媒体内容的首帧视频,并将所述首帧视频缓存在所述视频缓存区中;
获取所述流媒体内容的首帧音频,并将所述首帧音频缓存在所述音频缓存区中。
在本实施例中,在首帧音频内容被缓存在音频缓存区以及首帧视频内容被缓存在视频缓存区之后,即可将首帧音频内容和首帧视频内容同步播放,以进行音视频起播。由于起播时只需要快速的将首帧音频内容和首帧视频内容同步播放出来,所以用于缓存首帧音频内容的音频缓存区的容量大小和用于缓存首帧视频内容的视频缓存区的容量大小不必设置过大;在音视频的播放过程中,再逐渐的扩大音频缓存区的容量大小以及视频缓存区的容量大小。
实施例二
基于上述实施例一中提供的首帧均衡限流方法,本实施例中提供一种首帧均衡限流装置,具体地,图2示出了该首帧均衡限流装置的可选的结构框图,该首帧均衡限流装置被分割成一个或多个程序模块,一个或者多个程序模块被存储于存储介质中,并由一个或多个处理器所执行,以完成本申请。本申请所称的程序模块是指能够完成特定功能的一系列计算机可读指令段,以下描述将具体介绍本实施例各程序模块的功能。
如图2所示,应用于DASH客户端的首帧均衡限流装置包括以下组成部分:
第一获取模块201,用于当使用DASH播放流媒体内容时,从服务器获取待播放的流媒体内容的视频数据文件和音频数据文件。
具体的,第一获取模块201,用于:
向服务器发送获取视频数据文件和音频数据文件的请求,并接收由所述服务器反馈回的视频数据文件和音频数据文件。
第二获取模块202,用于从所述视频数据文件中获取视频码流,并从所述音频数据文件中获取音频码流。
具体的,第二获取模块202,用于:
对所述视频数据文件进行解析,以得到文件头部区域中的视频码流信息;并对所述音频数据文件进行解析,以得到文件头部区域中的音频码流信息。
容量确定模块203,用于根据所述视频码流和音频码流,确定出视频缓存区的容量大小和音频缓存区的容量大小。
其中,所述视频缓存区是预设的用于缓存视频内容的内存,所述音频缓存区是预设的用于缓存音频内容的内存;在本实施例中,所述视频缓存区和音频缓存区的容量大小是可以动态调整的。
具体的,容量确定模块203,用于:
根据所述视频码流,计算出播放第一设定数量的视频帧所需要的第一容量大小,并将所述第一容量大小设置为所述视频缓存区的容量大小;根据所述音频码流,计算出播放所述第一设定数量的音频帧所需要的第二容量大小,并将所述第二容量大小设定为所述音频缓存区的容量大小;或者,
根据所述视频码流和音频码流确定出音视频码流比例,并按照所述音视频码流比例,将预设的初始缓存总容量大小划分为视频缓存区的容量大小和音频缓存区的容量大小。
进一步的,所述装置还包括:
动态调整模块,用于在所述确定出视频缓存区的容量大小和音频缓存区的容量大小之 后,在播放所述流媒体内容的过程中,每当播放第二设定数量的视频帧之后,将所述视频缓存区的容量大小扩大一倍,直至所述视频缓存区的容量大小达到预设的最大缓存总容量大小的一半;以及,每当播放所述第二设定数量的音频帧之后,将所述音频缓存区的容量大小扩大一倍,直至所述音频缓存区的容量大小达到预设的最大缓存总容量大小的一半。
更进一步的,所述装置还包括:
获取缓存模块,用于在所述确定出视频缓存区的容量大小和音频缓存区的容量大小之后,获取所述流媒体内容的首帧视频,并将所述首帧视频缓存在所述视频缓存区中;获取所述流媒体内容的首帧音频,并将所述首帧音频缓存在所述音频缓存区中。
实施例三
本实施例还提供一种计算机设备,如可以执行程序的智能手机、平板电脑、笔记本电脑、台式计算机、机架式服务器、刀片式服务器、塔式服务器或机柜式服务器(包括独立的服务器,或者多个服务器所组成的服务器集群)等。如图3所示,本实施例的计算机设备30至少包括但不限于:可通过***总线相互通信连接的存储器301、处理器302。需要指出的是,图3仅示出了具有组件301-302的计算机设备30,但是应理解的是,并不要求实施所有示出的组件,可以替代的实施更多或者更少的组件。
本实施例中,存储器301(即可读存储介质)包括闪存、硬盘、多媒体卡、卡型存储器(例如,SD或DX存储器等)、随机访问存储器(RAM)、静态随机访问存储器(SRAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、可编程只读存储器(PROM)、磁性存储器、磁盘、光盘等。在一些实施例中,存储器301可以是计算机设备30的内部存储单元,例如该计算机设备30的硬盘或内存。在另一些实施例中,存储器301也可以是计算机设备30的外部存储设备,例如该计算机设备30上配备的插接式硬盘,智能存储卡(Smart Media Card,SMC),安全数字(Secure Digital,SD)卡,闪存卡(Flash Card)等。当然,存储器301还可以既包括计算机设备30的内部存储单元也包括其外部存储设备。在本实施例中,存储器301通常用于存储安装于计算机设备30的操作***和各类应用软件,例如实施例二的首帧均衡限流装置的程序代码等。此外,存储器301还可以用于暂时地存储已经输出或者将要输出的各类数据。
处理器302在一些实施例中可以是中央处理器(Central Processing Unit,CPU)、控制器、微控制器、微处理器、或其他数据处理芯片。该处理器302通常用于控制计算机设备30的总体操作。
具体的,在本实施例中,处理器302用于执行处理器302中存储的首帧均衡限流方法 的程序,所述首帧均衡限流方法的程序被执行时实现如下步骤:
当使用DASH播放流媒体内容时,从服务器获取待播放的流媒体内容的视频数据文件和音频数据文件;
从所述视频数据文件中获取视频码流,并从所述音频数据文件中获取音频码流;
根据所述视频码流和音频码流,确定出视频缓存区的容量大小和音频缓存区的容量大小。
上述方法步骤的具体实施例过程可参见第一实施例,本实施例在此不再重复赘述。
实施例四
本实施例还提供一种计算机可读存储介质(易失性/非易失性),如闪存、硬盘、多媒体卡、卡型存储器(例如,SD或DX存储器等)、随机访问存储器(RAM)、静态随机访问存储器(SRAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、可编程只读存储器(PROM)、磁性存储器、磁盘、光盘、服务器、App应用商城等等,其上存储有计算机可读指令,所述计算机可读指令被处理器执行时实现如下方法步骤:
当使用DASH播放流媒体内容时,从服务器获取待播放的流媒体内容的视频数据文件和音频数据文件;
从所述视频数据文件中获取视频码流,并从所述音频数据文件中获取音频码流;
根据所述视频码流和音频码流,确定出视频缓存区的容量大小和音频缓存区的容量大小。
上述方法步骤的具体实施例过程可参见第一实施例,本实施例在此不再重复赘述。
本实施例提供的首帧均衡限流方法、装置、计算机设备及可读存储介质,根据视频码流和音频码流的比例关系配置视频缓存区的容量大小和音频缓存区的容量大小,使得视频缓存区的容量大小大于音频缓存区的容量大小,从而导致视频内容在缓存的过程中可以占用更多的带宽,提高下载视频内容的下载速度,并使得在相同的时间内,可以下载同样帧数的视频内容和音频内容。此外,为了达到快速输出首帧流媒体内容效果,限制初始的缓存区的容量大小,以使初始的视频缓存区的容量大小和初始的音频缓存区的容量大小可以缓存首帧视频内容和首帧音频内容即可。在音视频的播放过程中,动态扩大音频缓存区的容量大小和视频缓存区的容量大小。通过扩大音频缓存区的容量大小和视频缓存区的容量大小,可以减少请求缓存的次数,避免了网络拥堵。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。
上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。
以上仅为本申请的优选实施例,并非因此限制本申请的专利范围,凡是利用本申请说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本申请的专利保护范围内。

Claims (20)

  1. 一种首帧均衡限流方法,所述方法包括:
    当使用DASH播放流媒体内容时,从服务器获取待播放的流媒体内容的视频数据文件和音频数据文件;
    从所述视频数据文件中获取视频码流,并从所述音频数据文件中获取音频码流;
    根据所述视频码流和音频码流,确定出视频缓存区的容量大小和音频缓存区的容量大小。
  2. 根据权利要求1所述的首帧均衡限流方法,所述根据所述视频码流和音频码流,确定出视频缓存区的容量大小和音频缓存区的容量大小,还包括:
    根据所述视频码流,计算出播放第一设定数量的视频帧所需要的第一容量大小,并将所述第一容量大小设置为所述视频缓存区的容量大小;
    根据所述音频码流,计算出播放所述第一设定数量的音频帧所需要的第二容量大小,并将所述第二容量大小设定为所述音频缓存区的容量大小。
  3. 根据权利要求2所述的首帧均衡限流方法,在所述确定出视频缓存区的容量大小和音频缓存区的容量大小之后,所述方法还包括:
    在播放所述流媒体内容的过程中,每当播放第二设定数量的视频帧之后,将所述视频缓存区的容量大小扩大一倍,直至所述视频缓存区的容量大小达到预设的最大缓存总容量大小的一半;以及,每当播放所述第二设定数量的音频帧之后,将所述音频缓存区的容量大小扩大一倍,直至所述音频缓存区的容量大小达到预设的最大缓存总容量大小的一半。
  4. 根据权利要求1所述的首帧均衡限流方法,所述根据所述视频码流和音频码流,确定出视频缓存区的容量大小和音频缓存区的容量大小,还包括:
    根据所述视频码流和音频码流确定出音视频码流比例,并按照所述音视频码流比例,将预设的初始缓存总容量大小划分为视频缓存区的容量大小和音频缓存区的容量大小。
  5. 根据权利要求1所述的首帧均衡限流方法,在所述确定出视频缓存区的容量大小和音频缓存区的容量大小之后,所述方法还包括:
    获取所述流媒体内容的首帧视频,并将所述首帧视频缓存在所述视频缓存区中;
    获取所述流媒体内容的首帧音频,并将所述首帧音频缓存在所述音频缓存区中。
  6. 一种首帧均衡限流装置,所述装置包括:
    第一获取模块,用于当使用DASH播放流媒体内容时,从服务器获取待播放的流媒体内容的视频数据文件和音频数据文件;
    第二获取模块,用于从所述视频数据文件中获取视频码流,并从所述音频数据文件中获取音频码流;
    容量确定模块,用于根据所述视频码流和音频码流,确定出视频缓存区的容量大小和音频缓存区的容量大小。
  7. 根据权利要求6所述的首帧均衡限流装置,所述容量确定模块,还用于:
    根据所述视频码流,计算出播放第一设定数量的视频帧所需要的第一容量大小,并将所述第一容量大小设置为所述视频缓存区的容量大小;
    根据所述音频码流,计算出播放所述第一设定数量的音频帧所需要的第二容量大小,并将所述第二容量大小设定为所述音频缓存区的容量大小。
  8. 根据权利要求7所述的首帧均衡限流装置,所述装置还包括:
    动态调整模块,用于在所述确定出视频缓存区的容量大小和音频缓存区的容量大小之后,在播放所述流媒体内容的过程中,每当播放第二设定数量的视频帧之后,将所述视频缓存区的容量大小扩大一倍,直至所述视频缓存区的容量大小达到预设的最大缓存总容量大小的一半;以及,每当播放所述第二设定数量的音频帧之后,将所述音频缓存区的容量大小扩大一倍,直至所述音频缓存区的容量大小达到预设的最大缓存总容量大小的一半。
  9. 根据权利要求6所述的首帧均衡限流装置,所述容量确定模块,还用于:
    根据所述视频码流和音频码流确定出音视频码流比例,并按照所述音视频码流比例,将预设的初始缓存总容量大小划分为视频缓存区的容量大小和音频缓存区的容量大小。
  10. 根据权利要求6所述的首帧均衡限流装置,还包括获取缓存模块,用于:
    获取所述流媒体内容的首帧视频,并将所述首帧视频缓存在所述视频缓存区中;
    获取所述流媒体内容的首帧音频,并将所述首帧音频缓存在所述音频缓存区中。
  11. 一种计算机设备,所述计算机设备包括:存储器、处理器以及存储在所述存储器上并可在所述处理器上运行的计算机可读指令,所述处理器执行所述计算机可读指令时实现以下步骤:
    当使用DASH播放流媒体内容时,从服务器获取待播放的流媒体内容的视频数据文件和音频数据文件;
    从所述视频数据文件中获取视频码流,并从所述音频数据文件中获取音频码流;
    根据所述视频码流和音频码流,确定出视频缓存区的容量大小和音频缓存区的容量大 小。
  12. 根据权利要求11所述的计算机设备,所述根据所述视频码流和音频码流,确定出视频缓存区的容量大小和音频缓存区的容量大小,还包括:
    根据所述视频码流,计算出播放第一设定数量的视频帧所需要的第一容量大小,并将所述第一容量大小设置为所述视频缓存区的容量大小;
    根据所述音频码流,计算出播放所述第一设定数量的音频帧所需要的第二容量大小,并将所述第二容量大小设定为所述音频缓存区的容量大小。
  13. 根据权利要求11所述的计算机设备,所述根据所述视频码流和音频码流,确定出视频缓存区的容量大小和音频缓存区的容量大小,还包括:
    根据所述视频码流和音频码流确定出音视频码流比例,并按照所述音视频码流比例,将预设的初始缓存总容量大小划分为视频缓存区的容量大小和音频缓存区的容量大小。
  14. 根据权利要求13所述的计算机设备,在所述确定出视频缓存区的容量大小和音频缓存区的容量大小之后,还包括:
    在播放所述流媒体内容的过程中,每当播放第二设定数量的视频帧之后,将所述视频缓存区的容量大小扩大一倍,直至所述视频缓存区的容量大小达到预设的最大缓存总容量大小的一半;以及,每当播放所述第二设定数量的音频帧之后,将所述音频缓存区的容量大小扩大一倍,直至所述音频缓存区的容量大小达到预设的最大缓存总容量大小的一半。
  15. 根据权利要求11所述的计算机设备,在所述确定出视频缓存区的容量大小和音频缓存区的容量大小之后,还包括:
    获取所述流媒体内容的首帧视频,并将所述首帧视频缓存在所述视频缓存区中;
    获取所述流媒体内容的首帧音频,并将所述首帧音频缓存在所述音频缓存区中。
  16. 一种计算机可读存储介质,其上存储有计算机可读指令,所述计算机可读指令被处理器执行时以下步骤:
    当使用DASH播放流媒体内容时,从服务器获取待播放的流媒体内容的视频数据文件和音频数据文件;
    从所述视频数据文件中获取视频码流,并从所述音频数据文件中获取音频码流;
    根据所述视频码流和音频码流,确定出视频缓存区的容量大小和音频缓存区的容量大小。
  17. 根据权利要求16所述的计算机可读存储介质,所述根据所述视频码流和音频码流,确定出视频缓存区的容量大小和音频缓存区的容量大小,还包括:
    根据所述视频码流,计算出播放第一设定数量的视频帧所需要的第一容量大小,并将所述第一容量大小设置为所述视频缓存区的容量大小;
    根据所述音频码流,计算出播放所述第一设定数量的音频帧所需要的第二容量大小,并将所述第二容量大小设定为所述音频缓存区的容量大小。
  18. 根据权利要求16所述的计算机可读存储介质,所述根据所述视频码流和音频码流,确定出视频缓存区的容量大小和音频缓存区的容量大小,还包括:
    根据所述视频码流和音频码流确定出音视频码流比例,并按照所述音视频码流比例,将预设的初始缓存总容量大小划分为视频缓存区的容量大小和音频缓存区的容量大小。
  19. 根据权利要求18所述的计算机可读存储介质,在所述确定出视频缓存区的容量大小和音频缓存区的容量大小之后,还包括:
    在播放所述流媒体内容的过程中,每当播放第二设定数量的视频帧之后,将所述视频缓存区的容量大小扩大一倍,直至所述视频缓存区的容量大小达到预设的最大缓存总容量大小的一半;以及,每当播放所述第二设定数量的音频帧之后,将所述音频缓存区的容量大小扩大一倍,直至所述音频缓存区的容量大小达到预设的最大缓存总容量大小的一半。
  20. 根据权利要求16所述的计算机可读存储介质,在所述确定出视频缓存区的容量大小和音频缓存区的容量大小之后,还包括:
    获取所述流媒体内容的首帧视频,并将所述首帧视频缓存在所述视频缓存区中;
    获取所述流媒体内容的首帧音频,并将所述首帧音频缓存在所述音频缓存区中。
PCT/CN2019/128418 2019-01-30 2019-12-25 首帧均衡限流方法、装置、计算机设备及可读存储介质 WO2020155956A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP19912992.5A EP3863293A4 (en) 2019-01-30 2019-12-25 FIRST FRAME CURRENT LIMITING METHOD AND DEVICE, COMPUTER DEVICE AND READABLE STORAGE MEDIA
US17/329,716 US11463494B2 (en) 2019-01-30 2021-05-25 Balance of initial frame and limitation of traffic

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910092901.0 2019-01-30
CN201910092901.0A CN111510761B (zh) 2019-01-30 2019-01-30 首帧均衡限流方法、装置、计算机设备及可读存储介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/329,716 Continuation US11463494B2 (en) 2019-01-30 2021-05-25 Balance of initial frame and limitation of traffic

Publications (1)

Publication Number Publication Date
WO2020155956A1 true WO2020155956A1 (zh) 2020-08-06

Family

ID=71840701

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/128418 WO2020155956A1 (zh) 2019-01-30 2019-12-25 首帧均衡限流方法、装置、计算机设备及可读存储介质

Country Status (4)

Country Link
US (1) US11463494B2 (zh)
EP (1) EP3863293A4 (zh)
CN (1) CN111510761B (zh)
WO (1) WO2020155956A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112616088A (zh) * 2020-11-26 2021-04-06 北京乐学帮网络技术有限公司 渲染方法及装置、电子设备和计算机可读存储介质
CN113179377B (zh) * 2021-03-17 2022-11-08 青岛小鸟看看科技有限公司 一种vr设备的信号切换方法和一种vr设备
CN114866814B (zh) * 2022-06-09 2024-04-30 上海哔哩哔哩科技有限公司 网络带宽分配方法及装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011031853A1 (en) * 2009-09-09 2011-03-17 Netflix, Inc. Accelerated playback of streaming media
CN102724584A (zh) * 2012-06-18 2012-10-10 Tcl集团股份有限公司 网络视频在线播放方法、视频在线播放装置及智能电视
CN104780422A (zh) * 2014-01-13 2015-07-15 北京兆维电子(集团)有限责任公司 流媒体播放方法及流媒体播放器
CN107438192A (zh) * 2017-07-26 2017-12-05 武汉烽火众智数字技术有限责任公司 音视频播放同步的方法及相关***和多媒体播放终端
CN107517400A (zh) * 2016-06-15 2017-12-26 成都鼎桥通信技术有限公司 流媒体播放方法及流媒体播放器

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070195735A1 (en) * 2006-02-22 2007-08-23 Rosen Eric C Method of buffering to reduce media latency in group communications on a wireless communication network
US7379653B2 (en) * 2002-02-20 2008-05-27 The Directv Group, Inc. Audio-video synchronization for digital systems
US20050071881A1 (en) * 2003-09-30 2005-03-31 Deshpande Sachin G. Systems and methods for playlist creation and playback
US9197684B2 (en) * 2010-05-27 2015-11-24 Ineoquest Technologies, Inc. Streaming media delivery composite
US9615126B2 (en) * 2011-06-24 2017-04-04 Google Technology Holdings LLC Intelligent buffering of media streams delivered over internet
US9276989B2 (en) * 2012-03-30 2016-03-01 Adobe Systems Incorporated Buffering in HTTP streaming client
US9402114B2 (en) * 2012-07-18 2016-07-26 Cisco Technology, Inc. System and method for providing randomization in adaptive bitrate streaming environments
CN106303562B (zh) * 2016-09-20 2019-03-01 天津大学 基于pi控制的多视点视频自适应传输控制算法
CN107371061B (zh) * 2017-08-25 2021-03-19 普联技术有限公司 一种视频流播放方法、装置及设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011031853A1 (en) * 2009-09-09 2011-03-17 Netflix, Inc. Accelerated playback of streaming media
CN102724584A (zh) * 2012-06-18 2012-10-10 Tcl集团股份有限公司 网络视频在线播放方法、视频在线播放装置及智能电视
CN104780422A (zh) * 2014-01-13 2015-07-15 北京兆维电子(集团)有限责任公司 流媒体播放方法及流媒体播放器
CN107517400A (zh) * 2016-06-15 2017-12-26 成都鼎桥通信技术有限公司 流媒体播放方法及流媒体播放器
CN107438192A (zh) * 2017-07-26 2017-12-05 武汉烽火众智数字技术有限责任公司 音视频播放同步的方法及相关***和多媒体播放终端

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3863293A4 *

Also Published As

Publication number Publication date
CN111510761B (zh) 2021-06-04
US11463494B2 (en) 2022-10-04
EP3863293A4 (en) 2022-07-27
EP3863293A1 (en) 2021-08-11
US20210281624A1 (en) 2021-09-09
CN111510761A (zh) 2020-08-07

Similar Documents

Publication Publication Date Title
US9060207B2 (en) Adaptive video streaming over a content delivery network
WO2020155956A1 (zh) 首帧均衡限流方法、装置、计算机设备及可读存储介质
US11490173B2 (en) Switch of audio and video
WO2020155959A1 (zh) 切换清晰度的方法、装置、计算机设备及可读存储介质
JP6314252B2 (ja) ネットワークビデオ再生方法及び装置
US11374843B2 (en) Method for measuring network speed, computing device, and computer-program product
WO2019128800A1 (zh) 一种内容服务的实现方法、装置及内容分发网络节点
EP3902266A1 (en) Processing method for dragging video data and proxy server
US20180013813A1 (en) Method and apparatus for cloud streaming service
US20220385989A1 (en) Video playing control method and system
WO2020155960A1 (zh) 视频播放方法、***、计算机设备及计算机可读存储介质
US11496536B2 (en) Method of requesting video, computing device, and computer-program product
CN112241419B (zh) 服务数据处理方法、装置、计算机设备和存储介质
US20150268808A1 (en) Method, Device and System for Multi-Speed Playing
US20140149539A1 (en) Streaming content over a network
US20170163555A1 (en) Video file buffering method and system
WO2017096836A1 (zh) 视频文件的缓存方法和***
CN114040245A (zh) 视频播放方法、装置、计算机存储介质和电子设备
CN112243136A (zh) 内容播放方法、视频存储方法和设备
US20230388590A1 (en) Playback optimization method and system
CN112600760B (zh) 一种应用层流量限速方法、终端设备及存储介质
WO2024125249A1 (zh) 码率适配方法、装置以及服务器
WO2020155957A1 (zh) 播放音视频的方法、装置、计算机设备及可读存储介质
JP2021114092A (ja) クライアント装置、コンテンツ補正システムおよび制御方法
CN116017040A (zh) 一种多媒体内容播放方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19912992

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019912992

Country of ref document: EP

Effective date: 20210504

NENP Non-entry into the national phase

Ref country code: DE