WO2016010229A1 - 스트리밍 서비스를 위한 클라이언트 및 서버의 동작 방법 - Google Patents
스트리밍 서비스를 위한 클라이언트 및 서버의 동작 방법 Download PDFInfo
- Publication number
- WO2016010229A1 WO2016010229A1 PCT/KR2015/003230 KR2015003230W WO2016010229A1 WO 2016010229 A1 WO2016010229 A1 WO 2016010229A1 KR 2015003230 W KR2015003230 W KR 2015003230W WO 2016010229 A1 WO2016010229 A1 WO 2016010229A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- file
- client
- data
- parameter
- key frame
- Prior art date
Links
- 238000011017 operating method Methods 0.000 title claims abstract description 9
- 230000004044 response Effects 0.000 claims abstract description 98
- 238000000034 method Methods 0.000 claims description 84
- 239000012634 fragment Substances 0.000 claims description 76
- 238000004891 communication Methods 0.000 claims description 5
- 238000004590 computer program Methods 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 11
- 238000012545 processing Methods 0.000 description 10
- 238000005516 engineering process Methods 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 3
- 102100037812 Medium-wave-sensitive opsin 1 Human genes 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000003139 buffering effect Effects 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/236—Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/61—Network physical structure; Signal processing
- H04N21/6106—Network physical structure; Signal processing specially adapted to the downstream path of the transmission network
- H04N21/6125—Network physical structure; Signal processing specially adapted to the downstream path of the transmission network involving transmission via Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/60—Network streaming of media packets
- H04L65/75—Media network packet handling
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/60—Network streaming of media packets
- H04L65/70—Media network packetisation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/60—Network streaming of media packets
- H04L65/75—Media network packet handling
- H04L65/756—Media network packet handling adapting media to device capabilities
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/02—Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L69/00—Network arrangements, protocols or services independent of the application payload and not provided for in the other groups of this subclass
- H04L69/22—Parsing or analysis of headers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/434—Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/63—Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
- H04N21/637—Control signals issued by the client directed to the server or network components
- H04N21/6377—Control signals issued by the client directed to the server or network components directed to server
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/63—Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
- H04N21/643—Communication protocols
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/65—Transmission of management data between client and server
- H04N21/658—Transmission by the client directed to the server
- H04N21/6587—Control parameters, e.g. trick play commands, viewpoint selection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/845—Structuring of content, e.g. decomposing content into time segments
- H04N21/8455—Structuring of content, e.g. decomposing content into time segments involving pointers to the content, e.g. pointers to the I-frames of the video stream
Definitions
- the embodiments below relate to a method of operating a client and a server for a streaming service.
- adaptive bitrate streaming is an HTTP-based streaming technique that transmits content of a quality that can be digested by bandwidth based on network conditions or transmission speeds.
- Variable bitrate streaming divides and transmits files by time.
- variable bitrate streaming may provide a streaming service using chunks that store content of 2 to 10 seconds.
- each of the pieces to be transmitted is required to start with a predetermined key frame, such as I-frame.
- a process of transcoding an image file into a file for streaming is essential in variable bitrate streaming. In this conversion process, the image quality of the image file is degraded, and additional time and computing power for conversion are additionally consumed.
- variable bitrate streaming essentially uses a manifest file.
- the manifest file is a file that stores information about each piece of the file for streaming.
- the manifest file is generated when the video file is converted into a file for streaming.
- a client According to variable bitrate streaming, a client must refer to a manifest file in order to receive a streaming service.
- Embodiments to be described below provide an HTTP-based streaming technology using the video file format without using the file format for streaming.
- Embodiments provide a technique of transmitting a video file by dividing it by a capacity unit.
- embodiments may provide a streaming service that transmits a portion of an image file in byte address units without actually dividing the image file.
- the pieces to be transmitted are not required to start with a key frame, and the process of converting an image file into a file for streaming may be omitted.
- Embodiments may omit the process of converting an image file into a file for streaming, thereby preventing a deterioration in image quality generated during the conversion and reducing a load on the server. Further, according to embodiments, since a time for converting an image file into a file for streaming is not required after the image file is uploaded to the server, a faster streaming service may be provided. In addition, according to embodiments, since the file for streaming is not separately generated, a manifest file for storing fragment information of the file for streaming is not required.
- a method of operating a client for a streaming service includes: transmitting a first request packet including a file URL and a first parameter indicating a request for reproduction information; Receiving a first response packet including reproduction information of a file corresponding to the file URL; Sending a second request packet comprising the file URL and a second parameter indicating a data request; And receiving a second response packet comprising data of an address range corresponding to the second parameter in the file.
- the operating method of the client may include setting the first parameter to a first predetermined indicator; And a size of a fragment that divides the file from the first response packet, the number of fragments, a resolution of content stored in the file, a URL of a second file that stores content having a second resolution different from the resolution, and the first
- the method may further include extracting a resolution.
- the method of operation of the client may include setting the second parameter to an initial index; And inputting data of the second response packet into a buffer.
- the method of operating the client may include checking a remaining amount of a buffer; Setting the second parameter to an index of a next piece to be reproduced when the remaining amount of the buffer is less than or equal to a threshold; And inputting data of the second response packet into the buffer.
- the method of operation of the client may include setting the second parameter to an initial index; And extracting an offset address for key frame information of the file from the second response packet.
- the operation method of the client may further include extracting the key frame information from the second response packet when the data of the offset address is included in the second response packet.
- the method of operating the client may include setting a third parameter indicating a data request to the offset address when data of the offset address is not included in the second response packet; Sending a third request packet comprising the file URL and the third parameter; Receiving a third response packet comprising data of an address range corresponding to the third parameter in the file; And extracting the key frame information from the third response packet.
- the method of operating the client may include inputting data of the second response packet into a buffer; Extracting key frames of the file while the data is playing; And generating key frame information of the file based on the key frames.
- the operating method of the client may include receiving a resolution change input; Setting the file URL to a URL of a second file corresponding to a new resolution included in the resolution change input; Setting the first parameter to a second predetermined indicator; And extracting a size of the fragment dividing the second file and the number of the fragment from the first response packet.
- the operating method of the client may include receiving a resolution change input; Setting the file URL to a URL of a second file corresponding to a new resolution included in the resolution change input; Detecting a key frame corresponding to a current playback time based on the key frame information of the second file; Setting the second parameter to the address of the detected key frame; And inputting data of the second response packet into a buffer.
- the operating method of the client may include receiving a resolution change input; Setting the file URL to a URL of a second file corresponding to a new resolution included in the resolution change input;
- a method of operating the client estimating a new index corresponding to the current reproduction time when the time range of the second response packet does not include the current reproduction time; Setting a third parameter indicating a data request to the new index; Sending a third request packet comprising the file URL and the third parameter; Receiving a third response packet comprising data of an address range corresponding to the third parameter in the file; And detecting a key frame closest to the current reproduction time from the data of the third response packet.
- the operating method of the client may include receiving a search time; Detecting a key frame corresponding to the search time by using key frame information of the file; Setting the second parameter to the address of the detected key frame; And inputting data of the second response packet into a buffer.
- the operating method of the client may include receiving a search time; Estimating a piece corresponding to the search time based on the search time and the total reproduction time of the file; Setting the second parameter to the index of the estimated piece; If the time range of the second response packet includes the search time, detecting a key frame closest to the search time from data of the second response packet; And inputting data after the detected key frame into a buffer.
- the method of operating the client may include estimating a new index corresponding to the search time when the time range of the second response packet does not include the search time; Setting a third parameter indicating a data request to the new index; Sending a third request packet comprising the file URL and the third parameter; Receiving a third response packet comprising data of an address range corresponding to the third parameter in the file; And detecting a key frame closest to the search time from the data of the third response packet.
- a method of operating a server for a streaming service includes: receiving a request packet including a file URL and parameters; If the parameter indicates a data request, responding with data of an address range corresponding to the parameter in a file corresponding to the file URL; And if the parameter indicates a request for playback information, responding to the playback information of the file.
- the responding of the data may include: responding to data in an address range corresponding to the index in the file when the parameter includes an index; And when the parameter includes an address, responding to data of an address range corresponding to the address in the file.
- Responding to data in an address range corresponding to the address may include: responsive to data between the two addresses in the file if the parameter includes two addresses; And responding data between the one address in the file and the end of the file when the parameter includes one address.
- the parameter In response to the playing information, when the parameter includes a first predetermined indicator, the size of the fragment to divide the file, the number of the fragment, the resolution of the content stored in the file, the second different from the resolution Responsive to the URL of a second file storing content of the resolution and the second resolution; And if the parameter includes a second predetermined indicator, responding to the size of the piece and the number of pieces.
- FIG. 1 is a view for explaining a method of operating a client according to an embodiment.
- 2 to 7 are diagrams for describing a method of operating a server, according to an exemplary embodiment.
- FIG. 8 illustrates a basic operation of a client according to an embodiment.
- 9 through 11 are diagrams for describing an operation of obtaining key frame information, according to an exemplary embodiment.
- FIGS. 12 to 14 are diagrams for describing a resolution changing operation, according to an exemplary embodiment.
- 15 through 17 are diagrams for describing a search operation, according to an exemplary embodiment.
- the client is a computing device provided with a streaming service, and may include, for example, a personal computer, a laptop computer, a tablet computer, a PDA, a smart phone, and the like.
- the client may be installed with a client application, such as a swf player, to communicate with the server using the http protocol.
- the server is a computing device that provides a streaming service, and may include, for example, a web server.
- the server may be configured not only as a server dedicated computing device but also as a personal computer, a laptop computer, a tablet computer, a PDA, a smart phone, and the like.
- the server may be installed with a server application for communicating with the client using the http protocol, for example, an Apache server.
- the client 110 transmits a first request packet to the server 120.
- the first request packet may include a first parameter indicating a file URL and a reproduction information request.
- the file URL may be information that uniquely indicates a resource on a computer network.
- the file URL may be information uniquely indicating a video file that is a target of a streaming service.
- the first parameter indicative of the reproduction information request may be a predetermined character or string, for example, 'i' or 'r'.
- the client 110 may request reproduction information of a file corresponding to the file URL by transmitting the first request packet to the server 120.
- the reproduction information of the file is information for reproducing the file through the streaming service, and may include, for example, information in which the file is divided for transmission.
- Client 110 receives a first response packet from server 120.
- the first response packet may include reproduction information of a file corresponding to the file URL.
- embodiments may provide a streaming technique for transmitting a file by dividing it by capacity.
- individual capacity units for dividing a file may be referred to as fragments.
- the size of the fragment for dividing the file may be determined by the server 120.
- the server 120 may calculate the number of fragments required to divide the file according to the size of the fragment to divide the file.
- the reproduction information of the first response packet may include the size of the fragment and the number of fragments.
- the server 120 may include the size of the fragment and the total size of the file in the first response packet.
- the client 110 may calculate the number of fragments required to divide the file based on the fragment size and the total size of the file.
- the size of the fragment for dividing the file may be determined by the client 120.
- the client 120 may include the size of the fragment in the first parameter and transmit it to the server 120.
- the server 120 may calculate the number of pieces required to divide the file based on the size of the received pieces.
- the reproduction information of the first response packet may include the number of fragments.
- the server 120 may include the total size of the file in the first response packet. In this case, since the client 110 already knows the size of the fragment, the client 110 may calculate the number of fragments required to divide the file based on the total size of the file.
- the client 110 transmits the second request packet to the server 120.
- the second request packet may include a file URL and a second parameter indicating a data request.
- the second parameter indicating the data request is a number or a string indicating an address in the file, for example, 'address r' or 'address 1r address 2'.
- the address in the file may be a byte address.
- Client 110 receives a second response packet from server 120.
- the second response packet may include data of an address range corresponding to the second parameter in the file corresponding to the file URL.
- the client 110 may input data of the second response packet into the buffer.
- the data input to the buffer may be reproduced through a demux and / or a decoder.
- FIGS. 2 to 7 are diagrams illustrating a method of operating a server, according to an exemplary embodiment.
- the server 220 receives a request packet from the client 210.
- the request packet may include a file URL and parameters.
- the file URL is information that uniquely indicates a resource on the computer network and may be, for example, information indicating a file stored in the server 220.
- the server 220 may determine whether the parameter indicates a data request or a reproduction information request. For example, the server 220 may determine whether the parameter included in the request packet is the first parameter indicating the reproduction information request or the second parameter indicating the data request.
- the first parameter indicative of the request for playback information and the second parameter indicative of the data request may be predetermined, and the server 220 and the client 210 share the predetermined first parameter and the second parameter. can do.
- the server 220 may respond to the playback information of the file.
- the server 320 may receive a request packet including a file1 URL and a first indicator from the client 310.
- the first indicator may be a predetermined character or string, for example 'r'.
- the file1 URL and the first indicator may be separated by a predetermined code, for example, '?'.
- the server 320 may determine whether the parameter included in the request packet includes the first indicator.
- the server 320 determines the size of the fragment (chunk size), the number of fragments (chunk number), and the fragment that divides the first file corresponding to the File1 URL. Answer a list of files.
- the file list may include a resolution of the first file and a URL and a resolution of at least one file that stores content having a resolution different from the first resolution of the content stored in the first file.
- the file list may include a first resolution (file1 resolution) of content stored in the first file, a URL (file2URL) of a second file that stores content having a second resolution different from the first resolution, and a second resolution. (File2 resolution), and so on.
- the file list may include URLs and resolutions of the plurality of files.
- the server 320 may search for at least one file that stores content having a resolution different from the first resolution in a directory in which the first file corresponding to the file1 URL is stored. For example, if the files shown in Table 1 are stored in the directory where the first file is stored, and the first file is 'music video.mp4', the server 320 may display 'music video_720p.mp4', 'music'. Video_480p.mp4 'and' music video_360p.mp4 '.
- the server 320 may generate a response packet as shown in Table 2 and transmit the response packet to the client 310.
- the first element of the response packet is chunk size
- the second element is the number of chunks
- the third element is file1 resolution
- the fourth element is file2URL
- the fifth element is file2 resolution
- the sixth element Is the file 3 URL
- the seventh element is the file 3 resolution
- the eighth element is the file 4 URL
- the ninth element is the file 4 resolution.
- the chunk size may be in bytes.
- the file list may include URLs and resolutions of all files corresponding to the same content.
- the file list may include a URL of the first file, a resolution of the first file, and a URL and a resolution of at least one file that stores content having a resolution different from that of the first file.
- the server 320 may generate a response packet as shown in Table 3 and transmit the response packet to the client 310.
- the first element of the response packet is chunk size
- the second element is the chunk number
- the third element is File1URL
- the fourth element is File1 Resolution
- the fifth element is File2URL
- the sixth element is The file 2 resolution
- the seventh element is file 3URL
- the eighth element is file 3 resolution
- the ninth element is file 4URL
- the tenth element is file 4 resolution.
- the chunk size may be in bytes.
- a predetermined character or string for example 'a', may be added to the end of the file 1 resolution.
- Each file can include a resolution in the file name.
- the server 320 may encode 'music video.mp4' with a new resolution.
- the server 320 may determine the file name of the new resolution by adding a new resolution to the file name of the original video file. In this case, the server 320 may obtain the resolution of the content stored in the corresponding file based on the file name.
- the server 420 may receive a request packet including a file URL and a second indicator from the client 410.
- the second indicator may be a predetermined character or string, for example 'i'.
- the file URL and the second indicator in the request packet may be separated by a predetermined code, for example '?'.
- the server 420 may determine whether the parameter included in the request packet includes the second indicator. If the parameter included in the request packet includes the second indicator, the server 420 may respond with the size (chunk size) and the number (chunk number) of fragments splitting the file corresponding to the file URL. have.
- the server 420 may generate a response packet as shown in Table 4 and transmit the response packet to the client 410.
- the first element of the response packet is chunk size
- the second element is the number of chunks.
- the chunk size may be in bytes. Referring to Table 1, chunk sizes for dividing each file may be set differently. However, embodiments are not limited so that chunk sizes for dividing respective files are set differently. For example, the chunk sizes for dividing each file may be set equal to each other.
- the server 220 may respond with data in the address range corresponding to the parameter in the file corresponding to the file URL.
- the server 520 may receive a request packet including a file URL and an index from the client 510.
- the index may be an integer of 0 or more.
- the file URL and the index in the request packet may be separated by a predetermined code, for example, '?'.
- the server 520 may determine whether the parameter included in the request packet includes an index. If the parameter included in the request packet includes an index, the server 520 may reply with data of the address range corresponding to the index in the file corresponding to the file URL.
- the server 520 may obtain the size of a fragment for dividing the file corresponding to the file URL.
- the server 520 may transmit data shown in Table 5 to the client 510.
- the data in Table 5 may be referred to as chunk-n.
- [first byte address, second byte address] is data from the first byte address to the second byte address.
- the request packet including the file URL and the index may include a header.
- the header may include an address range parameter.
- the header may be an http header.
- the server 520 may transmit some data of the fragment corresponding to the index to the client 510 using the address range parameter included in the header of the request packet.
- the chunk size for the file corresponding to the file URL included in the request packet is 2097152 bytes
- the index included in the request packet is n
- the address range parameter of the request packet is (start range, end range).
- the server 520 may transmit data shown in Table 6 to the client 510.
- Server 520 may include a cache server.
- the cache server may cache fragment data based on the file URL and index. In this case, even if the address range parameters included in the header of the request packet are different, if the file URL and the index included in the request packet are the same, fragment data cached in the cache server may be utilized.
- the server 620 may receive a request packet including a file URL and an address from the client 610.
- the address may be a byte address.
- the file URL and address in the request packet may be separated by a predetermined code, for example, '?'.
- a predetermined character or string may be added after the address, for example, 'r'.
- the server 620 may determine whether the parameter included in the request packet includes an address.
- the server 620 may reply with data of an address range corresponding to an address in a file corresponding to the file URL.
- the server 620 may transmit data as shown in Table 7 to the client 610.
- [address, end of file] is data from the address to the end of the file.
- the server 720 may receive a request packet including a file URL and a plurality of addresses, for example, a first address and a second address, from the client 710.
- the plurality of addresses may each be a byte address.
- the file URL and the plurality of addresses in the request packet may be separated by a predetermined code, for example, '?'.
- a predetermined character or string for example, 'r' may be added between the plurality of addresses.
- the server 720 may determine whether the parameter included in the request packet includes a plurality of addresses. If the parameter included in the request packet includes a plurality of addresses, the server 720 may reply with data of an address range corresponding to the plurality of addresses in the file corresponding to the file URL.
- the server 720 may transmit data shown in Table 8 to the client 710.
- [address 1, address 2] is data from the first address to the second address.
- the client 810 may obtain a URL of a first file to be played through a streaming service, for example, 'file1URL'.
- the client 810 may transmit a request packet for requesting the first fragment of the first file, for example, 'file1URL? 0' to the server 820.
- the client 810 may transmit a request packet for requesting reproduction information of the first file, for example, “file 1 URL? R” to the server 820.
- the client 810 may receive a response packet including the reproduction information of the first file from the server 820.
- the reproduction information of the first file may include a chunk size, a chunk number, and a file list.
- the file list may include file 1 resolution, file 2 URL, file 2 resolution, and the like.
- the client 810 may extract the reproduction information of the first file from the response packet.
- the client 810 may receive a response packet including an initial fragment of the first file, eg, 'chunk-0', from the server 820.
- the client 810 may play 'chunk-0'.
- the client 810 may input 'chunk-0' included in the response packet into the buffer.
- Data input to the buffer can be played back through the demux and / or decoder.
- the client 810 may check the remaining amount of the buffer. For example, it may be determined whether the remaining amount of the buffer is equal to or less than a predetermined threshold value.
- the predetermined threshold value may be in bytes or in units of time. When the predetermined threshold value is a time unit, the time unit threshold value may be converted into a byte unit threshold value based on the resolution of the content being played.
- the client 810 may transmit a request packet to the server 820 requesting the next piece of the first file. For example, the client 810 can calculate the index for the next piece by adding 1 to the index of the piece currently playing. If the index of the currently playing fragment is 0, the client 810 may transmit the file 1 URL? 1 to the server 820.
- the client 810 may receive a response packet including the next piece, for example 'chunk-1', from the server 820.
- the client 810 may play 'chunk-1'.
- the client 810 may input 'chunk-1' included in the response packet into the buffer.
- Data input to the buffer can be played back through the demux and / or decoder.
- the client may acquire key frame information of the first file to be played through the streaming service.
- the content stored in the first file is composed of a plurality of frames.
- the plurality of frames may be classified into frames that fully contain information of the screen and frames that refer to other frames.
- the frame that completely contains the information of the screen may be referred to as a key frame. Since a frame referring to another frame does not completely contain the information of the screen of the frame, it may have a smaller capacity than the key frame.
- the key frame information of the first file may include information about key frames among a plurality of frames constituting the content stored in the first file.
- the key frame information may include an index, a byte address, a time stamp, and the like of each of the key frames.
- the first file may store key frame information.
- the first file may store key frame information.
- the client may obtain the key frame information by receiving data of the address range in which the key frame information is stored from the server.
- the client 910 may transmit a request packet for requesting the first fragment of the first file, for example, 'file1URL? 0' to the server 920.
- the client 910 may receive a response packet including the first piece of the first file, eg, 'chunk-0', from the server 920.
- the client 910 may extract the offset address of the key frame information from the first piece of the first file.
- the client 910 may determine whether the offset address is included in the initial fragment by determining whether the offset address is included in the address range of the original fragment. If the offset address is included in the address range of the first fragment, the client 910 may obtain key frame information from the first fragment already received.
- the client 910 may transmit a request packet for requesting data after the offset address, for example, 'File1URL? Offset address r' to the server 920. .
- the client 910 may receive a response packet including data after the offset address, for example, [offset address, file1 end].
- the client 910 may extract key frame information from the response packet.
- the first file may not store key frame information.
- the first file may not store key frame information.
- the client may generate key frame information while playing the first file.
- the client 1010 may transmit a request packet for requesting the first fragment of the first file, for example, file 1 URL? 0 to the server 1020.
- the client 1010 may receive a response packet including the first piece of the first file, eg, 'chunk-0', from the server 1020.
- the client 1010 may play 'chunk-0'.
- the client 1010 may input 'chunk-0' included in the response packet into the buffer. Data input to the buffer can be played back through the demux and / or decoder.
- the client 1010 may extract key frames included in the 'chunk-0' while the 'chunk-0' is played.
- the client 1010 may generate key frame information for the first file based on the extracted key frames. For example, referring to FIG. 11, the client 1010 may generate a lookup table 1100 including key frame information.
- the client 1010 may generate key frame information even while playing pieces other than the first piece. For example, the client 1010 may continuously update the lookup table 1100 while content stored in the first file is played.
- the client 1210 may obtain a URL of a first file to be played through a streaming service, for example, 'file1URL'.
- the client 1210 may transmit a request packet for requesting the first fragment of the first file, for example, 'file1URL? 0' to the server 1220.
- the client 1210 may transmit a request packet for requesting reproduction information of the first file, for example, “File 1 URL? R” to the server 1220.
- the client 1210 may receive a response packet including reproduction information of the first file from the server 1220.
- the reproduction information of the first file may include a chunk size, a chunk number, and a file list.
- the file list may include file 1 resolution, file 2 URL, file 2 resolution, and the like.
- the client 810 may extract the reproduction information of the first file from the response packet.
- the client 1210 may receive a response packet including the first fragment of the first file, eg, 'file1 chunk-0' from the server 1220.
- the client 1210 may play 'file 1 chunk-0'.
- the client 810 may input 'file 1 chunk-0' included in the response packet into the buffer. Data input to the buffer can be played back through the demux and / or decoder.
- the client 1210 may receive a resolution change input.
- the client 1210 may provide a user with a resolution and / or a changeable resolution of content currently being played through a predetermined interface.
- the client 1210 may receive a resolution change input through an interface.
- the client 1210 may automatically determine whether to change the resolution. For example, the client 1210 may automatically determine whether to change the resolution based on the communication status, buffering, communication cost, and the like.
- the client 1210 may transmit a request packet for requesting reproduction information of the second file corresponding to the second resolution, for example, “file2URL? I” to the server 1220. Since the client 1210 already has the resolution list, the second indicator, for example, 'i' may be used to request the reproduction information of the second file.
- the client 1210 may receive a response packet including reproduction information of the second file from the server 1220.
- the reproduction information of the second file may include a chunk size and a chunk number.
- the client 1210 may transmit a request packet for requesting the first fragment of the second file, for example, 'file2URL? 0' to the server 1220.
- the client 1210 may receive a response packet including the first piece of the second file, eg, 'file2 chunk-0' from the server 1220.
- the client 1210 may obtain key frame information of the second file.
- the client 1210 may obtain key frame information of the second file based on the above-described embodiment through FIG. 9.
- the client 1210 may detect a key frame corresponding to the current playback time based on the key frame information of the second file. For example, the client 1210 may detect a key frame closest to a frame after a predetermined time, for example, 1 second, after the current playback time. As another example, the client 1210 may detect a key frame corresponding to the current playback time based on the buffer information of the first file. As another example, the client 1210 may detect a key frame corresponding to a time after the amount of the first file input to the current buffer.
- the client 1210 may transmit a request packet for requesting data after the address of the detected key frame, for example, “file 2 URL? Key frame address r corresponding chunk end address” to the server 1220.
- the corresponding chunk end address is the end address of the chunk to which the detected key frame belongs, and the client 1210 may calculate the end address of the chunk to which the detected key frame belongs using the reproduction information of the second file.
- the client 1210 includes a request packet for requesting a fragment to which the detected key frame belongs, for example, 'File 2 URL? Corresponding chunk index' to the server 1220 and included in the header of the request packet.
- the client 1210 may request data of the [key frame address, file end] by transmitting the 'file 2 URL? Key frame address r' to the server 1220.
- the client 1210 may receive a response packet including data after the key frame address, for example, [key frame address, corresponding chunk end address].
- the client 1210 may play the second file having the new resolution by inputting data after the key frame address into the buffer.
- a client 1310 may receive a resolution change input while playing a first file.
- the client 1310 may provide a user with a resolution and / or a changeable resolution of content currently being played through a predetermined interface.
- the client 1310 may receive a resolution change input through an interface.
- the client 1310 may automatically determine whether to change the resolution.
- the client 1310 may automatically determine whether to change the resolution based on a communication state, buffering, communication cost, and the like.
- the client 1310 may transmit a request packet for requesting reproduction information of the second file, for example, 'file2URL? I' to the server 1320. Since the client 1310 already has a resolution list, the client 1310 may request the reproduction information of the second file using the second indicator, for example, 'i'. The client 1310 may receive a response packet including reproduction information of the second file from the server 1320.
- the reproduction information of the second file may include a chunk size and a chunk number.
- the client 1310 may transmit a request packet for requesting the first fragment of the second file, for example, 'file2URL? 0' to the server 1320.
- the client 1310 may receive a response packet including the first piece of the second file, eg, 'file2 chunk-0' from the server 1320.
- the second file may not store the key frame information.
- the client 1310 may estimate a piece corresponding to the current reproduction time among pieces of the second file based on the current reproduction time and the entire reproduction time. For example, the client 1310 may estimate a piece of the pieces of the second file corresponding to the current play time using the ratio of the current play time to the total play time.
- the client 1310 may transmit a request packet for requesting the estimated piece of data, for example, “File2URL? M” to the server 1320. Where 'm' is the index of the estimated fragment.
- the client 1310 may receive a response packet including the estimated piece of data, eg, 'file2 chunk-m'.
- the client 1310 may determine whether the estimation is successful by using the estimated fragment data. For example, the client 1310 may determine whether the estimation is successful by comparing the current playback time with the time of the first frame in the estimated fragment.
- the client 1310 may estimate a new piece. If it is determined that the estimation is successful, key frames in the estimated fragment can be extracted. The client 1310 may detect a key frame closest to the current playback time. The client 1310 may input data after the detected key frame into the buffer.
- a client 1410 may receive fragments from the server 1420 using various bit rates.
- the pieces are not required to begin with a key frame, and furthermore, the pieces may not include a key frame. Accordingly, the server 1420 does not need to encode according to the fragments, and the server 1420 may divide and transmit general video files such as mp4 and flv by capacity units. Since fragments transmitted between the server 1420 and the client 1410 begin with a key frame or do not require a condition to include a key frame, embodiments are not only closed GOPs, but also open GOPs. Can also support.
- the client 1410 may download pieces in advance by the size of the buffer. By storing the reproduced pieces, the client 1410 may prevent the pieces from being repeatedly downloaded when the frames belonging to the stored pieces are replayed.
- the client 1510 may transmit a request packet for requesting the nth fragment of the first file, for example, “File1URL? N” to the server 1520.
- the client 1510 may receive a response packet including an nth piece of the first file, eg, 'chunk-n', from the server 1520.
- the client 1510 may input 'chunk-n' into the buffer. Data input to the buffer can be played back through the demux and / or decoder.
- the client 1510 may receive a search input while 'chunk-n' is playing.
- the client 1510 may receive a search input through a predetermined interface.
- the search input may include a search time.
- the client 1510 may detect a key frame corresponding to the search time based on the key frame information of the first file. For example, the client 1510 may detect a key frame closest to the frame of the seek time.
- the client 1510 may transmit a request packet for requesting data after the address of the detected key frame, for example, a file 1 URL? Key frame address r corresponding chunk end address, to the server 1520.
- the corresponding chunk end address is the end address of the chunk to which the detected key frame belongs, and the client 1510 may calculate the end address of the chunk to which the detected key frame belongs using the reproduction information of the first file.
- the client 1510 transmits a request packet for requesting a fragment to which the detected key frame belongs, for example, 'File 1 URL? Corresponding chunk index' to the server 1520 and includes it in the header of the request packet.
- the specified address range parameter data of [key frame address, end of the chunk] can be requested.
- the client 1510 may receive a response packet including data after the key frame address, for example, [key frame address, corresponding chunk end address].
- the client 1510 may search for and reproduce the first file by inputting data after the key frame address into the buffer.
- the client 1610 may transmit 'File1URL? N' to the server 1620.
- the client 1610 may receive a response packet including 'chunk-n' from the server 1620.
- the client 1610 may play 'chunk-n'.
- the client 1610 may receive a search input while 'chunk-n' is playing.
- the client 1610 may detect a key frame corresponding to the search time based on the key frame information of the first file. For example, the client 1610 may detect a key frame that is closest to the frame of search time.
- the client 1610 may transmit a request packet for requesting data of the fragment to which the detected key frame belongs, for example, a file URL? K, to the server 1620. Where 'k' is the index of the fragment to which the detected key frame belongs.
- the client 1610 may receive a response packet including data of a fragment to which the detected key frame belongs, for example, 'file1 chunk-k'.
- the client 1610 may search for and reproduce the first file by inputting data after the key frame address into the buffer.
- the search operation of FIG. 16 may increase the efficiency of the cache server compared to the search operation of FIG. 15, and the search operation of FIG. 15 may reduce the amount of data actually transmitted compared to the search operation of FIG. 16.
- Embodiments may operate according to either the search operation of FIG. 15 or the search operation of FIG. 16 in view of a trade-off relationship between the efficiency of the cache server and the amount of data actually transmitted.
- a client 1710 may receive a search input while playing 'chunk-n'.
- the search input may include a search time.
- the first file currently being played may not store key frame information.
- the client 1710 may generate key frame information according to the embodiment described above with reference to FIGS. 10 and 11. However, the key frame corresponding to the search time may not be included in the key frame information yet.
- the client 1710 may estimate a piece corresponding to the current reproduction time among pieces of the first file based on the current reproduction time and the total reproduction time. For example, the client 1710 may estimate a piece of the pieces of the first file corresponding to the current play time using the ratio of the current play time to the total play time.
- the client 1710 may transmit a request packet for requesting the estimated piece of data, for example, 'File1URL? M' to the server 1720. Where 'm' is the index of the estimated fragment.
- the client 1710 may receive a response packet including the estimated piece of data, eg, 'file1 chunk-m'.
- the client 1710 may determine whether the estimation is successful by using the estimated fragment data. For example, the client 1710 may determine whether the estimation is successful by comparing the current playback time with the time of the first frame in the estimated fragment.
- the client 1710 may estimate a new piece. If it is determined that the estimation is successful, key frames in the estimated fragment can be extracted. The client 1710 may detect a key frame closest to the current playback time. The client 1710 may input data after the detected key frame into the buffer.
- FIG. 18 and 19 are diagrams for describing an operation of a client using a plurality of url streams.
- a client may perform a resolution change operation and / or a search operation using a plurality of url streams.
- a case in which a client uses two url streams will be described.
- Each of the first url stream and the second url stream may generate a request packet, transmit the request packet to the server, and process a response packet received from the server.
- Data of the first url stream may be reproduced through the first demux and the first decoder.
- Data of the second url stream may be reproduced through the second demux and the second decoder.
- the first demux and the second demux may be implemented through a single device.
- the first demux and the second demux may be implemented in two threads using the same demux device.
- the first decoder and the second decoder may also be implemented through a single device.
- the first decoder and the second decoder may be implemented in the form of two threads using the same decoder device.
- the client may perform a resolution change operation and / or a search operation by using one demux and one decoder.
- the B-frame may then be a frame compressed with reference to the frame information.
- the B-frame may be a frame compressed with reference to previous frame information and subsequent frame information.
- the first file contains a B-frame
- the second file's buffer is concatenated after the buffer of the first file
- the B-frame included in the first file refers to the second file instead of the first file. Can be generated.
- the buffer of the first file and the buffer of the second file may be concatenated so that it is not necessary to distinguish the buffer input line input to the decoder.
- the client may perform a resolution change operation and / or a search operation by using one decoder.
- the client may select an operation mode.
- the client controls the first demux (EN_DEMUX_1), the first decoder (EN_DECODER_1), the second demux (EN_DEMUX_2), the second decoder (EN_DECODER_2), and the output.
- the data flow of the data of the first url stream and the data of the second url stream may be controlled using at least one of the signals MUX_OUT controlling the mux.
- the client may sequentially play from the first fragment of the first file by using the first url stream. At this time, the client may receive the reproduction information of the first file by using the second url stream.
- the client may detect the key frame corresponding to the current playback time among the key frames of the second file using the second url stream while playing the first file using the first url stream. To change the resolution, the client may control the data flow of the first url stream and the data flow of the second url stream at the time point when the detected key frame is played.
- the client may switch off the data flow of the first url stream and switch on the data flow of the second url stream.
- the client may control the data flow of the first url stream and the data flow of the second url stream to be switched on together for a predetermined time, for example, 30 ms, in consideration of the delay of the demux and / or the decoder. .
- the client may play data after the detected key frame using the second url stream.
- the client may clear the buffer for the first url stream.
- the client may process the additional resolution change input by changing only the roles of the first url stream and the second url stream in the above-described operation.
- the client plays the first file using the first url stream when inputting a resolution change, and the second file uses the second url stream.
- the key frame corresponding to the current buffer amount among the key frames of the two files may be detected.
- the current buffer amount may be an amount in which the first file is input to the current buffer.
- the client may detect a key frame of the second file corresponding to a time after the current buffer amount.
- the client may detect a key frame of the second file corresponding to the playback time after the amount of time the first file enters the buffer of the decoder.
- the client inputs the data of the first file into the buffer only before the playing time of the detected key frame of the second file, and inputs the data of the second file into the same buffer after the key frame of the second file into the buffer of the first file.
- the buffer of the second file can then be entered.
- the resolution change and / or search may be performed at the reproduction time corresponding to the key frame of the second file even if the client does not perform a separate operation.
- the client may not perform a separate switch on / off operation.
- the client may not be able to clear the buffer.
- the first url stream generates a buffer of the first file corresponding to a time before the key frame of the second file
- the second url stream generates a buffer of the second file when the buffer input of the first file is completed. It can be followed by a buffer of.
- the embodiments described above may be implemented as hardware components, software components, and / or combinations of hardware components and software components.
- the devices, methods, and components described in the embodiments may include, for example, processors, controllers, arithmetic logic units (ALUs), digital signal processors, microcomputers, field programmable gates (FPGAs). It may be implemented using one or more general purpose or special purpose computers, such as an array, a programmable logic unit (PLU), a microprocessor, or any other device capable of executing and responding to instructions.
- the processing device may execute an operating system (OS) and one or more software applications running on the operating system.
- the processing device may also access, store, manipulate, process, and generate data in response to the execution of the software.
- OS operating system
- the processing device may also access, store, manipulate, process, and generate data in response to the execution of the software.
- processing device includes a plurality of processing elements and / or a plurality of types of processing elements. It can be seen that it may include.
- the processing device may include a plurality of processors or one processor and one controller.
- other processing configurations are possible, such as parallel processors.
- the software may include a computer program, code, instructions, or a combination of one or more of the above, and configure the processing device to operate as desired, or process it independently or collectively. You can command the device.
- Software and / or data may be any type of machine, component, physical device, virtual equipment, computer storage medium or device in order to be interpreted by or to provide instructions or data to the processing device. Or may be permanently or temporarily embodied in a signal wave to be transmitted.
- the software may be distributed over networked computer systems so that they may be stored or executed in a distributed manner.
- Software and data may be stored on one or more computer readable recording media.
- the method according to the embodiment may be embodied in the form of program instructions that can be executed by various computer means and recorded in a computer readable medium.
- the computer readable medium may include program instructions, data files, data structures, etc. alone or in combination.
- the program instructions recorded on the media may be those specially designed and constructed for the purposes of the embodiments, or they may be of the kind well-known and available to those having skill in the computer software arts.
- Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks, and magnetic tape, optical media such as CD-ROMs, DVDs, and magnetic disks, such as floppy disks.
- Examples of program instructions include not only machine code generated by a compiler, but also high-level language code that can be executed by a computer using an interpreter or the like.
- the hardware device described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computer Security & Cryptography (AREA)
- Information Transfer Between Computers (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Data Exchanges In Wide-Area Networks (AREA)
Abstract
Description
뮤직비디오.mp4 |
뮤직비디오_720p.mp4 |
뮤직비디오_480p.mp4 |
뮤직비디오_360p.mp4 |
2097152 | 97 | 1080 | URL2 | 720 | URL3 | 480 | URL4 | 360 |
2097152 | 97 | URL1 | 1080a | URL2 | 720 | URL3 | 480 | URL4 | 360 |
1048576 | 86 |
[(2097152)*n, (2097152)*(n+1)-1] |
[(2097152)*n + start range, (2097152)*n + end range] |
[주소, 파일 끝] |
[주소1, 주소2] |
Claims (36)
- 스트리밍 서비스를 위한 클라이언트의 동작 방법에 있어서,파일 URL 및 재생 정보 요청을 지시하는 제1 매개변수를 포함하는 제1 요청 패킷을 전송하는 단계;상기 파일 URL에 대응하는 파일의 재생 정보를 포함하는 제1 응답 패킷을 수신하는 단계;상기 파일 URL 및 데이터 요청을 지시하는 제2 매개변수를 포함하는 제2 요청 패킷을 전송하는 단계; 및상기 파일 내 상기 제2 매개변수에 대응하는 주소 범위의 데이터를 포함하는 제2 응답 패킷을 수신하는 단계를 포함하는, 클라이언트의 동작 방법.
- 제1항에 있어서,상기 재생 정보는상기 파일을 분할하는 조각의 크기, 및 상기 조각의 개수 중 적어도 하나를 포함하는, 클라이언트의 동작 방법.
- 제1항에 있어서,상기 제1 매개변수를 미리 정해진 제1 지시자로 설정하는 단계; 및상기 제1 응답 패킷으로부터 상기 파일을 분할하는 조각의 크기, 상기 조각의 개수, 상기 파일에 저장된 컨텐츠의 해상도, 상기 해상도와 다른 제2 해상도의 컨텐츠를 저장하는 제2 파일의 URL, 및 상기 제2 해상도를 추출하는 단계를 더 포함하는, 클라이언트의 동작 방법.
- 제1항에 있어서,상기 제2 매개변수를 최초 인덱스로 설정하는 단계; 및상기 제2 응답 패킷의 데이터를 버퍼에 입력하는 단계를 더 포함하는, 클라이언트의 동작 방법.
- 제1항에 있어서,버퍼의 잔량을 체크하는 단계;상기 버퍼의 잔량이 임계 값 이하가 되면, 상기 제2 매개변수를 다음 재생할 조각의 인덱스로 설정하는 단계; 및상기 제2 응답 패킷의 데이터를 상기 버퍼에 입력하는 단계를 더 포함하는, 클라이언트의 동작 방법.
- 제1항에 있어서,상기 제2 매개변수를 최초 인덱스로 설정하는 단계; 및상기 제2 응답 패킷으로부터 상기 파일의 키 프레임 정보를 위한 오프셋 주소를 추출하는 단계를 더 포함하는, 클라이언트의 동작 방법.
- 제6항에 있어서,상기 오프셋 주소의 데이터가 상기 제2 응답 패킷에 포함된 경우, 상기 제2 응답 패킷으로부터 상기 키 프레임 정보를 추출하는 단계를 더 포함하는, 클라이언트의 동작 방법.
- 제6항에 있어서,상기 오프셋 주소의 데이터가 상기 제2 응답 패킷에 포함되지 않는 경우, 데이터 요청을 지시하는 제3 매개변수를 상기 오프셋 주소로 설정하는 단계;상기 파일 URL 및 상기 제3 매개변수를 포함하는 제3 요청 패킷을 전송하는 단계;상기 파일 내 상기 제3 매개변수에 대응하는 주소 범위의 데이터를 포함하는 제3 응답 패킷을 수신하는 단계; 및상기 제3 응답 패킷으로부터 상기 키 프레임 정보를 추출하는 단계를 더 포함하는, 클라이언트의 동작 방법.
- 제1항에 있어서,상기 제2 응답 패킷의 데이터를 버퍼에 입력하는 단계;상기 데이터가 재생되는 동안 상기 파일의 키 프레임들을 추출하는 단계; 및상기 키 프레임들에 기초하여 상기 파일의 키 프레임 정보를 생성하는 단계를 더 포함하는, 클라이언트의 동작 방법.
- 제1항에 있어서,해상도 변경 입력을 수신하는 단계;상기 파일 URL을 상기 해상도 변경 입력에 포함된 새로운 해상도에 대응하는 제2 파일의 URL로 설정하는 단계;상기 제1 매개변수를 미리 정해진 제2 지시자로 설정하는 단계; 및상기 제1 응답 패킷으로부터 상기 제2 파일을 분할하는 조각의 크기, 및 상기 조각의 개수를 추출하는 단계를 더 포함하는, 클라이언트의 동작 방법.
- 제1항에 있어서,해상도 변경 입력을 수신하는 단계;상기 파일 URL을 상기 해상도 변경 입력에 포함된 새로운 해상도에 대응하는 제2 파일의 URL로 설정하는 단계;상기 제2 파일의 키 프레임 정보에 기초하여 현재 재생 시간에 대응하는 키 프레임을 검출하는 단계;상기 제2 매개변수를 상기 검출된 키 프레임의 주소로 설정하는 단계; 및상기 제2 응답 패킷의 데이터를 버퍼에 입력하는 단계를 더 포함하는, 클라이언트의 동작 방법.
- 제11항에 있어서,상기 제2 매개변수는상기 검출된 키 프레임의 주소 및 상기 검출된 키 프레임이 속하는 조각의 끝 주소를 포함하는, 클라이언트의 동작 방법.
- 제11항에 있어서,상기 버퍼는 현재 이용중인 버퍼와 구별되는 제2 버퍼이고,상기 검출된 키 프레임의 시간 이후로 상기 제2 버퍼의 데이터가 재생되는, 클라이언트의 동작 방법.
- 제1항에 있어서,해상도 변경 입력을 수신하는 단계;상기 파일 URL을 상기 해상도 변경 입력에 포함된 새로운 해상도에 대응하는 제2 파일의 URL로 설정하는 단계;현재 재생 시간 및 전체 재생 시간에 기초하여 상기 현재 재생 시간에 대응하는 조각을 추정하는 단계;상기 제2 매개변수를 상기 추정된 조각의 인덱스로 설정하는 단계;상기 제2 응답 패킷의 시간 범위가 상기 현재 재생 시간을 포함하는 경우, 상기 제2 응답 패킷의 데이터로부터 상기 현재 재생 시간에 가장 가까운 키 프레임을 검출하는 단계; 및상기 검출된 키 프레임 이후의 데이터를 상기 버퍼에 입력하는 단계를 더 포함하는, 클라이언트의 동작 방법.
- 제14항에 있어서,상기 제2 응답 패킷의 시간 범위가 상기 현재 재생 시간을 포함하지 않는 경우, 상기 현재 재생 시간에 대응하는 새로운 인덱스를 추정하는 단계;데이터 요청을 지시하는 제3 매개변수를 상기 새로운 인덱스로 설정하는 단계;상기 파일 URL 및 상기 제3 매개변수를 포함하는 제3 요청 패킷을 전송하는 단계;상기 파일 내 상기 제3 매개변수에 대응하는 주소 범위의 데이터를 포함하는 제3 응답 패킷을 수신하는 단계; 및상기 제3 응답 패킷의 데이터로부터 상기 현재 재생 시간에 가장 가까운 키 프레임을 검출하는 단계를 더 포함하는, 클라이언트의 동작 방법.
- 제14항에 있어서,상기 버퍼는 현재 이용중인 버퍼와 구별되는 제2 버퍼이고,상기 검출된 키 프레임의 시간 이후로 상기 제2 버퍼의 데이터가 재생되는, 클라이언트의 동작 방법.
- 제1항에 있어서,탐색 시간을 수신하는 단계;상기 파일의 키 프레임 정보를 이용하여 상기 탐색 시간에 대응하는 키 프레임을 검출하는 단계;상기 제2 매개변수를 상기 검출된 키 프레임의 주소로 설정하는 단계; 및상기 제2 응답 패킷의 데이터를 버퍼에 입력하는 단계를 더 포함하는, 클라이언트의 동작 방법.
- 제17항에 있어서,상기 제2 매개변수는상기 검출된 키 프레임의 주소 및 상기 검출된 키 프레임이 속하는 조각의 끝 주소를 포함하는, 클라이언트의 동작 방법.
- 제17항에 있어서,상기 버퍼는 현재 이용중인 버퍼와 구별되는 제2 버퍼이고,상기 검출된 키 프레임의 시간 이후로 상기 제2 버퍼의 데이터가 재생되는, 클라이언트의 동작 방법.
- 제1항에 있어서,탐색 시간을 수신하는 단계;상기 탐색 시간과 상기 파일의 전체 재생 시간에 기초하여 상기 탐색 시간에 대응하는 조각을 추정하는 단계;상기 제2 매개변수를 상기 추정된 조각의 인덱스로 설정하는 단계;상기 제2 응답 패킷의 시간 범위가 상기 탐색 시간을 포함하는 경우, 상기 제2 응답 패킷의 데이터로부터 상기 탐색 시간에 가장 가까운 키 프레임을 검출하는 단계; 및상기 검출된 키 프레임 이후의 데이터를 버퍼에 입력하는 단계를 더 포함하는, 클라이언트의 동작 방법.
- 제20항에 있어서,상기 제2 응답 패킷의 시간 범위가 상기 탐색 시간을 포함하지 않는 경우, 상기 탐색 시간에 대응하는 새로운 인덱스를 추정하는 단계;데이터 요청을 지시하는 제3 매개변수를 상기 새로운 인덱스로 설정하는 단계;상기 파일 URL 및 상기 제3 매개변수를 포함하는 제3 요청 패킷을 전송하는 단계;상기 파일 내 상기 제3 매개변수에 대응하는 주소 범위의 데이터를 포함하는 제3 응답 패킷을 수신하는 단계; 및상기 제3 응답 패킷의 데이터로부터 상기 탐색 시간에 가장 가까운 키 프레임을 검출하는 단계를 더 포함하는, 클라이언트의 동작 방법.
- 제20항에 있어서,상기 버퍼는 현재 이용중인 버퍼와 구별되는 제2 버퍼이고,상기 검출된 키 프레임의 시간 이후로 상기 제2 버퍼의 데이터가 재생되는, 클라이언트의 동작 방법.
- 제1항에 있어서,상기 클라이언트는 서버와의 통신을 위하여 http 프로토콜을 이용하는, 클라이언트의 동작 방법.
- 제1항에 있어서,상기 제1 매개변수는 상기 파일을 분할하는 조각의 크기를 포함하고,상기 재생 정보는 상기 조각의 개수, 및 상기 파일의 크기 중 적어도 하나를 포함하는, 클라이언트의 동작 방법.
- 스트리밍 서비스를 위한 서버의 동작 방법에 있어서,파일 URL 및 매개변수를 포함하는 요청 패킷을 수신하는 단계;상기 매개변수가 데이터 요청을 지시하는 경우, 상기 파일 URL에 대응하는 파일 내 상기 매개변수에 대응하는 주소 범위의 데이터를 응답하는 단계; 및상기 매개변수가 재생 정보 요청을 지시하는 경우, 상기 파일의 재생 정보를 응답하는 단계를 포함하는, 서버의 동작 방법.
- 제25항에 있어서,상기 재생 정보는상기 파일을 분할하는 조각의 크기, 및 상기 조각의 개수 중 적어도 하나를 포함하는, 서버의 동작 방법.
- 제25항에 있어서,상기 데이터를 응답하는 단계는상기 매개 변수가 인덱스 또는 주소를 포함하는지 여부를 판단하는 단계를 포함하는, 서버의 동작 방법.
- 제25항에 있어서,상기 데이터를 응답하는 단계는상기 매개 변수가 인덱스를 포함하는 경우, 상기 파일 내 상기 인덱스에 대응하는 주소 범위의 데이터를 응답하는 단계; 및상기 매개 변수가 주소를 포함하는 경우, 상기 파일 내 상기 주소에 대응하는 주소 범위의 데이터를 응답하는 단계중 적어도 하나를 포함하는, 서버의 동작 방법.
- 제25항에 있어서,상기 인덱스에 대응하는 주소 범위는 상기 파일을 분할하는 조각의 크기에 기초하여 결정되는, 서버의 동작 방법.
- 제25항에 있어서,상기 주소에 대응하는 주소 범위의 데이터를 응답하는 단계는상기 매개 변수가 두 개의 주소들을 포함하는 경우, 상기 파일 내 상기 두 개의 주소들 사이의 데이터를 응답하는 단계; 및상기 매개 변수가 하나의 주소를 포함하는 경우, 상기 파일 내 상기 하나의 주소와 상기 파일의 끝 사이의 데이터를 응답하는 단계중 적어도 하나를 포함하는, 서버의 동작 방법.
- 제25항에 있어서,상기 재생 정보를 응답하는 단계는상기 매개변수가 미리 정해진 제1 지시자를 포함하는 경우, 상기 파일을 분할하는 조각의 크기, 상기 조각의 개수, 상기 파일에 저장된 컨텐츠의 해상도, 상기 해상도와 다른 제2 해상도의 컨텐츠를 저장하는 제2 파일의 URL, 및 상기 제2 해상도를 응답하는 단계; 및상기 매개변수가 미리 정해진 제2 지시자를 포함하는 경우, 상기 조각의 크기 및 상기 조각의 개수를 응답하는 단계중 적어도 하나를 포함하는, 서버의 동작 방법.
- 제31항에 있어서,상기 제2 파일은 상기 파일의 파일명 및 상기 파일이 저장된 디렉토리 중 적어도 하나에 기초하여 검색되는, 서버의 동작 방법.
- 제31항에 있어서,상기 제2 해상도는 상기 제2 파일의 파일명에 기초하여 획득되는, 서버의 동작 방법.
- 제25항에 있어서,상기 서버는 클라이언트와의 통신을 위하여 http 프로토콜을 이용하는, 서버의 동작 방법.
- 제25항에 있어서,상기 파일을 분할하는 조각의 크기를 수신하는 단계를 더 포함하고,상기 재생 정보는 상기 조각의 개수, 및 상기 파일의 크기 중 적어도 하나를 포함하는, 서버의 동작 방법.
- 하드웨어와 결합되어 제1항 내지 제35항 중 어느 하나의 항의 방법을 실행시키기 위하여 매체에 저장된 컴퓨터 프로그램.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP15821394.2A EP3171604A4 (en) | 2014-07-16 | 2015-04-01 | Operating method of client and server for streaming service |
US15/318,679 US20170134463A1 (en) | 2014-07-16 | 2015-04-01 | Operating method of client and server for streaming service |
CN201580038102.5A CN106537924A (zh) | 2014-07-16 | 2015-04-01 | 用于串流服务的客户端以及服务器的操作方法 |
JP2017502886A JP2017529726A (ja) | 2014-07-16 | 2015-04-01 | ストリーミングサービスのためのクライアント及びサーバの動作方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020140089679A KR101600469B1 (ko) | 2014-07-16 | 2014-07-16 | 스트리밍 서비스를 위한 클라이언트 및 서버의 동작 방법 |
KR10-2014-0089679 | 2014-07-16 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016010229A1 true WO2016010229A1 (ko) | 2016-01-21 |
Family
ID=55078691
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2015/003230 WO2016010229A1 (ko) | 2014-07-16 | 2015-04-01 | 스트리밍 서비스를 위한 클라이언트 및 서버의 동작 방법 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20170134463A1 (ko) |
EP (1) | EP3171604A4 (ko) |
JP (1) | JP2017529726A (ko) |
KR (1) | KR101600469B1 (ko) |
CN (1) | CN106537924A (ko) |
WO (1) | WO2016010229A1 (ko) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018021616A1 (en) * | 2016-07-29 | 2018-02-01 | Airbroad Inc. | Operating method of client for streaming service |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8942543B1 (en) | 2010-10-06 | 2015-01-27 | Verint Video Solutions Inc. | Systems, methods, and software for improved video data recovery effectiveness |
US10033794B2 (en) * | 2015-07-17 | 2018-07-24 | Bio-Rad Laboratories, Inc. | Network transfer of large files in unstable network environments |
CN105828149A (zh) * | 2016-04-28 | 2016-08-03 | 合智能科技(深圳)有限公司 | 播放优化方法和装置 |
US10873775B2 (en) | 2017-06-12 | 2020-12-22 | Netflix, Inc. | Staggered key frame video encoding |
US11354164B1 (en) * | 2018-04-20 | 2022-06-07 | Automation Anywhere, Inc. | Robotic process automation system with quality of service based automation |
US10908950B1 (en) | 2018-04-20 | 2021-02-02 | Automation Anywhere, Inc. | Robotic process automation system with queue orchestration and task prioritization |
KR102500145B1 (ko) * | 2018-07-13 | 2023-02-16 | 삼성전자주식회사 | 전자 장치 및 전자 장치의 컨텐트 전송 방법 |
CN110545492B (zh) * | 2018-09-05 | 2020-07-31 | 北京开广信息技术有限公司 | 媒体流的实时递送方法及服务器 |
CN110881018B (zh) * | 2018-09-05 | 2020-11-03 | 北京开广信息技术有限公司 | 媒体流的实时接收方法及客户端 |
CN110072128B (zh) * | 2019-04-22 | 2021-01-15 | 北京开广信息技术有限公司 | 媒体流的实时推送方法及服务器 |
CN110113655B (zh) * | 2019-05-05 | 2021-09-21 | 北京奇艺世纪科技有限公司 | 一种视频播放的方法、装置及用户终端 |
CN111314434B (zh) * | 2020-01-20 | 2022-08-19 | 浪潮云信息技术股份公司 | 请求处理方法及服务端 |
KR102655215B1 (ko) * | 2022-09-27 | 2024-04-05 | (주)이노시뮬레이션 | 원격 유지보수를 위한 선박 및 지상 담당자 사이의 네트워크 연결 방법 및 이를 실행하는 시스템 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002215516A (ja) * | 2001-01-22 | 2002-08-02 | Sony Corp | 情報端末装置、ダウンロード制御方法およびコンピュータプログラム |
KR20100055296A (ko) * | 2008-11-17 | 2010-05-26 | 에스케이텔레콤 주식회사 | 분산 저장된 컨텐츠의 리다이렉티드 url을 이용한 순차적 멀티미디어 스트리밍 시스템 및 방법 |
KR101066872B1 (ko) * | 2008-10-30 | 2011-09-26 | 에스케이텔레콤 주식회사 | 캐시서버를 이용한 컨텐츠 전송시스템 및 방법, 그 캐시서버 |
KR20120036901A (ko) * | 2009-11-09 | 2012-04-18 | 후아웨이 테크놀러지 컴퍼니 리미티드 | Http 기반의 스트리밍 미디어 서비스를 구현하는 방법, 시스템 및 네트워크장비 |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001242876A (ja) * | 1999-12-20 | 2001-09-07 | Matsushita Electric Ind Co Ltd | データ受信再生方法、データ受信再生装置、データ送信方法、およびデータ送信装置 |
JP2005260283A (ja) * | 2004-02-13 | 2005-09-22 | Matsushita Electric Ind Co Ltd | Avコンテンツのネットワーク再生方法 |
CN102055773B (zh) * | 2009-11-09 | 2013-10-09 | 华为技术有限公司 | 实现基于http的流媒体业务的方法、***和网络设备 |
KR101624013B1 (ko) * | 2010-02-19 | 2016-05-24 | 텔레폰악티에볼라겟엘엠에릭슨(펍) | 에이치티티피 스트리밍에서 적응화를 위한 방법 및 장치 |
JP5497919B2 (ja) * | 2010-03-05 | 2014-05-21 | サムスン エレクトロニクス カンパニー リミテッド | ファイルフォーマットベースの適応的ストリーム生成、再生方法及び装置とその記録媒体 |
EP2589222B1 (en) * | 2010-06-29 | 2021-02-24 | Qualcomm Incorporated | Signaling video samples for trick mode video representations |
US9485546B2 (en) * | 2010-06-29 | 2016-11-01 | Qualcomm Incorporated | Signaling video samples for trick mode video representations |
US9185439B2 (en) * | 2010-07-15 | 2015-11-10 | Qualcomm Incorporated | Signaling data for multiplexing video components |
US9462024B2 (en) * | 2011-06-08 | 2016-10-04 | Futurewei Technologies, Inc. | System and method of media content streaming with a multiplexed representation |
EP2547062B1 (en) * | 2011-07-14 | 2016-03-16 | Nxp B.V. | Media streaming with adaptation |
US8935425B2 (en) * | 2011-10-05 | 2015-01-13 | Qualcomm Incorporated | Switching between representations during network streaming of coded multimedia data |
-
2014
- 2014-07-16 KR KR1020140089679A patent/KR101600469B1/ko active IP Right Grant
-
2015
- 2015-04-01 US US15/318,679 patent/US20170134463A1/en not_active Abandoned
- 2015-04-01 EP EP15821394.2A patent/EP3171604A4/en not_active Withdrawn
- 2015-04-01 JP JP2017502886A patent/JP2017529726A/ja not_active Ceased
- 2015-04-01 CN CN201580038102.5A patent/CN106537924A/zh active Pending
- 2015-04-01 WO PCT/KR2015/003230 patent/WO2016010229A1/ko active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002215516A (ja) * | 2001-01-22 | 2002-08-02 | Sony Corp | 情報端末装置、ダウンロード制御方法およびコンピュータプログラム |
KR101066872B1 (ko) * | 2008-10-30 | 2011-09-26 | 에스케이텔레콤 주식회사 | 캐시서버를 이용한 컨텐츠 전송시스템 및 방법, 그 캐시서버 |
KR20100055296A (ko) * | 2008-11-17 | 2010-05-26 | 에스케이텔레콤 주식회사 | 분산 저장된 컨텐츠의 리다이렉티드 url을 이용한 순차적 멀티미디어 스트리밍 시스템 및 방법 |
KR20120036901A (ko) * | 2009-11-09 | 2012-04-18 | 후아웨이 테크놀러지 컴퍼니 리미티드 | Http 기반의 스트리밍 미디어 서비스를 구현하는 방법, 시스템 및 네트워크장비 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3171604A4 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018021616A1 (en) * | 2016-07-29 | 2018-02-01 | Airbroad Inc. | Operating method of client for streaming service |
Also Published As
Publication number | Publication date |
---|---|
KR20160009322A (ko) | 2016-01-26 |
EP3171604A1 (en) | 2017-05-24 |
JP2017529726A (ja) | 2017-10-05 |
US20170134463A1 (en) | 2017-05-11 |
CN106537924A (zh) | 2017-03-22 |
KR101600469B1 (ko) | 2016-03-07 |
EP3171604A4 (en) | 2018-04-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2016010229A1 (ko) | 스트리밍 서비스를 위한 클라이언트 및 서버의 동작 방법 | |
WO2014129747A1 (en) | Method and apparatus for streaming multimedia content of server by using cache | |
WO2012138183A2 (en) | Apparatus and method for providing content using a network condition-based adaptive data streaming service | |
US9356985B2 (en) | Streaming video to cellular phones | |
WO2015060638A1 (ko) | 적응적 실시간 트랜스코딩 방법 및 이를 위한 스트리밍 서버 | |
WO2015190893A1 (ko) | 멀티미디어 데이터를 관리하는 방법 및 장치 | |
EP3937434B1 (en) | Data distribution method and network device | |
EP3843412A2 (en) | Method and apparatus for managing redundant segmented streams | |
WO2016129966A1 (ko) | 저지연 생방송 컨텐츠 제공을 위한 프로그램을 기록한 기록매체 및 장치 | |
KR101942269B1 (ko) | 웹 브라우저에서 미디어를 재생하고 탐색하는 장치 및 방법 | |
WO2016056804A1 (en) | Content processing apparatus and content processing method thereof | |
WO2012176979A1 (ko) | 고화질 비디오 스트리밍 서비스 방법 및 시스템 | |
WO2014077459A1 (ko) | 데이터 가속 알고리즘을 선택하여 콘텐츠를 제공하는 방법 및 장치 | |
WO2020138567A1 (ko) | 컨텐츠 스트리밍 장치, 시스템 및 방법 | |
WO2013154364A1 (ko) | 스트리밍 재생 방법 및 이를 이용한 컴퓨팅 장치 | |
WO2018021616A1 (en) | Operating method of client for streaming service | |
WO2023027399A1 (en) | Method and device for downloading streaming media file | |
EP3461134A1 (en) | Low latency adaptive bitrate linear video delivery system | |
WO2023033300A1 (en) | Encoding and decoding video data | |
WO2016099183A1 (ko) | 하이브리드 전송 프로토콜 | |
WO2020138568A1 (ko) | 컨텐츠 인코딩 장치 및 방법 | |
WO2019209008A1 (ko) | 변화 매크로블록 추출 기법을 이용한 동영상 화질 개선 시스템 | |
WO2014208972A1 (en) | Method and apparatus for converting content in multimedia system | |
JP6357188B2 (ja) | 監視カメラシステム及び監視カメラデータ保存方法 | |
US10484725B2 (en) | Information processing apparatus and information processing method for reproducing media based on edit file |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15821394 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2015821394 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15318679 Country of ref document: US Ref document number: 2015821394 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2017502886 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |