US20200029130A1 - Method and apparatus for configuring content in a broadcast system - Google Patents

Method and apparatus for configuring content in a broadcast system Download PDF

Info

Publication number
US20200029130A1
US20200029130A1 US16/588,417 US201916588417A US2020029130A1 US 20200029130 A1 US20200029130 A1 US 20200029130A1 US 201916588417 A US201916588417 A US 201916588417A US 2020029130 A1 US2020029130 A1 US 2020029130A1
Authority
US
United States
Prior art keywords
mpu
data
layer
aus
fragmentation unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/588,417
Inventor
Sung-ryeul Rhyu
Jae-Yeon Song
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority to US16/588,417 priority Critical patent/US20200029130A1/en
Publication of US20200029130A1 publication Critical patent/US20200029130A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/643Communication protocols
    • H04L65/4076
    • H04L65/607
    • H04L65/608
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/61Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio
    • H04L65/611Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio for multicast or broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/65Network streaming protocols, e.g. real-time transport protocol [RTP] or real-time control protocol [RTCP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/70Media network packetisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/177Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a group of pictures [GOP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/23605Creation or processing of packetized elementary streams [PES]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/23608Remultiplexing multiplex streams, e.g. involving modifying time stamps or remapping the packet identifiers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/23614Multiplexing of additional data and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/2362Generation or processing of Service Information [SI]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/238Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
    • H04N21/2381Adapting the multiplex stream to a specific network, e.g. an Internet Protocol [IP] network
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M13/00Coding, decoding or code conversion, for error detection or error correction; Coding theory basic assumptions; Coding bounds; Error probability evaluation methods; Channel models; Simulation or testing of codes
    • H03M13/35Unequal or adaptive error protection, e.g. by providing a different level of protection according to significance of source information or by adapting the coding according to the change of transmission channel characteristics

Definitions

  • the present invention relates generally to a method and an apparatus for configuring content in a broadcast system, and more particularly, to a method and an apparatus for configuring a data unit of content in a broadcast system supporting multimedia services based on an Internet Protocol (IP).
  • IP Internet Protocol
  • a conventional broadcast network generally uses the Moving Picture Experts Group-2 Transport Stream (MPEG-2 TS) for transmission of multimedia content.
  • MPEG-2 TS is a representative transmission technique that allows a plurality of broadcast programs (a plurality of encoded video bit streams) to transmit multiplexed bit streams in a transmission environment having errors.
  • the MPEG-2 TS is appropriately used in digital TeleVsion (TV) broadcasting, etc.
  • FIG. 1 illustrates a layer structure supporting a conventional MPEG-2 TS.
  • the conventional MPEG-2 TS layer includes a media coding layer 110 , a sync (synchronization) layer 120 , a delivery layer 130 , a network layer 140 , a data link layer 150 , and a physical layer 160 .
  • the media coding layer 110 and the sync layer 120 configure media data to a format usable for recording or transmission.
  • the delivery layer 130 , the network layer 140 , the data link layer 150 , and the physical layer 160 configure a multimedia frame for recording or transmitting a data block having the format configured by the sync layer 120 in/to a separate recording medium.
  • the configured multimedia frame is transmitted to a subscriber terminal, etc., through a predetermined network.
  • the sync layer 120 includes a fragment block 122 and an access unit 124
  • the delivery layer 130 includes an MPEG-2 TS/MPEG-4 (MP4) Real-time Transport Protocol (RTP) Payload Format/File delivery over unidirectional transport (FLUTE) 132 block, an RTP/HyperText Transfer Protocol (HTTP) block 134 , and a User Datagram Protocol (UDP)/Transmission Control Protocol (TCP) block 136 .
  • MPEG-2 TS/MPEG-4 MP4 Real-time Transport Protocol (RTP) Payload Format/File delivery over unidirectional transport (FLUTE) 132 block, an RTP/HyperText Transfer Protocol (HTTP) block 134 , and a User Datagram Protocol (UDP)/Transmission Control Protocol (TCP) block 136 .
  • RTP Real-time Transport Protocol
  • FLUTE unidirectional transport
  • HTTP RTP/HyperText Transfer Protocol
  • UDP User Datagram Protocol
  • TCP Transmission Control Protocol
  • the MPEG-2 TS has several limitations in supporting multimedia services. Specifically, the MPEG-2 TS has limitations of inefficient transmission due to unidirectional communication and a fixed size of a frame, generation of an unnecessary overhead due to the usage of a transport protocol, and an IP specialized for audio/video data, etc.
  • MPEG MEDIA Transport (MMT) standard has been proposed by MPEG in order to overcome the above-described limitations of the MPEG-2 TS.
  • the MMT standard may be applied for the efficient transmission of complex content through heterogeneous networks.
  • the complex content includes a set of content having multimedia factors by a video/audio application, etc.
  • the heterogeneous networks include networks in which a broadcast network and a communication network coexist.
  • the MMT standard attempts to define a transmission technique that is friendlier to an IP that is a basic technique in a transmission network for the multimedia services.
  • the MMT standard attempts to representatively provide efficient MPEG transmission techniques in a multimedia service environment that changes based on the IP, and in this respect, the standardization and continuous research of the MMT standard have been progressed.
  • FIG. 2 illustrates a conventional layer structure of an MMT system for transmission of a multimedia frame according to multi-service/content through heterogeneous networks.
  • an MMT system for configuring and transmitting a multimedia frame includes a media coding layer 210 , an encapsulation layer (Layer E) 220 , delivery layers (Layer D) 230 and 290 , a network layer 240 , a data link layer 250 , a physical layer 260 , and control layers (Layer C) 270 and 280 .
  • the layers include three technique areas, Layer E 220 , Layers D 230 and 290 , and Layers C 270 and 280 .
  • Layer E 220 controls complex content generation
  • Layers D 230 and 290 control the transmission of the generated complex content through the heterogeneous network
  • Layers C 270 and 280 control consumption management and the transmission management of the complex content.
  • Layer E 220 includes three layers, i.e., MMT E. 3 222 , MMT E. 2 224 , and MMT E. 1 226 .
  • the MMT E. 3 222 generates a fragment, which is a basic unit for the MMT service, based on coded multimedia data provided from the media coding layer 210 .
  • the MMT E. 2 224 generates an Access Unit (AU) for the MMT service by using the fragment generated by the MMT E. 3 222 .
  • the AU is the smallest data unit having a unique presentation time.
  • the MMT E. 1 226 combines or divides the AUs provided by the MMT E. 2 224 to generate a format for generation, storage, and transmission of the complex content.
  • Layer D includes three layers, i.e., MMT D. 1 232 , MMT D. 2 234 , and MMT D. 3 290 .
  • the MMT D. 1 232 operates with an Application Protocol (AP) similarly functioning to the RTP or the HTTP
  • the MMT D. 2 234 operates with a network layer protocol similarly functioning to the UDP or the TCP
  • the MMT D. 3 290 controls optimization between the layers included in Layer E 220 and the layers included in Layer D 230 .
  • AP Application Protocol
  • Layer C includes two layers, i.e., MMT C. 1 270 and MMT C. 2 280 .
  • the MMT C. 1 270 provides information related to the generation and the consumption of the complex content
  • the MMT C. 2 280 provides information related to the transmission of the complex content.
  • FIG. 3 illustrates a conventional data transmission layer for a broadcast system.
  • Layer E in a transmission side stores elements of the content, such as video and audio, encoded to a Network Abstraction Layer (NAL) unit, a fragment unit, etc., by a codec encoder, such as an Advanced Video Codec (AVC) and a Scalable Video Codec (SVC) in units of AUs in layer E 3 , which is the top-level layer, and transmits the stored elements in the units of AUs to layer E 2 , which is a lower layer.
  • NAL Network Abstraction Layer
  • AVC Advanced Video Codec
  • SVC Scalable Video Codec
  • Layer E 2 structuralizes a plurality of AUs, encapsulates the structuralized AUs based on Layer E 2 units, stores the encapsulated AUs in the unit of Elementary Streams (ES), and transmits the stored AUs to Layer E 1 , which is a next lower layer.
  • Layer E 1 instructs a relation and a construction of the elements of the content, such as the video and audio, encapsulates the elements together with the ES, and transmits the encapsulated elements to Layer D 1 in units of packages.
  • Layer D 1 divides a received package in accordance with a form suitable for transmission of the divided package to a lower layer, and the lower layer then transmits the packet to a next lower layer.
  • Layer D in a reception side collects the packets transmitted from the transmission side to configure the collected packets to the package of Layer E 1 .
  • a receiver recognizes elements of the content within the package, a relation between the elements of the content, and information on construction of the elements of the content, to transfer the recognized information to a content element relation/construction processor and a content element processor.
  • the content relation/construction processor transfers the respective elements for the proper reproduction of the entire content to the content element processor, and the content element processor controls elements to be reproduced at a set time and displayed at a set position on a screen.
  • a conventional Layer E 2 technique provides only the AU itself or information on a processing time for the AU reproduction, e.g., a Decoding Time Stamp (DTS) or a Composition Time Stamp (CTS) and a Random Access Point (RAP). Accordingly, the utilization of the conventional Layer E 2 technique is limited.
  • DTS Decoding Time Stamp
  • CTS Composition Time Stamp
  • RAP Random Access Point
  • the present invention is designed to address at least the above-described problems and/or disadvantages occurring in the prior art, and to provide at least the advantages described below.
  • An aspect of the present invention is to provide a method of configuring AUs to a data unit for efficient reproduction of the AUs in Layer E 2 .
  • a method for receiving a media processing unit (MPU) including a data part and a control part, the MPU being processed independently, wherein the data part includes media data and the control part includes parameters related to the media data; and processing the received MPU, wherein the MPU comprises at least one fragmentation unit, wherein the parameters comprise a first parameter indicating a sequence number of the MPU, and wherein the sequence number of the MPU is unique to where the MPU belongs.
  • MPU media processing unit
  • FIG. 1 is a block diagram illustrating a layer structure for a conventional MPEG-2 TS
  • FIG. 2 is a block diagram illustrating an MMT service by a broadcast system based on a conventional MMT standard
  • FIG. 3 illustrates a conventional data transmission layer diagram in a broadcast system
  • FIG. 4 illustrates a conventional reproduction flow of a DU configured through encapsulation of AUs one by one
  • FIG. 5 illustrates a conventional process of receiving and reproducing a Data Unit (DU);
  • FIG. 6 illustrates a process of receiving and reproducing a DU according to an embodiment of the present invention
  • FIG. 7A illustrates a construction of conventional AUs
  • FIG. 7B illustrates a construction of AUs according to an embodiment of the present invention
  • FIGS. 8A and 8B are diagrams illustrating a comparison of a temporal scalability according to a construction of AUs within a DU;
  • FIGS. 9A and 9B are diagrams illustrating a comparison of an Application-Forward Error Control (AL-FEC) according to a construction of AUs within a DU; and
  • FIG. 10 illustrates a construction of a DU according to an embodiment of the present invention.
  • a method for configuring DUs by grouping a plurality of AUs.
  • the DUs are continuously concatenated to become Elementary Streams (ES), which become data transmitted from Layer E 2 to Layer E 1 .
  • ES Elementary Streams
  • a DU is configured by encapsulating the AUs one by one, a DTS and a CTS are granted to each AU, and a picture type (Intra (I)-picture, Bidirectionally Predictive (B)-picture, or Predictive (P)-picture) of a corresponding AU is expressed in each AU or whether a corresponding AU is a RAP is displayed.
  • a picture type Intra (I)-picture, Bidirectionally Predictive (B)-picture, or Predictive (P)-picture
  • FIGS. 4 and 5 illustrate a reproduction flow of a conventional DU configured by encapsulating the Aus, one by one
  • FIG. 6 illustrates a reproduction flow of a DU configured with a plurality of AUs according to an embodiment of the present invention.
  • a receiver searches for the RAP, i.e., a DU in a type of I-picture, by continuously examining subsequent concatenated DUs ( 402 ), such that it is possible to initiate the reproduction the DU ( 403 ).
  • a DU is provided by grouping a plurality of AUs, and further configuring the DU in units of Group Of Pictures (GOPs), compared to the generation of a DU for each of the respective AUs.
  • GOPs Group Of Pictures
  • all DUs may be independently reproduced, without having to wait until a next DU is decoded, eliminating a complex buffer control requirement.
  • the reproduction of the DU from a time ( 601 ) instructed in Layer E 1 does not require an inverse-directional search of the DUs ( 602 through 604 ).
  • a DU may be configured with a plurality of GOP units.
  • the I-pictures, the P-pictures, and the B-pictures are separately grouped and stored, and the respective data may be differently stored in three places.
  • FIG. 7A illustrates a construction of a conventional AU
  • FIG. 7B illustrates a construction of an AU according to an embodiment of the present invention.
  • a transmission system utilizing an error correction method such as an AL-FEC
  • grouping the AUs according to a property and stored them in the DU is also helpful for reducing transmission overhead due to the AL-FEC.
  • FIGS. 8A and 8B are diagrams illustrating a comparison of a temporal scalability according to a construction of AUs within a DU between a conventional art and an embodiment of the present invention.
  • FIGS. 8A and 8B when the transmission of the DU is interrupted or an error is generated during the transmission of the DU, in FIG. 8A , it is impossible to view content after 8 seconds. However, in FIG. 8B , it is possible to view content for up to 14 seconds although it has a low temporal scalability.
  • FIGS. 9A and 9B are diagrams illustrating a comparison of an AL-FEC according to a construction of AUs within a DU between the conventional art and an embodiment of the present invention.
  • the AUs when the AUs are arranged according to picture type, because the AUs of the I-picture and P-picture affect a picture quality, it is sufficient to apply AL-FEC only to the AUs in the I-picture and P-picture, as indicated by a thick line of FIG. 9B . Accordingly, the overhead of the AL-FEC is decreased over the remaining durations, i.e., AUs of the B-pictures.
  • FIG. 10 illustrates a construction of a DU according to an embodiment of the present invention.
  • the DU includes a header 1001 and a set of AUs 1002 included in a GOP or a plurality of GOPs.
  • the header 1001 includes a DU description 1010 , which includes information on the DU, an AU structure description 1020 , which includes information on a construction the AUs 1002 , and AU information 1030 , which includes information on each AU.
  • the DU description 1010 may include the following information.
  • Length 1011 This information represents a size of a DU and is a value obtained by adding a size of headers of remaining DUs and a size of a payload after a corresponding field.
  • Length 1011 may be represented in units of bytes.
  • Sequence Number 1012 This information represents a sequence of a corresponding DU within the ES. Omission or duplicate reception between a plurality of continuous DUs may be identified using the sequence number 1012 . When an increase of sequence numbers between a previous DU and a continuously received DU exceeds “1”, this indicates that an error is generated in the transmission of the DU.
  • Type of AU 1013 This information represents a type of AU included in the DU.
  • the AU may be generally classified into “timed data” or “non-timed data”, expressed with “0” or “1”, respectively.
  • the non-time data corresponds to general data, such a picture or a file.
  • Decoding Time of DU 1014 This information represents a time to start decoding a first AU of the DU, as a representative value.
  • Duration of DU 1015 This information represents a temporal length of the DU. A value obtained by adding a duration to the CTS of the first AU of the DU is the same as the time of termination of the reproduction of the finally decoded AU of the DU.
  • Error Correction Code of DU 1016 For example, a Cyclic Redundancy Check (CRC), a parity bit, etc., may be used as a code for error correction.
  • CRC Cyclic Redundancy Check
  • parity bit etc.
  • an AU structure description 1020 may include the following information.
  • Number of AUs 1021 This information represents the number of AUs within the DU.
  • Pattern of AUs 1022 This information represents a structure and an arrangement pattern of AUs.
  • the Pattern of AUs 1022 may be indicated with values 0: open GOP, 1: closed GOP, 2: IPBIPB, 4:IIPPBB, 6: Unknown, or 8: reserved.
  • Each bit value is added through the OR calculation for use.
  • the construction of IPBIPB of the closed GOP is 1
  • 2 3.
  • Open GOP represented by “0”, represents when the GOP is the open GOP.
  • Closed GOP represented by “1”, represents when the GOP is the closed GOP. Definitions of the open GOP and closed GOP are the same as that of the conventional art.
  • IPBIPB represents when I-pictures, P-pictures, and B-pictures are collected based on each group and repeated at least two times within the DU, e.g., IPBBIPBB or IPPBBBBIPPBBBB.
  • IIPPBB represented by “4”
  • IIPPBB represents when I-pictures, P-pictures, and B-pictures are collected based on each group and repeated only one time within the DU, e.g., IIPPBBBB or IIPPPPBBBBBBBB.
  • Reserved represents a value reserved for a later user.
  • Size of Patterns 1023 This information represents a size of each duration of a repeated pattern. For example, when pattern IPBIPB is actually configured as IPPBBBBIPPBBBB, lengths of duration I, duration PP, and duration BBBB are added to be represented as three values in units of bytes.
  • the size of the pattern may be expressed as:
  • the AU information 1030 may include the following information.
  • a value of the Independent and Disposable AUs 1036 is “1”
  • a value of the Independent and Disposable AUs 1036 is “2”
  • a value of the Independent and Disposable AUs 1036 is “4”.

Abstract

A method and an apparatus are provided for configuring content in a broadcast system. The method includes receiving a media processing unit (MPU) including a data part and a control part, the MPU being processed independently, wherein the data part includes media data and the control part includes parameters related to the media data; and processing the received MPU, wherein the MPU comprises at least one fragmentation unit, wherein the parameters comprise a first parameter indicating a sequence number of the MPU, and wherein the sequence number of the MPU is unique to where the MPU belongs.

Description

    PRIORITY
  • This application is a Continuation Application of U.S. patent application Ser. No. 13/421,375, filed on Mar. 15, 2012, and claims priority under 35 U.S.C. § 119(a) to Korean Patent Application Serial No. 10-2011-0023578, which was filed in the Korean Industrial Property Office on Mar. 16, 2011, the entire content of which is incorporated herein by reference.
  • BACKGROUND OF THE INVENTION 1. Field of the Invention
  • The present invention relates generally to a method and an apparatus for configuring content in a broadcast system, and more particularly, to a method and an apparatus for configuring a data unit of content in a broadcast system supporting multimedia services based on an Internet Protocol (IP).
  • 2. Description of the Related Art
  • A conventional broadcast network generally uses the Moving Picture Experts Group-2 Transport Stream (MPEG-2 TS) for transmission of multimedia content. The MPEG-2 TS is a representative transmission technique that allows a plurality of broadcast programs (a plurality of encoded video bit streams) to transmit multiplexed bit streams in a transmission environment having errors. For example, the MPEG-2 TS is appropriately used in digital TeleVsion (TV) broadcasting, etc.
  • FIG. 1 illustrates a layer structure supporting a conventional MPEG-2 TS.
  • Referring to FIG. 1, the conventional MPEG-2 TS layer includes a media coding layer 110, a sync (synchronization) layer 120, a delivery layer 130, a network layer 140, a data link layer 150, and a physical layer 160. The media coding layer 110 and the sync layer 120 configure media data to a format usable for recording or transmission. The delivery layer 130, the network layer 140, the data link layer 150, and the physical layer 160 configure a multimedia frame for recording or transmitting a data block having the format configured by the sync layer 120 in/to a separate recording medium. The configured multimedia frame is transmitted to a subscriber terminal, etc., through a predetermined network.
  • Accordingly, the sync layer 120 includes a fragment block 122 and an access unit 124, and the delivery layer 130 includes an MPEG-2 TS/MPEG-4 (MP4) Real-time Transport Protocol (RTP) Payload Format/File delivery over unidirectional transport (FLUTE) 132 block, an RTP/HyperText Transfer Protocol (HTTP) block 134, and a User Datagram Protocol (UDP)/Transmission Control Protocol (TCP) block 136.
  • However, the MPEG-2 TS has several limitations in supporting multimedia services. Specifically, the MPEG-2 TS has limitations of inefficient transmission due to unidirectional communication and a fixed size of a frame, generation of an unnecessary overhead due to the usage of a transport protocol, and an IP specialized for audio/video data, etc.
  • Accordingly, the newly proposed MPEG MEDIA Transport (MMT) standard has been proposed by MPEG in order to overcome the above-described limitations of the MPEG-2 TS.
  • For example, the MMT standard may be applied for the efficient transmission of complex content through heterogeneous networks. Here, the complex content includes a set of content having multimedia factors by a video/audio application, etc. The heterogeneous networks include networks in which a broadcast network and a communication network coexist.
  • In addition, the MMT standard attempts to define a transmission technique that is friendlier to an IP that is a basic technique in a transmission network for the multimedia services.
  • Accordingly, the MMT standard attempts to representatively provide efficient MPEG transmission techniques in a multimedia service environment that changes based on the IP, and in this respect, the standardization and continuous research of the MMT standard have been progressed.
  • FIG. 2 illustrates a conventional layer structure of an MMT system for transmission of a multimedia frame according to multi-service/content through heterogeneous networks.
  • Referring to FIG. 2, an MMT system for configuring and transmitting a multimedia frame includes a media coding layer 210, an encapsulation layer (Layer E) 220, delivery layers (Layer D) 230 and 290, a network layer 240, a data link layer 250, a physical layer 260, and control layers (Layer C) 270 and 280. The layers include three technique areas, Layer E 220, Layers D 230 and 290, and Layers C 270 and 280. Layer E 220 controls complex content generation, Layers D 230 and 290 control the transmission of the generated complex content through the heterogeneous network, and Layers C 270 and 280 control consumption management and the transmission management of the complex content.
  • Layer E 220 includes three layers, i.e., MMT E.3 222, MMT E.2 224, and MMT E.1 226. The MMT E.3 222 generates a fragment, which is a basic unit for the MMT service, based on coded multimedia data provided from the media coding layer 210. The MMT E.2 224 generates an Access Unit (AU) for the MMT service by using the fragment generated by the MMT E.3 222. The AU is the smallest data unit having a unique presentation time. The MMT E.1 226 combines or divides the AUs provided by the MMT E.2 224 to generate a format for generation, storage, and transmission of the complex content.
  • Layer D includes three layers, i.e., MMT D.1 232, MMT D.2 234, and MMT D.3 290. The MMT D.1 232 operates with an Application Protocol (AP) similarly functioning to the RTP or the HTTP, the MMT D.2 234 operates with a network layer protocol similarly functioning to the UDP or the TCP, and the MMT D.3 290 controls optimization between the layers included in Layer E 220 and the layers included in Layer D 230.
  • Layer C includes two layers, i.e., MMT C.1 270 and MMT C.2 280. The MMT C.1 270 provides information related to the generation and the consumption of the complex content, and the MMT C.2 280 provides information related to the transmission of the complex content.
  • FIG. 3 illustrates a conventional data transmission layer for a broadcast system.
  • Referring to FIG. 3, Layer E in a transmission side stores elements of the content, such as video and audio, encoded to a Network Abstraction Layer (NAL) unit, a fragment unit, etc., by a codec encoder, such as an Advanced Video Codec (AVC) and a Scalable Video Codec (SVC) in units of AUs in layer E3, which is the top-level layer, and transmits the stored elements in the units of AUs to layer E2, which is a lower layer.
  • In the conventional technique, a definition and a construction of the AU transmitted from Layer E3 to Layer E2 depend on a codec.
  • Layer E2 structuralizes a plurality of AUs, encapsulates the structuralized AUs based on Layer E2 units, stores the encapsulated AUs in the unit of Elementary Streams (ES), and transmits the stored AUs to Layer E1, which is a next lower layer. Layer E1 instructs a relation and a construction of the elements of the content, such as the video and audio, encapsulates the elements together with the ES, and transmits the encapsulated elements to Layer D1 in units of packages.
  • Layer D1 divides a received package in accordance with a form suitable for transmission of the divided package to a lower layer, and the lower layer then transmits the packet to a next lower layer.
  • Layer D in a reception side collects the packets transmitted from the transmission side to configure the collected packets to the package of Layer E1. A receiver recognizes elements of the content within the package, a relation between the elements of the content, and information on construction of the elements of the content, to transfer the recognized information to a content element relation/construction processor and a content element processor. The content relation/construction processor transfers the respective elements for the proper reproduction of the entire content to the content element processor, and the content element processor controls elements to be reproduced at a set time and displayed at a set position on a screen.
  • However, a conventional Layer E2 technique provides only the AU itself or information on a processing time for the AU reproduction, e.g., a Decoding Time Stamp (DTS) or a Composition Time Stamp (CTS) and a Random Access Point (RAP). Accordingly, the utilization of the conventional Layer E2 technique is limited.
  • SUMMARY OF THE INVENTION
  • Accordingly, the present invention is designed to address at least the above-described problems and/or disadvantages occurring in the prior art, and to provide at least the advantages described below.
  • An aspect of the present invention is to provide a method of configuring AUs to a data unit for efficient reproduction of the AUs in Layer E2.
  • In accordance with an aspect of the present invention, a method is provided for receiving a media processing unit (MPU) including a data part and a control part, the MPU being processed independently, wherein the data part includes media data and the control part includes parameters related to the media data; and processing the received MPU, wherein the MPU comprises at least one fragmentation unit, wherein the parameters comprise a first parameter indicating a sequence number of the MPU, and wherein the sequence number of the MPU is unique to where the MPU belongs.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and other aspects, features, and advantages of the present invention will be more apparent from the following detailed description taken in conjunction with the accompanying drawings, in which:
  • FIG. 1 is a block diagram illustrating a layer structure for a conventional MPEG-2 TS;
  • FIG. 2 is a block diagram illustrating an MMT service by a broadcast system based on a conventional MMT standard;
  • FIG. 3 illustrates a conventional data transmission layer diagram in a broadcast system;
  • FIG. 4 illustrates a conventional reproduction flow of a DU configured through encapsulation of AUs one by one;
  • FIG. 5 illustrates a conventional process of receiving and reproducing a Data Unit (DU);
  • FIG. 6 illustrates a process of receiving and reproducing a DU according to an embodiment of the present invention;
  • FIG. 7A illustrates a construction of conventional AUs;
  • FIG. 7B illustrates a construction of AUs according to an embodiment of the present invention;
  • FIGS. 8A and 8B are diagrams illustrating a comparison of a temporal scalability according to a construction of AUs within a DU;
  • FIGS. 9A and 9B are diagrams illustrating a comparison of an Application-Forward Error Control (AL-FEC) according to a construction of AUs within a DU; and
  • FIG. 10 illustrates a construction of a DU according to an embodiment of the present invention.
  • DETAILED DESCRIPTION OF EMBODIMENTS OF THE INVENTION
  • Hereinafter, various embodiments of the present invention will be described with reference to the accompanying drawings in detail. In the following description, a detailed explanation of known related functions and constitutions may be omitted to avoid unnecessarily obscuring the subject matter of the present invention. Further, the terms used in the description are defined considering the functions of the present invention and may vary depending on the intention or usual practice of a user or operator. Therefore, the definitions should be made based on the entire content of the description.
  • In accordance with an embodiment of the present invention, a method is proposed for configuring DUs by grouping a plurality of AUs. The DUs are continuously concatenated to become Elementary Streams (ES), which become data transmitted from Layer E2 to Layer E1.
  • Conventionally, a DU is configured by encapsulating the AUs one by one, a DTS and a CTS are granted to each AU, and a picture type (Intra (I)-picture, Bidirectionally Predictive (B)-picture, or Predictive (P)-picture) of a corresponding AU is expressed in each AU or whether a corresponding AU is a RAP is displayed.
  • FIGS. 4 and 5 illustrate a reproduction flow of a conventional DU configured by encapsulating the Aus, one by one, and FIG. 6 illustrates a reproduction flow of a DU configured with a plurality of AUs according to an embodiment of the present invention.
  • Referring to FIG. 4, when data begins to be received from a center of a DU string (401), because there is a probability that a corresponding DU is not the RAP, i.e., the I-picture, a receiver searches for the RAP, i.e., a DU in a type of I-picture, by continuously examining subsequent concatenated DUs (402), such that it is possible to initiate the reproduction the DU (403).
  • In accordance with an embodiment of the present invention, a DU is provided by grouping a plurality of AUs, and further configuring the DU in units of Group Of Pictures (GOPs), compared to the generation of a DU for each of the respective AUs. When the DU is configured in the GOPs, all DUs may be independently reproduced, without having to wait until a next DU is decoded, eliminating a complex buffer control requirement.
  • Further, as illustrated in FIG. 5, when Layer E1 (501) instructs reproduction while limiting a part of an ES, if the DU merely includes one AU, there is no guarantee that the DU corresponding to the instructed CTS is the I-picture. Therefore, it is necessary for the receiver to search for DUs prior to the corresponding DU in an inverse direction (502), decode the DUs from the I-picture (503), and reproduce the DU (504), in order to reproduce the DU from an instructed time point.
  • However, in accordance with an embodiment of the present invention, as illustrated in FIG. 6, when the DU is configured in a unit of a GOP (as indicated by a dashed line), the reproduction of the DU from a time (601) instructed in Layer E1 does not require an inverse-directional search of the DUs (602 through 604).
  • In accordance with an embodiment of the present invention, a DU may be configured with a plurality of GOP units. When the DU is configured with a plurality of GOP units, the I-pictures, the P-pictures, and the B-pictures are separately grouped and stored, and the respective data may be differently stored in three places.
  • FIG. 7A illustrates a construction of a conventional AU, and FIG. 7B illustrates a construction of an AU according to an embodiment of the present invention.
  • As illustrated in FIG. 7B, when the AUs are grouped according to a property and stored in the DU, even if a part of the DU fails to be transmitted during the transmission, it is possible to realize temporal scalability through a frame drop, etc. Further, because a transmission system utilizing an error correction method, such as an AL-FEC, may utilize a recoverable scope by departmentalizing a scope recoverable with the AL-FEC into a part including the collected I-pictures, a part including the collected PB-pictures, etc., grouping the AUs according to a property and stored them in the DU is also helpful for reducing transmission overhead due to the AL-FEC.
  • FIGS. 8A and 8B are diagrams illustrating a comparison of a temporal scalability according to a construction of AUs within a DU between a conventional art and an embodiment of the present invention.
  • Referring to FIGS. 8A and 8B, when the transmission of the DU is interrupted or an error is generated during the transmission of the DU, in FIG. 8A, it is impossible to view content after 8 seconds. However, in FIG. 8B, it is possible to view content for up to 14 seconds although it has a low temporal scalability.
  • FIGS. 9A and 9B are diagrams illustrating a comparison of an AL-FEC according to a construction of AUs within a DU between the conventional art and an embodiment of the present invention.
  • As illustrated in FIG. 9A, when the I-pictures, the P-pictures, and the B-pictures are arranged without any consideration to picture type, it is impossible to identify the construction of the AUs within the DU. Consequently, AL-FEC must then be applied to all durations.
  • However, in accordance with an embodiment of the present invention, when the AUs are arranged according to picture type, because the AUs of the I-picture and P-picture affect a picture quality, it is sufficient to apply AL-FEC only to the AUs in the I-picture and P-picture, as indicated by a thick line of FIG. 9B. Accordingly, the overhead of the AL-FEC is decreased over the remaining durations, i.e., AUs of the B-pictures.
  • As described above, there are several advantages in the configuration of the DU within a unit of a GOP or a plurality of units of GOPs.
  • FIG. 10 illustrates a construction of a DU according to an embodiment of the present invention.
  • Referring to FIG. 10, the DU includes a header 1001 and a set of AUs 1002 included in a GOP or a plurality of GOPs.
  • The header 1001 includes a DU description 1010, which includes information on the DU, an AU structure description 1020, which includes information on a construction the AUs 1002, and AU information 1030, which includes information on each AU.
  • For example, the DU description 1010 may include the following information.
  • 1) Length 1011: This information represents a size of a DU and is a value obtained by adding a size of headers of remaining DUs and a size of a payload after a corresponding field. For example, the Length 1011 may be represented in units of bytes.
  • 2) Sequence Number 1012: This information represents a sequence of a corresponding DU within the ES. Omission or duplicate reception between a plurality of continuous DUs may be identified using the sequence number 1012. When an increase of sequence numbers between a previous DU and a continuously received DU exceeds “1”, this indicates that an error is generated in the transmission of the DU.
  • 3) Type of AU 1013: This information represents a type of AU included in the DU. For example, the AU may be generally classified into “timed data” or “non-timed data”, expressed with “0” or “1”, respectively. Timed data, represented by “0”, includes the CTS and/or the DTS and corresponds to multimedia elements, such as video data and audio data. Non-time data, represented by “1”, includes no CTS or DTS. The non-time data corresponds to general data, such a picture or a file.
  • 4) Decoding Time of DU 1014: This information represents a time to start decoding a first AU of the DU, as a representative value.
  • 5) Duration of DU 1015: This information represents a temporal length of the DU. A value obtained by adding a duration to the CTS of the first AU of the DU is the same as the time of termination of the reproduction of the finally decoded AU of the DU.
  • 6) Error Correction Code of DU 1016: For example, a Cyclic Redundancy Check (CRC), a parity bit, etc., may be used as a code for error correction.
  • Further, an AU structure description 1020 may include the following information.
  • 1) Number of AUs 1021: This information represents the number of AUs within the DU.
  • 2) Pattern of AUs 1022: This information represents a structure and an arrangement pattern of AUs. For example, the Pattern of AUs 1022 may be indicated with values 0: open GOP, 1: closed GOP, 2: IPBIPB, 4:IIPPBB, 6: Unknown, or 8: reserved.
  • Each bit value is added through the OR calculation for use. For example, the construction of IPBIPB of the closed GOP is 1|2=3.
  • Open GOP, represented by “0”, represents when the GOP is the open GOP. Closed GOP, represented by “1”, represents when the GOP is the closed GOP. Definitions of the open GOP and closed GOP are the same as that of the conventional art.
  • IPBIPB, represented by “2”, represents when I-pictures, P-pictures, and B-pictures are collected based on each group and repeated at least two times within the DU, e.g., IPBBIPBB or IPPBBBBIPPBBBB. IIPPBB, represented by “4”, represents when I-pictures, P-pictures, and B-pictures are collected based on each group and repeated only one time within the DU, e.g., IIPPBBBB or IIPPPPBBBBBBBB. Unknown, represented by “6”, represents a failure to identify a pattern, and is used in when an order of AUs is not changed.
  • Reserved, represented by “8”, represents a value reserved for a later user.
  • 3) Size of Patterns 1023: This information represents a size of each duration of a repeated pattern. For example, when pattern IPBIPB is actually configured as IPPBBBBIPPBBBB, lengths of duration I, duration PP, and duration BBBB are added to be represented as three values in units of bytes.
  • The size of the pattern may be expressed as:
      • for(i=0;i<number_of_patterns,i++){Size of patterns;}:
  • Further, the AU information 1030 may include the following information.
  • 1) DTS of AUs 1031: This information represents the DTS of the AU, and may be expressed as “for(i=0;i<number_of_AUs;i++){Decoding timestamp of AU;}”.
  • 2) CTS of AUs 1032: This information represents the CTS of the AU, and may be expressed as “for(i=0;i<number_of_AUs;i++){Composition timestamp of AU;}”.
  • 3) Size of AUs 1033: This information represents a size of the AU in the unit of bytes, and may be expressed as “for(i=0;i<number_of_AUs;i++){Size of AU;}”.
  • 4) Duration of AUs 1034: This information represents a temporal length of the AU, and may be expressed as “for(i=0;i<number_of_AUs;i++){Duration of AU;}”.
  • 5) AU num of RAP 1035: This information represents a number of the AU, and may be expressed as “for(i=0;i<number_of_RAPs;i++){AU number;}”.
  • 6) Independent and disposable AUs 1036: This information represents a relationship between a corresponding AU and a different AU, and may be expressed as “for(i=0;i<number_of_AUs;i++){Independent and disposable value of AU;}”.
  • More specifically, when the corresponding AU is dependent on the different AU, a value of the Independent and Disposable AUs 1036 is “1”, when the different AU refers to the corresponding AU, a value of the Independent and Disposable AUs 1036 is “2”, and when the corresponding AU and the different AU have duplicated information, a value of the Independent and Disposable AUs 1036 is “4”.
  • While the present invention has been shown and described with reference to certain embodiments and drawings thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (7)

What is claimed is:
1. A method for receiving media data in a broadcast system, the method comprising:
receiving a media processing unit (MPU) including a data part and a control part, the MPU being processed independently, wherein the data part includes media data and the control part includes parameters related to the media data; and
processing the received MPU,
wherein the MPU comprises at least one fragmentation unit,
wherein the parameters comprise a first parameter indicating a sequence number of the MPU, and
wherein the sequence number of the MPU is unique to where the MPU belongs.
2. The method of claim 1, wherein the parameters further comprise a second parameter, based on the second parameter having a first value, the second parameter indicates that the at least one fragmentation unit comprises timed data including timeline information for decoding or presentation of content of the timed data, and based on the second parameter having a second value, the second parameter indicates that the at least one fragmentation unit comprises non-timed data, which does not include the timeline information for decoding or the presentation of the content of the non-timed data.
3. The method of claim 1, wherein at least one packet received through the MPU includes information indicating if the at least one packet includes at least one random access point (RAP).
4. The method of claim 1, wherein, according to the at least one fragmentation unit comprising the timed data, a first positioned fragmentation unit in the MPU is a start position enabling a playback of the media data to be started.
5. The method of claim 1, wherein the media data in the data part corresponds to at least one group of pictures.
6. The method of claim 1, wherein the control part includes information on a decoding order of each of the at least one fragmentation unit.
7. The method of claim 1, wherein if the at least one fragmentation unit comprises timed data, the control part comprises information on a presentation duration of each of the at least one fragmentation unit and a presentation order of each of at least one fragmentation unit.
US16/588,417 2011-03-16 2019-09-30 Method and apparatus for configuring content in a broadcast system Abandoned US20200029130A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/588,417 US20200029130A1 (en) 2011-03-16 2019-09-30 Method and apparatus for configuring content in a broadcast system

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR1020110023578A KR101803970B1 (en) 2011-03-16 2011-03-16 Method and apparatus for composing content
KR10-2011-0023578 2011-03-16
US13/421,375 US10433024B2 (en) 2011-03-16 2012-03-15 Method and apparatus for configuring content in a broadcast system
US16/588,417 US20200029130A1 (en) 2011-03-16 2019-09-30 Method and apparatus for configuring content in a broadcast system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/421,375 Continuation US10433024B2 (en) 2011-03-16 2012-03-15 Method and apparatus for configuring content in a broadcast system

Publications (1)

Publication Number Publication Date
US20200029130A1 true US20200029130A1 (en) 2020-01-23

Family

ID=46829540

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/421,375 Active US10433024B2 (en) 2011-03-16 2012-03-15 Method and apparatus for configuring content in a broadcast system
US16/588,417 Abandoned US20200029130A1 (en) 2011-03-16 2019-09-30 Method and apparatus for configuring content in a broadcast system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/421,375 Active US10433024B2 (en) 2011-03-16 2012-03-15 Method and apparatus for configuring content in a broadcast system

Country Status (4)

Country Link
US (2) US10433024B2 (en)
EP (1) EP2687013A4 (en)
KR (1) KR101803970B1 (en)
WO (1) WO2012125001A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190075334A1 (en) * 2014-01-17 2019-03-07 Saturn Licensing Llc Communication apparatus, communication data generation method, and communication data processing method

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101903443B1 (en) 2012-02-02 2018-10-02 삼성전자주식회사 Apparatus and method for transmitting/receiving scene composition information
KR20130090824A (en) * 2012-02-06 2013-08-14 한국전자통신연구원 Mmt asset structures, structing methods and structing apparatuses supporting random access for a system transporting coded media data in heterogeneous ip network
US9071853B2 (en) * 2012-08-31 2015-06-30 Google Technology Holdings LLC Broadcast content to HTTP client conversion
KR102163338B1 (en) * 2012-10-11 2020-10-08 삼성전자주식회사 Apparatus and method for transmitting and receiving packet in a broadcasting and communication system
ES2726350T3 (en) 2012-10-11 2019-10-03 Samsung Electronics Co Ltd Media data reception procedure
US10015486B2 (en) * 2012-10-26 2018-07-03 Intel Corporation Enhanced video decoding with application layer forward error correction
US11290510B2 (en) * 2012-11-29 2022-03-29 Samsung Electronics Co., Ltd. Method and apparatus for encapsulation of motion picture experts group media transport assets in international organization for standardization base media files
KR101484843B1 (en) 2013-04-19 2015-01-20 삼성전자주식회사 A method and apparatus for transmitting a media transport packet in a multimedia transport system
JP5788622B2 (en) * 2013-06-05 2015-10-07 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America REPRODUCTION METHOD, REPRODUCTION DEVICE AND GENERATION METHOD, GENERATION DEVICE
JP2015015706A (en) * 2013-07-03 2015-01-22 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America Data transmission method, data reproduction method, data transmitter and data reproducer
EP3036900B1 (en) 2013-08-19 2023-06-28 LG Electronics Inc. Apparatus for receiving broadcast signals, method for transmitting broadcast signals and method for receiving broadcast signals
US9537779B2 (en) 2013-10-11 2017-01-03 Huawei Technologies Co., Ltd. System and method for real-time traffic delivery
JP6652320B2 (en) * 2013-12-16 2020-02-19 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Transmission method, reception method, transmission device, and reception device
EP3703379B1 (en) 2013-12-16 2022-06-22 Panasonic Intellectual Property Corporation of America Transmission method, reception method, transmitting device, and receiving device
US9218848B1 (en) * 2014-07-01 2015-12-22 Amazon Technologies, Inc. Restructuring video streams to support random access playback
CA2968855C (en) * 2014-11-25 2021-08-24 Arris Enterprises Llc Filler detection during trickplay
US11051026B2 (en) * 2015-08-31 2021-06-29 Intel Corporation Method and system of frame re-ordering for video coding
US10142707B2 (en) * 2016-02-25 2018-11-27 Cyberlink Corp. Systems and methods for video streaming based on conversion of a target key frame

Family Cites Families (95)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6434319B1 (en) * 1994-01-19 2002-08-13 Thomson Licensing S.A. Digital video tape recorder for digital HDTV
US5809201A (en) * 1994-06-24 1998-09-15 Mitsubishi Denki Kabushiki Kaisha Specially formatted optical disk and method of playback
US6009236A (en) * 1994-09-26 1999-12-28 Mitsubishi Denki Kabushiki Kaisha Digital video signal record and playback device and method for giving priority to a center of an I frame
US6064794A (en) * 1995-03-30 2000-05-16 Thomson Licensing S.A. Trick-play control for pre-encoded video
US6138147A (en) * 1995-07-14 2000-10-24 Oracle Corporation Method and apparatus for implementing seamless playback of continuous media feeds
US5926610A (en) * 1995-11-15 1999-07-20 Sony Corporation Video data processing method, video data processing apparatus and video data recording and reproducing apparatus
EP0966823B1 (en) * 1997-10-17 2006-03-29 Koninklijke Philips Electronics N.V. Method of encapsulation of data into transport packets of constant size
CA2265089C (en) * 1998-03-10 2007-07-10 Sony Corporation Transcoding system using encoding history information
US6483543B1 (en) * 1998-07-27 2002-11-19 Cisco Technology, Inc. System and method for transcoding multiple channels of compressed video streams using a self-contained data unit
US8290351B2 (en) * 2001-04-03 2012-10-16 Prime Research Alliance E., Inc. Alternative advertising in prerecorded media
KR100420740B1 (en) * 1999-02-05 2004-03-02 소니 가부시끼 가이샤 Encoding device, encoding method, decoding device, decoding method, coding system and coding method
US7096487B1 (en) * 1999-10-27 2006-08-22 Sedna Patent Services, Llc Apparatus and method for combining realtime and non-realtime encoded content
JP4362914B2 (en) * 1999-12-22 2009-11-11 ソニー株式会社 Information providing apparatus, information using apparatus, information providing system, information providing method, information using method, and recording medium
KR20020020957A (en) * 2000-06-06 2002-03-16 요트.게.아. 롤페즈 Interactive processing system
JP4361674B2 (en) * 2000-06-26 2009-11-11 パナソニック株式会社 Playback apparatus and computer-readable recording medium
KR100640921B1 (en) 2000-06-29 2006-11-02 엘지전자 주식회사 Method for Generating and Transmitting Protocol Data Unit
US6871006B1 (en) * 2000-06-30 2005-03-22 Emc Corporation Processing of MPEG encoded video for trick mode operation
US6816194B2 (en) * 2000-07-11 2004-11-09 Microsoft Corporation Systems and methods with error resilience in enhancement layer bitstream of scalable video coding
US7110452B2 (en) * 2001-03-05 2006-09-19 Intervideo, Inc. Systems and methods for detecting scene changes in a video data stream
US20030046429A1 (en) * 2001-08-30 2003-03-06 Sonksen Bradley Stephen Static data item processing
US20030185299A1 (en) * 2001-11-30 2003-10-02 Taro Takita Program, recording medium, and image encoding apparatus and method
US20090282444A1 (en) * 2001-12-04 2009-11-12 Vixs Systems, Inc. System and method for managing the presentation of video
FI114527B (en) * 2002-01-23 2004-10-29 Nokia Corp Grouping of picture frames in video encoding
KR100931915B1 (en) * 2002-01-23 2009-12-15 노키아 코포레이션 Grouping of Image Frames in Video Coding
JP2005527138A (en) * 2002-03-08 2005-09-08 フランス テレコム Dependent data stream transmission method
JP4281309B2 (en) * 2002-08-23 2009-06-17 ソニー株式会社 Image processing apparatus, image processing method, image frame data storage medium, and computer program
AU2003264414A1 (en) * 2002-09-12 2004-04-30 Matsushita Electric Industrial Co., Ltd. Recording medium, reproduction device, program, reproduction method, and recording method
KR100488804B1 (en) * 2002-10-07 2005-05-12 한국전자통신연구원 System for data processing of 2-view 3dimention moving picture being based on MPEG-4 and method thereof
US7409702B2 (en) * 2003-03-20 2008-08-05 Sony Corporation Auxiliary program association table
US7313236B2 (en) * 2003-04-09 2007-12-25 International Business Machines Corporation Methods and apparatus for secure and adaptive delivery of multimedia content
US7567584B2 (en) * 2004-01-15 2009-07-28 Panasonic Corporation Multiplex scheme conversion apparatus
WO2005071970A1 (en) * 2004-01-16 2005-08-04 General Instrument Corporation Method and apparatus for determining timing information from a bit stream
US7586924B2 (en) * 2004-02-27 2009-09-08 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Apparatus and method for coding an information signal into a data stream, converting the data stream and decoding the data stream
JP2005277591A (en) * 2004-03-23 2005-10-06 Toshiba Corp Electronic camera apparatus and imaging signal generating method
JP2005285209A (en) * 2004-03-29 2005-10-13 Toshiba Corp Metadata of moving image
WO2005101828A1 (en) * 2004-04-16 2005-10-27 Matsushita Electric Industrial Co., Ltd. Recording medium, reproduction device, program
PL2207183T3 (en) * 2004-04-28 2012-09-28 Panasonic Corp Moving picture stream generation apparatus, moving picture coding apparatus, moving picture multiplexing apparatus and moving picture decoding apparatus
CN101677382B (en) * 2004-04-28 2013-01-09 松下电器产业株式会社 Stream generation apparatus, stream generation method, coding apparatus, coding method, recording medium and program thereof
TW200952462A (en) * 2004-06-02 2009-12-16 Panasonic Corp Seamless switching between random access units multiplexed in a multi angle view multimedia stream
KR101134220B1 (en) * 2004-06-02 2012-04-09 파나소닉 주식회사 Picture coding apparatus and picture decoding apparatus
JP4608953B2 (en) * 2004-06-07 2011-01-12 ソニー株式会社 Data recording apparatus, method and program, data reproducing apparatus, method and program, and recording medium
JP4575129B2 (en) * 2004-12-02 2010-11-04 ソニー株式会社 DATA PROCESSING DEVICE, DATA PROCESSING METHOD, PROGRAM, AND PROGRAM RECORDING MEDIUM
KR100665102B1 (en) * 2004-12-03 2007-01-04 한국전자통신연구원 Method for controlling video encoding bit rate considering transport packet length, and video coding Apparatus using it
KR100651486B1 (en) 2004-12-07 2006-11-29 삼성전자주식회사 Apparatus and Method for transporting MPEG contents through Internet Protocol Network
JP4769717B2 (en) * 2005-01-17 2011-09-07 パナソニック株式会社 Image decoding method
US7848408B2 (en) 2005-01-28 2010-12-07 Broadcom Corporation Method and system for parameter generation for digital noise reduction based on bitstream properties
JP4261508B2 (en) * 2005-04-11 2009-04-30 株式会社東芝 Video decoding device
JP4374548B2 (en) * 2005-04-15 2009-12-02 ソニー株式会社 Decoding device and method, recording medium, and program
WO2006115060A1 (en) * 2005-04-22 2006-11-02 Sony Corporation Recording device, recording method, reproducing device, reproducing method, program, and recording medium
KR20070122577A (en) * 2005-04-26 2007-12-31 코닌클리케 필립스 일렉트로닉스 엔.브이. A device for and method of processing a data stream having a sequence of packets and timing information related to the packets
EP1725036A1 (en) 2005-05-20 2006-11-22 Thomson Licensing A method and a video server for embedding audiovisual packets in an IP packet
US8055783B2 (en) 2005-08-22 2011-11-08 Utc Fire & Security Americas Corporation, Inc. Systems and methods for media stream processing
US8712169B2 (en) * 2005-08-26 2014-04-29 Thomson Licensing Transcoded images for improved trick play
FR2898754B1 (en) 2006-03-17 2008-06-13 Thales Sa METHOD FOR PROTECTING MULTIMEDIA DATA USING ADDITIONAL NETWORK ABSTRACTION LAYERS (NAL)
EP1845685B1 (en) 2006-04-11 2012-06-27 Alcatel Lucent Optimised transmission of content IP packets by adding to the IP packets content-related information
US9432433B2 (en) * 2006-06-09 2016-08-30 Qualcomm Incorporated Enhanced block-request streaming system using signaling or block creation
JP4207981B2 (en) * 2006-06-13 2009-01-14 ソニー株式会社 Information processing apparatus, information processing method, program, and recording medium
US7746882B2 (en) * 2006-08-22 2010-06-29 Nokia Corporation Method and device for assembling forward error correction frames in multimedia streaming
US8392595B2 (en) * 2006-09-15 2013-03-05 France Telecom Method and device for adapting a scalable data stream, corresponding computer program product and network element
CN101528633A (en) * 2006-11-01 2009-09-09 日立金属株式会社 Semiconductor ceramic composition and process for producing the same
US20080141091A1 (en) * 2006-12-06 2008-06-12 General Instrument Corporation Method and Apparatus for Recovering From Errors in Transmission of Encoded Video Over a Local Area Network
KR101072341B1 (en) * 2007-01-18 2011-10-11 노키아 코포레이션 Carriage of SEI messages in RTP payload format
US7890556B2 (en) * 2007-04-04 2011-02-15 Sony Corporation Content recording apparatus, content playback apparatus, content playback system, image capturing apparatus, processing method for the content recording apparatus, the content playback apparatus, the content playback system, and the image capturing apparatus, and program
DE112008000552B4 (en) * 2007-05-14 2020-04-23 Samsung Electronics Co., Ltd. Method and device for receiving radio
US20090106807A1 (en) * 2007-10-19 2009-04-23 Hitachi, Ltd. Video Distribution System for Switching Video Streams
JP2009135686A (en) * 2007-11-29 2009-06-18 Mitsubishi Electric Corp Stereoscopic video recording method, stereoscopic video recording medium, stereoscopic video reproducing method, stereoscopic video recording apparatus, and stereoscopic video reproducing apparatus
EP2071850A1 (en) 2007-12-10 2009-06-17 Alcatel Lucent Intelligent wrapping of video content to lighten downstream processing of video streams
JP2009163643A (en) * 2008-01-09 2009-07-23 Sony Corp Video retrieval device, editing device, video retrieval method and program
US8973028B2 (en) * 2008-01-29 2015-03-03 Samsung Electronics Co., Ltd. Information storage medium storing metadata and method of providing additional contents, and digital broadcast reception apparatus
JP2009253675A (en) * 2008-04-07 2009-10-29 Canon Inc Reproducing apparatus and method, and program
US20100049865A1 (en) * 2008-04-16 2010-02-25 Nokia Corporation Decoding Order Recovery in Session Multiplexing
EP2129028B1 (en) * 2008-05-06 2012-10-17 Alcatel Lucent Recovery of transmission errorrs
US20100008419A1 (en) * 2008-07-10 2010-01-14 Apple Inc. Hierarchical Bi-Directional P Frames
WO2010042650A2 (en) * 2008-10-07 2010-04-15 Motorola, Inc. System and method of optimized bit extraction for scalable video coding
US8301974B2 (en) 2008-10-22 2012-10-30 Samsung Electronics Co., Ltd. System and method for low complexity raptor codes for multimedia broadcast/multicast service
EP2392138A4 (en) * 2009-01-28 2012-08-29 Nokia Corp Method and apparatus for video coding and decoding
US9281847B2 (en) * 2009-02-27 2016-03-08 Qualcomm Incorporated Mobile reception of digital video broadcasting—terrestrial services
US20100254453A1 (en) * 2009-04-02 2010-10-07 Qualcomm Incorporated Inverse telecine techniques
JP4993224B2 (en) * 2009-04-08 2012-08-08 ソニー株式会社 Playback apparatus and playback method
EP2265026A1 (en) * 2009-06-16 2010-12-22 Canon Kabushiki Kaisha Method and device for deblocking filtering of SVC type video streams during decoding
US8310947B2 (en) * 2009-06-24 2012-11-13 Empire Technology Development Llc Wireless network access using an adaptive antenna array
US20110019693A1 (en) * 2009-07-23 2011-01-27 Sanyo North America Corporation Adaptive network system with online learning and autonomous cross-layer optimization for delay-sensitive applications
PT3104600T (en) * 2009-09-09 2018-06-19 Fraunhofer Ges Forschung Transmission concept for a stream comprising access units
JP5540969B2 (en) * 2009-09-11 2014-07-02 ソニー株式会社 Nonvolatile memory device, memory controller, and memory system
US8731053B2 (en) * 2009-11-18 2014-05-20 Tektronix, Inc. Method of multiplexing H.264 elementary streams without timing information coded
US9185335B2 (en) * 2009-12-28 2015-11-10 Thomson Licensing Method and device for reception of video contents and services broadcast with prior transmission of data
KR101777348B1 (en) * 2010-02-23 2017-09-11 삼성전자주식회사 Method and apparatus for transmitting and receiving of data
US9223643B2 (en) * 2010-03-04 2015-12-29 Microsoft Technology Licensing, Llc Content interruptions
EP2561664B1 (en) * 2010-04-20 2019-03-06 Samsung Electronics Co., Ltd Interface apparatus for transmitting and receiving media data
US20110293021A1 (en) * 2010-05-28 2011-12-01 Jayant Kotalwar Prevent audio loss in the spliced content generated by the packet level video splicer
US9485546B2 (en) * 2010-06-29 2016-11-01 Qualcomm Incorporated Signaling video samples for trick mode video representations
US8918533B2 (en) * 2010-07-13 2014-12-23 Qualcomm Incorporated Video switching for streaming video data
US8806050B2 (en) * 2010-08-10 2014-08-12 Qualcomm Incorporated Manifest file updates for network streaming of coded multimedia data
BR112013020175A2 (en) * 2011-02-10 2016-11-08 Panasonic Corp Data Creation Device and Video Stream Video Image Playback Device
US9565476B2 (en) * 2011-12-02 2017-02-07 Netzyn, Inc. Video providing textual content system and method

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190075334A1 (en) * 2014-01-17 2019-03-07 Saturn Licensing Llc Communication apparatus, communication data generation method, and communication data processing method
US10820024B2 (en) * 2014-01-17 2020-10-27 Saturn Licensing Llc Communication apparatus, communication data generation method, and communication data processing method
US11284135B2 (en) * 2014-01-17 2022-03-22 Saturn Licensing Llc Communication apparatus, communication data generation method, and communication data processing method

Also Published As

Publication number Publication date
US20120240174A1 (en) 2012-09-20
US10433024B2 (en) 2019-10-01
WO2012125001A3 (en) 2012-12-27
KR101803970B1 (en) 2017-12-28
EP2687013A4 (en) 2014-09-10
WO2012125001A2 (en) 2012-09-20
KR20120105875A (en) 2012-09-26
EP2687013A2 (en) 2014-01-22

Similar Documents

Publication Publication Date Title
US20200029130A1 (en) Method and apparatus for configuring content in a broadcast system
US20200280747A1 (en) Apparatus and method for transmitting multimedia frame in broadcast system
US11196786B2 (en) Interface apparatus and method for transmitting and receiving media data
US11895357B2 (en) Broadcasting signal transmission device, broadcasting signal reception device, broadcasting signal transmission method, and broadcasting signal reception method
KR101972951B1 (en) Method of delivering media data based on packet with header minimizing delivery overhead
JP6422527B2 (en) Data receiving method and apparatus in multimedia system
US8301982B2 (en) RTP-based loss recovery and quality monitoring for non-IP and raw-IP MPEG transport flows
US20160105259A1 (en) Apparatus and method of transmitting/receiving broadcast data
US20170272691A1 (en) Broadcast signal transmission device, broadcast signal reception device, broadcast signal transmission method, and broadcast signal reception method
US20140334504A1 (en) Method for hybrid delivery of mmt package and content and method for receiving content
EP2667625B1 (en) Apparatus and method for transmitting multimedia data in a broadcast system
KR20050052531A (en) System and method for transmitting scalable coded video over ip network
KR20130040090A (en) Apparatus and method for delivering multimedia data in hybrid network
KR20140084142A (en) Network streaming of media data
WO2007045140A1 (en) A real-time method for transporting multimedia data
MacAulay et al. WHITEPAPER IP streaming of MPEG-4: Native RTP vs MPEG-2 transport stream
Paulsen et al. MPEG-4/AVC versus MPEG-2 in IPTV.
KR20130058539A (en) Methods of synchronization in hybrid delivery

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION