VIDEO DATA STORAGE AND RETRIEVAL SYSTEM AND METHOD WITH RESOLUTION CONVERSION
Cross-Reference to Related Applications
This application claims priority to U.S. Provisional Application Serial No. 60/460,649, filed on April 4, 2003. Priority to the prior application is expressly claimed, and the disclosure of the application is hereby incorporated by reference in its entirety.
Field of the Invention
The present invention relates to the field of data storage and retrieval. In particular, but not exclusively, the invention relates to the storage and retrieval of media data.
Background of the Invention
As the amount of media data collected and transmitted around the world has increased, problems associated with efficiently receiving, storing and managing the data have also increased.
In particular, it is difficult to process and store all of the incoming data in a way that enables the data to be easily accessed and retrieved. For example, news broadcasters may receive satellite or other external feeds from a plurality of sources 24 hrs a day. All of the data received is usually stored, since it is not possible for the receiver to determine which parts of the data will subsequently be useful to editors and programme makers.
The data received is received in an external format, typically in a transmission format and at a transmission resolution. In the case of an incoming satellite feed, the data resolution of the received data is typically less than about 6Mb/s and may be about 3-5 Megabits per second (Mb/s). The ratio may be lower for other feeds (e.g. over a broadband link or satellite phone). However,
to process the data using editing systems and to prepare the data for broadcast, the data typically requires pre-processing to increase the resolution of the data or to conform it to an internal format. For a normal definition editing system, the resolution is likely to be increased. About 50Mb/s l-frame MPEG or 270 Mb/s SDI uncompressed are typical for normal definition and this may be significantly greater for high definition, but may be decreased with alternative compression technologies (e.g. MPEG-4).
As described in more detail below, prior art systems receive raw data from transmission sources, for example from satellite transmitters, convert the transmission-quality data to internal format, typically higher quality edit or broadcast quality data and store the converted data for later access and editing by editing systems. However, as outlined above, a large amount of data must be converted at least to the editing resolution and stored in the system.
Brief Summary of the Invention
According to one aspect, there is provided a method of managing media data comprising: receiving input media data from a media data source at a first resolution; storing the media data in a first archive at the first resolution; converting the input media data to a second resolution; providing at least a portion of the converted media data to at least one editing system; receiving output media data from the at least one editing system at the second resolution; checking whether portions of the output media data correspond directly to portions of stored input media data; storing the output media data at the second resolution, wherein at least some portions of the output media data which correspond directly to stored input media data are not stored at the second resolution.
Hence the input data may be stored at the resolution at which it is received, without requiring the system to convert all of the input data to a second resolution. This may reduce multiple transcoding problems with the broadcast data, since at least portions of the broadcast data may always be recreated from the original stored input media data.
Data that is accessed by an editing system and hence converted to a second resolution for editing is received by the system at the second resolution. Advantageously, the system does not store the data received at the second resolution if the data is directly derivable from the data stored at the first resolution. This may reduce the amount of data that needs to be stored by the system. Although the present method may reduce the data storage requirements of the system, the quality of the final data output is not reduced, since the required data may be upconverted to a higher resolution when required to yield the same result as if a higher resolution had been stored.
Data that corresponds directly to stored input data preferably comprises data that is directly derivable from the input data. For example, if clips have been taken from the input data, but the clips themselves have not been modified, the edited clips may be recreated directly from the stored data at the first resolution, so data at the second resolution is preferably not stored by the system. Similarly, if the data has been edited so that captions are dynamically applied to the media data on playout, it may not be necessary to store the edited media data, but the edited data may be regenerated directly from the data stored at the first resolution.
If the output media data at the second resolution is not directly derivable from the input data, the output media data may be stored by the system. For example, if frames of the data have been modified or if captions have been overlaid on the frames, it may not be possible to generate the data from the input data, hence the output data may be stored at the second resolution.
Converting the data may advantageously mean that the system and method described herein may interact with an editing system as if it were a prior art storage system. This may allow prior art editing systems to be used in conjunction with the present system without modification.
In a preferred embodiment, checking whether portions of the output media data correspond directly to portions of stored input media data comprises analysing metadata associated with the input and/or output media data. In an alternative embodiment, the data itself may be compared, but analysing metadata may advantageously be faster and require less processing that analysing the data itself. The metadata may be stored with the media data itself or may be stored in a separate metadata database.
Preferably, the second resolution may be higher than the first resolution. Hence the majority of the data may be stored at the lower, first resolution, which may allow the system to be implemented with a smaller data storage capacity. Only data required by the editing system needs to be converted to the second, higher resolution and the system may only store data at the second, higher, resolution if the output data does not correspond directly to the data already stored at the first, lower, resolution.
The method may further comprise subsequently enabling an editing system to access the media data at a selected one of the first or the second resolutions. Hence the data may be converted to the second system only on request by the editing system.
In one embodiment, checking whether portions of the output media data correspond directly to portions of stored input media data comprises consulting a database comprising metadata associated with the input and/or output media data.
In one embodiment, only the portion of the input media data that is provided to the at least one editing system is converted from storage in the archive to the
second resolution. Hence data that is not requested by an editing system is not converted from the first resolution. This may reduce the storage and processing requirements of the system. The conversion may be performed by either the storage system or the editing system.
The method may further comprise maintaining a database comprising metadata associated with the input and/or output media data.
In one embodiment, the media data at the second resolution may be stored in a separate archive to the media data at the first resolution. The two archives may be closely linked or interconnected, or may be two sections of the same archive.
In a preferred embodiment, the input media data comprises transmission- stream quality data. Preferably, the input media data is compressed and has a bit rate of less than about 8Mb/s, more preferably about 3Mb/s to about 5Mb/s.
The bit rate of the transmission-stream quality data may vary depending on the transmission stream. For example, for a satellite feed the bit rate may be about 3-5Mb/s. However, data received over a data transmission line into a port of the editing system may be received at a higher bit rate, possibly at a bit rate greater than 10Mb/s. For example, for HDTV signals, the transmission bit rate may be about 20Mb/s or greater.
In one embodiment, the media data source comprises a satellite feed. In one embodiment, the input media data may comprise a satellite feed at about 3- 5Mb/s.
The input media data preferably comprises compressed data, for example using an MPEG compression standard.
In alternative embodiments, other compression techniques may be used to compress and regenerate the data between different resolutions and it will be obvious to one skilled in the art that a wide range of compression ratios may be generated.
In a preferred embodiment, the output media data comprises broadcast quality data. Hence the data may be received at the archive in the format in which it is edited.
Preferably, the output media data is compressed and has a bit rate of greater than about 10Mb/s, preferably about 50Mb/s. This may allow the data to be edited by user editing systems provided on user workstations. The output bit rate is preferably increased if the user editing system can work with data at a higher bit rate
In an alternative embodiment, the output media data is uncompressed and has a bit rate of greater than about 10OMb/s, preferably about 270Mb/s, for example the output media data may be SDI format data. Preferably, the broadcast quality data has a high a bit rate as possible. In some systems, for example a HDTV (High Definition Television) system, the broadcast quality data may have a bit rate of the order of 1Gb/s or even greater. High bit rate output data may provide a higher-quality signal to the viewer of the media data.
According to a further aspect, there is provided a method of obtaining media data from an archive comprising: receiving portions of media data at at least a first and a second resolution; dynamically converting the portions of the media data at the first resolution to media data at the second resolution; combining the converted portions of the media data with the portions of the media data at the second resolution.
This may allow data to be reconstructed for further editing or playout from data at two different resolutions, hence a smaller amount of data may be stored in the archive.
Preferably, the method further comprises determining whether the portions of the media data are received at the first or at the second resolution. The resolution of the data may be determined based on metadata associated with the media data.
According to a further aspect, there is provided apparatus for managing media data comprising: means for receiving input media data from a media data source at a first resolution (for example an input interface); means for storing the media data in a first archive at the first resolution (for example a memory, such as a digital or analogue storage system, for example tape, disk or flash memory); means for converting the input media data to a second resolution (for example a transcoder); means for providing at least a portion of the converted media data to at least one editing system (for example, an interface with an editing system); means for receiving output media data from the at least one editing system at the second resolution (for example an interface with an editing system); means for checking whether portions of the output media data correspond directly to portions of stored input media data (for example a processor and associated memory); means for storing the output media data at the second resolution (for example a memory), wherein at least some portions of the output media data which correspond directly to stored input media data are not stored at the second resolution.
Preferably, the means for checking whether portions of the output media data correspond directly to portions of stored input media data comprises means
for analysing metadata associated with the input and/or output media data (for example, a processor).
In a highly preferred embodiment, the second resolution is higher than the first resolution. For most embodiments, this is likely to be the case, since the first, input resolution, from an external source is likely to be lower than the internal resolution at which the data is edited and broadcast. However, the system may be advantageous even if the first and second resolutions are similar, or if the first resolution is higher than the second resolution since storing the data in the format in which it is received may reduce the likelihood of multiple transcoding errors in the data when it is broadcast or reconstructed from the archive.
Preferably, only the portion of the input media data that is provided to the at least one editing system is converted to the second resolution.
Preferably, the apparatus further comprises means for maintaining a database comprising metadata associated with the input and/or output media data (for example a processor and associated memory means).
Preferably, the apparatus further comprises a second archive for storing the media data at the second resolution.
Preferably, the input media data comprises transmission-stream quality data. More preferably, the input media data has a bit rate of less than about 10Mb/s.
In a preferred embodiment, the media data source comprises a satellite feed.
Preferably, the output media data comprises broadcast quality data. In one embodiment, the output media data has a bit rate of greater than about 20Mb/s, preferably about 50Mb/s. In an alternative embodiment, the output
media data has a bit rate of greater than about 170Mb/s, preferably about 270Mb/s.
According to a further aspect, there is provided apparatus for obtaining media data from an archive, comprising: means for obtaining portions of media data at at least a first and a second resolution (for example a processor and associated memory means or an input interface for receiving the media data); means for dynamically converting the portions of the media data at the first resolution to the second resolution (for example a processor); means for combining the converted portions of the media data with the portions of the media data at the second resolution (for example a combiner, such as a combiner in an editing or playout system).
Preferably, the apparatus further comprises means for determining whether the portions of the media data are obtained at the first or at the second resolution (for example a processor or analyser).
According to a further aspect, there is provided a method of distributed editing of media data comprising: storing the data at a first resolution at first and second locations; editing the media data at the first location to produce an edit decision list for the media data; transmitting the edit decision list to a further editing or playout system at the second location; reconstructing the edited data at the second location based on the stored data and the edit decision list.
Producing an edit decision list may allow the edited media data to be recreated from the original data. Hence, using two or more archives containing the same original data, the same edited data may be produced using the edit decision list. This may be more efficient than centrally storing or
transmitting the edited data and may allow data to be edited over a distributed system.
In a preferred embodiment, the method further comprises converting the data to a second resolution before editing the data at the first location. Hence the data may be edited at a higher editing or broadcast resolution. As described above, the first resolution may comprise a transmission-quality resolution, for example as received from a satellite feed, and the second resolution may comprise a broadcast quality resolution or an editing quality resolution.
Preferably, reconstructing the edited data comprises converting the data from the first resolution to a second resolution at the second location.
The method may further comprise transmitting at least one portion of edited data that does not correspond directly to data stored at the first resolution from the first location to the second location. For example, if a portion of data has been edited to add captions or images to the frames, it may not be possible to derive the higher-resolution data directly from the data stored at the first resolution. This portion of edited data may be transmitted, for example with the EDL, from the first location to the second location.
In a preferred embodiment, reconstructing the edited data comprises combining edited data received from the first location with data stored at the first resolution at the second location.
According to a further aspect, there is provided apparatus for distributed editing of media data comprising: means for storing the data at a first resolution at first and second locations (for example a memory, such as a tape, disk or flash-based memory); means for editing the media data (for example, an editing system) at the first location to produce an edit decision list for the media data; means for transmitting the edit decision list to a further editing or playout system at the second location (for example a transmission or output system);
means for reconstructing the edited data at the second location based on the stored data and the edit decision list (for example, an editing system).
According to a further aspect, there is provided a computer program or computer program product (or a computer readable medium) comprising instructions for carrying out a method of managing media data, the method comprising: receiving input media data from a media data source at a first resolution; storing the media data in a first archive at the first resolution; converting the input media data to a second resolution; providing at least a portion of the converted media data to at least one editing system; receiving output media data from the at least one editing system at the second resolution; checking whether portions of the output media data correspond directly to portions of stored input media data; storing the output media data at the second resolution, wherein at least some portions of the output media data which correspond directly to stored input media data are not stored at the second resolution.
According to a further aspect, there is provided a computer program or computer program product (or a computer readable medium) comprising instructions for carrying out a method of obtaining media data from an archive, the method comprising: receiving portions of media data at at least a first and a second resolution; dynamically converting the portions of the media data at the first resolution to media data at the second resolution; combining the converted portions of the media data with the portions of the media data at the second resolution.
According to a further aspect, there is provided a computer program or computer program product (or a computer readable medium) comprising
instructions for carrying out a method of distributed editing of media data, the method comprising: storing the data at a first resolution at first and second locations; editing the media data at the first location to produce an edit decision list for the media data; transmitting the edit decision list to a further editing or playout system at the second location; reconstructing the edited data at the second location based on the stored data and the edit decision list.
The invention also provides a computer program and a computer program product for carrying out any of the methods described herein and/or for embodying any of the apparatus features described herein, and a computer readable medium having stored thereon a program for carrying out any of the methods described herein and/or for embodying any of the apparatus features described herein.
The invention also provides a signal embodying a computer program for carrying out any of the methods described herein and/or for embodying any of the apparatus features described herein, a method of transmitting such a signal, and a computer product having an operating system which supports a computer program for carrying out any of the methods described herein and/or for embodying any of the apparatus features described herein.
The invention extends to methods and/or apparatus substantially as herein described with reference to the accompanying drawings.
Any feature in one aspect of the invention may be applied to other aspects of the invention, in any appropriate combination. In particular, method aspects may be applied to apparatus aspects, and vice versa.
Furthermore, features implemented in hardware may generally be implemented in software, and vice versa. Any reference to software and hardware features herein should be construed accordingly.
The methods and apparatus described herein may be implemented in conjunction with media input, editing and transmission systems, aspects of which are described in the applicant's co-pending patent applications. In particular, aspects of a system for managing data for transmission are described in the applicant's co-pending patent application entitled "System and Method for Media Management", Attorney Reference No. IK/26522WO, filed on 5 April 2004, the disclosure of which is hereby incorporated by reference in its entirety. Aspects of a system for the storage of data, in particular controlling media storage devices remotely, are described in the applicant's co-pending patent application entitled "Media Storage Control", Attorney Reference No. IK/26520WO, filed on 5 April 2004, the disclosure of which is hereby incorporated by reference in its entirety. A resource allocation system, which may be implemented as part of a media editing system is described in the applicant's co-pending patent application entitled "A Method and Apparatus for Dynamically Controlling a Broadcast Media Production System", Attorney Reference No. IK/26271 WO, the disclosure of which is hereby incorporated by reference in its entirety. Aspects of a media editing system, in particular the control of media and related metadata in a non-linear media editing system, are further discussed in the applicant's co-pending patent application entitled "System and Method for Media Management", Attorney Reference No. IK/26521 WO, filed on 5 April 2004, the disclosure of which is hereby incorporated by reference in its entirety. Further aspects of a media processing system, are described in the applicant's co-pending patent application entitled "Media Processor", Attorney Reference No. IK/26519WO, filed on 5 April 2004, the disclosure of which is hereby incorporated by reference in its entirety. A production management system, which may also be implemented in conjunction with the system described herein is described in the applicant's co-pending patent application entitled "System and Method for
Processing Multimedia Content", Attorney Reference No. 13214.4001 , filed on
5 April 2004.
Brief Description of the Drawings
Preferred features of the present invention will now be described, purely by way of example, with reference to the accompanying drawings, in which: Fig. 1 is a schematic diagram of an embodiment of a prior art data storage and retrieval system; Fig. 2 is a schematic diagram of a data retrieval and storage system according to one embodiment;
Fig. 3 is a schematic diagram of an embodiment of a system, which may be provided in conjunction with the systems and methods described herein; Fig. 4 is a schematic diagram of media item structure; Fig. 5 is a schematic diagram of component metadata;
Fig. 6 is a schematic overview diagram of a system in conjunction with which the methods and systems described herein may be implemented.
Detailed Description
In the prior art system, illustrated in Fig. 1 , data is received from a satellite feed 110 at a transmission rate of 3 Megabits per second (Mb/s). This data is decoded at an Integrated Receiver Decoder (IRD) 112. The resolution of the data is then transformed 116 from 3Mb/s to a higher resolution for editing, in the present embodiment, a resolution of 50Mb/s. The higher resolution data is then transmitted to a data archive 120 and stored at the 50mb/s resolution. Data items, comprising sections of the data, may then be accessed and read from the data store into editing systems 124, for example data items may be incorporated into clips 122 or referenced in Edit Decision Lists (EDL). One function of the editing system may be to increase the resolution of the data further to broadcast-quality data, for example to 270Mb/s.
If the data received from the satellite transmission is intended for immediate
(for example on the same day or in the same week) playout or editing, a copy of the data may be sent directly to an editing or playout system, such as a video server 118 via a Serial Digital Interface, which, in the present embodiment, converts the data to a resolution of 270Mb/s. In one embodiment, all of the data received may be upconverted to broadcast quality data and may be stored temporarily in the video server 118, for example for a day or for a week, to allow multiple users to access recently-received data readily. When the data is no longer required for immediate access, a copy of the data may be stored in the archive 120.
In one embodiment, data may be received and converted into data elements, which may be called media item objects. Fig. 4 illustrates an exemplary structure for a media item that might be handled in a media management system in accordance with an embodiment of the present invention. The media item 400 is represented along a time axis extending horizontally across the page. The media item comprises three separate media objects or tracks, 402, 404 and 406. In this example track 402 is video, and tracks 404 and 406 are audio. The tracks can be referred to as media essence.
Each media object or track can be divided in time into segments or portions. It should be noted that segments may have boundaries aligned across all tracks, such as segment 408, however a single segment in one track may span two or more segments in another track, such as segments 410 and 412.
Media items additionally comprise metadata, which describe attributes associated with a media item, and which is used to assist in processing of the media essence within the system e.g. storing, tracking, editing, archiving etc. These attributes may apply to the whole media item e.g. item duration, or may be specific to segments within the media item e.g. copyright holder (this will be discussed in greater detail below) A media item can therefore be said to be made up of media essence (tracks), and associated metadata (attributes).
The table below describes a number of metadata attributes that can be associated with a media item:
While media and metadata are associated they can be used and stored separately and independently in the system of the present invention to advantageous effect.
In one embodiment of the invention each attribute or field associated with a media item can take varying forms. A metadata field can comprise only a single value associated with a media item e.g. duration.
A metadata field can take the form of 'components'. Figure 5 shows a schematic representation of a media item 502 with the time axis extending horizontally across the page. Component metadata has an associated timeline illustrated as 504. In other words it is information that relates to a particular time segment of a media item. The metadata value for a particular attribute for a particular media item can therefore vary in time.
In addition to the timeline components themselves, component fields may also have a default value 510, which is used to provide a value for a component field at all time instances along a media item for which no other value has been assigned. In Figure 5, component values have been assigned for time segments 506 and 508, and the remainder of the component timeline takes the default value.
Certain component fields also have a summary value 512 and optionally an override value 514. The summary value, like the default value is an overall value for the whole media item, and is derived from the component values along the media item. This can for example be performed using an arithmetic or Boolean operation on the individual component values. The summary value may optionally be replaced by a user input override value. For certain metadata fields it will be advantageous to impose restrictions on users who can override a summary value, and rules defining the manner in which the summary value may be overridden (e.g. the summary value may only be made greater, or more restrictive). If the override is removed, or if the summary value changes such that the override no longer satisfies the rules, then the summary will be shown again.
Metadata fields may exhibit propagation properties. In a media system according to one embodiment of the present invention metadata propagation is supported both from parent to child media items and also from child to parent media items.
Inheritance propagation refers to metadata from an item being automatically associated also with one or more child items. Bi-directional propagation refers to metadata from an item being automatically associated with both parent item and child item(s).
In the prior art system, media item essence, or media data may further by converted by the system to other data qualities (e.g. desktop, broadcast etc). The essence may be created by the system automatically on ingest. Other
qualities may include web quality, desktop quality and keyframe quality, and therefore, in a prior art system, four Media Item Instances may exist on the system for a Media Item. The non-broadcast quality Media Item Instances may be used for viewing and editing purposes within the system only.
Final archiving of a Media Item may mean that a broadcast quality copy of the
Media Item essence is made on an offline storage item (e.g. tape or disc). All essence may then be deleted from the online and offline archive stores with the exception of the web quality essence and keyframe essence. The metadata for the Media Item, and the metadata for the associated keyframes, Components and Bookmarks may also be kept online. All operations from the system to Offline Archive are copy actions. A move action is performed by first copying the item to the relevant store and then later by the system deleting the essence from the current online store.
By keeping both the metadata and low quality essence online, users can continue to search and view archived items in the same way as online items are searched and viewed, as described above. These items will have the archive flag set to indicate that the broadcast quality material is archived. The metadata will additionally include a tape ID to facilitate a user requesting that items on tape be brought back online.
A web host may be provided to cater for users on a low bandwidth network with a requirement to access media. For example, users may have 56Kb or ISDN connections. Such low bandwidth availability precludes working with 1.5 Mbps Desktop media. However, they will still be able to access and update metadata, whilst also being able to view Web media.
Fig. 2 illustrates one embodiment of the system described herein. As in the prior art system, data is received from a satellite feed 210 and is decoded at the IRD 212. The data is then transferred directly to an input archive 224 in an archive system 220 where it is stored at the input or transmission resolution, in this embodiment at 3Mb/s.
As in the prior art system, the received data may also be upconverted to broadcast quality and copied to a video server for editing or playout 216, 222 via a Serial Digital Interface 214. However, a low-resolution copy of the data is preferably also transferred to the archive 224. The broadcast-quality data may be stored on the video server 216 for a short time period, for example for a day or a week, and may allow portions of the media data to be played out directly from the video server. This may further allow fast access for users to media data that is likely to be required frequently.
Media data, or media item objects, may be transferred to the archive system 220 after the time period. On transfer of the media data, the archive system 220 preferably determines whether the data transferred from the video server 216 or the editing system 230 corresponds directly to data stored at the input resolution. In the present embodiment, the archive interface 226 determines whether the data corresponds to data stored at the input resolution based on the metadata associated with the media data, for example using the metadata system described above.
If the data does not correspond to data stored at the input resolution, that is if the output data is not directly derivable from the input data, the output data is stored at the second, edited resolution. In this embodiment, the output data is stored in a separate but interconnected part of the archive system 220, the editing resolution archive 228. Subsequent users of the same data may then retrieve the higher-resolution data from the editing resolution archive 228 and use and edit the data further in the high-resolution format. In the present embodiment, the original, unedited data is also retained in the transmission resolution archive 224.
If the edited output data corresponds directly to data stored at the input resolution, the output data is not stored at the second resolution. Rather the data is discarded by the archive interface 226. The output data is
reconstructed from the metadata associated with the output data and the input data if the data is subsequently requested by an editing or a playout system.
If the data in the archive system 220 is subsequently required for further editing or playout, the data may be requested from the archive system 220 by users via the editing system 230. The archive system determines whether the requested data is available at the input resolution or at the converted resolution and converts the data as necessary, for example at the archive interface 226, to an editing resolution before transmitting the data to the editing system 230. For example the resolution may be increased to an editing resolution of 50mb/s.
The data may be edited by the user and the editing system 230 may amend the metadata associated with the data to indicate that the data has been edited and preferably to indicate the nature of the editing performed. Edited data may then be transferred back to the archive system 220.
As for the data transferred from the video server, the archive interface 226, may determine whether the output data is directly derivable or corresponds to data already stored in the archive system 220 at the first resolution. If so, the output data is not stored. If the edited data does not correspond to data already stored, however, the edited data is stored in the archive system 220, preferably in the second archive 228.
It will be appreciated that modifications to the system described above may be provided. For example, conversion of the data and determination of whether to store the output data may be performed by means associated with the archive system 220, such as the archive interface described above, or may be performed by means associated with the editing system 230.
On playout of the media data to broadcast, the final broadcast data is preferably copied back into the archive system 220. In the present embodiment, as described above for the edited data, the archive interface
determines whether the output data corresponds to input data and stores only the output data that is not directly derivable from input data. As described above, the final broadcast data may be archived to a persistent store, for example to tape or to disc, and only the lower-quality data may be maintained in the archive system 220. Optionally, lower quality data, which may allow the broadcast data to be recreated, may be output to the persistent store by the archive system 220.
Alternatively or additionally to using metadata associated with the data to determine whether the data is directly derivable, a database associated with the archive may be provided to maintain a record of which items or sections of data have been accessed by editing systems and edited. Editing systems accessing data may then consult the database to determine whether the requested data is available in editing resolution format or only in transmission resolution format.
A further embodiment of a system, which may be provided in conjunction with the embodiments described above, is illustrated in Fig. 3. In many media data systems, one or more satellites may simultaneously broadcast the same media data to a plurality of receiving stations 312, each of which may store the media data in an archive 314, 316. For example, a satellite system may broadcast the same data to archives and editing systems on the East and on the West coasts of America.
In the present embodiment, the media data is stored at the transmission resolution without being converted to an editing resolution. An editing system 318 operated by a user may request data from an archive 314 and the data may be supplied at the editing resolution. The editing system may then edit the data and produce an edit decision list (EDL) which may detail all of the editing operations performed on the original data to produce the edited data.
If the edited data is subsequently required at a second location, one option may be to transmit the edited data from the first archive 314 to a second
archive 316 to allow the second editing system 320 to access the data.
However, in the present method, the EDL is transmitted to the second editing system 320. The second editing system 320 may then access the unedited data stored in the second archive 316 and use the EDL to recreate the edited data.
It will be appreciated by one skilled in the art that the invention is not limited to the embodiments described herein. Data may be received in different formats and at different resolutions to the 3Mb/s satellite feed described above and the broadcast and editing resolutions may be different to the 270Mb/s and the 50Mb/s resolutions described. However, it will be appreciated that the principles described herein may be applied to any editing system with an input resolution that is different to the output resolution. The systems and methods described may be particularly advantageous in systems which receive large quantities of data but in which not all of the data is likely to be accessed and used.
A generalised media management system will now be described at a high level with reference to Figure 6 in order to illustrate various aspects of a system in conjunction with which embodiments may be provided. The various features shown will be described in more detail below, in relation to specific examples.
As shown in Fig. 6, a metacore 1200 is at the centre of the system shown, and comprises a metadata store 1201 and a media store 1202. Media intake, for example from video feeds, agencies, newsgathering teams etc. can be received via an edit matrix 1206 which is controlled by a network control system 1208. In order to effectively manage the incoming media, the media may be assigned metadata values which are stored in the metadata store. Media intake can also be received from viewing and editing services 1210 and Archive service 1212. The metadata values may be imported with the incoming media, may be assigned values by a system operator or may be
assigned default values. The associated media is then stored in the media store 1202.
Users of the system can use viewing and editing services 1210 to view and edit media managed by the system, and can search the system by metadata attributes to find relevant media. Once the relevant metadata describing the desired media has been found, the system can retrieve the associated media from the media store (if it exists there) for use by the user. Users can create new media items from existing essence, but with new metadata (which may be derived from existing metadata as will be explained below) to be input into the metacore.
The media store is an online store, and media held within it can be accessed and manipulated directly via devices networked to the metacore. As explained above, the practical constraints of media storage dictate that only a certain volume of media can be maintained online in this way, and as new media is constantly fed into the system, existing media must be removed. This is particularly true of the media essence, and less so of the metadata. If it is determined that the media is important and cannot simply be deleted, it must be stored offline, or archived. Both the process of selecting material to be archived, and the process of archiving it require considerable resources.
An archive service 1212 is therefore linked to the metacore and the embodiments of the systems and method described herein may be used to provide the archive service. The archive service is in turn linked to one or more persistent analogue or digital data stores, for example VTRs 1214, discs or flash memory. The archive service identifies media, via its metadata, to be taken from the media store and archived (offline). The archive service can also act to re-ingest data into the (online) media store, for example according to the systems and methods described herein.
The metacore is connected to transmission servers 1216. These transmission servers can accept media items, which are ready to be broadcast on transmission system 1218.
The system also supports web-based output, and the metacore, is further linked to a post processor 1220, which in turn feeds a web hosting 1222.
It will be understood that the present invention has been described above purely by way of example, and modification of detail can be made within the scope of the invention.
Each feature disclosed in the description, and (where appropriate) the claims and drawings may be provided independently or in any appropriate combination.