CA2823742A1 - Logging events in media files - Google Patents
Logging events in media files Download PDFInfo
- Publication number
- CA2823742A1 CA2823742A1 CA2823742A CA2823742A CA2823742A1 CA 2823742 A1 CA2823742 A1 CA 2823742A1 CA 2823742 A CA2823742 A CA 2823742A CA 2823742 A CA2823742 A CA 2823742A CA 2823742 A1 CA2823742 A1 CA 2823742A1
- Authority
- CA
- Canada
- Prior art keywords
- events
- video
- media file
- file
- logger
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000003908 quality control method Methods 0.000 claims abstract description 16
- 238000000034 method Methods 0.000 claims description 15
- 239000011521 glass Substances 0.000 claims description 11
- 238000004519 manufacturing process Methods 0.000 claims description 10
- 238000003860 storage Methods 0.000 claims description 10
- 238000004590 computer program Methods 0.000 claims description 5
- 230000008676 import Effects 0.000 claims description 5
- 239000000463 material Substances 0.000 description 6
- 239000010454 slate Substances 0.000 description 6
- 230000009471 action Effects 0.000 description 3
- 238000009826 distribution Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 206010011878 Deafness Diseases 0.000 description 1
- 206010048865 Hypoacusis Diseases 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000001816 cooling Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000010931 gold Substances 0.000 description 1
- 229910052737 gold Inorganic materials 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 230000000153 supplemental effect Effects 0.000 description 1
- 238000013518 transcription Methods 0.000 description 1
- 230000035897 transcription Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
- G11B27/034—Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/19—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
- G11B27/28—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/34—Indicating arrangements
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Databases & Information Systems (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Multimedia (AREA)
- Management Or Editing Of Information On Record Carriers (AREA)
- Television Signal Processing For Recording (AREA)
- Indexing, Searching, Synchronizing, And The Amount Of Synchronization Travel Of Record Carriers (AREA)
Abstract
Logging events in a media file, including: providing a logger tool to allow a user to view media in multiple ways and to capture and validate key events within the media file; and tracking and logging events in the media file by adding information to the media file including locations of bars and tone, slates, content, logos, commercial blacks, quality control issues, subtitles, and captions.
Description
LOGGING EVENTS IN MEDIA FILES
CROSS-REFERENCE TO RELATED APPLICATIONS
This application claims the benefit of priority of co-pending U.S. Provisional Patent Application No. 61/429,720, filed January 4, 2011, entitled "Tech Logger," U.S. Utility Patent Application No. 13/026,134, filed February 11, 2011, entitled "LOGGING EVENTS IN MEDIA FILES," and U.S.
Provisional Patent Application No. 61/534,275, filed September 13, 2011, entitled "Tech Logger." The disclosures of the above-referenced applications are incorporated herein =
by reference.
BACKGROUND
Field of the Invention The present invention relates to logging events, and more specifically, to displaying and logging events associated with media files.
Background Creating lists of events for a video file by hand is tedious and prone to error. Reviewing a tape or video file in one tool while manually entering time codes in another can lead to mistakes and inconsistency. These types of problems can make it more difficult to consistently handle video files in a library.
SUMMARY
Embodiments of the present invention provide for displaying audio and video from data files and attaching metadata to the files.
In one implementation, a method of logging events in a media file is disclosed. The method includes: providing a logger tool to allow a user to view media in multiple ways and to capture and validate key events within the media file;
and tracking and logging events in the media file by adding information to the media file including locations of bars and tone, slates, content, logos, commercial blacks, quality control issues, subtitles, and captions.
In another implementation, a logger tool to log events in video is disclosed. The logger tool includes: an adjustable filmstrip of thumbnails for at least a part of the video; at least one audio waveform for the video; timing information for the video; a plurality of events associated ,with the video and locations of the events in the video; at least one interface to display and playback the video and the at least one audio waveform; at least one interface to create, edit, and delete events for the video; at least one interface to create re-usable clips from the video; and at least one interface to edit, import, and copy events or groups of events within a file or across files.
CROSS-REFERENCE TO RELATED APPLICATIONS
This application claims the benefit of priority of co-pending U.S. Provisional Patent Application No. 61/429,720, filed January 4, 2011, entitled "Tech Logger," U.S. Utility Patent Application No. 13/026,134, filed February 11, 2011, entitled "LOGGING EVENTS IN MEDIA FILES," and U.S.
Provisional Patent Application No. 61/534,275, filed September 13, 2011, entitled "Tech Logger." The disclosures of the above-referenced applications are incorporated herein =
by reference.
BACKGROUND
Field of the Invention The present invention relates to logging events, and more specifically, to displaying and logging events associated with media files.
Background Creating lists of events for a video file by hand is tedious and prone to error. Reviewing a tape or video file in one tool while manually entering time codes in another can lead to mistakes and inconsistency. These types of problems can make it more difficult to consistently handle video files in a library.
SUMMARY
Embodiments of the present invention provide for displaying audio and video from data files and attaching metadata to the files.
In one implementation, a method of logging events in a media file is disclosed. The method includes: providing a logger tool to allow a user to view media in multiple ways and to capture and validate key events within the media file;
and tracking and logging events in the media file by adding information to the media file including locations of bars and tone, slates, content, logos, commercial blacks, quality control issues, subtitles, and captions.
In another implementation, a logger tool to log events in video is disclosed. The logger tool includes: an adjustable filmstrip of thumbnails for at least a part of the video; at least one audio waveform for the video; timing information for the video; a plurality of events associated ,with the video and locations of the events in the video; at least one interface to display and playback the video and the at least one audio waveform; at least one interface to create, edit, and delete events for the video; at least one interface to create re-usable clips from the video; and at least one interface to edit, import, and copy events or groups of events within a file or across files.
In yet another implementation, a non-transitory tangible storage medium storing a computer program for logging events in a media file is disclosed. The computer program includes executable instructions that cause a computer to: enable a user to view media in multiple ways and to capture and validate key events within the media file; and track and log events in the media file by adding information to the media file including locations of bars and tone, slates, content, logos, commercial blacks, quality control issues, subtitles, and captions.
Other features and advantages of the present invention will become more readily apparent to those of ordinary skill in the art after reviewing the following detailed description and accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 shows a screen shot of a queue page of a logger in accordance with one implementation of the present invention.
FIG. 2 shows a snapshot of a video page of the logger reached by clicking a title including a media file name.
FIG. 3A shows a snapshot-of a stack view in the video page of the logger in accordance with one implementation of the present invention.
Other features and advantages of the present invention will become more readily apparent to those of ordinary skill in the art after reviewing the following detailed description and accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 shows a screen shot of a queue page of a logger in accordance with one implementation of the present invention.
FIG. 2 shows a snapshot of a video page of the logger reached by clicking a title including a media file name.
FIG. 3A shows a snapshot-of a stack view in the video page of the logger in accordance with one implementation of the present invention.
FIG. 3B shows a screenshot of a list of filters displayed when the filter tab is selected.
FIG. 3C shows video information displayed when the video info tab selected in the tabs area.
FIG. 3D shows logos information displayed when Logos is selected in the tabs area.
FIG. 4A illustrates a representation of a computer system and a user.
FIG. 4B is a functional block diagram illustrating the computer system hosting a logger.
FIG. 5 is a flowchart illustrating a method of logging events in a media file in accordance with one implementation of the present invention.
FIGS. 6-17 are illustrations of implementations of user interfaces for a logger, such as for presenting, selecting, conforming, matching, and logging audio and video elements (e.g., frames, tracks, segments, clips, waveforms, filmstrips, events).
DETAILED DESCRIPTION
Certain implementations as disclosed herein provide for displaying-audio and video from data files and attaching metadata to the files. After reading this description it will become apparent how to implement the invention in various alternative implementations and alternative applications. However, although various implementations of the present invention will be described herein, it is understood that these implementations are presented by way of example only, and not limitation. As such, this detailed description of various alternative implementations should not be construed to limit the scope or breadth of the present invention.
In one implementation, a software tool referred to as a logger is used to log events in a media file, such as a movie. The logger tool provides a user interface allowing a user to view the video in multiple ways and add information to the file to track and log events in the file including the locations of bars and tone, slates, content, logos, commercial blacks, quality control issues, subtitles, and captions. The logger tool allows the user to capture and validate key events within the media file required to enable downstream automated post production processes and workflows.
In one implementation, the user interface provides access to the media file and also provides an interface to create, track, and edit events for that media file. The user interface allows automatic presentation and association of events with the media file at their proper location, which can improve throughput and quality of the data. Events can be generated manually by the user within the logger tool and also generated by importing lists or tables of events created externally. The events can then be associated with the media file within the logger tool. For example, a user can import a quality control report into the logger tool and the logger tool is used to create events for the file matching the quality control entries. In another implementation, the logger tool can also present information and views on frame matching and/or differentiation based on imported matched and/or differentiated data.
FIG. 1 shows a screen shot of a queue page 100 of a logger in accordance with one implementation of the present invention. Queues shown on the queue page 100 are designed to track the progress of media files through each status of a logging process.
In the illustrated implementation of FIG. 1, the queue page 100 of the logger includes following items/fields:
status bar 110, item counter 112, 'sort by drop-down' 114, search field 116, 'expand/collapse all' 118, title 120, identifiers 130, expand 122, thumbnail 124, collapse 126, file specs 128, 'add movie' field 132, and logout. The status bar 110 is clicked to display the files in the selected status, which includes All, Loading, Ready for Logging, User working, Ready for Review, Completed, and Rejected. The item counter 112 displays the number of files showing for the chosen status. The 'sort by drop-down' item 114 is clicked to select an identifier (e.g., Title, Status, Task Id, Added date, Feature, User assigned, and Kit Id) in which the files will be arranged and viewed. The search field 116 displays the files that meet the entered keyword criteria. The 'expand/collapse all' item 118 is clicked to expand or collapse additional file information (i.e., file specs) for all files in the current status. The title 120 includes a file name that is clicked to proceed to a video page of the logger. The identifiers field 130 shows file specific identifying information. The expand icon 122 is clicked to display additional file information. The thumbnail 124 shows a single frame selected to visually represent the file. The collapse icon 126 is clicked to hide additional file information. The file specs 128 show additional technical file information. The 'add movie' field 132 is used to insert a selected file not currently in the logger tool into a loading status.
FIG. 2 shows a snapshot of a video page 200 of the logger reached by clicking a title including a media file name (e.g., 120 in FIG. 1). In one implementation, the video page 200 of the logger includes sections, controls, and commands that are used to view, verify, and capture events.
For example, the video page 200 of the logger provides/displays following: an adjustable filmstrip of thumbnails for all or part of a video file; audio waveforms for the video; the video with timing information (e.g., time code, tape time code, frame number); events associated with the video and their location in the file (e.g., by time code); interfaces to display and playback video and audio waveforms; interfaces to create, edit, and delete events for the video file; interfaces to create re-usable clips from a video file (e.g., creating new logos); interfaces for editing, importing, and copying events or groups of events within a file or across files; interfaces to a user through a web browser.
In the illustrated implementation of FIG. 2, the video page 200 includes following sections, controls, and commands:
page selector 210, event overview 212, master strip 214, looking glass 216, event strip 218, event indicator 220, anchor 222, audio waveform 224, audio magnification 226, standard timecode 228, tape timecode 230, frame number 232, player controls 234, magnification slider 236, volume slider 238, player pane 242, and stack view 240. The page selector 210 is used to choose which page to view (e.g., Queue, Video, or Audio). The event overview 212 represents sections of the file containing events. In one case, known events and unknown events are represented by different colors.
The master strip 214 represents the entire file timeline from start to end. The looking glass 216 is located in the.
master strip 214 and magnifies the surrounded section of the file in the event strip 218. The default position of the looking glass 216 upon opening a new file contains the entire file. The event strip 218 is a magnified section located inside the looking glass 216 on the master strip 214 that can split the file into events. The event indicator 220 is a stroke that outlines each individual event. For example, a first thumbnail within the event indicator 220 is the first frame of the event, and a last thumbnail within the event indicator 220 is the last frame of the event. The anchor 222 is represented by a vertical line that crosses the event strip 218 and audio waveform which represents the location in the file. This file location will display in the player pane 242. The player controls 234 are buttons that control basic playback tasks such as playing, pausing, fast forwarding, and rewinding. The magnification slider 236 adjusts the size of the looking glass 216 which can increase or decrease the amount of the master strip 214 that is displayed in the event strip 218. The player pane 242 displays the frame located to the right side of the anchor 222. The stack view section 240 is the action center of the logger video page 200.
In one implementation, the video page 200 of the logger can be navigated using the sections, controls, and commands described above. For example, the master strip 214 can be navigated by clicking and dragging the looking glass 216 to the right or left to view different sections of the file in the event strip 218. The size of the looking glass 216 can be adjusted by moving the magnification slider 236 toward minutes to increase the size of the looking glass 216 and toward frames to decrease the size of the looking glass 216.
In another example, the event strip 218 can be navigated by clicking and dragging the anchor 222 to the right or left along the event strip 218. The event strip 218 can be dragged to the right or left while the anchor 222 remains in the same location. Dragging the event strip 218 also moves the looking glass 216 in the master strip 214. When the desired event on the event strip 218 is clicked, the event strip 218 will move to place the anchor 222 before the first frame of the selected event. Either the Enter key can be pressed or the event on the event strip 218 can be clicked to also expand the event in the center of the strip 218. Up or down arrow key can be used to move to the next or previous event. In yet another example, when an event in the stack view 240 is selected, the event strip 218 will move to place the anchor 222 before the first frame of the selected event, and expand the event in the center of the event strip 218.
FIG. 3A shows a snapshot of a stack view 300 in the video page 200 of the logger in accordance with one implementation of the pre-sent Envention. The stack view 300 shows the tasks being completed as well as filter tools and other information. In the illustrated implementation of FIG.
3A, the stack view pane 300 includes track information 310 (including a track drop down button 312 and an add track button 314), tabs 330 for showing filters 332 (see FIG. 38), video information 334 (see FIG. 3C), and logos 336 (see FIG.
3D), and event row 320. As described above, known events and unknown events can be represented by different colors 322.
The stack view pane 300 further includes 'All Notes Expander' 316 and 'Notes Expander' 318. The track information 310 section provides options to: import quality control report, captions, subtitles, or script alignment; copy from a selected title; or create an unknown default event that represent the entire file.
FIG. 3B shows a screenshot of a list of filters 332 displayed when the filter tab is selected. A selection of one or more filter from the list of filters allows viewing of the events contained in an individual track by category.
Thus, the filter can be selected to show in the track only the events in that filter category. More than one filter can be turned on at one time to allow viewing of the events in the selected filter categories by pressing multiple filter buttons.
FIG. 3C shows video information 334 displayed when the video info tab selected in the tabs area 330. The video information 334 provides information such as frame rate, language, and other pertinent video information.
FIG. 3D shows logos information 336 displayed when Logos is selected in the tabs area 330. To view logos in the logos window of the stack view 300, click the logos button under the track name. To search logos, click to place the cursor in the search field with the logo window open. To create a new logo, execute the following steps: create an event that represents the logo from start to end; click on the 'edit mode' icon in the stack view for the event that contains the logo; choose 'Logo' in the event category menu and the corresponding logo type (e.g., Logo, Production Company Logo, Distribution Logo, or Production Logo); place the anchor on the frame that most accurately represents the logo in the event strip; click the 'OK' button or double-click the correct event type in the event category menu; type in the desired logo name in the search field when the logo window appears over the stack view; click the 'Create New' button;
and click the 'submit' button to assign the newly created logo to the event When the new logo appears in the stack view.
Returning to FIG. 3A, each event row 320 will display the event type it has been assigned, the event description, duration, as well as start and end. The measurement of the duration and start and end information' will display based on the highlighted measurement field. Each event type is represented by a different color 322 in the 'event type' column in the stack view 300. Table 1 shown below broadly defines the available event types.
Type Category Definition Audio Program Audio Start Audio Fade Out Audio "Two-Pop"
Audio Sync Point A hard effect that can be used to sync the audio track with the visual queue.
Bars And Bars And Tone SMPTE color bars together with a Tone continuous 1000 Hz audio tone at the beginning of the file to calibrate playback equipment.
Blacks Fade to Black Blacks Commercial Black Periods of black picture over MOS
placed where commercial breaks would be inserted.
Black Roll-up / Pre- Periods of black picture over MOS
roll typically preceding bars and tone.
Caption Caption Verifying that the caption is correct and in sync with the video.
Credits End Credit Start =
End Credit End Usually located at the end of program, credits contain information regarding the making of the program.
Credits Credits out of Safe Action Credit Change Scrolling end credits start Foreign Credit / Dub Credits that have been fully Card localized / White on black card that states the dub talent.
Cropping Cropping Dialogue Dialogue Foreign Dialogue Foreign Foreign Texted Start Texted (by Language) Foreign Texted Foreign Texted End (by Dialogue that is in a language Language) other than the stated OV of the file.
Foreign Texted Slate Graphics Graphics / Text _ Text Over Picture Text In Picture Graphics Overlay .
Insert Insert Start Insert Insert End Texted video clip that is meant to be inserted in program to take the place of texted OV material Insert Slate Language Language Logo Logo Production Company Logo Graphic and audio that represents the entity that produced the material.
Distribution Logo Graphic and audio that represents the line of business that is distributing the material.
Production Logo Production Company Logo that has (Customized to title) been integrated into program in such a fashion that it is not a standard production company logo.
Mains Main Title Main Title Start Main Title End First Hard Cut after Mains Mains Over Picture Out of Safe Title Within Safe Action Mastering Mastering Note Note Music Music Program Program Start Program End Program QC Issue QC - Picture issue QC - Audio issue xxxSlates Slate Insert Slate Program Slate Information card that displays tape metadata relevant to the file such as feature title, aspect ratio, part - timecode, runtime, audio configuration, date PØ# /
vendor facility, textless material, source material, etc.
Trailers Slate Textless Slate Speaker Speaker Gender Gender Subtitles Subtitle (in picture) Textual versions of the dialog in films and television programs, usually displayed at the bottom of the screen. They can either be a form of written translation of a dialog in a foreign language, or a written rendering of the dialog in the same language, with or without added information to help viewers who are deaf and hard-of-hearing to follow the dialog.
Subtitle (component validation) Tape Start of Reel / Part End of Reel MultiPart Join Parts Textless Textless Textless Start Textless End Non-texted portions of the program located at the end of the file.
Some titles do not have textless material available.
Textless Slate Trailer Trailer (English) Foreign Language Trailer (by language) Transitions Last Hard Cut Table 1 Each track includes at least one event that represents the entire file from beginning to end, or many imported or copied events that combined include the entire file. Each new event is a portion of an existing event. Thus, to create a new event, place the anchor on or directly in front of the first frame of the event to be created in the event strip.
This will display the first frame of the event in the player pane. Select to split the current event into two events.
The frame to the right of the anchor now represents the first frame of the new event and the frame to the left of the anchor represents the last frame of the previous event. The event will automatically be categorized as Unknown.
FIG. 4A illustrates a representation of a computer system 400 and a user 402. The user 402 uses the computer system 400 to log events in a media file, such as a movie.
The computer system 400 stores and executes a logger 490.
FIG. 4B is a functional block diagram illustrating the computer system 400 hosting the logger 490. The controller 410 is a programmable processor and controls the operation of =
the computer system 400 and its components. The controller 410 loads instructions (e.g., in the form of a computer program) from the memory 420 or an embedded controller memory (not shown) and executes these instructions to control the system. In its execution, the controller 410 provides the logger 490 as a software system, such as to enable logging of events in a media file. Alternatively, this service can be implemented as separate hardware components in the controller 410 or the computer system 400.
Memory 420 stores data temporarily for use by the other components of the computer system 400. In one implementation, memory 420 is implemented as RAM. In one implementation, memory 420 also includes long-term or permanent memory, such as flash memory and/or ROM.
Storage 430 stores data temporarily or long term for use by other components of the computer system 400, such as for storing data used by the logger 490. In one implementation, storage 430 is a hard disk drive.
The media device 440 receives removable media and reads and/or writes data to the inserted media. In one implementation, for example, the media device 440 is an optical disc drive. -The user interface 450 includes components for accepting user input from the user of the computer system 400 and presenting information to the user. In one implementation, the user interface 450 includes a keyboard, a mouse, audio speakers, and a display. The controller 410 uses input from the user to adjust the operation of the computer system 400.
The I/O interface 460 includes one or more I/O ports to connect to corresponding I/O devices, such as external storage or supplemental devices (e.g., a printer or a FDA).
In one implementation, the ports of the I/O interface 460 include ports such as: USB ports, PCMCIA ports, serial ports, and/or parallel ports. In another implementation, the I/O
interface 460 includes a wireless interface for communication with external devices wirelessly.
The network interface 470 includes a wired and/or wireless network connection, such as an RJ-45 or "Wi-Fi"
interface (including, but not limited to 802.11) supporting an Ethernet connection.
The computer system 400 includes additional hardware and software typical of computer systems (e.g., power, cooling, operating system), though these components are not specifically shown in FIG. 4B for simplicity. In other implementations, different configurations of the computer system can be used (e.g., different bus or storage configurations or a multi-processor configuration).
FIG. 5 is a flowchart illustrating a method 500 of logging events in a media file in accordance with one implementation of the present invention. In the illustrated implementation, the method comprises configuring a logger tool, at box 510, to allow a user to view media in multiple ways (box 512). The user also captures and validate key events within the media file, at box 514. Events in the media file are tracked and logged, at box 520, by adding information to the media file, at box 522, including locations of bars and tone, slates, content, logos, commercial blacks, quality control issues, subtitles, and captions.
Various implementations can include, but are not limited to, one or more of the following items:
= Providing an adjustable filmstrip of thumbnails for all or part of a video file = Displaying audio waveforms for the video = Displaying the video with timing information (e.g., time code, tape time code, frame number) = Displaying events associated with the video and their location in the file (e.g., by time code) = Providing a UI (user interface) to control the display and playback of video and audio waveforms = Providing a UI to create, edit, and delete events for the video file = Providing a UI to create re-usable clips from a video file (e.g., creating new logos) = Providing a UI for editing, importing, and copying events or groups of events within a file or across files = Automatically creating selected events by analyzing a file (e.g., commercial blacks or slates) = Providing a UI and operations for frame matching, allowing a user to match frames within or across files = Providing a UI and operations for audio conforms, finding similarities and differences in audio waveforms = Providing a UI and operations for audio component creation = Providing a UI and operations for AVID export - ability to export the frame match data to an Avid AAF (Advanced Authoring Format) / EDL (Edit Decision List) or Quicktime reference movie = Providing a UI and operations for on screen annotation in the video UI (i.e. hand draw on the frame) = Providing a UI and operations for QC report generation = Providing a UI and operations for auto text in picture detection = Providing a UI and operations for speech to text processing and results display with editing capability = Providing a UI and operations for manual transcription tools = Providing the interface to a user through a web browser = Providing the audio and video through the logger using streaming from a server, instead of or in addition to using download and local copies of files In one implementation, the logger includes components to .
support a set of features that allow users to match frames from the same, or different movie files. This frame matching feature's underlying algorithm centers around a basic concept known as the law of absolute difference, and compares positive and negative frames to determine relevance of the match and then returns the results based on a defined threshold. This functionality provides users with the ability to create textless masters and foreign texted masters by matching the inserts to the original program. In one example, a first file contains the original movie and a second file contains inserts, groups of frames to replace corresponding groups of frame's in the original (e.g., frames with localized text for a particular language). Using frame matching, the user can identify the original frames that match the insert frames and then indicate which frames in the original movie file to replace with which frames from the insert file (manually and/or automatically). The logger can then output a new version using the original frames with selected frames replaced with the selected insert frames. Alternatively, the logger can create a file (e.g., a table of references) that guides playback between the original file and the insert file.
A user can then create another file for different language using a different insert file.
The UI provides the users with the ability to see the results play side by side in a player window and includes default storyboard mode as well as a view in "precise" mode where users can fix and adjust if there are insert/original frame mismatches of inconsistencies. In addition, users can preview their versions "real-time" and toggle between languages. This allows for a preview of the foreign language master "virtual edit" before it's rendered into an actual file. During playback in the preview area, inserts will be added to the movie on the fly by keying off of the EDL
created during the matching process. Users can also select audio and text elements to render as part of the preview.
In one implementation, the logger includes components to support a set of features for audio conforming and audio analysis that allow users to compare wave forms to each other and find similarities and/or differences. This technology is part of the logger feature set and is part of the audio UI.
In another implementation, the audio analysis and/or audio UI
can be implemented in a separate program or component. The following figures illustrate aspects of the audio UI. Users can select a "gold" reference channel - this is a channel that all of the other channels will be conformed to (e.g., offset or shifted to synchronize). Once the results come back the audio channels will lock into place and offsets will be recorded. .Users then validate that the conform results are accurate and lock the component. As new components are ingested for that title and this auto conform process runs, only unlocked components will be analyzed.
In another implementation, the logger includes components to support a set of features for creating audio components. For example, a user can combine multi-part audio components to one part, e.g., 6 reels of audio rendered into 1 longplay file. In one implementation, this audio component creation is also part of the logger audio UI. The audio UI
provides the users a feature that allows users to ingest multi-part audio components, conform them and then render a new component. The resulting components could then be used in a distribution system or other post-production workflows. One implementation of audio component creation also provides these features: sample rate conversion, sync pop removal, basic envelopes and real-time preview:
The above description of the disclosed implementations is provided to enable any person skilled in the art to make or use the invention. Various modifications to these implementations will be readily apparent to those skilled in the art, and the generic principles described herein can be applied to other implementations without departing from the spirit or scope of the invention. Accordingly, additional implementations and variations are also within the scope of the invention. For example, the examples focus on displaying and logging for movies, but a logger can be specialized for other video, such as television shows, internet video, or user generated content, or for audio, such as radio or podcasts, or other content, such as games or text, or combinations thereof (e.g., matching and conforming video, audio, and text, such as for screenplay matching and tracking). All features of each example are not necessarily required in a particular logger implementation. Further, it is to be understood that the description and drawings presented herein are representative of the subject matter which is broadly contemplated by the present invention. It is further understood that the scope of the present invention fully encompasses other implementations that may become obvious to those skilled in the art and that the scope of the present invention is accordingly limited by nothing other than the appended claims.
FIG. 3C shows video information displayed when the video info tab selected in the tabs area.
FIG. 3D shows logos information displayed when Logos is selected in the tabs area.
FIG. 4A illustrates a representation of a computer system and a user.
FIG. 4B is a functional block diagram illustrating the computer system hosting a logger.
FIG. 5 is a flowchart illustrating a method of logging events in a media file in accordance with one implementation of the present invention.
FIGS. 6-17 are illustrations of implementations of user interfaces for a logger, such as for presenting, selecting, conforming, matching, and logging audio and video elements (e.g., frames, tracks, segments, clips, waveforms, filmstrips, events).
DETAILED DESCRIPTION
Certain implementations as disclosed herein provide for displaying-audio and video from data files and attaching metadata to the files. After reading this description it will become apparent how to implement the invention in various alternative implementations and alternative applications. However, although various implementations of the present invention will be described herein, it is understood that these implementations are presented by way of example only, and not limitation. As such, this detailed description of various alternative implementations should not be construed to limit the scope or breadth of the present invention.
In one implementation, a software tool referred to as a logger is used to log events in a media file, such as a movie. The logger tool provides a user interface allowing a user to view the video in multiple ways and add information to the file to track and log events in the file including the locations of bars and tone, slates, content, logos, commercial blacks, quality control issues, subtitles, and captions. The logger tool allows the user to capture and validate key events within the media file required to enable downstream automated post production processes and workflows.
In one implementation, the user interface provides access to the media file and also provides an interface to create, track, and edit events for that media file. The user interface allows automatic presentation and association of events with the media file at their proper location, which can improve throughput and quality of the data. Events can be generated manually by the user within the logger tool and also generated by importing lists or tables of events created externally. The events can then be associated with the media file within the logger tool. For example, a user can import a quality control report into the logger tool and the logger tool is used to create events for the file matching the quality control entries. In another implementation, the logger tool can also present information and views on frame matching and/or differentiation based on imported matched and/or differentiated data.
FIG. 1 shows a screen shot of a queue page 100 of a logger in accordance with one implementation of the present invention. Queues shown on the queue page 100 are designed to track the progress of media files through each status of a logging process.
In the illustrated implementation of FIG. 1, the queue page 100 of the logger includes following items/fields:
status bar 110, item counter 112, 'sort by drop-down' 114, search field 116, 'expand/collapse all' 118, title 120, identifiers 130, expand 122, thumbnail 124, collapse 126, file specs 128, 'add movie' field 132, and logout. The status bar 110 is clicked to display the files in the selected status, which includes All, Loading, Ready for Logging, User working, Ready for Review, Completed, and Rejected. The item counter 112 displays the number of files showing for the chosen status. The 'sort by drop-down' item 114 is clicked to select an identifier (e.g., Title, Status, Task Id, Added date, Feature, User assigned, and Kit Id) in which the files will be arranged and viewed. The search field 116 displays the files that meet the entered keyword criteria. The 'expand/collapse all' item 118 is clicked to expand or collapse additional file information (i.e., file specs) for all files in the current status. The title 120 includes a file name that is clicked to proceed to a video page of the logger. The identifiers field 130 shows file specific identifying information. The expand icon 122 is clicked to display additional file information. The thumbnail 124 shows a single frame selected to visually represent the file. The collapse icon 126 is clicked to hide additional file information. The file specs 128 show additional technical file information. The 'add movie' field 132 is used to insert a selected file not currently in the logger tool into a loading status.
FIG. 2 shows a snapshot of a video page 200 of the logger reached by clicking a title including a media file name (e.g., 120 in FIG. 1). In one implementation, the video page 200 of the logger includes sections, controls, and commands that are used to view, verify, and capture events.
For example, the video page 200 of the logger provides/displays following: an adjustable filmstrip of thumbnails for all or part of a video file; audio waveforms for the video; the video with timing information (e.g., time code, tape time code, frame number); events associated with the video and their location in the file (e.g., by time code); interfaces to display and playback video and audio waveforms; interfaces to create, edit, and delete events for the video file; interfaces to create re-usable clips from a video file (e.g., creating new logos); interfaces for editing, importing, and copying events or groups of events within a file or across files; interfaces to a user through a web browser.
In the illustrated implementation of FIG. 2, the video page 200 includes following sections, controls, and commands:
page selector 210, event overview 212, master strip 214, looking glass 216, event strip 218, event indicator 220, anchor 222, audio waveform 224, audio magnification 226, standard timecode 228, tape timecode 230, frame number 232, player controls 234, magnification slider 236, volume slider 238, player pane 242, and stack view 240. The page selector 210 is used to choose which page to view (e.g., Queue, Video, or Audio). The event overview 212 represents sections of the file containing events. In one case, known events and unknown events are represented by different colors.
The master strip 214 represents the entire file timeline from start to end. The looking glass 216 is located in the.
master strip 214 and magnifies the surrounded section of the file in the event strip 218. The default position of the looking glass 216 upon opening a new file contains the entire file. The event strip 218 is a magnified section located inside the looking glass 216 on the master strip 214 that can split the file into events. The event indicator 220 is a stroke that outlines each individual event. For example, a first thumbnail within the event indicator 220 is the first frame of the event, and a last thumbnail within the event indicator 220 is the last frame of the event. The anchor 222 is represented by a vertical line that crosses the event strip 218 and audio waveform which represents the location in the file. This file location will display in the player pane 242. The player controls 234 are buttons that control basic playback tasks such as playing, pausing, fast forwarding, and rewinding. The magnification slider 236 adjusts the size of the looking glass 216 which can increase or decrease the amount of the master strip 214 that is displayed in the event strip 218. The player pane 242 displays the frame located to the right side of the anchor 222. The stack view section 240 is the action center of the logger video page 200.
In one implementation, the video page 200 of the logger can be navigated using the sections, controls, and commands described above. For example, the master strip 214 can be navigated by clicking and dragging the looking glass 216 to the right or left to view different sections of the file in the event strip 218. The size of the looking glass 216 can be adjusted by moving the magnification slider 236 toward minutes to increase the size of the looking glass 216 and toward frames to decrease the size of the looking glass 216.
In another example, the event strip 218 can be navigated by clicking and dragging the anchor 222 to the right or left along the event strip 218. The event strip 218 can be dragged to the right or left while the anchor 222 remains in the same location. Dragging the event strip 218 also moves the looking glass 216 in the master strip 214. When the desired event on the event strip 218 is clicked, the event strip 218 will move to place the anchor 222 before the first frame of the selected event. Either the Enter key can be pressed or the event on the event strip 218 can be clicked to also expand the event in the center of the strip 218. Up or down arrow key can be used to move to the next or previous event. In yet another example, when an event in the stack view 240 is selected, the event strip 218 will move to place the anchor 222 before the first frame of the selected event, and expand the event in the center of the event strip 218.
FIG. 3A shows a snapshot of a stack view 300 in the video page 200 of the logger in accordance with one implementation of the pre-sent Envention. The stack view 300 shows the tasks being completed as well as filter tools and other information. In the illustrated implementation of FIG.
3A, the stack view pane 300 includes track information 310 (including a track drop down button 312 and an add track button 314), tabs 330 for showing filters 332 (see FIG. 38), video information 334 (see FIG. 3C), and logos 336 (see FIG.
3D), and event row 320. As described above, known events and unknown events can be represented by different colors 322.
The stack view pane 300 further includes 'All Notes Expander' 316 and 'Notes Expander' 318. The track information 310 section provides options to: import quality control report, captions, subtitles, or script alignment; copy from a selected title; or create an unknown default event that represent the entire file.
FIG. 3B shows a screenshot of a list of filters 332 displayed when the filter tab is selected. A selection of one or more filter from the list of filters allows viewing of the events contained in an individual track by category.
Thus, the filter can be selected to show in the track only the events in that filter category. More than one filter can be turned on at one time to allow viewing of the events in the selected filter categories by pressing multiple filter buttons.
FIG. 3C shows video information 334 displayed when the video info tab selected in the tabs area 330. The video information 334 provides information such as frame rate, language, and other pertinent video information.
FIG. 3D shows logos information 336 displayed when Logos is selected in the tabs area 330. To view logos in the logos window of the stack view 300, click the logos button under the track name. To search logos, click to place the cursor in the search field with the logo window open. To create a new logo, execute the following steps: create an event that represents the logo from start to end; click on the 'edit mode' icon in the stack view for the event that contains the logo; choose 'Logo' in the event category menu and the corresponding logo type (e.g., Logo, Production Company Logo, Distribution Logo, or Production Logo); place the anchor on the frame that most accurately represents the logo in the event strip; click the 'OK' button or double-click the correct event type in the event category menu; type in the desired logo name in the search field when the logo window appears over the stack view; click the 'Create New' button;
and click the 'submit' button to assign the newly created logo to the event When the new logo appears in the stack view.
Returning to FIG. 3A, each event row 320 will display the event type it has been assigned, the event description, duration, as well as start and end. The measurement of the duration and start and end information' will display based on the highlighted measurement field. Each event type is represented by a different color 322 in the 'event type' column in the stack view 300. Table 1 shown below broadly defines the available event types.
Type Category Definition Audio Program Audio Start Audio Fade Out Audio "Two-Pop"
Audio Sync Point A hard effect that can be used to sync the audio track with the visual queue.
Bars And Bars And Tone SMPTE color bars together with a Tone continuous 1000 Hz audio tone at the beginning of the file to calibrate playback equipment.
Blacks Fade to Black Blacks Commercial Black Periods of black picture over MOS
placed where commercial breaks would be inserted.
Black Roll-up / Pre- Periods of black picture over MOS
roll typically preceding bars and tone.
Caption Caption Verifying that the caption is correct and in sync with the video.
Credits End Credit Start =
End Credit End Usually located at the end of program, credits contain information regarding the making of the program.
Credits Credits out of Safe Action Credit Change Scrolling end credits start Foreign Credit / Dub Credits that have been fully Card localized / White on black card that states the dub talent.
Cropping Cropping Dialogue Dialogue Foreign Dialogue Foreign Foreign Texted Start Texted (by Language) Foreign Texted Foreign Texted End (by Dialogue that is in a language Language) other than the stated OV of the file.
Foreign Texted Slate Graphics Graphics / Text _ Text Over Picture Text In Picture Graphics Overlay .
Insert Insert Start Insert Insert End Texted video clip that is meant to be inserted in program to take the place of texted OV material Insert Slate Language Language Logo Logo Production Company Logo Graphic and audio that represents the entity that produced the material.
Distribution Logo Graphic and audio that represents the line of business that is distributing the material.
Production Logo Production Company Logo that has (Customized to title) been integrated into program in such a fashion that it is not a standard production company logo.
Mains Main Title Main Title Start Main Title End First Hard Cut after Mains Mains Over Picture Out of Safe Title Within Safe Action Mastering Mastering Note Note Music Music Program Program Start Program End Program QC Issue QC - Picture issue QC - Audio issue xxxSlates Slate Insert Slate Program Slate Information card that displays tape metadata relevant to the file such as feature title, aspect ratio, part - timecode, runtime, audio configuration, date PØ# /
vendor facility, textless material, source material, etc.
Trailers Slate Textless Slate Speaker Speaker Gender Gender Subtitles Subtitle (in picture) Textual versions of the dialog in films and television programs, usually displayed at the bottom of the screen. They can either be a form of written translation of a dialog in a foreign language, or a written rendering of the dialog in the same language, with or without added information to help viewers who are deaf and hard-of-hearing to follow the dialog.
Subtitle (component validation) Tape Start of Reel / Part End of Reel MultiPart Join Parts Textless Textless Textless Start Textless End Non-texted portions of the program located at the end of the file.
Some titles do not have textless material available.
Textless Slate Trailer Trailer (English) Foreign Language Trailer (by language) Transitions Last Hard Cut Table 1 Each track includes at least one event that represents the entire file from beginning to end, or many imported or copied events that combined include the entire file. Each new event is a portion of an existing event. Thus, to create a new event, place the anchor on or directly in front of the first frame of the event to be created in the event strip.
This will display the first frame of the event in the player pane. Select to split the current event into two events.
The frame to the right of the anchor now represents the first frame of the new event and the frame to the left of the anchor represents the last frame of the previous event. The event will automatically be categorized as Unknown.
FIG. 4A illustrates a representation of a computer system 400 and a user 402. The user 402 uses the computer system 400 to log events in a media file, such as a movie.
The computer system 400 stores and executes a logger 490.
FIG. 4B is a functional block diagram illustrating the computer system 400 hosting the logger 490. The controller 410 is a programmable processor and controls the operation of =
the computer system 400 and its components. The controller 410 loads instructions (e.g., in the form of a computer program) from the memory 420 or an embedded controller memory (not shown) and executes these instructions to control the system. In its execution, the controller 410 provides the logger 490 as a software system, such as to enable logging of events in a media file. Alternatively, this service can be implemented as separate hardware components in the controller 410 or the computer system 400.
Memory 420 stores data temporarily for use by the other components of the computer system 400. In one implementation, memory 420 is implemented as RAM. In one implementation, memory 420 also includes long-term or permanent memory, such as flash memory and/or ROM.
Storage 430 stores data temporarily or long term for use by other components of the computer system 400, such as for storing data used by the logger 490. In one implementation, storage 430 is a hard disk drive.
The media device 440 receives removable media and reads and/or writes data to the inserted media. In one implementation, for example, the media device 440 is an optical disc drive. -The user interface 450 includes components for accepting user input from the user of the computer system 400 and presenting information to the user. In one implementation, the user interface 450 includes a keyboard, a mouse, audio speakers, and a display. The controller 410 uses input from the user to adjust the operation of the computer system 400.
The I/O interface 460 includes one or more I/O ports to connect to corresponding I/O devices, such as external storage or supplemental devices (e.g., a printer or a FDA).
In one implementation, the ports of the I/O interface 460 include ports such as: USB ports, PCMCIA ports, serial ports, and/or parallel ports. In another implementation, the I/O
interface 460 includes a wireless interface for communication with external devices wirelessly.
The network interface 470 includes a wired and/or wireless network connection, such as an RJ-45 or "Wi-Fi"
interface (including, but not limited to 802.11) supporting an Ethernet connection.
The computer system 400 includes additional hardware and software typical of computer systems (e.g., power, cooling, operating system), though these components are not specifically shown in FIG. 4B for simplicity. In other implementations, different configurations of the computer system can be used (e.g., different bus or storage configurations or a multi-processor configuration).
FIG. 5 is a flowchart illustrating a method 500 of logging events in a media file in accordance with one implementation of the present invention. In the illustrated implementation, the method comprises configuring a logger tool, at box 510, to allow a user to view media in multiple ways (box 512). The user also captures and validate key events within the media file, at box 514. Events in the media file are tracked and logged, at box 520, by adding information to the media file, at box 522, including locations of bars and tone, slates, content, logos, commercial blacks, quality control issues, subtitles, and captions.
Various implementations can include, but are not limited to, one or more of the following items:
= Providing an adjustable filmstrip of thumbnails for all or part of a video file = Displaying audio waveforms for the video = Displaying the video with timing information (e.g., time code, tape time code, frame number) = Displaying events associated with the video and their location in the file (e.g., by time code) = Providing a UI (user interface) to control the display and playback of video and audio waveforms = Providing a UI to create, edit, and delete events for the video file = Providing a UI to create re-usable clips from a video file (e.g., creating new logos) = Providing a UI for editing, importing, and copying events or groups of events within a file or across files = Automatically creating selected events by analyzing a file (e.g., commercial blacks or slates) = Providing a UI and operations for frame matching, allowing a user to match frames within or across files = Providing a UI and operations for audio conforms, finding similarities and differences in audio waveforms = Providing a UI and operations for audio component creation = Providing a UI and operations for AVID export - ability to export the frame match data to an Avid AAF (Advanced Authoring Format) / EDL (Edit Decision List) or Quicktime reference movie = Providing a UI and operations for on screen annotation in the video UI (i.e. hand draw on the frame) = Providing a UI and operations for QC report generation = Providing a UI and operations for auto text in picture detection = Providing a UI and operations for speech to text processing and results display with editing capability = Providing a UI and operations for manual transcription tools = Providing the interface to a user through a web browser = Providing the audio and video through the logger using streaming from a server, instead of or in addition to using download and local copies of files In one implementation, the logger includes components to .
support a set of features that allow users to match frames from the same, or different movie files. This frame matching feature's underlying algorithm centers around a basic concept known as the law of absolute difference, and compares positive and negative frames to determine relevance of the match and then returns the results based on a defined threshold. This functionality provides users with the ability to create textless masters and foreign texted masters by matching the inserts to the original program. In one example, a first file contains the original movie and a second file contains inserts, groups of frames to replace corresponding groups of frame's in the original (e.g., frames with localized text for a particular language). Using frame matching, the user can identify the original frames that match the insert frames and then indicate which frames in the original movie file to replace with which frames from the insert file (manually and/or automatically). The logger can then output a new version using the original frames with selected frames replaced with the selected insert frames. Alternatively, the logger can create a file (e.g., a table of references) that guides playback between the original file and the insert file.
A user can then create another file for different language using a different insert file.
The UI provides the users with the ability to see the results play side by side in a player window and includes default storyboard mode as well as a view in "precise" mode where users can fix and adjust if there are insert/original frame mismatches of inconsistencies. In addition, users can preview their versions "real-time" and toggle between languages. This allows for a preview of the foreign language master "virtual edit" before it's rendered into an actual file. During playback in the preview area, inserts will be added to the movie on the fly by keying off of the EDL
created during the matching process. Users can also select audio and text elements to render as part of the preview.
In one implementation, the logger includes components to support a set of features for audio conforming and audio analysis that allow users to compare wave forms to each other and find similarities and/or differences. This technology is part of the logger feature set and is part of the audio UI.
In another implementation, the audio analysis and/or audio UI
can be implemented in a separate program or component. The following figures illustrate aspects of the audio UI. Users can select a "gold" reference channel - this is a channel that all of the other channels will be conformed to (e.g., offset or shifted to synchronize). Once the results come back the audio channels will lock into place and offsets will be recorded. .Users then validate that the conform results are accurate and lock the component. As new components are ingested for that title and this auto conform process runs, only unlocked components will be analyzed.
In another implementation, the logger includes components to support a set of features for creating audio components. For example, a user can combine multi-part audio components to one part, e.g., 6 reels of audio rendered into 1 longplay file. In one implementation, this audio component creation is also part of the logger audio UI. The audio UI
provides the users a feature that allows users to ingest multi-part audio components, conform them and then render a new component. The resulting components could then be used in a distribution system or other post-production workflows. One implementation of audio component creation also provides these features: sample rate conversion, sync pop removal, basic envelopes and real-time preview:
The above description of the disclosed implementations is provided to enable any person skilled in the art to make or use the invention. Various modifications to these implementations will be readily apparent to those skilled in the art, and the generic principles described herein can be applied to other implementations without departing from the spirit or scope of the invention. Accordingly, additional implementations and variations are also within the scope of the invention. For example, the examples focus on displaying and logging for movies, but a logger can be specialized for other video, such as television shows, internet video, or user generated content, or for audio, such as radio or podcasts, or other content, such as games or text, or combinations thereof (e.g., matching and conforming video, audio, and text, such as for screenplay matching and tracking). All features of each example are not necessarily required in a particular logger implementation. Further, it is to be understood that the description and drawings presented herein are representative of the subject matter which is broadly contemplated by the present invention. It is further understood that the scope of the present invention fully encompasses other implementations that may become obvious to those skilled in the art and that the scope of the present invention is accordingly limited by nothing other than the appended claims.
Claims (16)
1. A method of logging events in a media file, the method comprising:
providing a logger tool to allow a user to view media in multiple ways and to capture and validate key events within the media file; and tracking and logging events in the media file by adding information to the media file including locations of bars and tone, slates, content, logos, commercial blacks, quality control issues, subtitles, and captions.
providing a logger tool to allow a user to view media in multiple ways and to capture and validate key events within the media file; and tracking and logging events in the media file by adding information to the media file including locations of bars and tone, slates, content, logos, commercial blacks, quality control issues, subtitles, and captions.
2. The method of claim 1, wherein the media is a movie.
3. The method of claim 1, wherein the key events are required to enable downstream automated post production processes and workflows.
4. The method of claim 1, wherein the logger tool automatically presents and associates the events with the media file at proper locations.
5. The method of claim 1, wherein tracking and logging events in the media file comprises importing a quality control report having quality control entries into the logger tool; and generating events for the media file matching the quality control entries.
6. The method of claim 1, wherein the events are generated manually by the user within the logger tool.
7. A logger tool to log events in video, the logger tool comprising:
an adjustable filmstrip of thumbnails for at least a part of the video;
at least one audio waveform for the video;
timing information for the video;
a plurality of events associated with the video and locations of the events in the video;
at least one interface to display and playback the video and the at least one audio waveform;
at least one interface to create, edit, and delete events for the video;
at least one interface to create re-usable clips from the video; and at least one interface to edit, import, and copy events or groups of events within a file or across files.
an adjustable filmstrip of thumbnails for at least a part of the video;
at least one audio waveform for the video;
timing information for the video;
a plurality of events associated with the video and locations of the events in the video;
at least one interface to display and playback the video and the at least one audio waveform;
at least one interface to create, edit, and delete events for the video;
at least one interface to create re-usable clips from the video; and at least one interface to edit, import, and copy events or groups of events within a file or across files.
8. The logger tool of claim 7, wherein the timing information comprises at least one of standard time code, tape time code, and frame number.
9. The logger tool of claim 7, wherein the at least one interface to create re-usable clips comprises at least one interface to create new logos.
10. The logger tool of claim 7, wherein the at least one interface to display and playback the video and the at least one audio waveform comprises at least one user interface through a web browser.
11. The logger tool of claim 7, wherein the at least one interface to display and playback the video and the at least one audio waveform comprises at least one of a master strip, a looking glass, an event strip, an event indicator, an anchor, an audio magnification, player controls, a magnification slider, a volume slider, a player pane, and a stack view.
12. A non-transitory tangible storage medium storing a computer program for logging events in a media file, the computer program comprising executable instructions that cause a computer to:
enable a user to view media in multiple ways and to capture and validate key events within the media file; and track and log events in the media file by adding information to the media file including locations of bars and tone, slates, content, logos, commercial blacks, quality control issues, subtitles, and captions.
enable a user to view media in multiple ways and to capture and validate key events within the media file; and track and log events in the media file by adding information to the media file including locations of bars and tone, slates, content, logos, commercial blacks, quality control issues, subtitles, and captions.
13. The non-transitory tangible storage medium of claim 12, wherein the media is a movie.
14. The non-transitory tangible storage medium of claim 12, wherein the key events are required to enable downstream automated post production processes and workflows.
15. The non-transitory tangible storage medium of claim 12, wherein executable instructions that cause a computer to enable a user to view media in multiple ways and to capture and validate key events within the media file comprise executable instructions that cause a computer to automatically present and associate the events with the media file at proper locations.
16. The non-transitory tangible storage medium of claim 12, wherein executable instructions that cause a computer to track and log events in the media file comprise executable instructions that cause a computer to:
import a quality control report having quality control entries; and generate events for the media file matching the quality control entries.
import a quality control report having quality control entries; and generate events for the media file matching the quality control entries.
Applications Claiming Priority (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161429720P | 2011-01-04 | 2011-01-04 | |
US61/429,720 | 2011-01-04 | ||
US13/026,134 | 2011-02-11 | ||
US13/026,134 US20120170914A1 (en) | 2011-01-04 | 2011-02-11 | Logging events in media files |
US201161534275P | 2011-09-13 | 2011-09-13 | |
US61/534,275 | 2011-09-13 | ||
PCT/US2012/020220 WO2012094417A1 (en) | 2011-01-04 | 2012-01-04 | Logging events in media files |
Publications (1)
Publication Number | Publication Date |
---|---|
CA2823742A1 true CA2823742A1 (en) | 2012-07-12 |
Family
ID=46457700
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA2823742A Abandoned CA2823742A1 (en) | 2011-01-04 | 2012-01-04 | Logging events in media files |
Country Status (7)
Country | Link |
---|---|
EP (1) | EP2661701A1 (en) |
JP (1) | JP2014506434A (en) |
KR (1) | KR20140051115A (en) |
CN (1) | CN103534695A (en) |
BR (1) | BR112013017179A2 (en) |
CA (1) | CA2823742A1 (en) |
WO (1) | WO2012094417A1 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013040244A1 (en) * | 2011-09-13 | 2013-03-21 | Sony Corporation | Logging events in media files including frame matching |
CN111741231B (en) | 2020-07-23 | 2022-02-22 | 北京字节跳动网络技术有限公司 | Video dubbing method, device, equipment and storage medium |
KR20220099016A (en) * | 2021-01-05 | 2022-07-12 | 삼성전자주식회사 | Electronic apparatus and controlling method thereof |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6357042B2 (en) * | 1998-09-16 | 2002-03-12 | Anand Srinivasan | Method and apparatus for multiplexing separately-authored metadata for insertion into a video data stream |
US7548565B2 (en) * | 2000-07-24 | 2009-06-16 | Vmark, Inc. | Method and apparatus for fast metadata generation, delivery and access for live broadcast program |
US20040125124A1 (en) * | 2000-07-24 | 2004-07-01 | Hyeokman Kim | Techniques for constructing and browsing a hierarchical video structure |
US20050165840A1 (en) * | 2004-01-28 | 2005-07-28 | Pratt Buell A. | Method and apparatus for improved access to a compacted motion picture asset archive |
US9665629B2 (en) * | 2005-10-14 | 2017-05-30 | Yahoo! Inc. | Media device and user interface for selecting media |
US8515757B2 (en) * | 2007-03-20 | 2013-08-20 | Nuance Communications, Inc. | Indexing digitized speech with words represented in the digitized speech |
US20090210395A1 (en) * | 2008-02-12 | 2009-08-20 | Sedam Marc C | Methods, systems, and computer readable media for dynamically searching and presenting factually tagged media clips |
US8311390B2 (en) * | 2008-05-14 | 2012-11-13 | Digitalsmiths, Inc. | Systems and methods for identifying pre-inserted and/or potential advertisement breaks in a video sequence |
-
2012
- 2012-01-04 JP JP2013548487A patent/JP2014506434A/en active Pending
- 2012-01-04 BR BR112013017179A patent/BR112013017179A2/en not_active IP Right Cessation
- 2012-01-04 CN CN201280008597.3A patent/CN103534695A/en active Pending
- 2012-01-04 CA CA2823742A patent/CA2823742A1/en not_active Abandoned
- 2012-01-04 WO PCT/US2012/020220 patent/WO2012094417A1/en active Application Filing
- 2012-01-04 KR KR1020137020640A patent/KR20140051115A/en not_active Application Discontinuation
- 2012-01-04 EP EP12731902.8A patent/EP2661701A1/en not_active Withdrawn
Also Published As
Publication number | Publication date |
---|---|
BR112013017179A2 (en) | 2016-09-20 |
CN103534695A (en) | 2014-01-22 |
WO2012094417A1 (en) | 2012-07-12 |
KR20140051115A (en) | 2014-04-30 |
JP2014506434A (en) | 2014-03-13 |
EP2661701A1 (en) | 2013-11-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10404959B2 (en) | Logging events in media files | |
US7432940B2 (en) | Interactive animation of sprites in a video production | |
US11856315B2 (en) | Media editing application with anchored timeline for captions and subtitles | |
US8302010B2 (en) | Transcript editor | |
US9437247B2 (en) | Preview display for multi-camera media clips | |
US7334026B2 (en) | Collaborative remote operation of computer programs | |
US9881215B2 (en) | Apparatus and method for identifying a still image contained in moving image contents | |
US20100050080A1 (en) | Systems and methods for specifying frame-accurate images for media asset management | |
US20110307526A1 (en) | Editing 3D Video | |
EP1587109A1 (en) | Editing system for audiovisual works and corresponding text for television news | |
US20190107906A1 (en) | Time-based metadata management system for digital media | |
US8819558B2 (en) | Edited information provision device, edited information provision method, program, and recording medium | |
WO2013040244A1 (en) | Logging events in media files including frame matching | |
GB2520041A (en) | Automated multimedia content editing | |
US20140006978A1 (en) | Intelligent browser for media editing applications | |
CA2823742A1 (en) | Logging events in media files | |
US20220180902A1 (en) | Media management system | |
US20220189511A1 (en) | User interface for video editing system | |
US10692536B1 (en) | Generation and use of multiclips in video editing | |
Denoue et al. | Content-based copy and paste from video documents | |
Brenneis et al. | Final Cut Pro X: Visual QuickStart Guide | |
AU2002301447B2 (en) | Interactive Animation of Sprites in a Video Production | |
Althagafi et al. | MIDB: A Web-Based Film Annotation Tool. | |
KR20110010082A (en) | Method and apparatus for generating meta data of content data |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
EEER | Examination request |
Effective date: 20130703 |
|
FZDE | Discontinued |
Effective date: 20160817 |