WO2001011627A1 - Generation and versatile usage of effect tree presets - Google Patents

Generation and versatile usage of effect tree presets Download PDF

Info

Publication number
WO2001011627A1
WO2001011627A1 PCT/US2000/021522 US0021522W WO0111627A1 WO 2001011627 A1 WO2001011627 A1 WO 2001011627A1 US 0021522 W US0021522 W US 0021522W WO 0111627 A1 WO0111627 A1 WO 0111627A1
Authority
WO
WIPO (PCT)
Prior art keywords
effect
tree
input nodes
group
effect tree
Prior art date
Application number
PCT/US2000/021522
Other languages
French (fr)
Inventor
Thomas P. Nadas
Shailendra Mathur
Original Assignee
Avid Technology, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Avid Technology, Inc. filed Critical Avid Technology, Inc.
Priority to AU65266/00A priority Critical patent/AU6526600A/en
Publication of WO2001011627A1 publication Critical patent/WO2001011627A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing

Definitions

  • the present invention relates generally to the application of an effect tree to a clip or clips of video information, and more particularly to the application of an effect tree to a clip or clips of video information in a timeline editing context with a variable number of inputs and outputs which are defined by the user, prior to the application.
  • Computer software is routinely used to depict graphic images on a computer display, under the control of specially designed graphics software.
  • the graphics software is meant to execute in conjunction with an operating system that supports the graphics capabilities of the system; i.e. the software must understand graphics concepts such as brush, pen, and color support.
  • Dedicated graphics processors and general purpose computer systems with high speed processors and large, high resolution monitors have increased the ability to display high quality graphics when used in conjunction with the graphics software.
  • Video editing and computer animation applications are the typical uses of the software.
  • Softimage Co. of Montreal, Canada provides an example of computer graphics software that is commercially available.
  • This software provides for the ability to modify video and animation clips by applying certain effects to the clips, according to the specifications of the user. Examples of these effects include a brush effect for painting the clip, a dissolve effect for transitioning between clips, and a color selection effect for selectively changing a specified color in a clip to a new color.
  • a brush effect for painting the clip a dissolve effect for transitioning between clips
  • color selection effect for selectively changing a specified color in a clip to a new color.
  • other effects are possible, and it is not intended that those identified above are the only ones contemplated.
  • Effects may be chained together to form an "effect tree” for implementing more complex effects.
  • an "effect tree” is a Directed Acyclic Graph, or DAG, of effects.
  • the output of one effect is input to one or more subsequent effects.
  • Effects can be applied in a layered or sequential manner but is not restricted to a chain.
  • a typical effect tree operates on a set of input images by passing them through one or more effects such as a blur or tint, and then mixing them together using operations such as "multiply" or "over”.
  • An effect tree is usually presented to the user as a graph in which each effect within the effect tree is a node of the graph, and is represented as an icon on a display device.
  • the graph depicts the input or inputs to each node of the effect tree and its resulting output.
  • the effect tree is editable by the user. The user can change interconnections between effects, add or delete effects using the effect tree user interface, and can store the edited effect tree for later retrieval and use (a preset).
  • Effect trees can be applied to a video or animation clip or multiple clips in a compositing context or a timeline editing context.
  • the emphasis is on applying a DAG of effects on different layers of material which typically occupy a completely overlapping time span. The layers are composited together at the end to produce an output clip which occupies a time span which is the intersection of the input material time spans. In this context multiple inputs can be fed through a DAG to produce the final output.
  • the emphasis is on piecing together input material so that the material is almost stacked end to end to produce an output clip. Some material may be overlapped, and transition effects applied in the overlap area.
  • single or stacks of effects can be applied to single clips, changing only that clip.
  • the present invention advantageously provides for an effect tree in which the input source media and possibly the outputs are disassociated from the tree of effects.
  • the same effects tree preset can be applied in different effects environments, acting for example as a compositor, a filter, transition or source generator on a timeline, or as a packaged effect within another effect tree.
  • the input material to the effect tree is a video or animation clip, but the invention is not limited to only video or animation data.
  • Other forms of multimedia input are contemplated to be within the scope of the invention.
  • a timing operation is imposed on an effect tree that has been stored either previously as a preset for subsequent retrieval or newly created.
  • the input to the effect tree is transitioned to the output in response to the timing parameters that are specified by a user.
  • the timing parameters may specify the time period over which the effect tree will be active on an input, or may determine the progression of a transition effect from one state to another.
  • the number of input or output nodes for an effect tree may be less than the desired number of external connections at the time of applying the effect tree and consequently more input or output nodes are created dynamically in order to accommodate the increased connections.
  • the number of input or output nodes for effect tree may be more than the desired number of external connections at the time of applying the effect tree and thus input or output nodes are ignored by the effect tree in order to satisfy the reduced number of connections.
  • the timing parameters are user selectable and independent of the effect tree itself. Consequently, any effect tree can have a timing operation imposed on it, and the creator of the effect tree need not be concerned with this aspect of the effect tree when it is constructed.
  • the timing parameters for a timing operation are easily changed in order to experiment with the response of the effect tree to the timing changes. Consequently, the user enjoys a significant flexibility in applying the timing parameters to a particular effect tree.
  • specifying a timing operation for an effect tree automatically causes a change in any graph associated with the effect tree in order to conform to the timing operation.
  • FIG. 1 is a drawing of a computer system suitable for implementing a system for editing effect trees, according to the present invention.
  • FIG. 2 depicts the hardware components of the computer system of FIG. 1 in further detail.
  • FIG. 3 depicts a user interface, according to the invention, that allows for the application of an effect tree to two video clips.
  • FIG. 4 depicts a user interface, according to the invention, that allows for the saving of a newly created effect tree as a preset for future use.
  • FIG. 5 depicts a user interface, according to the invention, that allows for the application of an effect tree to an input video in a timed manner.
  • FIG. 6 shows the creation of an effect tree preset, according to the invention.
  • FIG. 7 shows the application of the preset of Fig. 6 as a transition.
  • FIG. 8 shows the creation of an effect tree preset, according to the invention.
  • FIG. 9 shows the application of the preset of Fig. 8 as a filter DETAILED DESCRIPTION
  • the present invention is described for illustrative purposes with reference to the editing of video information.
  • the invention in it broadest sense, is applicable to applications other than video applications, and it is not intended that the scope of the invention be so limited.
  • the present invention is also applicable to the editing of audio data, and to media data in general.
  • a computer graphics imaging system 10 is schematically depicted in FIG. 1 and FIG. 2.
  • the graphics imaging system 10 includes a computer 11 that has a central processing unit (CPU) 12, a system bus 14, a static memory 16, a main memory 18, a mass memory 20, an alphanumeric input device 22, a pointer device 24 for manipulating a cursor and making selections of data, and a display adapter 26 for coupling control signals to a video display 28 such as a computer monitor. Since the graphics imaging system 10 is particularly suited to high resolution, high-speed graphics imaging the display or monitor 28 is most preferably a high- resolution wide screen display.
  • the CPU 12 executes imaging software described below to allow the system 10 to render high quality graphics images on the monitor 28.
  • the CPU 12 comprises a suitable processing device such as a microprocessor, for example, and may comprise a plurality of suitable processing devices.
  • the CPU 12 executes instructions stored in the static memory 16, main memory 18, and/or mass memory 20.
  • the static memory 16 may comprise read only memory (ROM) or any other suitable memory device.
  • the static memory may store, for example, a boot program for execution by CPU 12 to initialize the graphics imaging system 10.
  • the main memory 18 may comprise random access memory (RAM) or any other suitable memory device.
  • the mass memory 20 may include a hard disk device, a floppy disk, an optical disk, a flash memory device, a CDROM, a file server device or any other suitable memory device.
  • the term memory comprises a single memory device and any combination of suitable devices for the storage of data and instructions.
  • the system bus 14 provides for the transfer of digital information between the hardware devices of the graphics imaging system 10.
  • the CPU 12 also receives data over the system bus 14 that is input by a user through alphanumeric input device 22 and/or the pointer device 24.
  • the alphanumeric input device 22 may comprise a keyboard, for example, that comprises alphanumeric keys.
  • the alphanumeric input device 22 may comprise other suitable keys such as function keys for example.
  • the pointer device 24 may comprise a mouse, track-ball, and/or joystick, for example, for controlling the movement of a cursor displayed on the computer display 28.
  • the graphics imaging system 10 of FIG. 1 also includes display adapter hardware 26 that may be implemented as a circuit that interfaces with system bus 14 for facilitating rendering of images on the computer display 28.
  • the display adapter hardware 26 may, for example, be implemented with a special graphics processor printed circuit board including dedicated random access memory that helps speed the rendering of high resolution, color images on a viewing screen of the display 28.
  • the display 28 may comprise a cathode ray tube (CRT) or a liquid crystal display particularly suited for displaying graphics on its viewing screen.
  • CTR cathode ray tube
  • the invention can be implemented using high-speed graphics workstations as well as personal computers having one or more high-speed processors.
  • the graphics imaging system 10 utilizes specialized graphics software particularly suited to take advantage of the imaging hardware included in the display system 10 depicted in FIG. 1 and FIG. 2.
  • the software implements nonlinear editing, compositing, audio mixing, and graphics design suites which are used to create multimedia presentations.
  • Source material for use with such a system can be obtained from a media storage device 50 that can include videotape, film reel, and digitally recorded videodisks.
  • the source material can also be in the form of previously digitized materials stored on a computer memory 20 such as computer generated animations, graphic images or video files stored on a large capacity hard or fixed disk storage device.
  • the system 10 includes a multi -media interface 30 for converting image data into a form suitable for use by the software executing on CPU 12 and display adapter 26.
  • a representative display by the graphics software presents multiple images 52 of different resolutions.
  • FIG. 3 illustrates a typical user interface or screen display 110 for use in graphics imaging by a graphics design suit that forms a part of the graphics software.
  • the screen display includes an image viewing area 112, an effect tree viewing area 114 for viewing the current effect tree, and a property editor region 116 for displaying a curve or graph of the user specified timing to be associated with the effect tree in the effect tree view area 114.
  • the user interface may include a number of sculpted buttons that are actuated by selecting or clicking on the button with the pointer device 24.
  • the graphics software is executed under an operating system that includes functions for creating multiple tasks to be executed on the computer system 10.
  • a taskbar 118 located along side of the effect tree view area 114 allows the user to switch between tasks by activating or selecting different icons in the taskbar 118.
  • the graphic design suite having the user interface depicted in FIG. 3 is launched by choosing or selecting a graphics icon from the multiple icons displayed in the taskbar. Others of the multiple icons cause the software to display graphics particularly suited for other tasks such as editing, audio mixing, or combining multiple video layers into a combined audiovisual output.
  • the graphics software of graphics imaging system 10 includes the capability for a user to specify certain effects that can be applied to modify a video image or animation.
  • the software may support a "blur" effect that results in a blur of an input image.
  • a "tint” effect applies a tint to the received image.
  • An effect is a digital filter for modifying in some manner an input image to produce an output image.
  • the effect tree 113 shown in the effect tree area 114, has a first input 120, which is an input channel for a first video clip -CI and a second input 122, which is an input channel for a second video clip - C2.
  • the first input 120 is introduced into a blur effect 124 for blurring the image presented at first input 120.
  • the second input 122 is introduced into a color correction effect 126 for changing the color in some portion of the second image.
  • the output of blur effect 124 is input to a dissolve effect 128, along with the output of the color correction effect 126.
  • a dissolve effect blends between the images from the first input 134 and the second input 136 over the overlapping time spans of the inputs.
  • the output 132 of the effect tree 113 is presented in the viewing area 114.
  • the combination of the blur effect 124, the color correction effect 126, the dissolve effect 128, along with the associated inputs and output comprise an example of an effect tree.
  • the effect tree 113 includes two inputs since the dissolve effect requires two inputs.
  • an effect tree may allow for any number of inputs and outputs..
  • Effect trees may be named and stored in non-volatile memory for future use as a preset effect tree for later retrieval and application to an object. It may also be included in other stored effect trees to generate an even more complex effect. It is important to note that what is stored are only the effects, their parameters with animation information, their interconnections with each other, and the input nodes. The actual media or a reference to the media connected to the input nodes of the effect tree is not stored. Therefore, a stored effect tree becomes a stored object that is available for modifying an input object or objects to produce an output in a deterministic fashion. It may be retrieved from storage and also used as a template to create other effect trees by editing the original effect or by adding or deleting other effects, inputs, and outputs to create the new effect tree.
  • FIG. 4 there is shown a second display screen 150 including a window or panel 152 for saving an effect tree as a preset.
  • Panel 152 is generated by clicking on the icon 154 in toolbar 118 to request saving the effect tree.
  • the effect tree that is saved is the effect tree 113 displayed in the effect tree area 114.
  • the name of the saved effect tree is entered into a file name entry 154 to allow future retrieval of the saved effect tree.
  • an effect tree whether retrieved from storage or newly created is applied to an object in a timed manner.
  • a third display screen 160 shows a user interface in FIG. 5 for applying an effect tree in a timed manner.
  • the effect tree 113 that is be applied to one or more video clips is shown in effect tree panel 162.
  • the output image from the effect tree is again shown in the image viewer 112.
  • the display screen 160 includes a timing panel 168 for specifying timing parameters to be used in the timed application of the effect tree 113.
  • Timing panel 168 depicts three timing markers 170, 172, and 174 which identify the timing associated with the dissolve effect of the effect tree 113 shown in effect tree panel 162.
  • Timing marker 170 identifies that initially the first input video clip 164 is fed directly to the viewer as the output. At the time identified by the timing marker 172, the second video clip 166 is also input and the dissolve effect tree is active.
  • the dissolve effect is active until the time identified at timing marker 174 wherein the second video clip is the only input and is fed directly to the viewer 112 as the output.
  • the time period between the timing markers 172 and 174 is user specifiable, and specifies the time period when both inputs are active. This time period is identified in the timing panel 168 as a shaded area 176 to emphasize this condition.
  • the user may define a timed application of the video inputs to the effect tree.
  • the effect tree transitions the input video to the prescribed output over a specified period of time which is identified by the timing markers set in the timing panel 168.
  • the graphical imaging system advantageously provides for the timed control of both the input to an effect tree and the duration of the response of the effect tree.
  • a graph area 180 is provided for specifying, in graphical form, a rate of response for a transition effect that is included in the effect tree.
  • a transition effect is an effect that morphs the input into the output over a period of time.
  • the graphical imaging system provides that the transition effect may be applied at different rates over the time period of the transition.
  • the graph 182 indicates that the dissolve effect is applied in a linear manner from 0% dissolve to 100% dissolve (i.e. at a constant rate from the first video clip to the second video clip).
  • Graph 182 is a plot of the percentage of the effect versus time, and different rates for applying a transition effect may be specified by plotting a different graph in the graph area 180.
  • the graph 182 in the graph area 180 is automatically resized in response to a change in the timing panel 168. 182for any transition effect specified in the effect tree. Referring to the graph 182 shown in FIG.
  • the actual media or a reference to the original media associated with the input or inputs to the effect tree are disassociated from the effect tree itself.
  • the inputs to an effects tree are represented by specially marked "input nodes". These input nodes serve the purpose of preserving the connections of effect nodes to the inputs.
  • input nodes serve the purpose of preserving the connections of effect nodes to the inputs.
  • an effects tree preset the input nodes as well as the connections of other effects to these nodes are saved.
  • an external connection to the input's source is made to none, some or all these input nodes, depending on the context of application, and the number of input sources available at the time of application.
  • the number of input nodes inside the effects tree preset may be less than the number of source clips that is desired or is available to be connected. Consequently, additional inputs are created dynamically within the effects tree so that every media source available to the tree has a corresponding input node in the tree (even if there was not one there at the time of the effect tree creation). These newly created input nodes would initially not have any connections to any effects in the effects tree DAG. The user is free to use the effects tree interface to make the connections after the fact. Under all circumstances, all source media externally (external sources) available to the effects tree are available through input nodes within the effects tree itself. For example, this condition may arise if the preset had been saved with only one input node, but is being applied as a transition which by definition has two clips available as inputs.
  • the effects tree preset When the effects tree preset is applied as a source generator, none of the input nodes need to be connected externally.
  • Application of the preset in this context is applicable if the DAG of effects inside the effects tree is capable of generating material such as a noise generator.
  • the preset When the preset is applied in a context where only a single clip of material is available such as when applied as a clip effect or track effect, only one input node is connected externally to the clip automatically. The rest of the input nodes, if they exist, will remain unconnected to any external material. This externally connected and unconnected state of the input nodes is visually presented to the user as well.
  • the preset When the preset is initially loaded as a node within another effects tree, no external connections are made to the effects tree inputs. Since the preset shows up as a node with allof its inputs displayed as ports on a node, the user is free to make the external connections to these inputs by drawing connection to other nodes in the DAG.
  • FIG. 6 shows the creation of a effect preset 610,.
  • FIG 6 depicts a User Interface 600 for producing a graphics output and showing an effect tree 610 having three effects.
  • the effects are Blur effect 612, Color Correction effect 614, and Wipe effect 616.
  • a single input 618 is presented to the Blur effect 612., and the output of that blur is passed to the Color Correction effect 614.
  • the output of the Color Correction effect 614 is input to the Wipe effect 616.
  • Wipe effect 618 provides for two inputs Inputl 640 and Input2 642.
  • User Interface 600 includes a View Area 620 for showing the graphical output of the effect tree 610, and a timeline 630 for adjusting the display of View Area 620 according to time.
  • the effect tree 610 is saved as a preset for subsequent retrieval and reuse.
  • FIG. 7 shows the application of the same effect tree 610 as a preset.
  • the effect tree 610b is now applied as a transition in which two source clips are available to be used, Inputl 710 and Input2 720.
  • the original effect tree in fig 6 was created with only one input node.
  • the available input node is connected externally to one of the source clips. . .
  • the effect tree creates a second input node dynamically and makes an external connection to the second source clip automatically. After these operations, both input source clips are available to the tree through the two input nodes to be used for any further connections within the effect tree. .
  • the user is free to connect this new input node to the input of any other effect in the effect tree.
  • FIG. 8 is similar to FIG. 6 and shows the creation of an effect tree preset including a Wipe effect in which two inputs are specified, Inputl 820 and Input2 830.
  • effect tree 810 is shown to be connected to two source clips.
  • Effect tree 810 is saved as a preset and applied as a filter in FIG. 9.
  • effect tree 810 is connected to only one source clip through input node 1. Since no other source media is available, there is no external connection made for input node 2. In one practice of the invention, user feedback is provided to show that the second input has no external connection.

Abstract

The present invention provides for the application of an effect tree (114) to a clip or clips of video information, and more particularly to the application of an effect tree (114) to a clip or clips of video information in a timed mannerline editing context with a variable number of inputs and outputs which are defined by the user, prior to the application.

Description

GENERATION AND VERSATILE USAGE OF EFFECT TREE PRESETS
RELATED APPLICATIONS
This application is a Continuation-in-Part of the pending application having serial number 09/369,615, which was filed on August 6, 1999 and incorporates by reference the subject matter disclosed therein.
FIELD OF THE INVENTION
The present invention relates generally to the application of an effect tree to a clip or clips of video information, and more particularly to the application of an effect tree to a clip or clips of video information in a timeline editing context with a variable number of inputs and outputs which are defined by the user, prior to the application.
BACKGROUND OF THE INVENTION
Computer software is routinely used to depict graphic images on a computer display, under the control of specially designed graphics software. The graphics software is meant to execute in conjunction with an operating system that supports the graphics capabilities of the system; i.e. the software must understand graphics concepts such as brush, pen, and color support. Dedicated graphics processors and general purpose computer systems with high speed processors and large, high resolution monitors have increased the ability to display high quality graphics when used in conjunction with the graphics software.
Graphics software packages that run on various hardware configurations are commercially available. Video editing and computer animation applications are the typical uses of the software. Softimage Co. of Montreal, Canada provides an example of computer graphics software that is commercially available. This software provides for the ability to modify video and animation clips by applying certain effects to the clips, according to the specifications of the user. Examples of these effects include a brush effect for painting the clip, a dissolve effect for transitioning between clips, and a color selection effect for selectively changing a specified color in a clip to a new color. Of course other effects are possible, and it is not intended that those identified above are the only ones contemplated.
Effects may be chained together to form an "effect tree" for implementing more complex effects. In essence, an "effect tree" is a Directed Acyclic Graph, or DAG, of effects. The output of one effect is input to one or more subsequent effects. Effects can be applied in a layered or sequential manner but is not restricted to a chain. A typical effect tree operates on a set of input images by passing them through one or more effects such as a blur or tint, and then mixing them together using operations such as "multiply" or "over".
An effect tree is usually presented to the user as a graph in which each effect within the effect tree is a node of the graph, and is represented as an icon on a display device. The graph depicts the input or inputs to each node of the effect tree and its resulting output. The effect tree is editable by the user. The user can change interconnections between effects, add or delete effects using the effect tree user interface, and can store the edited effect tree for later retrieval and use (a preset).
Effect trees can be applied to a video or animation clip or multiple clips in a compositing context or a timeline editing context. In a compositing context, the emphasis is on applying a DAG of effects on different layers of material which typically occupy a completely overlapping time span. The layers are composited together at the end to produce an output clip which occupies a time span which is the intersection of the input material time spans. In this context multiple inputs can be fed through a DAG to produce the final output. In a timeline editing context, the emphasis is on piecing together input material so that the material is almost stacked end to end to produce an output clip. Some material may be overlapped, and transition effects applied in the overlap area. In addition, single or stacks of effects can be applied to single clips, changing only that clip. In this context, a single effect or an effects tree composed of multiple effects can only operate on a single input or in the case of transitions a maximum of two inputs. There is a fixed limitation on the number of inputs and output in this context. Hence, the basis of distinction between the two contexts is this limitation of inputs and outputs. Henceforth, this context will be referred to as the limited context.
It would be thus desirable to provide computer graphics software that permits re-use of the same effect tree between compositing, editing and other limited contexts regardless of the number of inputs and outputs present in the effect tree. SUMMARY OF THE INVENTION
The present invention advantageously provides for an effect tree in which the input source media and possibly the outputs are disassociated from the tree of effects. The same effects tree preset can be applied in different effects environments, acting for example as a compositor, a filter, transition or source generator on a timeline, or as a packaged effect within another effect tree..
In one aspect of the invention, the input material to the effect tree is a video or animation clip, but the invention is not limited to only video or animation data. Other forms of multimedia input are contemplated to be within the scope of the invention.
In a yet further aspect of the invention, a timing operation is imposed on an effect tree that has been stored either previously as a preset for subsequent retrieval or newly created. The input to the effect tree is transitioned to the output in response to the timing parameters that are specified by a user. The timing parameters may specify the time period over which the effect tree will be active on an input, or may determine the progression of a transition effect from one state to another.
In a still further aspect of the invention, the number of input or output nodes for an effect tree may be less than the desired number of external connections at the time of applying the effect tree and consequently more input or output nodes are created dynamically in order to accommodate the increased connections.
In a further aspect of the invention, the number of input or output nodes for effect tree may be more than the desired number of external connections at the time of applying the effect tree and thus input or output nodes are ignored by the effect tree in order to satisfy the reduced number of connections.
In another aspect of the invention, the timing parameters are user selectable and independent of the effect tree itself. Consequently, any effect tree can have a timing operation imposed on it, and the creator of the effect tree need not be concerned with this aspect of the effect tree when it is constructed.
In a further aspect of the invention, the timing parameters for a timing operation are easily changed in order to experiment with the response of the effect tree to the timing changes. Consequently, the user enjoys a significant flexibility in applying the timing parameters to a particular effect tree.
In a still further aspect of the invention, specifying a timing operation for an effect tree automatically causes a change in any graph associated with the effect tree in order to conform to the timing operation. BRIEF DESCRIPTION OF THE DRAWINGS
A specific embodiment of the invention will now be described, by way of example, with reference to the accompanying drawings in which:
FIG. 1 is a drawing of a computer system suitable for implementing a system for editing effect trees, according to the present invention.
FIG. 2 depicts the hardware components of the computer system of FIG. 1 in further detail.
FIG. 3 depicts a user interface, according to the invention, that allows for the application of an effect tree to two video clips.
FIG. 4 depicts a user interface, according to the invention, that allows for the saving of a newly created effect tree as a preset for future use.
FIG. 5 depicts a user interface, according to the invention, that allows for the application of an effect tree to an input video in a timed manner.
FIG. 6 shows the creation of an effect tree preset, according to the invention.
FIG. 7 shows the application of the preset of Fig. 6 as a transition.
FIG. 8 shows the creation of an effect tree preset, according to the invention.
FIG. 9 shows the application of the preset of Fig. 8 as a filter DETAILED DESCRIPTION
In the following discussion, the present invention is described for illustrative purposes with reference to the editing of video information. However, one of ordinary skill in the art will recognize that the invention, in it broadest sense, is applicable to applications other than video applications, and it is not intended that the scope of the invention be so limited. For example, the present invention is also applicable to the editing of audio data, and to media data in general.
A computer graphics imaging system 10 is schematically depicted in FIG. 1 and FIG. 2. The graphics imaging system 10 includes a computer 11 that has a central processing unit (CPU) 12, a system bus 14, a static memory 16, a main memory 18, a mass memory 20, an alphanumeric input device 22, a pointer device 24 for manipulating a cursor and making selections of data, and a display adapter 26 for coupling control signals to a video display 28 such as a computer monitor. Since the graphics imaging system 10 is particularly suited to high resolution, high-speed graphics imaging the display or monitor 28 is most preferably a high- resolution wide screen display.
The CPU 12 executes imaging software described below to allow the system 10 to render high quality graphics images on the monitor 28. The CPU 12 comprises a suitable processing device such as a microprocessor, for example, and may comprise a plurality of suitable processing devices. The CPU 12 executes instructions stored in the static memory 16, main memory 18, and/or mass memory 20.
The static memory 16 may comprise read only memory (ROM) or any other suitable memory device. The static memory may store, for example, a boot program for execution by CPU 12 to initialize the graphics imaging system 10. The main memory 18 may comprise random access memory (RAM) or any other suitable memory device. The mass memory 20 may include a hard disk device, a floppy disk, an optical disk, a flash memory device, a CDROM, a file server device or any other suitable memory device. For this detailed description, the term memory comprises a single memory device and any combination of suitable devices for the storage of data and instructions.
The system bus 14 provides for the transfer of digital information between the hardware devices of the graphics imaging system 10. The CPU 12 also receives data over the system bus 14 that is input by a user through alphanumeric input device 22 and/or the pointer device 24. The alphanumeric input device 22 may comprise a keyboard, for example, that comprises alphanumeric keys. The alphanumeric input device 22 may comprise other suitable keys such as function keys for example. The pointer device 24 may comprise a mouse, track-ball, and/or joystick, for example, for controlling the movement of a cursor displayed on the computer display 28.
The graphics imaging system 10 of FIG. 1 also includes display adapter hardware 26 that may be implemented as a circuit that interfaces with system bus 14 for facilitating rendering of images on the computer display 28. The display adapter hardware 26 may, for example, be implemented with a special graphics processor printed circuit board including dedicated random access memory that helps speed the rendering of high resolution, color images on a viewing screen of the display 28.
The display 28 may comprise a cathode ray tube (CRT) or a liquid crystal display particularly suited for displaying graphics on its viewing screen. The invention can be implemented using high-speed graphics workstations as well as personal computers having one or more high-speed processors.
The graphics imaging system 10 utilizes specialized graphics software particularly suited to take advantage of the imaging hardware included in the display system 10 depicted in FIG. 1 and FIG. 2. The software implements nonlinear editing, compositing, audio mixing, and graphics design suites which are used to create multimedia presentations. Source material for use with such a system can be obtained from a media storage device 50 that can include videotape, film reel, and digitally recorded videodisks. The source material can also be in the form of previously digitized materials stored on a computer memory 20 such as computer generated animations, graphic images or video files stored on a large capacity hard or fixed disk storage device. To utilize the storage images from the media storage 50, the system 10 includes a multi -media interface 30 for converting image data into a form suitable for use by the software executing on CPU 12 and display adapter 26. A representative display by the graphics software presents multiple images 52 of different resolutions.
Graphics Imaging User Interface
FIG. 3 illustrates a typical user interface or screen display 110 for use in graphics imaging by a graphics design suit that forms a part of the graphics software. The screen display includes an image viewing area 112, an effect tree viewing area 114 for viewing the current effect tree, and a property editor region 116 for displaying a curve or graph of the user specified timing to be associated with the effect tree in the effect tree view area 114. The user interface may include a number of sculpted buttons that are actuated by selecting or clicking on the button with the pointer device 24. The graphics software is executed under an operating system that includes functions for creating multiple tasks to be executed on the computer system 10. A taskbar 118 located along side of the effect tree view area 114 allows the user to switch between tasks by activating or selecting different icons in the taskbar 118. The graphic design suite having the user interface depicted in FIG. 3 is launched by choosing or selecting a graphics icon from the multiple icons displayed in the taskbar. Others of the multiple icons cause the software to display graphics particularly suited for other tasks such as editing, audio mixing, or combining multiple video layers into a combined audiovisual output.
The Effect Tree
The graphics software of graphics imaging system 10 includes the capability for a user to specify certain effects that can be applied to modify a video image or animation. For example, the software may support a "blur" effect that results in a blur of an input image. In another example, a "tint" effect applies a tint to the received image. An effect is a digital filter for modifying in some manner an input image to produce an output image.
As is shown in FIG. 3, multiple effects may be linked together to form a more complex effect. The output of one effect is the input to another effect, and thus this DAG of effects is applied to the input image or multiple input images as in FIG. 3. In FIG. 3, we see that the effect tree 113, shown in the effect tree area 114, has a first input 120, which is an input channel for a first video clip -CI and a second input 122, which is an input channel for a second video clip - C2. The first input 120 is introduced into a blur effect 124 for blurring the image presented at first input 120. The second input 122 is introduced into a color correction effect 126 for changing the color in some portion of the second image. The output of blur effect 124 is input to a dissolve effect 128, along with the output of the color correction effect 126. As used in this compositing example, a dissolve effect blends between the images from the first input 134 and the second input 136 over the overlapping time spans of the inputs. The output 132 of the effect tree 113 is presented in the viewing area 114. The combination of the blur effect 124, the color correction effect 126, the dissolve effect 128, along with the associated inputs and output comprise an example of an effect tree. In FIG. 3, the effect tree 113 includes two inputs since the dissolve effect requires two inputs. However, one of ordinary skill in the art will recognize that an effect tree may allow for any number of inputs and outputs..
Effect trees may be named and stored in non-volatile memory for future use as a preset effect tree for later retrieval and application to an object. It may also be included in other stored effect trees to generate an even more complex effect. It is important to note that what is stored are only the effects, their parameters with animation information, their interconnections with each other, and the input nodes. The actual media or a reference to the media connected to the input nodes of the effect tree is not stored. Therefore, a stored effect tree becomes a stored object that is available for modifying an input object or objects to produce an output in a deterministic fashion. It may be retrieved from storage and also used as a template to create other effect trees by editing the original effect or by adding or deleting other effects, inputs, and outputs to create the new effect tree.
Referring now to FIG. 4, there is shown a second display screen 150 including a window or panel 152 for saving an effect tree as a preset. Panel 152 is generated by clicking on the icon 154 in toolbar 118 to request saving the effect tree. The effect tree that is saved is the effect tree 113 displayed in the effect tree area 114. The name of the saved effect tree is entered into a file name entry 154 to allow future retrieval of the saved effect tree.
In one embodiment of the invention, an effect tree whether retrieved from storage or newly created is applied to an object in a timed manner. A third display screen 160 shows a user interface in FIG. 5 for applying an effect tree in a timed manner. The effect tree 113 that is be applied to one or more video clips is shown in effect tree panel 162. The output image from the effect tree is again shown in the image viewer 112. The display screen 160 includes a timing panel 168 for specifying timing parameters to be used in the timed application of the effect tree 113.
Specifically, two input video clips 164 and 166 to the effect tree are identified in the timing panel 168, and the timing relationship for the video input is defined. Referring to the effect tree 113 shown in the effect tree panel 162, effect tree 113 requires two video inputs and when active "dissolves" the first video clip 164 into the second video clip 166. Timing panel 168 depicts three timing markers 170, 172, and 174 which identify the timing associated with the dissolve effect of the effect tree 113 shown in effect tree panel 162. Timing marker 170 identifies that initially the first input video clip 164 is fed directly to the viewer as the output. At the time identified by the timing marker 172, the second video clip 166 is also input and the dissolve effect tree is active. The dissolve effect is active until the time identified at timing marker 174 wherein the second video clip is the only input and is fed directly to the viewer 112 as the output. The time period between the timing markers 172 and 174 is user specifiable, and specifies the time period when both inputs are active. This time period is identified in the timing panel 168 as a shaded area 176 to emphasize this condition.
By specifying different timing markers, the user may define a timed application of the video inputs to the effect tree. In a further embodiment of the invention the effect tree transitions the input video to the prescribed output over a specified period of time which is identified by the timing markers set in the timing panel 168. The graphical imaging system, according to the present invention, advantageously provides for the timed control of both the input to an effect tree and the duration of the response of the effect tree. In a further embodiment of the invention, a graph area 180 is provided for specifying, in graphical form, a rate of response for a transition effect that is included in the effect tree. A transition effect is an effect that morphs the input into the output over a period of time. The graphical imaging system, according to the present invention, provides that the transition effect may be applied at different rates over the time period of the transition. In the graph area 180, the graph 182 indicates that the dissolve effect is applied in a linear manner from 0% dissolve to 100% dissolve (i.e. at a constant rate from the first video clip to the second video clip). Graph 182 is a plot of the percentage of the effect versus time, and different rates for applying a transition effect may be specified by plotting a different graph in the graph area 180. In one practice of the invention, the graph 182 in the graph area 180 is automatically resized in response to a change in the timing panel 168. 182for any transition effect specified in the effect tree. Referring to the graph 182 shown in FIG. 5, a change in the time period specified for the dissolve, the time period identified by the timing markers 172 and 174, changes the slope of the graph 182 in which the transition effect is applied, since the graph is linear. If the graph 182 were non-linear, the graph 182 is modified to conform the graph 182 to the new time period.
In a further embodiment of the invention, the actual media or a reference to the original media associated with the input or inputs to the effect tree are disassociated from the effect tree itself. When the user first creates the effects tree, he is free to add as many inputs to the effect tree as needed. The inputs to an effects tree are represented by specially marked "input nodes". These input nodes serve the purpose of preserving the connections of effect nodes to the inputs. When saving an effects tree preset, the input nodes as well as the connections of other effects to these nodes are saved. At the time of application of the effect tree preset, an external connection to the input's source is made to none, some or all these input nodes, depending on the context of application, and the number of input sources available at the time of application.
In some contexts at the time of applying the preset, the number of input nodes inside the effects tree preset may be less than the number of source clips that is desired or is available to be connected. Consequently, additional inputs are created dynamically within the effects tree so that every media source available to the tree has a corresponding input node in the tree (even if there was not one there at the time of the effect tree creation). These newly created input nodes would initially not have any connections to any effects in the effects tree DAG. The user is free to use the effects tree interface to make the connections after the fact. Under all circumstances, all source media externally (external sources) available to the effects tree are available through input nodes within the effects tree itself. For example, this condition may arise if the preset had been saved with only one input node, but is being applied as a transition which by definition has two clips available as inputs.
Examples follow of how the same preset can be applied in different contexts in a very versatile fashion. The application of the effect tree presets is in no way limited to just these scenarios. These examples are presented to illustrate the versatility of the invention. All the following examples assume that the effects tree in question has been originally constructed in a context which contained enough input nodes to make the external connections, and that they are being applied in a different and limited context which has fewer sources available.
When the effects tree preset is applied as a source generator, none of the input nodes need to be connected externally. Application of the preset in this context is applicable if the DAG of effects inside the effects tree is capable of generating material such as a noise generator. When the preset is applied in a context where only a single clip of material is available such as when applied as a clip effect or track effect, only one input node is connected externally to the clip automatically. The rest of the input nodes, if they exist, will remain unconnected to any external material. This externally connected and unconnected state of the input nodes is visually presented to the user as well.
When the effect tree preset is applied to a transition only the first two input nodes are connected externally to the "to" and "from" clips respectively.
When the preset is initially loaded as a node within another effects tree, no external connections are made to the effects tree inputs. Since the preset shows up as a node with allof its inputs displayed as ports on a node, the user is free to make the external connections to these inputs by drawing connection to other nodes in the DAG.
FIG. 6 shows the creation of a effect preset 610,. FIG 6 depicts a User Interface 600 for producing a graphics output and showing an effect tree 610 having three effects. The effects are Blur effect 612, Color Correction effect 614, and Wipe effect 616. A single input 618 is presented to the Blur effect 612., and the output of that blur is passed to the Color Correction effect 614. The output of the Color Correction effect 614 is input to the Wipe effect 616. Wipe effect 618 provides for two inputs Inputl 640 and Input2 642. User Interface 600 includes a View Area 620 for showing the graphical output of the effect tree 610, and a timeline 630 for adjusting the display of View Area 620 according to time. The effect tree 610 is saved as a preset for subsequent retrieval and reuse. In FIG. 7 shows the application of the same effect tree 610 as a preset. The effect tree 610b is now applied as a transition in which two source clips are available to be used, Inputl 710 and Input2 720. However the original effect tree in fig 6 was created with only one input node. The available input node is connected externally to one of the source clips. . .The effect tree creates a second input node dynamically and makes an external connection to the second source clip automatically. After these operations, both input source clips are available to the tree through the two input nodes to be used for any further connections within the effect tree. . The user is free to connect this new input node to the input of any other effect in the effect tree.
FIG. 8 is similar to FIG. 6 and shows the creation of an effect tree preset including a Wipe effect in which two inputs are specified, Inputl 820 and Input2 830. In FIG. 8, effect tree 810 is shown to be connected to two source clips. Effect tree 810 is saved as a preset and applied as a filter in FIG. 9. In FIG. 9 effect tree 810 is connected to only one source clip through input node 1. Since no other source media is available, there is no external connection made for input node 2. In one practice of the invention, user feedback is provided to show that the second input has no external connection.
Having described the invention, it should be apparent to those of ordinary skill in the art that the foregoing is illustrative and not limiting. Numerous modifications and other embodiments are within the scope of one of ordinary skill in the art and are contemplated as falling within the scope of the invention as defined by the appended claims.

Claims

WE CLAIM:
1. A graphics imaging system, said system comprising:
an effect tree for producing a first graphical output from said system, said effect tree having an initial defined number of input nodes for connecting a first group of external sources to said effect;
wherein said effect tree dynamically adjusts the number of said input nodes when said first group of external sources differs from said initial defined number of input nodes.
2. The graphics system of claim 1 wherein said effect tree dynamically increases said input nodes when said first group of external sources exceeds said initial defined number of input nodes.
3. The graphics system of claim 1 wherein said effect tree ignores said input nodes when said initial defined number of input nodes exceeds said first group of external sources.
4. The graphics imaging system of claim 1 further comprising:
a preset generator for storing said effect tree exclusive of said first group of external sources;
wherein said effect produces a second graphical output in response to a second group of connected external sources.
5. The graphics imaging system of claim 1 wherein said effect is a media generator and said group of external sources has no members.
6. The graphics system of claim 1 wherein said effect dynamically connected a default connected external source for input nodes that are not connected to one of said first group of external sources.
7. A graphics imaging system, said system comprising:
a plurality of effects for producing a first graphical output from said system, said effects having an initial defined number of input nodes for connecting a first group of external sources to said effect;
an effect tree generator for combining two or more of said effects to produce an effect tree;
a preset generator for storing said effect tree exclusive of said first group of external sources;
wherein said stored effect tree produces a second graphical output in response to a second group of external sources.
8. The graphical imaging system of claim 7 further comprising:
a timer element for applying timing constraints to said effect tree.
9. The graphical imaging system of claim 8 wherein said timing element includes a graphical user interface for specifying timing information in applying timing constraints to said effect tree.
10. The graphical imaging system of claim 9 wherein said graphical user interface includes markers for specifying said timing information.
11. The graphical imaging system of claim 7 wherein said effect tree dynamically adjusts the number of said input nodes in an effect in said effect tree when said first group of external sources differs from said initial defined number of input nodes in said effect.
12. The graphics system of claim 11 wherein said effect tree dynamically increases said input nodes in an effect in said effect tree when said first group of external sources exceeds said initial defined number of input nodes in said effect.
13. The graphics system of claim 11 wherein said effect tree ignores said input nodes in an effect in said effect tree when said initial defined number of input nodes in said effect exceeds said first group of external sources.
PCT/US2000/021522 1999-08-06 2000-08-04 Generation and versatile usage of effect tree presets WO2001011627A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU65266/00A AU6526600A (en) 1999-08-06 2000-08-04 Generation and versatile usage of effect tree presets

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US36961599A 1999-08-06 1999-08-06
US09/369,615 1999-08-06

Publications (1)

Publication Number Publication Date
WO2001011627A1 true WO2001011627A1 (en) 2001-02-15

Family

ID=23456178

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2000/021522 WO2001011627A1 (en) 1999-08-06 2000-08-04 Generation and versatile usage of effect tree presets

Country Status (2)

Country Link
AU (1) AU6526600A (en)
WO (1) WO2001011627A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1552685A1 (en) * 2002-05-09 2005-07-13 Parkervision, Inc. Video production system for automating the execution of a video show
AU2005201165B2 (en) * 2004-03-24 2008-04-10 Canon Kabushiki Kaisha Rendering Images Containing Video
WO2015074059A1 (en) * 2013-11-18 2015-05-21 Google Inc. Configurable media processing with meta effects
US9123380B2 (en) 1998-12-18 2015-09-01 Gvbb Holdings S.A.R.L. Systems, methods, and computer program products for automated real-time execution of live inserts of repurposed stored content distribution, and multiple aspect ratio automated simulcast production

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4114440A1 (en) * 1991-05-03 1992-11-05 Broadcast Television Syst Multistage video signal mixer - has switchable mixer stage inputs to form cascade or planar structures
EP0564247A1 (en) * 1992-04-03 1993-10-06 Adobe Systems Inc. Method and apparatus for video editing
US5359712A (en) * 1991-05-06 1994-10-25 Apple Computer, Inc. Method and apparatus for transitioning between sequences of digital information
US5682326A (en) * 1992-08-03 1997-10-28 Radius Inc. Desktop digital video processing system
WO1998005034A1 (en) * 1996-07-29 1998-02-05 Avid Technology, Inc. Graphical user interface for a motion video planning and editing system for a computer
WO1998006099A1 (en) * 1996-08-06 1998-02-12 Interval Research Corporation Time-based media processing system
US5892506A (en) * 1996-03-18 1999-04-06 Discreet Logic, Inc. Multitrack architecture for computer-based editing of multimedia sequences

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4114440A1 (en) * 1991-05-03 1992-11-05 Broadcast Television Syst Multistage video signal mixer - has switchable mixer stage inputs to form cascade or planar structures
US5359712A (en) * 1991-05-06 1994-10-25 Apple Computer, Inc. Method and apparatus for transitioning between sequences of digital information
EP0564247A1 (en) * 1992-04-03 1993-10-06 Adobe Systems Inc. Method and apparatus for video editing
US5682326A (en) * 1992-08-03 1997-10-28 Radius Inc. Desktop digital video processing system
US5892506A (en) * 1996-03-18 1999-04-06 Discreet Logic, Inc. Multitrack architecture for computer-based editing of multimedia sequences
WO1998005034A1 (en) * 1996-07-29 1998-02-05 Avid Technology, Inc. Graphical user interface for a motion video planning and editing system for a computer
WO1998006099A1 (en) * 1996-08-06 1998-02-12 Interval Research Corporation Time-based media processing system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
NOWARA T: "NONLINEAR EDITING: AVID XPRESS AUF DEM PRUEFSTAND", FERNSEH UND KINOTECHNIK,VDE VERLAG GMBH. BERLIN,DE, VOL. 52, NR. 10, PAGE(S) 594-596,598-600, ISSN: 0015-0142, XP000860879 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9123380B2 (en) 1998-12-18 2015-09-01 Gvbb Holdings S.A.R.L. Systems, methods, and computer program products for automated real-time execution of live inserts of repurposed stored content distribution, and multiple aspect ratio automated simulcast production
US9558786B2 (en) 1998-12-18 2017-01-31 Gvbb Holdings S.A.R.L. Systems, methods, and computer program products for multiple aspect ratio automated simulcast production
US9711180B2 (en) 1998-12-18 2017-07-18 Gvbb Holdings S.A.R.L. Systems, methods, and computer program products for automated real-time execution of live inserts of repurposed stored content distribution
US10056111B2 (en) 1998-12-18 2018-08-21 Gvbb Holdings S.A.R.L. Systems, methods, and computer program products for multiple aspect ratio automated simulcast production
EP1552685A1 (en) * 2002-05-09 2005-07-13 Parkervision, Inc. Video production system for automating the execution of a video show
EP1552685A4 (en) * 2002-05-09 2006-06-07 Parkervision Inc Video production system for automating the execution of a video show
US10360944B2 (en) 2002-05-09 2019-07-23 Gvbb Holdings S.A.R.L. Systems, methods, and computer program products for multiple aspect ratio automated simulcast production
US10546612B2 (en) 2002-05-09 2020-01-28 Gvbb Holdings S.A.R.L. Systems, methods, and computer program products for automated real-time execution of live inserts of repurposed stored content distribution
AU2005201165B2 (en) * 2004-03-24 2008-04-10 Canon Kabushiki Kaisha Rendering Images Containing Video
WO2015074059A1 (en) * 2013-11-18 2015-05-21 Google Inc. Configurable media processing with meta effects
CN105900175A (en) * 2013-11-18 2016-08-24 谷歌公司 Configurable media processing with meta effects

Also Published As

Publication number Publication date
AU6526600A (en) 2001-03-05

Similar Documents

Publication Publication Date Title
US6621504B1 (en) Editor for effect tree parameters
US6167404A (en) Multimedia plug-in using dynamic objects
US8069421B2 (en) Methods and apparatus for graphical object implementation
Myers et al. GARNET comprehensive support for graphical, highly interactive user interfaces
US5553222A (en) Multimedia synchronization system
JP4166378B2 (en) Digital image processing device
JP4700423B2 (en) Common charting using shapes
US7725828B1 (en) Application of speed effects to a video presentation
US5680639A (en) Multimedia control system
US5655144A (en) Audio synchronization system
US5664216A (en) Iconic audiovisual data editing environment
US5148154A (en) Multi-dimensional user interface
EP0827112B1 (en) Adjustment layers for composited image manipulation
US5596696A (en) Method and apparatus for synchronizing graphical presentations
JP4166376B2 (en) Digital video signal processing apparatus and method
US20080250314A1 (en) Visual command history
WO1994027235A1 (en) Midi synchronization system
KR20080042835A (en) Extensible visual effects on active content in user interfaces
US20040207665A1 (en) Graphical user interface for providing editing of transform hierarchies within an effects tree
US7318203B2 (en) Selecting image processing functions
JP4299925B2 (en) Data processing device
US6456296B1 (en) Color scheme for zooming graphical user interface
JP3980795B2 (en) Digital video processing device
Schloss et al. Building temporal structures in a layered multimedia data model
JP3929649B2 (en) Video special effects equipment

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AU CA JP US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP