GB2096868A - Audio/video editing system having touch responsive function display screen - Google Patents

Audio/video editing system having touch responsive function display screen Download PDF

Info

Publication number
GB2096868A
GB2096868A GB8207089A GB8207089A GB2096868A GB 2096868 A GB2096868 A GB 2096868A GB 8207089 A GB8207089 A GB 8207089A GB 8207089 A GB8207089 A GB 8207089A GB 2096868 A GB2096868 A GB 2096868A
Authority
GB
United Kingdom
Prior art keywords
coupled
controller
editing
recorder
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB8207089A
Other versions
GB2096868B (en
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ampex Corp
Original Assignee
Ampex Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ampex Corp filed Critical Ampex Corp
Publication of GB2096868A publication Critical patent/GB2096868A/en
Application granted granted Critical
Publication of GB2096868B publication Critical patent/GB2096868B/en
Expired legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/022Electronic editing of analogue information signals, e.g. audio or video signals
    • G11B27/028Electronic editing of analogue information signals, e.g. audio or video signals with computer assistance
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers
    • G11B2220/25Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
    • G11B2220/2508Magnetic discs
    • G11B2220/2512Floppy disks
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/90Tape-like record carriers
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/022Electronic editing of analogue information signals, e.g. audio or video signals
    • G11B27/024Electronic editing of analogue information signals, e.g. audio or video signals on tapes

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Management Or Editing Of Information On Record Carriers (AREA)
  • Position Input By Displaying (AREA)
  • Television Signal Processing For Recording (AREA)
  • Input From Keyboards Or The Like (AREA)
  • Image Input (AREA)

Abstract

There is described (Fig. 3) a data- editing monitor having a touch input system consisting of complementary arrays of LED's and photodiodes providing a grid of light paths selectively interruptible by the operator's finger. <IMAGE>

Description

SPECIFICATION Audio/Video editing system having touch responsive function display screen Over the years audio/video editing systems have developed into complex and sophisticated arrangements for manipulating video and audio information recorded therewith in a desired fashion. In a typical editing system selected video or audio information from a plurality of different inputs is recorded on a record audio/video tape recorder under the careful control of an operator.
The information to be recorded is typically contained in one or more of a plurality of source tape recorders but can also come from other input sources such as a microphone or video camera.
A switcher coupled between the record tape recorder and the source tape recorders and any other input devices responds to editing commands to control the exact information recorded by the record tape recorder. Proper synchronization during the recording operation characterizing a typical editing process is maintained by reference to a coded reference track which accompanies each recording on tape and which idenfifies each frame of the recording.
Such reference tracks typically involve the use of time code (recorded either directly on the reference track within the video tracks during the vertical interval, or a combination of both) as specified in various documents by the Society of Motion Picture and Television Engineers (SMPTE) and the European Broadcasting Union (EBU).
In a typical editing operation the operator manipulates a keyboard while viewing a video monitor which provides a visual display of the video information being recorded on the record tape recorder. The keyboard is typically a designated keyboard, meaning that it is comprised of different keys representing specific editing functions. The keys may be arranged in an ASCII format or in any other format which is convenient. Keys arranged in the ASCII format are also designated by letters arranged in a conventional typewriter keyboard format so that the various different editing functions can be selected largely by feel and based on experience and memory, much in the same manner as an experienced typist types.
As editing systems have become more sophisticated and complex, the keyboards used therewith have grown larger and more confusing.
It is not unusual, for example, to have an editing keyboard comprised of more than 100 keys, each representing a different editing function or type of function. As a result considerable time and experience are required with a given machine before the operator can perform editing functions quickly and efficiently. However, even in cases where the operator is experienced and totally familiar with a given editing system, there is an inherent problem with conventional editing systems in that the mechanical considerations imposed by a fixed keyboard with so many functions seriously detract from the creative aspects of editing. Time code representations of edit entry and exit points are typically relied upon rather than editing in accordance with the picture itself.
A further limitation of many conventional editing systems resides in the confinement of system intelligence to one central location. Those systems employing a central processor or other processing unit in conjunction with the editing functions typically concentrate limited processing equipment at one central location, thereby restricting future flexibility of a particular system.
Over the years various attempts have been made at improving the human-machine interface involved in editing. One such approach, for example, involves the use of a light pen. The pen is manipulated relative to a responding member to select certain editing functions, thereby reducing the number of buttons needed. At best, however, such systems replace keyboard buttons with the light pen, requiring that the operator pick up and handle the light pen with each operation.
The present invention aims to achieve one or more of the following objects: (1) to provide an editing system with improved human-machine interfacing so as to facilitate the artistic aspects and approach to video editing; (2) to provide an editing system capable of displaying different menus of editing functions to be chosen from the operator; (3) to provide an editing system capable of accommodating varying numbers and types of information inputs; and (4) to provide an editing system capable of intelligent operations at peripheral portions of such system in addition to a central portion of the system.
In a preferred form of the invention an editing system is capable of displaying different menus of editing information and selections on a display screen, which is responsive to the touching of or close proximity to selected areas thereof by the operator so as to change the menus and at the same time accomplish desired editing functions.
The presentation of a limited menu of editing information and selections at any given instant enables the operator to concentrate on a particular type or types of editing operations being performed. At the same time, however, considerable versatility is afforded in that the operator can choose the level at which the desires to work and can change each menu as he sees fit so as to combine levels and afford the custom designing of menus with virtually unlimited possibilities.
The versatility and character of operation of editing systems in accordance with the invention are enhanced by utilizing a system arrangement which places intelligent capabilities at peripheral locations as well as at a central location. In this manner a single central processor can communicate with a variety of different input equipment virtually simultaneously. Proper interfacing equipment provides for the conversion of data into a machine usable form tailored to the particular requirements of input and output peripheral equipment.
In a preferred arrangement of an audio/video editing system in accordance with the invention a data monitor comprising a cathode ray tube or other display screen device is coupled to an edit controller from which it receives the different menus of editing information and selections. The display screen is equipped with apparatus for providing a network of interruptable beams thereacross. When the operator points to a particular character or group of characters on the display screen, the beams in this area of the display screen are interrupted and a corresponding signal is provided to the edit controller. The edit controller which is comprised of a central processing unit, a memory and a character generator responds by making appropriate changes in the menu provided the visual display.The character generator coupled to the central processing unit provides the various characters comprising the different menus displayed on the data monitor. At the same time the central processing unit together with its associated memory provides data to other portions of the editing system to accomplish editing functions commanded when the operator touches particular portions of the displayed menus. In addition to the touch responsive data monitor, the editing system can be provided with one or more conventional keyboards also capable of providing editing information and selections to the edit controller.
The edit controller includes a plurality of intelligent line controllers coupled to the central processing unit. One of the intelligent line coritrollers is coupled to the touch sensitive apparatus associated with the data monitor.
Another one of the intelligent line controllers is coupled to a record tape recorder on which the material being edited is recorded. Another one of the intelligent line controllers is coupled to a switcher. The remaining ones of the intelligent line controllers are each coupled to a different input information source. The input information sources can comprise such things as video cameras or audio microphones but are typically comprised of source tape recorders. The switcher is coupled between the record tape recorder and the various input sources such as the source tape recorders to control coupling of the input sources to the record tape recorder. Each of the intelligent line controllers which includes its own processing unit and memory is capable of performing intelligent functions in locations remote from the central processing unit.In addition to converting parallel data from the central processing unit to a serial data form and vice versa, each intelligent line controller acts as a data buffer between the central processing unit and the various tape recorders and other peripheral equipment. The intelligent functions performed by the intelligent line controllers enable the central processing unit to communicate with the various pieces of peripheral equipment virtually simultaneously.
The result is an open-ended editing system capable of being coupled to input devices of varying numbers and types. The various input devices such as the source tape recorders as well as the switcher and the record tape recorder are coupled to the various intelligent line controllers through interfaces which serve to convert the serial data from the intelligent line controllers into a form usable by the particular recorder or other device coupled thereto.
The foregoing and other objects, features and advantages of the invention will be apparent from the following more particular description of a preferred embodiment of the invention, as illustrated in the accompanying drawings, in which: Fig. 1 is a perspective view of an audio/video editing system in accordance with the invention Fig. 2 is a perspective view of the data monitor of the editing system of Fig. 1 showing the touch responsive apparatus used in conjunction therewith:: Fig. 3 is a block diagram of the editing system of Fig. 1; Fig. 4 is a block diagram of the human interface portion of the editing system of Fig. 3; Fig. 5 is a block diagram of the different basic functions performed by the edit controller of the editing system of Fig. 3; Fig. 6 is a block diagram of a portion of the editing system of Fig. 3 showing the edit controller in greater detail; Fig. 7 is a block diagram of one of the video machine interfaces of the arrangement of Fig. 6; Fig. 8 is a block diagram of one of the intelligent line controllers of the arrangement of Fig. 6; Fig. 9 is a block diagram of the switcher and effects generator of the arrangement of Fig. 6; and Fig. 10 is a flow chart showing the manner in which the editing system uses the data monitor and touch responsive apparatus to convert user decisions into editing actions.
Fig. 1 depicts a video editing system 10 in accordance with the invention. The video editing system 10 is mounted on or contained within a desk 12 and includes a touchscreen data monitor 14 and a video monitor 16 mounted on top of the desk 12. Also mounted on the top of the desk 12 is a joystick panel 18. A designated keyboard 20 or a designated keyboard 22 having keys arranged in an ASCII format, both of which are shown in Fig. 1, can also be used to provide alternative human interfaces with the system 10.
The video editing system 10 of Fig. 1 can have up to three different human interfaces which are comprised of the touchscreen data monitor 14, the designated keyboard 20 and the ASCII keyboard 22, each utilizing a joystick panel 18.
The joystick panel 18 has several control buttons thereon plus a joystick. When control of a particular tape transport is obtained by one of the human interfaces 14, 20 and 22, the joystick panel 18 may be used to control the operation of that transport. The designated keyboard 20 may be of conventional design and includes all of the various keys and controls needed to perform the various editing functions used in a modern, sophisticated editing system. The ASCII keyboard 22 has the various keys thereof arranged in conventional typewriter format so that the various editing functions can be touch typed into the system 10 by the operator. The keyboards 20 and 22 comprise two different alternative ways in which the editing system 10 can be controlled by the operator.The touchscreen video monitor 16 provides a picture of the video information being recorded on a record tape recorder as editing is carried out by the operator.
In accordance with the invention a superior and highly advantageous human interface with the editing system 10 is provided by the touchscreen data monitor 14. As shown in detail in Fig. 2, the touchscreen data monitor 14 is comprised of a cathode ray tube 26 having a display screen 28 at the front thereof. Mounted adjacent the display screen 28 is a touch input system 30. The touch input system 30 establishes a pattern of interruptable beams across the face of the display screen 28 using light emitting diodes and photodetectors. In the present example the touch input system 30 is of the type manufactured by Carroll Manufacturing Company of Champaign, Illinois and includes two different pluralities of light emitting diodes 32 and 34 on the left side and bottom of the display screen 28 respectively.The touch input system 30 also includes a first plurality of photodetectors 36 on the right-hand side of the display screen 28 opposite the light emitting diodes 32 and a second plurality of photodetectors 38 along the top of the display screen 28 and opposite the light emitting diodes 34. The light emitting diodes 32 emit infrared rays which extend in generally parallel fashion across the face of the display screen 28 to the photodetectors 36. Each one of the light emitting diodes 32 corresponds with a particular one of the photodetectors 36 so that the infrared ray from the light emitting diode falls upon the photodetector. In like fashion the light emitting diodes 34 emit infrared rays which extend upwardly along the face of the display screen 28 in generally parallel fashion to the photodetectors 38.
The light emitting diodes 32 and 34 and the photo-detectors 36 and 38 establish an X-Y coordinate system of beams or rays across the face of the display screen 28. When the operator touches a particular location on the display screen 28 as shown in Fig. 1, one or more of the infrared rays from the light emitting diodes 32 and 34 are interrupted. A control logic box 40 which is coupled to the photodetectors 36 and 38 responds by providing a signal representative of the particular location on the screen 28 touched by the operator. The control logic box 40 does this by determining the average Y location of the infrared rays from the light emitting diodes 32 which are broken and by determining the average X location of the infrared rays from the light emitting diodes 34 which are broken.The resulting output signal from the control logic box 40 thus represents the average X and Y position of the area on the display screen 28 touched by the operator. The control logic box 40 is of conventional design and may be designed or purchased separately or in conjunction with the light emitting diodes 32 and 34 and the photodetectors 36 and 38 as part of a complete touch input system 30.
The particular touch input system 30 shown and described is for purposes of illustration only, and it should be understood that other touch systems which are capable of providing an output signal indicating an area touched on the display screen 28 can be used. Other touch systems which can be used include a variety of different manufactured products such as those employing a conductive glass and plastic laminate, those employing acoustic waves and those in which a conductive pattern is etched on the display screen.
As will become more fully apparent from the discussion to follow, the data monitor 14 with its included touch input system 30 provides a highly versatile and advantageous human interface with the editing system 10. The editing system 10 is capable of displaying any one of a plurality of different menus of editing information and selections on the data monitor 14. The operator then chooses one or more of the editing selections by touching the characters on the display screen 28 which describe or represent such selections. Each time the display screen 28 is touched by the operator, the resulting output signal from the control logic box 40 is applied to a portion of the editing system 10 which compares such information with the characters displayed on the screen to determine the particular function selected by the operator.The system 10 then executes any required editing functions and at the same time changes the information displayed on the screen 28. Such changes may be accomplished by displaying an entirely new menu of information and selections on the monitor 14 or by simply modifying the original menu displayed. This enables the operator to concentrate on editing from the standpoint of different levels of different types of functions.
Also, because the makeup of a particular displayed menu is virtually unlimited, a great amount of versatility and flexibility is introduced into the edting system 10.
Fig. 3 depicts a block diagram of the editing system 10. The editing system 10 includes a human interface 42 which is shown in Fig. 4 as being comprised of the joystick panel 1 8 and the designated keyboard 20, the ASCII keyboard 22 and the touch input system 30. The human interface 42 is coupled to an edit controller 44 as is the touchscreen data monitor 14. Each time one of the keys or other controls on the panel 18 or the keyboards 20 and 22 is manipulated or the touch input system 30 senses the touching of an area on the display screen of the data monitor 14, a corresponding signal is provided to the edit controller 44.In addition to providing the menus of editing information and selections to the data monitor 14, the edit controller 44 controls all other portions of the editing system 10 including three different source audio/video tape recorders 46, 48 and 50, a record audio/video tape recorder 52 and a switcher and effects generator 54. Each of the source audio/video tape recorders 46, 48 and 50 provides a different audio and video input to the switcher and effects generator 54. The switcher and effects generator 54 responds to commands from the edit controller 44 to couple one or more of the audio and video signal inputs thereto to the record audio/video tape recorder 52. The video information recorded on the record audio/video tape recorder 52 is also applied to a preview switcher 56 controlled by the edit controller 44.The preview switcher 56 in turn provides the video information for display on the video monitor 16. The preview switcher 56 is required for certain types of tape recorders when used as the record audio/video tape recorder. Still other types of tape recorders eliminate the need for the preview switcher 56. The edit controller 44 is also coupled to control a printer and paper tape system 58.
A floppy disk system 60 is coupled to the edit controller 44 so as to both input data to and receive data from the edit controller 44. The floppy disk system 60 comprises one example of a mass storage medium which can be used for this purpose, with various other types of mass storage media being usable for this purpose.
Initially, instructions for the various menus to be displayed on the data monitor 14 and the various editing functions which are recorded on a floppy disk are loaded from the disk system 60 into the edit controller 44. Thereafter, as editing functions are performed certain information defining the location and nature of different editing functions performed along the length of the tape within the record audio/video tape recorder 52 may be outputted by the edit controller 44 for storage in the floppy disk system 60. Alternate mass storage systems may store the information in an on-board PROM or a COD memory.
After the menus and other information are loaded into the edit controller 44 from the floppy disk system 60, the edit controller 44 provides an initial menu of information and selections on the data monitor 14. When the operator responds by punching a key on the one of the keyboards 20 and 22 or by touching the display screen 28 of the data monitors 14, the signal is sent from the human interface 42 to the edit controller 44. The edit controller 44 identifies the edit function represented by the signal, and then, if necessary, changes the display on the data monitor 14 by either making changes in the present menu or by providing a new menu to the data monitor 14. At the same time any edit functions to be performed within the system 10 are carried out under the control of the edit controller 44.For example, where video information from one or more of the source recorders 46, 48 and 50 is to be recorded on the record recorder 52, the edit controller 44 controls those of the source recorders 46,48 and 50 involved, the record recorder 52 and the switcher and effects generator 54. The processing of the various different video information is carefully synchronized using the time code or control track recorded on the various lengths of tape and which identifies the recordings on a frame-by-frame basis. The edit controller 44 continues to respond to inputs from the human interface 42 and to change the menus on the data monitor 14 and effects editing functions accordingly until all editing operations have been completed.
As previously noted the edit controller 44 controls the source recorders 46, 48 and 50, the record recorder 52 and the switcher and effects generator 54. The edit controller 44 has a choice of cuts, keys, dissolves and special effects. Cue points are selected manually to control tape speed and direction. Edit-in and edit-out points are marked manually in this mode of operation. If desired, cut points can also be selected by entering time code values for entry and exit points. Values for pre-roll and post-roll are also entered, as well as dissolve and special effect duration.
The source audio/video tape recorders 46, 48 and 50 are coupled to the edit controller 44 via machine interfaces 62, 64 and 66 respectively.
The record audio/video tape recorder 52 is coupled to the edit controller 44 by a machine interface 68. The printer and paper tape system 58 is coupled to the edit controller 44 by an interface 70. The switcher and effects generator 54 is coupled to the edit controller 44 by a switcher interface 72. The various interfaces 62, 64, 66, 68, 70 and 72 serve to convert the data communicated from and to the edit controller 44 between a serial form as it appears at the output of the edit controller 44 and a form which is usable by the particular recorder or other component to which the interface is coupled. For example, one particular type of tape recorder requires a DC voltage ranging from 0 volts to + 10 volts to control forward and reverse tape motion, while a different type of tape recorder requires a signal varying in frequency to control tape motion.
The various interfaces perform the appropriate conversion of the data from the edit controller 44 into such forms. A typical interface is described in detail in connection with Fig. 7.
Fig. 5 depicts the basic functions performed by the edit controller 44, while Fig. 6 depicts the controller 44 in detail. Referring first to Fig. 6, it will be seen that the edit controller 44 includes a central processing unit 74 and an associated memory 76. The memory 76 includes both read only memory (ROM) portions and random access memory (RAM) portions. The central processing unit 74 is coupled to the floppy disk system 60 through a disk controller 78. The central processing unit 74 is also coupled to the human interface 42 thrnugfra character generator 80 which sends an output to the data monitor 14 representing the information being displayed on the monitor 14. When the information from the floppy disk within the floppy disk system 60 is loaded into the edit controller 44, such information is stored in the memory 76.The central processing unit 74 responds to the signals from the human interface 42 applied thereto via an intelligent line controller 82 and a data bus 84.
Based on the signals communicated from the human interface 42, the central processing unit 74 performs any editing functions within the system 10 that are dictated by the signals. At the same time, the central processing unit 74 causes the menu presently being displayed on the data monitor 14 to change or a new menu to be presented if needed. This is accomplished by determining the desired characters to be displayed on the data monitor 14 from the memory 76 and providing such characters to the character generator 80. The character generator 80 which stores the various characters in a dot matrix format responds to characters outputted from the memory 76 by causing the display of the desired characters on the monitor 14.
Included in the data stored in the memory 76 from the floppy disk system 60 are overall instructions referred to as the operating system 86 in Fig. 5. The operating system 86 controls four separate processes referred to as command generator 88, display generator 90, input/output control 92 and machine control 94. The four different processes 88, 80, 92 and 94 are carried out in parallel under the control of the operating system 86. The command generator 88 responds to signals from the human interface 42 by determining what is to be done. This results in the generation of data which is entered in a common data base 96. The display generation process 90 involves the evaluation of the data entered in the common data base 96 to determine what is to be displayed on the data monitor 14.The display generator 90 provides any needed data in connection with changes in the display to the character generator 80. The input/output control process 92 provides for the control of the printer/paper tape system 58 and file system as well as any other peripheral equipment aside from the various tape recorders and switchers.
The machine control process 94 responds to data entered in the common data base 96 by determining what actions are to be taken by the source tape recorders 46, 48 and 50, the record tape recorder 52 and the switches. This results in appropriate commands being provided by the edit controller 44 to the tape recorders and switchers.
Because the editing process has become increasingly sophisticated, the efficient use of costly on-line editing time has become a major priority. Edits which are constructed and previewed on less costly off-line time, on the other hand, must have their parameters recorded for future reference, reproduction or transfer to on-line equipment. To provide for this a list of such editing decisions is stored together with all the parameters necessary to carry them out. Such information can be stored in the floppy disk system 60, in the memory 76 or in the printer and paper tape system 58 shown in Fig. 3. If an editing session is interrupted, the editing decisions and parameters can be temporarily stored in one of such locations for later reloading and completion.Once the edit construction is complete, such information can be used to preview the edit or to transfer such edits to online equipment.
Referring again to Fig. 6 the central processing unit 74 may be comprised of any appropriate central processing unit such as the single board central processing unit sold under the designation LSI-1 1 by Digital Equipment Corporation. The memory 76 preferably has a RAM capacity of 64K bytes or more. A timing generator 98 generates system timing signals for the edit controller 44.
As previously discussed the touch input system 30 functions in conjunction with the touchscreen data monitor 14 to perform various functions in response to being touched by the operator. These functions include system operation, the entry of data and the changing of an edit decision. The joystick panel 18 is used in conjunction with the touch input system 30 to provide for manual transport control and edit entry/exit point selection. The joystick panel 18 can also be used in conjunction with the designated keyboard 20 or the ASCII keyboard 22. Together the keyboard 20 or 22 and the panel 18 allow the operator to enter system edit data, select menus, trim edit points, and control transport operation. The designated keyboard 20 has special purpose keys for specific system operations.All system operating functions are controlled by dedicated keys such as preview, edit, search, cue, rewind, play mark entry/mark exit, key dissolve, and set/trim entry/exit. In addition, soft keys on the designated keyboard 20 may be used by the editor to define functions which may change from edit to edit. Each such soft key can be used to control the edit controller 44 in a manner similar to the operation of the touch input system 30 to change items on a menu presented by the data monitor 14 or to change menus.
The switcher and effects generator 54 may be used to select a cut, dissolve, key or one of many different wipe patterns. The wipe patterns occur from one source to another, with selection of hard or soft edges being possible. All effects are reversible in direction. The key mode provides fade-up or fade-down of a keyed foreground into a background. The key input effectively cuts a hole in the background in the shape of the video to be keyed, such as a title or graphic. The foreground of the key fills the hole in the background. Fade durations are selectable. During all effects except keys, audio is dissolved from one source to another.
In order to understand the nature of the different menus displayed on the data monitor 14, it will be helpful to consider a particular example of an editing system 10 having both primary menus and secondary menus. The primary menus are designed to accomplish the more commonly performed editing tasks. They typically display a moderate amount of information, allowing the editor to keep track of numerous steps without changing menus while at the same time limiting the total information displayed on any one menu in order to avoid confusion and clutter.
There are six primary menus which include the following: 1. System Parameters-This is the first menu to appear on the data monitor 14 after the editing system is turned on. It is used primarily as a reference point, or dispatcher, from which to select other menus for editing. General system set-up controls are typically located on this menu.
2. Edit Decision List-This menu displays a portion of the current edit decision and provides the control to modify it. Also displayed are controls for previewing edits with the switcher 54.
3. Edit Decision List Configuration-This menu is used to select the edit decision parameters. The menu lists all parameters that may be used as headings on the edit decision list and their current configuration. The menu retains multiple versions of the edit decision list headings and provides the controls to add, remove, or rearrange parameters within the edit decision listings.
4. Edit Construction-This menu comprises an active display of the particular edit under construction. It indicates and allows the operator to select in-edit and out-edit points, transports, reels, and effects. This menu can be combined with a number of secondary menus to accommodate more specific editing requirements.
5. Transport Configuration-This menu provides for selection and the display of the status of transport parameters such as logical order, intelligent line controller channel, interface type, reel number and transport status.
The secondary menus in the present example are designed for discrete, specialized editing functions and are not included in the primary menus. All secondary menus except for a numerical keypad appear as additions to the lower portion of the primary menus. The keypads cause the operator to enter numerical data such as pre-roll and post-roll duration. Transport controls for the video tape recorders provide play, rewind, fast forward, search and shuttle functions.
Special effects are carried out with a variety of flexible controls. Split edit modes of audio and video may be selected from different points on the source video tapes. Controls for preview edits, cuts, dissolves, wipes, fades to or from black and key modes with programmable duration are included, as well as controls over rolls.
Fig. 7 shows the details of a typical one of the interfaces such as the machine interfaces 62, 64, 66 and 68. As previously noted each such interface provides bidirectional communications between the controller 44 and the recorder being controlled. The input to the interface includes a pair of serial data lines 100 and 102. The line 100 provides serial data from a serial data processor 104 to the edit controller 44. Conversely, the line 102 provides serial data from the edit controller 44 to the seal data process or 104. The serial data processor 104 includes a universal asynchronous receiver transmitter which converts incoming serial data to parallel form before applying it to a CPU and Memory 105. A personality module 106 is peculiar to the particular peripheral device being interfaced such as a tape recorder 108.The personality module 106 translates the parallel data from the CPU and Memory 106 into transport commands in proper form for use by the tape recorder 108. As previously noted different peripheral devices require different types of signals. For example some tape recorders require a DC voltage to control capstan speed, while others require an AC signal whose frequency determines tape speed.
The tape recorder 108 is coupled to the CPU and Memory 105 both through the personality module 106 and a time code processor/ready 110. The connection from the tape recorder 108 to the time code processor/reader 110 provides communication for the time code track between the tape and the time code processor/reader 110.
Typically, the time code track is read from the tape on the tape recorder 108 and passed to the time code processor/reader 110 where it is decoded from a serial bit stream to parallel data.
Output signals from the interface of Fig. 7 to the edit controller 44 consist of transport tally signals, status signals and time code data. Tally and status signals are buffered and then passed to the data bus 84 within the edit controller 44.
Serial time code data is converted to parallel data by the time code processor/reader 110 as previously noted. This data is then communicated to the edit controller 44.
Referring again to Fig. 6 it was previously noted that the human interface 42 is coupled to the central processing unit 74 by an intelligent line controller 82. In the present example the edit controller 44 includes the intelligent line controller 82 and three additional intelligent line controllers 112, 114 and 116. The intelligent line controller 1 12 couples the central processing unit 74 to the machine interface 62. The intelligent line controller 1 14 couples the central processing unit 74 to the switcher interface 73. The intelligent line controller 11 6 couples the central processing unit 74 to the machine interface 68.
Each of the intelligent line controllers 82, 112, 114 and 1 16 comprises an intelligent communications interface between the edit controller 44 and external peripheral apparatus such as the audio/video tape recorders and the human interface 42. Data from the central processing unit 74 communicated on the data bus 84 is in parallel form. Each of the controllers 82, 112, 114 and 1 16 converts the data to serial form before applying the data to the peripheral apparatus.Conversely, data communicated from the peripheral apparatus to the central processing unit 74 is converted from serial to parallel form by the various controllers 82, 112, 114, and 11 6. In addition to conversion by the data form, the various controllers 82, 112, 114, and 116 act as buffers between the peripheral apparatus and the central processing unit 74. The intelligent line controllers also verify data input formats, convert data formats and perform certain primitive controller functions such as automatic time code and status requests. The central processing unit 74 cannot communicate with two or more pieces of peripheral apparatus simultaneously.However, the effect of the presence of the various controllers is to provide for substantially simultaneous communication by receiving and holding data until the central processing unit 74 is free to receive such data.
The various intelligent line controllers 82, 112, 114 and 116 are identical in construction with one of them being shown in Fig. 8. The intelligent line controller shown in Fig. 8 includes a first bus transceiver 11 8 coupled to the central processing unit 74 to communicate address and data between the intelligent line controller and the central processing unit 74. The bus transceiver 118 is coupled to an ILC CPU 120 by an address bus 122 and a data bus 124. The address bus 122 is also coupled to a 4K RAM 126 and to the 1 K ROM 128. The data bus 124 is also coupled to the 4K RAM 126 and to the 1 K ROM 128.A second bus transceiver 130 is coupled to the central processing unit 120 via a control bus 132 and to the central processing unit 74 and is operative to handle timing and control requests communicated between the intelligent line controller and the central processing unit 74.
An address comparator 134 has two different inputs, one from the bus transceiver 11 8 and the other from a board address select switch group 136. The address comparator 134 compares the contents of the address received from the central processing unit 74 with the setting of the board address select switch group 136. If a valid comparison is made, the comparator 134 requests control of the data and address buses 122 and 124 from the ILC CPU 120. When such control is relinquished by the intelligent line controller, data is transferred from the central processing unit 74 to the intelligent line controller RAM 126. The ILC CPU 120 in turn formats the data in the RAM 126 and then transmits the data via a universal asynchronous transmitter 138 to the interface as shown in Fig. 7.Incoming serial data from the interface is converted to parallel form by the universal synchronous receiver transmitter 138, which generates and communicates an interrupt signal to the ILC CPU 120. When the ILC CPU 120 services the interrupt signal, the incoming data is formatted and written in the RAM 126. The new data in the RAM 126 is then read by the central processing unit 74 by requesting bus control from the ILC CPU 120 and reading the data. The frequency at which the serial signals from the universal asynchronous receiver transmitter 138 are communicated to the interface is determined by a board rate generator 140 in conjunction with a board rate select switch group 142.
The switcher and effects generator 54 is shown in detail in Fig. 9. The intelligent line controller 114 associated with the switcher 54 is coupled through the switcher interface 72 to the switcher 54. The switcher interface 72 which may be of like construction to that shown in Fig. 7 is coupled to receive composite sync and composite blanking signals. The switcher interface 72 has an address bus 144 and a data bus 146 at the output thereof. The buses 144 and 146 are coupled to a video process or mixer 148 at the output of the switcher 54 by either a video crosspoint circuit 150 or a waveform generator 152 and a video process 154. The buses 144 and 146 are also coupled to an audio crosspoint circuit 156.
The switcher and effects generator 54 is used principally for mixing and wiping operations. In the case of a mixing operation one of the inputs to the audio crosspoint circuit 1 56 and one of the inputs to the video crosspoint circuit 150 are applied to the video process/mixer 148. The video processor 154 is also used in such operation. A wiping operation is performed using the waveform generator 152 and the video processor 154.
Fig. 10 is a flow chart depicting the manner in which the editing system 10 uses the data monitor 14 together with the touch input system 30 to convert decisions by the operator into editing actions.
In a step 160 depicted in Fig. 10, the system 10 creates via the edit controller 44 a list of valid choices to be presented to the operator. At different points in an editing process, a different set of operator requests is considered valid. The edit controller 44 first determines this set of choices in the step 160. Usually, this is a function of the type of action currently being performed. In addition, the edit controller 44 may have to examine information received from other transports or sources as well as information previously stored in the memory 76.
In the next step 162, each valid choice is displayed on the screen 28 of the data monitor 14. Associated with each choice in the set of valid choices are several important pieces of information. One of these is a description of where on the screen 28 the choice should be presented to the operator. An additional piece of information is what label should be printed on the screen 28 so that the operator is able to identify and understand the choice. Once a valid set of choices has been assembled, each label is displayed at its corresponding position on screen 28. The operator is thus made aware of what the current set of choices consists of and is expected to now select one of them.
In a next step 1 64, the screen 28 of the data monitor 14 is scanned and a decision 166 is made as to whether or not a beam has been broken. When the operator has determined which of the labels displayed on the screen 28 corresponds to the action he wants the editing system 10 to take, he places a finger (or other object) on that label on the screen 28. As previously noted, the light emitting diodes 32 and 34 mounted adjacent the screen 28 emit infra-red beams which are directed onto corresponding ones of the photodetectors 36 and 38. Placement of the operator's finger at a particular location on the screen 28 results in one or more of the photodetectors 36 by sensing interruption of the infrared beams normally received thereby. By periodically examining the outputs of the photodetectors 36 and 38, the editing system detects the presence of a finger.By assigning numeric values to the various photo-detectors 36 and 38, the editing system can also calculate numbers which represent the horizontal and vertical coordinates of the finger on the screen 28. As a result of the periodic examination of the outputs of the photodetectors 36 and 38, the decision 166 indicates whether or not one or more of the infrared beams have been broken.
If the decision 166 is that one or more of the infrared beams have been broken, then in a next step 168 the distance from the finger to each of the choices displayed on the screen 28 is calculated. It was previously noted that each choice displayed on the screen 28 has associated with it an indication of a position on the screen where it is displayed. Each element in a set of choices is examined and the distance between the position associated with that choice and the position of the finger is calculated. The choice which has the smallest distance from the position of the finger is assumed to be the one intended by the operator. If the finger is too far away from the nearest choice, no action is taken. These actions are represented by a decision 170 shown in Fig.
10.
If the decision 170 results in a "yes" determination indicating that the finger has been placed at least within a nominal distance of one of the choices and that choice has been determined to be the one desired by the operator, the value associated with that choice is computed in a following step 172. Associated with each choice in a set of determined choices is a value. During the step 172 the value associated with the choice selected by the operator is determined. Using this value, the editing system 10 performs the desired action which is depicted in a following step 174 in Fig. 10. The value determined in the step 172 corresponds to the type of signal provided when either the designated keyboard 30 or the ASCII keyboard 22 is utilized.
The various operations depicted in the flow charts of Fig. 10 can be better understood by considering an example. If it is assumed that at a particular point in an editing session the operator is expected to enter a digit between 0 and 9, then under these conditions there are ten possible choices. Associated with each of the choices is a position on the screen 28 of the data monitor 14.
Each choice has a label printed on the screen 28 so that the numerals "1", "2", "3", etc., are displayed on the screen. If the operator wants to select the numeral "7", he places a finger on the screen 28 at the place where the label "7" has been displayed. The infra-red beams corresponding to this location are broken and the editing system determines the horizontal and vertical coordinates of the finger. Each of the ten possible choices is examined, and the distance between its position on the screen 28 and the position of the finger on the screen 28 is calculated. In the present example, the choice corresponding to the numeral "7" is determined to be the closest. Associated with each choice is a value. In the present example, the value for "7" might be the ASCII code for "7". This is the value which most computer and communication equipment uses to represent a key labelled "7".
The editing system 10 now uses this value as if it had come from a conventional keyboard or communication device such as the designated keyboard 20 or the ASCII keyboard 22.

Claims (24)

Claims
1. An editing system in which information from at least one source is recorded under the control of a controller, the system including a monitor coupled to the controller for providing a visual display of editing information, sensing means responsive to the touching of or close proximity to selected areas of the visual display by an operator for providing an indication of the area touched to the controller and control means for providing for an editing function in response to the indication provided to the controller.
2. An editing system according to claim 1, further comprising means associated with the controller for changing the visual display in response to the indication provided to the controller.
3. An editing system according to claim 1 or claim 2 wherein the monitor comprises a cathode ray tube having a screen and the sensing means includes means for providing a network of rays across and adjacent the screen and means for providing an indication when at least one of the rays is interrupted by an operator.
4. An editing system according to any foregoing claim, including means for providing different menus of editing information for display on the visual display, and wherein the editing function comprises selecting one of the menus of editing information for display on the visual display.
5. An editing system according to claim 4, wherein the control means is arranged to select a menu of editing information for display on the visual display based on the editing information displayed at a particular location of the visual display touched by an operator.
6. An editing system according to any foregoing claim and including at least one source of information, a recorder, a switcher for selectively coupling the source of information to the recorder, a processor for controlling the source of information, the recorder and the switcher, a memory coupled to the processor and a character generator coupled to the processor for providing characters to be displayed under the control of the processor, the monitor being coupled to the character generator for displaying characters provided by the character generator.
7. An editing system according to claim 6, further including a data storage system coupled to the processor.
8. An editing system according to claim 6 further comprising a plurality of controllers coupled between the processor and the at least one source of information, the recorder, the switcher and the sensing means, each of the controllers being operative to convert data between serial and parallel forms.
9. An editing system according to claim 6 further comprising a plurality of interfaces coupled between the processor and the at least one source of information, the recorder and the switcher, each of the interfaces being operative to convert data between serial and machine usable forms.
10. An editing system comprising the combination of at least one source of information to be edited, a recorder for recording information being edited and means for controlling the recording of information to be edited in the recorder, the means for controlling including a visual display, means for providing different menus of editing information for display on the visual display, and means responsive to the touching of or close proximity to selected areas of the visual display by an operator for selecting different ones of the menus of editing information for display on the visual display.
11. An editing system comprising the combination of at least one source of information, a recorder, a switcher for selectively coupling the source of information to the recorder, a processor for controlling the source of information, the recorder and the switcher, a memory coupled to the processor, a character generator coupled to the processor for providing characters to be displayed under the control of the processor, a display monitor coupled to the character generator for displaying characters provided by the character generator and means responsive to selection of a particular portion of the display on the monitor for providing a corresponding indication to the processor.
12. The invention set forth in claim 7 wherein the means for providing a corresponding indication is responsive to the proximity of a human digit to particular portions of the display on the monitor.
13. An editing system comprising the combination of a central processing unit, a memory coupled to the central processing unit, a character generator coupled to the central processing unit, an operator's interface coupled to the character generator, means coupling the human interface to the central processing unit, at least one source recorder, a first controller coupling the at least one source recorder to the central processing unit, a recording recorder, a second controller coupling the recording recorder to the central processing unit, a switcher coupled between the at least one source recorder and the recording recorder and a third controller coupled between the switcher and the central processing unit, the first, second and third controller each being operative to store and convert the form of data being transferred from or to the central processing unit.
14. An editing system according to claim 13, wherein at least one of the first, second and third controllers includes a central processing unit and a memory.
15. An editing system according to claim 13, wherein at least one of the first, second and third controllers includes a processor, a transceiver coupled to the processor, a memory coupled to the processor and to the transceiver and an address comparator coupled between the transceiver and the processor.
16. An editing system according to claim 15 further including a receiver transmitter coupled to the processor and operative to convert data between parallel and serial forms.
17. An editing system according to claim 13, further including a first interface coupled between the first controller and the at least one source recorder, a second interface coupled between the second controller and the recording recorder and a third interface coupled between the third controller and the switcher, each of the first, second and third interfaces including a processor for converting data between serial and machine usable forms.
18. An editing system according to claim 17, wherein at least one of the first second and third interfaces includes a serial data processor coupled to one of the controllers, a time code processor coupled to the serial data processor and a data conversion module coupled to the serial data processor.
19. An audio/video editing system comprising the combination of an edit controller, a data monitor coupled to the edit controller and having a screen for providing a display, an interface coupled to the edit controller and operative to provide to the edit controller signals representing specific portions of the display on the monitor manually selected by an operator, the edit controller providing displays on the data monitor and being operative to change the displays in accordance with signal provided thereto from the interface, a plurality of source audio/video tape recorders coupled to the edit controller, a record audio/video tape recorder coupled to the edit controller and a switcher coupled between the plurality of source audio/video tape recorders and the record audio/video tape recorder, the switcher also being coupled to the edit controller, the edit controller being operative to control the plurality of source audio/video tape recorder, the record audio/video tape recorder and the switcher.
20. A system according to claim 19, further including a data storage system coupled to the edit controller.
21. An editing system according to claim 19, wherein the interface includes a means for sensing proximity of an operator's hand or part thereof.
22. An editing system according to claim 21, wherein the interface also includes at least one keyboard having a plurality of buttons thereon representing predetermined editing functions.
23. An editing system according to claim 19, further including a plurality of interfaces, each coupled between the edit controller and a different one of the plurality of source audio/video tape recorders, the record audio/video tape recorder and the switcher.
24. An editing system according to claim 19, wherein the edit controller includes a plurality of intelligent line controllers, each coupled to a different one of the plurality of source audio/video tape recorders, the record audio/video tape recorder, the switcher and the interface.
GB8207089A 1981-04-09 1982-03-11 Audio/video editing system having touch responsive function display screen Expired GB2096868B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US25257181A 1981-04-09 1981-04-09

Publications (2)

Publication Number Publication Date
GB2096868A true GB2096868A (en) 1982-10-20
GB2096868B GB2096868B (en) 1985-09-11

Family

ID=22956583

Family Applications (1)

Application Number Title Priority Date Filing Date
GB8207089A Expired GB2096868B (en) 1981-04-09 1982-03-11 Audio/video editing system having touch responsive function display screen

Country Status (10)

Country Link
JP (1) JPH0644384B2 (en)
AU (1) AU545936B2 (en)
CA (1) CA1177969A (en)
DE (1) DE3213036A1 (en)
FR (1) FR2509075B1 (en)
GB (1) GB2096868B (en)
IE (1) IE52655B1 (en)
IT (1) IT8248111A0 (en)
NL (1) NL8201110A (en)
SE (1) SE454030B (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2126054A (en) * 1982-08-11 1984-03-14 Philips Electronic Associated Display system with nested information display
GB2154109A (en) * 1984-01-27 1985-08-29 Hitachi Shipbuilding Eng Co Ship collision preventive aid apparatus
GB2181627A (en) * 1985-09-10 1987-04-23 Interactive Tech Limited Information processing system
WO1988002958A1 (en) * 1986-10-16 1988-04-21 David Burton Control system
US4937685A (en) * 1983-12-02 1990-06-26 Lex Computer And Management Corporation Method of display presentation for video editing
US4939594A (en) * 1982-12-22 1990-07-03 Lex Computer And Management Corporation Method and apparatus for improved storage addressing of video source material
US4949193A (en) * 1983-12-02 1990-08-14 Lex Computer And Management Corporation Video composition method employing action scrolling
US4964004A (en) * 1983-12-02 1990-10-16 Lex Computer And Management Corporation Video composition method and apparatus employing visual and tactile feedback
US4979050A (en) * 1983-12-02 1990-12-18 Lex Computer And Management Corporation Video composition method for assembling video segments
GB2235815A (en) * 1989-09-01 1991-03-13 Compact Video Group Inc Digital dialog editor
EP0440408A1 (en) * 1990-01-29 1991-08-07 Pioneer Electronic Corporation Recording and reproduction method and apparatus
US5119474A (en) * 1989-06-16 1992-06-02 International Business Machines Corp. Computer-based, audio/visual creation and presentation system and method
GB2274232A (en) * 1993-01-09 1994-07-13 Ibm A data processing system
US5384667A (en) * 1989-05-05 1995-01-24 Quantel Limited Video processing system
ES2068071A2 (en) * 1992-04-13 1995-04-01 Rodriguez Francisco Casau Computer-controlled audio mixing console
US5798800A (en) * 1994-03-19 1998-08-25 Sony Corporation Apparatus for controlling a switcher and a special effects device
EP0902431A2 (en) * 1997-09-12 1999-03-17 Philips Patentverwaltung GmbH System for editing of digital video and audio information
FR2768843A1 (en) * 1997-09-24 1999-03-26 Sony Pictures Entertainment USER INTERFACE SYSTEM AND USER INTERFACE METHOD
GB2329811A (en) * 1997-09-24 1999-03-31 Sony Pictures Entertainment Providing Graphical User Interfaces for player/recorder systems.
ITPD20110405A1 (en) * 2011-12-22 2013-06-23 Edutech S R L MULTI-USER ELECTRONIC EQUIPMENT FOR GENERATING ANIMATED SEQUENCES

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU601671B2 (en) * 1982-12-22 1990-09-20 Lex Computer And Management Corporation Video composition method and apparatus
US4685003A (en) * 1983-12-02 1987-08-04 Lex Computing & Management Corporation Video composition method and apparatus for providing simultaneous inputting and sorting of video source material
JPS6084631A (en) * 1983-09-16 1985-05-14 Yokogawa Hewlett Packard Ltd Device for input on display screen
JPS6083129A (en) * 1983-10-14 1985-05-11 Toshiba Corp Display controller
US4943866A (en) * 1983-12-02 1990-07-24 Lex Computer And Management Corporation Video composition method and apparatus employing smooth scrolling
JPS60138627A (en) * 1983-12-27 1985-07-23 Hitachi Ltd Multi-item input device
EP0156593A3 (en) * 1984-03-22 1985-12-27 AMP INCORPORATED (a New Jersey corporation) Method and apparatus for mode changes and/or touch mouse control
US4692809A (en) * 1984-11-20 1987-09-08 Hughes Aircraft Company Integrated touch paint system for displays
JPS61245228A (en) * 1985-04-23 1986-10-31 Arupain Kk Position detecting method for optical touch panel
JPH0744674B2 (en) * 1986-01-31 1995-05-15 キヤノン株式会社 Recording / playback device
JPH0512828Y2 (en) * 1986-03-20 1993-04-05
JPS62158571U (en) * 1986-03-27 1987-10-08
JPS6472523A (en) * 1987-09-11 1989-03-17 Seiko Instr & Electronics Manufacture of semiconductor device
FR2630572B1 (en) * 1988-04-22 1990-08-24 Eliane De Latour Dejean METHOD OF MOUNTING ELEMENTS OF IMAGES AND / OR SOUNDS AND DEVICE FOR IMPLEMENTING SAME
JPH0322296U (en) * 1989-07-13 1991-03-07
EP0520655A3 (en) * 1991-06-28 1993-11-18 Ncr Int Inc Item selection method and apparatus
DE9216539U1 (en) * 1992-12-04 1993-02-25 Nitzsche, Wolfgang, 07552 Gera Arrangement of elements to make data visible, especially advertising data with feedback

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3478220A (en) * 1966-05-11 1969-11-11 Us Navy Electro-optic cursor manipulator with associated logic circuitry
FR2080100A5 (en) * 1970-02-24 1971-11-12 Omera Segid Optique Meca
US3721757A (en) * 1971-02-08 1973-03-20 Columbia Broadcasting Syst Inc Method and apparatus for automatically editing television information
GB1387286A (en) * 1972-03-27 1975-03-12 Cbs Inc Method and apparatus for automatically editing television information
DE2748453C2 (en) * 1976-10-29 1983-08-25 Ampex Corp., 94063 Redwood City, Calif. Arrangement for simultaneous recording of a plurality of digital signal components of an analog video information signal
US4271479A (en) * 1977-10-20 1981-06-02 International Business Machines Corporation Display terminal with modularly attachable features
CA1109539A (en) * 1978-04-05 1981-09-22 Her Majesty The Queen, In Right Of Canada, As Represented By The Ministe R Of Communications Touch sensitive computer input device
JPS5510639A (en) * 1978-07-10 1980-01-25 Hitachi Ltd Input-output display unit
JPS5520528A (en) * 1978-07-31 1980-02-14 Hitachi Ltd Picture introduction system for data processing terminal
JPS55110330A (en) * 1979-02-16 1980-08-25 Mitsubishi Electric Corp Information input unit

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2126054A (en) * 1982-08-11 1984-03-14 Philips Electronic Associated Display system with nested information display
US4939594A (en) * 1982-12-22 1990-07-03 Lex Computer And Management Corporation Method and apparatus for improved storage addressing of video source material
US4949193A (en) * 1983-12-02 1990-08-14 Lex Computer And Management Corporation Video composition method employing action scrolling
US4979050A (en) * 1983-12-02 1990-12-18 Lex Computer And Management Corporation Video composition method for assembling video segments
US4964004A (en) * 1983-12-02 1990-10-16 Lex Computer And Management Corporation Video composition method and apparatus employing visual and tactile feedback
US4937685A (en) * 1983-12-02 1990-06-26 Lex Computer And Management Corporation Method of display presentation for video editing
GB2154109A (en) * 1984-01-27 1985-08-29 Hitachi Shipbuilding Eng Co Ship collision preventive aid apparatus
GB2181627B (en) * 1985-09-10 1989-12-28 Interactive Tech Limited Information processing system
GB2181627A (en) * 1985-09-10 1987-04-23 Interactive Tech Limited Information processing system
WO1988002958A1 (en) * 1986-10-16 1988-04-21 David Burton Control system
AU604783B2 (en) * 1986-10-16 1991-01-03 Compumedics Limited Control system
US5384667A (en) * 1989-05-05 1995-01-24 Quantel Limited Video processing system
US5119474A (en) * 1989-06-16 1992-06-02 International Business Machines Corp. Computer-based, audio/visual creation and presentation system and method
GB2235815A (en) * 1989-09-01 1991-03-13 Compact Video Group Inc Digital dialog editor
EP0440408A1 (en) * 1990-01-29 1991-08-07 Pioneer Electronic Corporation Recording and reproduction method and apparatus
US5237426A (en) * 1990-01-29 1993-08-17 Pioneer Electronic Corporation Record regenerative method and regenerative apparatus
ES2068071A2 (en) * 1992-04-13 1995-04-01 Rodriguez Francisco Casau Computer-controlled audio mixing console
EP0606735A2 (en) * 1993-01-09 1994-07-20 International Business Machines Corporation A data processing system
GB2274232A (en) * 1993-01-09 1994-07-13 Ibm A data processing system
EP0606735A3 (en) * 1993-01-09 1995-04-26 Ibm A data processing system.
US5798800A (en) * 1994-03-19 1998-08-25 Sony Corporation Apparatus for controlling a switcher and a special effects device
EP0902431A2 (en) * 1997-09-12 1999-03-17 Philips Patentverwaltung GmbH System for editing of digital video and audio information
US6185538B1 (en) 1997-09-12 2001-02-06 Us Philips Corporation System for editing digital video and audio information
EP0902431A3 (en) * 1997-09-12 1999-08-11 Philips Patentverwaltung GmbH System for editing of digital video and audio information
GB2329812A (en) * 1997-09-24 1999-03-31 Sony Pictures Entertainment Configurable Graphic User Interface for audio/video data manipulation
FR2770735A1 (en) * 1997-09-24 1999-05-07 Sony Pictures Entertainment METHOD AND APPARATUS FOR GRAPHICAL USER INTERFACE
GB2329811A (en) * 1997-09-24 1999-03-31 Sony Pictures Entertainment Providing Graphical User Interfaces for player/recorder systems.
JPH11259960A (en) * 1997-09-24 1999-09-24 Sony Pictures Entertainment Track control method and controller
FR2768843A1 (en) * 1997-09-24 1999-03-26 Sony Pictures Entertainment USER INTERFACE SYSTEM AND USER INTERFACE METHOD
GB2329811B (en) * 1997-09-24 2002-02-27 Sony Pictures Entertainment Providing graphical user interfaces for player/recorder systems
GB2329812B (en) * 1997-09-24 2002-04-10 Sony Pictures Entertainment User interface systems and methods
US7167763B2 (en) 1997-09-24 2007-01-23 Sony Corporation Method and apparatus for providing a graphical user interface for a player/recorder system
ITPD20110405A1 (en) * 2011-12-22 2013-06-23 Edutech S R L MULTI-USER ELECTRONIC EQUIPMENT FOR GENERATING ANIMATED SEQUENCES

Also Published As

Publication number Publication date
IE820688L (en) 1982-10-09
IT8248111A0 (en) 1982-03-29
GB2096868B (en) 1985-09-11
DE3213036C2 (en) 1989-10-19
SE454030B (en) 1988-03-21
FR2509075B1 (en) 1988-03-18
CA1177969A (en) 1984-11-13
AU545936B2 (en) 1985-08-08
IE52655B1 (en) 1988-01-06
SE8202264L (en) 1982-10-10
NL8201110A (en) 1982-11-01
AU8107082A (en) 1982-10-14
JPH0644384B2 (en) 1994-06-08
FR2509075A1 (en) 1983-01-07
DE3213036A1 (en) 1982-12-09
JPS57178532A (en) 1982-11-02

Similar Documents

Publication Publication Date Title
US4521870A (en) Audio/video system having touch responsive function display screen
CA1177969A (en) Audio/video editing system having touch responsive function display screen
US10674216B2 (en) Electronic apparatus, display controlling method for electronic apparatus and graphical user interface
US4538188A (en) Video composition method and apparatus
US7174518B2 (en) Remote control method having GUI function, and system using the same
US10013154B2 (en) Broadcast control
US5206929A (en) Offline editing system
EP1619888B1 (en) Electronic apparatus, display controlling method for electronic apparatus and graphical user interface
EP0625783B1 (en) Method and apparatus for displaying available source material for editing
US6744968B1 (en) Method and system for processing clips
EP0560624B1 (en) Electronic video system with simultaneous real-time processing
Nicholls A New Edit Room Using One-Inch Continuous-Field Helical VTRs
US5050003A (en) Image processing apparatus capable of displaying a plurality of screens
MXPA97005547A (en) Apparatus and method for controlling the presentation of the electr program guide
WO1993021595A1 (en) Media composer including pointer-based display of sequentially stored samples
KR19990067919A (en) Editing system and editing method
JPH11213174A (en) Animation editing method
KR100481415B1 (en) Centralized control system of electronic equipment
JPH09259515A (en) Av controller
US20050117878A1 (en) Data editing apparatus, data editing method and data recording/reproducing apparatus
JPH08125996A (en) Screen display system in video display device
JPH10112889A (en) Pointer display controller and display controlling method
US5444580A (en) Apparatus for recording/reproducing digital data on/from recording medium
JP2000312390A (en) Display device
JPH05176225A (en) Virtual controller for automating video edition stadio

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee