US20010037499A1 - Method and system for recording auxiliary audio or video signals, synchronizing the auxiliary signal with a television singnal, and transmitting the auxiliary signal over a telecommunications network - Google Patents

Method and system for recording auxiliary audio or video signals, synchronizing the auxiliary signal with a television singnal, and transmitting the auxiliary signal over a telecommunications network Download PDF

Info

Publication number
US20010037499A1
US20010037499A1 US09/816,632 US81663201A US2001037499A1 US 20010037499 A1 US20010037499 A1 US 20010037499A1 US 81663201 A US81663201 A US 81663201A US 2001037499 A1 US2001037499 A1 US 2001037499A1
Authority
US
United States
Prior art keywords
signal
auxiliary
video signal
video
auxiliary signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US09/816,632
Inventor
David Turock
Richard Robbins
Warren Gifford
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US09/816,632 priority Critical patent/US20010037499A1/en
Publication of US20010037499A1 publication Critical patent/US20010037499A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234336Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by media transcoding, e.g. video is transformed into a slideshow of still pictures or audio is converted into text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/2368Multiplexing of audio and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/262Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists
    • H04N21/2625Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists for delaying content or additional data distribution, e.g. because of an extended sport event
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4341Demultiplexing of audio and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/08Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division

Definitions

  • a television signal When a television signal is recorded or broadcast, it typically includes a video signal with a synchronized audio signal “attached” to it.
  • a person In many cases it is desirable for a person to view the video but to be able to listen to a different audio signal. For example, the person may not speak the language of the attached audio signal, the person may be sight impaired and need a more descriptive audio interpretation, or the language in the attached audio signal may offend the person.
  • the person may desire to see the video essentially when it is delivered or broadcast, for example, with live news coverage or a sporting event, so that they can discuss it with their friends, perhaps even in the same room.
  • the present invention includes a method and system for recording an auxiliary signal, synchronizing the auxiliary signal with a video signal, and transmitting the auxiliary signal over a telecommunications network.
  • the method includes receiving a video signal and generating an auxiliary signal derived at least in part from the video signal.
  • the auxiliary signal is transmitted over a telecommunications network and the video signal is delayed as a function of the auxiliary signal.
  • the auxiliary signal and video signal are synchronized.
  • a system which includes at least one video signal receiver, an auxiliary signal recorder, at least two telecommunications network interfaces, a signal comparator, and video signal buffer.
  • FIG. 1 is a schematic illustration of a first embodiment of the method of the invention.
  • FIG. 2 is a schematic illustration of a second embodiment of the method of the invention.
  • FIG. 3 is a schematic illustration of an implementation of the methods of the invention.
  • a first embodiment 100 of the method of the invention is schematically illustrated in FIG. 1.
  • the first embodiment begins with Existing Video Signal 110 .
  • This signal is distributed to both New Synchronized Signal Generation Process 120 , and Playback Process 130 .
  • the New Synchronized Signal Generation Process 120 creates New Synchronized Signal 140 , which is distributed to the Playback Process 130 .
  • This embodiment applies when the playback process has access to the existing video signal at essentially the same time as the new signal generation process, for example if the existing video signal is broadcast or available on a common transmission system, such as a cable TV.
  • a second embodiment 200 of the method of the invention applies if the playback process does not have access to the existing video signal at essentially the same time as the new signal generation process.
  • the transmission mechanisms may be different for the two processes, or the playback process may not have access to the existing signal directly.
  • the second embodiment also begins with an Existing Video Signal 210 .
  • This signal is distributed to the New Synchronized Signal Generation Process 220 .
  • the New Synchronized Signal Generation Process 220 creates Synchronized Combined Signal 240 , which is distributed to the Playback Process 230 .
  • This signal may be in a different format from the original video signal, for example, due to different transmission media, or different technology, or to being stored and transmitted in non-real time.
  • the Existing Video Signal 110 , 210 can be in any format, including, for example, the variety of commonly used video formats, or newly developing formats.
  • the method can work with any video signal, any combined video and audio signal, and other formats that include video and possibly other information, such as text in a multi-media format.
  • the signal can be “broadcast” or local, live, or recorded.
  • Monitoring the audio signal may be required in the case of a translation of a speech which is carried in the existing video and audio signal, but may be extraneous if a completely different interpretation is being created, such as for the visually disabled.
  • Another class of methods may use additional information that may be available, such as a prepared text or other multi-media information available either separately from the video signal or combined with it in some way. These methods may include an automated process for generating an auxiliary signal designed to translate speech, synthesize speech if there is a prepared text, or describe the situation depicted in the video signal.
  • a key feature of these processes is that the generation of the synchronized signal may require that the video be paused for an interval or the viewing otherwise delayed. For example, the person or process generating the new information may need more time to describe the situation, to complete the translation, or to look up information.
  • portions of the existing video may be omitted, or otherwise processed, such as freeze-frame or slow motion, to permit better description or to omit offensive portions of the video signal.
  • the new information signal is then marked with time stamps to correspond to the original video signal timing, and to provide the control information for playback of the video signal.
  • Multiple audio signals may be simultaneously prepared, for example different languages, and other information can be provided as well, including text, image, other video, etc. These can be synchronized with the original video.
  • the output of the New Synchronized Signal Generation Process 120 , 220 depends on whether the Playback Process 130 , 230 is receiving the Existing Video Signal 110 , 210 in nearly real time with the New Synchronized Signal Generation Process 120 , 220 .
  • the Existing Video Signal 110 is available to the Playback Process 130 directly.
  • the New Synchronized Signal 140 need only contain the new information, the synchronizing information, and any playback control information, for proper playback.
  • the Existing Video Signal 210 is not directly the available to the Playback Process 230 .
  • the New Signal 240 containing the new information as well as synchronizing and playback control information, can be either (1) combined with a representation of the existing video signal for distribution to the Playback Process 230 , or (2) a video and audio delivery format can be used and sent separately over the telecommunications network.
  • the Playback Process 130 , 230 takes the inputs and generates a combined experience for the viewer. The viewer should see a fully synchronized signal with the audio and other descriptions corresponding to the video.
  • the Playback Process 130 has the ability to buffer the video and audio, and other signals, independently so that they can be synchronized and to execute the commands specified in the New Synchronized Signal Generation Process 120 .
  • the Playback Process 130 can run on a PC or other device that synchronizes the audio and video and other signals, and executes the specified commands.
  • the Synchronized Combined Signal 140 can be either a combined video and audio signal, for example, a conventional television format, or it can be the same information as the New Synchronized Signal 240 plus a video signal in any format.
  • the Playback Process 230 can simply be any compatible video and audio display system, for example, a TV set. In the latter case, the same functions are required in the Playback Process 230 as in the first embodiment 100 .
  • the viewer will also have control over the playing of the combined and individual signals, for example, pause and replay.
  • synchronized audio signals may be several synchronized audio signals to choose from, for example, different languages.
  • signals may be other types of signals provided, such as text, images and other video signals that the viewer can select and control.
  • the viewer may have a convenient, easy-to-use interface to control the playback and to select the various options and operate the controls.
  • the system may interface with standard video and audio displays and recording systems.
  • the New Synchronized Signal 140 may include an audio signal, a set of time stamps or other signals to allow the playback process to synchronize the audio and other signals with the video, control signals to instruct the Playback process in how to act on the other signals, or other types of information, such as text, graphics, images, or other video.
  • the signal 140 is sent as a combined package of information so that the Playback Process 130 can receive it, decode the various components, synchronize with the Existing Video Signal 110 , and carry out the functions as specified in the New Synchronized Signal Generation Process 120 .
  • the Synchronized Combined Signal 240 contains the same functional information as the New Synchronized Signal 140 , but in addition includes the video signal itself, although possibly in a different format from the original video signal.
  • the combined signal 240 may be delivered in a conventional video format so that it can be played on any compatible audio and video system.
  • the advantage is the ability to use existing standards and media and receiving and display devices.
  • FIG. 3 A schematic illustration of an implementation of the methods of the invention is shown in FIG. 3.
  • the system of this implementation includes a standard TV signal 310 sent over a commercial coaxial cable.
  • This TV signal 310 is delivered to two Personal Computer (“PC”) systems 330 , 350 .
  • PC Personal Computer
  • Each PC system 330 , 350 is equipped with a video signal receiver 331 , 351 and an Internet interface 332 , 352 .
  • the two PC Systems 330 , 350 are connected together over the Internet 340 .
  • PC system 330 takes the incoming TV signal 310 and derives 333 a synchronizing signal 334 that can specify a precise instant in the TV signal time stream, namely the clock time associated with receipt of the video signal.
  • PC system 330 displays the TV signal 310 on a portion of its monitor 335 .
  • a human editor 320 views this TV signal 310 and records a new audio signal 380 to be synchronized with the TV signal.
  • the human editor 320 can control the time stamps on the new audio signal, for example delaying the clock time to give the human editor 320 time to think.
  • the new audio signal may cause the video to be delayed from actual clock time in playback time.
  • This new audio signal 380 as well as the synchronizing signal 334 is coded 336 and transmitted 370 over the Internet 390 to PC system 350 .
  • PC system 350 receives the TV signal 310 and derives 353 clock time signal 357 from it.
  • PC system 350 also receives the combined signal 370 including synchronizing signal 334 and linked new audio signal 380 over the Internet connection 352 from PC system 330 .
  • PC system 350 now compares 354 the two time-stamp signals 334 , 357 and buffers 355 the incoming TV signal so that it matches the synchronizing timing signal 334 coming over the Internet connection 352 .
  • PC system 350 displays 356 to the viewer 360 the delayed TV signal 359 synchronized with the new audio signal 380 .
  • the viewer 360 perceives that the TV signal has the new audio signal seamlessly integrated with it.
  • the human editor 320 who can actually be more than one person, may control the display of the TV signal. For example, the video signal may be paused for a specified period of time, viewed in slow motion, or speeded up. These effects are then replicated by PC system 350 so that human viewer 360 perceives the video and audio signals as directed by the human editor. Similarly, human viewer 350 may control the playback of the signal, much as they would from a VCR or other recorded audio and video source. Thus, human viewer 360 may pause, speed up, slowdown, or replay the derived audio and video signal.
  • More advanced capabilities can also be provided, such as zooming in or out on the video image, special effects, such as transitions from one image to another, and myriad other capabilities which are becoming available in video playback systems and on personal computers.
  • Human editor 360 can also add an additional video or other signals 380 , for example, an image of the human editor 320 .
  • a signal containing this additional information 370 is transmitted over the Internet connection 340 to PC system 350 .
  • Additional information and control information such as music, subtitles, text, still images, other audio and video, etc. can also be added and synchronized.
  • a further extension is to allow two or more TV signals to be combined and otherwise controlled by the human editor during preparation of new information, and by the human viewer during playback.
  • this implementation, and more generally the method can take one or more existing media and create a variety of new media from this under the control of both the human editor and human viewer.

Abstract

A method and system for recording an auxiliary signal, synchronizing the auxiliary signal with a video signal, and transmitting the auxiliary signal over a telecommunications network is provided.
The method includes receiving a video signal and generating an auxiliary signal derived at least in part from the video signal. The auxiliary signal is transmitted over a telecommunications network and the video signal is delayed as a function of the auxiliary signal. The auxiliary signal and video signal are synchronized.
To accomplish this method, a system is used which includes at least one video signal receiver, an auxiliary signal recorder, at least two telecommunications network interfaces, a signal comparator, and video signal buffer.

Description

    BACKGROUND OF THE INVENTION
  • When a television signal is recorded or broadcast, it typically includes a video signal with a synchronized audio signal “attached” to it. In many cases it is desirable for a person to view the video but to be able to listen to a different audio signal. For example, the person may not speak the language of the attached audio signal, the person may be sight impaired and need a more descriptive audio interpretation, or the language in the attached audio signal may offend the person. The person may desire to see the video essentially when it is delivered or broadcast, for example, with live news coverage or a sporting event, so that they can discuss it with their friends, perhaps even in the same room. [0001]
  • SUMMARY OF THE INVENTION
  • The present invention includes a method and system for recording an auxiliary signal, synchronizing the auxiliary signal with a video signal, and transmitting the auxiliary signal over a telecommunications network. [0002]
  • The method includes receiving a video signal and generating an auxiliary signal derived at least in part from the video signal. The auxiliary signal is transmitted over a telecommunications network and the video signal is delayed as a function of the auxiliary signal. The auxiliary signal and video signal are synchronized. [0003]
  • To accomplish this method, a system is used which includes at least one video signal receiver, an auxiliary signal recorder, at least two telecommunications network interfaces, a signal comparator, and video signal buffer.[0004]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a schematic illustration of a first embodiment of the method of the invention. [0005]
  • FIG. 2 is a schematic illustration of a second embodiment of the method of the invention. [0006]
  • FIG. 3 is a schematic illustration of an implementation of the methods of the invention.[0007]
  • DETAILED DESCRIPTION
  • A [0008] first embodiment 100 of the method of the invention is schematically illustrated in FIG. 1. The first embodiment begins with Existing Video Signal 110. This signal is distributed to both New Synchronized Signal Generation Process 120, and Playback Process 130. In addition, the New Synchronized Signal Generation Process 120 creates New Synchronized Signal 140, which is distributed to the Playback Process 130. This embodiment applies when the playback process has access to the existing video signal at essentially the same time as the new signal generation process, for example if the existing video signal is broadcast or available on a common transmission system, such as a cable TV.
  • A [0009] second embodiment 200 of the method of the invention, schematically illustrated in FIG. 2, applies if the playback process does not have access to the existing video signal at essentially the same time as the new signal generation process. For example, the transmission mechanisms may be different for the two processes, or the playback process may not have access to the existing signal directly. The second embodiment also begins with an Existing Video Signal 210. This signal is distributed to the New Synchronized Signal Generation Process 220. The New Synchronized Signal Generation Process 220 creates Synchronized Combined Signal 240, which is distributed to the Playback Process 230. This signal may be in a different format from the original video signal, for example, due to different transmission media, or different technology, or to being stored and transmitted in non-real time.
  • The [0010] Existing Video Signal 110, 210 can be in any format, including, for example, the variety of commonly used video formats, or newly developing formats. The method can work with any video signal, any combined video and audio signal, and other formats that include video and possibly other information, such as text in a multi-media format. The signal can be “broadcast” or local, live, or recorded.
  • In accordance with the New Synchronized Signal Generation Process [0011] 120, 220 of the current invention, many methods of creating a new audio, and/or another other signal, to be synchronized with the video, are contemplated. One class of methods relies solely on the Existing Video Signal 110, 210. For example, a person, or even a computer, could monitor the existing video and record a new audio signal as they are monitoring. It is optional whether the combined audio signal, if it exists, is used.
  • Monitoring the audio signal may be required in the case of a translation of a speech which is carried in the existing video and audio signal, but may be extraneous if a completely different interpretation is being created, such as for the visually disabled. [0012]
  • Another class of methods may use additional information that may be available, such as a prepared text or other multi-media information available either separately from the video signal or combined with it in some way. These methods may include an automated process for generating an auxiliary signal designed to translate speech, synthesize speech if there is a prepared text, or describe the situation depicted in the video signal. [0013]
  • A key feature of these processes is that the generation of the synchronized signal may require that the video be paused for an interval or the viewing otherwise delayed. For example, the person or process generating the new information may need more time to describe the situation, to complete the translation, or to look up information. [0014]
  • Another feature is that portions of the existing video may be omitted, or otherwise processed, such as freeze-frame or slow motion, to permit better description or to omit offensive portions of the video signal. The new information signal is then marked with time stamps to correspond to the original video signal timing, and to provide the control information for playback of the video signal. [0015]
  • Multiple audio signals may be simultaneously prepared, for example different languages, and other information can be provided as well, including text, image, other video, etc. These can be synchronized with the original video. [0016]
  • The output of the New Synchronized Signal Generation Process [0017] 120, 220 depends on whether the Playback Process 130, 230 is receiving the Existing Video Signal 110, 210 in nearly real time with the New Synchronized Signal Generation Process 120, 220.
  • In the first embodiment, the [0018] Existing Video Signal 110 is available to the Playback Process 130 directly. The New Synchronized Signal 140 need only contain the new information, the synchronizing information, and any playback control information, for proper playback.
  • In the second embodiment, the [0019] Existing Video Signal 210 is not directly the available to the Playback Process 230. The New Signal 240, containing the new information as well as synchronizing and playback control information, can be either (1) combined with a representation of the existing video signal for distribution to the Playback Process 230, or (2) a video and audio delivery format can be used and sent separately over the telecommunications network.
  • The [0020] Playback Process 130, 230 takes the inputs and generates a combined experience for the viewer. The viewer should see a fully synchronized signal with the audio and other descriptions corresponding to the video.
  • In the [0021] first embodiment 100, the Playback Process 130 has the ability to buffer the video and audio, and other signals, independently so that they can be synchronized and to execute the commands specified in the New Synchronized Signal Generation Process 120. The Playback Process 130 can run on a PC or other device that synchronizes the audio and video and other signals, and executes the specified commands. In the second embodiment, the Synchronized Combined Signal 140 can be either a combined video and audio signal, for example, a conventional television format, or it can be the same information as the New Synchronized Signal 240 plus a video signal in any format. In the former case, the Playback Process 230 can simply be any compatible video and audio display system, for example, a TV set. In the latter case, the same functions are required in the Playback Process 230 as in the first embodiment 100. The viewer will also have control over the playing of the combined and individual signals, for example, pause and replay.
  • There may be several synchronized audio signals to choose from, for example, different languages. There may be other types of signals provided, such as text, images and other video signals that the viewer can select and control. The viewer may have a convenient, easy-to-use interface to control the playback and to select the various options and operate the controls. The system may interface with standard video and audio displays and recording systems. [0022]
  • The New Synchronized [0023] Signal 140 may include an audio signal, a set of time stamps or other signals to allow the playback process to synchronize the audio and other signals with the video, control signals to instruct the Playback process in how to act on the other signals, or other types of information, such as text, graphics, images, or other video. The signal 140 is sent as a combined package of information so that the Playback Process 130 can receive it, decode the various components, synchronize with the Existing Video Signal 110, and carry out the functions as specified in the New Synchronized Signal Generation Process 120.
  • The Synchronized Combined [0024] Signal 240 contains the same functional information as the New Synchronized Signal 140, but in addition includes the video signal itself, although possibly in a different format from the original video signal. The combined signal 240 may be delivered in a conventional video format so that it can be played on any compatible audio and video system. The advantage is the ability to use existing standards and media and receiving and display devices.
  • A schematic illustration of an implementation of the methods of the invention is shown in FIG. 3. There are many different systems, devices, and configurations that can be used to implement the proposed method. The system of this implementation includes a [0025] standard TV signal 310 sent over a commercial coaxial cable. This TV signal 310 is delivered to two Personal Computer (“PC”) systems 330, 350. Each PC system 330, 350 is equipped with a video signal receiver 331, 351 and an Internet interface 332, 352. The two PC Systems 330, 350 are connected together over the Internet 340.
  • [0026] PC system 330 takes the incoming TV signal 310 and derives 333 a synchronizing signal 334 that can specify a precise instant in the TV signal time stream, namely the clock time associated with receipt of the video signal. PC system 330 displays the TV signal 310 on a portion of its monitor 335. A human editor 320 views this TV signal 310 and records a new audio signal 380 to be synchronized with the TV signal. The human editor 320 can control the time stamps on the new audio signal, for example delaying the clock time to give the human editor 320 time to think. Thus, the new audio signal may cause the video to be delayed from actual clock time in playback time. This new audio signal 380 as well as the synchronizing signal 334 is coded 336 and transmitted 370 over the Internet 390 to PC system 350.
  • [0027] PC system 350 receives the TV signal 310 and derives 353 clock time signal 357 from it. PC system 350 also receives the combined signal 370 including synchronizing signal 334 and linked new audio signal 380 over the Internet connection 352 from PC system 330. PC system 350 now compares 354 the two time- stamp signals 334, 357 and buffers 355 the incoming TV signal so that it matches the synchronizing timing signal 334 coming over the Internet connection 352. PC system 350 then displays 356 to the viewer 360 the delayed TV signal 359 synchronized with the new audio signal 380. Thus the viewer 360 perceives that the TV signal has the new audio signal seamlessly integrated with it.
  • This implementation also provides many other capabilities. The [0028] human editor 320, who can actually be more than one person, may control the display of the TV signal. For example, the video signal may be paused for a specified period of time, viewed in slow motion, or speeded up. These effects are then replicated by PC system 350 so that human viewer 360 perceives the video and audio signals as directed by the human editor. Similarly, human viewer 350 may control the playback of the signal, much as they would from a VCR or other recorded audio and video source. Thus, human viewer 360 may pause, speed up, slowdown, or replay the derived audio and video signal. More advanced capabilities can also be provided, such as zooming in or out on the video image, special effects, such as transitions from one image to another, and myriad other capabilities which are becoming available in video playback systems and on personal computers. Human editor 360 can also add an additional video or other signals 380, for example, an image of the human editor 320. A signal containing this additional information 370 is transmitted over the Internet connection 340 to PC system 350. Additional information and control information, such as music, subtitles, text, still images, other audio and video, etc. can also be added and synchronized. A further extension is to allow two or more TV signals to be combined and otherwise controlled by the human editor during preparation of new information, and by the human viewer during playback. Thus this implementation, and more generally the method, can take one or more existing media and create a variety of new media from this under the control of both the human editor and human viewer.
  • It will be understood that the above-described embodiments are merely illustrative of the principles of the invention and that other arrangements may be devised by those skilled in the art without departing from the spirit and scope of the invention. [0029]

Claims (20)

What is claimed is:
1. A system for recording an auxiliary signal, synchronizing the auxiliary signal with a video signal, and transmitting the auxiliary signal over a telecommunications network comprising a first video signal receiver, an auxiliary signal recorder, a first telecommunications network interface; a second telecommunications network interface, a signal comparator, and a video signal buffer.
2. The system of
claim 1
further comprising a first computer and a second computer.
3. The system of
claim 2
wherein said first computer includes said first video signal receiver, said auxiliary signal recorder, said first telecommunications network interface, said signal comparator, and said video signal buffer, and said second computer includes said second telecommunications network interface.
4. The system of
claim 2
wherein said first computer includes said first video signal receiver, said auxiliary signal recorder, and said first telecommunications network interface, and said second computer includes said second telecommunications network interface, said signal comparator, said video signal buffer, and a second video signal receiver.
5. The system of
claim 1
wherein the auxiliary signal includes an audio signal.
6. The system of
claim 1
wherein the auxiliary signal includes a video signal.
7. The system of
claim 2
wherein one of said first and second computers further includes a video signal speed controller.
8. The system of
claim 2
wherein one of said first and second computers further includes an image size controller.
9. The system of
claim 2
wherein one of said first and second computers further includes a video signal clock signal deriver.
10. The system of
claim 1
wherein one of said first and second telecommunications network interfaces is an Internet interface.
11. The system of
claim 1
wherein said signal comparator is a clock signal comparator.
12. A system for recording an auxiliary signal, synchronizing the auxiliary signal with a video signal, and transmitting the auxiliary signal over a telecommunications network comprising:
a first computer, said first computer having means for recording the auxiliary signal, means for receiving the video signal, means for deriving a synchronizing signal from the video signal, and means for transmitting the auxiliary signal over the telecommunications network; and
a second computer, said second computer having means for receiving the auxiliary signal and the synchronizing signal from the telecommunications network, means for receiving the video signal, and means for synchronizing the auxiliary signal and the synchronizing signal with the video signal to form an integrated combined signal.
13. A method for recording an auxiliary signal, synchronizing the auxiliary signal with a video signal, and transmitting the auxiliary signal over a telecommunication network comprising the steps of:
receiving the video signal;
generating the auxiliary signal, the auxiliary signal derived at least in part from said video signal;
transmitting the auxiliary signal over the telecommunications network;
receiving the auxiliary signal;
delaying the video signal as a function of said auxiliary signal; and
synchronizing the video signal with the auxiliary signal.
14. The method of
claim 13
wherein said video signal receiving, auxiliary signal generating, video signal delaying, auxiliary signal transmitting, and synchronizing steps are performed with a first computer, and said auxiliary signal receiving step is performed using a second computer.
15. The method of
claim 13
wherein said video signal receiving, auxiliary signal generating, and auxiliary signal transmitting steps are performed using a first computer, and said auxiliary signal receiving, video signal delaying, and synchronizing steps are performed using a second computer.
16. The method of
claim 15
further including the step of receiving the video signal using said second computer.
17. The method of
claim 13
wherein said generating step further includes deriving a first synchronizing signal from the video signal.
18. the method of
claim 17
wherein said delaying step further includes deriving a second synchronizing signal from the video signal and comparing said first and second synchronizing signals.
19. The method of
claim 13
in which the auxiliary signal is an audio signal.
20. The method of
claim 13
further including the step of playing said synchronized video and auxiliary signals using a computer.
US09/816,632 2000-03-23 2001-03-23 Method and system for recording auxiliary audio or video signals, synchronizing the auxiliary signal with a television singnal, and transmitting the auxiliary signal over a telecommunications network Abandoned US20010037499A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US09/816,632 US20010037499A1 (en) 2000-03-23 2001-03-23 Method and system for recording auxiliary audio or video signals, synchronizing the auxiliary signal with a television singnal, and transmitting the auxiliary signal over a telecommunications network

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US19198400P 2000-03-23 2000-03-23
US09/816,632 US20010037499A1 (en) 2000-03-23 2001-03-23 Method and system for recording auxiliary audio or video signals, synchronizing the auxiliary signal with a television singnal, and transmitting the auxiliary signal over a telecommunications network

Publications (1)

Publication Number Publication Date
US20010037499A1 true US20010037499A1 (en) 2001-11-01

Family

ID=26887614

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/816,632 Abandoned US20010037499A1 (en) 2000-03-23 2001-03-23 Method and system for recording auxiliary audio or video signals, synchronizing the auxiliary signal with a television singnal, and transmitting the auxiliary signal over a telecommunications network

Country Status (1)

Country Link
US (1) US20010037499A1 (en)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050081252A1 (en) * 2003-10-14 2005-04-14 International Business Machines Corporation Device and method for bandwidth optimization using a local cache
US20050105894A1 (en) * 2003-08-05 2005-05-19 Samsung Electronics Co., Ltd. Information storage medium, and apparatus and method of reproducing information from the same
US20050188297A1 (en) * 2001-11-01 2005-08-25 Automatic E-Learning, Llc Multi-audio add/drop deterministic animation synchronization
US20050237378A1 (en) * 2004-04-27 2005-10-27 Rodman Jeffrey C Method and apparatus for inserting variable audio delay to minimize latency in video conferencing
US20100218223A1 (en) * 2009-02-20 2010-08-26 At&T Intellectual Property I, L.P. Network recording system
US8374479B1 (en) * 2006-11-02 2013-02-12 National Public Radio, Inc. Live-chase video-description buffer display
JP2013055479A (en) * 2011-09-02 2013-03-21 Nippon Hoso Kyokai <Nhk> Device and program for generating communication content
US9232335B2 (en) 2014-03-06 2016-01-05 Sony Corporation Networked speaker system with follow me
US9288597B2 (en) 2014-01-20 2016-03-15 Sony Corporation Distributed wireless speaker system with automatic configuration determination when new speakers are added
US9369801B2 (en) 2014-01-24 2016-06-14 Sony Corporation Wireless speaker system with noise cancelation
US9426551B2 (en) 2014-01-24 2016-08-23 Sony Corporation Distributed wireless speaker system with light show
US9483997B2 (en) 2014-03-10 2016-11-01 Sony Corporation Proximity detection of candidate companion display device in same room as primary display using infrared signaling
US9560449B2 (en) 2014-01-17 2017-01-31 Sony Corporation Distributed wireless speaker system
US9693169B1 (en) 2016-03-16 2017-06-27 Sony Corporation Ultrasonic speaker assembly with ultrasonic room mapping
US9693168B1 (en) 2016-02-08 2017-06-27 Sony Corporation Ultrasonic speaker assembly for audio spatial effect
US9696414B2 (en) 2014-05-15 2017-07-04 Sony Corporation Proximity detection of candidate companion display device in same room as primary display using sonic signaling
US9794724B1 (en) 2016-07-20 2017-10-17 Sony Corporation Ultrasonic speaker assembly using variable carrier frequency to establish third dimension sound locating
US9826330B2 (en) 2016-03-14 2017-11-21 Sony Corporation Gimbal-mounted linear ultrasonic speaker assembly
US9826332B2 (en) 2016-02-09 2017-11-21 Sony Corporation Centralized wireless speaker system
US9854362B1 (en) 2016-10-20 2017-12-26 Sony Corporation Networked speaker system with LED-based wireless communication and object detection
US9866986B2 (en) 2014-01-24 2018-01-09 Sony Corporation Audio speaker system with virtual music performance
US9924286B1 (en) 2016-10-20 2018-03-20 Sony Corporation Networked speaker system with LED-based wireless communication and personal identifier
US10070291B2 (en) 2014-05-19 2018-09-04 Sony Corporation Proximity detection of candidate companion display device in same room as primary display using low energy bluetooth
US10075791B2 (en) 2016-10-20 2018-09-11 Sony Corporation Networked speaker system with LED-based wireless communication and room mapping
US10623859B1 (en) 2018-10-23 2020-04-14 Sony Corporation Networked speaker system with combined power over Ethernet and audio delivery

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5387943A (en) * 1992-12-21 1995-02-07 Tektronix, Inc. Semiautomatic lip sync recovery system
US6240555B1 (en) * 1996-03-29 2001-05-29 Microsoft Corporation Interactive entertainment system for presenting supplemental interactive content together with continuous video programs
US6324694B1 (en) * 1996-09-06 2001-11-27 Intel Corporation Method and apparatus for providing subsidiary data synchronous to primary content data

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5387943A (en) * 1992-12-21 1995-02-07 Tektronix, Inc. Semiautomatic lip sync recovery system
US6240555B1 (en) * 1996-03-29 2001-05-29 Microsoft Corporation Interactive entertainment system for presenting supplemental interactive content together with continuous video programs
US6324694B1 (en) * 1996-09-06 2001-11-27 Intel Corporation Method and apparatus for providing subsidiary data synchronous to primary content data

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050188297A1 (en) * 2001-11-01 2005-08-25 Automatic E-Learning, Llc Multi-audio add/drop deterministic animation synchronization
US20050105894A1 (en) * 2003-08-05 2005-05-19 Samsung Electronics Co., Ltd. Information storage medium, and apparatus and method of reproducing information from the same
US20050081252A1 (en) * 2003-10-14 2005-04-14 International Business Machines Corporation Device and method for bandwidth optimization using a local cache
US20050237378A1 (en) * 2004-04-27 2005-10-27 Rodman Jeffrey C Method and apparatus for inserting variable audio delay to minimize latency in video conferencing
US7170545B2 (en) 2004-04-27 2007-01-30 Polycom, Inc. Method and apparatus for inserting variable audio delay to minimize latency in video conferencing
US8374479B1 (en) * 2006-11-02 2013-02-12 National Public Radio, Inc. Live-chase video-description buffer display
US20140013351A1 (en) * 2006-11-02 2014-01-09 National Public Radio Live-chase video-description buffer display
US20100218223A1 (en) * 2009-02-20 2010-08-26 At&T Intellectual Property I, L.P. Network recording system
US9667918B2 (en) 2009-02-20 2017-05-30 At&T Intellectual Property I, L.P. Network recording system
JP2013055479A (en) * 2011-09-02 2013-03-21 Nippon Hoso Kyokai <Nhk> Device and program for generating communication content
US9560449B2 (en) 2014-01-17 2017-01-31 Sony Corporation Distributed wireless speaker system
US9288597B2 (en) 2014-01-20 2016-03-15 Sony Corporation Distributed wireless speaker system with automatic configuration determination when new speakers are added
US9369801B2 (en) 2014-01-24 2016-06-14 Sony Corporation Wireless speaker system with noise cancelation
US9426551B2 (en) 2014-01-24 2016-08-23 Sony Corporation Distributed wireless speaker system with light show
US9866986B2 (en) 2014-01-24 2018-01-09 Sony Corporation Audio speaker system with virtual music performance
US9699579B2 (en) 2014-03-06 2017-07-04 Sony Corporation Networked speaker system with follow me
US9232335B2 (en) 2014-03-06 2016-01-05 Sony Corporation Networked speaker system with follow me
US9483997B2 (en) 2014-03-10 2016-11-01 Sony Corporation Proximity detection of candidate companion display device in same room as primary display using infrared signaling
US9696414B2 (en) 2014-05-15 2017-07-04 Sony Corporation Proximity detection of candidate companion display device in same room as primary display using sonic signaling
US9858024B2 (en) 2014-05-15 2018-01-02 Sony Corporation Proximity detection of candidate companion display device in same room as primary display using sonic signaling
US10070291B2 (en) 2014-05-19 2018-09-04 Sony Corporation Proximity detection of candidate companion display device in same room as primary display using low energy bluetooth
US9693168B1 (en) 2016-02-08 2017-06-27 Sony Corporation Ultrasonic speaker assembly for audio spatial effect
US9826332B2 (en) 2016-02-09 2017-11-21 Sony Corporation Centralized wireless speaker system
US9826330B2 (en) 2016-03-14 2017-11-21 Sony Corporation Gimbal-mounted linear ultrasonic speaker assembly
US9693169B1 (en) 2016-03-16 2017-06-27 Sony Corporation Ultrasonic speaker assembly with ultrasonic room mapping
US9794724B1 (en) 2016-07-20 2017-10-17 Sony Corporation Ultrasonic speaker assembly using variable carrier frequency to establish third dimension sound locating
US9854362B1 (en) 2016-10-20 2017-12-26 Sony Corporation Networked speaker system with LED-based wireless communication and object detection
US9924286B1 (en) 2016-10-20 2018-03-20 Sony Corporation Networked speaker system with LED-based wireless communication and personal identifier
US10075791B2 (en) 2016-10-20 2018-09-11 Sony Corporation Networked speaker system with LED-based wireless communication and room mapping
US10623859B1 (en) 2018-10-23 2020-04-14 Sony Corporation Networked speaker system with combined power over Ethernet and audio delivery

Similar Documents

Publication Publication Date Title
US20010037499A1 (en) Method and system for recording auxiliary audio or video signals, synchronizing the auxiliary signal with a television singnal, and transmitting the auxiliary signal over a telecommunications network
US5900908A (en) System and method for providing described television services
US8065710B2 (en) Apparatuses and methods for interactive communication concerning multimedia content
US5818441A (en) System and method for simulating two-way connectivity for one way data streams
US8810728B2 (en) Method and apparatus for synchronizing audio and video streams
US7219363B2 (en) Device and method for processing broadcast program related information
US6751800B1 (en) Information processing apparatus, method, and computer-readable medium
EP1449213B1 (en) System for synchronizing the playback of two or more connected playback devices using closed captioning
US20120033133A1 (en) Closed captioning language translation
EP0969668A2 (en) Copyright protection for moving image data
US20080219641A1 (en) Apparatus and method for synchronizing a secondary audio track to the audio track of a video source
WO2006048963A1 (en) Captioned still image content creating device, captioned still image content creating program and captioned still image content creating system
KR20020006970A (en) Simultsneous recording and playback apparatus with indexing/searching/browsing functionality
JP2002027429A (en) Method for supplying audio translation data on demand and receiver used therefor
JPH08502390A (en) Method for encoding a video signal having multilingual characteristics and apparatus therefor
EP2356654A1 (en) Method and process for text-based assistive program descriptions for television
KR20010075043A (en) Simulating two way connectivity for one way data streams for multiple parties
US11758245B2 (en) Interactive media events
US7346692B2 (en) Information processing apparatus, information processing method, and program
CN102196206A (en) Linkage method of video apparatus, video apparatus and video system
US7296055B2 (en) Information providing system, information providing apparatus, information providing method, information processing apparatus, information processing method, and program
US8001576B2 (en) Information providing system, information processing apparatus and information processing method for transmitting sound and image data
JP2976889B2 (en) Moving image data playback system
US20100091188A1 (en) Synchronization of secondary decoded media streams with a primary media stream
JP5452400B2 (en) Content reproducing apparatus and combination method description data providing apparatus

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION