MXPA99011215A - System and method for processing audio-only programs in a television receiver - Google Patents

System and method for processing audio-only programs in a television receiver

Info

Publication number
MXPA99011215A
MXPA99011215A MXPA/A/1999/011215A MX9911215A MXPA99011215A MX PA99011215 A MXPA99011215 A MX PA99011215A MX 9911215 A MX9911215 A MX 9911215A MX PA99011215 A MXPA99011215 A MX PA99011215A
Authority
MX
Mexico
Prior art keywords
program
audio
type
information
data
Prior art date
Application number
MXPA/A/1999/011215A
Other languages
Spanish (es)
Inventor
Richard Schneidewend Daniel
Louise Brown Megan
Sheridan Westlake Mark
Joseph Mclane Michael
Wayne Randall Darrel
Original Assignee
Louise Brown Megan
Joseph Mclane Michael
Wayne Randall Darrel
Richard Schneidewend Daniel
Thomson Consumer Electronics Inc
Sheridan Westlake Mark
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Louise Brown Megan, Joseph Mclane Michael, Wayne Randall Darrel, Richard Schneidewend Daniel, Thomson Consumer Electronics Inc, Sheridan Westlake Mark filed Critical Louise Brown Megan
Publication of MXPA99011215A publication Critical patent/MXPA99011215A/en

Links

Abstract

Un aparato y un método para procesar programas indicados por la descripción de programas asociada como programas de audio solamente, que incluyen lo siguiente. Se recibe una descripción del programa respectivo para los programas. Después de la selección por parte del usuario de un programa, se hace una determinación sobre si el programa seleccionado es un programa de audio solamente. Si el programa seleccionado es un programa de audio solamente, entonces se exhibe la información de despliegue visual en pantalla previamente programada, mientras que se ejecuta el programa de audio solamente seleccionado, para proporcionar un entretenimiento visual adicional para los usuarios.

Description

SYSTEM AND METHOD FOR PROCESSING AUDIO PROGRAMS ONLY FIELD OF THE INVENTION This invention relates in general to the field of electric program guide processing, and more particularly, to a system and method for processing a program indicated by its program description information, to be an audio program only.
BACKGROUND OF THE INVENTION Electronic devices, such as televisions and personal computers (PC), require a control system that includes a user interface system. Normally, a user interface provides information to a user, and simplifies the use of the device. An example of a user interface is an Electronic Program Guide (EPG) in a television system. An electronic program guide is an interactive feature of on-screen visual display that displays information analogous to television listings found in local newspapers and other print media. In addition, an electronic program guide also includes the information necessary to collate and decode the programs. An electronic program guide provides information about each program within the time frames covered by the electronic program guide, which are usually from the next hour up to seven days. The information contained in an electronic program guide includes programming features, such as channel number, program title, start time, end time, elapsed time, remaining time, evaluation (if available), topic, topic, and a brief description of the content of the program. The electronic programming guides are usually configured in a two-dimensional table or in a grid format, with the information of the time on an axis, and the information of the channel on the other axis. Unlike the non-interactive guides that reside in a dedicated channel, and merely move through the current programming over the other channels during the next 2 to 3 hours, the electronic program guides allow the viewers to select any channel in any moment during some period in the future, for example, up to 7 days later. In addition, the features of the electronic program guide include the ability to highlight individual cells in the grid that contain the program information. Once enhanced, the viewer can perform functions belonging to that selected program. For example, the viewer could instantly switch to that program if it is currently in the air. Viewers could also schedule a touch video cartridge recording (VCR) or similar, if the television is properly configured and connected to a recording device. These electronic program guides are known in the art, and are described, for example, in the Patents of the United States of North America Nos. 5,353,121; 5,479,268; and 5,479,266 issued to Young et al., and assigned to StarSight Telecast, Inc. In addition, U.S. Patent No. 5,515,106, issued to Chaney and co-workers, and assigned to the same assignee of the present invention, describes in detail an example mode that includes the structure of the data packet necessary to implement an example program guide system. The structure of the sample data package is designed in such a way that both channel information can be transmitted (for example, channel name, call letters, channel number, type, etc.), such as program description information (for example, content, title, evaluation, star, etc.) in relation to a program, from a program guide database provider, up to a receiving device, in an efficient manner. Also, as discussed in the Chaney patent, it is anticipated that different types of programs will be available for users, including, for example, video and audio program, audio program only, video program only, or data type program, such as an executable computer program or email. In order to uniquely identify the different types of programs mentioned above, a "class" field is designated, for example, in the program guide package structure, to indicate the type of program to be transmitted. The "class" field can be, for example, "audio-video", "audio", "video", or "data", corresponding respectively to the types of programs described above. The Dl, Patent of the United States of North America US 5,585,866, discloses a receiver that can receive both an audio-visual program and an audio program only. The receiver of the Dl can execute an audio program only received, while displaying text data of the associated program without movement in the visual display of the receiver. Of course, different methods for generating graphics, including animated graphics, in a visual display are well known in the art. For example, different methods are disclosed in D2, an article by Richard G. Shroup, entitled "Color Table Animation", in Proc. of Annual Conference on Computer Graphics and Interact. Tech. (SIGGRAPH '79). 6a; Chicago, IL, USA, 8-10 August 1979, volume 13, number 2, August 1979, pages 8-13, XP002075128, Comput. Graphics (ACM) August 1979. However, references Dl and D2, either alone or in combination, do not teach or suggest that it is desirable to display previously stored animated images, either automatically or in response to a request of the user, when an audio program is activated only. The animated image can serve as an additional entertainment for a user, and / or can function as a screen saver, to prevent the screen from burning out.
SUMMARY OF THE INVENTION Accordingly, the present inventors recognize that it is desirable to be able to process each type of program differently, depending on the description of the associated program received in the program guide information. In particular, the present inventors recognize that it is convenient to provide animated image on a screen, such that a user can be better visually entertained, and / or can serve as a screen saver when an audio program is executed only. Accordingly, in accordance with aspects of the invention, an apparatus for processing a first type of program having both audio and video content is disclosed, and a second type of program having audio content only, characterized by : a memory element for storing the visual display information representing an animated image; a control element to determine if a selected program is the first type of program that has both audio and video content, or the second type of program that has audio content only; and the control element causes the execution of the audio content, and the display of the video content, when the selected program is a first type of program, and causes the execution of the audio content only, and the display of the animated image, when the selected program is a second type of program. Also, there is disclosed a method for processing a first type of program that has both audio and video content, and a second type of program that has audio content only, characterized by: storing the visual display information representing a animated image; determine if a selected program is the first type of program that has both audio and video content, or the second type of program that has audio content only, - cause the execution of audio content, and display audio content when the selected program is the first type of program; and cause the execution of the audio content only, and display the animated image, when the selected program is a -second type of program, BRIEF DESCRIPTION OF THE DRAWINGS In the drawing: Figure 1 shows an example of a television system suitable for processing different types of programs, including audio programs only, and the associated program description information in accordance with the present invention. Figure 2 shows an example of a digital video processing apparatus suitable for processing different types of programs, including audio programs only, and the associated program description information, in accordance with the present invention. Figure 3 shows a block diagram of a specific implementation of a digital satellite system suitable for processing audio programs only, and the associated program description information, according to the present invention. Figure 4 shows an example of a program guide that is being displayed. Figure 5 shows a flow chart, in accordance with the present invention, for processing user inputs and audio programs only, according to the present invention.
Figure 6 shows an example of an animation screen.
DETAILED DESCRIPTION Figure 1 shows an example of a television system suitable for processing different types of programs, including audio programs only, and the associated program guide information in accordance with the present invention. The television receiver shown in Figure 1 is capable of processing both analog NTSC television signals and Internet information. The system shown in Figure 1 has a first input 1100 for receiving the RF IN television signal at radiofrequency frequencies, and a second input 1102 for receiving the baseband television signal VIDEO IN. The RF_IN signal can be supplied from a source, such as an antenna or a cable system, while the VIDEO IN signal can be supplied, for example, by means of a video cartridge recorder (VCR). The tuner 1105 and the intermediate frequency processor 1130 operate in a conventional manner to tune and demodulate a particular television signal that is included in the RF_IN signal. The intermediate frequency processor 1130 produces the VIDEO baseband video signal, which represents the portion of the video program of the tuned television signal. The intermediate frequency processor 1130 also produces a baseband audio signal, which is coupled to an audio processing section (not shown in Figure 1), for further audio processing. Although Figure 1 shows the input 1102 as a baseband signal, the television receiver could include a second tuner, and an intermediate frequency processor, similar to the units 1105 and 1130 to produce a second baseband video signal, from the RF IN signal, or from a second radio frequency signal source. The system shown in Figure 1 also includes a main microprocessor (mP) 1110, for controlling the components of the television receiver, such as the tuner 1105, the image-in-picture-processing unit 1140, the video-signal processor 1155, and the StarSightR 1160 data processing module. As used herein, the term "microprocessor" represents different apparatus, including, but not limited to, microprocessors, microcomputers, microcontrollers, and controllers. The microprocessor 1110 controls the system by sending and receiving both commands and data by means of the serial data bus I C BUS, which uses the well-known I C serial data bus protocol. More specifically, the central processing unit (CPU) 1112 within the mP 1110 executes the control programs contained within the memory, such as the EEPROM 1127, shown in Figure 1, in response to the commands provided by a user, for example by means of the infrared remote control 1125 and the infrared receiver 1122. For example, the activation of a "UP CHANNEL" feature on the remote control 1125 causes the central processing unit 1112 to send a command of " change channel ", along with channel data, to tuner 1105 through the I2C BUS. As a result, the tuner 1105 tunes to the next channel in the channel scan list. Another example of a control program stored in the EEPROM 1127 is the software to implement the operations shown in Figure 5, which will be discussed below, and in accordance with the present invention. Main microprocessor 1110 also controls the operation of an 1113 communications interface unit to provide the ability to upload and download information to and from the Internet. The communication interface unit 1113 includes, for example, a modem for connecting to an Internet service provider, for example, by means of a telephone line, or by means of a cable television line. The communication capability allows the system shown in Figure 1 to provide e-mail capability and Internet-related features, such as web search in addition to receiving television programming.
The central processing unit 1112 controls the functions included within the mP 1110 by means of the busbar 1119 inside the mP 1110. In particular, the central processing unit 1112 controls the data processor "auxiliary 1115 and the visual display processor in display (OSD) 1117. Auxiliary data processor 1115 extracts auxiliary data, such as StarSightR data from the PIPV video signal.StarSightR data, which provides the data information of the program guide in a known format , normally they are received only on a particular television channel, and the television receiver must tune that channel to extract the StarSight data To prevent the StarSightR data extraction from interfering with the normal use of the television receiver, the processing unit Central 1112 starts the StarSightR data extraction, tuning the particular channel only for a period of time when the The television receiver is not normally used (for example, 2:00 AM). At that time, the central processing unit 1112 configures the decoder 1115, such that the auxiliary data is extracted from the horizontal line intervals ^, such as line 16, which are used for the StarSightR data. The central processing unit 1112 controls the transfer of the StarSightR data extracted from the decoder 1115 by means of I2C BUS to the StarSightR 1160 module. A processor internal to the module formats and stores the data in the memory inside the module. In response to activating the EPG visual display of StarSightR (eg, a user activating a particular key on remote control 125), central processing unit 1112 transfers the StarSightR EPG visual display data formatted from the StarSightR module. 1160 via I2C BUS, to the display visualization processor 1117. The display visualization processor 1117 operates in a conventional manner to produce the video signals R, G, and B, RGB OSD which, when coupled to a visual display apparatus (not shown), will produce an image displayed that represents the visual display information on the screen, such as graphics and / or text on the screen, of according to a flow diagram shown in Figure 5, and which will be described later. The visual display processor on the screen 1117 also produces the Quick Switching control signal (FSW), which is intended to control a fast switch to insert RGB OSD signals into the video output signal of the system, at the times when a visual display will be displayed on the screen. Accordingly, when a user enables the animation feature of the present invention to be described later, the on-screen display processor 1117 produces the corresponding OSD RGB signals representing the visual display information previously stored or programmed in the memory. 1127. For example, when a user enables an electronic program guide, for example, by activating a particular switch on the remote control 1125, the central processing unit 1112 enables the processor 1117. In response, the processor 1117 produces the OSD_RGB signals representing the data information of the program guide previously extracted and already stored in the memory, as discussed above. The processor 1117 also produces the FSW signal, which indicates when the electronic program guide will be displayed. The video signal processor (VSP) 1155 performs conventional video signal processing functions, such as luma and chroma processing. The output signals produced by the video signal processor 1155 are suitable for coupling with a visual display apparatus, for example a kinescope or a liquid crystal display apparatus (not shown in Figure 2), to produce an image displayed. The video signal processor 1155 also includes a fast switch for coupling the signals produced by the on-screen display processor 1117, with the path of the output video signal, at the times when graphics will be included and / or text in the displayed image. The fast switch is controlled by the control signal FSW, which is generated by the on-screen display processor 1117 on the main microprocessor 1110, at the times when text and / or graphics are to be displayed. The input signal for the video signal processor 1155 is the PIPV signal, which is produced by the processor. -image in image (PIP) 1140. When a user activates the PIP mode, the PIPV signal represents a large image (large pix) in which a small image (small pix) is inserted. When the PIP mode is inactive, the PIPV signal represents just the large image, that is, no small image signal is included in the PIPV signal. The image-in-picture processor 1140 provides the functionality described in a conventional manner, using the features included in the unit 1140, such as a video switch, an analog-to-digital converter (ADC), RAM, and a digital converter. to analog (DAC). As mentioned above, the visual display data included in the visual display of EPG is produced by the on-screen visual display processor 1117, and is included in the output signal of the video signal processor 1155, in response to the fast switching signal FSW. When the controller 1110 detects the activation of the visual display of EPG, for example, when a user presses an appropriate key on the remote control 1125, the controller 1110 causes the visual display display processor 1117 to produce the visual display of EPG, using information such as the program guide data, from the StarSightR module 1160. The controller 1110 causes the video signal processor 1155 to combine the visual display data EPG from the visual display processor 1117 and the signal from video image in response to the FSW signal, to produce a visual display that includes EPG. The EPG can occupy all or any portion of the visual display area. When the visual display of the electronic program guide is active, the controller 1110 executes a control program stored in the EEPROM 1127. The control program monitors the location of a position indicator, such as a cursor and / or highlight, in the visual display EPG. A user controls the location of the position indicator using direction keys and selection of the remote control 1125. Alternatively, the system could include a mouse device. The controller 1110 detects the activation of a selection device, such as clicking on the button of a mouse, and evaluates the current location information of the cursor in conjunction with the EPG data being displayed, to determine the desired function, by example, tune to a particular program. The controller 1110 subsequently activates the control action associated with the selected feature. An exemplary embodiment of the system features shown in Figure 1 that have been described so far, comprises an ST9296 microprocessor produced by SGS-Thomson Microelectronics, to provide the features associated with mP 1110; an image-in-picture processor M65616 produced by Mitsubishi, to provide the described basic image-image functionality associated with the image-in-picture processor 1140; and a video signal processor LA7612 produced by Sanyo, to provide the functions of the video signal processor 1155. Figure 2 shows another example of an electronic device capable of processing different types of programs, including audio programs only, and the associated program guide, in accordance with the present invention. As described below, the system shown in Figure 2 is an MPEG compatible system, to receive transport streams encoded in MPEG, which represent transmitted programs. However, the system shown in Figure 2 is exemplary only. The user interface systems are also applicable to other types of digital signal processing equipment, including systems not compatible with MPEG, which involve other types of coded data streams. For exampleOther devices include digital video disc (DVD) systems, and streams of MPEG programs, and systems that combine computer and television functions, such as the so-called "PCTV". In addition, although the system described below is described as the processing of transmitted programs, it is exemplary only. The term 'program' is used to represent any form of packetized data, such as telephone messages, computer programs, Internet data, or other communications, for example. In panorama, the video receiver system of Figure 2, a carrier modulated with video data, is received by the antenna 10 and processed by the unit 15. The resulting digital output signal is demodulated by the demodulator 20, and decoded by the decoder 30. The output from the decoder 30 is processed by the transport system 25, which responds to commands from the remote control unit 125. The system 25 provides compressed data outputs for storage, its decoding -additional, or its communication with other devices. The video and audio decoders 85 and 80 respectively, decode the compressed data from the system 25, to provide outputs for visual display. The data port 75 provides an interface for the communication of the compressed data from the system 25 to other devices, such as a computer, or a High Definition Television (HDTV) receiver, for example. The storage device 90 stores the compressed data from the system 25 in the storage medium 105. The device 90, in a reproduction mode, also supports the recovery of the compressed data from the storage medium 105, to be processed by the system 25 for decoding, communication with other devices, or storage in a different storage medium (not shown to simplify the drawing). Considering Figure 2 in detail, a modulated carrier with video data received by the antenna 10, is converted to the digital form, and processed by the input processor 15. The processor 15 includes the radio frequency (RF) tuner and the intermediate frequency (IF) mixer, and the amplification stages for converting the input video signal down to a lower frequency band suitable for further processing. The resulting digital output signal is demodulated by the demodulator 20, and decoded by the decoder 30. The output from the decoder 30 is further processed by the transport system 25. The multiplexer (mux) 37 of the service detector 33 is provided , by the selector 35, with the output from the decoder 30, or the output of the decoder 30 further processed by a demixing unit 40. The demixer 40 may be, for example, a removable unit, such as a smart card in accordance with ISO 7816 and NRSS (National Renewable Security Standards), Standards Committee (The NRSS removable conditional access system is defined in Project Document EIA IS-679, Project PN-3639). The selector 35 detects the presence of a compatible insertable demixing card, and provides output from the unit 40 to the multiplexer 37 only if the card is currently inserted in the video receiving unit. Otherwise, the selector 35 provides the output from the decoder 30 to the multiplexer 37. The presence of the insertable card allows the unit 40 to demix additional, higher program channels, for example, and provide additional program services to a viewer. It should be noted that, in the preferred embodiment, the NRSS unit 40 and the smart card unit 130 (the smart card unit 130 discussed below) share the same system interface., so that only one NRSS card or one smart card can be inserted at a time. However, the interfaces can also be separated to allow a parallel operation. The data provided to the multiplexer 37 is in the form of a packet transport data stream that complies with MPEG, as defined in the standard section 2.4 of the MPEG systems, and includes the So program guide information and the content of data from one or more program channels. The individual packages that comprise channels of particular programs are identified by the Packet Identifiers (PIDs). The transport stream contains the Program Specific Information (PSI), to be used in the identification of the Packet Identifiers, and to assemble the individual data packets to retrieve the contents of all the program channels that comprise the packet data stream. . The transport system 25, under the control of the system controller 115, acquires and collates the program guide information from the incoming transport stream, the storage device 90, or an Internet service provider, by means of the communication interface unit 116. The individual packets comprising the content of the particular program channel, or the information in the Programs, are identified by their Package Identifiers (PIDs) contained within the header information. As discussed above, the description of the program contained in the program guide information may comprise different descriptive fields of the program, such as title, star, evaluation, etc., related to a program. The user interface incorporated in the video receiver shown in Figure 2 makes it possible for a user to activate different features by selecting a desired feature from a visual display (OSD) menu. The on-screen visual display menu may include an electronic program guide (EPG) as described above, and other features discussed below. The data representing the information displayed in the on-screen visual display menu is generated by the system controller 115, in response to the stored screen visual display (OSD) information representing text / graphics, and guide information of stored programs, and / or program guide information and text / graphics, received by means of the input signal, as described above, and in accordance with an example control program, which will be shown in Figure 5 , and as will be discussed later. The software control program of Figure 5 can be stored, for example, in the built-in memory (not shown) of the system controller 115. By using the remote control unit 125 (or other selection element, such as a mouse ), a user may select, from the items of the on-screen display menu, such as a program to be viewed, a program to be stored, the type of storage medium, and the manner of storage. The system controller 115 uses the selection information, provided through the interface 120, to configure the system 25 in order to select the programs to be stored and displayed, and to generate Program-Specific Information suitable for the selected storage medium and apparatus. . The controller 115 configures the elements 45, 47, 50, 55, 65, and 95 of the system 25, establishing the control register values within these elements by means of a data bus, and by selecting signal paths by means of the multiplexers 37 and 110, with the control signal C. In response to the control signal C, the multiplexer 37 selects any of the transport current from the unit 35, or in a reproduction mode, a current data recovered from the storage device 90 by means of the storage interface 95. In a normal operation without reproduction, the data packets comprising the program that the user selected to view, are identified by their packet identifiers, by means of the selection of unit 45. If a cryptic encoding indicator in the header data of the selected program packages indicates that the packets are encoded cryptic In this case, the unit 45 provides the packets to the cryptic decoding unit 50. Otherwise, the unit 45 provides the crypto-encoded packets to the transport decoder 55. Similarly, the data packets comprising the programs that the user selected for their storage are identified by their packet identifiers by the selection unit 47. The unit 47 provides the packets cryptically encoded to the cryptic decoding unit 50, or the packets non-encoded cryptically to the multiplexer 110, based on the flag information cryptic encoding of the packet header. The functions of the cryptic decoders 40 and 50 can be implemented in a single removable smart card that is compatible with the NRSS standard. The approach places all security-related functions in a removable unit that can be easily replaced if a service provider decides to change the cryptic coding techniques, or allow to easily change the security system, for example, to demix a different service. The units 45 and 47 employ packet identifier detection filters, which are coupled with the packet identifiers of the input packets provided by the multiplexer 37, with packet identifier values previously loaded in the control registers inside the units 45 and 47 by the controller 115. The previously loaded packet identifiers are used in the units 47 and 45 to identify the data packets to be stored, and the data packets to be decoded for use in the provision of data. a video image. The identifiers of ~ previously loaded packages are stored in the query tables of units 45 and 47. The query tables of packet identifiers are mapped in memory, in the cryptic encoding key tables in units 45 and 47, that associate the cryptic encoding keys with each packet identifier previously loaded. The packet identifier mapped in the memory and the cryptic encoding key look-up tables allow the units 45 and 47 to couple the cryptically encoded packets containing a previously loaded packet identifier, with the associated cryptic encoding keys allowing their decoding cryptic Uncoded packets_ cryptically have no associated cryptic encoding keys. The units 45 and 47 provide both the identified packets and their cryptic encoding keys associated with the cryptic decoder 50. The packet identifier look-up table of the unit 45 is also mapped into the memory in a destination table, which couples the packets containing pre-loaded packet identifiers, with the corresponding destination buffer zone locations in the packet buffer zone 60. The cryptic encoding keys and the destination buffer zone location addresses associated with programs selected by a user to view or store are preloaded into units 45 and 47, together with packet identifiers assigned by controller 115. Cryptic encoding keys are generated by smart card system 130 which complies with ISO 7816-3, from the extracted cryptic coding codes from the input data stream. The generation of cryptic encoding keys is subject to the client's right, determined from the information encoded in the input data stream and / or previously stored in the same insertable smart card (International Standards Organization) , Document ISO 7816-3 of 1989, defines the interface and signal structures for a smart card system). The packets provided by units 45 and 47 to unit 50, are encoded cryptically using a cryptic encoding technique, such as the Cryptographic Data Coding Standard (DES), defined in Publications 46, 74, and 81 of the Federal Information Standards (FIPS) (Federal Information Standards), provided by the National Technical Information Service, Department of Commerce. The unit 50 cryptically decodes the cryptically encoded packets, using the corresponding cryptic encoding keys provided by the units 45 and 47, by applying cryptic decoding techniques appropriate for the selected cryptic encoding algorithm. The packets cryptically decoded from the unit 50, and the packets not cryptically encoded from the unit 45 comprising the program to be displayed, are provided to the decoder 55. The packets decoded cryptically from the unit 50., and packets not cryptically encoded from the unit 47 that comprise the program for storage, are provided to the multiplexer 110. The unit 60 contains four buffer zones of packets accessible by the controller 115. One of the buffer zones is assigned to contain the data intended to be used by the controller 115, and the other three buffer zones are allocated to contain packets that are intended to be used by the application apparatuses 75, 80, and 85. Access to the stored packets in the four buffer zones inside the unit 60, both by the controller 115 and by the application interface 70, it is controlled by the buffer zone control unit 65. The unit 45 provides a destination indicator to the unit 65 for each packet identified by unit 45 for decoding. The indicators indicate the destination locations of the individual unit 60 for the identified packets, and are stored by the control unit 65 in an internal memory table. The control unit 65 determines a series of read and write pointers associated with the packets stored in the buffer zone 60, based on the First-Enter-First-Exit (FIFO) principle. The write pointers, in conjunction with the destination flags, allow sequential storage of a packet identified from units 45 or 50, at the next empty location within the appropriate target buffer zone in unit 60. "read flags" allow the packets to be read in sequence from the destination buffer areas of the appropriate unit 60, by means of the controller 115 and the application interface 70. The packets not cryptically encoded and decoded cryptically provided by the units 45 and 50 to the decoder 55, they contain a transport header, as defined by section 2.4.3.2 of the MPEG system standard.The decoder 55 determines, from the transport header, whether the crypto-encoded and cryptically decoded packets contain a non-encrypted layer. adaptation (in accordance with the MPEG system standard) The field of adaptation with it has the time information, including, for example, Program Clock References (PCRs), which allow the synchronization and decoding of the content packages. Upon detecting a packet of time information, i.e., a packet containing an adaptation field, the decoder 55 signals the controller 115, by means of an interruption mechanism, by setting a system interrupt, which has been received. the package. In addition, the decoder 55 changes the destination flag of the time packet in the unit 65, and provides the packet to the unit 60. By changing the destination flag of the unit 65, the unit 65 diverts the time information packet. provided by the decoder 55 to the location of the buffer zone of the unit 60 allocated to contain data to be used by the controller 115, instead of an application buffer zone location. Upon receiving the interruption of the system established by the decoder 55, the controller 115 reads the time information and the PCR value, and stores it in its internal memory. The PCR values of the successive time information packets are used by the controller 115 to adjust the master clock of the system 25 (27 MHz). The difference between the PCR-based and master-based estimates of the time interval between the reception of successive time packets, generated by the controller 115, is used to adjust the master clock of the system 25. The controller 115 accomplishes this by the application of the estimated time difference derived, to adjust the input control voltage of a controlled voltage oscillator used to generate the master clock. The controller 115 restores the system interruption after storing the time information in the internal memory. The packets received by the decoder 55 from the units 45 and 50, which contain the content of the program, including the audio, video, subtitling information, and other information, are directed, via the unit 65, from the decoder 55 to the zones. buffer of the designated application device in the buffer zone of the packet 60. The application control unit 70 retrieves in sequence the audio, video, subtitle, and other data data from the designated buffer zones , in the buffer zone 60, and provides the data to the corresponding application apparatuses 75, 80, and 85. The application apparatuses comprise audio and video decoders 80 and 85, and the high-speed data port 75. For example, packet data is processed according to the type of program, according to a flow chart shown in Figure 5, which will be discussed later. Also, the packet data corresponding to a composite program guide generated by the controller 115, as described above, can be transported to the video decoder 85, to be formatted into a suitable video signal for visual display on a monitor ( not shown) connected to the video decoder 85. Also, for example, the data port 75 can be used to provide high-speed data, such as computer programs, for example, to a computer. Alternatively, port 75 may be used to produce data to an HDTV decoder, to display the images corresponding to a selected program or a program guide, for example. The packets containing the PSI information are recognized by the unit 45, as it is intended for the buffer zone of the controller 115 in the unit 60. The PSI packets are directed towards this buffer area by means of the unit 65, by means of of units 45, 50, and 55, in a manner similar to that described for packages containing the content of the program. The controller 115 reads the. PSI from unit 60, and stores it in internal memory. The controller 115 also generates the condensed PSI (CPSI) from the stored PSI, and incorporates the CPSI into a packet data stream suitable for being stored in a selectable storage medium. The identification and address of the packet is governed by the controller 115 in conjunction with the unit 45 and the packet identifier of the unit 47, the destination look-up tables and cryptic encoding key, and the functions of the control unit 65 of the packet. the way previously described. In addition, the controller 115 is coupled with a communication interface unit 116, which operates in a manner similar to the interface unit 1113 of FIG. 1. That is, the unit 116 provides the capability to load and download information to and from from Internet . The communication interface unit 116 includes, for example, a modem for connecting to an Internet service provider, for example, by a telephone line or by a cable television line. The communication capability allows the system shown in the Figure 2 provide e-mail capability and Internet-related features, such as web search, in addition to receiving television programming. - Figure 3 is a specific implementation of an electronic apparatus generally shown in Figure 2, and described in detail above. Figure 3 depicts a satellite receiver top box, designed and manufactured by Thomson Consumer Electronics, of Indianapolis, Indiana, USA, to receive DirecTVMR satellite service provided by Hughes Electronics. As shown in Figure 3, the upper box has a tuner 301 that receives and tunes the applicable satellite radio frequency signals in the range of 950 to 1,450 MHz from a satellite antenna 317. The tuned analog signals are output to a module link 302 for further processing. The link module 302 is "responsible for the further processing of the tuned analog signals I Output and Q Output from the tuner 301, including the filtering and conditioning of the analog signals, and the conversion of the analog signals to a digital output signal, DATA The link module 302 is implemented as an integrated circuit (IC) The integrated circuit of the link module is manufactured by SGS-Thomson Microelectronics of Grenoble, France, and has the part number ST 15339-610. , DATA, from the link module 302, consists of a packet data stream that complies, recognized and processed by the transport unit 303. The data stream, as described in detail in relation to Figure 2, includes the data information from the program guide, and the data content of one or more program channels of the DirecTV1 satellite broadcast service. Further, the program guide data contains information related to which type of program (for example, audio only, video only, etc.), as indicated, for example, by the "class" type. The function of the transport unit 303 is the same as the transportation system 25 shown in FIG. 2 and already discussed. As described above, the transport unit 303 processes the data stream in packets according to the Packet Identifiers (PID) contained in the header information. Then the processed data stream is formatted into compressed audio and video packets compatible with MPEG, and coupled with an MPEG decoder 304 for further processing. The transport unit 303 is controlled by an Advanced RISC Microprocessor (ARM) 315, which is a microprocessor based on RISC. The ARM processor 315 executes the control software that resides in ROM 308. A software component may be, for example, a control program shown in Figure 5, for processing programs according to its type of program, according to the aspects of the present invention, as will be discussed below. The transport unit 303 is normally implemented as an integrated circuit. For example, a preferred embodiment of the transport unit is an IC manufactured by SGS-Thomson Microelectronics, and has the part number ST 15273-810 or 15103-65C. Compressed audio and video packages compatible with MPEG from the transport unit 303, are delivered to an MPEG decoder 304. The MPEG decoder decodes the compressed MPEG data stream from the transport unit 303. Then the decoder 304 produces the applicable audio stream, which can be further processing using the digital-to-analog audio converter (DAC) 305, to convert digital audio data to analog sound. The decoder 304 also produces the applicable digital video data, representing the image pixel information, to an NTSC encoder 306. The NTSC encoder 306 then further processes this video data into an analog video signal compatible with NTSC., such that the video images can be displayed on a regular NTSC television screen. An example of a preferred embodiment of the MPEG decoder is an integrated circuit manufactured by SGS-Thomson Microelectronics, having part number ST 13520. A visual display processor 320 is included in the MPEG 304 integrated circuit. The visual display processor on screen 320 reads the data from SDRAM 316, which contains the displayed visual display information stored. The visual display information on the screen corresponds to graphics / images of visual display text on the bitmap screen. The display visual display processor 320 is capable of varying the color of each pixel of a screen display image_ under the control of the ARM 315 microprocessor in a conventional manner. The on-screen visual display processor 320 is also responsible for generating a sample program guide, as shown in Figure 4, under the control of the ARM 315 processor. In our example mode, upon detecting a user request to generate a visual display of the guide, the ARM 315 microprocessor processes the data information of the program guide obtained from a data stream provided by a program guide information provider, and formats the data information of the guide into data of screen display pixels corresponding to a complete "grid guide", as shown in Figure 4. The visual display pixel data on the screen from the transport unit 303 is then sent to the visual display processor in screen 320 in the MPEG 304 audio / video decoder, to generate the guide image, as described above. As shown in Figure 4, the "grid guide" 400 normally occupies the entire screen of a visual display. Grid guide 400 shows a display of programs in a time and channel format, similar to television programming listed in a newspaper. In particular, one dimension (for example, horizontal) of the guide shows the time information, while the other dimension (for example, the vertical one) of the guide shows the information of the channel. The time information is transmitted to the user by making a time line 401 on the upper portion of the guide, and is marked at half-hour intervals. The channel information is transmitted to the user by channel numbers 410-416, and the station names of the corresponding channels 420-426. In addition, the program guide 400 contains the icons Internet 450 and email 460. By clicking on these icons, a user can surf the Internet, and send / receive email, respectively, through the communication interface unit 307. In addition, an icon can also be incorporated of Internet website in a grid of a program guide. For example, by clicking on "ESPN.com" within grid 470, the user will automatically link to, for example, an ESPN website. The additional relevant functional blocks of the Figure 3 includes modem 307, which corresponds to the communication interface unit 116 shown in Figure 2, to access the Internet, for example. The Conditional Access Module (CAM) 309, corresponds to the cryptic decoding unit NRSS 130 shown in Figure 2, to provide conditional access information. The broadband data module 310 corresponds to the High Speed Data Port 75 shown in Figure 2, to provide high speed data access, for example, to an HDTV decoder, or a computer. A keyboard module / Infrared Receiver 312 corresponds to the interface of the Remote Unit 120 shown in Figure 2, to receive the user control commands from a user control unit 314. The Digital AV 313 busbar module corresponds to the input / output port 100 shown in Figure 2, for connecting to an external device, such as a video cartridge recorder or a digital video disc player. Figure 5 shows the flow chart of an example control program that can be executed by the central processing unit 1112 of Figure 1, the Controller 115 of Figure 2, or the ARM 315 microprocessor of Figure 3, for implement the features according to the aspects of the present invention. A person skilled in the art would readily recognize that the control programs of Figure 5 when being executed by any of the systems described in Figures 1 to 3, will provide the same characteristics in accordance with the present invention. Accordingly, to avoid redundancy, the control program shown in Figure 5 will be described below only with respect to the implementation of the example hardware shown in Figure 3. As shown in step 510, and as discussed above , the on-screen visual display information representing text graphics / images to be displayed in accordance with the aspects of the present invention, is normally pre-programmed, and is already stored, for example, in SDRAM 316. The system shown in claim 3 also processes and stores the program description information contained in the program guide data for each of the programs described in the program guide data, as shown in step 515. In particular , the "class" information that indicates the type (for example, audio only, video only, audio-video, data, etc.) of the program, is recovered and stored in the to DRAM 316 by the ARM processor 315. In step 520, a user may select a program from the program guide shown in Figure 4, for example, by highlighting the grid containing the program, using a control unit of the program. user 314 of the system shown in Figure 3. As an example, as shown in Figure 4, the user has selected the "SONG 1" program in grid 430, by highlighting it. Once a program is selected, the ARM 315 processor will determine if the selected program is an audio program only, as shown in step 525. As described above, the ARM program determines this by examining the information from " class "contained in the program guide data for this selected program. If the ARM 315 processor determines that this program is not an audio program only, but it is, for example, a program that has simultaneous audio and video information, the ARM 315 processor will then process this program as normal, simultaneously displaying the received video, and executing the audio portion received from the program, as shown in step 530. On the other hand, if the ARM processor, in step 525, determines that the received program is an audio program only, the ARM processor 315 will further determine whether the user has previously selected an animation feature, as shown in step 535. If the ARM processor determines that the user has not previously selected the animation feature, the ARM processor will execute the received audio program, and display only a blank or blue screen, as shown in step 540. On the other hand, if the ARM 315 processor determines that the user has previously selected the animation feature, and the selected program is a program audio only, then the ARM processor will proceed to step 545. In step 545, the ARM processor 315 will instruct the visual display processor 320, to retrieve the previously programmed visual display information, to implement the animation feature in accordance with the present invention, from memory 316. The ARM processor 315 will also instruct the display visual display processor 320, to display the visual display information on the screen, on a screen 600, as shown in the Figure 6. The visual display information of our example mode corresponds to a screen having a plurality of screen elements 601-606. The screen elements in this case are, for example, a series of footprints 601-606. The ARM processor also instructs the visual display processor 320 to display the descriptive information of the associated program contained in the program guide information about this audio program only. For example, descriptive program information about the content, title, artist, and class type of this program are displayed on the 600 screen as shown in Figure 6. In addition, to achieve an animated effect of the elements of screen 601-606 of our mode, the ARM processor will change the color scheme of the 601-606 screen elements. For example, the ARM processor may instruct the display visual display processor 320 to display all footprints 601-606 initially in the same color as the background color (e.g., blue). Then the visual display on the screen can change in sequence the color of each footprint, starting from the footprint 601 to the footprint 606, and so on.
Then a viewer can have the visual effect of seeing a leg that steps up gradually, leaving behind a trail of fingerprints. In addition, the same visual display information used in the animation feature thus described can also be used as a screen saver. For example, the ARM processor 315 may have a timer routine that keeps track of when the user's last command is entered by means of user-control 314. If a certain amount of time has passed (eg, 3 minutes) since the last user input, the ARM processor will instruct the on-screen visual display processor to display the same visual display information used in the animation feature described above, to prevent screen burn. This is convenient, because the resources of the system, especially the memory resources, are conserved using the same visual display information in order to achieve both purposes. It should be understood that the embodiments and variations shown and described herein are for illustrations only, and that those skilled in the art can implement different modifications without departing from the scope and spirit of the invention.

Claims (16)

  1. CLAIMS 1. An apparatus for processing a first type of program that has both audio and video content, and a second type of program that has audio content only, characterized by: a memory element for storing visual display information that represents an animated image; a control element to determine if a selected program is the first type of program that has both audio and video content, or the second type of program that has audio content only; and the control element causes the execution of the audio content, and the display of the video content, when the selected program is a first type of program, and causes the execution of the audio content only, and the display of the animated image, when the selected program is a second type of program.
  2. 2. The apparatus of claim 1, wherein the control element determines the type of program based on the information of the received program guide.
  3. 3. The apparatus of claim 1, wherein the program guide information is displayed together with the animated image.
  4. 4. The apparatus of claim 1, wherein the animated image serves as a screen saver.
  5. An apparatus for processing a first type of program having both audio and video content, and a second type of program having audio content only, characterized by: a memory element for storing the visual display information it represents an animated image; a user control element for selecting a first mode and a second mode; a control element to determine if a selected program is the first type of program that has both audio and video content, or the second type of program that has audio content only; and the control element, when the selected program is a first type of program, causes the execution of the audio content, and the display of the video content; and when the program is a second type of program, _ in the first mode, it causes the execution of the audio content only, and the display of the animated image, and in the second mode, the execution of the audio content only, and the display of a static screen.
  6. 6. The apparatus of claim 5, wherein the static screen is a blank screen.
  7. 7. The apparatus of claim 5, wherein the static screen is a blue screen.
  8. 8. The apparatus of claim 5, wherein the control element determines the type of program, based on the information of the received program guide.
  9. 9. A method for processing a first type of program that has both audio and video content, and a second type of program that has audio content only, characterized by: storing the visual display information that represents an animated image; determine if a selected program is the first type of program that has both audio and video content, or the second type of program that has audio content only; cause the execution of the audio content, and display the audio content when the selected program is the first type of program; and cause the execution of the audio content only, and display the animated image, when the selected program is a second type of program.
  10. The method of claim 9, wherein the step of determining is based on the information of the received program guide.
  11. The method of claim 9, which further comprises the step of displaying the program guide information, along with the animated image.
  12. 12. A method for processing a first type of program having both audio and video content, and a second type of program having audio content only, characterized by: storing the visual display information representing an animated image; select a first mode and a second mode; determine if a selected program is the first type of program that has both audio and video content, or the second type of program that has audio content only; cause, when the selected program is a first type of program, the execution of the audio content, and the display of the video content; and cause, when the selected program is a second type of program, in the first mode, the execution of the audio content only, and the display of the animated image, and in the second mode, the execution of the audio content only, and the display of a static screen.
  13. The method of claim 12, wherein the static screen is a blank screen.
  14. The method of claim 12, wherein the static screen is a blue screen.
  15. 15. The apparatus of claim 12, wherein the step of determining is based on the information of the received program guide.
  16. 16. The apparatus of claim 12, wherein the animated image serves as a screen saver.
MXPA/A/1999/011215A 1997-06-06 1999-12-03 System and method for processing audio-only programs in a television receiver MXPA99011215A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US60/048,879 1997-06-06

Publications (1)

Publication Number Publication Date
MXPA99011215A true MXPA99011215A (en) 2001-05-17

Family

ID=

Similar Documents

Publication Publication Date Title
EP0986903B1 (en) System and method for sorting program guide information
KR100629401B1 (en) Method for processing program guide information
EP1374574B1 (en) Method for searching of an electronic program guide
US20040078816A1 (en) System and method for simplifying different types of searches in electronic program guide
US20030223734A1 (en) System and method for providing recording function when no program information is available
US20040073922A1 (en) System and method for distinguishing between indentically titled programs
EP1197076B1 (en) Program guide processing
MXPA99011215A (en) System and method for processing audio-only programs in a television receiver
MXPA99011214A (en) System and method for sorting program guide information
MXPA99011217A (en) System and method for changing program guide format
MXPA99011216A (en) System and method for recording pay tv programs