EP1021808A1 - Dispositif et procedes de commande d'appareils electromenagers - Google Patents

Dispositif et procedes de commande d'appareils electromenagers

Info

Publication number
EP1021808A1
EP1021808A1 EP98920723A EP98920723A EP1021808A1 EP 1021808 A1 EP1021808 A1 EP 1021808A1 EP 98920723 A EP98920723 A EP 98920723A EP 98920723 A EP98920723 A EP 98920723A EP 1021808 A1 EP1021808 A1 EP 1021808A1
Authority
EP
European Patent Office
Prior art keywords
computer
operative
appliance
user
household
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP98920723A
Other languages
German (de)
English (en)
Inventor
Oz Gabai
Jacob Gabai
Nimrod Sandlerman
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Creator Ltd
Original Assignee
Creator Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from IL12085597A external-priority patent/IL120855A0/xx
Application filed by Creator Ltd filed Critical Creator Ltd
Publication of EP1021808A1 publication Critical patent/EP1021808A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2816Controlling appliance services of a home automation network by calling their functionalities
    • H04L12/282Controlling appliance services of a home automation network by calling their functionalities based on user interaction within the home
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/06Receivers
    • H04B1/16Circuits
    • H04B1/20Circuits for coupling gramophone pick-up, recorder output, or microphone to receiver
    • H04B1/202Circuits for coupling gramophone pick-up, recorder output, or microphone to receiver by remote control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L13/00Speech synthesis; Text to speech systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/06Receivers
    • H04B1/16Circuits
    • H04B1/20Circuits for coupling gramophone pick-up, recorder output, or microphone to receiver
    • H04B1/205Circuits for coupling gramophone pick-up, recorder output, or microphone to receiver with control bus for exchanging commands between units
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L2012/284Home automation networks characterised by the type of medium used
    • H04L2012/2841Wireless

Definitions

  • the present invention relates to household appliances and methods for their operation.
  • a signal receiving sensor such as an IR sensor receives a signal from a remote control unit, typically hand-held.
  • the IR signal is processed to be compatible with electrically operable equipment such as a sound entertainment system or electrical appliances.
  • central computer or microprocessor is associated with the dwelling structure and receives a signal from the remote control unit, via the signal receiving sensor.
  • the computer generates a control signal to control the electrically operable equipment. Signals may be sent from the electrically operable equipment through the computer to the remote control unit for providing feedbac information to the user of the remote control unit.
  • toys include vehicles whose motion is controlled by a human user via a remote control device.
  • Haugerud describes computer control of a toy via a wired connection, wherein the user of the computer typically writes a simple program to control movement of a robot.
  • US Patent 4,840,602 to Rose describes a talking doll responsive to an external signal, in which the doll has a vocabulary stored in digital data in a memory which may be accessed to cause a speech synthesizer in the doll to simulate speech.
  • US Patent 5,142,803 to Lang describes an animated character system with realtime control.
  • US Patent 5,191,615 to Aldava et al. describes an interrelational audio kinetic entertainment system in which movable and audible toys and other animated devices spaced apart from a television screen are provided with program synchronized audio and control data to interact with the program viewer in relationship to the television program.
  • US Patent 5,195,920 to Collier describes a radio controlled toy vehicle which generates realistic sound effects on board the vehicle. Communications with a remote computer allows an operator to modify and add new sound effects.
  • the system uses radio signals to transfer audio, video and other control signals to the animated character to provide speech, hearing vision and movement in real-time.
  • the system may be used with either a conventional
  • German Patent DE 3009-040 to Neuhierl describes a device for adding the capability to transmit sound from a remote control to a controlled model vehicle.
  • the sound is generated by means of a microphone or a tape recorder and transmitted to the controlled model vehicle by means of radio communications.
  • the model vehicle is equipped with a speaker that emits the received sounds.
  • the present invention seeks to provide improved household appliances with psychological added value such as entertainment value. Because of the considerable drudgery involved in the majority of household tasks, psychologically stimulating features in household appliances are extremely useful in providing a pleasant, supportive, entertaining and/or empathic experience to users of household appliances, rather than a neutral, negative or boring experience. According to a preferred embodiment of the present invention, a first household
  • a household appliance which is capable of holding a dialogue with a user of the appliance and acting upon
  • a particular advantage of a household appliance which is capable of holding and acting upon a dialog with a user is that homemakers, like drivers, are often in a situation in which
  • a computer system which generates oral messages regarding each appliance or each function of each appliance or each particular task (for example cooking chicken) in a different voice, e.g. a simulation of voices of different celebrities, in order to aid the user in differentiating between oral messages pertaining to different appliances or tasks.
  • Household appliances include but are not limited to kitchen equipment such as refrigerators, cooking devices, mixing devices and food processing devices; entertainment equipment such as VCRs and televisions, housekeeping equipment such as washing machines, dryers and vacuum cleaners, gardening equipment such as power lawn mowers and electric tools such as electric drills.
  • a wireless computer controlled household appliance system including a computer system including a first wireless transmitter and a first wireless receiver and operative to transmit
  • the household appliance including a second wireless transmitter and a second wireless receiver, the household appliance
  • the household appliance receiving the first transmission via the second wireless receiver and operative to carry out at least one action based on the first transmission, the household appliance being operative to transmit a second transmission via the second wireless transmitter and wherein the computer system is
  • household appliance apparatus including at least one functional unit operative to
  • a household appliance including a functional unit operative to perform a household operation, and an entertainment generator operative to provide entertainment to a user of the functional unit.
  • an entertainment generator operative to provide entertainment to a user of the functional unit.
  • speaking household appliance apparatus including a functional unit operative to perform a household operation, and a speech generator operative to generate speech specimens audible to a user of the functional unit.
  • the speech generator is operative to generate speech specimens having entertainment value for the user of the functional unit.
  • the functional unit includes at least one sensor operative to sense at least one ambient condition relevant to the functionality of the functional unit and wherein the speech generator is operative responsive to the at least one sensor.
  • speaking household appliance apparatus also includes an inter-appliance communication unit operative to receive messages from at least one other household appliance and the speech
  • the entertainment generator is operative responsive to the messages. Still further in accordance with a preferred embodiment of the present invention, the entertainment generator includes a random entertainment generator operative to provide
  • appliance personifier includes a celebrity simulator operative to generate outputs causing a perception, on the part of the user of at least one functional units, that the at least one functional units behave similarly to a celebrity.
  • the functional unit includes a loudspeaker and the celebrity simulator includes a personified audio message provider operative to provide verbal messages in a voice resembling the voice of a celebrity, to the loudspeaker.
  • a computerized household appliance system including a plurality of household chore performing appliances distributed in a corresponding plurality of rooms throughout a dwelling- place, each household chore performing appliance including a functional unit, and a loudspeaker for conveying audio messages to a user, and a computer operative to generate audio messages for a user, and to convey the audio messages to the user using at least one of the plurality of loudspeakers.
  • each appliance also includes a sensor operative to sense presence of the user and wherein the computer
  • the senor includes a microphone.
  • computer system includes a computer, a computer radio interface communicating commands to
  • the at least one household appliance and a sound board device having at least one audio channel and transmitting commands from the computer to the computer radio interface over the at least one audio channel.
  • the at least one audio channel also includes an audio channel from the computer radio interface to the sound board device over which digital information arriving from at least one appliance is transmitted to the computer.
  • the functional unit includes a loudspeaker and the appliance personifier includes a personified audio message provider operative to provide personified audio messages to the loudspeaker. Still further in accordance with a preferred embodiment of the present invention, the functional unit includes a food-related appliance and wherein the appliance personifier is operative to simulate a diet-facilitating personification of the food-related appliance.
  • a wireless computer controlled household appliance system including a computer system including a wireless transmitter for transmitting a command to perform at least one
  • At least one household appliance including a wireless receiver, the receiver
  • the appliance receiving the command from the transmitter, the appliance being operative to carry out at least one action based on the command.
  • a wireless computer controlled household appliance system including a
  • the household appliance being operative to transmit a transmission via the
  • the computer system being operative to receive the transmission via the wireless receiver and to perform at least one action based on the transmission.
  • a method for wireless computer control of household appliances including transmitting a first transmission from a computer via a first wireless transmitter, receiving the first transmission at at least one household appliance and carrying out at least one action based on the first transmission, and transmitting a second transmission from the at least one household appliance to the computer.
  • a household appliance personification method including providing a household appliance including a functional unit operative to perform a household operation, and using the household appliance to simulate a personification of the functional unit for a user of the
  • a household entertainment method including providing a household appliance including a functional unit operative to perform a household operation, and using the household
  • present invention is a method for performing household operations including providing a household appliance including a functional unit operative to perform a household operation, and generating speech specimens audible to a user of the functional unit.
  • invention is a computerized household appliance running method including distributing a plurality
  • each household chore performing appliance including a functional unit and a loudspeaker for conveying audio messages to a user, using a computer to generate audio messages for a user, and conveying the audio messages to the user using at least one of the plurality of loudspeakers.
  • a wireless computer controlled household appliance communication method including transmitting a command to perform at least one appliance action from a computer system including a wireless transmitter to at least one household appliance including a wireless receiver, and using the appliance to carry out at least one action based on the command.
  • a wireless computer controlled household appliance communication method including transmitting a message to a computer system including a wireless receiver from at least one household appliance including a wireless transmitter, and using the computer to perform at least one computer action based on the transmission.
  • a wireless computer controlled toy system including a computer system operative to transmit a first transmission via a first wireless transmitter and at least one toy including a first wireless receiver, the toy receiving the first transmission via the first wireless receiver and
  • the computer system may include a computer game.
  • the toy may include a
  • the at least one action may include a plurality of actions.
  • the first transmission may include a digital signal.
  • the first transmission includes
  • the computer system includes a computer having a MIDI port and wherein the computer may be operative to transmit the digital signal by way of the MIDI port.
  • the sound includes music, a pre-recorded sound and/or speech.
  • the speech may include recorded speech and synthesized speech.
  • the at least one toy has a plurality of states including at least a sleep state and an awake state, and the first transmission includes a state transition command, and the at least one action includes transitioning between the sleep state and the awake state.
  • a sleep state may typically include a state in which the toy consumes a reduced amount of energy and/or in which the toy is largely inactive, while an awake state is typically a state of normal operation.
  • the computer system includes a plurality of computers.
  • the at least one toy is operative to transmit a second transmission via a second wireless transmitter and the computer system is operative to receive the second transmission via a second wireless receiver.
  • system includes at least one input device and the second transmission includes a status of the at least one input device.
  • the at least one toy includes at least a first toy and a second toy, and wherein the first toy is operative to transmit a toy-to-toy transmission to the second toy via the second wireless transmitter, and wherein the second toy is operative to carry out at least one action based on the toy-to-toy transmission.
  • operation of the computer system is controlled, at least in part, by the second transmission.
  • the computer system includes a computer game, and wherein operation of the game is controlled, at
  • the second transmission may include a digital signal and/or an analog signal.
  • the computer system has a plurality of states including at least a sleep state and an awake state, and the second transmission include a state transition command, and the computer is operative,
  • At least one toy includes sound input apparatus, and the second transmission includes a sound signal
  • the computer system is also operative to perform at least one of the following actions: manipulate the sound signal; and play the sound signal.
  • the sound includes speech
  • the computer system is operative to perform a speech recognition operation on the speech.
  • the second transmission includes toy identification data
  • the computer system is operative to identify the at least one toy based, at least in part, on the toy identification data.
  • the first transmission includes toy identification data.
  • the computer system may adapt a mode of operation thereof based, at least in part, on the toy identification data.
  • the at least one action may include movement of the toy, movement of a part of the toy and/or an output of a sound.
  • the sound may be transmitted using a MIDI protocol.
  • a game system including a computer system operative to control a computer game and having a display operative to display at least one display object, and at least one toy in
  • the computer game including a plurality of game objects, and the plurality of game objects includes the at least one display object and the at
  • At least one toy is operative to transmit toy identification data to the computer system, and the computer system is operative to adapt a mode of operation of the computer game based, at least in part, on the toy identification data.
  • the computer system may include a plurality of computers. Additionally in accordance with a preferred embodiment of the present invention the first transmission includes computer identification data and the second transmission includes computer identification data.
  • a data transmission apparatus including first wireless apparatus including musical instrument data interface (MIDI) apparatus operative to receive and transmit MIDI data between a first wireless and a first MIDI device and second wireless apparatus including MIDI apparatus operative to receive and transmit MIDI data between a second wireless and a second MIDI device, the first wireless apparatus is operative to transmit MIDI data including data received from the first MIDI device to the second wireless apparatus, and to transmit MIDI data including data received from the second wireless apparatus to the first MIDI device, and the second
  • wireless apparatus is operative to transmit MIDI data including data received from the second MIDI device to the first wireless apparatus, and to transmit MIDI data including data received
  • second wireless apparatus includes a plurality of wirelesses each respectively associated with one of the plurality of MIDI devices, and each of the second plurality of wirelesses is operative to
  • the first MIDI device may include a computer, while the second MIDI device may include a toy.
  • the first wireless apparatus also includes analog interface apparatus operative to receive and transmit analog signals between the first wireless and a first analog device
  • the second wireless apparatus also includes analog interface apparatus operative to receive and transmit analog signals between the second wireless and a second analog device
  • the first wireless apparatus is also operative to transmit analog signals including signals received from the first analog device to the second wireless apparatus, and to transmit analog signal including signals received from the second wireless apparatus to the first analog device
  • the second wireless apparatus is also operative to transmit analog signals including signals received from the second analog device to the first wireless apparatus, and to transmit analog signals including data received from the first wireless apparatus to the second analog device.
  • a method for generating control instructions for a computer controlled toy system includes selecting a toy, selecting at least one command from among a
  • the step of selecting at least one command includes choosing a command, and specifying at least one
  • the at least one control parameter includes at least one condition depending on a result of a
  • At least one of the steps of selecting a toy and the step of selecting at least one command includes utilizing a graphical user interface.
  • the previous command includes a previous command associated with a second toy.
  • the at least one control parameter includes an execution condition controlling execution of the command.
  • the execution condition may include a time at which to perform the command and/or a time at which to cease performing the command.
  • the execution condition may also include a status of the toy.
  • the at least one control parameter includes a command modifier modifying execution of the command. Still further in accordance with a preferred embodiment of the present invention
  • the at least one control parameter includes a condition dependent on a future event.
  • the at least one command includes a command to cancel a previous command.
  • apparatus including wireless transmission apparatus; and signal processing apparatus including at least one of the following analog/digital sound conversion apparatus operative to convert analog
  • MIDI interface operative to transmit MIDI signals between the computer and a MIDI device using the wireless transmission apparatus.
  • a computer system including a computer, and a sound card operatively attached to the computer and having a MIDI connector and at least one analog connector, wherein the computer is operative to transmit digital signals by means of the MIDI connector and to transmit analog signals by means of the at least one analog connector.
  • the computer is also operative to receive digital signals by means of the MIDI connector and to receive analog signals by means of the at least one analog connector.
  • one or more of the appliances may be integrally formed therewith.
  • FIG. 1 - 32C illustrate a toy system for use in conjunction with a computer system wherein:
  • Fig. 1A is a partly pictorial, partly block diagram illustration of a computer control system including a toy, constructed and operative in accordance with a preferred embodiment of the present invention
  • Fig. IB is a partly pictorial, partly block diagram illustration a preferred implementation of the toy 122 of Fig. 1A;
  • Fig. 1C is a partly pictorial, partly block diagram illustration of a computer control system including a toy, constructed and operative in accordance with an alternative preferred embodiment of the present invention
  • Figs. 2 A - 2C are simplified pictorial illustrations of a portion of the system of Fig. 1 A in use;
  • Fig. 3 is a simplified block diagram of a preferred implementation of the computer radio interface 110 of Fig. 1A;
  • Fig. 4 is a more detailed block diagram of the computer radio interface 110 of Fig.
  • Figs. 5 A - 5D taken together comprise a schematic diagram of the apparatus of
  • Fig. 5E is an schematic diagram of an alternative implementation of the apparatus of Fig. 5D;
  • Fig. 6 is a simplified block diagram of a preferred implementation of the toy
  • Figs. 7A - 7F taken together with either Fig. 5D or Fig. 5E, comprise a schematic
  • FIG. 8A is a simplified flowchart illustration of a preferred method for receiving radio signals, executing commands comprised therein, and sending radio signals, within the toy control device 130 of Fig. 1A;
  • FIG. 8B - 8T taken together, comprise a simplified flowchart illustration of a preferred implementation of the method of Fig. 8 A;
  • Fig. 9A is a simplified flowchart illustration of a preferred method for receiving MIDI signals, receiving radio signals, executing commands comprised therein, sending radio signals, and sending MIDI signals, within the computer radio interface 110 of Fig. 1 A;
  • Figs. 9B - 9N taken together with Figs. 8D - 8M, comprise a simplified flowchart illustration of a preferred implementation of the method of Fig. 9 A;
  • Figs. 10A - IOC are simplified pictorial illustrations of a signal transmitted between the computer radio interface 110 and the toy control device 130 of Fig. 1 A;
  • Fig. 11 is a simplified flowchart illustration of a preferred method for generating control instructions for the apparatus of Fig. 1 A;
  • Figs. 12A - 12C are pictorial illustrations of a preferred implementation of a graphical user interface implementation of the method of Fig. 11;
  • Fig. 13 is a block diagram of a first sub-unit of a multi-port multi-channel implementation of the computer radio interface 110 of Fig. 1A, which sub-unit resides within
  • Fig. 14 is a block diagram of a second sub-unit of a multi-port multi-channel
  • Figs. 15A - 15E taken together, form a detailed electronic schematic diagram of 6, suitable for the multi-channel implementation of Figs. 13 and 14;
  • Fig. 16 is a simplified flowchart illustration of a preferred method by which a computer selects a control channel pair in anticipation of a toy becoming available and starts a game-defining communication over the control channel each time both a toy and a transceiver of the computer radio interface are available;
  • Fig. 17 is a simplified flowchart illustration of a preferred method for implementing the "select control channel pair" step of Fig. 16;
  • Fig. 18A is a simplified flowchart illustration of a preferred method for implementing the "select information communication channel pair" step of Fig. 16;
  • Fig. 18B is a simplified flowchart illustration of a preferred method for performing the "locate computer" step of Fig. 18 A;
  • Fig. 19 is a simplified flowchart illustration of a preferred method of operation of the toy control device 130;
  • Fig. 20 is a simplified illustration of a remote game server in association with a
  • Fig. 21 is a simplified flowchart illustration of the operation of the computer or of the network computer of Fig. 20, when operating in conjunction with the remote server;
  • Fig. 22 is a simplified flowchart illustration of the operation of the remote game server of Fig. 20;
  • Fig. 23 is a semi-pictorial semi-block diagram illustration of a wireless computer controlled toy system including a proximity detection subsystem operative to detect proximity between the toy and the computer;
  • FIG. 24A - 24E taken together, form a detailed electronic schematic diagram of a
  • FIGS. 25A - 25F taken together, form a detailed schematic illustration of a computer radio interface which connects to a serial port of a computer rather than to the sound board of the computer;
  • FIGS. 26A - 26D taken together, form a detailed schematic illustration of a computer radio interface which connects to a parallel port of a computer rather than to the sound board of the computer.;
  • Figs. 27A - 27J are preferred flowchart illustrations of a preferred radio coding technique which is an alternative to the radio coding technique described above with reference to Figs. 8E, 8G - 8M and 10A - C;
  • Figs. 28A - 28K taken together, form a detailed electronic schematic diagram of the multi-port multi-channel computer radio interface sub-unit of Fig. 13;
  • Figs. 29A - 291 taken together, form a detailed electronic schematic diagram of the multi-port multi-channel computer radio interface sub-unit of Fig. 14;
  • Fig. 30 is a partly pictorial, partly block diagram illustration of a computer control system including a toy, constructed and operative in accordance with a further preferred
  • Fig. 31 is a block diagram is a simplified block diagram illustrating the combination of the computer radio interface and the toy control device as used in the embodiment of Fig. 30;
  • Fig. 33 is a pictorial illustration of personified household appliances associated with a central computer 2100 by means of two-way radio communication;
  • Fig. 34 is a pictorial illustration of a modification of the apparatus of Fig. 33 in which a first appliance is associated by means of a wire with the computer and other appliances communicate with the computer via the first appliance, communication between the first appliance and the other appliances being wireless;
  • Fig. 35 is a pictorial illustration of personified household appliances associated with a central computer by means of an existing electrical household wiring system;
  • Fig. 37 is an example of a portion of a user interface by which the computer is set up by a user of an appliance, to provide the user of the appliance with infotainment or entertainment;
  • FIG. 38A - 38B taken together, form a simplified flowchart illustration of a
  • Fig. 39 is a simplified diagram of the interface between computer radio interface and a soundboard of the computer
  • Fig. 40 is a simplified block diagram of a preferred implementation for the
  • Fig. 41 is a simplified flowchart illustration of a preferred communication method
  • Fig. 42 is a diagram of the analog and digital representation of the SYNC, SQ, zero-valued bit and the one-valued bit signals;
  • Fig. 44 is an example of a dialogue between a personified microwave oven and a personified dishwasher culminating in a verbal message emitted by a television;
  • Fig. 45 is a simplified flowchart illustration of a method of operation for a central computer according to a first preferred implementation of an inter-appliance dialogue such as the dialog of Fig. 44;
  • Fig. 46 is a flowchart illustration of a method of operation for an appliance according to a second preferred implementation of the dialogue of Fig. 45;
  • Fig. 47 is a pictorial illustration of a central computer 2100 accumulating information from users, via microphone bearing appliances, regarding consumable supplies to be
  • Fig. 48 is a script for the flowchart of Fig. 45 by which the computer 2100 implements the refrigerator's role in the interaction of Fig. 47;
  • Fig. 49 is a script for the flowchart of Fig. 45 by which the computer 2100 implements the washing machine's role in the interaction of Fig. 47;
  • Fig. 50 is a pictorial illustration of a scenario in which a central computer is accumulating information regarding household chore monitoring and timing;
  • Fig. 51 is a script for the flowchart of Fig. 45 by which the computer 2100
  • Fig. 52 is a script for the flowchart of Fig. 45 by which the computer 2100 implements the dryer's role in the interaction of Fig. 50;
  • Fig. 53 is a script for the flowchart of Fig. 45 by which the computer 2100 implements the microwave oven's role in the interaction of Fig. 50.
  • Fig. 1 A is a partly pictorial, partly block diagram illustration of a computer control system including a toy, constructed and operative in accordance with a preferred embodiment of the present invention.
  • the system of Fig. 1A comprises a computer 100, which may be any suitable computer such as, for example, an IBM-compatible personal computer.
  • the computer 100 is equipped with a screen 105.
  • the computer 100 is preferably equipped with a sound card such as, for example, a Sound Blaster Pro card commercially available from Creative Labs, Inc., 1901 McCarthy Boulevard, Milpitas CA 95035 or from Creative Technology Ltd., 67 Ayer Rajah Crescent #03-18, Singapore, 0513; a hard disk; and, optionally, a CD-ROM drive.
  • a sound card such as, for example, a Sound Blaster Pro card commercially available from Creative Labs, Inc., 1901 McCarthy Boulevard, Milpitas CA 95035 or from Creative Technology Ltd., 67 Ayer Rajah Crescent #03-18, Singapore, 0513;
  • the computer 100 is equipped with a computer radio interface 110 operative to
  • commands transmitted from the computer 100 to the computer radio interface 110 are transmitted via both analog signals and digital signals, with the digital signals typically being transmitted by way of a MIDI port. Transmission of the analog and digital signals is described below with
  • the transmitted signal may be an analog signal or a digital signal.
  • the received signal may also be an analog signal or a digital signal.
  • Each signal typically comprises a message.
  • a preferred implementation of the computer radio interface 110 is described below with reference to Fig. 3.
  • the system of Fig. 1A also comprises one or more toys 120.
  • 1A comprises a plurality of toys, namely three toys 122, 124, and 126 but it is appreciated that, alternatively, either one toy only or a large plurality of toys may be used.
  • FIG. IB is a partly pictorial, partly block diagram illustration of the toy 122 of Fig. 1 A.
  • Each toy 120 comprises a power source 125, such as a battery or a connection to line power.
  • Each toy 120 also comprises a toy control device 130, operative to receive a wireless signal transmitted by the computer 100 and to cause each toy 120 to perform an action based on the received signal.
  • the received signal may be, as explained above, an analog signal or a digital signal.
  • a preferred implementation of the toy control device 130 is described below with reference to Fig. 6.
  • Each toy 120 preferably comprises a plurality of input devices 140 and output
  • the input devices 140 may comprise, for example on or more of the following: a microphone 141; a microswitch sensor 142; a touch sensor (not shown in Fig. IB); a light sensor (not shown in Fig. IB); a movement sensor 143, which may be, for example, a tilt sensor or an acceleration sensor.
  • a microphone 141 a microphone 141; a microswitch sensor 142; a touch sensor (not shown in Fig. IB); a light sensor (not shown in Fig. IB); a movement sensor 143, which may be, for example, a tilt sensor or an acceleration sensor.
  • a movement sensor 143 which may be, for example, a tilt sensor or an acceleration sensor.
  • Appropriate commercially available input devices include the
  • the output devices 150 may comprise, for example, one or more of the following: a speaker 151; a light 152; a solenoid 153 which may be operative to move a portion of the toy; a motor, such as a stepping motor, operative to move a portion of the toy or all of the toy (not shown in Fig. IB).
  • Appropriate commercially available output devices include the following: DC motors available from Alkatel (dunkermotoren), Postfach 1240, D-7823, Bonndorf/Schwarzald, Germany; stepping motors and miniature motors available from Haydon Switch and Instruments, Inc. (HS ⁇ ), 1500 Meriden Road, Waterbury, CT, USA; and DC solenoids available from Communications Instruments, Inc., P.O. Box 520, Fairview, North Carolina 28730, USA.
  • Examples of actions which the toy may perform include the following: move a portion of the toy; move the entire toy; or produce a sound, which may comprise one or more of the following: a recorded sound, a synthesized sound, music including recorded music or synthesized music, speech including recorded speech or synthesized speech.
  • the received signal may comprise a condition governing the action as, for example, the duration of the action, or the number of repetitions of the action.
  • the portion of the received signal comprising a message comprising a command to perform a specific action as, for example, to produce a sound with a given duration
  • the portion of the received signal comprising a sound typically comprises an analog signal.
  • the portion of the received signal comprising a sound, including music may comprise a digital signal, typically a signal comprising MIDI data.
  • the action the toy may perform also includes reacting to signals transmitted by
  • the toy control device 130 is also operative to transmit a signal intended for the computer 100, to be received by the computer radio interface 110.
  • the computer radio interface 110 is preferably also operative to poll the toy control device 130, that is, transmit a signal comprising a request that the toy control device 130 transmit a signal to the computer radio interface 110. It is appreciated that polling is particularly preferred in the case where there are a plurality of toys having a plurality of toy control devices 130.
  • the signal transmitted by the toy control device 130 may comprise one or more of the following: sound, typically sound captured by a microphone input device 141; status of sensor input devices 140 as, for example, light sensors or micro switch; an indication of low power in the power source 125; or information identifying the toy.
  • a sound signal transmitted by the device 130 may also include speech.
  • the computer system is operative to perform a speech recognition operation on the speech signals.
  • Appropriate commercially available software for speech recognition is available from companies such as: Stylus Innovation Inc., One Kendall Square, Building 300, Cambridge, MA 02139, USA; A&G Graphics Interface, USA, Telephone No. (617) 492-0120, Telefax No. (617) 427-3625; "Dragon Dictate For Windows", available from Dragon Systems Inc., 320
  • the signal from the radio control interface 110 may also comprise, for example, one or more of the following: a request to ignore input from one or more input devices 140; a
  • all signals transmitted in both directions between the computer radio interface 110 and the toy control device 130 include information identifying the toy.
  • Fig. IC is a partly pictorial, partly block diagram illustration of a computer control system including a toy, constructed and operative in accordance with an alternative preferred embodiment of the present invention.
  • the system of Fig. IC comprises two computers 100. It is appreciated that, in general, a plurality of computers 100 may be used.
  • all signals transmitted in both directions between the computer radio interface 110 and the toy control device 130 typically include information identifying the computer.
  • computer 100 runs software comprising a computer game, typically a game including at least one animated character.
  • the software may comprise educational software or any other interactive software including at least one animated object.
  • animated object includes any object which may be depicted on the computer screen 105 and which interacts with the user of the computer via input to and output from the computer.
  • object may be any object depicted on the screen such as, for example: a doll; an action figure; a
  • toy such as, for example, an activity toy, a vehicle, or a ride-on vehicle; a drawing board or sketch board; or a household object such as, for example, a clock, a lamp, a chamber pot, or an
  • FIG. 2 A depicts a portion of the system of Fig. 1A in use.
  • the apparatus of Fig. 2 A comprises the computer screen 105 of Fig. 1 A.
  • animated objects 160 and 165 are depicted on the computer screen.
  • Fig. 2B depicts the situation after the toy 122 has been brought into range of the computer radio interface 110 of Fig. 1A, typically into the same room therewith.
  • the toy 122 corresponds to the animated object 160.
  • the toy 122 and the animated object 160, shown in Fig. 2A are both a teddy bear.
  • the apparatus of Fig. 2B comprises the computer screen 105, on which is depicted the animated object 165.
  • the apparatus of Fig. 2B also comprises the toy 122.
  • the computer 100 having received a message via the computer radio interface 110, from the toy 122, no longer displays the animated object 160 corresponding to the toy 122.
  • the functions of the animated object 160 are now performed through the toy 122, under control of the computer 100 through the computer radio interface 110 and the toy control device 130.
  • Fig. 2C depicts the situation after the toy 126 has also been brought into range of the computer radio interface 110 of Fig. 1A, typically into the same room therewith.
  • the toy 126 has also been brought into range of the computer radio interface 110 of Fig. 1A, typically into the same room therewith.
  • the toy 126 corresponds to the animated object 165.
  • the toy 126 and the animated object 165 corresponds to the animated object 165.
  • FIG. 2A and 2B are both a clock.
  • the apparatus of Fig. 2C comprises the computer screen 105, on which no animated objects are depicted.
  • the apparatus of Fig. 2C also comprises the toy 126.
  • the computer 100 having received a message via the computer radio interface 110 from the toy 126, no longer displays the animated object 165 corresponding to the toy 126.
  • the functions of the animated object 165 are now performed through the toy 126, under control of the computer 100 through the computer
  • the user interacts with the animated objects 160 and 165 on the computer screen, typically using conventional methods.
  • the user also interacts with the toy 122, and in Fig. 2C typically with the toys 122 and 126, instead of interacting with the animated objects 160 and 165 respectively.
  • the user may interact with the toys 122 and 126 by moving the toys or parts of the toys; by speaking to the toys; by responding to movement of the toys which movement occurs in response to a signal received from the computer 100; by responding to a sound produced by the toys, which sound is produced in response to a signal received from the computer 100 and which may comprise music, speech, or another sound; or otherwise.
  • FIG. 3 which is a simplified block diagram of a preferred embodiment of the computer radio interface 110 of Fig. 1A.
  • the apparatus of Fig. 3 comprises the computer radio interface 110.
  • the apparatus of Fig. 3 also comprises a sound card 190, as described above with reference to Fig. 1 A.
  • the connections between the computer radio interface 110 and the sound card 190 are shown.
  • the computer radio interface 110 comprises a DC unit 200 which is fed with power through a MIDI interface 210 from a sound card MIDI interface 194, and the following
  • a MDDI interface 210 which connects to the sound card MIDI interface 194; an audio interface 220 which connects to an audio interface 192 of the sound card 190; and a secondary MDDI interface 210 which connects to the sound card MIDI interface 194; an audio interface 220 which connects to an audio interface 192 of the sound card 190; and a secondary MDDI interface 210 which connects to the sound card MIDI interface 194; an audio interface 220 which connects to an audio interface 192 of the sound card 190; and a secondary MDDI interface 210 which connects to the sound card MIDI interface 194; an audio interface 220 which connects to an audio interface 192 of the sound card 190; and a secondary MDDI interface 210 which connects to the sound card MIDI interface 194; an audio interface 220 which connects to an audio interface 192 of the sound card 190; and a secondary MDDI interface 210 which connects to the sound card MIDI interface 194; an audio interface 220 which connect
  • audio interface 230 which preferably connects to a stereo sound system for producing high quality sound under control of software running on the computer 100 (not shown).
  • the apparatus of Fig. 3 also comprises an antenna 240, which is operative to send and receive signals between the computer radio interface 110 and one or more toy control devices
  • Fig. 4 is a more detailed block diagram of the computer radio interface 110 of Fig.
  • the apparatus of Fig. 4 comprises the DC unit 200, the MIDI interface 210, the audio interface
  • the apparatus of Fig. 4 also comprises a multiplexer
  • a micro controller 250 controls the radio transceiver 260 and the connection unit 270 connecting the radio transceiver 260 to the micro controller 250, and a comparator 280.
  • Figs. 5A - 5D taken together comprise a schematic diagram of the apparatus of Fig. 4.
  • Transistors 2N2222 and MPSA14 Motorola, Phoenix, AZ, USA. Tel.
  • UI of Fig. 5D may be replaced by:
  • U2 of Fig. 5D may be replaced by:
  • Fig. 5E is a schematic diagram of an alternative implementation of the apparatus of Fig. 5D.
  • the following is a preferred parts list for the apparatus of Fig. 5E: 1. UI BIM-418-F low power UHF data transceiver module, Ginsburg Electronic
  • UI may be replaced by:
  • UI RY3GB021 RF 900Mhz units available from SHARP ELECTRONIC COMPONENTS GROUP, 5700 Northwest, Pacific Rim Boulevard #20, Camas, Washington, USA.
  • UI RY3GB100 RF Units For DECT available from SHARP ELECTRONIC
  • one of item 1 or either of the alternate items 1 may be
  • the apparatus of Fig. 5E has similar functionality to the apparatus of Fig. 5D, but has higher bit rate transmission and reception capacity and is, for example, preferred when MIDI data is transmitted and received.
  • Figs. 5 A - 5E are self-explanatory with regard to the above parts lists.
  • Fig. 6 is a simplified block diagram of a preferred embodiment of the toy control device 130 of Fig. IN
  • the apparatus of Fig. 6 comprises a radio transceiver 260, similar to the radio transceiver 260 of Fig. 4.
  • the apparatus of Fig. 6 also comprises a microcontroller 250 similar to the microcontroller 250 of Fig. 4.
  • the apparatus of Fig. 6 also comprises a digital input/output interface (digital I/O interface) 290, which is operative to provide an interface between the microcontroller 250 and a plurality of input and output devices which may be connected thereto such as, for example, four input device and four output devices.
  • digital I/O interface 290 A preferred implementation of the digital I/O interface 290 is described in more detail below with reference to Fig. 7A - 7F.
  • the apparatus of Fig. 6 also comprises an analog input/output interface (analog I/O interface) 300 operatively connected to the radio transceiver 260, and operative to receive
  • the apparatus of Fig. 6 also comprises a multiplexer 305 which is operative, in response to a signal from the microcontroller 250, to provide output to the analog I/O interface 300 only when analog signals are being transmitted by the radio transceiver 260, and to pass input from the analog I/O interface 300 only when such input is desired.
  • the apparatus of Fig. 6 also comprises input devices 140 and output devices 150.
  • the input devices 140 comprise, by way of example, a tilt switch operatively connected to the digital I/O interface 290, and a microphone operatively connected to the analog I/O interface 300. It is appreciated that a wide variety of input devices 140 may be used.
  • the output devices 150 comprise, by way of example, a DC motor operatively connected to the digital I/O interface 290, and a speaker operatively connected to the analog I/O interface 300. It is appreciated that a wide variety of output devices 150 may be used.
  • the apparatus of Fig. 6 also comprises a DC control 310, a preferred implementation of which is described in more detail below with reference to Figs. 7 A - 7F.
  • the apparatus of Fig. 6 also comprises a comparator 280, similar to the comparator 280 of Fig. 4.
  • the apparatus of Fig. 6 also comprises a power source 125, shown in Fig. 6 by way of example as batteries, operative to provide electrical power to the apparatus of Fig. 6 via the DC control 310.
  • Figs. 7A - 7F which, taken together with either Fig. 5D or 5E, comprise a schematic diagram of the toy control device of Fig. 6. If the schematics of Fig. 5E is employed to implement the computer radio interface of Fig. 4, using RY3GB021 as UI of
  • RY3GH021 is used to implement UI rather than RY3GB021.
  • the signals transmitted between the computer radio interface 110 and the toy control device 130 may be either analog signals or digital signals. It the case of digital signals, the digital signals preferably comprise a plurality of predefined messages, known to both the computer 100 and to the toy control device 130. Each message sent by the computer radio interface 110 to the toy control device
  • Each message sent by the toy control device 130 to the computer radio interface 110 comprises an indication of the sender of the message.
  • messages also comprise the following:
  • each message sent by the computer radio interface 110 to the toy control device 130 comprises an indication of the sender of the message; and each message sent by the toy control device 130 to the computer radio interface
  • a preferred set of predefined messages is as follows:
  • Change Toy control device output pin to D for a period of time and then return to previous state.
  • T1,T2 time - 00-FF H
  • the Audio is sent to the Toy control device by the computer sound card and the Computer radio interface.
  • T1,T2 TIME 00-FF H (SEC) example:
  • CHI Transmit RF channel number 0-F H
  • CH2 Receive RF Channel number 0-F H
  • This command is available only with enhanced radio modules (alternate UI of Fig. 5E ) or with the modules described if Fig 15A-15E and 24A 24E.
  • the computer radio interface number is 6.
  • Fig. 8 A is a simplified flowchart illustration of a preferred method for receiving radio signals, executing commands comprised therein, and sending radio signals, within the toy control device 130 of Fig. 1A.
  • each message as described above comprises a command, which may include a command to process information also comprised in the message.
  • the method of Fig. 8A preferably comprises the following steps:
  • a synchronization signal or preamble is detected (step 400).
  • a header is detected (step 403).
  • a command contained in the signal is received (step 405).
  • the command contained in the signal is executed (step 410). Executing the command may be as described above with reference to Fig. 1 A.
  • a signal comprising a command intended for the computer radio interface 110 is sent (step 420).
  • FIG. 8B - 8T which, taken together, comprise a simplified flowchart illustration of a preferred implementation of the method of Fig. 8A.
  • the method of Figs. 8B - 8T is self-explanatory.
  • Fig. 9 A is a simplified flowchart illustration of a preferred method for receiving MIDI signals, receiving radio signals, executing commands comprised therein, sending radio signals, and sending MIDI signals, within the computer radio interface 110 of Fig. 1A.
  • Fig. 9A also preferably comprises the following steps:
  • a MIDI command is received from the computer 100 (step 430).
  • a MIDI command is sent to the computer 100 (step 440).
  • the MIDI command may comprise a signal received from the toy control device 130, may comprise a response to a MIDI command previously received by the computer radio interface 110 from the computer 100, or may comprise a general command.
  • the command contained in the MIDI command or in the received signal is executed (step 450). Executing the command may comprise, in the case of a received signal, reporting the command to the computer 100, whereupon the computer 100 may typically carry out any appropriate action under program control as, for example, changing a screen display or taking any other appropriate action in response to the received command.
  • executing the command may comprise transmitting the command to the toy control device 130.
  • Executing a MIDI command may also comprise switching audio output of the computer control device 110 between the secondary audio interface 230 and the radio transceiver 260.
  • the secondary audio interface 230 is directly connected to the audio interface 220 preserving the connection between the computer sound board and the peripheral audio devices such as speakers, microphone and stereo system.
  • Figs. 9B - 9N Reference is now made to Figs. 8D - 8M, all of which, taken together, comprise a simplified flowchart illustration of a preferred implementation of the method of Fig. 9 A.
  • Figs. 10A - 10C are simplified pictorial illustrations of a signal transmitted between the computer radio interface 110 and the toy
  • Fig. 10A comprises a synchronization preamble. The duration
  • T_SYNC of the synchronization preamble is preferably .500 millisecond, being preferably
  • Fig. 10B comprises a signal representing a bit with value 0
  • Fig. IOC comprises a signal representing a bit with value 1.
  • Figs. 10B and IOC refer to the case where the apparatus of Fig. 5D is used.
  • functionality corresponding to that depicted in Figs. 10B and IOC is provided within the apparatus of Fig. 5E.
  • each bit is assigned a predetermined duration T, which is the same for every bit.
  • a frequency modulated carrier is transmitted, using the method of frequency modulation keying as is well known in the art.
  • An "off" signal (typically less than 0.7 Volts) presented at termination 5 of U2 in Fig. 5D causes a transmission at a frequency below the median channel frequency.
  • An "on” signal (typically over 2.3 Volts) presented at pin 5 of U2 in Fig. 5D causes a transmission at a frequency above the median frequency.
  • These signals are received by the corresponding receiver UI.
  • Output signal from pin 6 of UI is fed to the comparator 280 of Figs. 4 and 6 that is operative to determine whether the received signal is "off" or "on", respectively. It is also possible to use the comparator that is contained within UI by connecting pin 7 of UI of Fig. 5D, through pin 6 of the connector Jl of Fig. 5D, pin 6 of connector Jl of Fig.
  • T is preferably taken to be a bit with value 0. Receipt of an on signal as shown in Fig. 10C, of duration greater than 0.40 * T is preferably taken to be a bit with value 1. Typically, T has a value
  • the sum of the durations of the on signal and the off signal must be between 0.90 T and 1.10 T for the bit to be considered valid. Otherwise, the bit is considered invalid and is ignored.
  • Fig. 11 is a simplified flowchart illustration of a method for generating control instructions for the apparatus of Fig. 1A.
  • the method of Fig. 11 preferably includes the following steps:
  • a toy is selected (step 550). At least one command is selected, preferably from a plurality of commands associated with the selected toy (steps 560 - 580). Alternatively, a command may be entered by selecting, modifying, and creating a new binary command (step 585).
  • selecting a command in steps 560 - 580 may include choosing a command and specifying one or more control parameters associated with the command.
  • a control parameter may include, for example, a condition depending on a result of a previous command, the previous command being associated either with the selected toy or with another toy.
  • parameter may also include an execution condition governing execution of a command such as, for example: a condition stating that a specified output is to occur based on a status of the toy, that is, if and only if a specified input is received; a condition stating that the command is to be performed at a specified time; a condition stating that performance of the command is to cease at a specified time; a condition comprising a command modifier modifying execution of the command
  • command such as, for example, to terminate execution of the command in a case where execution of the command continues over a period of time; a condition dependent on the occurrence of a future event; or another condition.
  • the command may comprise a command to cancel a previous command.
  • the output of the method of Fig. 11 typically comprises one or more control
  • command file is called from a driver program which typically determines which command is to be executed at a given point in time and then calls the command file associated with the given command.
  • a user of the method of Fig. 11 performs steps 550 and 560 using a computer having a graphical user interface.
  • Figs. 12A - 12C are pictorial illustrations of a preferred embodiment of a graphical user interface implementation of the method of Fig. 11.
  • Fig. 12A comprises a toy selection area 600, comprising a plurality of toy selection icons 610, each depicting a toy.
  • the user of the graphical user interface of Figs. 12A - 12C typically selects one of the toy selection icons 610, indicating that a command is to be specified for the selected toy.
  • Fig. 12A also typically comprises action buttons 620, typically comprising one or more of the following: a button allowing the user, typically an expert user, to enter a direct binary command implementing an advanced or particularly complex command not otherwise available through the graphical user interface of Figs. 12A - 12C; a button allowing the user to install a new toy, thus adding a new toy selection
  • Fig. 12B depicts a command generator screen typically displayed after the user has selected one of the toy selection icons 610 of Fig. 12A.
  • Fig. 12B comprises an animation area 630, preferably comprising a depiction of the selected toy selection icon 610, and a text area 635 comprising text describing the selected toy.
  • Fig. 12B also comprises a plurality of command category buttons 640, each of which allow the user to select a category of commands such as, for example: output commands; input commands; audio in commands; audio out commands; and general commands.
  • Fig. 12B also comprises a cancel button 645 to cancel command selection and return to the screen of Fig. 12 A.
  • Fig. 12C comprises a command selection area 650, allowing the user to specify a specific command.
  • a wide variety of commands may be specified, and the commands shown in Fig. 12C are shown by way of example only.
  • Fig. 12C also comprises a file name area 655, in which the user may specify the name of the file which is to receive the generated control instructions.
  • Fig. 12C also comprises a cancel button 645, similar to the cancel button 645 of Fig. 12B.
  • Fig. 12C also comprises a make button 660. When the user actuates the make button 660, the control instruction generator of Fig. 11 generates control instructions implementing the chosen command for the chosen toy, and writes the control instructions to the specified file.
  • Fig. 12C also comprises a parameter selection area 665, in which the user may
  • a computer transmits this command to verify that the radio channel is vacant. If another computer is already using this channel it will respond with the Availability Response Command. If no response is received within 250msec the channel is deemed vacant.
  • P Computer address 00-03 H
  • a computer transmits this command in response to an Availability Interrogation Command to announce that the radio channel is in use.
  • P Computer address 00-03 H
  • a Toy transmits this command to declare its existence and receive in response a Channel Pair Selection Command designating the computer that will control it and the radio channels to use.
  • a computer transmits this command in response to a Toy Availability Command to inform the toy the radio channels to be used.
  • P Computer address 00-03 H
  • FIGs. 13 and 14 there are illustrated block diagrams of multiport multi-channel implementation of the computer radio interface 110 of Fig. 1A.
  • Fig. 13 illustrates the processing sub-unit of the computer interface that is implemented as an add-in board installed inside a PC.
  • Fig. 14 is the RF transceiver which is a device external to the computer and connects to the processing subunit by means of a cable.
  • the RF unit there are 4 transceivers each capable of utilizing two radio channels simultaneously.
  • both sound and control commands may be transmitted via the MIDI connector 210 rather than transmitting sound commands via the analog connector 220.
  • the functions of the interfaces 210 and 220 between the computer radio interface 110 and the sound card 190 may, alternatively, be implemented as connections between the computer radio interface 110 to the serial and/or parallel ports of the computer 100, as shown in Figs. 25A - 25F.
  • each transceiver 260 which forms part of the computer radio interface 110 of Fig. 1A preferably is operative to transmit on a
  • the transceiver 260 (Fig. 4)
  • toy control device 130 of Fig. 1A which forms part of the toy control device 130 of Fig. 1A preferably is operative to transmit on the second channel and to receive on the first channel.
  • any suitable technology may be employed to define at least two channel pairs such as narrow band technology or spread spectrum technologies such as frequency hopping technology or direct sequence technology, as illustrated in Figs. 15A - 15E, showing a Multi- Channel Computer Radio Interface, and in Figs. 24A - 24E showing a Multi-Channel Toy Control
  • FIG. 16 is a simplified flowchart illustration of a
  • a computer radio interface (CRI) 110 operative to service an individual computer 100 of Fig. 1A without interfering with other computers or being interfered with by the other computers, each of which is similarly serviced by a similar CRI.
  • CRI computer radio interface
  • the method of Fig. 16 is implemented in software on the computer 100 of Fig. 1A.
  • the CRI includes a conventional radio transceiver (260 of Fig. 4) which may, for example, comprise an RY3 GB021 having 40 channels which are divided into 20 pairs of channels. Typically, 16 of the channel pairs are assigned to information communication and the
  • control channels are designated as control channels.
  • one of the 4 control channel pairs is selected by the radio interface (step 810) as described in detail below in Fig. 17.
  • the selected control channel pair i is monitored by a first transceiver (step 820) to detect the appearance of a new toy which is signaled by arrival of a toy availability command from the new toy (step 816).
  • a first transceiver step 820
  • an information communication channel pair is selected (step 830) from among the 16 such channel pairs provided over which game program information will be transmitted to the new toy.
  • a preferred method for implementing step 830 is illustrated in self-explanatory flowchart Fig. 18 A.
  • the "Locate Computer" command in Fig. 18A (step 1004) is illustrated in the flowchart of Fig. 18B.
  • the identity of the selected information communication channel pair also termed
  • channel pair selection command is sent over the control channel pair to the new toy
  • step 840 A game program is then begun (step 850), using the selected information communication channel pair.
  • the control channel pair is then free to receive and act upon a toy availability command received from another toy. Therefore, it is desirable to assign another transceiver to that control channel pair since the current transceiver is now being used to provide
  • transceiver availability table is then scanned until an available transceiver, i.e. a transceiver which is not marked as busy, is identified (step 854). This transceiver is then assigned to the control channel i (step 858).
  • Fig. 17 is a simplified flowchart illustration of a preferred method for implementing "select control channel pair" step 810 of Fig. 16.
  • the four control channels are scanned.
  • the computer sends an availability interrogation command (step 910) and waits for a predetermined time period, such as 250 ms, for a response (steps 930 and 940). If no other computer responds, i.e. sends back an "availability response command", then the channel pair is deemed vacant. If the channel pair is found to be occupied the next channel is scanned. If none of the four channel pairs are found to be vacant, a "no control channel available" message is returned.
  • Fig. 19 is a self-explanatory flowchart illustration of a preferred method of operation of the toy control device 130 which is useful in conjunction with the "multi-channel"
  • toy control device sends a "toy availability command” (step 1160) which is a message advertising the toy's availability, on each control channel i in turn (steps 1140, 1150, 1210), until a control channel is reached which is being monitored by a computer.
  • a "toy availability command” step 1160
  • step 1140, 1150, 1210 step 1140, 1150, 1210
  • step 1180 the computer responds (step 1180) by transmitting a "channel pair selection command” which is a message designating the information channel pair over which the toy control device may
  • the device may begin receiving and executing game commands which the computer transmits over the
  • a computer system in communication with a remote game server, as shown in Fig. 20.
  • the remote game server 1250 is operative to serve to the computer 100 at least a portion of at least one toy- operating game, which operates one or more toys 1260.
  • an entire game may be downloaded from the remote game server 1250.
  • a new toy action script or new text files may be downloaded from the remote game server 1250 whereas the remaining components of a particular game may already be present in the memory of computer 100.
  • Downloading from the remote game server 1250 to the computer 100 may take place either off-line, before the game begins, or on-line, in the course of the game. Alternatively, a first portion of the game may be received off-line whereas an additional portion of the game is
  • the communication between the remote game server 1250 and the computer 100 may be based on any suitable technology such as but not limited to ISDN; X.25; Frame-Relay; and Internet.
  • An advantage of the embodiment of Fig. 20 is that a very simple computerized
  • Fig. 21 is a simplified flowchart illustration of the operation of the computer 100 or of the network computer 1260 of Fig. 20, when operating in conjunction with the remote
  • Fig. 22 is a simplified flowchart illustration of the operation of the remote game
  • Fig. 23 is a semi-pictorial semi-block diagram illustration of a wireless computer controlled toy system including a toy 1500 having a toy control device 1504, a computer 1510 communicating with the toy control device 1504 by means of a computer-radio interface 1514 and a proximity detection subsystem operative to detect proximity between the toy and the computer.
  • the proximity detection subsystem may for example include a pair of ultrasound transducers 1520 and 1530 associated with the toy and computer respectively.
  • the toy's ultrasound transducer 1520 typically broadcasts ultrasonic signals which the computer's ultrasound transducer 1530 detects if the computer and toy are within ultrasonic communication range, e.g. are in the same room.
  • FIGS. 25A - 25F taken together, form a detailed schematic illustration of a computer radio interface which connects to a serial port of a computer rather than to the sound board of the computer.
  • FIGS. 26A - 26D taken together, form a detailed schematic illustration of a computer radio interface which connects to a parallel port of a computer rather than to the sound
  • Figs. 27A - 27J are preferred self-explanatory flowchart illustrations of a preferred radio coding technique, based on the Manchester coding, which is an alternative to the radio coding technique described above with reference to Figs. 8E, 8G - 8M and 10A - C.
  • Figs. 28A - 28K taken together, form a detailed electronic schematic diagram of
  • Figs. 29A - 291 taken together, form a detailed electronic schematic diagram of the multi-port multi-channel computer radio interface sub-unit of Fig. 14.
  • Fig. 30 illustrates a further embodiment of the present invention which includes a combination of a Computer Radio Interface (CRI) and a Toy Control Device (TCD), 1610.
  • the combined unit 1610 controls a toy 1620 which is connected to the computer
  • the toy 1620 is operated in a similar manner as the toy device 120.
  • Fig 31 illustrates a simplified block diagram of the combined unit 1610.
  • EP900 EPLD chip (U9) of Fig. 28H The code to program the EPLD chip for this schematic diagram preferably uses the programming package "Max Plus II Ver. 6.2" available from Altera Corporation, 3525 Monroe Street, Santa Clara, CN 5051, USA.
  • Figs. 33 - 53 illustrated hereinbelow, illustrate various embodiments of the toy system of Figs. 1 - 32C.
  • Fig. 33 is a pictorial illustration of personified household appliances associated with a central computer 2100 by means of two-way wireless communication, typically via a
  • Suitable computer control provides a wide variety of personified household appliances such as but not limited to the following embodiments described in detail below: a refrigerator 2122 operating as a diet-mate, a microwave oven 2124 simulating a celebrity and a washing machine 2126
  • Each transceiver/controller 2130 is preferably associated with a suitable combination of electro-mechanical accessories such as microphones 2142, speakers 2144, switches 2146 and solenoids 2148.
  • the presence of a user in a particular room is sensed and audio messages to the user from any one of the appliances throughout the house are provided by an appliance in that room, via the computer 2100.
  • Any suitable method may be used to sense the user's presence in a particular room such as voice recognition, volume detection, or bi-directional paging.
  • the user may optionally attach a conventional personal locator, e.g. a IR badge, to his clothing or body such that his or her adjacency to a particular appliance may be sensed and the loudspeaker of that appliance may then be used to convey to the user information regarding that appliance or any other appliance in the household.
  • a conventional personal locator e.g. a IR badge
  • audio messages are broadcast to the user from all appliances in the house. It is appreciated that providing a loudspeaker in association with appliances is cost- effective relative to providing separate loudspeakers mounted in rooms and not in association
  • Each such loudspeaker would need to be associated via wire with the central computer 2100 or alternatively would need to be associated with a transceiver/controller.
  • Communication between appliances and central computer 2100 as well as, preferably, communication between the appliances themselves may be either wireless (Fig. 33) or custom-wired (Fig. 34) or by means of PLC (power line carrier) i.e. the household's existing
  • Radio system such as the PLC2.5-4.0/20W system, manufactured by Intracom SA, Greece and the PLC system ACE32, manufactured by Neva, Norway.
  • Wireless communication may be implemented by means of any suitable technology such as radio wave technology or infra-red wave technology.
  • the radio system of the present invention is based on the following: Spread Spectrum Transceivers, AIC9001 with IRF9085DS, available from ALFA INCORPORATED, No. 15-1, Industry East Road.IV, Science-Based Industrial Park, Hsinchu, Taiwan, R.O.C.
  • FIG. 2110 Preferred implementations of computer radio interface 2110, some of which are operative in conjunction with an associated audio card (sound board) installed in computer 2100, are described hereinabove with reference to Figs. 1 - 32C.
  • FIGs. 1 - 32C Preferred implementations of transceiver/controller 2130, are also described hereinabove with reference to Figs. 1 - 32C. It is appreciated that many modifications of the apparatus shown and described hereinabove with reference to Figs. 1 - 32C are possible. For example, in the embodiment of Figs. 6 and 7A-7F; and Figs 15A-15E, the motor 150 and the tilt switch 140 may be omitted.
  • the game systems and toy system described herein can be implemented by using a home's appliances as games or causing the home appliances to function as toys.
  • the central computer is programmed to sense the presence of appliances and to adapt the game program so as to incorporate into the game a newly sensed appliance.
  • Fig. 34 is a pictorial illustration of a modification of the apparatus of Fig. 33 in which a first appliance 2126 is associated by means of a wire with the computer 2100 and other
  • appliances 2122, 2124 communicate with the computer 2100 via the first appliance 2126, communication between the first appliance 2126 and the other appliances 2122, 2124 being
  • a first household appliance such as a washing machine 2126
  • a computer 2100 serves as a transceiver for transmitting commands
  • the computer radio interface 2110 provides celebrity simulation
  • Combined unit 2150 is in turn associated by means of a wire 2152 with the computer 2100.
  • a preferred implementation for combined unit 2150 is described hereinabove with reference to Fig. 31.
  • the microwave oven 2124 may, for example, simulate a celebrity in the sense that voice messages pertaining to the microwave oven are given in the voice of a particular celebrity (Fig. 45).
  • a particular advantage of giving voice messages pertaining to different appliances in the voices of different celebrities is that users of the appliances can learn to distinguish between messages "coming from” different appliances. This facilitates use of the appliances since an appliance delivering a voice message does not need to identify itself. Also, use of distinctive voices and mannerisms for different appliances enhances the perceived "personalities" of the various appliances. It is believed that granting a personality to an appliance can have a psychological effect on a user thereof, such as development of affection for the appliance, increased willingness to operate (interact with) the appliance and increased willingness to abide by
  • an appliance simulating a particular celebrity is designed to resemble the celebrity.
  • a microwave oven simulating Pooh Bear might be green and have a Pooh Bear doll mounted thereupon.
  • appliances may simulate a celebrity.
  • appliances also preferably provide status updates to the computer 2100 via the computer radio interface 2110 (Fig. 45).
  • the status update signals provided to the computer are augmented by oral status updates which are audible to users of the appliances.
  • Fig. 35 is a pictorial illustration of personified household appliances associated with a central computer 2100 by means of an existing electrical household wiring system 2128.
  • Figs. 36A - 36C taken together, form a simplified flowchart illustration of a preferred method by which the computer controls the transceiver/controller 2130 of the refrigerator 2122 of Fig. 33.
  • the method of Figs. 36A - 36C enables the refrigerator to function as a "diet-mate", in a first supportive mode or in a second, aggressive mode.
  • the mode (supportive or aggressive) may be preset by the user either initially or in the course of using the appliance (step 2350 of Fig. 36B) or may be randomly selected by the computer or may be programmed by the user such that different modes are used at different times of day or under different circumstances.
  • the mode may be set by the computer conditionally, depending on the behavior of the user.
  • the supportive mode is active until the dieter has exceeded his daily calorie count at which point the aggressive mode becomes active.
  • the supportive mode is active until the dieter has exceeded his daily calorie count at which point the aggressive mode becomes active.
  • the supportive mode is active until the dieter has
  • the computer performs a voice signature matching process to
  • step 2326 determine whether the user is an impostor, i.e. whether or not the user's name as supplied matches his voice.
  • Conventional voice signature matching methods may be employed for this purpose (step 2326).
  • the refrigerator preferably includes a microswitch 2146 (Fig. 33) which detects opening of the refrigerator door (step 2318).
  • the refrigerator's loudspeaker 2144 prompts
  • the person opening the refrigerator to identify himself (step 2322). If the user does not identify himself within a predetermined time period, the loudspeaker emits a message pleasantly urging or
  • the refrigerator's loudspeaker 2144 then asks the person to identify the food product or products he is removing from the refrigerator (step 2354).
  • the user's utterance is then recognized by a conventional speaker-dependent or speaker-independent speech recognition unit of the computer 2100 (steps 2358 to 2368).
  • the loudspeaker emits a message pleasantly urging the user to do so or aggressively insisting that the user do so (step 2370).
  • the calories corresponding to a typical portion of the food substance being removed are added to the sum total of calories consumed that day by the identified person (step 2380).
  • the loudspeaker preferably emits a randomly selected message politely requesting the user not to eat the foodstuffs in question or, in aggressive mode, blasting the user for eating the foodstuffs and demanding that s/he cease (step 2386).
  • the above message may be provided if the food substance is in and of itself forbidden to the user, e.g. in accordance with a list of forbidden food items input in advance
  • the apparatus does not prevent the dieter from opening the refrigerator.
  • solenoid 2148 of Fig. 33 is employed to prevent
  • Fig. 37 is an example of a portion of a user interface by which the computer is set up by a user of an appliance, such as washing machine 2126 of Fig. 33, to provide the user of the appliance with infotainment or entertainment.
  • each preference statement includes an activity, a time period and a sequence of preferences, such as only two preferences in the present embodiment.
  • priorities are assigned by the system to the various preference statements. For example, preference statements stipulating a specific activity may take precedence over preference statements for "all" activities. Then, preference statements stipulating a narrow time period take precedence over preference statements stipulating a broad time period which includes the narrow time period.
  • preference statement 2410 has the highest priority for the laundry activity. Consequently, the user, when doing laundry, will be exposed to the weather report and, once it is over or if the user rejects the weather report, the system exposes the user to humor selections. If the user rejects the humor selections, then radio channel 2 is played to the user if the time is between 10:00 hours and 10:30 hours because the preference statement with the next highest priority is preference statement 2450. Otherwise, or if radio channel 2 was rejected by the user, chamber music is played to the user, assuming that it is
  • FIGs. 38A - 38B taken together, form a simplified flowchart illustration of a preferred method of operation by which the computer 2100 provides the user of a washing machine with infotainment or entertainment.
  • Fig. 38A illustrates the process of identifying the homemaker. After the user voice recognition procedure has been successfully completed, steps 2460 to 2510, the system then proceeds to present to the user the personal entertainment using his or hers personal preference record as illustrated in Fig. 38B.
  • the system presents the user with his top preference (step 2515) and stands by to receive a rejection message from the user, typically orally. If a rejection message is received, the system presents the user with his next preference.
  • the system preferably advances to the next lower priority preference statement, according to a priority scheme between preference statements which may be either system-defined or user-defined (steps 2520 to 2522).
  • the computer is operative to terminate provision of entertainment by the washing machine if the computer is informed that another appliance has detected the presence of the homemaker, indicating that the homemaker has left the vicinity of the washing machine.
  • Fig. 39 is a simplified diagram of the interface between computer radio interface 2110 and a soundboard 2600 of the computer 2110.
  • the apparatus of Fig. 39 is a modification of the apparatus of Fig. 3 except that the MIDI connectors are omitted, such that the apparatus of Fig. 39 is useful in conjunction with sound-boards or computers which lack MIDI connectors.
  • Fig. 40 is a simplified block diagram of computer radio interface 2110.
  • Fig. 40 is a modification of the apparatus of Fig. 4 except that the MIDI connectors are omitted, such that the apparatus of Fig. 40 is useful in conjunction with sound-boards or computers which lack MIDI connectors.
  • Fig. 41 is a simplified flowchart illustration of a preferred communication method allowing one of the computer radio interface 2110 and the computer 2100 to receive commands over the audio channel, rather than over the MIDI channel, from the other one of the computer radio interface 2110 and the computer 2100.
  • the method of Fig. 41 first detects whether an audio signal is currently arriving (step 2660) and if so, detects whether the audio signal comprises audio information (i.e. comprises the contents of an utterance which one of the appliances' speakers is supposed to emit) or a command. This is preferably effected by detecting whether or not a command-characterizing preamble has been received (step 2670).
  • the command- characterizing preamble typically comprises SYNC followed by SQ signals as described in detail below with reference to Fig. 42.
  • Fig. 42 is a diagram of analog and digital representations 2700 and 2710 respectively of the following signals: SYNC, SQ, zero-valued bit and one-valued bit.
  • more than one audio channel connects the sound board 2600 and the
  • a first audio channel transmits audio signals from the sound board 2600 to the computer radio interface and a second audio channel transmits audio signals in the opposite direction.
  • Figs. 43 A - 43E taken together, form a detailed electronic schematic diagram of a preferred implementation of the apparatus of Fig. 40.
  • Fig. 44 is an example of a dialogue between a personified microwave oven and a personified dishwasher culminating in a verbal message emitted by a television.
  • the dialog is managed by the computer 2100.
  • the tone of voice of the personified microwave oven and of the personified dishwasher differ to allow listeners to differentiate therebetween.
  • the microwave oven's voice may simulate Paul Newman's voice and the dishwasher's voice may simulate Vivian Leigh's voice.
  • simulations of a celebrity's voice may be provided by pre-recording sentences, phrases or words produced by a human model mimicking the celebrity or by the celebrity herself.
  • text-to- speech systems exist which convert text to oral speech having a variety of characteristics such that
  • Fig. 45 is a simplified flowchart illustration of a method of operation for a central computer according to a first preferred implementation of an inter-appliance dialogue such as the dialog of Fig. 44.
  • a central computer controls all appliances and simulates, for the amusement of a user, a dialogue therebetween.
  • the computer maintains a state machine representing the possible states that the household appliances may be in and/or a plurality of separate state machines representing the possible states that each of a plurality of household appliances respectively may be in.
  • Each state defines at least one condition, each condition comprising a logical combination of events triggering a connection to another state and/or an action to be performed by one or other of the
  • the events may include at least one of the following types of events: Counter events, timer events, events in which an input such as speech is received from a user, events in which a change in the environment is detected, events defined in terms of computations carried out by the computer, etc.
  • Fig. 46 is a flowchart illustration of a method of operation for an appliance according to a second preferred implementation of the dialogue of Fig. 45 in which each appliance is intelligent and preferably has speech recognition capabilities. Therefore, the dialogue is a real dialogue rather than a simulation of a dialogue which is in fact generated by the central computer.
  • Fig. 47 is a pictorial illustration of a central computer 2100 accumulating information from users, via microphone bearing appliances, regarding consumable supplies to be replenished.
  • the central computer 2100 also accumulates messages from the appliances themselves, e.g. indications of the number of times they have operated, which impacts on the supply of consumables for that appliance.
  • the central computer then generates a shopping list,
  • the central computer receives an indication of each operation of the washing machine and accumulates these indications. After a predetermined number of operations, the computer may add laundry powder to the list.
  • the computer automatically generates a shopping list, sends it to the store e.g. electronically, and zeroes all consumable supply counters.
  • oral messages regarding a particular task such as shopping are provided in the tone of voice of a celebrity.
  • Fig. 48 is a script for the flowchart of Fig. 45 by which the computer 2100
  • Fig. 49 is a script for the flowchart of Fig. 45 by which the computer 2100 implements the washing machine's role in the interaction of Fig. 47.
  • Fig. 50 is a pictorial illustration of a scenario in which a central computer is accumulating information regarding household chore monitoring and timing. This information may be provided by users, via sensor- bearing appliances, and/or by the appliances themselves. For example, in the illustrated scenario, the washing machine and dryer notify the homemaker, via the central computer, via the amplifier of the microwave oven, that washing and drying operations have both terminated. This facilitates coordination of household tasks since the home-maker is not only alerted to go downstairs and deal with the laundry but is alerted only once allowing him or her to go downstairs only once and deal with both the washer and the dryer.
  • the messages to the home-maker are coordinated such that, for example, the alert that two laundry tasks can now be performed (communication F) is not provided in the midst of a food preparation task (microwave oven operation), even though the information is available to the computer. Instead, the alert regarding the two laundry tasks is
  • Fig. 51 is a script for the flowchart of Fig. 45 by which the computer 2100 implements the washing machine's role in the interaction of Fig. 50.
  • Fig. 52 is a script for the flowchart of Fig. 45 by which the computer 2100
  • Fig. 53 is a script for the flowchart of Fig. 45 by which the computer 2100
  • Each appliance may have substantial computing power and therefore have independent intelligence, rather than being a "slave” to a central computer providing intelligence to a plurality of household appliances.
  • Intelligence may be provided to "slave” appliances by a "master” appliance associated with each of the “slave” appliances either wirelessly or by means of wire.
  • the software components of the present invention may, if desired, be implemented in ROM (read-only memory) form.
  • the software components may, generally, be implemented in hardware, if desired, using conventional techniques. It is appreciated that various features of the invention which are, for clarity,

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Toys (AREA)
  • Selective Calling Equipment (AREA)

Abstract

L'invention concerne un système d'appareil ménager commandé par un ordinateur sans fil (fig 33, 2100) comprenant un premier émetteur (2110) sans fil et un premier récepteur (211) sans fil, destiné à transmettre une première transmission par le premier émetteur sans fil, ainsi qu'au moins un appareil électroménager (2122, 2124 et 2126) présentant un second émetteur (2130) sans fil et un second récepteur (2130) sans fil, l'appareil électroménager (2122, 2124, 2126) recevant la première transmission par le second récepteur sans fil et exécutant au moins une action sur la base de la première transmission, l'appareil électroménager (2122, 2124, 2126) étant destiné à transmettre une seconde transmission par le second émetteur sans fil et le système d'ordinateur (2110) étant destiné à recevoir la seconde transmission par le premier récepteur sans fil.
EP98920723A 1997-05-19 1998-05-19 Dispositif et procedes de commande d'appareils electromenagers Withdrawn EP1021808A1 (fr)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
IL12085597 1997-05-19
IL12085597A IL120855A0 (en) 1997-05-19 1997-05-19 Apparatus and methods for controlling household appliances
US09/062,502 US20020005787A1 (en) 1997-05-19 1998-04-17 Apparatus and methods for controlling household appliances
US62502 1998-04-17
PCT/IL1998/000223 WO1998053456A1 (fr) 1997-05-19 1998-05-19 Dispositif et procedes de commande d'appareils electromenagers

Publications (1)

Publication Number Publication Date
EP1021808A1 true EP1021808A1 (fr) 2000-07-26

Family

ID=26323426

Family Applications (1)

Application Number Title Priority Date Filing Date
EP98920723A Withdrawn EP1021808A1 (fr) 1997-05-19 1998-05-19 Dispositif et procedes de commande d'appareils electromenagers

Country Status (5)

Country Link
EP (1) EP1021808A1 (fr)
JP (1) JP2002512757A (fr)
AU (1) AU7349898A (fr)
CA (1) CA2290348A1 (fr)
WO (1) WO1998053456A1 (fr)

Families Citing this family (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001092562A (ja) * 1999-09-20 2001-04-06 Toshiba Corp 装着型情報処理装置および制御方法
US8412377B2 (en) 2000-01-24 2013-04-02 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US6956348B2 (en) 2004-01-28 2005-10-18 Irobot Corporation Debris sensor for cleaning apparatus
US7277765B1 (en) 2000-10-12 2007-10-02 Bose Corporation Interactive sound reproducing
CN100393093C (zh) * 2000-12-25 2008-06-04 广东科龙电器股份有限公司 基于无线应用协议通讯的智能家居控制方法
US6690134B1 (en) 2001-01-24 2004-02-10 Irobot Corporation Method and system for robot localization and confinement
US7571511B2 (en) 2002-01-03 2009-08-11 Irobot Corporation Autonomous floor-cleaning robot
US7663333B2 (en) 2001-06-12 2010-02-16 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US9128486B2 (en) 2002-01-24 2015-09-08 Irobot Corporation Navigational control system for a robotic device
US8428778B2 (en) 2002-09-13 2013-04-23 Irobot Corporation Navigational control system for a robotic device
US8386081B2 (en) 2002-09-13 2013-02-26 Irobot Corporation Navigational control system for a robotic device
US7170994B2 (en) 2003-10-15 2007-01-30 Motorola, Inc. Method and apparatus for selecting an alert mode based on user biometrics
US7332890B2 (en) 2004-01-21 2008-02-19 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
DE112005000738T5 (de) 2004-03-29 2007-04-26 Evolution Robotics, Inc., Pasadena Verfahren und Vorrichtung zur Positionsbestimmung unter Verwendung von reflektierten Lichtquellen
ATE536577T1 (de) 2004-06-24 2011-12-15 Irobot Corp Fernbediente ablaufsteuerung und verfahren für eine autonome robotervorrichtung
US8972052B2 (en) 2004-07-07 2015-03-03 Irobot Corporation Celestial navigation system for an autonomous vehicle
US7706917B1 (en) 2004-07-07 2010-04-27 Irobot Corporation Celestial navigation system for an autonomous robot
DE602006014364D1 (de) 2005-02-18 2010-07-01 Irobot Corp Autonomer oberflächenreinigungsroboter für nass- und trockenreinigung
US7620476B2 (en) 2005-02-18 2009-11-17 Irobot Corporation Autonomous surface cleaning robot for dry cleaning
US8392021B2 (en) 2005-02-18 2013-03-05 Irobot Corporation Autonomous surface cleaning robot for wet cleaning
US8930023B2 (en) 2009-11-06 2015-01-06 Irobot Corporation Localization by learning of wave-signal distributions
ATE442619T1 (de) 2005-12-02 2009-09-15 Irobot Corp Modularer roboter
ES2706729T3 (es) 2005-12-02 2019-04-01 Irobot Corp Sistema de robot
EP2466411B1 (fr) 2005-12-02 2018-10-17 iRobot Corporation Système de robot
US7441298B2 (en) 2005-12-02 2008-10-28 Irobot Corporation Coverage robot mobility
EP3404505B1 (fr) 2006-03-17 2023-12-06 iRobot Corporation Robot d'entretien de pelouses
US20090044370A1 (en) 2006-05-19 2009-02-19 Irobot Corporation Removing debris from cleaning robots
US8417383B2 (en) 2006-05-31 2013-04-09 Irobot Corporation Detecting robot stasis
KR101529848B1 (ko) 2007-05-09 2015-06-17 아이로보트 코퍼레이션 표면 처리 로봇
JP5647269B2 (ja) 2010-02-16 2014-12-24 アイロボット コーポレイション 掃除機ブラシ
WO2015153109A1 (fr) 2014-03-31 2015-10-08 Irobot Corporation Robot mobile autonome
US9516806B2 (en) 2014-10-10 2016-12-13 Irobot Corporation Robotic lawn mowing boundary determination
US9510505B2 (en) 2014-10-10 2016-12-06 Irobot Corporation Autonomous robot localization
US9420741B2 (en) 2014-12-15 2016-08-23 Irobot Corporation Robot lawnmower mapping
US9538702B2 (en) 2014-12-22 2017-01-10 Irobot Corporation Robotic mowing of separated lawn areas
US11115798B2 (en) 2015-07-23 2021-09-07 Irobot Corporation Pairing a beacon with a mobile robot
US10021830B2 (en) 2016-02-02 2018-07-17 Irobot Corporation Blade assembly for a grass cutting mobile robot
US10459063B2 (en) 2016-02-16 2019-10-29 Irobot Corporation Ranging and angle of arrival antenna system for a mobile robot
WO2019013989A1 (fr) 2017-07-14 2019-01-17 Irobot Corporation Ensemble lame pour robot mobile de coupe d'herbe

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4520576A (en) * 1983-09-06 1985-06-04 Whirlpool Corporation Conversational voice command control system for home appliance
EP0467305B1 (fr) * 1990-07-19 1997-07-02 Sony Corporation Appareil pour l'interconnexion de dispositifs électroniques

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO9853456A1 *

Also Published As

Publication number Publication date
CA2290348A1 (fr) 1998-11-26
JP2002512757A (ja) 2002-04-23
WO1998053456A1 (fr) 1998-11-26
AU7349898A (en) 1998-12-11

Similar Documents

Publication Publication Date Title
WO1998053456A1 (fr) Dispositif et procedes de commande d'appareils electromenagers
US20020005787A1 (en) Apparatus and methods for controlling household appliances
US6206745B1 (en) Programmable assembly toy
US6773322B2 (en) Programmable assembly toy
WO1997018871A2 (fr) Poupee i*
US6290566B1 (en) Interactive talking toy
US20020107591A1 (en) "controllable toy system operative in conjunction with a household audio entertainment player"
US6352478B1 (en) Techniques and apparatus for entertainment sites, amusement parks and other information and/or entertainment dispensing sites
US6089942A (en) Interactive toys
US20010031652A1 (en) 1*doll
US6368177B1 (en) Method for using a toy to conduct sales over a network
EP0934105A1 (fr) Techniques et dispositifs pour installations de divertissement, parcs d'attractions et autres lieux avec service d'informations et/ou de divertissements
CA2332582A1 (fr) Poupee intelligente
WO1998053567A1 (fr) Jouet pouvant etre commande fonctionnant avec un lecteur audio
EP0935492A2 (fr) Jouet parlant interactif
US20220047956A1 (en) Systems and Methods for Interactive Communication Between an Object and a Smart Device
KR20010028654A (ko) 무선데이터 통신을 이용한 로봇 제어 시스템
US10537812B2 (en) Systems and methods for interactive communication between an object and a smart device
WO2005038776A1 (fr) Jouet a commande vocale
CA2234330A1 (fr) Jouets interactifs
WO2002029761A1 (fr) Procede d'utilisation d'un jouet pour conduire une vente

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 19991213

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20020102

R18D Application deemed to be withdrawn (corrected)

Effective date: 20011201