US11154776B2 - Semantic gaming and application transformation - Google Patents

Semantic gaming and application transformation Download PDF

Info

Publication number
US11154776B2
US11154776B2 US16/184,610 US201816184610A US11154776B2 US 11154776 B2 US11154776 B2 US 11154776B2 US 201816184610 A US201816184610 A US 201816184610A US 11154776 B2 US11154776 B2 US 11154776B2
Authority
US
United States
Prior art keywords
data
user input
input device
handheld
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US16/184,610
Other versions
US20190091571A1 (en
Inventor
Arno Penzias
Matthew G. Liberty
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
DRNC Holdings Inc
Original Assignee
IDHL Holdings Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by IDHL Holdings Inc filed Critical IDHL Holdings Inc
Priority to US16/184,610 priority Critical patent/US11154776B2/en
Publication of US20190091571A1 publication Critical patent/US20190091571A1/en
Application granted granted Critical
Publication of US11154776B2 publication Critical patent/US11154776B2/en
Assigned to DRNC HOLDINGS, INC. reassignment DRNC HOLDINGS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: IDHL HOLDINGS, INC.
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03543Mice or pucks
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/215Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • A63F13/235Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console using a wireless connection, e.g. infrared or piconet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1006Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals having additional degrees of freedom
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1025Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals details of the interface with the game device, e.g. USB version detection
    • A63F2300/1031Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals details of the interface with the game device, e.g. USB version detection using a wireless connection, e.g. Bluetooth, infrared connections
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1081Input via voice recognition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/64Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Definitions

  • the present invention relates, generally, to user interfaces and methods associated therewith and, more specifically, to user interfaces and methods involving semantically coupling the actions of handheld devices into application inputs.
  • User interfaces are ubiquitous in today's society. Computers, cell phones, fax machines and televisions, to name a few products, all employ user interfaces. User interfaces are intended to provide a mechanism for users to easily access and manipulate the, sometimes complex, functionality of the devices that they support.
  • An example of a user interface is found in U.S. patent application Ser. No. 10/768,432, filed on Jan. 30, 2004, entitled “A Control Framework with a Zoomable Graphical User Interface for Organizing, Selecting and Launching Media Items”, the disclosure of which is incorporated here by reference.
  • Such user interfaces employ remote, handheld devices to provide inputs to their respective applications.
  • a gaming system comprises a handheld user input device, a processor for receiving input data from the handheld user input device and a gaming application for receiving data from the processor based on the received input data from the handheld user input device.
  • the handheld user input device emulates one of a plurality of different devices associated with the gaming application.
  • Each of the plurality of different devices have a set of input commands associated therewith.
  • the set of input commands associated with the one of a plurality of different devices includes a command based on one of motion and orientation of the handheld device.
  • a method for using a handheld user input device comprises: executing an application program on a processor, establishing a communication data link between the handheld user input device and the processor, receiving user input data by the processor via the handheld user input device, receiving data by the application program from the processor based on the received user input data, and reflecting user input on a display wherein the user input corresponds to one of motion and orientation of the handheld user input device.
  • a system comprises: a handheld user input device, a processor for receiving input data from the handheld user input device, and a software application executing on the processor and receiving data from the processor based on input data received from the handheld user input device.
  • the handheld user input device emulates one of a plurality of different devices associated with the application. Each of the plurality of different devices have a set of input commands associated therewith wherein the commands include a command based on one of motion and orientation of the handheld device.
  • FIG. 1 depicts an exemplary system according to an exemplary embodiment of the present invention
  • FIG. 2 depicts an exemplary media system in which exemplary embodiments of the present invention can be implemented
  • FIG. 3 depicts a system controller of FIG. 2 in more detail
  • FIG. 4 depicts a free-space pointing device
  • FIG. 5 depicts sensors in a free-space pointing device
  • FIG. 6 depicts a process model that describes the general operation of a free-space pointing device
  • FIG. 7 illustrates an exemplary hardware architecture using a free-space pointing device
  • FIG. 8 is a state diagram depicting a stationary detection mechanism for a free-space pointing device
  • FIGS. 9A-9D illustrate an exemplary free-space pointing device
  • FIG. 10 illustrates a method in accordance with exemplary embodiments.
  • Exemplary embodiments of the present invention provide a generic user input device that tracks user movement in free-space. Both gestures and actions can be measured, recorded and responded to by the associated application(s). The result is a low-cost and flexible gaming device that can, via the appropriate application software, become all manner of units. For example, any number of weapon types like swords, guns, grenade launchers and so on can be modeled with one common handheld device. Exemplary embodiments of the present invention are also much cheaper to implement than full immersion virtual reality systems with and without haptic feedback.
  • FIG. 1 A high level view of an exemplary embodiment of the present invention is provided in FIG. 1 .
  • a handheld device (or, user input detector) 100 provides input to computer 110 which is running an application 120 such as a gaming application for example.
  • the handheld device 100 is a free-space device capable of sensing movement in at least 2 degrees of freedom and preferably 6 degrees of freedom.
  • An example of a free-space pointing device is found in U.S. Provisional Patent Application 60/612,571, filed on Sep. 23, 2004, entitled “Free Space Pointing Devices and Methods” (corresponding to U.S. application Ser. No. 11/119,663 cited above), the disclosure of which is incorporated here by reference.
  • the computer 110 could be a gaming console or PC or set-top box, among others and the application 120 may be a game.
  • the application 120 can be a gaming application using an user interface such as that described in U.S. Patent Application Publication No. US 2005/0125826, which corresponds to U.S. patent application Ser. No. 11/029,329 filed on Jan. 5, 2005, which is a continuation of U.S. patent application Ser. No. 10/768,432 filed on Jan. 30, 2004, that is also incorporated here by reference.
  • an exemplary aggregated media system 200 in which the present invention can be implemented will first be described with respect to FIG. 2 .
  • Those skilled in the art will appreciate, however, that this invention is not restricted to implementation in this type of media system and that more or fewer components can be included therein.
  • the media system 200 may include an input/output (I/O) bus 210 that connects the components in the media system together.
  • the I/O bus 210 represents any of a number of different of mechanisms and techniques for routing signals between the media system components.
  • the I/O bus 210 may include an appropriate number of independent audio “patch” cables that route audio signals, coaxial cables that route video signals, two-wire serial lines or IR or radio frequency (RF) transceivers that route control signals, optical fiber or any other routing mechanisms that route other types of signals.
  • RF radio frequency
  • the media system 200 includes a television/monitor 212 , a gaming console or gaming device 214 , digital video disk (DVD) recorder/playback device 216 , audio/video tuner 218 , and compact disk (CD) player 220 coupled to the I/O bus 210 .
  • the DVD 216 and CD player 220 may be single disk or single cassette devices, or alternatively may be multiple disk or multiple cassette devices. They may be independent units or integrated together.
  • the media system 200 includes a microphone/speaker system 222 , video camera 224 , and a wireless I/O control device 226 .
  • the wireless I/O control device 226 may be a media system remote control unit that supports free-space pointing, has a minimal number of buttons to support navigation, and communicates with the entertainment system 200 through RF signals.
  • wireless I/O control device 226 can be a free-space pointing device that uses a gyroscope or other mechanism to define both a screen position and a motion vector to determine the particular command desired.
  • a set of buttons can also be included on the wireless I/O device 226 to initiate a “click” primitive as well as a “back” button.
  • wireless I/O control device 226 is a media system remote control unit, which communicates with the components of the entertainment system 200 through IR signals.
  • wireless I/O control device 226 may be an IR remote control device similar in appearance to a typical entertainment system remote control with the added feature of a track-ball or other navigational mechanisms, which allows a user to position a cursor on a display.
  • the system 200 also includes a system controller 228 , which may operate to store and display entertainment system data available from a plurality of entertainment system data sources and to control a wide variety of features associated with each of the system components.
  • system controller 228 is coupled, either directly or indirectly, to each of the system components, as necessary, through I/O bus 210 .
  • system controller 228 in addition to or in place of I/O bus 210 , system controller 228 is configured with a wireless communication transmitter (or transceiver), which is capable of communicating with the system components via IR signals or RF signals. Regardless of the control medium, the system controller 228 is configured to control the media components of the media system 200 via a GUI described below.
  • media system 200 may be configured to receive media items from various media sources and service providers.
  • media system 200 receives media input from and, optionally, sends information to, any or all of the following sources: cable broadcast 230 , satellite broadcast 232 (e.g., via a satellite dish), very high frequency (VHF) or ultra high frequency (UHF) radio frequency communication of the broadcast television networks 234 (e.g., via an aerial antenna), telephone network 236 , and cable modem 238 (or another source of Internet content).
  • VHF very high frequency
  • UHF ultra high frequency
  • FIG. 3 is a block diagram illustrating an embodiment of an exemplary system controller 228 , which can, for example, be implemented as a set-top box and include, for example, a processor 300 , memory 302 , a display controller 304 , other device controllers 306 (e.g., associated with the other components of system 200 ), one or more data storage devices 308 , and an I/O interface 310 . These components communicate with the processor 300 via bus 312 . Those skilled in the art will appreciate that processor 300 can be implemented using one or more processing units.
  • Memory device(s) 302 may include, for example, DRAM or SRAM, ROM, some of which may be designated as cache memory, which store software to be run by processor 300 and/or data usable by such programs, including software and/or data associated with the GUIs described below.
  • Display controller 304 is operable by processor 300 to control the display of monitor 212 to, among other things, display GUI screens and objects as described below. Zoomable GUIs provide resolution independent zooming, so that monitor 212 can provide displays at any resolution.
  • Device controllers 306 provide an interface between the other components of the media system 200 and the processor 300 .
  • Data storage 308 may include one or more of a hard disk drive, a floppy disk drive, a CD-ROM device, or other mass storage device.
  • Input/output interface 310 may include one or more of a plurality of interfaces including, for example, a keyboard interface, an RF interface, an IR interface and a microphone/speech interface.
  • I/O interface 310 may include an interface for receiving location information associated with movement of a wireless pointing device.
  • Generation and control of a GUI to display media item selection information is performed by the system controller 228 in response to the processor 300 executing sequences of instructions contained in the memory 302 .
  • Such instructions may be read into the memory 302 from other computer-readable media such as data storage device(s) 308 or from a computer connected externally to the media system 200 .
  • Execution of the sequences of instructions contained in the memory 302 causes the processor to generate GUI objects and controls, among other things, on monitor 212 .
  • various different types of remote devices can be used as the input device 100 , 226 , including, for example, trackballs, “mouse”-type pointing devices, light pens, etc., as well as free-space pointing devices.
  • free-space pointing refers to the ability of an input device to move in three (or more) dimensions in the air in front of a display screen, for example, and the corresponding ability of the user interface to translate those motions directly into user interface commands, e.g., movement of a cursor on the display screen.
  • free-space pointing differs from conventional computer mouse pointing techniques, for example, which use a surface, e.g., a desk surface or mouse pad, as a proxy surface from which relative movement of the mouse is translated into cursor movement on the computer display screen.
  • FIG. 4 An exemplary free-space pointing device 400 (corresponding to user input detector 100 of FIG. 1 for example) is depicted in FIG. 4 .
  • User movement of the free-space pointing device can be defined, for example, in terms of a combination of x-axis attitude (roll), y-axis elevation (pitch) and/or z-axis heading (yaw) motion of the device 400 .
  • Linear movement of the pointing device 400 along the x, y, and z axes may also be measured to generate cursor movement or other user interface commands.
  • the free-space pointing device 400 includes two buttons 402 , 404 and a scroll wheel 406 , although other embodiments can have other physical configurations.
  • the free-space pointing device 400 may be held by a user in front of a display 408 , such as the monitor 212 , and motion of the pointing device 400 is translated into output signals that are usable for interaction with information presented on the display 408 , e.g., to move a cursor 410 on the display 408 .
  • the display 408 may be included in the computer 110 depicted in FIG. 1 .
  • Rotation of the device 400 about the y-axis can be sensed by the device 400 , for example, and translated into an output usable by the system to move cursor 410 along the y 2 axis of the display 408 .
  • rotation of the device 400 about the z-axis can be sensed and translated into an output usable by the system to move cursor 410 along the x 2 axis of the display 408 .
  • the output of pointing device 400 can be used to interact with the display 408 in a number of ways other than (or in addition to) cursor movement.
  • the device 400 can control cursor fading or control volume or media transport (play, pause, fast-forward, and rewind) in a system such as the media entertainment system 200 .
  • Input commands may also include, for example, a zoom in or zoom out on a particular region of a display.
  • a cursor may or may not be visible.
  • rotation of the free-space pointing device 400 sensed about the x-axis of free space pointing device 400 can be used in addition to, or as an alternative to, y-axis and/or z-axis rotation to provide input to a user interface.
  • rotational sensors 502 , 504 and an accelerometer 506 can be employed as sensors in the device 400 .
  • the sensors 502 , 504 can, for example, be ADXRS150 sensors made by Analog Devices, although it will be appreciated by those skilled in the art that other types of rotational sensors can be used and that ADXRS150 sensors are simply illustrative examples. If the rotational sensors 502 , 504 have a single sensing axis (as an ADXRS150 sensor does, for example), then they may be mounted in the free-space pointing device 400 such that their sensing axes are aligned with the rotations to be measured, although this is not necessary. In the exemplary embodiment depicted in FIGS. 4 and 5 , this means that rotational sensor 502 is mounted such that its sensing axis is parallel to the y-axis and that rotational sensor 504 is mounted such that its sensing axis is parallel to the z-axis as shown.
  • Measurements and calculations are performed by the device 400 that are used to adjust the outputs of one or more of the sensors 502 , 504 , 506 and/or as part of the input used by a processor to determine an appropriate output for the user interface based on the outputs of the sensors 502 , 504 , 506 . These measurements and calculations are used to compensate for several factors, such as errors associated with the sensors 502 , 504 , 506 and the manner in which a user uses the free-space pointing device 400 , e.g., linear acceleration, tilt and tremor.
  • a process model 600 that describes the general operation of a free-space pointing device 400 is illustrated in FIG. 6 .
  • the sensors 502 , 504 , 506 produce analog signals that are sampled periodically, such as 200 samples/second, for example.
  • the sampled output from the accelerometer 506 is indicated at block 602 , and the sampled output values are converted from raw units to units of acceleration, e.g., gravities (g), as indicated by conversion function 604 .
  • An acceleration calibration block 606 provides values used for the conversion function 604 .
  • This calibration of the accelerometer output 602 can include, for example, compensation for one or more of scale, offset, and axis misalignment error associated with the accelerometer 506 .
  • the accelerometer 506 may be used to compensate for fluctuations in the readings generated by the rotational sensors 502 , 504 that are caused by variances in linear acceleration by multiplying the converted accelerometer readings by a gain matrix 610 and subtracting (or adding) the results from (or to) the corresponding sampled rotational sensor data 612 .
  • linear acceleration compensation for the sampled rotational data from sensor 504 can be provided at block 614 .
  • the sampled rotational data 612 is then converted from a sampled unit value into a value associated with a rate of angular rotation, e.g., radians/s, at function 616 .
  • This conversion step can also include calibration provided by function 618 to compensate the sampled rotational data for factors such as scale and offset.
  • an input from a temperature sensor 619 may be used in rotation calibration function 618 .
  • the inputs from the rotational sensors 502 , 504 can be further processed to rotate those inputs into an inertial frame of reference, i.e., to compensate for tilt associated with the manner in which the user is holding the free-space pointing device 400 , at function 620 .
  • Tilt correction to compensate for a user's holding the pointing device 400 at different x-axis rotational positions can be accomplished by determining the tilt of the device 400 using the inputs y and z received from accelerometer 506 at function 622 . After the acceleration data is converted and calibrated as described above, it can be low-pass filtered at LPF 624 to provide an average acceleration value to the tilt determination function 622 .
  • post-processing can be performed at blocks 626 and 628 to compensate for factors such as human tremor.
  • tremor may be removed using several different methods, one way to remove tremor is by using hysteresis.
  • the angular velocity produced by rotation function 620 is integrated to produce an angular position.
  • Hysteresis of a calibrated magnitude is then applied to the angular position.
  • the derivative is taken of the output of the hysteresis block to again yield an angular velocity.
  • the resulting output is then scaled at function 628 (e.g., based on the sampling period) and used to generate a result within the interface, e.g., movement of the cursor 410 on the display 408 .
  • FIG. 7 illustrates an exemplary hardware architecture, including a processor 700 that communicates with other elements of the free-space pointing device 400 including a scroll wheel 702 , JTAG 704 , LEDs 706 , switch matrix 708 , IR photodetector 710 , rotational sensors 712 , accelerometer 714 and transceiver 716 .
  • the scroll wheel 702 is an optional input component that enables a user to provide input to the interface by rotating the scroll wheel 702 .
  • JTAG 704 provides a programming and debugging interface to the processor.
  • LEDs 706 provide visual feedback to a user, for example, when a button is pressed or a function activated.
  • Switch matrix 708 receives inputs, e.g., indications that a button on the free-space pointing device 400 has been depressed or released, that are then passed on to processor 700 .
  • the optional IR photodetector 710 can be provided to enable the exemplary free-space pointing device to learn IR codes from other remote controls.
  • Rotational sensors 712 provide readings to processor 700 regarding, e.g., the y-axis and z-axis rotation of the free-space pointing device as described above.
  • Accelerometer 714 provides readings to processor 700 regarding the linear acceleration of the free-space pointing device 400 that can be used as described above, e.g., to perform tilt compensation and to compensate for errors which linear acceleration introduces into the rotational readings generated by rotational sensors 712 .
  • Transceiver 716 communicates information to and from free-space pointing device 400 , e.g., to a controller in a device such as an entertainment system or to a processor associated with the computer 110 .
  • the transceiver 716 can be a wireless transceiver, e.g., operating in accordance with the BLUETOOTH standards for short-range wireless communication or an IR transceiver.
  • free-space pointing device 400 can communicate with systems via a wire-line connection.
  • Stationary detection function 608 can operate to determine whether the free-space pointing device 400 is, for example, either stationary or active (moving). This categorization can be performed in a number of different ways. One way is to compute the variance of the sampled input data of all inputs (x, y, z, ⁇ y, ⁇ z) over a predetermined window, e.g., every quarter of a second. ⁇ y and ⁇ z are rotational data from the sensors 502 , 504 , respectively. This variance is then compared with a threshold to classify the free-space pointing device as either stationary or active.
  • the processor 700 can also determine whether the free-space pointing device 400 is either stationary or active and detect the small movements of the free-space pointing device 400 introduced by a user's hand tremor.
  • Tremor can be identified as peaks in the range of human tremor frequencies, e.g., nominally 8-12 Hz.
  • the variances in the frequency domain can be sensed within a particular frequency range, the actual frequency range to be monitored and used to characterize the status of the free-space pointing device 400 may vary.
  • the nominal tremor frequency range may shift based on, e.g., the ergonomics and weight of the free-space pointing device 200 , e.g., from 8-12 Hz to 4-7 Hz.
  • tremor data may be memorized as typically each user will exhibit a different tremor pattern.
  • This property of user tremor can also be used to identify users.
  • a user's tremor pattern can be memorized by the system (either stored in the free-space pointing device 400 or transmitted to the system) during an initialization procedure wherein the user is requested to hold the free-space pointing device as steadily as possible for a period, e.g., 10 seconds.
  • This pattern can be used as the user's unique signature to perform a variety of user interface functions.
  • the user interface can identify the user from a group of users by comparing a current tremor pattern with patterns stored in memory. The identification can then be used, for example, to retrieve preference settings associated with the identified user.
  • the media selection item display preferences associated with that user can be activated after the system recognizes the user via tremor pattern comparison.
  • System security can also be implemented using tremor recognition, e.g., access to the system may be forbidden or restricted based on the user identification performed after a user picks up the free-space pointing device 400 .
  • Stationary detection mechanism 608 can include a state machine, an example of which is depicted in FIG. 8 .
  • An ACTIVE state is, in the example, the default state during which the free-space pointing device 400 is being used to provide inputs to a user interface for example.
  • the free-space pointing device 400 can enter the ACTIVE state on power-up of the device as indicated by a reset input. If the free-space pointing device 400 stops moving, it may then enter an INACTIVE state.
  • the various state transitions depicted in FIG. 8 can be triggered by any of a number of different criteria including, but not limited to, data output from one or both of the rotational sensors 502 and 504 , data output from the accelerometer 506 , time domain data, frequency domain data, or any combination thereof.
  • Condition stateA ⁇ stateB State transition conditions are generically referred to here using the convention “Condition stateA ⁇ stateB ”.
  • condition active ⁇ inactive can, in an exemplary free-space pointing device 400 , occur when mean and/or standard deviation values from both the rotational sensor(s) and the accelerometer fall below predetermined threshold values for a predetermined time period.
  • State transitions can be determined by a number of different conditions based upon the interpreted sensor outputs.
  • Exemplary condition metrics include the variance of the interpreted signals over a time window, the threshold between a reference value and the interpreted signal over a time window, the threshold between a reference value and the filtered interpreted signal over a time window, and the threshold between a reference value and the interpreted signal from a start time. All, or any combination, of these condition metrics can be used to trigger state transitions. Alternatively, other metrics can also be used.
  • a transition from the INACTIVE state to the ACTIVE state may occur either when (1) a mean value of sensor output(s) over a time window is greater than predetermined threshold(s) or (2) a variance of values of sensor output(s) over a time window is greater than predetermined threshold(s) or (3) an instantaneous delta between sensor values is greater than a predetermined threshold.
  • the INACTIVE state enables the stationary detection mechanism 608 to distinguish between brief pauses during which the free-space pointing device 400 is still being used, e.g., on the order of a tenth of a second, and an actual transition to either a stable or stationary condition. This protects against the functions which are performed during the STABLE and STATIONARY states, described below, from inadvertently being performed when the free-space pointing device is being used.
  • the free-space pointing device 400 will transition back to the ACTIVE state when condition inactive ⁇ active occurs, e.g., if the free-space pointing device 400 starts moving again such that the measured outputs from the rotational sensor(s) and the accelerometer exceeds the first threshold before a second predetermined time period in the INACTIVE state elapses.
  • the free-space pointing device 400 will transition to either the STABLE state or the STATIONARY state after the second predetermined time period elapses.
  • the STABLE state reflects the characterization of the free-space pointing device 400 as being held by a person but being substantially unmoving
  • the STATIONARY state reflects a characterization of the free-space pointing device as not being held by a person.
  • an exemplary state machine can provide for a transition to the STABLE state after a second predetermined time period has elapsed if minimal movement associated with hand tremor is present or, otherwise, transition to the STATIONARY state.
  • the STABLE and STATIONARY states define times during which the free-space pointing device 400 can perform various functions. For example, since the STABLE state is intended to reflect times when the user is holding the free-space pointing device 400 but is not moving it, the device can record the movement of the free-space pointing device 400 when it is in the STABLE state e.g., by storing outputs from the rotational sensor(s) and/or the accelerometer while in this state. These stored measurements can be used to determine a tremor pattern associated with a particular user or users as described below. Likewise, when in the STATIONARY state, the free-space pointing device 400 can take readings from the rotational sensors and/or the accelerometer for use in compensating for offset.
  • the free-space pointing device 400 can trigger a return to the ACTIVE state. Otherwise, after measurements are taken, the device can transition to the SLEEP state. While in the SLEEP state, the device can enter a power-down mode wherein power consumption of the free-space pointing device is reduced and, e.g., the sampling rate of the rotational sensors and/or the accelerometer is also reduced.
  • the SLEEP state can also be entered via an external command so that the user or another device can command the free-space pointing device 400 to enter the SLEEP state.
  • the device can transition from the SLEEP state to the WAKEUP state.
  • the WAKEUP state provides an opportunity for the device to confirm that a transition to the ACTIVE state is justified, e.g., that the free-space pointing device 400 was not inadvertently jostled.
  • the conditions for state transitions may be symmetrical or may differ.
  • the threshold associated with the condition active ⁇ inactive may be the same as (or different from) the threshold(s) associated with the condition inactive ⁇ active . This enables free-space pointing devices to capture more accurately user input.
  • exemplary embodiments which include a state machine implementation allow, among other things, for the threshold for transition into a stationary condition to be different from the threshold for the transition out of a stationary condition.
  • Entering or leaving a state can be used to trigger other device functions as well.
  • the user interface can be powered up based a transition from any state to the ACTIVE state.
  • the free-space pointing device and/or the user interface can be turned off (or enter a sleep mode) when the free-space pointing device transitions from ACTIVE or STABLE to STATIONARY or INACTIVE.
  • the cursor 410 can be displayed or removed from the screen based on the transition from or to the stationary state of the free-space pointing device 400 .
  • the STABLE state can be used to memorize tremor data. Typically, each user will exhibit a different tremor pattern. This property of user tremor can also be used to identify users. For example, a user's tremor pattern can be memorized by the system (either stored in the free-space pointing device 400 or transmitted to the system) during an initialization procedure in which the user is requested to hold the free-space pointing device as steadily as possible for, e.g., 10 seconds.
  • This pattern can be used as the user's unique signature to perform a variety of user interface functions.
  • the user interface can identify the user from a group of user's by comparing a current tremor pattern with those stored in memory. The identification can then be used, for example, to retrieve preference settings associated with the identified user. For example, if the free-space pointing device is used in conjunction with a media system, then the media selection item display preferences associated with that user can be activated after the system recognizes the user via tremor pattern comparison.
  • System security can also be implemented using tremor recognition, e.g., access to the system may be forbidden or restricted based on the user identification performed after a user picks up the free-space pointing device 400 .
  • operation of a handheld device may be as follows.
  • the handheld device 100 communicates both device movement (delta x, delta y, delta z) and orientation to the computer 110 , e.g., over a wireless data link such as Bluetooth.
  • a wireless data link such as Bluetooth
  • other actions performed by the user are communicated, e.g., button presses and gesture recognition.
  • Computer 110 then makes that information available to the application software 120 for processing.
  • an exemplary handheld device (or free-space pointing device) 100 may resemble a mouse such as mouse 400 illustrated in FIGS. 4 and 5 .
  • Button presses on mouse 400 correspond to actuation of buttons 402 and 404 and scroll wheel 406 for example.
  • a button may correspond to a particular function or action within a gaming application for example.
  • Free-space pointing device 100 may resemble a loop-shaped device such as that illustrated in FIGS. 9A-9D and designed by Hillcrest Labs of Rockville, Md.
  • Free-space pointing device 900 includes buttons 902 and 904 and scroll wheel 906 (corresponding to buttons 402 and 404 and scroll wheel 406 of FIG. 4 ).
  • pointing device 900 may include programmable buttons 908 and 910 . Each of these buttons may be programmed to perform a particular function or action within a gaming application for example.
  • Free-space pointing device 900 may also include a grip 915 which may facilitate a better hold of the device to a user or game participant. In some embodiments or for use in particular gaming applications, grip 915 may include a plurality of sensors associated with a particular functionality for example.
  • the application software 120 takes the input data (from device 100 via computer 110 ) and processes it according to the game scenario of interest.
  • the handheld device 100 emulates a gun or similar weapon and the bullet, grenade, ray blast or similar projectile is fired when the user hits the button.
  • the handheld device 100 could be modeling a gun.
  • the game application 120 and potentially the movement characteristics, could change depending on what type of gun the device 100 was being used to model. Even the number of firings could be changed depending on the type of weapon.
  • the direction the gun fires in the game corresponds to the direction that the handheld device 100 is pointed.
  • the movement characteristics of the handheld device 100 may change depending on what type of device it is currently emulating. For example, if the handheld device 100 is meant to be a shoulder-mount weapon, the movement processing equations are altered to simulate inertia by requiring more significant handheld device movement to result in any on-screen motion. By way of comparison, if the handheld device 100 is emulating a handgun, then a different set of movement processing equations are applied to the data, such that less significant handheld device movements result in on-screen adjustment of the direction of fire of the weapon.
  • a specific weapons reloading gesture of the handheld device 100 can be recognized, e.g., a rapid back and forth movement of the handheld device that simulates the pump action on a shotgun.
  • a gesture command could be defined to involve a forward motion of the handheld device followed within a predetermined time period by a backward motion of the handheld which gesture could be associated with a pump action.
  • the forward and backward motions could further be specified to involve a predetermined magnitude change in one or more of position, velocity and acceleration of the handheld device 100 .
  • the provision of handheld devices and user interfaces according to the present invention enable the full 6-degrees of motion data and button press activity of the handheld device 100 to interact with the game application 120 to produce a full immersion experience.
  • the game alters to fit the emulated device and those changes are made in both the handheld motion response equations and in the visualization.
  • exemplary embodiments of the present invention couple the application 120 and the movements and actions of the device 100 together in unique combinations.
  • the result is an experience that goes beyond individual device and software performance and instead achieves a holistic performance that rivals immersion gaming.
  • Weapon size, capacity, reloading and activation can all be emulated by fully exploiting the measurement of motion and button activations of the handheld device.
  • the software can adjust how that motion activity is interpreted to fully convey the size. This then leads to a semantic gaming implementation.
  • the generic device 100 is semantically transformed into the desired operational device by the application.
  • special purpose devices 100 could be designed that emphasize certain performance traits over others. These devices would still be able to convey 2 or 6 degree of freedom information to the computer 110 and/or application 120 , but would also be configured to provide special information, or simply ergonomic tailoring. One item in particular that could be adjusted is grip.
  • a recoil feature could be added. For example, pulling a trigger could release a spring-loaded mass within the gun, causing the mass to strike against a stop. Between shots, a gear drive, worm, or similar machine would recompress the spring and reset the latch. The actual movements of the handheld device 100 associated with the recoil could be measured and transmitted back to the application 120 such that the visual display of the gun on the screen showed the movements of the emulated gun caused by the recoil. In this way, users that were better able to handle the recoil or adapt to the recoil, would see their weapons operate more effectively in the gaming application.
  • While exemplary embodiments describe a user interacting with or facing a user interface (displaying a gaming application for example) on a single display (such as display 408 of FIG. 4 ), the user interface may be extended to cover multiple displays. For example, four such displays may be positioned on each of the four sides surrounding a user or game participant. Similarly, three such displays may be used—one on each of two sides of the user and one in front of the user. Alternatively, multiple displays may be positioned above each other in a vertical orientation. Multiple displays can also be positioned next to each other in a horizontal orientation. The use of multiple displays requires an operating system that supports multiple screens.
  • a user's manipulation or motion of the device 100 may result in a pointer movement from a first display to a second display for example. Once the pointer has been moved or navigated from a first display to a second display, the user may re-orient his or her position so that the user now faces the second display.
  • a sub channel may be used to communicate data from the second handheld device. This may be applicable in those gaming environments where two (or, more) players may be involved for example.
  • the embodiments described above may facilitate adapting a user input device 100 to perform functionality associated with a steering wheel, a squash racket or a fishing rod/reel in a gaming environment for example.
  • the method 1000 includes executing an application program 120 on processor 110 at 1010 .
  • a communication data link may be established between the handheld user input device 100 and the processor 110 at 1020 .
  • User input may be received at 1030 by processor 110 from handheld user input device 100 .
  • Data may be received by application program 120 from processor 110 at 1040 .
  • User input corresponding to one of motion and orientation of handheld user input device 100 may be reflected on a display at 1050 .

Abstract

A gaming system comprises a handheld user input device, a processor for receiving input data from the handheld user input device and a gaming application for receiving data from the processor based on the received input data from the handheld user input device. The handheld user input device emulates one of a plurality of different devices associated with the gaming application. Each of the plurality of different devices have a set of input commands associated therewith. The set of input commands associated with the one of a plurality of different devices includes a command based on one of motion and orientation of the handheld device.

Description

RELATED APPLICATIONS
This application is a continuation of U.S. patent application Ser. No. 14/218,156, filed on Mar. 18, 2014; which is a continuation of U.S. patent application Ser. No. 13/421,132, filed on Mar. 15, 2012, now U.S. Pat. No. 8,795,079, issued on Aug. 5, 2014; which is a continuation of U.S. patent application Ser. No. 11/286,702, filed on Nov. 23, 2005, now U.S. Pat. No. 8,137,195, issued on Mar. 20, 2012; which claims the benefit of U.S. Provisional Patent Application No. 60/630,408, filed on Nov. 23, 2004; the contents of each of which is incorporated herein by reference.
This application is also related to U.S. patent application Ser. No. 11/119,663, filed on May 2, 2005, entitled “Freespace Pointing Devices and Methods”; and U.S. patent application Ser. No. 10/768,432, filed on Jan. 30, 2004, entitled “A Control Framework with a Zoomable Graphical User Interface for Organizing, Selecting and Launching Media Items”. The subject matter of each of these applications is incorporated in its entirety herein by reference.
BACKGROUND
The present invention relates, generally, to user interfaces and methods associated therewith and, more specifically, to user interfaces and methods involving semantically coupling the actions of handheld devices into application inputs.
User interfaces are ubiquitous in today's society. Computers, cell phones, fax machines and televisions, to name a few products, all employ user interfaces. User interfaces are intended to provide a mechanism for users to easily access and manipulate the, sometimes complex, functionality of the devices that they support. An example of a user interface is found in U.S. patent application Ser. No. 10/768,432, filed on Jan. 30, 2004, entitled “A Control Framework with a Zoomable Graphical User Interface for Organizing, Selecting and Launching Media Items”, the disclosure of which is incorporated here by reference. Typically, such user interfaces employ remote, handheld devices to provide inputs to their respective applications.
Most interface designs for gaming applications include special purpose remote input devices with limited range of motion sensitivity. The result is that morphing devices for different purposes within one gaming application, or across different games, is difficult and restrictive. So-called virtual reality systems provide more flexibility, e.g. by requiring a user to “suit up”, but are physically burdensome and expensive. Accordingly, it would be desirable to provide new user interfaces and handheld devices which overcome the drawbacks associated with existing gaming (and other) applications.
SUMMARY OF THE INVENTION
In one embodiment, a gaming system comprises a handheld user input device, a processor for receiving input data from the handheld user input device and a gaming application for receiving data from the processor based on the received input data from the handheld user input device. The handheld user input device emulates one of a plurality of different devices associated with the gaming application. Each of the plurality of different devices have a set of input commands associated therewith. The set of input commands associated with the one of a plurality of different devices includes a command based on one of motion and orientation of the handheld device.
In another embodiment, a method for using a handheld user input device comprises: executing an application program on a processor, establishing a communication data link between the handheld user input device and the processor, receiving user input data by the processor via the handheld user input device, receiving data by the application program from the processor based on the received user input data, and reflecting user input on a display wherein the user input corresponds to one of motion and orientation of the handheld user input device.
In a further embodiment, a system comprises: a handheld user input device, a processor for receiving input data from the handheld user input device, and a software application executing on the processor and receiving data from the processor based on input data received from the handheld user input device. The handheld user input device emulates one of a plurality of different devices associated with the application. Each of the plurality of different devices have a set of input commands associated therewith wherein the commands include a command based on one of motion and orientation of the handheld device.
BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings illustrate exemplary embodiments of the present invention, wherein:
FIG. 1 depicts an exemplary system according to an exemplary embodiment of the present invention;
FIG. 2 depicts an exemplary media system in which exemplary embodiments of the present invention can be implemented;
FIG. 3 depicts a system controller of FIG. 2 in more detail;
FIG. 4 depicts a free-space pointing device;
FIG. 5 depicts sensors in a free-space pointing device;
FIG. 6 depicts a process model that describes the general operation of a free-space pointing device;
FIG. 7 illustrates an exemplary hardware architecture using a free-space pointing device;
FIG. 8 is a state diagram depicting a stationary detection mechanism for a free-space pointing device;
FIGS. 9A-9D illustrate an exemplary free-space pointing device; and
FIG. 10 illustrates a method in accordance with exemplary embodiments.
DETAILED DESCRIPTION
The following detailed description of the invention refers to the accompanying drawings. The same reference numbers in different drawings identify the same or similar elements. Also, the following detailed description does not limit the invention. Instead, the scope of the invention is defined by the appended claims.
Exemplary embodiments of the present invention provide a generic user input device that tracks user movement in free-space. Both gestures and actions can be measured, recorded and responded to by the associated application(s). The result is a low-cost and flexible gaming device that can, via the appropriate application software, become all manner of units. For example, any number of weapon types like swords, guns, grenade launchers and so on can be modeled with one common handheld device. Exemplary embodiments of the present invention are also much cheaper to implement than full immersion virtual reality systems with and without haptic feedback.
A high level view of an exemplary embodiment of the present invention is provided in FIG. 1. Therein, a handheld device (or, user input detector) 100 provides input to computer 110 which is running an application 120 such as a gaming application for example. According to one exemplary embodiment of the present invention, the handheld device 100 is a free-space device capable of sensing movement in at least 2 degrees of freedom and preferably 6 degrees of freedom. An example of a free-space pointing device is found in U.S. Provisional Patent Application 60/612,571, filed on Sep. 23, 2004, entitled “Free Space Pointing Devices and Methods” (corresponding to U.S. application Ser. No. 11/119,663 cited above), the disclosure of which is incorporated here by reference. Also, in the same exemplary embodiment, the computer 110 could be a gaming console or PC or set-top box, among others and the application 120 may be a game.
The application 120 can be a gaming application using an user interface such as that described in U.S. Patent Application Publication No. US 2005/0125826, which corresponds to U.S. patent application Ser. No. 11/029,329 filed on Jan. 5, 2005, which is a continuation of U.S. patent application Ser. No. 10/768,432 filed on Jan. 30, 2004, that is also incorporated here by reference.
In order to provide some context for this description, an exemplary aggregated media system 200 in which the present invention can be implemented will first be described with respect to FIG. 2. Those skilled in the art will appreciate, however, that this invention is not restricted to implementation in this type of media system and that more or fewer components can be included therein.
The media system 200 may include an input/output (I/O) bus 210 that connects the components in the media system together. The I/O bus 210 represents any of a number of different of mechanisms and techniques for routing signals between the media system components. For example, the I/O bus 210 may include an appropriate number of independent audio “patch” cables that route audio signals, coaxial cables that route video signals, two-wire serial lines or IR or radio frequency (RF) transceivers that route control signals, optical fiber or any other routing mechanisms that route other types of signals.
In this exemplary embodiment, the media system 200 includes a television/monitor 212, a gaming console or gaming device 214, digital video disk (DVD) recorder/playback device 216, audio/video tuner 218, and compact disk (CD) player 220 coupled to the I/O bus 210. The DVD 216 and CD player 220 may be single disk or single cassette devices, or alternatively may be multiple disk or multiple cassette devices. They may be independent units or integrated together. In addition, the media system 200 includes a microphone/speaker system 222, video camera 224, and a wireless I/O control device 226.
The wireless I/O control device 226 may be a media system remote control unit that supports free-space pointing, has a minimal number of buttons to support navigation, and communicates with the entertainment system 200 through RF signals. For example, wireless I/O control device 226 can be a free-space pointing device that uses a gyroscope or other mechanism to define both a screen position and a motion vector to determine the particular command desired. A set of buttons can also be included on the wireless I/O device 226 to initiate a “click” primitive as well as a “back” button. In another exemplary embodiment, wireless I/O control device 226 is a media system remote control unit, which communicates with the components of the entertainment system 200 through IR signals. In yet another embodiment, wireless I/O control device 226 may be an IR remote control device similar in appearance to a typical entertainment system remote control with the added feature of a track-ball or other navigational mechanisms, which allows a user to position a cursor on a display.
The system 200 also includes a system controller 228, which may operate to store and display entertainment system data available from a plurality of entertainment system data sources and to control a wide variety of features associated with each of the system components. As depicted in FIG. 2, system controller 228 is coupled, either directly or indirectly, to each of the system components, as necessary, through I/O bus 210. In one exemplary embodiment, in addition to or in place of I/O bus 210, system controller 228 is configured with a wireless communication transmitter (or transceiver), which is capable of communicating with the system components via IR signals or RF signals. Regardless of the control medium, the system controller 228 is configured to control the media components of the media system 200 via a GUI described below.
As further illustrated in FIG. 2, media system 200 may be configured to receive media items from various media sources and service providers. In this exemplary embodiment, media system 200 receives media input from and, optionally, sends information to, any or all of the following sources: cable broadcast 230, satellite broadcast 232 (e.g., via a satellite dish), very high frequency (VHF) or ultra high frequency (UHF) radio frequency communication of the broadcast television networks 234 (e.g., via an aerial antenna), telephone network 236, and cable modem 238 (or another source of Internet content). Those skilled in the art will appreciate that the media components and media sources illustrated and described with respect to FIG. 2 are purely exemplary and that media system 200 may include more or fewer of both. For example, other types of inputs to the system include AM/FM radio and satellite radio.
FIG. 3 is a block diagram illustrating an embodiment of an exemplary system controller 228, which can, for example, be implemented as a set-top box and include, for example, a processor 300, memory 302, a display controller 304, other device controllers 306 (e.g., associated with the other components of system 200), one or more data storage devices 308, and an I/O interface 310. These components communicate with the processor 300 via bus 312. Those skilled in the art will appreciate that processor 300 can be implemented using one or more processing units.
Memory device(s) 302 may include, for example, DRAM or SRAM, ROM, some of which may be designated as cache memory, which store software to be run by processor 300 and/or data usable by such programs, including software and/or data associated with the GUIs described below. Display controller 304 is operable by processor 300 to control the display of monitor 212 to, among other things, display GUI screens and objects as described below. Zoomable GUIs provide resolution independent zooming, so that monitor 212 can provide displays at any resolution. Device controllers 306 provide an interface between the other components of the media system 200 and the processor 300. Data storage 308 may include one or more of a hard disk drive, a floppy disk drive, a CD-ROM device, or other mass storage device. Input/output interface 310 may include one or more of a plurality of interfaces including, for example, a keyboard interface, an RF interface, an IR interface and a microphone/speech interface. I/O interface 310 may include an interface for receiving location information associated with movement of a wireless pointing device.
Generation and control of a GUI to display media item selection information is performed by the system controller 228 in response to the processor 300 executing sequences of instructions contained in the memory 302. Such instructions may be read into the memory 302 from other computer-readable media such as data storage device(s) 308 or from a computer connected externally to the media system 200. Execution of the sequences of instructions contained in the memory 302 causes the processor to generate GUI objects and controls, among other things, on monitor 212.
In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions. It will be understood that control frameworks described herein overcome limitations of conventional interface frameworks, for example those associated with the television industry. The terms “GUI”, “GUI screen”, “display” and “display screen” are intended to be generic and refer to television displays, computer displays, and any other display device.
As described in the above-incorporated Provisional Patent Application No. 60/612,571, various different types of remote devices can be used as the input device 100, 226, including, for example, trackballs, “mouse”-type pointing devices, light pens, etc., as well as free-space pointing devices. The phrase “free-space pointing” refers to the ability of an input device to move in three (or more) dimensions in the air in front of a display screen, for example, and the corresponding ability of the user interface to translate those motions directly into user interface commands, e.g., movement of a cursor on the display screen.
Data can be transferred between the free-space pointing device and the computer or other device either wirelessly or via a wire connecting the free-space pointing device to the other device. Thus “free-space pointing” differs from conventional computer mouse pointing techniques, for example, which use a surface, e.g., a desk surface or mouse pad, as a proxy surface from which relative movement of the mouse is translated into cursor movement on the computer display screen.
An exemplary free-space pointing device 400 (corresponding to user input detector 100 of FIG. 1 for example) is depicted in FIG. 4. User movement of the free-space pointing device can be defined, for example, in terms of a combination of x-axis attitude (roll), y-axis elevation (pitch) and/or z-axis heading (yaw) motion of the device 400. Linear movement of the pointing device 400 along the x, y, and z axes may also be measured to generate cursor movement or other user interface commands. In the exemplary embodiment depicted in FIG. 4, the free-space pointing device 400 includes two buttons 402, 404 and a scroll wheel 406, although other embodiments can have other physical configurations.
The free-space pointing device 400 may be held by a user in front of a display 408, such as the monitor 212, and motion of the pointing device 400 is translated into output signals that are usable for interaction with information presented on the display 408, e.g., to move a cursor 410 on the display 408. It will be understood that the display 408 may be included in the computer 110 depicted in FIG. 1.
Rotation of the device 400 about the y-axis can be sensed by the device 400, for example, and translated into an output usable by the system to move cursor 410 along the y2 axis of the display 408. Likewise, rotation of the device 400 about the z-axis can be sensed and translated into an output usable by the system to move cursor 410 along the x2 axis of the display 408. It will be appreciated that the output of pointing device 400 can be used to interact with the display 408 in a number of ways other than (or in addition to) cursor movement.
For example, the device 400 can control cursor fading or control volume or media transport (play, pause, fast-forward, and rewind) in a system such as the media entertainment system 200. Input commands may also include, for example, a zoom in or zoom out on a particular region of a display. A cursor may or may not be visible. Similarly, rotation of the free-space pointing device 400 sensed about the x-axis of free space pointing device 400 can be used in addition to, or as an alternative to, y-axis and/or z-axis rotation to provide input to a user interface.
Referring to FIG. 5, rotational sensors 502, 504 and an accelerometer 506 can be employed as sensors in the device 400. The sensors 502, 504 can, for example, be ADXRS150 sensors made by Analog Devices, although it will be appreciated by those skilled in the art that other types of rotational sensors can be used and that ADXRS150 sensors are simply illustrative examples. If the rotational sensors 502, 504 have a single sensing axis (as an ADXRS150 sensor does, for example), then they may be mounted in the free-space pointing device 400 such that their sensing axes are aligned with the rotations to be measured, although this is not necessary. In the exemplary embodiment depicted in FIGS. 4 and 5, this means that rotational sensor 502 is mounted such that its sensing axis is parallel to the y-axis and that rotational sensor 504 is mounted such that its sensing axis is parallel to the z-axis as shown.
Measurements and calculations are performed by the device 400 that are used to adjust the outputs of one or more of the sensors 502, 504, 506 and/or as part of the input used by a processor to determine an appropriate output for the user interface based on the outputs of the sensors 502, 504, 506. These measurements and calculations are used to compensate for several factors, such as errors associated with the sensors 502, 504, 506 and the manner in which a user uses the free-space pointing device 400, e.g., linear acceleration, tilt and tremor.
A process model 600 that describes the general operation of a free-space pointing device 400 is illustrated in FIG. 6. The sensors 502, 504, 506 produce analog signals that are sampled periodically, such as 200 samples/second, for example. The sampled output from the accelerometer 506 is indicated at block 602, and the sampled output values are converted from raw units to units of acceleration, e.g., gravities (g), as indicated by conversion function 604. An acceleration calibration block 606 provides values used for the conversion function 604. This calibration of the accelerometer output 602 can include, for example, compensation for one or more of scale, offset, and axis misalignment error associated with the accelerometer 506.
The accelerometer 506 may be used to compensate for fluctuations in the readings generated by the rotational sensors 502, 504 that are caused by variances in linear acceleration by multiplying the converted accelerometer readings by a gain matrix 610 and subtracting (or adding) the results from (or to) the corresponding sampled rotational sensor data 612. Similarly, linear acceleration compensation for the sampled rotational data from sensor 504 can be provided at block 614.
Like the accelerometer data, the sampled rotational data 612 is then converted from a sampled unit value into a value associated with a rate of angular rotation, e.g., radians/s, at function 616. This conversion step can also include calibration provided by function 618 to compensate the sampled rotational data for factors such as scale and offset. To accomplish dynamic offset compensation, an input from a temperature sensor 619 may be used in rotation calibration function 618.
After conversion/calibration at block 616, the inputs from the rotational sensors 502, 504 can be further processed to rotate those inputs into an inertial frame of reference, i.e., to compensate for tilt associated with the manner in which the user is holding the free-space pointing device 400, at function 620.
Tilt correction to compensate for a user's holding the pointing device 400 at different x-axis rotational positions can be accomplished by determining the tilt of the device 400 using the inputs y and z received from accelerometer 506 at function 622. After the acceleration data is converted and calibrated as described above, it can be low-pass filtered at LPF 624 to provide an average acceleration value to the tilt determination function 622.
After compensation for linear acceleration, processing into readings indicative of angular rotation of the free-space pointing device 400, and compensation for tilt, post-processing can be performed at blocks 626 and 628 to compensate for factors such as human tremor. Although tremor may be removed using several different methods, one way to remove tremor is by using hysteresis. The angular velocity produced by rotation function 620 is integrated to produce an angular position.
Hysteresis of a calibrated magnitude is then applied to the angular position. The derivative is taken of the output of the hysteresis block to again yield an angular velocity. The resulting output is then scaled at function 628 (e.g., based on the sampling period) and used to generate a result within the interface, e.g., movement of the cursor 410 on the display 408.
FIG. 7 illustrates an exemplary hardware architecture, including a processor 700 that communicates with other elements of the free-space pointing device 400 including a scroll wheel 702, JTAG 704, LEDs 706, switch matrix 708, IR photodetector 710, rotational sensors 712, accelerometer 714 and transceiver 716. The scroll wheel 702 is an optional input component that enables a user to provide input to the interface by rotating the scroll wheel 702. JTAG 704 provides a programming and debugging interface to the processor.
LEDs 706 provide visual feedback to a user, for example, when a button is pressed or a function activated. Switch matrix 708 receives inputs, e.g., indications that a button on the free-space pointing device 400 has been depressed or released, that are then passed on to processor 700. The optional IR photodetector 710 can be provided to enable the exemplary free-space pointing device to learn IR codes from other remote controls.
Rotational sensors 712 provide readings to processor 700 regarding, e.g., the y-axis and z-axis rotation of the free-space pointing device as described above. Accelerometer 714 provides readings to processor 700 regarding the linear acceleration of the free-space pointing device 400 that can be used as described above, e.g., to perform tilt compensation and to compensate for errors which linear acceleration introduces into the rotational readings generated by rotational sensors 712.
Transceiver 716 communicates information to and from free-space pointing device 400, e.g., to a controller in a device such as an entertainment system or to a processor associated with the computer 110. The transceiver 716 can be a wireless transceiver, e.g., operating in accordance with the BLUETOOTH standards for short-range wireless communication or an IR transceiver. Alternatively, free-space pointing device 400 can communicate with systems via a wire-line connection.
Stationary detection function 608 can operate to determine whether the free-space pointing device 400 is, for example, either stationary or active (moving). This categorization can be performed in a number of different ways. One way is to compute the variance of the sampled input data of all inputs (x, y, z, αy, αz) over a predetermined window, e.g., every quarter of a second. αy and αz are rotational data from the sensors 502, 504, respectively. This variance is then compared with a threshold to classify the free-space pointing device as either stationary or active.
By analyzing inputs from the pointing device 400 in the frequency domain, e.g., by performing a Fast Fourier Transform (FFT) and using peak detection, the processor 700 can also determine whether the free-space pointing device 400 is either stationary or active and detect the small movements of the free-space pointing device 400 introduced by a user's hand tremor. Tremor can be identified as peaks in the range of human tremor frequencies, e.g., nominally 8-12 Hz.
Although the variances in the frequency domain can be sensed within a particular frequency range, the actual frequency range to be monitored and used to characterize the status of the free-space pointing device 400 may vary. For example, the nominal tremor frequency range may shift based on, e.g., the ergonomics and weight of the free-space pointing device 200, e.g., from 8-12 Hz to 4-7 Hz.
As mentioned above, tremor data may be memorized as typically each user will exhibit a different tremor pattern. This property of user tremor can also be used to identify users. For example, a user's tremor pattern can be memorized by the system (either stored in the free-space pointing device 400 or transmitted to the system) during an initialization procedure wherein the user is requested to hold the free-space pointing device as steadily as possible for a period, e.g., 10 seconds. This pattern can be used as the user's unique signature to perform a variety of user interface functions.
For example, the user interface can identify the user from a group of users by comparing a current tremor pattern with patterns stored in memory. The identification can then be used, for example, to retrieve preference settings associated with the identified user.
For example, if the free-space pointing device is used in conjunction with the media systems described in the above-incorporated by reference patent application, then the media selection item display preferences associated with that user can be activated after the system recognizes the user via tremor pattern comparison. System security can also be implemented using tremor recognition, e.g., access to the system may be forbidden or restricted based on the user identification performed after a user picks up the free-space pointing device 400.
Stationary detection mechanism 608 can include a state machine, an example of which is depicted in FIG. 8. An ACTIVE state is, in the example, the default state during which the free-space pointing device 400 is being used to provide inputs to a user interface for example. The free-space pointing device 400 can enter the ACTIVE state on power-up of the device as indicated by a reset input. If the free-space pointing device 400 stops moving, it may then enter an INACTIVE state. The various state transitions depicted in FIG. 8 can be triggered by any of a number of different criteria including, but not limited to, data output from one or both of the rotational sensors 502 and 504, data output from the accelerometer 506, time domain data, frequency domain data, or any combination thereof.
State transition conditions are generically referred to here using the convention “ConditionstateA→stateB”. For example, the free-space pointing device 400 will transition from the ACTIVE state to the INACTIVE state when conditionactive→inactive occurs. For the sole purpose of illustration, consider that conditionactive→inactive can, in an exemplary free-space pointing device 400, occur when mean and/or standard deviation values from both the rotational sensor(s) and the accelerometer fall below predetermined threshold values for a predetermined time period.
State transitions can be determined by a number of different conditions based upon the interpreted sensor outputs. Exemplary condition metrics include the variance of the interpreted signals over a time window, the threshold between a reference value and the interpreted signal over a time window, the threshold between a reference value and the filtered interpreted signal over a time window, and the threshold between a reference value and the interpreted signal from a start time. All, or any combination, of these condition metrics can be used to trigger state transitions. Alternatively, other metrics can also be used. A transition from the INACTIVE state to the ACTIVE state may occur either when (1) a mean value of sensor output(s) over a time window is greater than predetermined threshold(s) or (2) a variance of values of sensor output(s) over a time window is greater than predetermined threshold(s) or (3) an instantaneous delta between sensor values is greater than a predetermined threshold.
The INACTIVE state enables the stationary detection mechanism 608 to distinguish between brief pauses during which the free-space pointing device 400 is still being used, e.g., on the order of a tenth of a second, and an actual transition to either a stable or stationary condition. This protects against the functions which are performed during the STABLE and STATIONARY states, described below, from inadvertently being performed when the free-space pointing device is being used. The free-space pointing device 400 will transition back to the ACTIVE state when conditioninactive→active occurs, e.g., if the free-space pointing device 400 starts moving again such that the measured outputs from the rotational sensor(s) and the accelerometer exceeds the first threshold before a second predetermined time period in the INACTIVE state elapses.
The free-space pointing device 400 will transition to either the STABLE state or the STATIONARY state after the second predetermined time period elapses. As mentioned earlier, the STABLE state reflects the characterization of the free-space pointing device 400 as being held by a person but being substantially unmoving, while the STATIONARY state reflects a characterization of the free-space pointing device as not being held by a person. Thus, an exemplary state machine can provide for a transition to the STABLE state after a second predetermined time period has elapsed if minimal movement associated with hand tremor is present or, otherwise, transition to the STATIONARY state.
The STABLE and STATIONARY states define times during which the free-space pointing device 400 can perform various functions. For example, since the STABLE state is intended to reflect times when the user is holding the free-space pointing device 400 but is not moving it, the device can record the movement of the free-space pointing device 400 when it is in the STABLE state e.g., by storing outputs from the rotational sensor(s) and/or the accelerometer while in this state. These stored measurements can be used to determine a tremor pattern associated with a particular user or users as described below. Likewise, when in the STATIONARY state, the free-space pointing device 400 can take readings from the rotational sensors and/or the accelerometer for use in compensating for offset.
If the free-space pointing device 400 starts to move while in either the STABLE or STATIONARY state, this can trigger a return to the ACTIVE state. Otherwise, after measurements are taken, the device can transition to the SLEEP state. While in the SLEEP state, the device can enter a power-down mode wherein power consumption of the free-space pointing device is reduced and, e.g., the sampling rate of the rotational sensors and/or the accelerometer is also reduced. The SLEEP state can also be entered via an external command so that the user or another device can command the free-space pointing device 400 to enter the SLEEP state.
Upon receipt of another command, or if the free-space pointing device 400 begins to move, the device can transition from the SLEEP state to the WAKEUP state. Like the INACTIVE state, the WAKEUP state provides an opportunity for the device to confirm that a transition to the ACTIVE state is justified, e.g., that the free-space pointing device 400 was not inadvertently jostled.
The conditions for state transitions may be symmetrical or may differ. Thus, the threshold associated with the conditionactive→inactive may be the same as (or different from) the threshold(s) associated with the conditioninactive→active. This enables free-space pointing devices to capture more accurately user input. For example, exemplary embodiments which include a state machine implementation allow, among other things, for the threshold for transition into a stationary condition to be different from the threshold for the transition out of a stationary condition.
Entering or leaving a state can be used to trigger other device functions as well. For example, the user interface can be powered up based a transition from any state to the ACTIVE state. Conversely, the free-space pointing device and/or the user interface can be turned off (or enter a sleep mode) when the free-space pointing device transitions from ACTIVE or STABLE to STATIONARY or INACTIVE. Alternatively, the cursor 410 can be displayed or removed from the screen based on the transition from or to the stationary state of the free-space pointing device 400.
The STABLE state can be used to memorize tremor data. Typically, each user will exhibit a different tremor pattern. This property of user tremor can also be used to identify users. For example, a user's tremor pattern can be memorized by the system (either stored in the free-space pointing device 400 or transmitted to the system) during an initialization procedure in which the user is requested to hold the free-space pointing device as steadily as possible for, e.g., 10 seconds.
This pattern can be used as the user's unique signature to perform a variety of user interface functions. For example, the user interface can identify the user from a group of user's by comparing a current tremor pattern with those stored in memory. The identification can then be used, for example, to retrieve preference settings associated with the identified user. For example, if the free-space pointing device is used in conjunction with a media system, then the media selection item display preferences associated with that user can be activated after the system recognizes the user via tremor pattern comparison. System security can also be implemented using tremor recognition, e.g., access to the system may be forbidden or restricted based on the user identification performed after a user picks up the free-space pointing device 400.
According to one exemplary embodiment of the present invention, operation of a handheld device may be as follows. The handheld device 100 communicates both device movement (delta x, delta y, delta z) and orientation to the computer 110, e.g., over a wireless data link such as Bluetooth. In addition, other actions performed by the user are communicated, e.g., button presses and gesture recognition. Computer 110 then makes that information available to the application software 120 for processing.
As described above, an exemplary handheld device (or free-space pointing device) 100 may resemble a mouse such as mouse 400 illustrated in FIGS. 4 and 5. Button presses on mouse 400 correspond to actuation of buttons 402 and 404 and scroll wheel 406 for example. A button may correspond to a particular function or action within a gaming application for example.
Another exemplary free-space pointing device 100 may resemble a loop-shaped device such as that illustrated in FIGS. 9A-9D and designed by Hillcrest Labs of Rockville, Md. Free-space pointing device 900 includes buttons 902 and 904 and scroll wheel 906 (corresponding to buttons 402 and 404 and scroll wheel 406 of FIG. 4).
In addition, pointing device 900 may include programmable buttons 908 and 910. Each of these buttons may be programmed to perform a particular function or action within a gaming application for example. Free-space pointing device 900 may also include a grip 915 which may facilitate a better hold of the device to a user or game participant. In some embodiments or for use in particular gaming applications, grip 915 may include a plurality of sensors associated with a particular functionality for example.
The application software 120 takes the input data (from device 100 via computer 110) and processes it according to the game scenario of interest. In the most basic of implementations, the handheld device 100 emulates a gun or similar weapon and the bullet, grenade, ray blast or similar projectile is fired when the user hits the button. In one exemplary, but purely illustrative, embodiment, the handheld device 100 could be modeling a gun. The game application 120, and potentially the movement characteristics, could change depending on what type of gun the device 100 was being used to model. Even the number of firings could be changed depending on the type of weapon. The direction the gun fires in the game corresponds to the direction that the handheld device 100 is pointed.
For more complex interactions, the movement characteristics of the handheld device 100 may change depending on what type of device it is currently emulating. For example, if the handheld device 100 is meant to be a shoulder-mount weapon, the movement processing equations are altered to simulate inertia by requiring more significant handheld device movement to result in any on-screen motion. By way of comparison, if the handheld device 100 is emulating a handgun, then a different set of movement processing equations are applied to the data, such that less significant handheld device movements result in on-screen adjustment of the direction of fire of the weapon.
Furthermore, the usage of the weapon in the gaming application 120 can be altered based on movement of the handheld device 100. For example, in order to reload the weapon in the game, a specific weapons reloading gesture of the handheld device 100 can be recognized, e.g., a rapid back and forth movement of the handheld device that simulates the pump action on a shotgun. More specifically, a gesture command could be defined to involve a forward motion of the handheld device followed within a predetermined time period by a backward motion of the handheld which gesture could be associated with a pump action. The forward and backward motions could further be specified to involve a predetermined magnitude change in one or more of position, velocity and acceleration of the handheld device 100.
Unlike conventional systems, some of which only recognize point and click operations, the provision of handheld devices and user interfaces according to the present invention enable the full 6-degrees of motion data and button press activity of the handheld device 100 to interact with the game application 120 to produce a full immersion experience. The game alters to fit the emulated device and those changes are made in both the handheld motion response equations and in the visualization.
Thus, exemplary embodiments of the present invention couple the application 120 and the movements and actions of the device 100 together in unique combinations. The result is an experience that goes beyond individual device and software performance and instead achieves a holistic performance that rivals immersion gaming. Weapon size, capacity, reloading and activation can all be emulated by fully exploiting the measurement of motion and button activations of the handheld device. The software can adjust how that motion activity is interpreted to fully convey the size. This then leads to a semantic gaming implementation. The generic device 100 is semantically transformed into the desired operational device by the application.
According to other exemplary embodiments of the present invention, special purpose devices 100 could be designed that emphasize certain performance traits over others. These devices would still be able to convey 2 or 6 degree of freedom information to the computer 110 and/or application 120, but would also be configured to provide special information, or simply ergonomic tailoring. One item in particular that could be adjusted is grip.
Additionally, or alternatively, a recoil feature could be added. For example, pulling a trigger could release a spring-loaded mass within the gun, causing the mass to strike against a stop. Between shots, a gear drive, worm, or similar machine would recompress the spring and reset the latch. The actual movements of the handheld device 100 associated with the recoil could be measured and transmitted back to the application 120 such that the visual display of the gun on the screen showed the movements of the emulated gun caused by the recoil. In this way, users that were better able to handle the recoil or adapt to the recoil, would see their weapons operate more effectively in the gaming application.
While exemplary embodiments describe a user interacting with or facing a user interface (displaying a gaming application for example) on a single display (such as display 408 of FIG. 4), the user interface may be extended to cover multiple displays. For example, four such displays may be positioned on each of the four sides surrounding a user or game participant. Similarly, three such displays may be used—one on each of two sides of the user and one in front of the user. Alternatively, multiple displays may be positioned above each other in a vertical orientation. Multiple displays can also be positioned next to each other in a horizontal orientation. The use of multiple displays requires an operating system that supports multiple screens.
A user's manipulation or motion of the device 100 (in a particular pre-defined manner for example) may result in a pointer movement from a first display to a second display for example. Once the pointer has been moved or navigated from a first display to a second display, the user may re-orient his or her position so that the user now faces the second display.
Similarly, while exemplary embodiments describe the use of a single handheld device, multiple handheld devices can also be used. In such scenario, a sub channel may be used to communicate data from the second handheld device. This may be applicable in those gaming environments where two (or, more) players may be involved for example.
The embodiments described above may facilitate adapting a user input device 100 to perform functionality associated with a steering wheel, a squash racket or a fishing rod/reel in a gaming environment for example.
A method in accordance with exemplary embodiments may be more clearly understood with respect to FIG. 10. The method 1000 includes executing an application program 120 on processor 110 at 1010. A communication data link may be established between the handheld user input device 100 and the processor 110 at 1020. User input may be received at 1030 by processor 110 from handheld user input device 100. Data may be received by application program 120 from processor 110 at 1040. User input corresponding to one of motion and orientation of handheld user input device 100 may be reflected on a display at 1050.
The above-described exemplary embodiments are intended to be illustrative in all respects, rather than restrictive, of the present invention. Thus the present invention is capable of many variations in detailed implementation that can be derived from the description contained herein by a person skilled in the art. For example, while the description focused on a gaming environment, exemplary embodiments may be equally applicable in a training or visual acuity for physical therapy.
All such variations and modifications are considered to be within the scope and spirit of the present invention as defined by the following claims. No element, act, or instruction used in the description of the present application should be construed as critical or essential to the invention unless explicitly described as such. Also, as used herein, the article “a” is intended to include one or more items.

Claims (20)

What is claimed is:
1. A system comprising a handheld user input device, a processor and an application, wherein:
the handheld user input device comprising a plurality of inertial sensors, including a three-axis accelerometer and at least one gyroscope, configured to provide data corresponding to motion of the handheld user input device in any of three to six degrees of freedom in connection with emulating one of a plurality of different devices associated with the application;
the processor is configured to:
receive the data; and
compensate the data corresponding to the motion of the handheld user input device by transforming the data corresponding to the motion from a body frame of reference associated with the handheld user input device into an inertial frame of reference, wherein:
the data comprises an angular position in the inertial frame of reference,
the angular position has an axis of rotation in a plane perpendicular to gravity and comprises first, second and third values for first, second and third axes of rotation, respectively,
the first value is an estimate of roll of the handheld device,
the third axis of rotation corresponds to the handheld device's heading; and
the third value corresponds to an estimate of a current heading relative to an arbitrary reference point;
the application configured to:
execute on the processor;
receive the data from the processor;
render the one of the plurality of different devices; and
adjust on-screen motion of the one of the plurality of different devices based on the data.
2. The system of claim 1, wherein the application corresponds to a training program for evaluating a user undergoing therapy.
3. The system of claim 1, wherein the data provided by the handheld user input device comprises first data corresponding to a linear acceleration of the handheld user input device measured by the three-axis accelerometer and second data corresponding to at least one angular velocity of the handheld user input device measured by the at least one gyroscope.
4. The system of claim 1, wherein the system is a gaming system.
5. The system of claim 1, wherein the display comprises a plurality of displays being positioned surrounding a user.
6. The system of claim 1, wherein a particular motion associated with the handheld user input device results in moving a corresponding cursor from one display to a second display.
7. The system of claim 1, wherein the data is communicated to the processor via a wireless signal.
8. The system of claim 3, wherein the at least one gyroscope is a Coriolis vibratory gyroscope.
9. The system of claim 3, wherein the data comprises a value y generated by the three-axis accelerometer associated with acceleration of the handheld user input device in a y-axis direction and a value z generated by the three-axis accelerometer associated with acceleration of the handheld user input device in a z-axis direction.
10. The system of claim 1, wherein the processor is configured to:
determine a tremor pattern present in the data, and
determine a user identity of the user based on the tremor pattern matching at least one of a plurality of tremor patterns associated with user identities.
11. The system of claim 1, wherein the processor is configured to:
restrict access to the system based on recognition of a tremor pattern present in the data.
12. A method comprising:
provide, by a handheld user input device comprising a plurality of inertial sensors, including a three-axis accelerometer and at least one gyroscope, data corresponding to motion of the handheld user input device in any of three to six degrees of freedom in connection with emulating one of a plurality of different devices associated with an application;
receiving, by a processor, the data;
compensating, by the processor, the data corresponding to the motion of the handheld user input device by transforming the data corresponding to the motion from a body frame of reference associated with the handheld user input device into an inertial frame of reference, wherein:
the data comprises an angular position in the inertial frame of reference,
the angular position has an axis of rotation in a plane perpendicular to gravity and comprises first, second and third values for first, second and third axes of rotation, respectively,
the first value is an estimate of roll of the handheld device,
the third axis of rotation corresponds to the handheld device's heading; and
the third value corresponds to an estimate of a current heading relative to an arbitrary reference point;
executing, by the processor, the application, including:
receiving the data from said processor;
rendering the one of the plurality of different devices; and
adjusting on-screen motion of the one of the plurality of different devices based on the data.
13. The method of claim 12, wherein the data comprises first data corresponding to a linear acceleration of the handheld user input device measured by the three-axis accelerometer and second data corresponding to at least one angular velocity of the handheld user input device measured by the at least one gyroscope.
14. The method of claim 12, wherein the display comprises a plurality of displays being positioned surrounding a user.
15. The method of claim 12, wherein a particular motion associated with the handheld user input device results in moving a corresponding cursor from one display to a second display.
16. The method of claim 12, wherein the data is communicated to the processor via a wireless signal.
17. The method of claim 13, wherein the at least one gyroscope is a Coriolis vibratory gyroscope.
18. The method of claim 13, wherein the data comprises a value y generated by the three-axis accelerometer associated with acceleration of the handheld user input device in a y-axis direction and a value z generated by the three-axis accelerometer associated with acceleration of the handheld user input device in a z-axis direction.
19. The method of claim 12, further comprising:
determining a tremor pattern present in the data, and
determining a user identity of the user based on the tremor pattern matching at least one of a plurality of tremor patterns associated with user identities.
20. The method of claim 12, further comprising restricting access to the system based on recognition of a tremor pattern present in the data.
US16/184,610 2004-11-23 2018-11-08 Semantic gaming and application transformation Active US11154776B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/184,610 US11154776B2 (en) 2004-11-23 2018-11-08 Semantic gaming and application transformation

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US63040804P 2004-11-23 2004-11-23
US11/286,702 US8137195B2 (en) 2004-11-23 2005-11-23 Semantic gaming and application transformation
US13/421,132 US8795079B2 (en) 2004-11-23 2012-03-15 Semantic gaming and application transformation including movement processing equations based on inertia
US14/218,156 US10159897B2 (en) 2004-11-23 2014-03-18 Semantic gaming and application transformation
US16/184,610 US11154776B2 (en) 2004-11-23 2018-11-08 Semantic gaming and application transformation

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/218,156 Continuation US10159897B2 (en) 2004-11-23 2014-03-18 Semantic gaming and application transformation

Publications (2)

Publication Number Publication Date
US20190091571A1 US20190091571A1 (en) 2019-03-28
US11154776B2 true US11154776B2 (en) 2021-10-26

Family

ID=36498509

Family Applications (4)

Application Number Title Priority Date Filing Date
US11/286,702 Active 2028-01-03 US8137195B2 (en) 2004-11-23 2005-11-23 Semantic gaming and application transformation
US13/421,132 Active US8795079B2 (en) 2004-11-23 2012-03-15 Semantic gaming and application transformation including movement processing equations based on inertia
US14/218,156 Active 2028-04-06 US10159897B2 (en) 2004-11-23 2014-03-18 Semantic gaming and application transformation
US16/184,610 Active US11154776B2 (en) 2004-11-23 2018-11-08 Semantic gaming and application transformation

Family Applications Before (3)

Application Number Title Priority Date Filing Date
US11/286,702 Active 2028-01-03 US8137195B2 (en) 2004-11-23 2005-11-23 Semantic gaming and application transformation
US13/421,132 Active US8795079B2 (en) 2004-11-23 2012-03-15 Semantic gaming and application transformation including movement processing equations based on inertia
US14/218,156 Active 2028-04-06 US10159897B2 (en) 2004-11-23 2014-03-18 Semantic gaming and application transformation

Country Status (2)

Country Link
US (4) US8137195B2 (en)
WO (1) WO2006058129A2 (en)

Families Citing this family (78)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7749089B1 (en) 1999-02-26 2010-07-06 Creative Kingdoms, Llc Multi-media interactive play system
US7445550B2 (en) 2000-02-22 2008-11-04 Creative Kingdoms, Llc Magical wand and interactive play experience
US7878905B2 (en) 2000-02-22 2011-02-01 Creative Kingdoms, Llc Multi-layered interactive play experience
US6761637B2 (en) 2000-02-22 2004-07-13 Creative Kingdoms, Llc Method of game play using RFID tracking device
US7066781B2 (en) 2000-10-20 2006-06-27 Denise Chapman Weston Children's toy with wireless tag/transponder
US6990639B2 (en) 2002-02-07 2006-01-24 Microsoft Corporation System and process for controlling electronic components in a ubiquitous computing environment using multimodal integration
US6967566B2 (en) 2002-04-05 2005-11-22 Creative Kingdoms, Llc Live-action interactive adventure game
US20070066396A1 (en) 2002-04-05 2007-03-22 Denise Chapman Weston Retail methods for providing an interactive product to a consumer
US10086282B2 (en) * 2002-07-27 2018-10-02 Sony Interactive Entertainment Inc. Tracking device for use in obtaining information for controlling game program execution
US7674184B2 (en) 2002-08-01 2010-03-09 Creative Kingdoms, Llc Interactive water attraction and quest game
US8745541B2 (en) 2003-03-25 2014-06-03 Microsoft Corporation Architecture for controlling a computer using hand gestures
US9446319B2 (en) 2003-03-25 2016-09-20 Mq Gaming, Llc Interactive gaming toy
US7665041B2 (en) * 2003-03-25 2010-02-16 Microsoft Corporation Architecture for controlling a computer using hand gestures
US7038661B2 (en) * 2003-06-13 2006-05-02 Microsoft Corporation Pointing device and cursor for use in intelligent computing environments
US20050227217A1 (en) * 2004-03-31 2005-10-13 Wilson Andrew D Template matching on interactive surface
US7394459B2 (en) 2004-04-29 2008-07-01 Microsoft Corporation Interaction between objects and a virtual environment display
US7787706B2 (en) * 2004-06-14 2010-08-31 Microsoft Corporation Method for controlling an intensity of an infrared source used to detect objects adjacent to an interactive display surface
US7593593B2 (en) 2004-06-16 2009-09-22 Microsoft Corporation Method and system for reducing effects of undesired signals in an infrared imaging system
US8560972B2 (en) 2004-08-10 2013-10-15 Microsoft Corporation Surface UI for gesture-based interaction
US7942745B2 (en) 2005-08-22 2011-05-17 Nintendo Co., Ltd. Game operating device
JP4805633B2 (en) * 2005-08-22 2011-11-02 任天堂株式会社 Game operation device
US8313379B2 (en) 2005-08-22 2012-11-20 Nintendo Co., Ltd. Video game system with wireless modular handheld controller
US7927216B2 (en) 2005-09-15 2011-04-19 Nintendo Co., Ltd. Video game system with wireless modular handheld controller
US8870655B2 (en) 2005-08-24 2014-10-28 Nintendo Co., Ltd. Wireless game controllers
JP4262726B2 (en) 2005-08-24 2009-05-13 任天堂株式会社 Game controller and game system
US8308563B2 (en) 2005-08-30 2012-11-13 Nintendo Co., Ltd. Game system and storage medium having game program stored thereon
US7911444B2 (en) 2005-08-31 2011-03-22 Microsoft Corporation Input method for surface of interactive display
US8157651B2 (en) 2005-09-12 2012-04-17 Nintendo Co., Ltd. Information processing program
US8060840B2 (en) * 2005-12-29 2011-11-15 Microsoft Corporation Orientation free user interface
JP4530419B2 (en) 2006-03-09 2010-08-25 任天堂株式会社 Coordinate calculation apparatus and coordinate calculation program
JP4151982B2 (en) 2006-03-10 2008-09-17 任天堂株式会社 Motion discrimination device and motion discrimination program
JP4684147B2 (en) 2006-03-28 2011-05-18 任天堂株式会社 Inclination calculation device, inclination calculation program, game device, and game program
JP2007304666A (en) 2006-05-08 2007-11-22 Sony Computer Entertainment Inc Information output system and information output method
US7907117B2 (en) * 2006-08-08 2011-03-15 Microsoft Corporation Virtual controller for visual displays
JP4689585B2 (en) * 2006-11-29 2011-05-25 任天堂株式会社 Information processing apparatus and information processing program
JP5177735B2 (en) * 2006-12-01 2013-04-10 任天堂株式会社 GAME PROGRAM AND GAME DEVICE
JP5127242B2 (en) 2007-01-19 2013-01-23 任天堂株式会社 Acceleration data processing program and game program
US8212857B2 (en) 2007-01-26 2012-07-03 Microsoft Corporation Alternating light sources to reduce specular reflection
US20080188277A1 (en) 2007-02-01 2008-08-07 Ritter Janice E Electronic Game Device And Method Of Using The Same
JP5060798B2 (en) * 2007-02-23 2012-10-31 任天堂株式会社 Information processing program and information processing apparatus
JP4918376B2 (en) * 2007-02-23 2012-04-18 任天堂株式会社 Information processing program and information processing apparatus
US20100292007A1 (en) 2007-06-26 2010-11-18 Nintendo Of America Inc. Systems and methods for control device including a movement detector
US7988555B2 (en) * 2007-07-27 2011-08-02 Empire Of Sports Developments Ltd. Method and device for controlling a motion-sequence within a simulated game or sports event
US7826999B1 (en) 2007-08-20 2010-11-02 Pni Corporation Magnetic tilt compensated heading compass with adaptive zoffset
KR101182286B1 (en) * 2007-09-19 2012-09-14 삼성전자주식회사 Remote controller for sensing motion, image display apparatus controlling pointer by the remote controller, and methods thereof
US8823645B2 (en) * 2010-12-28 2014-09-02 Panasonic Corporation Apparatus for remotely controlling another apparatus and having self-orientating capability
US9171454B2 (en) * 2007-11-14 2015-10-27 Microsoft Technology Licensing, Llc Magic wand
US20090131173A1 (en) * 2007-11-20 2009-05-21 Gurnsey Lori A Electronic elimination game system and method
WO2010002997A1 (en) * 2008-07-01 2010-01-07 Hillcrest Laboratories, Inc. 3d pointer mapping
US8159455B2 (en) * 2008-07-18 2012-04-17 Apple Inc. Methods and apparatus for processing combinations of kinematical inputs
US8847739B2 (en) * 2008-08-04 2014-09-30 Microsoft Corporation Fusing RFID and vision for surface object tracking
US20100031202A1 (en) * 2008-08-04 2010-02-04 Microsoft Corporation User-defined gesture set for surface computing
US20100105479A1 (en) 2008-10-23 2010-04-29 Microsoft Corporation Determining orientation in an external reference frame
US8645871B2 (en) 2008-11-21 2014-02-04 Microsoft Corporation Tiltable user interface
US8310447B2 (en) * 2008-11-24 2012-11-13 Lsi Corporation Pointing device housed in a writing device
US20100171635A1 (en) * 2009-01-02 2010-07-08 Ewig Industries Macao Commerical Offshore, Ltd. System And Method For Motion-Sensitive Remote Control For Audio-Visual Entertainment System
JP2010231736A (en) * 2009-03-30 2010-10-14 Sony Corp Input device and method, information processing device and method, information processing system, and program
US8719714B2 (en) 2009-07-08 2014-05-06 Steelseries Aps Apparatus and method for managing operations of accessories
US9737796B2 (en) * 2009-07-08 2017-08-22 Steelseries Aps Apparatus and method for managing operations of accessories in multi-dimensions
US20110050563A1 (en) * 2009-08-31 2011-03-03 Timothy Douglas Skutt Method and system for a motion compensated input device
US8816991B2 (en) * 2009-10-02 2014-08-26 Dedo Interactive, Inc. Touch input apparatus including image projection
US8549418B2 (en) * 2009-12-23 2013-10-01 Intel Corporation Projected display to enhance computer device use
US8810514B2 (en) * 2010-02-12 2014-08-19 Microsoft Corporation Sensor-based pointing device for natural input and interaction
GB2477959A (en) * 2010-02-19 2011-08-24 Sony Europ Navigation and display of an array of selectable items
US9336647B2 (en) 2011-08-29 2016-05-10 Igt Attract based on mobile device
KR20130060757A (en) * 2011-11-30 2013-06-10 삼성전기주식회사 Device for detecting motions and method for detecting motions
US8811938B2 (en) 2011-12-16 2014-08-19 Microsoft Corporation Providing a user interface experience based on inferred vehicle state
US9423874B2 (en) 2013-03-15 2016-08-23 Steelseries Aps Gaming accessory with sensory feedback device
US9687730B2 (en) 2013-03-15 2017-06-27 Steelseries Aps Gaming device with independent gesture-sensitive areas
USD751072S1 (en) 2014-02-18 2016-03-08 Merge Labs, Inc. Mobile head mounted display
US9274340B2 (en) 2014-02-18 2016-03-01 Merge Labs, Inc. Soft head mounted display goggles for use with mobile computing devices
CN105404385B (en) * 2014-05-30 2018-11-27 阿里巴巴集团控股有限公司 A kind of method and device of intelligent display terminal and somatosensory device realization data interaction
CN105808182B (en) 2015-01-15 2019-09-17 财团法人工业技术研究院 Display control method and system, advertisement breach judging device and video and audio processing device
USD760701S1 (en) 2015-02-20 2016-07-05 Merge Labs, Inc. Mobile head mounted display controller
USD755789S1 (en) 2015-02-20 2016-05-10 Merge Labs, Inc. Mobile head mounted display
US11273367B1 (en) * 2019-09-24 2022-03-15 Wayne Hughes Beckett Non-CRT pointing device
WO2021190752A1 (en) * 2020-03-26 2021-09-30 Eaton Intelligent Power Limited Tremor cancellation
US11731048B2 (en) * 2021-05-03 2023-08-22 Sony Interactive Entertainment LLC Method of detecting idle game controller

Citations (390)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB591019A (en) 1945-04-13 1947-08-05 Charles Stanley Hudson Improvements in or relating to remote indicating compasses
US2878286A (en) 1945-06-09 1959-03-17 Du Pont Process for preparing symmetrical hexachlorodiphenyl urea
US3474241A (en) 1966-10-06 1969-10-21 Jack Kuipers Coordinate transformer
US3660648A (en) 1969-10-15 1972-05-02 Northrop Corp Angular rate coordinate transformer
US3931747A (en) 1974-02-06 1976-01-13 Sperry Rand Corporation Gyroscopic stable reference device
US4038876A (en) 1976-03-04 1977-08-02 Systron Donner Corporation Acceleration error compensated attitude sensing and control apparatus and method
US4402250A (en) 1979-06-29 1983-09-06 Hollandse Signaalapparaten B.V. Automatic correction of aiming in firing at moving targets
US4558313A (en) 1981-12-31 1985-12-10 International Business Machines Corporation Indicator to data processing interface
US4558604A (en) 1981-02-02 1985-12-17 Teldix Gmbh Directional gyro
US4578674A (en) 1983-04-20 1986-03-25 International Business Machines Corporation Method and apparatus for wireless cursor position control
US4601206A (en) 1983-09-16 1986-07-22 Ferranti Plc Accelerometer system
US4617634A (en) 1983-06-28 1986-10-14 Mitsubishi Denki Kabushiki Kaisha Artificial satellite attitude control system
US4623930A (en) 1983-12-29 1986-11-18 Matsushita Electric Industrial Co., Ltd. Camera apparatus
US4686772A (en) 1986-05-23 1987-08-18 Elbit Computers Ltd. Electronic magnetic compass system
US4718078A (en) 1985-08-19 1988-01-05 Siemens Aktiengesellschaft System for controlling motion of a robot
US4745402A (en) 1987-02-19 1988-05-17 Rca Licensing Corporation Input device for a display system using phase-encoded signals
US4787051A (en) 1986-05-16 1988-11-22 Tektronix, Inc. Inertial mouse system
US4839838A (en) 1987-03-30 1989-06-13 Labiche Mitchell Spatial input apparatus
US4906907A (en) 1987-03-30 1990-03-06 Hitachi, Ltd. Robot system
US4961369A (en) 1983-01-21 1990-10-09 The Secretary Of State For Defence In Her Britannic Majesty's Government Of The United Kingdom Of Great Britain And Northern Ireland Gun laying
JPH0359619A (en) 1989-07-28 1991-03-14 Nippon Telegr & Teleph Corp <Ntt> Optical amplifier
DE3930581A1 (en) 1989-09-13 1991-03-21 Asea Brown Boveri Work station for process control personnel - has display fields with windows accessed by mouse selection
US5005551A (en) 1989-02-03 1991-04-09 Mcnelley Jerald R In-line fuel heater
GB2237911A (en) 1989-10-17 1991-05-15 Sharp Kk Display processing system
US5045843A (en) 1988-12-06 1991-09-03 Selectech, Ltd. Optical pointing device
US5060175A (en) 1989-02-13 1991-10-22 Hughes Aircraft Company Measurement and control system for scanning sensors
US5128671A (en) 1990-04-12 1992-07-07 Ltv Aerospace And Defense Company Control device having multiple degrees of freedom
US5138154A (en) 1990-04-04 1992-08-11 Gyration Inc. Shaft angle encoder with rotating off-axis interference pattern
US5181181A (en) 1990-09-27 1993-01-19 Triton Technologies, Inc. Computer apparatus input device for three-dimensional information
US5280744A (en) 1992-01-27 1994-01-25 Alliedsignal Inc. Method for aiming towed field artillery pieces
US5293879A (en) 1991-09-23 1994-03-15 Vitatron Medical, B.V. System an method for detecting tremors such as those which result from parkinson's disease
US5327161A (en) 1989-08-09 1994-07-05 Microtouch Systems, Inc. System and method for emulating a mouse input device with a touchpad input device
US5329276A (en) 1990-12-19 1994-07-12 Kabushiki Kaisha Yaskawa Denki Multidimensional signal input device
US5331563A (en) 1991-12-10 1994-07-19 Pioneer Electronic Corporation Navigation device and direction detection method therefor
NL9300171A (en) 1993-01-28 1994-08-16 Josephus Godefridus Wilhelmus Computer mouse based on a system of acceleration sensors disposed therein
US5341466A (en) 1991-05-09 1994-08-23 New York University Fractal computer user centerface with zooming capability
US5359348A (en) 1992-05-21 1994-10-25 Selectech, Ltd. Pointing device having improved automatic gain control and information reporting
JPH06308879A (en) 1992-08-19 1994-11-04 Fujitsu Ltd Optical pointing system
US5369889A (en) 1986-07-07 1994-12-06 Honeywell Inc. Single gyro northfinder
US5373857A (en) 1993-06-18 1994-12-20 Forte Technologies, Inc. Head tracking apparatus
US5383363A (en) 1993-02-10 1995-01-24 Ford Motor Company Inertial measurement unit providing linear and angular outputs using only fixed linear accelerometer sensors
JPH0728591A (en) 1993-05-13 1995-01-31 Toshiba Corp Space manipulation mouse system and space operation pattern input method
JPH0744315A (en) 1993-05-21 1995-02-14 Sony Corp Input device
US5393974A (en) 1993-03-06 1995-02-28 Jee; Sung N. Method and apparatus for detecting the motion variation of a projectile
US5396265A (en) 1990-09-17 1995-03-07 Massachusetts Institute Of Technology Three-dimensional tactile computer input device
US5404307A (en) 1991-12-19 1995-04-04 Pioneer Electronic Corporation Navigation apparatus with detected angular speed correction
US5405152A (en) * 1993-06-08 1995-04-11 The Walt Disney Company Method and apparatus for an interactive video game with physical feedback
US5412421A (en) 1992-04-30 1995-05-02 Westinghouse Electric Corporation Motion compensated sensor
JPH07146123A (en) 1993-11-25 1995-06-06 Alps Electric Co Ltd Inclination detection apparatus and input apparatus using same
GB2284478A (en) 1993-11-25 1995-06-07 Alps Electric Co Ltd Inclination measurement for cursor control
US5430435A (en) 1992-11-13 1995-07-04 Rhys Resources Adjustable athletic training system
JPH07200142A (en) 1993-12-27 1995-08-04 Alps Electric Co Ltd Device and method for position detection
US5440326A (en) 1990-03-21 1995-08-08 Gyration, Inc. Gyroscopic pointer
US5453758A (en) 1992-07-31 1995-09-26 Sony Corporation Input apparatus
US5459489A (en) 1991-12-05 1995-10-17 Tv Interactive Data Corporation Hand held electronic remote control device
JPH07271546A (en) 1994-03-31 1995-10-20 Olympus Optical Co Ltd Image display control method
JPH07284166A (en) 1993-03-12 1995-10-27 Mitsubishi Electric Corp Remote controller
JPH07302148A (en) 1994-05-02 1995-11-14 Wacom Co Ltd Information input device
JPH07318332A (en) 1994-05-26 1995-12-08 Alps Electric Co Ltd Angle detection apparatus and input device using it
US5481957A (en) 1992-07-06 1996-01-09 Alliedsignal Inc. Aiming and pointing system for ground based weapons equipment
US5485171A (en) 1991-10-04 1996-01-16 Micromed Systems, Inc. Hand held computer input apparatus and method
US5484355A (en) * 1993-10-01 1996-01-16 Smith & Nephew Roylan, Inc. System for therapeutic exercise and evaluation
JPH0834569A (en) 1994-07-25 1996-02-06 Sogo Keibi Hosho Co Ltd System for controlling nonstop of elevator
JPH0836459A (en) 1994-07-22 1996-02-06 Fujitsu Ltd Pointing device
US5506605A (en) 1992-07-27 1996-04-09 Paley; W. Bradford Three-dimensional mouse with tactile feedback
JPH0895704A (en) 1994-09-28 1996-04-12 Alps Electric Co Ltd Spatial coordinate detecting device
WO1996011435A1 (en) 1994-10-07 1996-04-18 Synaptics, Incorporated Object position detector with edge motion feature and gesture recognition
JPH08106352A (en) 1994-10-05 1996-04-23 Alps Electric Co Ltd Spatial coordinate detecting device
JPH08114415A (en) 1994-10-13 1996-05-07 Alps Electric Co Ltd Space coordinate detector
JPH08122070A (en) 1994-10-24 1996-05-17 Alps Electric Co Ltd Inclination detection device and input device using it
US5524196A (en) 1992-12-18 1996-06-04 International Business Machines Corporation Method and system for manipulating data through a graphic user interface within a data processing system
JPH08152959A (en) 1994-11-30 1996-06-11 Alps Electric Co Ltd Remote coordinate indicating device
US5525764A (en) 1994-06-09 1996-06-11 Junkins; John L. Laser scanning graphic input system
US5546309A (en) 1993-10-20 1996-08-13 The Charles Stark Draper Laboratory, Inc. Apparatus and method for autonomous satellite attitude sensing
JPH08211993A (en) 1995-01-31 1996-08-20 Alps Electric Co Ltd Inclination detector
JPH08262517A (en) 1995-03-27 1996-10-11 Olympus Optical Co Ltd Camera shake prediction device
JPH08263255A (en) 1995-03-23 1996-10-11 Canon Inc Hierarchical data display method and browser system
US5572221A (en) 1994-10-26 1996-11-05 Telefonaktiebolaget Lm Ericsson Method and apparatus for detecting and predicting motion of mobile terminals
US5574479A (en) 1994-01-07 1996-11-12 Selectech, Ltd. Optical system for determining the roll orientation of a remote unit relative to a base unit
US5573011A (en) 1994-04-08 1996-11-12 Felsing; Gary W. System for quantifying neurological function
JPH08314625A (en) 1995-05-03 1996-11-29 Mitsubishi Electric Res Lab Inc Apparatus and system for control of computer
JPH08335136A (en) 1995-06-08 1996-12-17 Canon Inc Device and method for detecting coordinate
US5587558A (en) 1992-01-24 1996-12-24 Seiko Instruments Inc. Coordinate detecting apparatus having acceleration detectors
US5598187A (en) 1993-05-13 1997-01-28 Kabushiki Kaisha Toshiba Spatial motion pattern input system and input method
JPH0944311A (en) 1995-08-01 1997-02-14 Alpine Electron Inc Pointer positioning method for pointing device
US5615132A (en) 1994-01-21 1997-03-25 Crossbow Technology, Inc. Method and apparatus for determining position and orientation of a moveable object using accelerometers
US5617515A (en) 1994-07-11 1997-04-01 Dynetics, Inc. Method and apparatus for controlling and programming a robot or other moveable object
JPH09114959A (en) 1995-10-16 1997-05-02 Sharp Corp Device and method for information retrieval
US5627565A (en) 1994-05-26 1997-05-06 Alps Electric Co., Ltd. Space coordinates detecting device and input apparatus using same
GB2307133A (en) 1995-11-13 1997-05-14 Secr Defence Video camera image stabilisation system
US5638092A (en) 1994-12-20 1997-06-10 Eng; Tommy K. Cursor control system
US5638523A (en) 1993-01-26 1997-06-10 Sun Microsystems, Inc. Method and apparatus for browsing information in a computer database
CA2246412A1 (en) 1995-12-12 1997-06-19 Acceleron Technologies, Llc. System and method for measuring movement of objects
US5644082A (en) 1994-06-30 1997-07-01 Matsushita Electric Industrial Co., Ltd. Vehicle rotational-angle calculating apparatus
US5645077A (en) 1994-06-16 1997-07-08 Massachusetts Institute Of Technology Inertial orientation tracker apparatus having automatic drift compensation for tracking human head and other similarly sized body
CN1153675A (en) 1995-10-12 1997-07-09 科乐美株式会社 Image game apparatus, input support and control method for image game and image game medium
DE19701374A1 (en) 1996-01-17 1997-07-24 Lg Electronics Inc Hand held wireless three dimensional cursor control, remote control for computer systems, television, robots, computer peripherals
DE19701344A1 (en) 1996-01-17 1997-07-24 Lg Electronics Inc Hand held wireless three dimensional input unit for computer systems
US5661502A (en) 1996-02-16 1997-08-26 Ast Research, Inc. Self-adjusting digital filter for smoothing computer mouse movement
JPH09230997A (en) 1996-02-20 1997-09-05 Ricoh Co Ltd Pen type input device
JPH09251350A (en) 1996-03-18 1997-09-22 Matsushita Electric Ind Co Ltd Menu selection device
US5671342A (en) 1994-11-30 1997-09-23 Intel Corporation Method and apparatus for displaying information relating to a story and a story indicator in a computer system
JPH09274534A (en) 1996-04-04 1997-10-21 Ricoh Co Ltd Pen type input device
JPH09282085A (en) 1996-04-15 1997-10-31 Mitsumi Electric Co Ltd Position information input device
US5692956A (en) * 1996-02-09 1997-12-02 Mattel, Inc. Combination computer mouse and game play control
JPH09319510A (en) 1996-05-27 1997-12-12 Ricoh Co Ltd Pen type input device
JPH09322089A (en) 1996-05-27 1997-12-12 Fujitsu Ltd Broadcasting program transmitter, information transmitter, device provided with document preparation function and terminal equipment
US5698784A (en) 1996-01-24 1997-12-16 Gyration, Inc. Vibratory rate gyroscope and methods of assembly and operation
US5701424A (en) 1992-07-06 1997-12-23 Microsoft Corporation Palladian menus and methods relating thereto
US5703623A (en) 1996-01-24 1997-12-30 Hall; Malcolm G. Smart orientation sensing circuit for remote control
US5706448A (en) 1992-12-18 1998-01-06 International Business Machines Corporation Method and system for manipulating data through a graphic user interface within a data processing system
US5714698A (en) 1994-02-03 1998-02-03 Canon Kabushiki Kaisha Gesture input method and apparatus
GB2316482A (en) 1993-11-25 1998-02-25 Alps Electric Co Ltd Inclination detection apparatus
US5736923A (en) 1995-07-11 1998-04-07 Union Switch & Signal Inc. Apparatus and method for sensing motionlessness in a vehicle
JPH1093936A (en) 1996-09-18 1998-04-10 Toshiba Corp Broadcast transmitter, broadcast receiver and video reservation device
US5740471A (en) 1991-03-06 1998-04-14 Nikon Corporation Camera shake compensation device
US5741182A (en) 1994-06-17 1998-04-21 Sports Sciences, Inc. Sensing spatial movement
US5745226A (en) 1996-11-04 1998-04-28 Litton Systems, Inc. Passive optical velocity measurement device and method
US5745710A (en) 1993-05-24 1998-04-28 Sun Microsystems, Inc. Graphical user interface for selection of audiovisual programming
US5748189A (en) * 1995-09-19 1998-05-05 Sony Corp Method and apparatus for sharing input devices amongst plural independent graphic display devices
GB2319374A (en) 1994-11-30 1998-05-20 Alps Electric Co Ltd Remote coordinate designating apparatus
US5757362A (en) 1995-01-05 1998-05-26 International Business Machines Corporation Recursive digital filter using fixed point arithmetic
DE19648487C1 (en) 1996-11-12 1998-06-10 Primax Electronics Ltd Computer mouse with additional display window controls
US5771406A (en) 1993-12-10 1998-06-23 Nikon Corporation Camera with a shift optical system responsive to an external device
CA2248364A1 (en) 1997-01-08 1998-07-16 Ferdinand Peer Instrument for compensating for hand tremor during the manipulation of fine structures
JPH10191468A (en) 1996-10-25 1998-07-21 Matsushita Electric Ind Co Ltd Audio and video system
US5786805A (en) 1996-12-27 1998-07-28 Barry; Edwin Franklin Method and apparatus for improving object selection on a computer display by providing cursor control with a sticky property
US5790121A (en) 1996-09-06 1998-08-04 Sklar; Peter Clustering user interface
US5794081A (en) 1994-03-03 1998-08-11 Olympus Optical Co., Ltd. Camera capable of detecting camera shake and compensating image blur due to camera shake
US5796354A (en) 1997-02-07 1998-08-18 Reality Quest Corp. Hand-attachable controller with direction sensing
US5796395A (en) 1996-04-02 1998-08-18 Wegener Internet Projects Bv System for publishing and searching interests of individuals
JPH10240434A (en) 1997-02-27 1998-09-11 Matsushita Electric Ind Co Ltd Command menu selecting method
WO1998043183A1 (en) 1997-03-25 1998-10-01 Sony Electronics, Inc. Integrated search of electronic program guide, internet and other information resources
US5822713A (en) 1993-04-05 1998-10-13 Contraves Usa Guided fire control system
JPH10275048A (en) 1997-03-28 1998-10-13 Ricoh Co Ltd Pen type input device
DE19814254A1 (en) 1997-03-31 1998-10-15 Microsoft Corp Query-based electronic program guide
US5825350A (en) 1996-03-13 1998-10-20 Gyration, Inc. Electronic pointing apparatus and method
US5828987A (en) 1995-08-28 1998-10-27 Data Tec Co., Ltd. Movement detecting device
US5835156A (en) 1996-08-14 1998-11-10 Samsung Electroncis, Ltd. Television graphical user interface employing remote random access pointing device
US5835077A (en) 1995-01-13 1998-11-10 Remec, Inc., Computer control device
JPH10307676A (en) 1997-05-07 1998-11-17 Ricoh Co Ltd Pen type input device
US5870079A (en) 1996-11-12 1999-02-09 Legaltech, Inc. Computer input device and controller therefor
RU2125853C1 (en) 1987-04-29 1999-02-10 Др.Хельмут Хуберти Device for checking of load on parts of body
RU2126161C1 (en) 1994-06-27 1999-02-10 Коновалов Сергей Феодосьевич Compensation accelerometer
JPH1145150A (en) 1997-07-25 1999-02-16 Ricoh Co Ltd Pen type input device
US5878286A (en) 1996-09-10 1999-03-02 Nikon Corporation Motion detection device for a photographic apparatus
US5881321A (en) 1997-05-09 1999-03-09 Cammotion, Inc.. Camera motion sensing system
US5880722A (en) 1997-11-12 1999-03-09 Futuretel, Inc. Video cursor with zoom in the user interface of a video editor
JPH1185387A (en) 1997-09-12 1999-03-30 Ricoh Co Ltd Posture input device, pen type input device with posture input function, and pen type input system with pen type input device
US5902968A (en) 1996-02-20 1999-05-11 Ricoh Company, Ltd. Pen-shaped handwriting input apparatus using accelerometers and gyroscopes and an associated operational device for determining pen movement
JPH11146299A (en) 1997-11-11 1999-05-28 Hitachi Ltd Device and method for displaying data television broadcasting
EP0919906A2 (en) 1997-11-27 1999-06-02 Matsushita Electric Industrial Co., Ltd. Control method
US5912612A (en) 1997-10-14 1999-06-15 Devolpi; Dean R. Multi-speed multi-direction analog pointing device
US5940072A (en) 1996-08-15 1999-08-17 Samsung Information Systems America Graphics decompression using system ROM indexing in TV set top box
US5953683A (en) 1997-10-09 1999-09-14 Ascension Technology Corporation Sourceless orientation sensor
US5955988A (en) 1996-08-14 1999-09-21 Samsung Electronics Co., Ltd. Graphical user interface for establishing installation location for satellite based television system
US5978043A (en) 1996-08-14 1999-11-02 Samsung Electronics Co., Ltd. TV graphical user interface that provides customized lists of programming
US5982369A (en) 1997-04-21 1999-11-09 Sony Corporation Method for displaying on a screen of a computer system images representing search results
US5989157A (en) 1996-08-06 1999-11-23 Walton; Charles A. Exercising system with electronic inertial game playing
US6002394A (en) 1995-10-02 1999-12-14 Starsight Telecast, Inc. Systems and methods for linking television viewers with advertisers and broadcasters
US6001014A (en) * 1996-10-01 1999-12-14 Sony Computer Entertainment Inc. Game machine control module and game machine
US6005551A (en) 1997-04-25 1999-12-21 Microsoft Corporation Offline force effect rendering
US6005578A (en) 1997-09-25 1999-12-21 Mindsphere, Inc. Method and apparatus for visual navigation of information objects
US6016144A (en) 1996-08-14 2000-01-18 Samsung Electronics Co., Ltd. Multi-layered television graphical user interface
DE19937307A1 (en) 1998-08-10 2000-02-17 Deutsch Zentr Luft & Raumfahrt Method for technical control operations using control wheel, where pressure and turning actions are converted into translational and rotational movements of objects being controlled with wheel
US6028594A (en) 1996-06-04 2000-02-22 Alps Electric Co., Ltd. Coordinate input device depending on input speeds
JP2000056897A (en) 1998-08-11 2000-02-25 Toshiba Corp Computer system
US6037933A (en) 1996-11-13 2000-03-14 Samsung Electronics Co., Ltd. TV graphical user interface for providing user access to preset time periods of TV program information
US6043807A (en) 1997-09-23 2000-03-28 At&T Corp. Mouse for positioning a cursor on a computer display and having a removable pen-type input device
US6047132A (en) 1998-03-04 2000-04-04 Nikon Corporation Camera system and interchangeable lens to compensate for image blur when photographing at close range
US6049823A (en) 1995-10-04 2000-04-11 Hwang; Ivan Chung-Shung Multi server, interactive, video-on-demand television system utilizing a direct-access-on-demand workgroup
JP2000115652A (en) 1998-10-02 2000-04-21 Matsushita Electric Ind Co Ltd Epg information display method and device, image recording and reproducing device and program recording medium
US6057831A (en) 1996-08-14 2000-05-02 Samsung Electronics Co., Ltd. TV graphical user interface having cursor position indicator
US6069594A (en) * 1991-07-29 2000-05-30 Logitech, Inc. Computer input device with multiple switches using single line
TW392066B (en) 1998-12-17 2000-06-01 Tokin Corp Orientation angle detector
US6072467A (en) 1996-05-03 2000-06-06 Mitsubishi Electric Information Technology Center America, Inc. (Ita) Continuously variable control of animated on-screen characters
WO2000033566A1 (en) 1998-11-30 2000-06-08 Sony Corporation Information providing device and method
WO2000034474A2 (en) 1998-12-07 2000-06-15 Zymogenetics, Inc. Growth factor homolog zvegf3
JP2000172248A (en) 1998-09-28 2000-06-23 Fujitsu Ltd Method of displaying electronic information, and device and program storage medium for reading electronic information
US6088031A (en) 1997-07-21 2000-07-11 Samsung Electronics Co., Ltd. Method and device for controlling selection of a menu item from a menu displayed on a screen
US6092076A (en) 1998-03-24 2000-07-18 Navigation Technologies Corporation Method and system for map display in a navigation application
US6104969A (en) 1998-12-30 2000-08-15 Honeywell International Inc. Methods and apparatus for operating an input device in a turbulent environment
US6115028A (en) 1996-08-22 2000-09-05 Silicon Graphics, Inc. Three dimensional input system using tilt
JP2000242383A (en) 1999-02-19 2000-09-08 Nec Corp Screen display enlargement control unit
JP2000270237A (en) 1999-03-15 2000-09-29 Nippon Hoso Kyokai <Nhk> Selector for image display device
JP2000308756A (en) 1999-04-27 2000-11-07 Taito Corp Input controller of game device
US6144366A (en) 1996-10-18 2000-11-07 Kabushiki Kaisha Toshiba Method and apparatus for generating information input using reflected light image of target object
US6154723A (en) 1996-12-06 2000-11-28 The Board Of Trustees Of The University Of Illinois Virtual reality 3D interface system for data creation, viewing and editing
US6154199A (en) 1998-04-15 2000-11-28 Butler; Craig L. Hand positioned mouse
US6163021A (en) 1998-12-15 2000-12-19 Rockwell Collins, Inc. Navigation system for spinning projectiles
US6164808A (en) 1996-02-09 2000-12-26 Murata Mfg. Co., Ltd. Three-dimensional data input device
JP2001008384A (en) 1999-06-21 2001-01-12 Toshiba Corp System screen display and recording medium
US6175362B1 (en) 1997-07-21 2001-01-16 Samsung Electronics Co., Ltd. TV graphical user interface providing selection among various lists of TV channels
US6181333B1 (en) 1996-08-14 2001-01-30 Samsung Electronics Co., Ltd. Television graphical user interface having channel and program sorting capabilities
US6188392B1 (en) 1997-06-30 2001-02-13 Intel Corporation Electronic pen device
US6191774B1 (en) 1995-11-17 2001-02-20 Immersion Corporation Mouse interface for providing force feedback
US6191781B1 (en) 1996-08-14 2001-02-20 Samsung Electronics, Ltd. Television graphical user interface that combines electronic program guide with graphical channel changer
JP2001052009A (en) 1999-08-06 2001-02-23 Sony Corp Device and method for processing information and medium
US6195089B1 (en) 1996-08-14 2001-02-27 Samsung Electronics Co., Ltd. Television graphical user interface having variable channel changer icons
US6198470B1 (en) 1998-12-28 2001-03-06 Uri Agam Computer input device
US6208936B1 (en) 1999-06-18 2001-03-27 Rockwell Collins, Inc. Utilization of a magnetic sensor to compensate a MEMS-IMU/GPS and de-spin strapdown on rolling missiles
JP2001100908A (en) 1999-09-28 2001-04-13 Fujitsu Ltd Pen tip trace generating method, pen type input device, and pen mounting unit
JP2001159951A (en) 1999-12-02 2001-06-12 Nec Corp Information processor and method for processing information
JP2001175412A (en) 1999-12-15 2001-06-29 Shigekazu Koshiba Remote controller for electronic equipment with multi- axial integral acceleration detector
US6268849B1 (en) 1998-06-30 2001-07-31 United Video Properties, Inc. Internet television program guide system with embedded real-time data
US20010015123A1 (en) 2000-01-11 2001-08-23 Yoshiki Nishitani Apparatus and method for detecting performer's motion to interactively control performance of music or the like
US6282467B1 (en) 1997-10-14 2001-08-28 The Boeing Company Three-axis inertial attitude determination for spinning spacecraft
US6295646B1 (en) 1998-09-30 2001-09-25 Intel Corporation Method and apparatus for displaying video data and corresponding entertainment data for multiple entertainment selection sources
WO2001078055A1 (en) 2000-04-05 2001-10-18 Feinstein David Y View navigation and magnification of a hand-held device with a display
US6314575B1 (en) 1994-09-14 2001-11-06 Time Warner Entertainment Company, L.P. Telecasting service for providing video programs on demand with an interactive interface for facilitating viewer selection of video programs
US6330856B1 (en) 1999-01-28 2001-12-18 Innovative Product Achievements, Inc. Garment dispensing and receiving apparatus
DE10029173A1 (en) 2000-06-19 2002-01-03 Deutsch Zentr Luft & Raumfahrt Method and arrangement for commanding control operations for kinematic movements of an object using a hand-operated input device
US20020015064A1 (en) 2000-08-07 2002-02-07 Robotham John S. Gesture-based user interface to multi-level and multi-modal sets of bit-maps
US6346959B1 (en) 1995-11-06 2002-02-12 Riso Kagaku Corporation Image forming apparatus and method for image formation
US6349257B1 (en) 1999-09-15 2002-02-19 International Business Machines Corporation System for personalized mobile navigation information
US20020021278A1 (en) 2000-07-17 2002-02-21 Hinckley Kenneth P. Method and apparatus using multiple sensors in a device with a display
JP2002062981A (en) 2000-08-16 2002-02-28 Nippon Hoso Kyokai <Nhk> Display screen indicating device
US20020032696A1 (en) 1994-12-16 2002-03-14 Hideo Takiguchi Intuitive hierarchical time-series data display method and system
US20020033848A1 (en) 2000-04-21 2002-03-21 Sciammarella Eduardo Agusto System for managing data objects
JP2002082773A (en) 2001-08-22 2002-03-22 Sony Corp Input device and its method
JP2002091692A (en) 2000-09-12 2002-03-29 Seiko Instruments Inc Pointing system
US6369794B1 (en) 1998-09-09 2002-04-09 Matsushita Electric Industrial Co., Ltd. Operation indication outputting device for giving operation indication according to type of user's action
US6369837B1 (en) 1998-07-17 2002-04-09 International Business Machines Corporation GUI selector control
US6385542B1 (en) 2000-10-18 2002-05-07 Magellan Dis, Inc. Multiple configurations for a vehicle navigation system
US20020054158A1 (en) 2000-08-31 2002-05-09 Akiko Asami Information-processing apparatus and computer-graphic display program
US20020054129A1 (en) 1999-12-24 2002-05-09 U.S. Philips Corporation 3D environment labelling
KR200276592Y1 (en) 2002-02-22 2002-05-24 오성훈 Cutter for cutting round wire
US6397387B1 (en) 1997-06-02 2002-05-28 Sony Corporation Client and server system
US6400996B1 (en) 1999-02-01 2002-06-04 Steven M. Hoffberg Adaptive pattern recognition based control system and method
US6400406B1 (en) 1996-06-28 2002-06-04 Samsung Electronics, Co., Ltd. Device and method for displaying broadcast program guide in a programmed recording system
US6404416B1 (en) 1994-06-09 2002-06-11 Corporation For National Research Initiatives Unconstrained pointing interface for natural human interaction with a display-based computer system
US6411308B1 (en) 1996-08-14 2002-06-25 Samsung Electronics Co., Ltd. Television graphical user interface having variable channel control bars
US6412110B1 (en) 1996-08-06 2002-06-25 Starsight Telecast, Inc. Electronic program guide with interactive areas
US6415225B1 (en) 1999-08-06 2002-07-02 Aisin Aw Co., Ltd. Navigation system and a memory medium
US6421067B1 (en) 2000-01-16 2002-07-16 Isurftv Electronic programming guide
JP2002207703A (en) 2001-01-11 2002-07-26 Sony Corp Electronic equipment
US6426761B1 (en) 1999-04-23 2002-07-30 Internation Business Machines Corporation Information presentation system for a graphical user interface
JP2002215327A (en) 2001-01-18 2002-08-02 Nec Eng Ltd Mouse control mechanism
US6429813B2 (en) 1999-01-14 2002-08-06 Navigation Technologies Corp. Method and system for providing end-user preferences with a navigation system
US20020112237A1 (en) 2000-04-10 2002-08-15 Kelts Brett R. System and method for providing an interactive display interface for information objects
US20020118123A1 (en) 2001-02-27 2002-08-29 Kim Sung-Cheol Space keyboard system using force feedback and method of inputting information therefor
US20020126121A1 (en) 2001-03-12 2002-09-12 Robbins Daniel C. Visualization of multi-dimensional data having an unbounded dimension
US20020126026A1 (en) 2001-03-09 2002-09-12 Samsung Electronics Co., Ltd. Information input system using bio feedback and method thereof
JP2002259335A (en) 2001-03-01 2002-09-13 Konica Corp Information recognition system, digital image pickup device, store printer, portable telephone set, information processor, information recognizing method and information recording medium
US6452609B1 (en) 1998-11-06 2002-09-17 Supertuner.Com Web application for accessing media streams
US20020130835A1 (en) 2001-03-16 2002-09-19 Brosnan Michael John Portable electronic device with mouse-like capabilities
US20020140745A1 (en) 2001-01-24 2002-10-03 Ellenby Thomas William Pointing systems for addressing objects
US6463328B1 (en) 1996-02-02 2002-10-08 Michael Sasha John Adaptive brain stimulation method and system
US6466200B1 (en) 1999-11-03 2002-10-15 Innalabs, Inc. Computer input device
US6466199B2 (en) 1998-07-23 2002-10-15 Alps Electric Co., Ltd. Method for moving a pointing cursor
JP2002312117A (en) 2001-04-17 2002-10-25 Japan Aviation Electronics Industry Ltd Cylindrical image spherical image control device
US6473713B1 (en) 1999-09-20 2002-10-29 American Gnc Corporation Processing method for motion measurement
US20020158843A1 (en) 2001-04-26 2002-10-31 International Business Machines Corporation Method and adapter for performing assistive motion data processing and/or button data processing external to a computer
US6492981B1 (en) 1997-12-23 2002-12-10 Ricoh Company, Ltd. Calibration of a system for tracking a writing instrument with multiple sensors
US6496779B1 (en) 2000-03-30 2002-12-17 Rockwell Collins Inertial measurement unit with magnetometer for detecting stationarity
US20020189870A1 (en) 1999-03-15 2002-12-19 Kamen Dean L. Control of a balancing personal vehicle
JP2003009577A (en) 2001-06-20 2003-01-10 Yamaha Motor Co Ltd Drive controller for brushless dc motor
US6515669B1 (en) 1998-10-23 2003-02-04 Olympus Optical Co., Ltd. Operation input device applied to three-dimensional input device
KR20030009577A (en) 2001-06-27 2003-02-05 (주)이에스비컨 Three-dimensional input device using a gyroscope
US20030038778A1 (en) 2001-08-13 2003-02-27 Siemens Information And Communication Mobile, Llc Tilt-based pointing for hand-held devices
US6529161B2 (en) 2001-02-08 2003-03-04 Mitsubishi Denki Kabushiki Kaisha Antenna control method and antenna controller
US6529218B2 (en) 1998-07-13 2003-03-04 Matsushita Electric Industrial Co., Ltd. Display control with movable or updatable auxiliary information
WO2003021947A1 (en) 2001-08-29 2003-03-13 Digeo, Inc. System and method for displaying option representations with multiple levels of specificity
US6544126B2 (en) 2000-04-25 2003-04-08 Nintendo Co., Ltd. Portable game machine with download capability
US6556127B1 (en) 1996-10-15 2003-04-29 Swisscom Ag Speaker verification method
TW530998U (en) 2000-08-14 2003-05-01 Jr-Feng Chen Computer input devive with two elastic fulcrums for 6 degrees of freedom data input
US20030080282A1 (en) 2001-10-26 2003-05-01 Walley Thomas M. Apparatus and method for three-dimensional relative movement sensing
US6557350B2 (en) 2001-05-17 2003-05-06 General Electric Company Method and apparatus for cooling gas turbine engine igniter tubes
US6561993B2 (en) 2001-02-26 2003-05-13 International Business Machines Corporation Device driver system for minimizing adverse tremor effects during use of pointing devices
US20030093311A1 (en) 2001-11-05 2003-05-15 Kenneth Knowlson Targeted advertising
US6577350B1 (en) 1998-12-21 2003-06-10 Sony Corporation Method and apparatus for displaying an electronic program guide
WO2003048909A2 (en) 2001-12-04 2003-06-12 Applied Neural Computing, L.L.C. Validating the identity of a user using a pointing device
US20030107551A1 (en) 2001-12-10 2003-06-12 Dunker Garrett Storm Tilt input device
US6583781B1 (en) 2000-10-17 2003-06-24 International Business Machines Corporation Methods, systems and computer program products for controlling events associated with user interface elements by capturing user intent based on pointer movements
US20030115930A1 (en) 2001-11-13 2003-06-26 Nokia Corporation Method, device and system for calibrating angular rate measurement sensors
US6590536B1 (en) * 2000-08-18 2003-07-08 Charles A. Walton Body motion detecting system with correction for tilt of accelerometers and remote measurement of body position
US20030158699A1 (en) 1998-12-09 2003-08-21 Christopher P. Townsend Orientation sensor
US20030159051A1 (en) 2000-03-27 2003-08-21 Wilhelm Hollnagel Method for generating electronic signatures
US6614420B1 (en) 1999-02-22 2003-09-02 Microsoft Corporation Dual axis articulated electronic input device
US6613000B1 (en) * 2000-09-30 2003-09-02 The Regents Of The University Of California Method and apparatus for mass-delivered movement rehabilitation
US20030172283A1 (en) 2001-10-25 2003-09-11 O'hara Sean M. Biometric characteristic-enabled remote control device
US6621452B2 (en) 1997-08-19 2003-09-16 Siemens Vdo Automotive Corporation Vehicle information system
US20030193572A1 (en) 2002-02-07 2003-10-16 Andrew Wilson System and process for selecting objects in a ubiquitous computing environment
DE10219198A1 (en) 2002-04-29 2003-11-06 Univ Leipzig Cursor movement control device comprises a device that is moved in space so that sensors, such as acceleration sensors, detect the movement which is converted into a command signal that is transmitted to a computer or similar
US20030210230A1 (en) 2002-05-09 2003-11-13 Waters Richard C. Invisible beam pointer system
US6661410B2 (en) 2001-09-07 2003-12-09 Microsoft Corporation Capacitive sensing and data input device power management
US6672962B1 (en) 1998-05-13 2004-01-06 Kabushiki Kaisha Sega Enterprises Gun-shaped controller and game device
US20040008191A1 (en) 2002-06-14 2004-01-15 Ivan Poupyrev User interface apparatus and portable information apparatus
US20040036650A1 (en) 2002-08-26 2004-02-26 Morgan Kenneth S. Remote velocity sensor slaved to an integrated GPS/INS
JP2004062774A (en) 2002-07-31 2004-02-26 Sharp Corp Presentation display device
JP2004061502A (en) 2002-07-26 2004-02-26 Microsoft Corp Electrostatic capacitive sensing system and its method
US20040050924A1 (en) 2000-12-06 2004-03-18 Falk Mletzko Enabling of devices
JP2004086511A (en) 2002-08-27 2004-03-18 Innotech Corp Electronic equipment and pointer moving method
US20040064252A1 (en) 2002-09-26 2004-04-01 Honeywell International Inc. Method and system for processing pulse signals within an inertial navigation system
US20040070564A1 (en) 2002-10-15 2004-04-15 Dawson Thomas P. Method and system for controlling a display device
US6724368B2 (en) 2001-12-14 2004-04-20 Koninklijke Philips Electronics N.V. Remote control system and method for a television receiver
US20040078194A1 (en) 1997-06-10 2004-04-22 Coding Technologies Sweden Ab Source coding enhancement using spectral-band replication
US20040075650A1 (en) 1999-05-25 2004-04-22 Lapstun Paul Orientation sensing device with processor
JP2004126756A (en) 2002-09-30 2004-04-22 Pentel Corp Correction value setting method for coordinate input device
US6727887B1 (en) 1995-01-05 2004-04-27 International Business Machines Corporation Wireless pointing device for remote cursor control
US6735777B1 (en) 1998-10-28 2004-05-11 Samsung Electronics Co., Ltd. Method for controlling program guide for displaying broadcast program title
US20040095317A1 (en) 2002-11-20 2004-05-20 Jingxi Zhang Method and apparatus of universal remote pointing control for home entertainment system and computer
US6744420B2 (en) 2000-06-01 2004-06-01 Olympus Optical Co., Ltd. Operation input apparatus using sensor attachable to operator's hand
US6753849B1 (en) 1999-10-27 2004-06-22 Ken Curran & Associates Universal remote TV mouse
US20040123320A1 (en) 2002-12-23 2004-06-24 Mike Daily Method and system for providing an interactive guide for multimedia selection
US6757446B1 (en) 2000-11-27 2004-06-29 Microsoft Corporation System and process for image-based relativistic rendering
US20040125073A1 (en) 2002-12-30 2004-07-01 Scott Potter Portable electronic apparatus and method employing motion sensor for function control
US6766456B1 (en) 2000-02-23 2004-07-20 Micron Technology, Inc. Method and system for authenticating a user of a computer system
US6765598B2 (en) 1998-10-27 2004-07-20 Samsung Electronics Co., Ltd. Method and apparatus for enabling selection in an on-screen menu
US20040151218A1 (en) 2002-12-23 2004-08-05 Vlad Branzoi Systems and methods for tremor cancellation in pointers
US20040176991A1 (en) 2003-03-05 2004-09-09 Mckennan Carol System, method and apparatus using biometrics to communicate dissatisfaction via stress level
US20040189620A1 (en) 2003-03-19 2004-09-30 Samsung Electronics Co., Ltd. Magnetic sensor-based pen-shaped input system and a handwriting trajectory recovery method therefor
US20040193925A1 (en) 2003-03-26 2004-09-30 Matnn Safriel Portable password manager
US20040193413A1 (en) 2003-03-25 2004-09-30 Wilson Andrew D. Architecture for controlling a computer using hand gestures
US20040196287A1 (en) 2003-04-01 2004-10-07 Wong Pak Chung Dynamic visualization of data streams
US20040204240A1 (en) 2000-02-22 2004-10-14 Barney Jonathan A. Magical wand and interactive play experience
US20040218104A1 (en) 2003-05-01 2004-11-04 Smith Gregory C. Multimedia user interface
US20040221243A1 (en) 2003-04-30 2004-11-04 Twerdahl Timothy D Radial menu interface for handheld computing device
US20040227725A1 (en) 2002-10-14 2004-11-18 Stmicroelectronics S.R.L. User controlled device for sending control signals to an electric appliance, in particular user controlled pointing device such as mouse of joystick, with 3D-motion detection
US20040229693A1 (en) 2003-05-13 2004-11-18 Clifton Lind Multiple video display gaming machine and gaming system
US20040236573A1 (en) 2001-06-19 2004-11-25 Sapeluk Andrew Thomas Speaker recognition systems
US20040252120A1 (en) 2003-05-08 2004-12-16 Hunleth Frank A. Systems and methods for node tracking and notification in a control framework including a zoomable graphical user interface
US20040268393A1 (en) 2003-05-08 2004-12-30 Hunleth Frank A. Control framework with a zoomable graphical user interface for organizing, selecting and launching media items
US20050008148A1 (en) 2003-04-02 2005-01-13 Dov Jacobson Mouse performance identification
US6844871B1 (en) 1999-11-05 2005-01-18 Microsoft Corporation Method and apparatus for computer input using six degrees of freedom
US20050017454A1 (en) * 2003-06-09 2005-01-27 Shoichi Endo Interactive gaming systems with haptic feedback
US20050033200A1 (en) 2003-08-05 2005-02-10 Soehren Wayne A. Human motion identification and measurement system and method
US20050037843A1 (en) 2003-08-11 2005-02-17 William Wells Three-dimensional image display for a gaming apparatus
US20050041014A1 (en) 2003-08-22 2005-02-24 Benjamin Slotznick Using cursor immobility to suppress selection errors
US6873931B1 (en) 2000-10-10 2005-03-29 Csi Technology, Inc. Accelerometer based angular position sensor
US6871413B1 (en) 1997-12-15 2005-03-29 Microstrain, Inc. Miniaturized inclinometer for angle measurement with accurate measurement indicator
US6874413B2 (en) 2001-09-27 2005-04-05 Thieme Gmbh & Co. Kg Top part of a screen printing machine with bearing elements for a screen printing stencil
US20050097474A1 (en) 2003-10-31 2005-05-05 Accot Johnny I. Spiral scrollbar
US6897854B2 (en) 2001-04-12 2005-05-24 Samsung Electronics Co., Ltd. Electronic pen input device and coordinate detecting method therefor
US6902482B1 (en) * 1997-11-25 2005-06-07 Thomas G. Woolston Interactive electronic game having gyroscopic output effect
US20050160813A1 (en) 2004-01-27 2005-07-28 Nobuyuki Imai Clock generating device, vibration type gyro sensor, navigation device, imaging device, and electronic apparatus
US20050176505A1 (en) * 2004-02-09 2005-08-11 Stanley Mark J. Method and apparatus for providing computer pointing device input to a video game console
US20050174324A1 (en) 2003-10-23 2005-08-11 Hillcrest Communications, Inc. User interface devices and methods employing accelerometers
US6929548B2 (en) 2002-04-23 2005-08-16 Xiaoling Wang Apparatus and a method for more realistic shooting video games on computers or similar devices
US20050212749A1 (en) 2004-03-23 2005-09-29 Marvit David L Motion sensor engagement for a handheld device
US20050212766A1 (en) * 2004-03-23 2005-09-29 Reinhardt Albert H M Translation controlled cursor
US20050213840A1 (en) 2003-10-17 2005-09-29 Mei Chen Method for image stabilization by adaptive filtering
US20050212767A1 (en) 2004-03-23 2005-09-29 Marvit David L Context dependent gesture response
US20050219217A1 (en) 2002-07-11 2005-10-06 International Business Machines Corporation Peripheral device for a data processing system
US20050222784A1 (en) 2004-04-01 2005-10-06 Blue Line Innovations Inc. System and method for reading power meters
US20050222802A1 (en) 2002-08-27 2005-10-06 Yasuhiro Tamura Mobile terminal apparatus
US20050243062A1 (en) 2004-04-30 2005-11-03 Hillcrest Communications, Inc. Free space pointing devices with tilt compensation and improved usability
US20050243061A1 (en) 2004-04-30 2005-11-03 Hillcrest Communications, Inc. Methods and devices for identifying users based on tremor
US20050246109A1 (en) 2004-04-29 2005-11-03 Samsung Electronics Co., Ltd. Method and apparatus for entering information into a portable electronic device
US20050253806A1 (en) 2004-04-30 2005-11-17 Hillcrest Communications, Inc. Free space pointing devices and methods
US6975959B2 (en) 2002-12-03 2005-12-13 Robert Bosch Gmbh Orientation and navigation for a mobile device using inertial sensors
US20050275623A1 (en) 2004-06-14 2005-12-15 Siemens Information And Communication Mobile Llc Optical joystick for hand-held communication device
US6984208B2 (en) 2002-08-01 2006-01-10 The Hong Kong Polytechnic University Method and apparatus for sensing body gesture, posture and movement
US6990639B2 (en) 2002-02-07 2006-01-24 Microsoft Corporation System and process for controlling electronic components in a ubiquitous computing environment using multimodal integration
US20060028446A1 (en) 2004-04-30 2006-02-09 Hillcrest Communications, Inc. Methods and devices for removing unintentional movement in free space pointing devices
US6998966B2 (en) 2003-11-26 2006-02-14 Nokia Corporation Mobile communication device having a functional cover for controlling sound applications by motion
US20060061545A1 (en) 2004-04-02 2006-03-23 Media Lab Europe Limited ( In Voluntary Liquidation). Motion-activated control with haptic feedback
JP2006113019A (en) 2004-10-18 2006-04-27 Alps Electric Co Ltd Triaxial type electronic compass, and azimuth detecting method using same
US7038661B2 (en) 2003-06-13 2006-05-02 Microsoft Corporation Pointing device and cursor for use in intelligent computing environments
US20060092133A1 (en) 2004-11-02 2006-05-04 Pierre A. Touma 3D mouse and game controller based on spherical coordinates system and system for use
US20060109242A1 (en) 2004-11-19 2006-05-25 Simpkins Daniel S User interface for impaired users
US7061469B2 (en) 2000-02-24 2006-06-13 Innalabs Technologies, Inc. Method of data input into a computer
US20060125789A1 (en) 2002-12-23 2006-06-15 Jiawen Tu Contactless input device
US20060150734A1 (en) 2002-12-10 2006-07-13 Koninklijke Philips Electronics N.V. Activity monitoring
US7093201B2 (en) 2001-09-06 2006-08-15 Danger, Inc. Loop menu navigation apparatus and method
US7098891B1 (en) 1992-09-18 2006-08-29 Pryor Timothy R Method for providing human input to a computer
US20060262116A1 (en) 2005-05-19 2006-11-23 Hillcrest Laboratories, Inc. Global navigation objects in user interfaces
US7155974B2 (en) 2003-10-20 2007-01-02 Honda Motor Co., Ltd. Inertia sensor unit
WO2007007227A2 (en) 2005-07-11 2007-01-18 Philips Intellectual Property & Standards Gmbh Method of controlling a control point position on a command area and method for control of a device
US7166832B2 (en) 2004-07-09 2007-01-23 Funai Electric Co., Ltd. Self-running robot
US7173604B2 (en) 2004-03-23 2007-02-06 Fujitsu Limited Gesture identification of controlled devices
US20070035518A1 (en) 2005-07-01 2007-02-15 Hillcrest Laboratories, Inc. 3D pointing devices
US7188045B1 (en) 2006-03-09 2007-03-06 Dean A. Cirielli Three-dimensional position and motion telemetry input
US20070066394A1 (en) 2005-09-15 2007-03-22 Nintendo Co., Ltd. Video game system with wireless modular handheld controller
US7194816B2 (en) 2004-07-15 2007-03-27 C&N Inc. Mobile terminal apparatus
US20070072680A1 (en) 2005-08-24 2007-03-29 Nintendo Co., Ltd. Game controller and game system
US20080016962A1 (en) 2006-07-24 2008-01-24 Honeywell International Inc, Medical use angular rate sensor
US20080024435A1 (en) 2006-07-25 2008-01-31 Nintendo Co., Ltd. Information processing device and storage medium storing information processing program
US7353134B2 (en) 2006-03-09 2008-04-01 Dean A. Cirielli Three-dimensional position and motion telemetry input
US20080108870A1 (en) 2006-11-06 2008-05-08 Wiita Bruce E Apparatus and method for stabilizing an image from an endoscopic camera
US7383517B2 (en) 2004-04-21 2008-06-03 Microsoft Corporation System and method for acquiring a target with intelligent pointer movement
US20080134784A1 (en) 2006-12-12 2008-06-12 Industrial Technology Research Institute Inertial input apparatus with six-axial detection ability and the operating method thereof
US7409292B2 (en) 2006-05-26 2008-08-05 Honeywell International Inc. Method and system for degimbalization of vehicle navigation data
US7421343B2 (en) 2005-10-27 2008-09-02 Honeywell International Inc. Systems and methods for reducing vibration-induced errors in inertial sensors
US20090002203A1 (en) 2005-02-25 2009-01-01 Kenji Kataoka Mobile Device
US7487045B1 (en) 2005-03-10 2009-02-03 William Vieira Projected score area calculator and method of use
US20090153389A1 (en) 2007-12-14 2009-06-18 Apple Inc. Scroll bar with video region in a media system
US20100153872A1 (en) 2008-12-11 2010-06-17 Samsung Electronics Co., Ltd. Method for providing graphical user interface and electronic device using the same
US7843425B2 (en) 2005-12-16 2010-11-30 Industrial Technology Research Institute Motion recognition system and method for controlling electronic devices
US8629836B2 (en) 2004-04-30 2014-01-14 Hillcrest Laboratories, Inc. 3D pointing devices with orientation compensation and improved usability

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6937277B1 (en) * 1998-04-24 2005-08-30 Canon Kabushiki Kaisha Image input apparatus employing read region size determination
KR20010011596A (en) 1999-07-29 2001-02-15 윤종용 Method for controlling diplay of an electric program guide picture
KR20020076592A (en) 2001-03-29 2002-10-11 엘지전자 주식회사 Intelligence pointing apparatus and method for mouse

Patent Citations (459)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB591019A (en) 1945-04-13 1947-08-05 Charles Stanley Hudson Improvements in or relating to remote indicating compasses
US2878286A (en) 1945-06-09 1959-03-17 Du Pont Process for preparing symmetrical hexachlorodiphenyl urea
US3474241A (en) 1966-10-06 1969-10-21 Jack Kuipers Coordinate transformer
US3660648A (en) 1969-10-15 1972-05-02 Northrop Corp Angular rate coordinate transformer
US3931747A (en) 1974-02-06 1976-01-13 Sperry Rand Corporation Gyroscopic stable reference device
US4038876A (en) 1976-03-04 1977-08-02 Systron Donner Corporation Acceleration error compensated attitude sensing and control apparatus and method
US4402250A (en) 1979-06-29 1983-09-06 Hollandse Signaalapparaten B.V. Automatic correction of aiming in firing at moving targets
US4558604A (en) 1981-02-02 1985-12-17 Teldix Gmbh Directional gyro
US4558313A (en) 1981-12-31 1985-12-10 International Business Machines Corporation Indicator to data processing interface
US4961369A (en) 1983-01-21 1990-10-09 The Secretary Of State For Defence In Her Britannic Majesty's Government Of The United Kingdom Of Great Britain And Northern Ireland Gun laying
US4578674A (en) 1983-04-20 1986-03-25 International Business Machines Corporation Method and apparatus for wireless cursor position control
US4617634A (en) 1983-06-28 1986-10-14 Mitsubishi Denki Kabushiki Kaisha Artificial satellite attitude control system
US4601206A (en) 1983-09-16 1986-07-22 Ferranti Plc Accelerometer system
US4623930A (en) 1983-12-29 1986-11-18 Matsushita Electric Industrial Co., Ltd. Camera apparatus
US5062696A (en) 1983-12-29 1991-11-05 Matsushita Electric Industrial Co., Ltd. Camera apparatus
US4718078A (en) 1985-08-19 1988-01-05 Siemens Aktiengesellschaft System for controlling motion of a robot
US4787051A (en) 1986-05-16 1988-11-22 Tektronix, Inc. Inertial mouse system
US4686772A (en) 1986-05-23 1987-08-18 Elbit Computers Ltd. Electronic magnetic compass system
US5369889A (en) 1986-07-07 1994-12-06 Honeywell Inc. Single gyro northfinder
US4745402A (en) 1987-02-19 1988-05-17 Rca Licensing Corporation Input device for a display system using phase-encoded signals
US4839838A (en) 1987-03-30 1989-06-13 Labiche Mitchell Spatial input apparatus
US4906907A (en) 1987-03-30 1990-03-06 Hitachi, Ltd. Robot system
RU2125853C1 (en) 1987-04-29 1999-02-10 Др.Хельмут Хуберти Device for checking of load on parts of body
US5045843B1 (en) 1988-12-06 1996-07-16 Selectech Ltd Optical pointing device
US5045843A (en) 1988-12-06 1991-09-03 Selectech, Ltd. Optical pointing device
US5005551A (en) 1989-02-03 1991-04-09 Mcnelley Jerald R In-line fuel heater
US5060175A (en) 1989-02-13 1991-10-22 Hughes Aircraft Company Measurement and control system for scanning sensors
JPH0359619A (en) 1989-07-28 1991-03-14 Nippon Telegr & Teleph Corp <Ntt> Optical amplifier
US5327161A (en) 1989-08-09 1994-07-05 Microtouch Systems, Inc. System and method for emulating a mouse input device with a touchpad input device
DE3930581A1 (en) 1989-09-13 1991-03-21 Asea Brown Boveri Work station for process control personnel - has display fields with windows accessed by mouse selection
GB2237911A (en) 1989-10-17 1991-05-15 Sharp Kk Display processing system
US5440326A (en) 1990-03-21 1995-08-08 Gyration, Inc. Gyroscopic pointer
US5898421A (en) 1990-03-21 1999-04-27 Gyration, Inc. Gyroscopic pointer and method
US5138154A (en) 1990-04-04 1992-08-11 Gyration Inc. Shaft angle encoder with rotating off-axis interference pattern
US5128671A (en) 1990-04-12 1992-07-07 Ltv Aerospace And Defense Company Control device having multiple degrees of freedom
US5396265A (en) 1990-09-17 1995-03-07 Massachusetts Institute Of Technology Three-dimensional tactile computer input device
US5181181A (en) 1990-09-27 1993-01-19 Triton Technologies, Inc. Computer apparatus input device for three-dimensional information
US5329276A (en) 1990-12-19 1994-07-12 Kabushiki Kaisha Yaskawa Denki Multidimensional signal input device
US5740471A (en) 1991-03-06 1998-04-14 Nikon Corporation Camera shake compensation device
US5341466A (en) 1991-05-09 1994-08-23 New York University Fractal computer user centerface with zooming capability
US6069594A (en) * 1991-07-29 2000-05-30 Logitech, Inc. Computer input device with multiple switches using single line
US5293879A (en) 1991-09-23 1994-03-15 Vitatron Medical, B.V. System an method for detecting tremors such as those which result from parkinson's disease
US5640152A (en) 1991-10-04 1997-06-17 Copper; John M. Hand held computer input apparatus and method
US5485171A (en) 1991-10-04 1996-01-16 Micromed Systems, Inc. Hand held computer input apparatus and method
US5459489A (en) 1991-12-05 1995-10-17 Tv Interactive Data Corporation Hand held electronic remote control device
US5331563A (en) 1991-12-10 1994-07-19 Pioneer Electronic Corporation Navigation device and direction detection method therefor
US5404307A (en) 1991-12-19 1995-04-04 Pioneer Electronic Corporation Navigation apparatus with detected angular speed correction
US5587558A (en) 1992-01-24 1996-12-24 Seiko Instruments Inc. Coordinate detecting apparatus having acceleration detectors
US5280744A (en) 1992-01-27 1994-01-25 Alliedsignal Inc. Method for aiming towed field artillery pieces
US5412421A (en) 1992-04-30 1995-05-02 Westinghouse Electric Corporation Motion compensated sensor
US5359348A (en) 1992-05-21 1994-10-25 Selectech, Ltd. Pointing device having improved automatic gain control and information reporting
US5481957A (en) 1992-07-06 1996-01-09 Alliedsignal Inc. Aiming and pointing system for ground based weapons equipment
US5701424A (en) 1992-07-06 1997-12-23 Microsoft Corporation Palladian menus and methods relating thereto
US5506605A (en) 1992-07-27 1996-04-09 Paley; W. Bradford Three-dimensional mouse with tactile feedback
US5453758A (en) 1992-07-31 1995-09-26 Sony Corporation Input apparatus
JPH06308879A (en) 1992-08-19 1994-11-04 Fujitsu Ltd Optical pointing system
US7098891B1 (en) 1992-09-18 2006-08-29 Pryor Timothy R Method for providing human input to a computer
US5430435A (en) 1992-11-13 1995-07-04 Rhys Resources Adjustable athletic training system
US5524196A (en) 1992-12-18 1996-06-04 International Business Machines Corporation Method and system for manipulating data through a graphic user interface within a data processing system
US5706448A (en) 1992-12-18 1998-01-06 International Business Machines Corporation Method and system for manipulating data through a graphic user interface within a data processing system
US5638523A (en) 1993-01-26 1997-06-10 Sun Microsystems, Inc. Method and apparatus for browsing information in a computer database
NL9300171A (en) 1993-01-28 1994-08-16 Josephus Godefridus Wilhelmus Computer mouse based on a system of acceleration sensors disposed therein
US5383363A (en) 1993-02-10 1995-01-24 Ford Motor Company Inertial measurement unit providing linear and angular outputs using only fixed linear accelerometer sensors
US5393974A (en) 1993-03-06 1995-02-28 Jee; Sung N. Method and apparatus for detecting the motion variation of a projectile
JPH07284166A (en) 1993-03-12 1995-10-27 Mitsubishi Electric Corp Remote controller
US5554980A (en) 1993-03-12 1996-09-10 Mitsubishi Denki Kabushiki Kaisha Remote control system
US5822713A (en) 1993-04-05 1998-10-13 Contraves Usa Guided fire control system
JPH0728591A (en) 1993-05-13 1995-01-31 Toshiba Corp Space manipulation mouse system and space operation pattern input method
US5598187A (en) 1993-05-13 1997-01-28 Kabushiki Kaisha Toshiba Spatial motion pattern input system and input method
JPH0744315A (en) 1993-05-21 1995-02-14 Sony Corp Input device
US5745710A (en) 1993-05-24 1998-04-28 Sun Microsystems, Inc. Graphical user interface for selection of audiovisual programming
US5405152A (en) * 1993-06-08 1995-04-11 The Walt Disney Company Method and apparatus for an interactive video game with physical feedback
US5373857A (en) 1993-06-18 1994-12-20 Forte Technologies, Inc. Head tracking apparatus
US5484355A (en) * 1993-10-01 1996-01-16 Smith & Nephew Roylan, Inc. System for therapeutic exercise and evaluation
US5546309A (en) 1993-10-20 1996-08-13 The Charles Stark Draper Laboratory, Inc. Apparatus and method for autonomous satellite attitude sensing
GB2284478A (en) 1993-11-25 1995-06-07 Alps Electric Co Ltd Inclination measurement for cursor control
JPH07146123A (en) 1993-11-25 1995-06-06 Alps Electric Co Ltd Inclination detection apparatus and input apparatus using same
GB2316482A (en) 1993-11-25 1998-02-25 Alps Electric Co Ltd Inclination detection apparatus
JP3059619B2 (en) 1993-11-25 2000-07-04 アルプス電気株式会社 Tilt detecting device and input device using the same
US5771406A (en) 1993-12-10 1998-06-23 Nikon Corporation Camera with a shift optical system responsive to an external device
JP2901476B2 (en) 1993-12-27 1999-06-07 アルプス電気株式会社 Position detecting device and position detecting method
JPH07200142A (en) 1993-12-27 1995-08-04 Alps Electric Co Ltd Device and method for position detection
US5574479A (en) 1994-01-07 1996-11-12 Selectech, Ltd. Optical system for determining the roll orientation of a remote unit relative to a base unit
US5615132A (en) 1994-01-21 1997-03-25 Crossbow Technology, Inc. Method and apparatus for determining position and orientation of a moveable object using accelerometers
US5819206A (en) 1994-01-21 1998-10-06 Crossbow Technology, Inc. Method and apparatus for determining position and orientation of a moveable object using accelerometers
US5714698A (en) 1994-02-03 1998-02-03 Canon Kabushiki Kaisha Gesture input method and apparatus
US5794081A (en) 1994-03-03 1998-08-11 Olympus Optical Co., Ltd. Camera capable of detecting camera shake and compensating image blur due to camera shake
JPH07271546A (en) 1994-03-31 1995-10-20 Olympus Optical Co Ltd Image display control method
US5573011A (en) 1994-04-08 1996-11-12 Felsing; Gary W. System for quantifying neurological function
JPH07302148A (en) 1994-05-02 1995-11-14 Wacom Co Ltd Information input device
JP3262677B2 (en) 1994-05-02 2002-03-04 株式会社ワコム Information input device
JPH07318332A (en) 1994-05-26 1995-12-08 Alps Electric Co Ltd Angle detection apparatus and input device using it
US5627565A (en) 1994-05-26 1997-05-06 Alps Electric Co., Ltd. Space coordinates detecting device and input apparatus using same
JP3204844B2 (en) 1994-05-26 2001-09-04 アルプス電気株式会社 Angle detecting device and input device using the same
US5525764A (en) 1994-06-09 1996-06-11 Junkins; John L. Laser scanning graphic input system
US6404416B1 (en) 1994-06-09 2002-06-11 Corporation For National Research Initiatives Unconstrained pointing interface for natural human interaction with a display-based computer system
US5807284A (en) 1994-06-16 1998-09-15 Massachusetts Institute Of Technology Inertial orientation tracker apparatus method having automatic drift compensation for tracking human head and other similarly sized body
US5645077A (en) 1994-06-16 1997-07-08 Massachusetts Institute Of Technology Inertial orientation tracker apparatus having automatic drift compensation for tracking human head and other similarly sized body
US5741182A (en) 1994-06-17 1998-04-21 Sports Sciences, Inc. Sensing spatial movement
RU2126161C1 (en) 1994-06-27 1999-02-10 Коновалов Сергей Феодосьевич Compensation accelerometer
US6073490A (en) 1994-06-27 2000-06-13 Sergy Feodosievich Konovalov Servo accelerometer
US5644082A (en) 1994-06-30 1997-07-01 Matsushita Electric Industrial Co., Ltd. Vehicle rotational-angle calculating apparatus
US5617515A (en) 1994-07-11 1997-04-01 Dynetics, Inc. Method and apparatus for controlling and programming a robot or other moveable object
JPH0836459A (en) 1994-07-22 1996-02-06 Fujitsu Ltd Pointing device
JPH0834569A (en) 1994-07-25 1996-02-06 Sogo Keibi Hosho Co Ltd System for controlling nonstop of elevator
US6314575B1 (en) 1994-09-14 2001-11-06 Time Warner Entertainment Company, L.P. Telecasting service for providing video programs on demand with an interactive interface for facilitating viewer selection of video programs
JPH0895704A (en) 1994-09-28 1996-04-12 Alps Electric Co Ltd Spatial coordinate detecting device
JPH08106352A (en) 1994-10-05 1996-04-23 Alps Electric Co Ltd Spatial coordinate detecting device
WO1996011435A1 (en) 1994-10-07 1996-04-18 Synaptics, Incorporated Object position detector with edge motion feature and gesture recognition
JPH08114415A (en) 1994-10-13 1996-05-07 Alps Electric Co Ltd Space coordinate detector
JP3194841B2 (en) 1994-10-24 2001-08-06 アルプス電気株式会社 Tilt detecting device and input device using the same
JPH08122070A (en) 1994-10-24 1996-05-17 Alps Electric Co Ltd Inclination detection device and input device using it
RU2141738C1 (en) 1994-10-26 1999-11-20 Телефонактиеболагет Лм Эрикссон Method and device for detection and estimation of movement of mobile sets
US5572221A (en) 1994-10-26 1996-11-05 Telefonaktiebolaget Lm Ericsson Method and apparatus for detecting and predicting motion of mobile terminals
US5671342A (en) 1994-11-30 1997-09-23 Intel Corporation Method and apparatus for displaying information relating to a story and a story indicator in a computer system
JP3273531B2 (en) 1994-11-30 2002-04-08 アルプス電気株式会社 Remote coordinate pointing device
GB2319374A (en) 1994-11-30 1998-05-20 Alps Electric Co Ltd Remote coordinate designating apparatus
JPH08152959A (en) 1994-11-30 1996-06-11 Alps Electric Co Ltd Remote coordinate indicating device
US20020032696A1 (en) 1994-12-16 2002-03-14 Hideo Takiguchi Intuitive hierarchical time-series data display method and system
US5638092A (en) 1994-12-20 1997-06-10 Eng; Tommy K. Cursor control system
US6727887B1 (en) 1995-01-05 2004-04-27 International Business Machines Corporation Wireless pointing device for remote cursor control
US5757362A (en) 1995-01-05 1998-05-26 International Business Machines Corporation Recursive digital filter using fixed point arithmetic
US5835077A (en) 1995-01-13 1998-11-10 Remec, Inc., Computer control device
JPH08211993A (en) 1995-01-31 1996-08-20 Alps Electric Co Ltd Inclination detector
JP3228845B2 (en) 1995-01-31 2001-11-12 アルプス電気株式会社 Tilt detector
JPH08263255A (en) 1995-03-23 1996-10-11 Canon Inc Hierarchical data display method and browser system
JPH08262517A (en) 1995-03-27 1996-10-11 Olympus Optical Co Ltd Camera shake prediction device
US5757360A (en) 1995-05-03 1998-05-26 Mitsubishi Electric Information Technology Center America, Inc. Hand held computer control device
JPH08314625A (en) 1995-05-03 1996-11-29 Mitsubishi Electric Res Lab Inc Apparatus and system for control of computer
JPH08335136A (en) 1995-06-08 1996-12-17 Canon Inc Device and method for detecting coordinate
JP3517482B2 (en) 1995-06-08 2004-04-12 キヤノン株式会社 Coordinate detection device and method
US5736923A (en) 1995-07-11 1998-04-07 Union Switch & Signal Inc. Apparatus and method for sensing motionlessness in a vehicle
JPH0944311A (en) 1995-08-01 1997-02-14 Alpine Electron Inc Pointer positioning method for pointing device
US5828987A (en) 1995-08-28 1998-10-27 Data Tec Co., Ltd. Movement detecting device
US5748189A (en) * 1995-09-19 1998-05-05 Sony Corp Method and apparatus for sharing input devices amongst plural independent graphic display devices
US6002394A (en) 1995-10-02 1999-12-14 Starsight Telecast, Inc. Systems and methods for linking television viewers with advertisers and broadcasters
US6049823A (en) 1995-10-04 2000-04-11 Hwang; Ivan Chung-Shung Multi server, interactive, video-on-demand television system utilizing a direct-access-on-demand workgroup
CN1153675A (en) 1995-10-12 1997-07-09 科乐美株式会社 Image game apparatus, input support and control method for image game and image game medium
US5807174A (en) 1995-10-12 1998-09-15 Konami Co., Ltd. Method of assisting player in entering commands in video game, video game system, video game storage medium, and method of controlling video game
US5969706A (en) 1995-10-16 1999-10-19 Sharp Kabushiki Kaisha Information retrieval apparatus and method
JPH09114959A (en) 1995-10-16 1997-05-02 Sharp Corp Device and method for information retrieval
US6346959B1 (en) 1995-11-06 2002-02-12 Riso Kagaku Corporation Image forming apparatus and method for image formation
GB2307133A (en) 1995-11-13 1997-05-14 Secr Defence Video camera image stabilisation system
US6191774B1 (en) 1995-11-17 2001-02-20 Immersion Corporation Mouse interface for providing force feedback
CA2246412A1 (en) 1995-12-12 1997-06-19 Acceleron Technologies, Llc. System and method for measuring movement of objects
DE19701374A1 (en) 1996-01-17 1997-07-24 Lg Electronics Inc Hand held wireless three dimensional cursor control, remote control for computer systems, television, robots, computer peripherals
US5892501A (en) 1996-01-17 1999-04-06 Lg Electronics Inc, Three dimensional wireless pointing device
US5867146A (en) 1996-01-17 1999-02-02 Lg Electronics Inc. Three dimensional wireless pointing device
DE19701344A1 (en) 1996-01-17 1997-07-24 Lg Electronics Inc Hand held wireless three dimensional input unit for computer systems
US5703623A (en) 1996-01-24 1997-12-30 Hall; Malcolm G. Smart orientation sensing circuit for remote control
US5698784A (en) 1996-01-24 1997-12-16 Gyration, Inc. Vibratory rate gyroscope and methods of assembly and operation
US6463328B1 (en) 1996-02-02 2002-10-08 Michael Sasha John Adaptive brain stimulation method and system
US6466831B1 (en) 1996-02-09 2002-10-15 Murata Mpg. Co. Ltd. Three-dimensional data input device
US6164808A (en) 1996-02-09 2000-12-26 Murata Mfg. Co., Ltd. Three-dimensional data input device
US5692956A (en) * 1996-02-09 1997-12-02 Mattel, Inc. Combination computer mouse and game play control
US5661502A (en) 1996-02-16 1997-08-26 Ast Research, Inc. Self-adjusting digital filter for smoothing computer mouse movement
JPH09230997A (en) 1996-02-20 1997-09-05 Ricoh Co Ltd Pen type input device
US5902968A (en) 1996-02-20 1999-05-11 Ricoh Company, Ltd. Pen-shaped handwriting input apparatus using accelerometers and gyroscopes and an associated operational device for determining pen movement
US6084577A (en) 1996-02-20 2000-07-04 Ricoh Company, Ltd. Pen-shaped handwriting input apparatus using accelerometers and gyroscopes and an associated operational device for determining pen movement
US5825350A (en) 1996-03-13 1998-10-20 Gyration, Inc. Electronic pointing apparatus and method
JPH09251350A (en) 1996-03-18 1997-09-22 Matsushita Electric Ind Co Ltd Menu selection device
US5796395A (en) 1996-04-02 1998-08-18 Wegener Internet Projects Bv System for publishing and searching interests of individuals
JPH09274534A (en) 1996-04-04 1997-10-21 Ricoh Co Ltd Pen type input device
JPH09282085A (en) 1996-04-15 1997-10-31 Mitsumi Electric Co Ltd Position information input device
US6072467A (en) 1996-05-03 2000-06-06 Mitsubishi Electric Information Technology Center America, Inc. (Ita) Continuously variable control of animated on-screen characters
US6230324B1 (en) 1996-05-27 2001-05-08 Fujitsu Limited Device for transmitting broadcast-program information and allowing other information sources to be accessed
JPH09322089A (en) 1996-05-27 1997-12-12 Fujitsu Ltd Broadcasting program transmitter, information transmitter, device provided with document preparation function and terminal equipment
JPH09319510A (en) 1996-05-27 1997-12-12 Ricoh Co Ltd Pen type input device
US6028594A (en) 1996-06-04 2000-02-22 Alps Electric Co., Ltd. Coordinate input device depending on input speeds
US6400406B1 (en) 1996-06-28 2002-06-04 Samsung Electronics, Co., Ltd. Device and method for displaying broadcast program guide in a programmed recording system
US6412110B1 (en) 1996-08-06 2002-06-25 Starsight Telecast, Inc. Electronic program guide with interactive areas
US5989157A (en) 1996-08-06 1999-11-23 Walton; Charles A. Exercising system with electronic inertial game playing
US6411308B1 (en) 1996-08-14 2002-06-25 Samsung Electronics Co., Ltd. Television graphical user interface having variable channel control bars
US6181333B1 (en) 1996-08-14 2001-01-30 Samsung Electronics Co., Ltd. Television graphical user interface having channel and program sorting capabilities
US5835156A (en) 1996-08-14 1998-11-10 Samsung Electroncis, Ltd. Television graphical user interface employing remote random access pointing device
US6191781B1 (en) 1996-08-14 2001-02-20 Samsung Electronics, Ltd. Television graphical user interface that combines electronic program guide with graphical channel changer
US6195089B1 (en) 1996-08-14 2001-02-27 Samsung Electronics Co., Ltd. Television graphical user interface having variable channel changer icons
US6057831A (en) 1996-08-14 2000-05-02 Samsung Electronics Co., Ltd. TV graphical user interface having cursor position indicator
US6016144A (en) 1996-08-14 2000-01-18 Samsung Electronics Co., Ltd. Multi-layered television graphical user interface
US5955988A (en) 1996-08-14 1999-09-21 Samsung Electronics Co., Ltd. Graphical user interface for establishing installation location for satellite based television system
US5978043A (en) 1996-08-14 1999-11-02 Samsung Electronics Co., Ltd. TV graphical user interface that provides customized lists of programming
US5940072A (en) 1996-08-15 1999-08-17 Samsung Information Systems America Graphics decompression using system ROM indexing in TV set top box
US6115028A (en) 1996-08-22 2000-09-05 Silicon Graphics, Inc. Three dimensional input system using tilt
US5790121A (en) 1996-09-06 1998-08-04 Sklar; Peter Clustering user interface
US5878286A (en) 1996-09-10 1999-03-02 Nikon Corporation Motion detection device for a photographic apparatus
JPH1093936A (en) 1996-09-18 1998-04-10 Toshiba Corp Broadcast transmitter, broadcast receiver and video reservation device
US6001014A (en) * 1996-10-01 1999-12-14 Sony Computer Entertainment Inc. Game machine control module and game machine
US6556127B1 (en) 1996-10-15 2003-04-29 Swisscom Ag Speaker verification method
US6144366A (en) 1996-10-18 2000-11-07 Kabushiki Kaisha Toshiba Method and apparatus for generating information input using reflected light image of target object
JPH10191468A (en) 1996-10-25 1998-07-21 Matsushita Electric Ind Co Ltd Audio and video system
US5889506A (en) 1996-10-25 1999-03-30 Matsushita Electric Industrial Co., Ltd. Video user's environment
US5745226A (en) 1996-11-04 1998-04-28 Litton Systems, Inc. Passive optical velocity measurement device and method
US5883619A (en) 1996-11-12 1999-03-16 Primax Electronics Ltd. Computer mouse for scrolling a view of an image
US5870079A (en) 1996-11-12 1999-02-09 Legaltech, Inc. Computer input device and controller therefor
DE19648487C1 (en) 1996-11-12 1998-06-10 Primax Electronics Ltd Computer mouse with additional display window controls
US6037933A (en) 1996-11-13 2000-03-14 Samsung Electronics Co., Ltd. TV graphical user interface for providing user access to preset time periods of TV program information
US6154723A (en) 1996-12-06 2000-11-28 The Board Of Trustees Of The University Of Illinois Virtual reality 3D interface system for data creation, viewing and editing
US5786805A (en) 1996-12-27 1998-07-28 Barry; Edwin Franklin Method and apparatus for improving object selection on a computer display by providing cursor control with a sticky property
CA2248364A1 (en) 1997-01-08 1998-07-16 Ferdinand Peer Instrument for compensating for hand tremor during the manipulation of fine structures
US5796354A (en) 1997-02-07 1998-08-18 Reality Quest Corp. Hand-attachable controller with direction sensing
JPH10240434A (en) 1997-02-27 1998-09-11 Matsushita Electric Ind Co Ltd Command menu selecting method
WO1998043183A1 (en) 1997-03-25 1998-10-01 Sony Electronics, Inc. Integrated search of electronic program guide, internet and other information resources
JPH10275048A (en) 1997-03-28 1998-10-13 Ricoh Co Ltd Pen type input device
DE19814254A1 (en) 1997-03-31 1998-10-15 Microsoft Corp Query-based electronic program guide
US8051450B2 (en) 1997-03-31 2011-11-01 Microsoft Corporation Query-based electronic program guide
US5982369A (en) 1997-04-21 1999-11-09 Sony Corporation Method for displaying on a screen of a computer system images representing search results
US6005551A (en) 1997-04-25 1999-12-21 Microsoft Corporation Offline force effect rendering
JPH10307676A (en) 1997-05-07 1998-11-17 Ricoh Co Ltd Pen type input device
US5881321A (en) 1997-05-09 1999-03-09 Cammotion, Inc.. Camera motion sensing system
US6397387B1 (en) 1997-06-02 2002-05-28 Sony Corporation Client and server system
US20040078194A1 (en) 1997-06-10 2004-04-22 Coding Technologies Sweden Ab Source coding enhancement using spectral-band replication
US6188392B1 (en) 1997-06-30 2001-02-13 Intel Corporation Electronic pen device
US6175362B1 (en) 1997-07-21 2001-01-16 Samsung Electronics Co., Ltd. TV graphical user interface providing selection among various lists of TV channels
US6088031A (en) 1997-07-21 2000-07-11 Samsung Electronics Co., Ltd. Method and device for controlling selection of a menu item from a menu displayed on a screen
JPH1145150A (en) 1997-07-25 1999-02-16 Ricoh Co Ltd Pen type input device
US6621452B2 (en) 1997-08-19 2003-09-16 Siemens Vdo Automotive Corporation Vehicle information system
JPH1185387A (en) 1997-09-12 1999-03-30 Ricoh Co Ltd Posture input device, pen type input device with posture input function, and pen type input system with pen type input device
US6043807A (en) 1997-09-23 2000-03-28 At&T Corp. Mouse for positioning a cursor on a computer display and having a removable pen-type input device
US6005578A (en) 1997-09-25 1999-12-21 Mindsphere, Inc. Method and apparatus for visual navigation of information objects
US5953683A (en) 1997-10-09 1999-09-14 Ascension Technology Corporation Sourceless orientation sensor
US6282467B1 (en) 1997-10-14 2001-08-28 The Boeing Company Three-axis inertial attitude determination for spinning spacecraft
US5912612A (en) 1997-10-14 1999-06-15 Devolpi; Dean R. Multi-speed multi-direction analog pointing device
JPH11146299A (en) 1997-11-11 1999-05-28 Hitachi Ltd Device and method for displaying data television broadcasting
US5880722A (en) 1997-11-12 1999-03-09 Futuretel, Inc. Video cursor with zoom in the user interface of a video editor
US6902482B1 (en) * 1997-11-25 2005-06-07 Thomas G. Woolston Interactive electronic game having gyroscopic output effect
EP0919906A2 (en) 1997-11-27 1999-06-02 Matsushita Electric Industrial Co., Ltd. Control method
US6871413B1 (en) 1997-12-15 2005-03-29 Microstrain, Inc. Miniaturized inclinometer for angle measurement with accurate measurement indicator
US6492981B1 (en) 1997-12-23 2002-12-10 Ricoh Company, Ltd. Calibration of a system for tracking a writing instrument with multiple sensors
US6047132A (en) 1998-03-04 2000-04-04 Nikon Corporation Camera system and interchangeable lens to compensate for image blur when photographing at close range
US6092076A (en) 1998-03-24 2000-07-18 Navigation Technologies Corporation Method and system for map display in a navigation application
US6154199A (en) 1998-04-15 2000-11-28 Butler; Craig L. Hand positioned mouse
US6672962B1 (en) 1998-05-13 2004-01-06 Kabushiki Kaisha Sega Enterprises Gun-shaped controller and game device
US6268849B1 (en) 1998-06-30 2001-07-31 United Video Properties, Inc. Internet television program guide system with embedded real-time data
US6529218B2 (en) 1998-07-13 2003-03-04 Matsushita Electric Industrial Co., Ltd. Display control with movable or updatable auxiliary information
US6369837B1 (en) 1998-07-17 2002-04-09 International Business Machines Corporation GUI selector control
US6466199B2 (en) 1998-07-23 2002-10-15 Alps Electric Co., Ltd. Method for moving a pointing cursor
DE19937307A1 (en) 1998-08-10 2000-02-17 Deutsch Zentr Luft & Raumfahrt Method for technical control operations using control wheel, where pressure and turning actions are converted into translational and rotational movements of objects being controlled with wheel
US6583783B1 (en) 1998-08-10 2003-06-24 Deutsches Zentrum Fur Luft- Und Raumfahrt E.V. Process for performing operations using a 3D input device
JP2000056897A (en) 1998-08-11 2000-02-25 Toshiba Corp Computer system
US6369794B1 (en) 1998-09-09 2002-04-09 Matsushita Electric Industrial Co., Ltd. Operation indication outputting device for giving operation indication according to type of user's action
JP2000172248A (en) 1998-09-28 2000-06-23 Fujitsu Ltd Method of displaying electronic information, and device and program storage medium for reading electronic information
US6650343B1 (en) 1998-09-28 2003-11-18 Fujitsu Limited Electronic information displaying method, electronic information browsing apparatus and electronic information browsing program storing medium
US6295646B1 (en) 1998-09-30 2001-09-25 Intel Corporation Method and apparatus for displaying video data and corresponding entertainment data for multiple entertainment selection sources
JP2000115652A (en) 1998-10-02 2000-04-21 Matsushita Electric Ind Co Ltd Epg information display method and device, image recording and reproducing device and program recording medium
EP1126701A1 (en) 1998-10-02 2001-08-22 Matsushita Electric Industrial Co., Ltd. Epg information display method, epg information display device, video recording/reproducing device, and program
US6515669B1 (en) 1998-10-23 2003-02-04 Olympus Optical Co., Ltd. Operation input device applied to three-dimensional input device
US6765598B2 (en) 1998-10-27 2004-07-20 Samsung Electronics Co., Ltd. Method and apparatus for enabling selection in an on-screen menu
US6735777B1 (en) 1998-10-28 2004-05-11 Samsung Electronics Co., Ltd. Method for controlling program guide for displaying broadcast program title
US6452609B1 (en) 1998-11-06 2002-09-17 Supertuner.Com Web application for accessing media streams
US6978472B1 (en) 1998-11-30 2005-12-20 Sony Corporation Information providing device and method
WO2000033566A1 (en) 1998-11-30 2000-06-08 Sony Corporation Information providing device and method
JP2011052009A (en) 1998-12-07 2011-03-17 Zymogenetics Inc Growth factor homolog zvegf3
WO2000034474A2 (en) 1998-12-07 2000-06-15 Zymogenetics, Inc. Growth factor homolog zvegf3
US20030158699A1 (en) 1998-12-09 2003-08-21 Christopher P. Townsend Orientation sensor
US6163021A (en) 1998-12-15 2000-12-19 Rockwell Collins, Inc. Navigation system for spinning projectiles
TW392066B (en) 1998-12-17 2000-06-01 Tokin Corp Orientation angle detector
US6577350B1 (en) 1998-12-21 2003-06-10 Sony Corporation Method and apparatus for displaying an electronic program guide
US6198470B1 (en) 1998-12-28 2001-03-06 Uri Agam Computer input device
US6104969A (en) 1998-12-30 2000-08-15 Honeywell International Inc. Methods and apparatus for operating an input device in a turbulent environment
US6429813B2 (en) 1999-01-14 2002-08-06 Navigation Technologies Corp. Method and system for providing end-user preferences with a navigation system
US6330856B1 (en) 1999-01-28 2001-12-18 Innovative Product Achievements, Inc. Garment dispensing and receiving apparatus
US6400996B1 (en) 1999-02-01 2002-06-04 Steven M. Hoffberg Adaptive pattern recognition based control system and method
JP2000242383A (en) 1999-02-19 2000-09-08 Nec Corp Screen display enlargement control unit
US6614420B1 (en) 1999-02-22 2003-09-02 Microsoft Corporation Dual axis articulated electronic input device
JP2000270237A (en) 1999-03-15 2000-09-29 Nippon Hoso Kyokai <Nhk> Selector for image display device
US20020189870A1 (en) 1999-03-15 2002-12-19 Kamen Dean L. Control of a balancing personal vehicle
US6426761B1 (en) 1999-04-23 2002-07-30 Internation Business Machines Corporation Information presentation system for a graphical user interface
JP2000308756A (en) 1999-04-27 2000-11-07 Taito Corp Input controller of game device
US6737591B1 (en) 1999-05-25 2004-05-18 Silverbrook Research Pty Ltd Orientation sensing device
US20040075650A1 (en) 1999-05-25 2004-04-22 Lapstun Paul Orientation sensing device with processor
US6208936B1 (en) 1999-06-18 2001-03-27 Rockwell Collins, Inc. Utilization of a magnetic sensor to compensate a MEMS-IMU/GPS and de-spin strapdown on rolling missiles
JP2001008384A (en) 1999-06-21 2001-01-12 Toshiba Corp System screen display and recording medium
US6833844B1 (en) 1999-06-21 2004-12-21 Kabushiki Kaisha Toshiba System display apparatus and storing medium
US6415225B1 (en) 1999-08-06 2002-07-02 Aisin Aw Co., Ltd. Navigation system and a memory medium
JP2001052009A (en) 1999-08-06 2001-02-23 Sony Corp Device and method for processing information and medium
US6349257B1 (en) 1999-09-15 2002-02-19 International Business Machines Corporation System for personalized mobile navigation information
US6473713B1 (en) 1999-09-20 2002-10-29 American Gnc Corporation Processing method for motion measurement
JP2001100908A (en) 1999-09-28 2001-04-13 Fujitsu Ltd Pen tip trace generating method, pen type input device, and pen mounting unit
US6753849B1 (en) 1999-10-27 2004-06-22 Ken Curran & Associates Universal remote TV mouse
US6466200B1 (en) 1999-11-03 2002-10-15 Innalabs, Inc. Computer input device
US6466198B1 (en) 1999-11-05 2002-10-15 Innoventions, Inc. View navigation and magnification of a hand-held device with a display
US6844871B1 (en) 1999-11-05 2005-01-18 Microsoft Corporation Method and apparatus for computer input using six degrees of freedom
JP2001159951A (en) 1999-12-02 2001-06-12 Nec Corp Information processor and method for processing information
JP2001175412A (en) 1999-12-15 2001-06-29 Shigekazu Koshiba Remote controller for electronic equipment with multi- axial integral acceleration detector
US20020054129A1 (en) 1999-12-24 2002-05-09 U.S. Philips Corporation 3D environment labelling
US20010015123A1 (en) 2000-01-11 2001-08-23 Yoshiki Nishitani Apparatus and method for detecting performer's motion to interactively control performance of music or the like
US6421067B1 (en) 2000-01-16 2002-07-16 Isurftv Electronic programming guide
US20040204240A1 (en) 2000-02-22 2004-10-14 Barney Jonathan A. Magical wand and interactive play experience
US6766456B1 (en) 2000-02-23 2004-07-20 Micron Technology, Inc. Method and system for authenticating a user of a computer system
US7061469B2 (en) 2000-02-24 2006-06-13 Innalabs Technologies, Inc. Method of data input into a computer
US20030159051A1 (en) 2000-03-27 2003-08-21 Wilhelm Hollnagel Method for generating electronic signatures
US6496779B1 (en) 2000-03-30 2002-12-17 Rockwell Collins Inertial measurement unit with magnetometer for detecting stationarity
WO2001078055A1 (en) 2000-04-05 2001-10-18 Feinstein David Y View navigation and magnification of a hand-held device with a display
US6933923B2 (en) 2000-04-05 2005-08-23 David Y. Feinstein View navigation and magnification of a hand-held device with a display
US20020112237A1 (en) 2000-04-10 2002-08-15 Kelts Brett R. System and method for providing an interactive display interface for information objects
US20020033848A1 (en) 2000-04-21 2002-03-21 Sciammarella Eduardo Agusto System for managing data objects
US6544126B2 (en) 2000-04-25 2003-04-08 Nintendo Co., Ltd. Portable game machine with download capability
US6744420B2 (en) 2000-06-01 2004-06-01 Olympus Optical Co., Ltd. Operation input apparatus using sensor attachable to operator's hand
DE10029173A1 (en) 2000-06-19 2002-01-03 Deutsch Zentr Luft & Raumfahrt Method and arrangement for commanding control operations for kinematic movements of an object using a hand-operated input device
US20020021278A1 (en) 2000-07-17 2002-02-21 Hinckley Kenneth P. Method and apparatus using multiple sensors in a device with a display
US20020015064A1 (en) 2000-08-07 2002-02-07 Robotham John S. Gesture-based user interface to multi-level and multi-modal sets of bit-maps
TW530998U (en) 2000-08-14 2003-05-01 Jr-Feng Chen Computer input devive with two elastic fulcrums for 6 degrees of freedom data input
JP2002062981A (en) 2000-08-16 2002-02-28 Nippon Hoso Kyokai <Nhk> Display screen indicating device
US6590536B1 (en) * 2000-08-18 2003-07-08 Charles A. Walton Body motion detecting system with correction for tilt of accelerometers and remote measurement of body position
US20020054158A1 (en) 2000-08-31 2002-05-09 Akiko Asami Information-processing apparatus and computer-graphic display program
JP2002091692A (en) 2000-09-12 2002-03-29 Seiko Instruments Inc Pointing system
US6613000B1 (en) * 2000-09-30 2003-09-02 The Regents Of The University Of California Method and apparatus for mass-delivered movement rehabilitation
US6873931B1 (en) 2000-10-10 2005-03-29 Csi Technology, Inc. Accelerometer based angular position sensor
US6583781B1 (en) 2000-10-17 2003-06-24 International Business Machines Corporation Methods, systems and computer program products for controlling events associated with user interface elements by capturing user intent based on pointer movements
US6385542B1 (en) 2000-10-18 2002-05-07 Magellan Dis, Inc. Multiple configurations for a vehicle navigation system
US6757446B1 (en) 2000-11-27 2004-06-29 Microsoft Corporation System and process for image-based relativistic rendering
US20040050924A1 (en) 2000-12-06 2004-03-18 Falk Mletzko Enabling of devices
JP2002207703A (en) 2001-01-11 2002-07-26 Sony Corp Electronic equipment
JP2002215327A (en) 2001-01-18 2002-08-02 Nec Eng Ltd Mouse control mechanism
US20020140745A1 (en) 2001-01-24 2002-10-03 Ellenby Thomas William Pointing systems for addressing objects
US6529161B2 (en) 2001-02-08 2003-03-04 Mitsubishi Denki Kabushiki Kaisha Antenna control method and antenna controller
US6561993B2 (en) 2001-02-26 2003-05-13 International Business Machines Corporation Device driver system for minimizing adverse tremor effects during use of pointing devices
US20020118123A1 (en) 2001-02-27 2002-08-29 Kim Sung-Cheol Space keyboard system using force feedback and method of inputting information therefor
JP2002259335A (en) 2001-03-01 2002-09-13 Konica Corp Information recognition system, digital image pickup device, store printer, portable telephone set, information processor, information recognizing method and information recording medium
US20020126026A1 (en) 2001-03-09 2002-09-12 Samsung Electronics Co., Ltd. Information input system using bio feedback and method thereof
US6819344B2 (en) 2001-03-12 2004-11-16 Microsoft Corporation Visualization of multi-dimensional data having an unbounded dimension
US20020126121A1 (en) 2001-03-12 2002-09-12 Robbins Daniel C. Visualization of multi-dimensional data having an unbounded dimension
US20020130835A1 (en) 2001-03-16 2002-09-19 Brosnan Michael John Portable electronic device with mouse-like capabilities
US6897854B2 (en) 2001-04-12 2005-05-24 Samsung Electronics Co., Ltd. Electronic pen input device and coordinate detecting method therefor
JP2002312117A (en) 2001-04-17 2002-10-25 Japan Aviation Electronics Industry Ltd Cylindrical image spherical image control device
US20020158843A1 (en) 2001-04-26 2002-10-31 International Business Machines Corporation Method and adapter for performing assistive motion data processing and/or button data processing external to a computer
US6650313B2 (en) 2001-04-26 2003-11-18 International Business Machines Corporation Method and adapter for performing assistive motion data processing and/or button data processing external to a computer
US6557350B2 (en) 2001-05-17 2003-05-06 General Electric Company Method and apparatus for cooling gas turbine engine igniter tubes
US20040236573A1 (en) 2001-06-19 2004-11-25 Sapeluk Andrew Thomas Speaker recognition systems
JP2003009577A (en) 2001-06-20 2003-01-10 Yamaha Motor Co Ltd Drive controller for brushless dc motor
KR20030009577A (en) 2001-06-27 2003-02-05 (주)이에스비컨 Three-dimensional input device using a gyroscope
US20040239626A1 (en) 2001-08-13 2004-12-02 Noguera Gritsko Perez Tilt-based pointing for hand-held devices
US20030038778A1 (en) 2001-08-13 2003-02-27 Siemens Information And Communication Mobile, Llc Tilt-based pointing for hand-held devices
JP2002082773A (en) 2001-08-22 2002-03-22 Sony Corp Input device and its method
WO2003021947A1 (en) 2001-08-29 2003-03-13 Digeo, Inc. System and method for displaying option representations with multiple levels of specificity
US7093201B2 (en) 2001-09-06 2006-08-15 Danger, Inc. Loop menu navigation apparatus and method
US6661410B2 (en) 2001-09-07 2003-12-09 Microsoft Corporation Capacitive sensing and data input device power management
US6874413B2 (en) 2001-09-27 2005-04-05 Thieme Gmbh & Co. Kg Top part of a screen printing machine with bearing elements for a screen printing stencil
US20030172283A1 (en) 2001-10-25 2003-09-11 O'hara Sean M. Biometric characteristic-enabled remote control device
DE10241392A1 (en) 2001-10-26 2003-05-15 Agilent Technologies Inc Optical sensor device for sensing relative movement, includes movable motion sensor with two arrays of photo detectors, and multiple lenses that direct far-field images onto arrays of photo detectors
US6770863B2 (en) 2001-10-26 2004-08-03 Agilent Technologies, Inc. Apparatus and method for three-dimensional relative movement sensing
US20030080282A1 (en) 2001-10-26 2003-05-01 Walley Thomas M. Apparatus and method for three-dimensional relative movement sensing
US20030093311A1 (en) 2001-11-05 2003-05-15 Kenneth Knowlson Targeted advertising
US20030115930A1 (en) 2001-11-13 2003-06-26 Nokia Corporation Method, device and system for calibrating angular rate measurement sensors
WO2003048909A2 (en) 2001-12-04 2003-06-12 Applied Neural Computing, L.L.C. Validating the identity of a user using a pointing device
US20030107551A1 (en) 2001-12-10 2003-06-12 Dunker Garrett Storm Tilt input device
US6724368B2 (en) 2001-12-14 2004-04-20 Koninklijke Philips Electronics N.V. Remote control system and method for a television receiver
US6990639B2 (en) 2002-02-07 2006-01-24 Microsoft Corporation System and process for controlling electronic components in a ubiquitous computing environment using multimodal integration
US20030193572A1 (en) 2002-02-07 2003-10-16 Andrew Wilson System and process for selecting objects in a ubiquitous computing environment
US6982697B2 (en) 2002-02-07 2006-01-03 Microsoft Corporation System and process for selecting objects in a ubiquitous computing environment
KR200276592Y1 (en) 2002-02-22 2002-05-24 오성훈 Cutter for cutting round wire
US6929548B2 (en) 2002-04-23 2005-08-16 Xiaoling Wang Apparatus and a method for more realistic shooting video games on computers or similar devices
DE10219198A1 (en) 2002-04-29 2003-11-06 Univ Leipzig Cursor movement control device comprises a device that is moved in space so that sensors, such as acceleration sensors, detect the movement which is converted into a command signal that is transmitted to a computer or similar
US20030210230A1 (en) 2002-05-09 2003-11-13 Waters Richard C. Invisible beam pointer system
US20040008191A1 (en) 2002-06-14 2004-01-15 Ivan Poupyrev User interface apparatus and portable information apparatus
US20050219217A1 (en) 2002-07-11 2005-10-06 International Business Machines Corporation Peripheral device for a data processing system
JP2004061502A (en) 2002-07-26 2004-02-26 Microsoft Corp Electrostatic capacitive sensing system and its method
US6954867B2 (en) 2002-07-26 2005-10-11 Microsoft Corporation Capacitive sensing employing a repeatable offset charge
US20060007115A1 (en) 2002-07-31 2006-01-12 Sharp Kabushiki Kaisha Display device for presentation
JP2004062774A (en) 2002-07-31 2004-02-26 Sharp Corp Presentation display device
US6984208B2 (en) 2002-08-01 2006-01-10 The Hong Kong Polytechnic University Method and apparatus for sensing body gesture, posture and movement
US20040036650A1 (en) 2002-08-26 2004-02-26 Morgan Kenneth S. Remote velocity sensor slaved to an integrated GPS/INS
US20050222802A1 (en) 2002-08-27 2005-10-06 Yasuhiro Tamura Mobile terminal apparatus
JP2004086511A (en) 2002-08-27 2004-03-18 Innotech Corp Electronic equipment and pointer moving method
US20040064252A1 (en) 2002-09-26 2004-04-01 Honeywell International Inc. Method and system for processing pulse signals within an inertial navigation system
JP2004126756A (en) 2002-09-30 2004-04-22 Pentel Corp Correction value setting method for coordinate input device
US20040227725A1 (en) 2002-10-14 2004-11-18 Stmicroelectronics S.R.L. User controlled device for sending control signals to an electric appliance, in particular user controlled pointing device such as mouse of joystick, with 3D-motion detection
US20040070564A1 (en) 2002-10-15 2004-04-15 Dawson Thomas P. Method and system for controlling a display device
US20040095317A1 (en) 2002-11-20 2004-05-20 Jingxi Zhang Method and apparatus of universal remote pointing control for home entertainment system and computer
US6975959B2 (en) 2002-12-03 2005-12-13 Robert Bosch Gmbh Orientation and navigation for a mobile device using inertial sensors
US20060150734A1 (en) 2002-12-10 2006-07-13 Koninklijke Philips Electronics N.V. Activity monitoring
US20060125789A1 (en) 2002-12-23 2006-06-15 Jiawen Tu Contactless input device
US20040123320A1 (en) 2002-12-23 2004-06-24 Mike Daily Method and system for providing an interactive guide for multimedia selection
US20040151218A1 (en) 2002-12-23 2004-08-05 Vlad Branzoi Systems and methods for tremor cancellation in pointers
US20040125073A1 (en) 2002-12-30 2004-07-01 Scott Potter Portable electronic apparatus and method employing motion sensor for function control
US20040176991A1 (en) 2003-03-05 2004-09-09 Mckennan Carol System, method and apparatus using biometrics to communicate dissatisfaction via stress level
US20040189620A1 (en) 2003-03-19 2004-09-30 Samsung Electronics Co., Ltd. Magnetic sensor-based pen-shaped input system and a handwriting trajectory recovery method therefor
US20040193413A1 (en) 2003-03-25 2004-09-30 Wilson Andrew D. Architecture for controlling a computer using hand gestures
US20040193925A1 (en) 2003-03-26 2004-09-30 Matnn Safriel Portable password manager
US20040196287A1 (en) 2003-04-01 2004-10-07 Wong Pak Chung Dynamic visualization of data streams
US20050008148A1 (en) 2003-04-02 2005-01-13 Dov Jacobson Mouse performance identification
US20040221243A1 (en) 2003-04-30 2004-11-04 Twerdahl Timothy D Radial menu interface for handheld computing device
US20040218104A1 (en) 2003-05-01 2004-11-04 Smith Gregory C. Multimedia user interface
US20040252120A1 (en) 2003-05-08 2004-12-16 Hunleth Frank A. Systems and methods for node tracking and notification in a control framework including a zoomable graphical user interface
US20050125826A1 (en) 2003-05-08 2005-06-09 Hunleth Frank A. Control framework with a zoomable graphical user interface for organizing selecting and launching media items
US20040268393A1 (en) 2003-05-08 2004-12-30 Hunleth Frank A. Control framework with a zoomable graphical user interface for organizing, selecting and launching media items
US20040229693A1 (en) 2003-05-13 2004-11-18 Clifton Lind Multiple video display gaming machine and gaming system
US20050017454A1 (en) * 2003-06-09 2005-01-27 Shoichi Endo Interactive gaming systems with haptic feedback
US7038661B2 (en) 2003-06-13 2006-05-02 Microsoft Corporation Pointing device and cursor for use in intelligent computing environments
US20050033200A1 (en) 2003-08-05 2005-02-10 Soehren Wayne A. Human motion identification and measurement system and method
US20050037843A1 (en) 2003-08-11 2005-02-17 William Wells Three-dimensional image display for a gaming apparatus
US20050041014A1 (en) 2003-08-22 2005-02-24 Benjamin Slotznick Using cursor immobility to suppress selection errors
US20050213840A1 (en) 2003-10-17 2005-09-29 Mei Chen Method for image stabilization by adaptive filtering
US7254279B2 (en) 2003-10-17 2007-08-07 Hewlett-Packard Development Company, L.P. Method for image stabilization by adaptive filtering
US7155974B2 (en) 2003-10-20 2007-01-02 Honda Motor Co., Ltd. Inertia sensor unit
US20050174324A1 (en) 2003-10-23 2005-08-11 Hillcrest Communications, Inc. User interface devices and methods employing accelerometers
US20050097474A1 (en) 2003-10-31 2005-05-05 Accot Johnny I. Spiral scrollbar
US6998966B2 (en) 2003-11-26 2006-02-14 Nokia Corporation Mobile communication device having a functional cover for controlling sound applications by motion
US20050160813A1 (en) 2004-01-27 2005-07-28 Nobuyuki Imai Clock generating device, vibration type gyro sensor, navigation device, imaging device, and electronic apparatus
US20050176505A1 (en) * 2004-02-09 2005-08-11 Stanley Mark J. Method and apparatus for providing computer pointing device input to a video game console
US7173604B2 (en) 2004-03-23 2007-02-06 Fujitsu Limited Gesture identification of controlled devices
US20050212767A1 (en) 2004-03-23 2005-09-29 Marvit David L Context dependent gesture response
US20050212766A1 (en) * 2004-03-23 2005-09-29 Reinhardt Albert H M Translation controlled cursor
US20050212749A1 (en) 2004-03-23 2005-09-29 Marvit David L Motion sensor engagement for a handheld device
JP2007531942A (en) 2004-04-01 2007-11-08 ヤコブソン,ドヴ Mouse motion identification
WO2005099166A2 (en) 2004-04-01 2005-10-20 Dov Jacobson Mouse performance identification
US20050222784A1 (en) 2004-04-01 2005-10-06 Blue Line Innovations Inc. System and method for reading power meters
US20060061545A1 (en) 2004-04-02 2006-03-23 Media Lab Europe Limited ( In Voluntary Liquidation). Motion-activated control with haptic feedback
US7383517B2 (en) 2004-04-21 2008-06-03 Microsoft Corporation System and method for acquiring a target with intelligent pointer movement
US20050246109A1 (en) 2004-04-29 2005-11-03 Samsung Electronics Co., Ltd. Method and apparatus for entering information into a portable electronic device
US20080158154A1 (en) 2004-04-30 2008-07-03 Hillcrest Laboratories, Inc. 3D pointing devices and methods
US20080158155A1 (en) 2004-04-30 2008-07-03 Hillcrest Laboratories, Inc. Methods and devices for indentifying users based on tremor
US10782792B2 (en) 2004-04-30 2020-09-22 Idhl Holdings, Inc. 3D pointing devices with orientation compensation and improved usability
US7158118B2 (en) 2004-04-30 2007-01-02 Hillcrest Laboratories, Inc. 3D pointing devices with orientation compensation and improved usability
US10514776B2 (en) 2004-04-30 2019-12-24 Idhl Holdings, Inc. 3D pointing devices and methods
US9298282B2 (en) 2004-04-30 2016-03-29 Hillcrest Laboratories, Inc. 3D pointing devices with orientation compensation and improved usability
US9261978B2 (en) 2004-04-30 2016-02-16 Hillcrest Laboratories, Inc. 3D pointing devices and methods
US8994657B2 (en) 2004-04-30 2015-03-31 Hillcrest Laboratories, Inc. Methods and devices for identifying users based on tremor
US8937594B2 (en) 2004-04-30 2015-01-20 Hillcrest Laboratories, Inc. 3D pointing devices with orientation compensation and improved usability
US8766917B2 (en) 2004-04-30 2014-07-01 Hillcrest Laboratories, Inc. 3D pointing devices and methods
US8629836B2 (en) 2004-04-30 2014-01-14 Hillcrest Laboratories, Inc. 3D pointing devices with orientation compensation and improved usability
US8072424B2 (en) 2004-04-30 2011-12-06 Hillcrest Laboratories, Inc. 3D pointing devices with orientation compensation and improved usability
US20050243062A1 (en) 2004-04-30 2005-11-03 Hillcrest Communications, Inc. Free space pointing devices with tilt compensation and improved usability
US7236156B2 (en) 2004-04-30 2007-06-26 Hillcrest Laboratories, Inc. Methods and devices for identifying users based on tremor
US7239301B2 (en) 2004-04-30 2007-07-03 Hillcrest Laboratories, Inc. 3D pointing devices and methods
US20050243061A1 (en) 2004-04-30 2005-11-03 Hillcrest Communications, Inc. Methods and devices for identifying users based on tremor
US7262760B2 (en) 2004-04-30 2007-08-28 Hillcrest Laboratories, Inc. 3D pointing devices with orientation compensation and improved usability
US20070247425A1 (en) 2004-04-30 2007-10-25 Hillcrest Laboratories, Inc. Methods and devices for identifying users based on tremor
US20060028446A1 (en) 2004-04-30 2006-02-09 Hillcrest Communications, Inc. Methods and devices for removing unintentional movement in free space pointing devices
US7535456B2 (en) 2004-04-30 2009-05-19 Hillcrest Laboratories, Inc. Methods and devices for removing unintentional movement in 3D pointing devices
US7489298B2 (en) 2004-04-30 2009-02-10 Hillcrest Laboratories, Inc. 3D pointing devices and methods
US7414611B2 (en) 2004-04-30 2008-08-19 Hillcrest Laboratories, Inc. 3D pointing devices with orientation compensation and improved usability
US20050253806A1 (en) 2004-04-30 2005-11-17 Hillcrest Communications, Inc. Free space pointing devices and methods
US20050275623A1 (en) 2004-06-14 2005-12-15 Siemens Information And Communication Mobile Llc Optical joystick for hand-held communication device
US7166832B2 (en) 2004-07-09 2007-01-23 Funai Electric Co., Ltd. Self-running robot
US7194816B2 (en) 2004-07-15 2007-03-27 C&N Inc. Mobile terminal apparatus
JP2006113019A (en) 2004-10-18 2006-04-27 Alps Electric Co Ltd Triaxial type electronic compass, and azimuth detecting method using same
US20060092133A1 (en) 2004-11-02 2006-05-04 Pierre A. Touma 3D mouse and game controller based on spherical coordinates system and system for use
US20060109242A1 (en) 2004-11-19 2006-05-25 Simpkins Daniel S User interface for impaired users
US20090002203A1 (en) 2005-02-25 2009-01-01 Kenji Kataoka Mobile Device
US8106795B2 (en) 2005-02-25 2012-01-31 Nec Corporation Mobile device
US7487045B1 (en) 2005-03-10 2009-02-03 William Vieira Projected score area calculator and method of use
US20060262116A1 (en) 2005-05-19 2006-11-23 Hillcrest Laboratories, Inc. Global navigation objects in user interfaces
US20070035518A1 (en) 2005-07-01 2007-02-15 Hillcrest Laboratories, Inc. 3D pointing devices
WO2007007227A2 (en) 2005-07-11 2007-01-18 Philips Intellectual Property & Standards Gmbh Method of controlling a control point position on a command area and method for control of a device
JP2007083013A (en) 2005-08-24 2007-04-05 Nintendo Co Ltd Game controller and game system
US20070072680A1 (en) 2005-08-24 2007-03-29 Nintendo Co., Ltd. Game controller and game system
US20070066394A1 (en) 2005-09-15 2007-03-22 Nintendo Co., Ltd. Video game system with wireless modular handheld controller
US7421343B2 (en) 2005-10-27 2008-09-02 Honeywell International Inc. Systems and methods for reducing vibration-induced errors in inertial sensors
US7843425B2 (en) 2005-12-16 2010-11-30 Industrial Technology Research Institute Motion recognition system and method for controlling electronic devices
US7188045B1 (en) 2006-03-09 2007-03-06 Dean A. Cirielli Three-dimensional position and motion telemetry input
US7353134B2 (en) 2006-03-09 2008-04-01 Dean A. Cirielli Three-dimensional position and motion telemetry input
US7409292B2 (en) 2006-05-26 2008-08-05 Honeywell International Inc. Method and system for degimbalization of vehicle navigation data
US20080016962A1 (en) 2006-07-24 2008-01-24 Honeywell International Inc, Medical use angular rate sensor
US20080024435A1 (en) 2006-07-25 2008-01-31 Nintendo Co., Ltd. Information processing device and storage medium storing information processing program
US20080108870A1 (en) 2006-11-06 2008-05-08 Wiita Bruce E Apparatus and method for stabilizing an image from an endoscopic camera
US20080134784A1 (en) 2006-12-12 2008-06-12 Industrial Technology Research Institute Inertial input apparatus with six-axial detection ability and the operating method thereof
US20090153389A1 (en) 2007-12-14 2009-06-18 Apple Inc. Scroll bar with video region in a media system
US20100153872A1 (en) 2008-12-11 2010-06-17 Samsung Electronics Co., Ltd. Method for providing graphical user interface and electronic device using the same

Non-Patent Citations (300)

* Cited by examiner, † Cited by third party
Title
Alexander D. Chinoy's Letter Requesting Return of Physical Exhibits; United States International Trade Commission Investigation No. 337-TA-658; Oct. 16, 2009.
Allen et al., "Tracking: Beyond 15 Minutes of Thought," SIGGRAPH, Aug. 12-17, 2001.
Analog Devices ADXL 202, "Low Cost ± 2g Duel Axis iMEMS® Accelerometer with Digital Output," 1998.
Analog Devices ADXL202/ADXL210, "Low Cost ± 2 g/±10 g Duel Axis iMEMS® Accelerometers with Digital Output," 1999.
Analog Devices ADXL202E, "Low-Cost ± 2g Duel-Axis Accelerometer with Duty Cycle Output," 2000.
Analog Devices ADXL330, "Small, Low Power, 3-Axis ± 3 g iMEMS® Accelerometer," 2006.
Analog Devices ADXL50, "Monolithic Accelerometer With Signal Conditioning," 1996.
Analog Devices, "+/− 150 degree/s Single Chip Yaw Rate Gyro with Signal Conditioning", ADXRS150, 2003, 14 pages.
Ang et al., "Physical Model of a MEMS Accelerometer for Low-g Motion Tracking Applications," Proceedings of the 2004 IEEE International Conference on Robotics & Automation, Apr. 2004, New Orleans, LA, USA.
Ang, W.T. et al., "Design of All-Accelerometer Inertial Measurement Unit for Tremor Sensing in Hand-held Microsurgical Instrument," Proceedings of the 2003 IEEE International Conference on Robotics & Automation, Sep. 2003, pp. 1781-1786.
Ang, W.T., et al., "Design and Implementation of Active Error Canceling in Hand-held Microsurgical Instrument," Proceedings of the 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 2, Oct. 2001, pp. 1106-1111.
Answer of Nintendo Co., Ltd. and Nintendo of America Inc. to the Complaint of Hillcrest Laboratories, Inc. Under Section 337 of the Tariff Act of 1930, as Amended; United States International Trade Commission Investigation No. 337-TA-658; Oct. 20, 2008.
Anthony, Jhon, Wireless Mouse and Keyboard [online], Aug. 7, 2003 [retrieved on Nov. 15, 2015]. Retrieved from the Internet: <URL: https://web.arch ive.org/web/20030807075132/http://computerhelpatoz.com/tech-wireless-mouse.html>.
Appellant, Movea SA, Notice of Appeal of the Decision of the Opposition Division rejecting the Opposition filed against European Patent No. EP1741088, EPO Communication of the Board of Appeal for Appeal No. T1912/17-3.5.05 dated Sep. 1, 2017, 5 pages.
Appellant, Movea SA, Reply to Respondent, IDHL Holdings, Inc., Response to Movea SA Statement of Grounds of Appeal in Appeal No. T1912/17-3.5.05 dated Oct. 29, 2018, 9 pages.
Appellant, Movea SA, Response to the Preliminary Opinion of the Board of Appeal in Appeal No. T1912/17-3.5.05 dated Jan. 10, 2020, 16 pages.
Appellant, Movea SA, Statement of Grounds of Appeal of the Decision of the Opposition Division rejecting the Opposition filed against European Patent No. EP1741088, EPO Communication of the Board of Appeal for Appeal No. T1912/17-3.5.05 dated Nov. 6, 2017, 45 pages.
Appendices A, B, and C from U.S. Pat. No. 6,069,594 to Barnes et al., pp. 1-104, May 30, 2000.
Arangarasan et al., "Modular Approach of Multimodal Integration in a Virtual Environment," Proceedings of the Fourth IEEE International Conference on Multimodal Interfaces (ICMI'02), Oct. 16, 2002, pp. 331-336.
Bachmann et al., "Design and Implementation of MARG Sensors for 3-DOF Orientation Measurement of Rigid Bodies," Proceedings of the 2003 IEEE International Conference on Robotics and Automation, ICRA '03, Sep. 14-19, 2003, pp. 1171-1178, vol. 1.
Bachmann et al., "Orientation Tracking for Humans and Robots Using Inertial Sensors," Proceedings of the 1999 International Symposium on Computational Intelligence in Robotics & Automation (CIRA99), Dec. 1999, pp. 187-194.
Bachmann, "Inertial and Magnetic Tracking of Limb Segment Orientation for Inserting Humans Into Synthetic Environments," Dissertation, Naval Postgraduate School, Monterey, California, Dec. 2000.
Baerveldt et al., "A Low-cost and Low-weight Attitude Estimation System for an Autonomous Helicopter," 1997 IEEE International Conference on Intelligent Engineering Systems, 1997, INES '97 Proceedings, Sep. 15-17, 1997, pp. 391-395.
Barshan et al., "Inertial Navigation Systems for Mobile Robots," IEEE Transactions on Robotics and Automation, Jun. 1995, vol. 11, No. 3.
Bederson, "Quantum Treemaps and Bubblemaps for a Zoomable Image Browser," UIST 2001, ACM Symposium on User Interface Software and Technology, CHI Letters, 3(2), Nov. 11-14, 2001, Orlando, FL, U.S.A., pp. 71-80.
Bibi et al., "Modeling and Simulation of Bias Uncertainty of Gyroscope for Improved Navigation," 2nd International Bhurban Conference on Applied Sciences and Technology, Jun. 16-21, 2003, Bhurban, Pakistan.
Bier et al., "A Taxonomy of See-Through Tools," Proceedings of CHI '94, ACM, Apr. 24-28, 1994, Boston, MA, U.S.A., pp. 358-364.
Bier et al., "Toolglass and Magic Lenses: The See-Through Interface," Proceedings of Siggraph '93, Computer Graphics Annual Conference Series, ACM, Aug. 1993, Anaheim, CA, U.S.A., pp. 73-80.
Blomster, "Orientation estimation combining vision and gyro measurements," KTH Electrical Engineering, Master's Degree Project, Apr. 6, 2006, Stockholm, Sweden.
Bowman et al., "3D User Interfaces: Theory and Practice," Aug. 2004, Addison Wesley Longman Publishing Co., Inc., Redwood City, CA, USA.
Card et al., "The Design Space of Input Devices," CHI '90 Proceedings, Apr. 1990.
Caruso et al., "Vehicle Detection and Compass Applications using AMR Magnetic Sensors," Proceedings Sensors Expo, May 4-6, 1999, Baltimore, MD, USA.
Caruso, "Applications of Magnetoresistive Sensors in Navigation Systems," SAE Technical Paper Series, 970602, SAE The International Engineering Society for Advancing Mobility Land Sea Air and Space, International Congress & Exposition, Feb. 24-27, 1997, Detroit, MI, USA.
Choukroun et al., "A Novel Quaternion Kalman Filter," Faculty of Aerospace Engineering, Technion—Israel Institute of Technology, TAE No. 930, Jan. 2004.
Choukroun, "Novel Methods for Attitude Determination Using Vector Observations," Research Thesis, Senate of the Technion—Israel Institute of Technology, May 2003.
Ciciora et al., "Modern Cable Television Technology; Video, Voice, and Data Communications," 1999, Morgan Kaufmann Publishers, Inc., San Francisco, CA, USA, Chapters 1, 7, 8, and 18-20.
Clarendon Press, Oxford, "The New Shorter Oxford English Dictionary on Historical Principles," vol. 1, A-M, 1993, pp. 286, 371, 374, 1076, 2266-2268, 2371, 2542-2543.
Commission Investigative Staff's List of Admitted Exhibits (Public and Confidential), United States International Trade Commission Investigation No. 337-TA-658, Jul. 21, 2009.
Commission Investigative Staff's Rebuttal Exhibit List, United States International Trade Commission Investigation No. 337-TA-658, Apr. 29, 2009.
Commission Investigative Staff's Response to Motion by Donald S. Odell to Quash Subpoena Duces Tecum and Ad Testificandum, United States International Trade Commission Investigation No. 337-TA-658, Feb. 13, 2009.
Commission Investigative Staff's Response to Motion of Nintendo Co., Ltd. and Nintendo of America Inc. for Leave to File Second Amended Answer to Complaint of Hillcrest Laboratories, Inc., United States International Trade Commission Investigation No. 337-TA-658, Mar. 30, 2009.
Commission Investigative Staff's Response to Respondent Nintendo's Motion for Summary Determination of Unpatentability of the '118, '760, and '611 Patents Under 35 U.S.C. § 101, United States International Trade Commission Investigation No. 337-TA-658, Mar. 4, 2009.
Complainant's Combined List of Confidential and Public Exhibits Admitted or Rejected During the Evidentiary Hearing, United States International Trade Commission Investigation No. 337-TA-658, May 21, 2009.
Complainant's List of Confidential Exhibits Admitted or Rejected During the Evidentiary Hearing, United States International Trade Commission Investigation No. 337-TA-658, May 21, 2009.
Complainant's List of Public Exhibits Admitted or Rejected During the Evidentiary Hearing, United States International Trade Commission Investigation No. 337-TA-658, May 21, 2009.
Day 1 Final Hearing Transcript, United States International Trade Commission Investigation No. 337-TA-658, May 11, 2009.
Day 2 Final Hearing Transcript, United States International Trade Commission Investigation No. 337-TA-658, May 12, 2009.
Day 3 Final Hearing Transcript, United States International Trade Commission Investigation No. 337-TA-658, May 13, 2009.
Day 4 Final Hearing Transcript, United States International Trade Commission Investigation No. 337-TA-658, May 14, 2009.
Day 5 Final Hearing Transcript, United States International Trade Commission Investigation No. 337-TA-658, May 15, 2009.
Decision of Rejection dated Feb. 14, 2012 in related JP Application No. 2007-511071.
Decision of Rejection dated Feb. 15, 2011 in related JP Application No. 2007-511062.
Decision of Rejection dated Jun. 23, 2014 in related JP Application No. 2012-126909.
Decision of the Boards of Appeal in Appeal No. T1912/17-3.5.05, EPO Communication of the Board of Appeal dated May 11, 2020, 17 pages.
Decision of the Opposition Division rejecting the Opposition filed against European Patent No. EP1741088, EPO Communication dated Jun. 21, 2017, 29 pages.
Decision on Appeal dated Sep. 26, 2014 in related U.S. Appl. No. 11/821,018.
Decision to Refuse Application dated Dec. 30, 2011 in related EP Application No. 05757855.1.
Demonstratives Submitted in United States International Trade Commission Investigation No. 337-TA-658, 2009.
Digalakis, V., et al., "ML Estimation of a Shochastic Linear System with the EM Algorithm and Its Application to Speech Recognition," IEEE Transactions on Speech and Audio Processing, vol. 1, No. 4, Oct. 1993, pp. 431-442.
Dogancay, "Bias Compensation for the Bearings-Only Pseudolinear Target Track Estimator," IEEE Transactions on Signal Processing, Jan. 2006, vol. 54, No. 1.
Eibele et al., "Orientation as an additional User Interface in Mixed-Reality Environments," Workshop Erweiterte Und Virtuelle Realitat, 2004, pp. 79-90.
European Patent Office Communication dated Apr. 28, 2011 for EP Application No. 09769324.6 comprising Third Party Observation dated Apr. 14, 2011.
Ex Parte Application for Issuance of Subpoena Duces Tecum and Ad Testificandum to Christopher Roller, United States International Trade Commission Investigation No. 337-TA-658, Jan. 7, 2009.
Extended European Search Report dated Apr. 11, 2014 in related EP Application No. 13005551.0.
Extended European Search Report dated Dec. 7, 2010 in related EP Application No. 10011316.6.
Extended European Search Report dated Jun. 6, 2011 in related EP Application No. 10011833.0.
Extended European Search Report dated May 10, 2011 in related EP Application No. 10014911.1.
F J de Kermadec, Apple Wireless Mouse and Keyboard Tips [online], Oct. 14, 2003 [retrieved Nov. 15, 2015]. Retrieved from the Internet: <URL: http://www.macdevcenter.com/lpt/a/4286>.
Feltens, "Vector methods to compute azimuth, elevation, ellipsoidal normal, and the Cartesian (X, Y, Z) to geodetic (ϕ, λ, h) transformation," Journal of Geodesy, Aug. 2008, pp. 493-504, vol. 82, No. 8.
Fishkin et al., "Enhanced Dynamic Queries via Movable Filters," CHI '95 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, May 1, 1995, Addison-Wesley Publishing Co., New York, NY, USA.
Flaherty, "Silicon springs to its sensors," Electronics Times, Mar. 2, 1998, http://findarticles.com/p/articles/mi_m0WVI/is_1998_March_2/ai_58372213/.
Foley et al., "Second Edition Computer Graphics: Principles and Practice," Nov. 1992, Chapters 1, 5, 7, Appendix, Addison-Wesley Publishing Company, Inc., Reading, MA, USA.
Foxlin, "Chapter 7. Motion Tracking Requirements and Technologies," Handbook of Virtual Environment Technology, 2002.
Foxlin, "Intertial Head-Tracker Sensor Fusion by a Complementary Separate-Bias Kalman Filter," Proceedings of the IEEE 1996 Virtual Reality Annual International Symposium, Mar. 30-Apr. 3, 1996, pp. 185-194, 267, Santa Clara, CA, USA.
Foxlin, "Pedestrian Tracking with Shoe-Mounted Inertial Sensors," IEEE Computer Graphics & Applications, Nov./Dec. 2005, pp. 38-46, vol. 25, No. 6.
Francis, Erik, Home > Counter-Strike > Tips [online], Oct. 2, 2003 [retrieved on Nov. 15, 2015]. Retrieved from the Internet: <URL: https://web.archive.org/web/20031002090301/http://www.bosskey.net/cs/tips.html>.
Friedberg et al., "Linear Algebra, 4th Edition," 2003, Pearson Education, Inc., Upper Saddle River, NJ, USA.
Fuerst et al., "Interactive Television: A Survey of the State of Research and the Proposal and Evaluation of a User Interface," Jun. 1996, http://wwwai.wu-wien.ac.at/˜koch/stud/itv/paper.html.
Gebre-Egziabher et al., "A Gyro-Free Quaternion-Based Attitude Determination System Suitable for Implementation Using Low Cost Sensors," IEEE 2000 Position Location and Navigation Symposium, Mar. 13-16, 2000, pp. 185-192, San Diego, CA, USA.
Geen, J., et al., "New iMEMS Angular-Rate-Sensing Gyroscope," Analog Dialogue, 37-03 (2003), pp. 1-4.
Gripton, "The Application and Future Development of a MEMS SiVSG® for Commercial and Military Inertial Products," 2002 IEEE Position Location and Navigation Symposium, Apr. 15-18, 2002, pp. 28-35.
Hamilton, "Lectures on Quaternions: Containing a Systematic Statement of a New Mathematical Method," 1853, Hodges and Smith, Dublin, IE.
Haykin, S., et al., "Adaptive Tracking of Linear Time-Variant Systems by Extended RLS Algorithms," IEEE Transactions on Signal Processing, vol. 45, No. 5, May 1997, pp. 1118-1128.
Hide et al., "Multiple Model Kalman Filtering for GPS and Low-cost INS integration," Proceedings of the 17th International Technical Meeting of the Satellite Division of the Institute of Navigation (ION GNSS 2004), Sep. 21-24, 2004, pp. 1096-1103, Long Beach, CA, USA.
Hogue, "MARVIN: a Mobile Automatic Realtime Visual and INertial tracking system," Thesis, May 2003, Graduate Program in Computer Science, York University, Toronto, Ontario, CA.
Honeywell, "Three-Axis Magnetic Sensor Hybrid", HMC2003 Datasheet, Apr. 2003, 4 pages.
Hong et al., "Observability of Error States in GPS/INS Integration," IEEE Transactions on Vehicular Technology, Mar. 2005, vol. 54, No. 2.
IBM, "Dictionary of Computing: Information Processing, Personal Computing, Telecommunications, Office Systems, IBM-specific Terms," 8th Edition, Mar. 1987, p. 475.
IEEE Standards, "802.16 IEEE Standard for Local and metropolitan area networks, Part 16: Air Interface for Fixed Broadband Wireless Access Systems," IEEE Std 802.16 2004, Oct. 1, 2004, pp. 1-857.
Inter Parties Reexamination Certificate in Reexamination Control No. 95/002,036 issued Feb. 25, 2019, 8 pages.
International Preliminary Report on Patentability dated Jan. 5, 2011 in related International Application No. PCT/EP2009/057978.
International Search Report dated Aug. 5, 2002 in related International Application No. PCT/US01/08261.
International Search Report dated Feb. 9, 2005 in related International Application No. PCT/US04/14487.
International Search Report dated Jul. 25, 2002 in related International Application No. PCT/US01/08377.
International Search Report dated Jun. 12, 2006 in related International Application No. PCT/US05/14702.
International Search Report dated Jun. 14, 2006 in related International Application No. PCT/US05/15068.
International Search Report dated Nov. 13, 2002 in related International Application No. PCT/US01/08331.
International Search Report for International application No. PCT/US04/35369, dated May 11, 2006.
International Search Report issued in International application No. PCT/US05/15051, dated Feb. 19, 2008.
International Search Report issued in International application No. PCT/US05/15096, dated May 15, 2006.
International Search Report issued in International application No. PCT/US05/42558, dated Nov. 30, 2006.
International Standard, "Information technology—Vocabulary—Part 13: Computer Graphics," Second Edition, Jun. 1, 1996.
Jakubowski, J., et al., "Higher Order Statistics and Neural Network for Tremor Recognition," IEEE Transactions on Biomedical Engineering, vol. 49, No. 2, Feb. 2002, pp. 152-159.
Jakubowski, J., et al., "Increasing Effectiveness of Human Hand Tremor Separation Process by Using Higher-Order Statistics," Measurement Science Review, vol. 1, No. 1, 2001, pp. 43-46.
James D. Richards III's Response to Respondent's Motion to Certify to the Commission a Request for Judicial Enforcement of a Subpoena Duces Tecum and Ad Testificandum Directed to James D. Richards III, United States International Trade Commission Investigation No. 337-TA-658, Mar. 20, 2009.
Joint Stipulation Regarding Technology at Issue, United States International Trade Commission Investigation No. 337-TA-658, May 11, 2009.
Joint Stipulation Regarding the Technology in Issue, United States International Trade Commission Investigation No. 337-TA-658, Feb. 23, 2009.
Jurman et al., "Calibration and data fusion solution for the miniature attitude and heading reference system," Science Direct, Sensors and Actuators A, Aug. 26, 2007, pp. 411-420, vol. 138, No. 2.
Karin J. Norton's Letter regarding Feb. 18 and Feb. 20 Motion, United States International Trade Commission Investigation No. 337-TA-658, Feb. 24, 2009.
Karush, "Webster's New World Dictionary of Mathematics," 1989, Macmillan, USA.
Kevin B. Collins Letter Withdrawing Feb. 17, 2009 Motion, United States International Trade Commission Investigation No. 337-TA-658, Feb. 20, 2009.
Kraft, "A Quaternion-based Unscented Kalman Filter for Orientation Tracking," Proceedings of the Sixth International Conference of Information Fusion, 2003 (vol. 1), Jul. 8-11, 2003, pp. 47-54, Cairns, Queensland, AU.
La Scala, "Design of an Extended Kalman Filter Frequency Tracker," IEEE Transactions on Signal Processing, Mar. 1996, vol. 44, No. 3.
La Scala, B.F., et al., "Design of an Extended Kalman Filter Frequency Tracker," IEEE Transactions on Signal Processing, vol. 44, No. 3, Mar. 1996, pp. 739-742.
Leavitt et al., "High Bandwidth Tilt Measurement Using Low-Cost Sensors," IEEE/ASME Transactions on Mechatronics, Jun. 2006, pp. 320-327, vol. 11, No. 3.
Lexisnexis, "Analog Devices Announces Two Design Wins for Versatile Micromachined Sensors," Feb. 25, 1999, PR Newswire Association, Inc.
Lexisnexis, "Analog Devices Partners with Caveo Technology to Develop Next-Generation Security Technology for Laptop Computers," Intel Developer Forum 2000, Aug. 17, 2000, Business Wire, Inc.
Liu, C., et al., "Enhanced Fisher Linear Discriminant Models for Face Recognition," Proc. 14th International Conference on Pattern Recognition, Queensland, Australia, Aug. 17-20, 1998, pp. 1-5.
Lobo et al., "Vision and Inertial Sensor Cooperation Using Gravity as a Vertical Reference," IEEE Transactions on Pattern Analysis and Machine Intelligence, Dec. 2003, vol. 25, No. 12.
Mahony et al., "Complementary filter design on the special orthogonal group SO(3)," 44th IEEE Conference on Decision & Control, 2005 and 2005 European Control Conference, CDC-ECC '05, Dec. 12-15, 2005, pp. 1477-1484.
Mathematical Relationship Between Equations in U.S. Appl. No. 12/147,811 and U.S. Appl. No. 12/188,595.
McGraw-Hill, "Dictionary of Scientific and Technical Terms, Fifth Edition," 1994, pp. 11, 12, 592,1299, 1409, 1551, 2125, McGraw-Hill , Inc., USA.
Meyer et al., "A Survey of Position Trackers," Communication Technology and Cognition Group, University of North Carolina, 1991.
Minutes of the Oral Proceedings before the Opposition Division on May 17, 2017 for the Opposition filed against European Patent No. EP1741088, EPO Communication dated Jun. 21, 2017, 25 pages.
Minutes of the Oral Proceedings held Jan. 31, 2020 in Appeal No. T1912/17-3.5.05, EPO Communication of the Board of Appeal dated Feb. 7, 2020, 4 pages.
Morrison, "ADI debuts accelerometer," Electronic News, Mar. 16, 1998, http://findarticles.com/p/articles/mi_m0EKF/is_n2210_v44/ai_20403891/.
Motion by Donald S. Odell to Quash Subpoena Duces Tecum and Ad Testificandum, United States International Trade Commission Investigation No. 337-TA-658, Feb. 5, 2009.
Motion by James D. Richards III to Quash Subpoena Duces Tecum and Ad Testifcandum, United States International Trade Commission Investigation No. 337-TA-658, Jan. 21, 2009.
Motion of Nintendo Co., Ltd. and Nintendo of America Inc. for Leave to File Amended Answer to Complaint of Hillcrest Laboratories, Inc., United States International Trade Commission Investigation No. 337-TA-658, Nov. 25, 2008.
Motion of Nintendo Co., Ltd. and Nintendo of America Inc. for Summary Determination of Unpatentability of the '118, '760, and '611 Patents Under 35 U.S.C. § 101, United States International Trade Commission Investigation No. 337-TA-658, Feb. 18, 2009.
Motion of Nintendo Co., Ltd. and Nintendo of America Inc. for Summary Determination of Unpatentability of the '118, '760, and '611 Patents Under 35 U.S.C. § 101, United States International Trade Commission Investigation No. 337-TA-658, Feb. 20, 2009.
Movea SA Notice of Opposition to European Patent No. EP1741088 (Aug. 8, 2012), available at http://register.epo.org/espacenet/application?number=EP05744089&Ing=en&tab=doclist.
Navarrete, P., et al., "Eigenspace-based Recognition of Faces: Comparisons and a new Approach," Image Analysis and Processing, 2001, pp. 1-6.
Nintendo Co., Ltd., Presentation Slides submitted in United States International Trade Commission Investigation No. 337-TA-658, 2009, 211 pages.
Nishiyama, K., "A Nonlinear Filter for Estimating a Sinusoidal Signal and its Parameters in White Noise: On the Case of a Single Sinusoid," IEEE Transactions on Signal Processing, vol. 45, No. 4, Apr. 1997, pp. 970-981.
Nishiyama, K., "Robust Estimation of a Single Complex Sinusoid in White Noise—H∞Filtering Approach," IEEE Transactions on Signal Processing, vol. 47, No. 10, Oct. 1999, pp. 2853-2856.
Notice of Allowance dated Jan. 14, 2011 in related JP Application No. 2007-510993.
Notice of Allowance dated Nov. 17, 2014 in related JP Application No. 2012-126909.
Notice of Intent to Issue an Inter Parties Reexamination Certificate in Reexamination Control No. 95/002,036, , Jan. 30, 2019, 6 pages.
O'Driscoll, "The Essential Guide to Digital Set-top Boxes and Interactive TV," Apr. 2000, Prentice Hall PTR, USA, Chapters 1, 2, 3, 6, 9, 10, 12.
Office Action dated Apr. 1, 2010 in related KR Application No. 10-2006-7025233.
Office Action dated Apr. 17, 2012 in related JP Application No. 2007-511065.
Office Action dated Apr. 21, 2015 in related CN Application No. 201110369736.2.
Office Action dated Apr. 8, 2011 in related JP Application No. 2006-532896.
Office Action dated Aug. 1, 2011 in related JP Application No. 2006-532896.
Office Action dated Aug. 17, 2007 in related KR Application No. 10-2005-702119.
Office Action dated Aug. 17, 2010 in related JP Application No. 2007-510993.
Office Action dated Aug. 19, 2008 in related EP Application No. 05757855.1.
Office Action dated Aug. 19, 2008 in related EP Application No. 05761047.9.
Office Action dated Aug. 24, 2010 in related JP Application No. 2007-511062.
Office Action dated Aug. 5, 2009 in related KR Application No. 10-2007-7029462.
Office Action dated Dec. 22, 2006 in related CN Application No. 200480012477.6.
Office Action dated Dec. 5, 2011 in related EP Application No. 04760954.0.
Office Action dated Dec. 9, 2011 in related JP Application No. 2011-237899.
Office Action dated Feb. 2, 2009 in related JP Application No. 2006-532896.
Office Action dated Jan. 10, 2014 in related IN Application No. 6379/DELNP/2006.
Office Action dated Jan. 19, 2010 in related JP Application No. 2007-510993.
Office Action dated Jan. 19, 2010 in related JP Application No. 2007-511065.
Office Action dated Jan. 22, 2016 in related EP Application No. 04760954.0.
Office Action dated Jan. 29, 2012 in related CN Application No. 200810095047.5.
Office Action dated Jan. 29, 2013 in related JP Application No. 2011-133123.
Office Action dated Jan. 8, 2010 in related CN Application No. 200580021162.2.
Office Action dated Jul. 10, 2009 in related EP Application No. 05744089.3.
Office Action dated Jul. 13, 2010 in related JP Application No. 2007-511071.
Office Action dated Jul. 23, 2010 in related JP Application No. 2006-532896.
Office Action dated Jul. 30, 2013 in related JP Application No. 2012-126909.
Office Action dated Jul. 4, 2011 in related CN Application No. 200810095047.5.
Office Action dated Jun. 16, 2009 in related JP Application No. 2007-511062.
Office Action dated Jun. 16, 2009 in related JP Application No. 2007-511071.
Office Action dated Jun. 24, 2009 in related KR Application No. 10-2006-7025233.
Office Action dated Mar. 11, 2013 in related EP Application No. 04760954.0.
Office Action dated Mar. 17, 2010 in related EP Application No. 05744089.3.
Office Action dated Mar. 17, 2010 in related EP Application No. 05760711.1.
Office Action dated Mar. 18, 2010 in related EP Application No. 05757855.1.
Office Action dated Mar. 22, 2011 in related JP Application No. 2007-511071.
Office Action dated Mar. 24, 2014 in related CN Application No. 201110369736.2.
Office Action dated Mar. 3, 2010 in related KR Application No. 10-2007-7029462.
Office Action dated Mar. 7, 2016 in related JP Application No. 2014-216431.
Office Action dated Mar. 8, 2012 in related EP Application No. 10014911.1.
Office Action dated Mar. 9, 2011 in related CN Application No. 200580021162.2.
Office Action dated May 16, 2011 in related JP Application No. 2009-124732.
Office Action dated May 18, 2010 in related KR Application No. 10-2009-7023531.
Office Action dated May 31, 2010 in related CN Application No. 200810181100.3.
Office Action dated May 4, 2011 in related CN Application No. 200810181100.3.
Office Action dated Nov. 16, 2010 in related JP Application No. 2007-511065.
Office Action dated Nov. 25, 2008 in related EP Application No. 05744089.3.
Office Action dated Nov. 3, 2015 in related CN Application No. 201110369736.2.
Office Action dated Nov. 6, 2007 in related IN Application No. 4829/DELNP/2005.
Office Action dated Oct. 11, 2011 in related JP Application No. 2007-511065.
Office Action dated Oct. 30, 2009 in related CN Application No. 200810095047.5.
Office Action dated Oct. 31, 2014 in related CN Application No. 201110369736.2.
Office Action dated Oct. 4, 2011 in related TW Application No. 9411402.
Office Action dated Sep. 11, 2009 in related KR Application No. 10-2006-7025226.
Office Action dated Sep. 11, 2012 in related EP Application No. 09769324.6.
Office Action dated Sep. 22, 2015 in related EP Application No. 05760711.1 (reference previously cited with IDS Jul. 31, 2015).
Office Action dated Sep. 6, 2010 in related KR Application No. 10-2010-7016483.
Office Action for Chinese Application No. 200580021163.7, dated Jan. 25, 2008.
Order No. 13: Initial Determination Terminating the Investigation as to Certain Claims, United States International Trade Commission Investigation No. 337-TA-658, Feb. 23, 2009.
Order No. 14: Relating to Third-Party James D. Richards III's Motion No. 658-18 for Sanctions and for a New Protective Order and His Motion No. 658-13 to Quash Subpoena Duces Tecum and Ad Testificandum and Ordering Richards to Comply With Said Subpoena, United States International Trade Commission Investigation No. 337-TA-658, Feb. 27, 2009.
Order No. 19: Initial Determination Requesting Judicial Enforcement of Subpoena, United States International Trade Commission Investigation No. 337-TA-658, Mar. 23, 2009.
Order No. 25: Denying Respondent Nintendo's Motion for Summary Determination of Unpatentability of the '760 and 611 Patents Under 35 U.S.C. § 101, United States International Trade Commission Investigation No. 337-TA-658, Mar. 26, 2009.
Order No. 30: Denying Respondent's Motion to Stay, United States International Trade Commission Investigation No. 337-TA-658, Apr. 2, 2009.
Order No. 31: Granting Nintendo's Motion No. 658-37 to File Second Amended Answer to Complaint, United States International Trade Commission Investigation No. 337-TA-658, Apr. 2, 2009.
Order No. 37: Requiring Submissions From Complainant, Respondents and the Staff, United States International Trade Commission Investigation No. 337-TA-658, Apr. 23, 2009.
Order No. 39: Granting Complainant's Motion to Submit Riviere Supplemental Report, United States International Trade Commission Investigation No. 337-TA-658, Apr. 27, 2009.
Pancratov et al., "Why Computer Architecture Matters," Computing in Science & Engineering, May/Jun. 2008, pp. 59-63, vol. 10, No. 4.
Park et al., "Covariance Analysis of Strapdown INS Considering Gyrocompass Characteristics," IEEE Transactions on Aerospace and Electronic Systems, Jan. 1995, vol. 31, No. 1.
Park et al., "Examples of Estimation Filters from Recent Aircraft Projects at MIT," Nov. 2004.
Pique, "Semantics of Interactive Rotations," 1986 Workshop on Interactive 3D Graphics, Oct. 23-24, 1986, pp. 259-269, Chapel Hill, NC, USA.
planetmath.org, "PlanetMath: derivation of rotation matrix using polar coordinates," Apr. 25, 2009, http://planetmath.org/encyclopedia/DerivationOfRotationMatrixUsin . . . .
Preliminary Conference, United States International Trade Commission Investigation No. 337-TA-658, Oct. 23, 2008.
Preliminary Opinion of the Board of Appeal in Appeal No. T1912/17-3.5.05, EPO Communication of the Board of Appeal pursuant to Article 15(1) of the Rules of Procedure of the Boards of Appeal dated Dec. 5, 2019, 13 pages.
Press Release, "NetTV Selected for 800 Kansas City Classrooms," http://www.fno.org/mar98/NKCSDPR1.html, Mar. 23, 1998.
Private Parties' Combined List of Confidential and Public Joint Exhibits Admitted or Rejected During the Evidentiary Hearing, United States International Trade Commission Investigation No. 337-TA-658, May 21, 2009.
Private Parties' List of Confidential Joint Exhibits Admitted or Rejected During the Evidentiary Hearing, United States International Trade Commission Investigation No. 337-TA-658, May 21, 2009.
Private Parties' List of Public Joint Exhibits Admitted or Rejected During the Evidentiary Hearing, United States International Trade Commission Investigation No. 337-TA-658, May 21, 2009.
Proposed Conclusions of Law of Respondents Nintendo Co., Ltd and Nintendo of America Inc., United States International Trade Commission Investigation No. 337-TA-658, Jun. 3, 2009.
Quesenbery et al., "Designing for Interactive Television," Aug. 15, 2003, http:www.wqusability.com/articles/itv-design.html.
Raethjen, J., et al., "Tremor Analysis in Two Normal Cohorts," Clinical Neurophysiology 115, 2004, pp. 2151-2156.
Request for Inter Partes Reexamination of U.S. Pat. No. 7,158,118, filed with the U.S. Patent and Trademark Office on Jul. 13, 2012.
Respondent, IDHL Holdings, Inc., Response to Appellant, Movea SA, Statement of Grounds of Appeal in Appeal No. T1912/17-3.5.05 dated Mar. 16, 2018, 41 pages.
Respondent, IDHL Holdings, Inc., Submission in Response to the Preliminary Opinion of the Board of Appeal in Appeal No. T1912/17-3.5.05 dated Jan. 10, 2020, 8 pages.
Respondents Nintendo Co., Ltd. and Nintendo of America Inc.'s Combined List of Confidential and Public Exhibits Admitted or Rejected During the Evidentiary Hearing, United States International Trade Commission Investigation No. 337-TA-658, May 21, 2009.
Respondents Nintendo Co., Ltd. and Nintendo of America Inc.'s Corrected Responses to Complainant Hillcrest Laboratories, Inc.'s Fourth Set of Interrogatories, United States International Trade Commission Investigation No. 337-TA-658, Feb. 18, 2009.
Respondents Nintendo Co., Ltd. and Nintendo of America Inc.'s List of Confidential Exhibits Admitted or Rejected During the Evidentiary Hearing, United States International Trade Commission Investigation No. 337-TA-658, May 21, 2009.
Respondents Nintendo Co., Ltd. and Nintendo of America Inc.'s List of Public Exhibits Admitted or Rejected During the Evidentiary Hearing, United States International Trade Commission Investigation No. 337-TA-658, May 21, 2009.
Respondents Nintendo Co., Ltd. and Nintendo of America Inc.'s Responses to Complainant Hillcrest Laboratories, Inc.'s Fourth Set of Interrogatories, United States International Trade Commission Investigation No. 337-TA-658, Feb. 17, 2009.
Respondents Nintendo Co., Ltd. and Nintendo of America Inc.'s Second Motion for Leave to Submit Supplemental Notice of Prior Art, United States International Trade Commission Investigation No. 337-TA-658, Feb. 24, 2009.
Respondents Nintendo Co., Ltd. and Nintendo of America Inc.'s Third Motion for Leave to Submit Supplemental Notice of Prior Art, United States International Trade Commission Investigation No. 337-TA-658, Mar. 17, 2009.
Respondents Nintendo Co., Ltd. and Nintendo of America Inc.'s Unopposed Motion for Leave to File Supplemental Notice of Prior Art, United States International Trade Commission Investigation No. 337-TA-658, Feb. 6, 2009.
Rios et al., "Fusion Filter Algorithm Enhancements for a MEMS GPS/IMU," Proceedings of the 14th International Technical Meeting of the Satellite Division of the Institute of Navigation (ION GPS 2001), Sep. 11-14, 2001, pp. 1382-1393, Salt Lake City, UT, USA.
Riviere, C.N., et al., "Adaptive Canceling of Physiological Tremor for Improved Precision in Microsurgery," IEEE Transactions on Biomedical Engineering, vol. 45, No. 7, Jul. 1998, pp. 839-846.
Riviere, C.N., et al., "Toward Active Tremor Canceling in Handheld Microsurgical Instruments," IEEE Transactions on Robotics and Automation, vol. 19, No. 5, Oct. 2003, pp. 793-800.
Roumeliotis et al., "Cicumventing Dynamic Modeling: Evaluation of the Error-State Kalman Filter applied to Mobile Robot Localization," 1999 IEEE International Conference on Robotics and Automation, 1999 Proceedings, May 10-15, 1999, pp. 1656-1663, vol. 2, Detroit, MI, USA.
Sawada et al., "A Wearable Attitude-Measurement System Using a Fiberoptic Gyroscope," Presence, Apr. 2002, pp. 109-118, vol. 11, No. 2.
Sayed, A.H., et al., "A Framework for State-Space Estimation with Uncertain Models," IEEE Transactions on Automatic Control, vol. 46, No. 7, Jul. 2001, pp. 998-1013.
SIGGRAPH, Presentation Slides submitted in United States International Trade Commission Investigation No. 337-TA-658, 2009, 103 pages.
Simon et al., "The YoYo: A Handheld Input Combining Elastic and Isotonic Input," Human Computer Interaction—Interact '03, Jan. 2003, pp. 303-310.
Srivastava, "Interactive TV Technology and Markets," Jan. 1, 2002, Chapters 1, 2, 3, and 7, Artech House, Boston, MA, USA.
ST Microelectronics, "MEMS Inertial Sensor: 3-axis-+/− 2g Ultracompact Linear Accelerometer," LIS3L02AL, Sep. 2005.
ST Microelectronics, "MEMS Inertial Sensor: 3-axis-+/− 2g Ultracompact Linear Accelerometer," LIS3L02ALE, Feb. 2006.
ST, "MEMS Accelerometers," Apr. 29, 2009, pp. 2508-2511.
STMicroelectronics, "Inertial Sensor: 3-Axis-2g/6g Linear Accelerometer," LIS3L02AS, Feb. 2003.
Stone et al., "The Movable Filter as a User Interface Tool," Proceedings of CHI '94, ACM, Apr. 24-28, 1994, pp. 306-312, Boston, MA, USA.
Strachan et al., "Muscle Tremor as an Input Mechanism," UIST '04, Oct. 24-27, 2004, Santa Fe, NM, USA.
Summons to Attend Oral Proceedings by Board of Appeals mailed Nov. 6, 2015 in related EP Application No. 05757855.
Summons to Attend Oral Proceedings mailed Apr. 28, 2011 in related EP Application No. 05744089.3.
Summons to Attend Oral Proceedings mailed Aug. 3, 2011 in related EP Application No. 05757855.1.
Summons to Oral Proceedings pursuant to Rule 115(1) EPC for Appeal No. T1912/17-3.5.05, EPO Communication of the Board of Appeal dated Aug. 7, 2019, 2 pages.
Supplemental European Search Report issued in European Application No. EP 05 74 4089, dated Mar. 6, 2008.
Supplementary European Search Report dated Apr. 10, 2008 in related EP Application No. 05757855.1.
Supplementary European Search Report dated Apr. 2, 2008 in related EP Application No. 04796360.8.
Supplementary European Search Report dated Apr. 2, 2008 in related EP Application No. 05761047.9.
Supplementary European Search Report dated Aug. 9, 2011 in related EP Application No. 04760954.0.
Supplementary European Search Report dated Oct. 8, 2009 in related EP Application No. 05760711.1.
SupremeOverlord, Advantages and Disadvantages for a mouse/keyboard using USB, in Ars Open Forum [online], Oct. 20, 2001; 12:07 PM [retrieved on Nov. 19, 2015], Retrieved from the Internet: <URL: http://arstechnica.com/civis/viewtopic.php?f=11 &t=895076>.
Tech Infobase, "ADI Advances Tilt/Motion Sensing and System Hardware Monitoring," Oct. 1, 2000, http://cma.zdnet.com/texis/techinfobase/techinfobase/+ID6e4DLxzmwwwxqFqo_/display.htm.
Telephone Conference, United States International Trade Commission Investigation No. 337-TA-658, Mar. 9, 2009.
Timmer, J., et al., "Characteristics of Hand Tremor Time Series," Biological Cybernetics, vol. 70, 1993, pp. 75-80.
Timmer, J., et al., "Cross-Spectral Analysis of Physiological Tremor and Muscle Activity: I Theory and Application to Unsynchronized Electromyogram," Biological Cybernetics, vol. 78, 1998, pp. 349-357.
Timmer, J., et al., "Cross-Spectral Analysis of Physiological Tremor and Muscle Activity: II Application to Synchronized Electromyogram," Biological Cybernetics, vol. 78, 1998, pp. 359-368.
Timmer, J., et al., "Cross-Spectral Analysis of Tremor Time Series," International Journal of Bifurcation and Chaos, vol. 10, No. 11, 2000, pp. 2595-2610.
Timmer, J., et al., "Modeling Noisy Time Series: Physiological Tremor," International Journal of Bifurcation and Chaos, vol. 8, No. 7, 1998, pp. 1505-1516.
Timmer, J., et al., "Pathological Tremors: Deterministic Chaos or Nonlinear Stochastic Oscillators?" Chaos, vol. 10, No. 1, Mar. 2000, pp. 278-288.
Titterton et al., "Strapdown inertial navigation technology," IEE Radar, Sonar, Navigation and Avionics Series 5, 1997, Peter Peregrinus Ltd., Chapters 1, 2, 3, 10.
U.S. Appl. No. 60/355,368, filed Feb. 7, 2002.
U.S. Appl. No. 60/566,444, filed Apr. 30, 2004.
U.S. Appl. No. 60/612,571, filed Sep. 23, 2004.
U.S. Appl. No. 60/641,410, filed Jan. 5, 2005.
Utility U.S. Appl. No. 11/286,702, filed Nov. 23, 2005.
Vane et al., "Programming for TV, Radio, and Cable," Ch. 3: The Marketplace: Part II, 1994, pp. 29-46, Butterworth-Heinemann, Newton, MA, USA.
Verhoeven et al., "Hypermedia on the Map: Spatial Hypermedia in HyperMap," International Conference on Information, Communications and Signal Processing, ICICS '97, Singapore, Sep. 9-12, 1997, pp. 589-592.
Vigyan Prasar, "Vigyan Prasar: An Insight, Technology to the aid of science popularisation," Jan. 1999, http://www.vigyanprasar.com/dream/jan99/janvpinsight.htm.
Vogel et al., "Distant Freehand Pointing and Clicking on Very Large, High Resolution Displays," UIST '05, Oct. 23-27, 2005, pp. 33-42, Seattle, WA, USA.
VOiD, Counter-Strike Manual, Feb. 1, 2001, http://voidclan.tripod.com/csmanual.htm. *
Website: "Freiburg Center for Data Analysis and Modeling—Publications," http://www.fdm.uni-freiburg.de/cms/puplications/publications/, retrieved Aug. 17, 2007, pp. 1-11.
Website: Beuter, A., Publications, University of Quebec at Montreal, http://www.er.uqam.ca/nobel/r11040/publicat.htm, retrieved Aug. 17, 2007, pp. 1-7.
Website: Jian, Z., et al., "Adaptive Noise Cancellation," Rice University, http://www.ece.rice.edu/˜klwang/elec434/elec434.htm, retrieved Aug. 17, 2007, pp. 1-6.
Website: Murray-Smith, R., Hamilton Institute, http://www.dcs.gla.ac.uk/˜rod/, retrieved Aug. 17, 2007, pp. 1-5.
Website: Riviere, C., Robotics Institute, http://www.ri.cmu.edu/people/riviere_cameron.html, retrieved Aug. 17, 2007, pp. 1-4.
Website: Sayed, A.H., "UCLA Adaptive Systems Laboratory—Home Page," UCLA, http://asl.ee.ucla.edu/index.php?option=com_frontpage&Itemid=1, retrieved Aug. 17, 2007, p. 1.
Website: Timmer, J., "Data Analysis and Modeling of Dynamic Processes in the Life Sciences," Freiburg Center for Data Analysis and Modeling, http://webber.physik.uni-freiburg.de/˜jeti/, retrieved Aug. 17, 2007, pp. 1-2.
Weisstein, "Rotation Matrix," From MathWorld—A Wolfram Web Resource, http://mathworld.wolfram.com/RotationMatrix.html (last updated May 29, 2012).
Welch, "SCAAT: Incremental Tracking with Incomplete Information," TR96-051, Oct. 1996, Department of Computer Science, UNC—Chapel Hill.
Widrow, B., et al., "Fundamental Relations Between the LMS Algorithm and the DFT," IEEE Transactions on Circuits and Systems, vol. 34, No. CAS-7, Jul. 1987, pp. 814-820.
Wikipedia, "List of trigonometric identities," Dec. 28, 2007, retrieved from Internet Aug. 23, 2012, http://en.wikipedia.org/w/index.php?title=List_of_trigonometric_identities.
Wilson et al., "Demonstration of the XWand Interface for Intelligent Spaces," UIST '02 Companion, Demonstrations, Oct. 27-30, 2002, pp. 37-38, Paris, FR.
Wilson et al., "Gesture Recognition Using The XWand," Assistive Intelligent Environments Group, Robotics Institute, Carnegie Mellon University, Apr. 2004.
Wilson et al., "XWand: UI for Intelligent Spaces," CHI 2003, Apr. 5-10, 2003. Ft. Lauderdale, FL, USA.
Wilson, "XWand: UI for Intelligent Environments," Apr. 26, 2004, http://research.microsoft.com/en-us/um/people/awilson/wand/default.htm.
Written Decision of the Board of Appeals mailed Feb. 22, 2016 in related EP Application No. 05757855.
Written Opinion dated Feb. 9, 2005 in related International Application No. PCT/US04/14487.
Written Opinion dated Jun. 12, 2006 in related International Application No. PCT/US05/14702.
Written Opinion dated Jun. 14, 2006 in related International Application No. PCT/US05/15068.
Written Opinion issued in International application No. PCT/US04/35369, dated May 11, 2006.
Written Opinion issued in International application No. PCT/US05/15051, dated Feb. 19, 2008.
Written Opinion issued in International application No. PCT/US05/42558, dated Nov. 30, 2006.
Written Opinion issued is International application No. PCT/US05/15096, dated May 15, 2006.
Yokoi et al., "Development of 3D-Input Device Using Adaptive Control," Hokkaido University, 1995.
Zhai, "Human Performance in Six Degree of Freedom Input Control," Thesis, University of Toronto, 1995.

Also Published As

Publication number Publication date
US20060178212A1 (en) 2006-08-10
US8137195B2 (en) 2012-03-20
US10159897B2 (en) 2018-12-25
WO2006058129A3 (en) 2007-03-01
US20190091571A1 (en) 2019-03-28
US20140295961A1 (en) 2014-10-02
WO2006058129A2 (en) 2006-06-01
US20120178533A1 (en) 2012-07-12
US8795079B2 (en) 2014-08-05

Similar Documents

Publication Publication Date Title
US11154776B2 (en) Semantic gaming and application transformation
US8723794B2 (en) Remote input device
JP5204224B2 (en) Object detection using video input combined with tilt angle information
JP6026483B2 (en) Free-space pointing device with tilt compensation and improved usability
US8427426B2 (en) Remote input device
JP5363533B2 (en) 3D handheld device and 3D control method
US10086282B2 (en) Tracking device for use in obtaining information for controlling game program execution
KR101617562B1 (en) 3d pointer mapping
US20060287085A1 (en) Inertially trackable hand-held controller
JP2007535774A5 (en)
JP5726811B2 (en) Method and apparatus for use in determining lack of user activity, determining user activity level, and / or adding a new player to the system
US20060287084A1 (en) System, method, and apparatus for three-dimensional input control
JP2007535776A5 (en)
JP2007535773A5 (en)
WO2007130792A2 (en) System, method, and apparatus for three-dimensional input control
JP2007509448A (en) User interface device and method using accelerometer
WO2011011898A1 (en) Input system, and method
US8725445B2 (en) Calibration of the accelerometer sensor of a remote controller
EP3057035A1 (en) Information processing program, information processing device, information processing system, and information processing method
EP2013864A2 (en) System, method, and apparatus for three-dimensional input control

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: SENT TO CLASSIFICATION CONTRACTOR

STPP Information on status: patent application and granting procedure in general

Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: DRNC HOLDINGS, INC., DELAWARE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:IDHL HOLDINGS, INC.;REEL/FRAME:063327/0188

Effective date: 20221104