US20090305785A1 - Gesture controlled game screen navigation - Google Patents
Gesture controlled game screen navigation Download PDFInfo
- Publication number
- US20090305785A1 US20090305785A1 US12/134,448 US13444808A US2009305785A1 US 20090305785 A1 US20090305785 A1 US 20090305785A1 US 13444808 A US13444808 A US 13444808A US 2009305785 A1 US2009305785 A1 US 2009305785A1
- Authority
- US
- United States
- Prior art keywords
- game controller
- controller
- game
- action
- operational mode
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000033001 locomotion Effects 0.000 claims abstract description 56
- 230000009471 action Effects 0.000 claims abstract description 41
- 239000003607 modifier Substances 0.000 claims description 15
- 238000000034 method Methods 0.000 claims description 13
- 238000004891 communication Methods 0.000 claims description 9
- 230000004044 response Effects 0.000 claims description 6
- 238000012545 processing Methods 0.000 description 21
- 230000003287 optical effect Effects 0.000 description 7
- 230000002093 peripheral effect Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 5
- 230000006855 networking Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 230000005055 memory storage Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- CDFKCKUONRRKJD-UHFFFAOYSA-N 1-(3-chlorophenoxy)-3-[2-[[3-(3-chlorophenoxy)-2-hydroxypropyl]amino]ethylamino]propan-2-ol;methanesulfonic acid Chemical compound CS(O)(=O)=O.CS(O)(=O)=O.C=1C=CC(Cl)=CC=1OCC(O)CNCCNCC(O)COC1=CC=CC(Cl)=C1 CDFKCKUONRRKJD-UHFFFAOYSA-N 0.000 description 1
- 241000288673 Chiroptera Species 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000026058 directional locomotion Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000007723 transport mechanism Effects 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/428—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/211—Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/215—Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/105—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1081—Input via voice recognition
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/6045—Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
Definitions
- Video game consoles employ controllers to allow users to interface with software, such as video games.
- a typical controller has a number of controls.
- a gamepad type controller will typically incorporate one or more directional controls, such as a number of buttons arranged in a directional keypad, one or more analog sticks, or a combination of such controls.
- a controller will typically include one or more action buttons that may be located on the face or shoulders of the controller.
- directional controls may provide action selection functionality as well.
- the analog sticks in a controller compatible with the XBOX 360® brand video game console, available from Microsoft Corp. of Redmond, Wash. can be pushed in as well as moved directionally.
- a controller can be used to navigate a graphic user interface, such as the dashboard presented to users of the XBOX 360® brand video game console.
- the graphic user interface may include a number of menus and sub-menus that allow a user to, for example, execute game software, access media resources such as image, video, or audio files or media discs, configure system settings, etc. Navigating the graphic user interface is conventionally accomplished via a combination of directional navigation commands, e.g., left, right, up, and down, input using the directional controls and action buttons. While this control scheme can work well for traditional gamepad type controllers, it relies on the use of several buttons and is thus not well-suited for controllers that lack the buttons typically found on gamepad type controllers.
- buttons on such a controller may detract from the overall gaming experience by reducing the degree of realism of the controller.
- a game controller such as a microphone controller, incorporates motion sensors that are configured to detect gestures performed by a user of the game controller.
- the gestures can be used to navigate and perform actions in a graphic user interface that a game console employs to provide a consistent user experience when navigating to different media types available on the game console.
- One embodiment is directed to a method for using a game controller to navigate a graphic user interface presented by a video game console to a user.
- a motion of the game controller is detected and is recognized as a gesture.
- An operational mode in which the game controller is operating is then determined. If the game controller is operating in a first operational mode, a navigation command corresponding to the recognized gesture is executed in the graphic user interface. On the other hand, if the game controller is operating in a second operational mode, an action corresponding to the recognized gesture is performed in the graphic user interface.
- This method may be performed by a computer executing instructions stored on a computer readable storage medium.
- Another embodiment is directed to a game controller for use with a video game console.
- a microcontroller in electrical communication with at least one motion sensor configured to detect motion of the game controller. While not required, in some embodiments, the at least one motion sensor is incorporated in the game controller.
- the microcontroller or the video game console is configured to recognize the detected motion as a gesture. If the game controller is operating in a first operational mode, the recognized gesture is mapped to a navigation command that is executed in a graphic user interface presented by the video game console to a user. If the game controller is operating in a second operational mode, the recognized gesture is mapped to an action that is performed in the graphic user interface.
- the game controller avoids the need to use a separate gamepad-type controller to navigate the graphic user interface, while also avoiding the need to incorporate additional buttons. As a result, the sense of realism and the overall gaming experience may be enhanced.
- FIG. 1 is a block diagram representing an exemplary computing device.
- FIG. 2 is a block diagram illustrating an implementation of the computing device of FIG. 1 as a game console.
- FIG. 3 is a plan view illustrating an example implementation of a microphone controller according to one embodiment.
- FIG. 4 is a block diagram representing the microphone controller of FIG. 3 .
- FIG. 5 is a flow diagram illustrating an example method of performing navigation commands and actions in a graphic user interface using gestures.
- FIG. 1 illustrates an example of a suitable computing system environment 100 in which the subject matter described above may be implemented.
- the computing system environment 100 is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality of the subject matter described above. Neither should the computing environment 100 be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in the exemplary operating environment 100 .
- computing system environment 100 includes a general purpose computing device in the form of a computer 110 .
- Components of computer 110 may include, but are not limited to, a processing unit 120 , a system memory 130 , and a system bus 121 that couples various system components including the system memory to the processing unit 120 .
- the system bus 121 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures.
- such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus (also known as Mezzanine bus).
- ISA Industry Standard Architecture
- MCA Micro Channel Architecture
- EISA Enhanced ISA
- VESA Video Electronics Standards Association
- PCI Peripheral Component Interconnect
- Computer 110 typically includes a variety of computer readable media.
- Computer readable media can be any available media that can be accessed by computer 110 and includes both volatile and nonvolatile media, removable and non-removable media.
- Computer readable media may comprise computer storage media and communication media.
- Computer storage media include both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
- Computer storage media include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CDROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computer 110 .
- Communication media typically embody computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and include any information delivery media.
- modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
- communication media include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of any of the above should also be included within the scope of computer readable media.
- the system memory 130 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 131 and random access memory (RAM) 132 .
- ROM read only memory
- RAM random access memory
- BIOS basic input/output system
- RAM 132 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 120 .
- FIG. 1 illustrates operating system 134 , application programs 135 , other program modules 136 , and program data 137 .
- the computer 110 may also include other removable/non-removable, volatile/nonvolatile computer storage media.
- FIG. 1 illustrates a hard disk drive 141 that reads from or writes to non-removable, nonvolatile magnetic media, a magnetic disk drive 151 that reads from or writes to a removable, nonvolatile magnetic disk 152 , and an optical disk drive 155 that reads from or writes to a removable, nonvolatile optical disk 156 , such as a CD-RW, DVD-RW or other optical media.
- removable/non-removable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM and the like.
- the hard disk drive 141 is typically connected to the system bus 121 through a non-removable memory interface such as interface 140
- magnetic disk drive 151 and optical disk drive 155 are typically connected to the system bus 121 by a removable memory interface, such as interface 150 .
- hard disk drive 141 is illustrated as storing operating system 144 , application programs 145 , other program modules 146 and program data 147 . Note that these components can either be the same as or different from operating system 134 , application programs 135 , other program modules 136 and program data 137 . Operating system 144 , application programs 145 , other program modules 146 and program data 147 are given different numbers here to illustrate that, at a minimum, they are different copies.
- a user may enter commands and information into the computer 110 through input devices such as a keyboard 162 and pointing device 161 , such as a mouse, trackball or touch pad.
- Other input devices may include a microphone, joystick, game pad, satellite dish, scanner, or the like.
- a graphics interface 182 may also be connected to the system bus 121 .
- One or more graphics processing units (GPUs) 184 may communicate with graphics interface 182 .
- a monitor 191 or other type of display device is also connected to the system bus 121 via an interface, such as a video interface 190 , which may in turn communicate with video memory 186 .
- computers may also include other peripheral output devices such as speakers 197 and printer 196 , which may be connected through an output peripheral interface 195 .
- the computer 110 may operate in a networked or distributed environment using logical connections to one or more remote computers, such as a remote computer 180 .
- the remote computer 180 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 110 , although only a memory storage device 181 has been illustrated in FIG. 1 .
- the logical connections depicted in FIG. 1 include a local area network (LAN) 171 and a wide area network (WAN) 173 , but may also include other networks/buses.
- LAN local area network
- WAN wide area network
- Such networking environments are commonplace in homes, offices, enterprise-wide computer networks, intranets and the Internet.
- the computer 110 When used in a LAN networking environment, the computer 110 is connected to the LAN 171 through a network interface or adapter 170 .
- the computer 110 When used in a WAN networking environment, the computer 110 typically includes a modem 172 or other means for establishing communications over the WAN 173 , such as the Internet.
- the modem 172 which may be internal or external, may be connected to the system bus 121 via the user input interface 160 , or other appropriate mechanism.
- program modules depicted relative to the computer 110 may be stored in the remote memory storage device.
- FIG. 1 illustrates remote application programs 185 as residing on memory device 181 . It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used.
- game console 200 has a central processing unit (CPU) 201 having a level 1 (L1) cache 202 , a level 2 (L2) cache 204 , and a flash ROM (Read-Only Memory) 206 .
- the level 1 cache 202 and level 2 cache 204 temporarily store data and hence reduce the number of memory access cycles, thereby improving processing speed and throughput.
- the flash ROM 206 can store executable code that is loaded during an initial phase of a boot process when the game console 200 is initially powered. Alternatively, the executable code that is loaded during the initial boot phase can be stored in a FLASH memory device (not shown).
- Game console 200 can, optionally, be a multi-processor system; for example, game console 200 can have three processors 201 , 203 , and 205 , where processors 203 and 205 have similar or identical components to the CPU 201 .
- a graphics processing unit (GPU) 208 and a video encoder/video codec (coder/decoder) 214 form a video processing pipeline for high speed and high resolution graphics processing. Data is carried from the graphics processing unit 208 to the video encoder/video codec 214 via a bus. The video processing pipeline outputs data to an A/V (audio/video) port 240 for transmission to a television or other display device.
- a memory controller 210 is connected to the GPU 208 and CPU 201 to facilitate processor access to various types of memory 212 , such as, but not limited to, a RAM (Random Access Memory).
- Game console 200 includes an I/O controller 220 , a system management controller 222 , an audio processing unit 223 , a network interface controller 224 , a first USB controller 226 , a second USB controller 228 and a front panel I/O subassembly 230 that may be implemented on a module 218 .
- the USB controllers 226 and 228 serve as hosts for peripheral controllers 242 ( 1 )- 242 ( 2 ), a wireless adapter 248 , and an external memory unit 246 (e.g., flash memory, external CD/DVD ROM drive, removable media, etc.).
- the network interface 224 and/or wireless adapter 248 provide access to a network (e.g., the Internet, a home network, etc.) and may be any of a wide variety of various wired or wireless interface components including an Ethernet card, a modem, a Bluetooth module, a cable modem, and the like.
- the game console 200 may be connected to a controller sensing device 254 to sense the position or motion of the peripheral controllers 242 ( 1 )- 242 ( 2 ) or other accessories.
- the controller sensing device may be implemented using, for example, a three-dimensional camera or an ultrasonic triangulation system.
- System memory 243 is provided to store application data that is loaded during the boot process.
- a media drive 244 is provided and may comprise a DVD/CD drive, a hard drive, or a removable media drive, etc.
- the media drive 244 may be internal or external to the game console 200 .
- the media drive 244 is a drive or reader for removable media (such as removable optical disks or flash cartridges)
- the media drive 244 is an example of an interface onto which (or into which) media are mountable for reading.
- Application data may be accessed via the media drive 244 for execution, playback, etc. by game console 200 .
- Media drive 244 is connected to the I/O controller 220 via a bus, such as a Serial ATA bus or other high speed connection (e.g., IEEE 3394). While media drive 244 may generally refer to various storage embodiments (e.g., hard disk, removable optical disk drive, etc.), game console 200 may specifically include a hard disk 253 , which can be used to store game data.
- the system management controller 222 provides a variety of service functions related to assuring availability of the game console 200 .
- the audio processing unit 223 and an audio codec 232 form a corresponding audio processing pipeline with high fidelity, 3D, surround, and stereo audio processing according to aspects of the present subject matter described herein. Audio data is carried between the audio processing unit 223 and the audio codec 232 via a communication link.
- the audio processing pipeline outputs data to the A/V port 240 for reproduction by an external audio player or device having audio capabilities.
- the front panel I/O subassembly 230 supports the functionality of the power button 250 and the eject button 252 , as well as any LEDs (light emitting diodes) or other indicators exposed on the outer surface of the game console 200 .
- a system power supply module 236 provides power to the components of the game console 200 .
- a fan 238 cools the circuitry within the game console 200 .
- the CPU 201 , GPU 208 , memory controller 210 , and various other components within the game console 200 are interconnected via one or more buses, including serial and parallel buses, a memory bus, a peripheral bus, and a processor or local bus using any of a variety of bus architectures.
- application data can be loaded from the system memory 243 into memory 212 and/or caches 202 , 204 and executed on the CPU 201 .
- the game console 200 can present a graphic user interface that provides a consistent user experience when navigating to different media types available on the game console 200 .
- applications and/or other media contained within the media drive 244 may be launched or played from the media drive 244 to provide additional functionalities to the game console 200 .
- the game console 200 may be operated as a standalone system by simply connecting the system to a television or other display. In this standalone mode, the game console 200 may allow one or more users to interact with the system, watch movies, listen to music, and the like. However, with the integration of broadband connectivity made available through the network interface 224 or the wireless adapter 248 , the game console 200 may further be operated as a participant in a larger network community.
- the game console 200 can be used with a variety of controllers 242 , such as the controllers 242 ( 1 ), 242 ( 2 ), and 242 ( 3 ) of FIG. 2 .
- controllers 242 ( 1 ) and 242 ( 2 ) are wired controllers that communicate with the game console 200 via the USB controller 226 .
- Controller 242 ( 3 ) is a wireless controller that communicates with the game console 200 via the wireless adapter 248 .
- one or more of the controllers 242 can be implemented as specialized controllers for playing certain types of games.
- one or more of the controllers 242 can be implemented as a microphone controller for playing singing and music games.
- a microphone controller 300 which may embody the controller 242 ( 3 ) of FIG. 2 .
- the microphone controller 300 may communicate with the game console 200 via a wired connection, for example, using the USB controller 226 , or a wireless connection, for example, using the wireless adapter 248 of FIG. 2 .
- the microphone controller 300 may incorporate a power button 302 for turning the microphone controller 300 on and off.
- the microphone controller 300 may also incorporate a modifier button 304 for selecting an operational mode of the microphone controller 300 , as explained more fully below. It will be appreciated that some embodiments of the microphone controller 300 may omit either or both of the power button 302 and the modifier button 304 .
- FIG. 4 is a block diagram illustrating the functional components of one embodiment of the microphone controller 300 .
- the microphone controller 300 includes a transducer 402 configured to receive an acoustic input, such as a human voice, and to generate an electrical signal.
- the transducer 402 may be implemented using any of a variety of well-known technologies.
- the generated electrical signal is output to the game console 200 using a wired connection, such as using the USB controller 226 or the USB controller 228 , or a wireless connection, such as using the wireless adapter 248 .
- the microphone controller 300 may perform some processing on the electrical signal via an analog-to-digital (A/D) codec 403 before outputting the electrical signal to the game console 200 .
- the electrical signal may be output without any processing.
- the power button 302 is connected to a power control module 404 , which is in turn connected to a power source 406 , such as one or more batteries.
- a power source 406 such as one or more batteries.
- the power control module 404 causes the microphone controller 300 to draw power from the power source 406 , activating the microphone controller 300 . If the microphone controller 300 is already activated, actuating the power button 302 may cause the power control module 404 to deactivate the microphone controller 300 .
- the power control module 404 may consist of a simple electrical switch that completes a circuit when the power button 302 is actuated.
- the power control module 404 may be more complex, such that, for example, the power button 302 must be continuously actuated for some period to either activate or deactivate the microphone controller 300 .
- the power control module 404 may be configured to automatically deactivate the microphone controller 300 when the microphone controller 300 is not used beyond a specified timeout period. This configuration may provide the advantage of avoiding excess power consumption when the microphone controller 300 is not in use.
- the power control module 404 may be configured to place the microphone controller 300 in a low-power “sleep” mode, either in response to actuation of the power button 302 or after the timeout period.
- the microphone controller 300 also includes one or more motion sensors. While FIG. 4 depicts three motion sensors 408 , 410 , and 412 , it will be appreciated by those of skill in the art that more or fewer motion sensors may be used.
- the motion sensors 408 , 410 , and 412 may be implemented using any of a variety of known technologies, including, for example, accelerometers, gyroscopes, and the like.
- the motion sensors 408 , 410 , and 412 are depicted as being located in the microphone controller 300 , some embodiments may incorporate motion sensing devices located outside the microphone controller 300 , such as the controller sensing device 254 of FIG. 2 .
- the number and type of motion sensors affects the capability of the microphone controller 300 to detect motion in various dimensions.
- the motion sensors 408 , 410 , and 412 generate signals in response to motion of the microphone controller 300 .
- the signals generated by the motion sensors 408 , 410 , and 412 may be output to the game console 200 for further processing, as indicated by the solid lines emanating from the motion sensors 408 , 410 , and 412 in FIG. 4 .
- the signals may be provided to a microcontroller 414 , which is indicated by a dashed box in FIG. 4 to denote that it may be omitted in some embodiments.
- the microcontroller 414 In embodiments in which the signals are provided to the microcontroller 414 , the microcontroller 414 generates an output signal based on the signals received from the motion sensors 408 , 410 , and 412 . This output signal may also be based in part on the signal generated by the transducer 402 , as indicated by the dashed line connecting the transducer 402 to the microcontroller 414 in FIG. 4 . If the microphone controller 300 incorporates the modifier button 304 , the output signal generated by the microcontroller 414 may also be affected by which operational mode has been selected with the modifier button 304 .
- the output signal generated by the microcontroller 414 is output to a radio block 413 , to a USB port block 415 , or to both the radio block 413 and the USB port block 415 , for output to the game console 200 via a wired or wireless connection, e.g., to the USB controller 226 or the wireless adapter 248 of FIG. 2 .
- the motion sensors 408 , 410 , and 412 detect gestures performed by a user of the microphone controller 300 .
- each motion sensor 408 , 410 , and 412 detects motion in one or more orthogonal directions and outputs motion data.
- Gestures are derived from the motion data by software.
- the software that converts the motion data to gestures may reside in the game console 200 or may be embedded in the microphone game controller 300 . If the software resides in the game console 200 , the game controller 300 may be configured to output the motion data to the game console 200 .
- the gestures can be simple directional movements, e.g., UP, DOWN, LEFT, or RIGHT, or more complex movements to represent commands such as START, BACK, ENTER, ESCAPE, and the like. More complex movements can be represented by simple movements combined with an actuation of the modifier button 304 .
- the gestures are then used to control various aspects of the operation of the game console 200 , including, for example, selecting and launching a game.
- the game console 200 can present a graphic user interface that provides a consistent user experience when navigating to different media types available on the game console 200 .
- a graphic user interface is the dashboard menu used by the XBOX 360® brand video game console.
- gestures detected by the motion sensors 408 , 410 , and 412 are used to navigate the graphic user interface and to perform actions using the graphic user interface.
- the modifier button 304 may be used to switch between one operational mode in which gestures are used to perform navigation commands, such as UP, DOWN, LEFT, and RIGHT, and another operational mode in which gestures are used to perform actions, such as START, BACK, ENTER, and ESCAPE. It will be appreciated by those skilled in the art that the modifier button 304 may also be used to place the microphone controller 300 in operational modes other than those specifically described in this disclosure.
- the microphone controller 300 avoids the need to use a separate gamepad-type controller to navigate the graphic user interface, while also avoiding the need to incorporate additional buttons on the body of the microphone controller 300 . As a result, the sense of realism and the overall gaming experience may be enhanced.
- both the power button 302 and the modifier button 304 are optional.
- the functionality provided by the power button 302 can be implemented using gestures.
- the microphone controller 300 may be configured to activate when the user picks up the microphone controller 300 .
- the microphone controller 300 may enter a “sleep” mode after a specified timeout period expires without any sensed motion.
- the functionality provided by the modifier button 304 can be implemented using gestures.
- the microphone controller 300 can be configured to switch between the operational mode in which gestures are used to perform navigation commands and the other operational mode in which gestures are used to perform actions when a specified modifier gesture or combination of gestures is performed.
- the game console 200 may send a command to the microphone controller 300 to switch between operational modes, for example, in response to the occurrence of a triggering event or based on the context in which a gesture or combination of gestures is performed.
- the microphone controller 300 can be implemented without any buttons, thereby enhancing the simulation of a real microphone.
- FIG. 5 is a flow diagram illustrating a method of using the microphone controller 300 to perform navigation commands and actions in a graphic user interface using gestures.
- one or more of the motion sensors 408 , 410 , and 412 detects motion of the microphone controller 300 .
- the motion of the microphone controller 300 is recognized as a gesture, either by the microcontroller 414 in the microphone controller 300 or by software stored in the system memory 243 of the game console 200 or on a media disc accessible by the media drive 244 .
- the game console 200 determines the operational mode in which the microphone controller 300 is currently operating.
- the game console 200 maps the detected gesture to a navigation command, such as UP, DOWN, LEFT, or RIGHT.
- a navigation command such as UP, DOWN, LEFT, or RIGHT.
- This gesture mapping can be performed by software located either in the system memory 243 or on a media disc loaded in the media drive 244 .
- the navigation command is executed, causing the graphic user interface to respond appropriately.
- gestures that are mapped to the navigation commands UP and DOWN may cause various items, such as games and movies, in a menu to be highlighted, while gestures that are mapped to the navigation commands LEFT and RIGHT may cause the graphic user interface to rotate between displays of various menus, such as a game menu, a system configuration menu, etc.
- the game console 200 maps the detected gesture to an action, such as START, BACK, ENTER, and ESCAPE.
- the action is performed, causing the graphic user interface to respond appropriately. For example, if the detected gesture is mapped to the action START, the highlighted item, e.g., a game or a movie, may be initiated. As another example, if the detected gesture is mapped to the action BACK, the graphic user interface may display a previously displayed menu or menu item. Using a combination of navigation and action gestures, the user can use the microphone controller 300 to perform most or all of the functions that are supported by a gamepad type controller without needing to use a separate controller.
Abstract
A game controller, such as a microphone controller, incorporates motion sensors that are configured to detect gestures performed by a user of the game controller. The gestures can be used to navigate and perform actions in a graphic user interface that a game console employs to provide a consistent user experience when navigating to different media types available on the game console. In this way, the game controller avoids the need to use a separate gamepad-type controller to navigate the graphic user interface, while also avoiding the need to incorporate additional buttons. As a result, the sense of realism and the overall gaming experience may be enhanced.
Description
- Video game consoles employ controllers to allow users to interface with software, such as video games. A typical controller has a number of controls. For example, a gamepad type controller will typically incorporate one or more directional controls, such as a number of buttons arranged in a directional keypad, one or more analog sticks, or a combination of such controls. In addition to the directional controls, a controller will typically include one or more action buttons that may be located on the face or shoulders of the controller. In some cases, directional controls may provide action selection functionality as well. For example, the analog sticks in a controller compatible with the XBOX 360® brand video game console, available from Microsoft Corp. of Redmond, Wash., can be pushed in as well as moved directionally.
- In addition to controlling the movement and actions of a character in a video game, a controller can be used to navigate a graphic user interface, such as the dashboard presented to users of the XBOX 360® brand video game console. The graphic user interface may include a number of menus and sub-menus that allow a user to, for example, execute game software, access media resources such as image, video, or audio files or media discs, configure system settings, etc. Navigating the graphic user interface is conventionally accomplished via a combination of directional navigation commands, e.g., left, right, up, and down, input using the directional controls and action buttons. While this control scheme can work well for traditional gamepad type controllers, it relies on the use of several buttons and is thus not well-suited for controllers that lack the buttons typically found on gamepad type controllers.
- For example, some video games involve singing into a microphone type controller. The limited space available on the surface of the microphone type controller limits the number of buttons that can be implemented on the microphone type controller. In addition, the presence of buttons on such a controller may detract from the overall gaming experience by reducing the degree of realism of the controller.
- According to various embodiments, a game controller, such as a microphone controller, incorporates motion sensors that are configured to detect gestures performed by a user of the game controller. The gestures can be used to navigate and perform actions in a graphic user interface that a game console employs to provide a consistent user experience when navigating to different media types available on the game console.
- One embodiment is directed to a method for using a game controller to navigate a graphic user interface presented by a video game console to a user. A motion of the game controller is detected and is recognized as a gesture. An operational mode in which the game controller is operating is then determined. If the game controller is operating in a first operational mode, a navigation command corresponding to the recognized gesture is executed in the graphic user interface. On the other hand, if the game controller is operating in a second operational mode, an action corresponding to the recognized gesture is performed in the graphic user interface. This method may be performed by a computer executing instructions stored on a computer readable storage medium.
- Another embodiment is directed to a game controller for use with a video game console. A microcontroller in electrical communication with at least one motion sensor configured to detect motion of the game controller. While not required, in some embodiments, the at least one motion sensor is incorporated in the game controller. The microcontroller or the video game console is configured to recognize the detected motion as a gesture. If the game controller is operating in a first operational mode, the recognized gesture is mapped to a navigation command that is executed in a graphic user interface presented by the video game console to a user. If the game controller is operating in a second operational mode, the recognized gesture is mapped to an action that is performed in the graphic user interface.
- Various embodiments may realize certain advantages. For example, by using gestures to perform navigation commands and actions in the graphic user interface, the game controller avoids the need to use a separate gamepad-type controller to navigate the graphic user interface, while also avoiding the need to incorporate additional buttons. As a result, the sense of realism and the overall gaming experience may be enhanced.
- This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
- The illustrative embodiments will be better understood after reading the following detailed description with reference to the appended drawings, in which:
-
FIG. 1 is a block diagram representing an exemplary computing device. -
FIG. 2 is a block diagram illustrating an implementation of the computing device ofFIG. 1 as a game console. -
FIG. 3 is a plan view illustrating an example implementation of a microphone controller according to one embodiment. -
FIG. 4 is a block diagram representing the microphone controller ofFIG. 3 . -
FIG. 5 is a flow diagram illustrating an example method of performing navigation commands and actions in a graphic user interface using gestures. - The inventive subject matter is described with specificity to meet statutory requirements. However, the description itself is not intended to limit the scope of this patent. Rather, it is contemplated that the claimed subject matter might also be embodied in other ways, to include different steps or combinations of steps similar to the ones described in this document, in conjunction with other present or future technologies.
-
FIG. 1 illustrates an example of a suitablecomputing system environment 100 in which the subject matter described above may be implemented. Thecomputing system environment 100 is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality of the subject matter described above. Neither should thecomputing environment 100 be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in theexemplary operating environment 100. - With reference to
FIG. 1 ,computing system environment 100 includes a general purpose computing device in the form of acomputer 110. Components ofcomputer 110 may include, but are not limited to, aprocessing unit 120, asystem memory 130, and asystem bus 121 that couples various system components including the system memory to theprocessing unit 120. Thesystem bus 121 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus (also known as Mezzanine bus). -
Computer 110 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed bycomputer 110 and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media include both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CDROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed bycomputer 110. Communication media typically embody computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and include any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of any of the above should also be included within the scope of computer readable media. - The
system memory 130 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 131 and random access memory (RAM) 132. A basic input/output system 133 (BIOS), containing the basic routines that help to transfer information between elements withincomputer 110, such as during start-up, is typically stored inROM 131.RAM 132 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on byprocessing unit 120. By way of example, and not limitation,FIG. 1 illustratesoperating system 134,application programs 135,other program modules 136, andprogram data 137. - The
computer 110 may also include other removable/non-removable, volatile/nonvolatile computer storage media. By way of example only,FIG. 1 illustrates ahard disk drive 141 that reads from or writes to non-removable, nonvolatile magnetic media, amagnetic disk drive 151 that reads from or writes to a removable, nonvolatilemagnetic disk 152, and anoptical disk drive 155 that reads from or writes to a removable, nonvolatileoptical disk 156, such as a CD-RW, DVD-RW or other optical media. Other removable/non-removable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM and the like. Thehard disk drive 141 is typically connected to thesystem bus 121 through a non-removable memory interface such asinterface 140, andmagnetic disk drive 151 andoptical disk drive 155 are typically connected to thesystem bus 121 by a removable memory interface, such as interface 150. - The drives and their associated computer storage media discussed above and illustrated in
FIG. 1 provide storage of computer readable instructions, data structures, program modules and other data for thecomputer 110. InFIG. 1 , for example,hard disk drive 141 is illustrated as storingoperating system 144,application programs 145, other program modules 146 andprogram data 147. Note that these components can either be the same as or different fromoperating system 134,application programs 135,other program modules 136 andprogram data 137.Operating system 144,application programs 145, other program modules 146 andprogram data 147 are given different numbers here to illustrate that, at a minimum, they are different copies. A user may enter commands and information into thecomputer 110 through input devices such as akeyboard 162 andpointing device 161, such as a mouse, trackball or touch pad. Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to theprocessing unit 120 through auser input interface 160 that is coupled to thesystem bus 121, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB). Agraphics interface 182 may also be connected to thesystem bus 121. One or more graphics processing units (GPUs) 184 may communicate withgraphics interface 182. Amonitor 191 or other type of display device is also connected to thesystem bus 121 via an interface, such as avideo interface 190, which may in turn communicate withvideo memory 186. In addition to monitor 191, computers may also include other peripheral output devices such as speakers 197 andprinter 196, which may be connected through an output peripheral interface 195. - The
computer 110 may operate in a networked or distributed environment using logical connections to one or more remote computers, such as aremote computer 180. Theremote computer 180 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to thecomputer 110, although only amemory storage device 181 has been illustrated inFIG. 1 . The logical connections depicted inFIG. 1 include a local area network (LAN) 171 and a wide area network (WAN) 173, but may also include other networks/buses. Such networking environments are commonplace in homes, offices, enterprise-wide computer networks, intranets and the Internet. - When used in a LAN networking environment, the
computer 110 is connected to theLAN 171 through a network interface oradapter 170. When used in a WAN networking environment, thecomputer 110 typically includes amodem 172 or other means for establishing communications over theWAN 173, such as the Internet. Themodem 172, which may be internal or external, may be connected to thesystem bus 121 via theuser input interface 160, or other appropriate mechanism. In a networked environment, program modules depicted relative to thecomputer 110, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation,FIG. 1 illustrates remote application programs 185 as residing onmemory device 181. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used. - It will be appreciated that one particular application of
computer 110 is in the form of agame console 200 as depicted inFIG. 2 . As seen therein,game console 200 has a central processing unit (CPU) 201 having a level 1 (L1)cache 202, a level 2 (L2)cache 204, and a flash ROM (Read-Only Memory) 206. Thelevel 1cache 202 andlevel 2cache 204 temporarily store data and hence reduce the number of memory access cycles, thereby improving processing speed and throughput. Theflash ROM 206 can store executable code that is loaded during an initial phase of a boot process when thegame console 200 is initially powered. Alternatively, the executable code that is loaded during the initial boot phase can be stored in a FLASH memory device (not shown). Further, theflash ROM 206 can be located separate fromCPU 201.Game console 200 can, optionally, be a multi-processor system; for example,game console 200 can have threeprocessors processors CPU 201. - A graphics processing unit (GPU) 208 and a video encoder/video codec (coder/decoder) 214 form a video processing pipeline for high speed and high resolution graphics processing. Data is carried from the
graphics processing unit 208 to the video encoder/video codec 214 via a bus. The video processing pipeline outputs data to an A/V (audio/video)port 240 for transmission to a television or other display device. Amemory controller 210 is connected to theGPU 208 andCPU 201 to facilitate processor access to various types ofmemory 212, such as, but not limited to, a RAM (Random Access Memory). -
Game console 200 includes an I/O controller 220, asystem management controller 222, anaudio processing unit 223, anetwork interface controller 224, afirst USB controller 226, asecond USB controller 228 and a front panel I/O subassembly 230 that may be implemented on amodule 218. TheUSB controllers wireless adapter 248, and an external memory unit 246 (e.g., flash memory, external CD/DVD ROM drive, removable media, etc.). Thenetwork interface 224 and/orwireless adapter 248 provide access to a network (e.g., the Internet, a home network, etc.) and may be any of a wide variety of various wired or wireless interface components including an Ethernet card, a modem, a Bluetooth module, a cable modem, and the like. Thegame console 200 may be connected to acontroller sensing device 254 to sense the position or motion of the peripheral controllers 242(1)-242(2) or other accessories. The controller sensing device may be implemented using, for example, a three-dimensional camera or an ultrasonic triangulation system. -
System memory 243 is provided to store application data that is loaded during the boot process. A media drive 244 is provided and may comprise a DVD/CD drive, a hard drive, or a removable media drive, etc. The media drive 244 may be internal or external to thegame console 200. When the media drive 244 is a drive or reader for removable media (such as removable optical disks or flash cartridges), then the media drive 244 is an example of an interface onto which (or into which) media are mountable for reading. Application data may be accessed via the media drive 244 for execution, playback, etc. bygame console 200. Media drive 244 is connected to the I/O controller 220 via a bus, such as a Serial ATA bus or other high speed connection (e.g., IEEE 3394). While media drive 244 may generally refer to various storage embodiments (e.g., hard disk, removable optical disk drive, etc.),game console 200 may specifically include ahard disk 253, which can be used to store game data. - The
system management controller 222 provides a variety of service functions related to assuring availability of thegame console 200. Theaudio processing unit 223 and anaudio codec 232 form a corresponding audio processing pipeline with high fidelity, 3D, surround, and stereo audio processing according to aspects of the present subject matter described herein. Audio data is carried between theaudio processing unit 223 and theaudio codec 232 via a communication link. The audio processing pipeline outputs data to the A/V port 240 for reproduction by an external audio player or device having audio capabilities. - The front panel I/
O subassembly 230 supports the functionality of thepower button 250 and theeject button 252, as well as any LEDs (light emitting diodes) or other indicators exposed on the outer surface of thegame console 200. A systempower supply module 236 provides power to the components of thegame console 200. Afan 238 cools the circuitry within thegame console 200. - The
CPU 201,GPU 208,memory controller 210, and various other components within thegame console 200 are interconnected via one or more buses, including serial and parallel buses, a memory bus, a peripheral bus, and a processor or local bus using any of a variety of bus architectures. - When the
game console 200 is powered on or rebooted, application data can be loaded from thesystem memory 243 intomemory 212 and/orcaches CPU 201. Thegame console 200 can present a graphic user interface that provides a consistent user experience when navigating to different media types available on thegame console 200. In operation, applications and/or other media contained within the media drive 244 may be launched or played from the media drive 244 to provide additional functionalities to thegame console 200. - The
game console 200 may be operated as a standalone system by simply connecting the system to a television or other display. In this standalone mode, thegame console 200 may allow one or more users to interact with the system, watch movies, listen to music, and the like. However, with the integration of broadband connectivity made available through thenetwork interface 224 or thewireless adapter 248, thegame console 200 may further be operated as a participant in a larger network community. - The
game console 200 can be used with a variety ofcontrollers 242, such as the controllers 242(1), 242(2), and 242(3) ofFIG. 2 . In the embodiment shown inFIG. 2 , controllers 242(1) and 242(2) are wired controllers that communicate with thegame console 200 via theUSB controller 226. Controller 242(3) is a wireless controller that communicates with thegame console 200 via thewireless adapter 248. According to certain embodiments, one or more of thecontrollers 242 can be implemented as specialized controllers for playing certain types of games. As a particular example, one or more of thecontrollers 242 can be implemented as a microphone controller for playing singing and music games. - One example implementation of a microphone controller is depicted at
FIG. 3 as amicrophone controller 300, which may embody the controller 242(3) ofFIG. 2 . Themicrophone controller 300 may communicate with thegame console 200 via a wired connection, for example, using theUSB controller 226, or a wireless connection, for example, using thewireless adapter 248 ofFIG. 2 . As shown inFIG. 3 , themicrophone controller 300 may incorporate apower button 302 for turning themicrophone controller 300 on and off. Themicrophone controller 300 may also incorporate amodifier button 304 for selecting an operational mode of themicrophone controller 300, as explained more fully below. It will be appreciated that some embodiments of themicrophone controller 300 may omit either or both of thepower button 302 and themodifier button 304. -
FIG. 4 is a block diagram illustrating the functional components of one embodiment of themicrophone controller 300. Themicrophone controller 300 includes atransducer 402 configured to receive an acoustic input, such as a human voice, and to generate an electrical signal. Thetransducer 402 may be implemented using any of a variety of well-known technologies. The generated electrical signal is output to thegame console 200 using a wired connection, such as using theUSB controller 226 or theUSB controller 228, or a wireless connection, such as using thewireless adapter 248. In some embodiments, themicrophone controller 300 may perform some processing on the electrical signal via an analog-to-digital (A/D)codec 403 before outputting the electrical signal to thegame console 200. In other embodiments, the electrical signal may be output without any processing. - The
power button 302 is connected to apower control module 404, which is in turn connected to apower source 406, such as one or more batteries. When thepower button 302 is actuated, thepower control module 404 causes themicrophone controller 300 to draw power from thepower source 406, activating themicrophone controller 300. If themicrophone controller 300 is already activated, actuating thepower button 302 may cause thepower control module 404 to deactivate themicrophone controller 300. In some embodiments, thepower control module 404 may consist of a simple electrical switch that completes a circuit when thepower button 302 is actuated. In other embodiments, thepower control module 404 may be more complex, such that, for example, thepower button 302 must be continuously actuated for some period to either activate or deactivate themicrophone controller 300. As another example, thepower control module 404 may be configured to automatically deactivate themicrophone controller 300 when themicrophone controller 300 is not used beyond a specified timeout period. This configuration may provide the advantage of avoiding excess power consumption when themicrophone controller 300 is not in use. In some embodiments, as an alternative to deactivating themicrophone controller 300, thepower control module 404 may be configured to place themicrophone controller 300 in a low-power “sleep” mode, either in response to actuation of thepower button 302 or after the timeout period. - The
microphone controller 300 also includes one or more motion sensors. WhileFIG. 4 depicts threemotion sensors motion sensors motion sensors microphone controller 300, some embodiments may incorporate motion sensing devices located outside themicrophone controller 300, such as thecontroller sensing device 254 ofFIG. 2 . The number and type of motion sensors affects the capability of themicrophone controller 300 to detect motion in various dimensions. - In the embodiment shown in
FIG. 4 , themotion sensors microphone controller 300. The signals generated by themotion sensors game console 200 for further processing, as indicated by the solid lines emanating from themotion sensors FIG. 4 . Alternatively, as indicated by the dashed lines emanating from themotion sensors microcontroller 414, which is indicated by a dashed box inFIG. 4 to denote that it may be omitted in some embodiments. - In embodiments in which the signals are provided to the
microcontroller 414, themicrocontroller 414 generates an output signal based on the signals received from themotion sensors transducer 402, as indicated by the dashed line connecting thetransducer 402 to themicrocontroller 414 inFIG. 4 . If themicrophone controller 300 incorporates themodifier button 304, the output signal generated by themicrocontroller 414 may also be affected by which operational mode has been selected with themodifier button 304. The output signal generated by themicrocontroller 414 is output to a radio block 413, to aUSB port block 415, or to both the radio block 413 and theUSB port block 415, for output to thegame console 200 via a wired or wireless connection, e.g., to theUSB controller 226 or thewireless adapter 248 ofFIG. 2 . - In operation, the
motion sensors microphone controller 300. Specifically, eachmotion sensor game console 200 or may be embedded in themicrophone game controller 300. If the software resides in thegame console 200, thegame controller 300 may be configured to output the motion data to thegame console 200. The gestures can be simple directional movements, e.g., UP, DOWN, LEFT, or RIGHT, or more complex movements to represent commands such as START, BACK, ENTER, ESCAPE, and the like. More complex movements can be represented by simple movements combined with an actuation of themodifier button 304. - The gestures are then used to control various aspects of the operation of the
game console 200, including, for example, selecting and launching a game. Further, as described above, thegame console 200 can present a graphic user interface that provides a consistent user experience when navigating to different media types available on thegame console 200. One particular example of such a graphic user interface is the dashboard menu used by the XBOX 360® brand video game console. According to various embodiments, gestures detected by themotion sensors modifier button 304 may be used to switch between one operational mode in which gestures are used to perform navigation commands, such as UP, DOWN, LEFT, and RIGHT, and another operational mode in which gestures are used to perform actions, such as START, BACK, ENTER, and ESCAPE. It will be appreciated by those skilled in the art that themodifier button 304 may also be used to place themicrophone controller 300 in operational modes other than those specifically described in this disclosure. By using gestures to perform navigation commands and actions in the graphic user interface, themicrophone controller 300 avoids the need to use a separate gamepad-type controller to navigate the graphic user interface, while also avoiding the need to incorporate additional buttons on the body of themicrophone controller 300. As a result, the sense of realism and the overall gaming experience may be enhanced. - As described above in connection with
FIG. 3 , both thepower button 302 and themodifier button 304 are optional. In some embodiments, the functionality provided by thepower button 302 can be implemented using gestures. For example, themicrophone controller 300 may be configured to activate when the user picks up themicrophone controller 300. When themicrophone controller 300 is set down after use, themicrophone controller 300 may enter a “sleep” mode after a specified timeout period expires without any sensed motion. Similarly, the functionality provided by themodifier button 304 can be implemented using gestures. For example, themicrophone controller 300 can be configured to switch between the operational mode in which gestures are used to perform navigation commands and the other operational mode in which gestures are used to perform actions when a specified modifier gesture or combination of gestures is performed. As another alternative, thegame console 200 may send a command to themicrophone controller 300 to switch between operational modes, for example, in response to the occurrence of a triggering event or based on the context in which a gesture or combination of gestures is performed. Thus, themicrophone controller 300 can be implemented without any buttons, thereby enhancing the simulation of a real microphone. -
FIG. 5 is a flow diagram illustrating a method of using themicrophone controller 300 to perform navigation commands and actions in a graphic user interface using gestures. At astep 500, one or more of themotion sensors microphone controller 300. Next, at astep 502, the motion of themicrophone controller 300 is recognized as a gesture, either by themicrocontroller 414 in themicrophone controller 300 or by software stored in thesystem memory 243 of thegame console 200 or on a media disc accessible by themedia drive 244. After the motion of themicrophone controller 300 is recognized as a gesture, thegame console 200 determines the operational mode in which themicrophone controller 300 is currently operating. If themicrophone controller 300 is operating in the operational mode in which gestures are used to perform navigation commands, then, at astep 504, thegame console 200 maps the detected gesture to a navigation command, such as UP, DOWN, LEFT, or RIGHT. This gesture mapping can be performed by software located either in thesystem memory 243 or on a media disc loaded in themedia drive 244. At astep 506, the navigation command is executed, causing the graphic user interface to respond appropriately. For example, gestures that are mapped to the navigation commands UP and DOWN may cause various items, such as games and movies, in a menu to be highlighted, while gestures that are mapped to the navigation commands LEFT and RIGHT may cause the graphic user interface to rotate between displays of various menus, such as a game menu, a system configuration menu, etc. - On the other hand, if the
microphone controller 300 is operating in the operational mode in which gestures are used to perform actions, then, at astep 508, thegame console 200 maps the detected gesture to an action, such as START, BACK, ENTER, and ESCAPE. At astep 510, the action is performed, causing the graphic user interface to respond appropriately. For example, if the detected gesture is mapped to the action START, the highlighted item, e.g., a game or a movie, may be initiated. As another example, if the detected gesture is mapped to the action BACK, the graphic user interface may display a previously displayed menu or menu item. Using a combination of navigation and action gestures, the user can use themicrophone controller 300 to perform most or all of the functions that are supported by a gamepad type controller without needing to use a separate controller. - While the above embodiments have been described in the context of a microphone controller, it will be appreciated that the principles described herein can be applied to any of a variety of game controllers. These principles may be particularly suitable for application to game controllers for which it is desirable to minimize the number of buttons, for example, to enhance realism. Other types of game controllers in connection with which the principles described herein may be particularly beneficial may include, but are not limited to, exercise controllers intended to be worn on the wrists and/or feet of a user for use in exercise or dancing games, pointing controllers such as guns, and specialized sports controllers for use in sports games, such as simulated tennis rackets, baseball bats, and the like.
- Although the subject matter has been described in language specific to the structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features or acts described above are disclosed as example forms of implementing the claims.
Claims (20)
1. A method for using a game controller to navigate a graphic user interface presented by a video game console to a user, the method comprising:
detecting a motion of the game controller;
recognizing the detected motion as a gesture;
determining an operational mode in which the game controller is operating;
if the game controller is operating in a first operational mode, executing in the graphic user interface a navigation command corresponding to the recognized gesture; and
if the game controller is operating in a second operational mode, performing in the graphic user interface an action corresponding to the recognized gesture.
2. The method of claim 1 , further comprising switching the operation of the game controller between the first operational mode and the second operational mode in response to at least one of actuation of a button on the game controller, recognizing the detected motion as a modifier gesture, and a command received by the game controller from the video game console.
3. The method of claim 1 , wherein the navigation command comprises at least one of an UP command, a DOWN command, a LEFT command, and a RIGHT command.
4. The method of claim 1 , wherein the action comprises at least one of an ENTER action, a BACK action, an ESCAPE action, and a START action.
5. The method of claim 1 , further comprising:
using the executed navigation command to select a game; and
using the performed action to launch the selected game.
6. A computer-readable storage medium storing computer-executable instructions for:
detecting a motion of a game controller;
recognizing the detected motion as a gesture;
determining an operational mode in which the game controller;
if the game controller is operating in a first operational mode, executing in a graphic user interface a navigation command corresponding to the recognized gesture; and
if the game controller is operating in a second operational mode, performing in the graphic user interface an action corresponding to the recognized gesture.
7. The computer-readable storage medium of claim 6 , wherein the computer-readable storage medium stores further computer-executable instructions for switching the operation of the game controller between the first operational mode and the second operational mode in response to at least one of actuation of a button on the game controller, recognizing the detected motion as a modifier gesture, and a command received by the game controller from the video game console.
8. The computer-readable storage medium of claim 6 , wherein the navigation command comprises at least one of an UP command, a DOWN command, a LEFT command, and a RIGHT command.
9. The computer-readable storage medium of claim 6 , wherein the action comprises at least one of an ENTER action, a BACK action, an ESCAPE action, and a START action.
10. The computer-readable storage medium of claim 6 , wherein the computer-readable storage medium stores further computer-executable instructions for:
using the executed navigation command to select a game; and
using the performed action to launch the selected game.
11. A game controller for use with a video game console, the game controller comprising:
a microcontroller in electrical communication with at least one motion sensor configured to detect motion of the game controller, wherein at least one of the microcontroller and the video game console is configured to recognize the detected motion as a gesture,
wherein if the game controller is operating in a first operational mode, the recognized gesture is mapped to a navigation command that is executed in a graphic user interface presented by the video game console to a user, and
wherein if the game controller is operating in a second operational mode, the recognized gesture is mapped to an action that is performed in the graphic user interface.
12. The game controller of claim 11 , further comprising the at least one motion sensor.
13. The game controller of claim 11 , wherein the at least one motion sensor is configured to generate motion data, the game controller is configured to output motion data to the video game console, and the video game console is configured to convert the motion data to the recognized gesture.
14. The game controller of claim 11 , further comprising a modifier button in electrical communication with the microcontroller and configured to switch the game controller between the first operational mode and the second operational mode when the modifier button is actuated.
15. The game controller of claim 11 , wherein the operation of the game controller is switched between the first operational mode and the second operational mode in response to the microcontroller recognizing the detected motion as a modifier gesture.
16. The game controller of claim 11 , wherein the navigation command comprises at least one of an UP command, a DOWN command, a LEFT command, and a RIGHT command.
17. The game controller of claim 11 , wherein the action comprises at least one of an ENTER action, a BACK action, an ESCAPE action, and a START action.
18. The game controller of claim 11 , further comprising a transducer configured to receive an acoustic input and generate a signal based on the acoustic input.
19. The game controller of claim 11 , wherein the game controller is configured to activate when the at least one motion sensor detects motion of the game controller.
20. The game controller of claim 11 , wherein the game controller comprises at least one of an exercise controller configured to be worn by the user, a pointing controller, and a sports controller.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/134,448 US20090305785A1 (en) | 2008-06-06 | 2008-06-06 | Gesture controlled game screen navigation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/134,448 US20090305785A1 (en) | 2008-06-06 | 2008-06-06 | Gesture controlled game screen navigation |
Publications (1)
Publication Number | Publication Date |
---|---|
US20090305785A1 true US20090305785A1 (en) | 2009-12-10 |
Family
ID=41400813
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/134,448 Abandoned US20090305785A1 (en) | 2008-06-06 | 2008-06-06 | Gesture controlled game screen navigation |
Country Status (1)
Country | Link |
---|---|
US (1) | US20090305785A1 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090197635A1 (en) * | 2008-02-01 | 2009-08-06 | Kim Joo Min | user interface for a mobile device |
US20090197615A1 (en) * | 2008-02-01 | 2009-08-06 | Kim Joo Min | User interface for mobile devices |
US20100248832A1 (en) * | 2009-03-30 | 2010-09-30 | Microsoft Corporation | Control of video game via microphone |
CN103064513A (en) * | 2012-12-10 | 2013-04-24 | 深圳市茁迩科技发展有限公司 | Method and device for switching operation modes of motion sensing device automatically |
US20130147850A1 (en) * | 2011-12-08 | 2013-06-13 | Motorola Solutions, Inc. | Method and device for force sensing gesture recognition |
US20130346906A1 (en) * | 2012-06-25 | 2013-12-26 | Peter Farago | Creation and exposure of embedded secondary content data relevant to a primary content page of an electronic book |
WO2014060647A1 (en) * | 2012-10-15 | 2014-04-24 | Trick Technologies Oy | A microphone device, method to operate and a system thereof |
US10024660B2 (en) | 2012-08-27 | 2018-07-17 | Universite Du Quebec A Chicoutimi | Method to determine physical properties of the ground |
Citations (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5607157A (en) * | 1993-04-09 | 1997-03-04 | Sega Enterprises, Ltd. | Multi-connection device for use in game apparatus |
US20030173829A1 (en) * | 2002-03-13 | 2003-09-18 | Yu-Wen Zeng | Sound-activated wake-up device for electronic input devices having a sleep-mode |
US20040193413A1 (en) * | 2003-03-25 | 2004-09-30 | Wilson Andrew D. | Architecture for controlling a computer using hand gestures |
US20040196265A1 (en) * | 2001-07-17 | 2004-10-07 | Nohr Steven P. | System and method for finger held hardware device |
US20060209014A1 (en) * | 2005-03-16 | 2006-09-21 | Microsoft Corporation | Method and system for providing modifier key behavior through pen gestures |
US20060264258A1 (en) * | 2002-07-27 | 2006-11-23 | Zalewski Gary M | Multi-input game control mixer |
US20060264259A1 (en) * | 2002-07-27 | 2006-11-23 | Zalewski Gary M | System for tracking user manipulations within an environment |
US20060274032A1 (en) * | 2002-07-27 | 2006-12-07 | Xiadong Mao | Tracking device for use in obtaining information for controlling game program execution |
US20060287085A1 (en) * | 2002-07-27 | 2006-12-21 | Xiadong Mao | Inertially trackable hand-held controller |
US20060287084A1 (en) * | 2002-07-27 | 2006-12-21 | Xiadong Mao | System, method, and apparatus for three-dimensional input control |
US20070015558A1 (en) * | 2002-07-27 | 2007-01-18 | Sony Computer Entertainment America Inc. | Method and apparatus for use in determining an activity level of a user in relation to a system |
US20070015559A1 (en) * | 2002-07-27 | 2007-01-18 | Sony Computer Entertainment America Inc. | Method and apparatus for use in determining lack of user activity in relation to a system |
US20070021208A1 (en) * | 2002-07-27 | 2007-01-25 | Xiadong Mao | Obtaining input for controlling execution of a game program |
US20070026869A1 (en) * | 2005-07-29 | 2007-02-01 | Sony Ericsson Mobile Communications Ab | Methods, devices and computer program products for operating mobile devices responsive to user input through movement thereof |
US20070066394A1 (en) * | 2005-09-15 | 2007-03-22 | Nintendo Co., Ltd. | Video game system with wireless modular handheld controller |
US7233316B2 (en) * | 2003-05-01 | 2007-06-19 | Thomson Licensing | Multimedia user interface |
US20070218994A1 (en) * | 2006-03-14 | 2007-09-20 | Sony Computer Entertainment Inc. | Game Controller |
US20070259716A1 (en) * | 2004-06-18 | 2007-11-08 | Igt | Control of wager-based game using gesture recognition |
US20070259717A1 (en) * | 2004-06-18 | 2007-11-08 | Igt | Gesture controlled casino gaming system |
US20080080789A1 (en) * | 2006-09-28 | 2008-04-03 | Sony Computer Entertainment Inc. | Object detection using video input combined with tilt angle information |
US20080174550A1 (en) * | 2005-02-24 | 2008-07-24 | Kari Laurila | Motion-Input Device For a Computing Terminal and Method of its Operation |
US20080287189A1 (en) * | 2007-05-09 | 2008-11-20 | Nintendo Of America Inc. | System and method for using accelerometer outputs to control an object rotating on a display |
US20080300055A1 (en) * | 2007-05-29 | 2008-12-04 | Lutnick Howard W | Game with hand motion control |
US20090027337A1 (en) * | 2007-07-27 | 2009-01-29 | Gesturetek, Inc. | Enhanced camera-based input |
US20090153289A1 (en) * | 2007-12-12 | 2009-06-18 | Eric James Hope | Handheld electronic devices with bimodal remote control functionality |
US20090153288A1 (en) * | 2007-12-12 | 2009-06-18 | Eric James Hope | Handheld electronic devices with remote control functionality and gesture recognition |
US20090217211A1 (en) * | 2008-02-27 | 2009-08-27 | Gesturetek, Inc. | Enhanced input using recognized gestures |
US20090291759A1 (en) * | 2008-05-22 | 2009-11-26 | International Business Machines Corporation | Simulation of writing on game consoles through the use of motion-sensing technology |
US20090305787A1 (en) * | 2006-07-28 | 2009-12-10 | Sony Computer Entertainment Inc. | Game control program, game control method, and game device |
US7667686B2 (en) * | 2006-02-01 | 2010-02-23 | Memsic, Inc. | Air-writing and motion sensing input for portable devices |
US7690988B2 (en) * | 2000-11-21 | 2010-04-06 | Sony Computer Entertainment Inc. | Information processing method |
US20100234094A1 (en) * | 2007-11-09 | 2010-09-16 | Wms Gaming Inc. | Interaction with 3d space in a gaming system |
US8064827B2 (en) * | 2005-05-15 | 2011-11-22 | Sony Computer Entertainment Inc. | Center device |
US20110300941A1 (en) * | 2000-02-22 | 2011-12-08 | Creative Kingdoms, Llc | Motion-sensitive Input Device and Interactive Gaming System |
-
2008
- 2008-06-06 US US12/134,448 patent/US20090305785A1/en not_active Abandoned
Patent Citations (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5607157A (en) * | 1993-04-09 | 1997-03-04 | Sega Enterprises, Ltd. | Multi-connection device for use in game apparatus |
US20110300941A1 (en) * | 2000-02-22 | 2011-12-08 | Creative Kingdoms, Llc | Motion-sensitive Input Device and Interactive Gaming System |
US7690988B2 (en) * | 2000-11-21 | 2010-04-06 | Sony Computer Entertainment Inc. | Information processing method |
US20040196265A1 (en) * | 2001-07-17 | 2004-10-07 | Nohr Steven P. | System and method for finger held hardware device |
US20030173829A1 (en) * | 2002-03-13 | 2003-09-18 | Yu-Wen Zeng | Sound-activated wake-up device for electronic input devices having a sleep-mode |
US20070021208A1 (en) * | 2002-07-27 | 2007-01-25 | Xiadong Mao | Obtaining input for controlling execution of a game program |
US20060274032A1 (en) * | 2002-07-27 | 2006-12-07 | Xiadong Mao | Tracking device for use in obtaining information for controlling game program execution |
US20060287085A1 (en) * | 2002-07-27 | 2006-12-21 | Xiadong Mao | Inertially trackable hand-held controller |
US20060287084A1 (en) * | 2002-07-27 | 2006-12-21 | Xiadong Mao | System, method, and apparatus for three-dimensional input control |
US20070015558A1 (en) * | 2002-07-27 | 2007-01-18 | Sony Computer Entertainment America Inc. | Method and apparatus for use in determining an activity level of a user in relation to a system |
US20070015559A1 (en) * | 2002-07-27 | 2007-01-18 | Sony Computer Entertainment America Inc. | Method and apparatus for use in determining lack of user activity in relation to a system |
US20060264259A1 (en) * | 2002-07-27 | 2006-11-23 | Zalewski Gary M | System for tracking user manipulations within an environment |
US20060264258A1 (en) * | 2002-07-27 | 2006-11-23 | Zalewski Gary M | Multi-input game control mixer |
US20040193413A1 (en) * | 2003-03-25 | 2004-09-30 | Wilson Andrew D. | Architecture for controlling a computer using hand gestures |
US7233316B2 (en) * | 2003-05-01 | 2007-06-19 | Thomson Licensing | Multimedia user interface |
US20070259717A1 (en) * | 2004-06-18 | 2007-11-08 | Igt | Gesture controlled casino gaming system |
US20070259716A1 (en) * | 2004-06-18 | 2007-11-08 | Igt | Control of wager-based game using gesture recognition |
US20080174550A1 (en) * | 2005-02-24 | 2008-07-24 | Kari Laurila | Motion-Input Device For a Computing Terminal and Method of its Operation |
US20060209014A1 (en) * | 2005-03-16 | 2006-09-21 | Microsoft Corporation | Method and system for providing modifier key behavior through pen gestures |
US8064827B2 (en) * | 2005-05-15 | 2011-11-22 | Sony Computer Entertainment Inc. | Center device |
US20070026869A1 (en) * | 2005-07-29 | 2007-02-01 | Sony Ericsson Mobile Communications Ab | Methods, devices and computer program products for operating mobile devices responsive to user input through movement thereof |
US20070066394A1 (en) * | 2005-09-15 | 2007-03-22 | Nintendo Co., Ltd. | Video game system with wireless modular handheld controller |
US7927216B2 (en) * | 2005-09-15 | 2011-04-19 | Nintendo Co., Ltd. | Video game system with wireless modular handheld controller |
US7667686B2 (en) * | 2006-02-01 | 2010-02-23 | Memsic, Inc. | Air-writing and motion sensing input for portable devices |
US20070218994A1 (en) * | 2006-03-14 | 2007-09-20 | Sony Computer Entertainment Inc. | Game Controller |
US20090305787A1 (en) * | 2006-07-28 | 2009-12-10 | Sony Computer Entertainment Inc. | Game control program, game control method, and game device |
US20080080789A1 (en) * | 2006-09-28 | 2008-04-03 | Sony Computer Entertainment Inc. | Object detection using video input combined with tilt angle information |
US20080287189A1 (en) * | 2007-05-09 | 2008-11-20 | Nintendo Of America Inc. | System and method for using accelerometer outputs to control an object rotating on a display |
US20080300055A1 (en) * | 2007-05-29 | 2008-12-04 | Lutnick Howard W | Game with hand motion control |
US20090027337A1 (en) * | 2007-07-27 | 2009-01-29 | Gesturetek, Inc. | Enhanced camera-based input |
US20100234094A1 (en) * | 2007-11-09 | 2010-09-16 | Wms Gaming Inc. | Interaction with 3d space in a gaming system |
US20090153288A1 (en) * | 2007-12-12 | 2009-06-18 | Eric James Hope | Handheld electronic devices with remote control functionality and gesture recognition |
US20090153289A1 (en) * | 2007-12-12 | 2009-06-18 | Eric James Hope | Handheld electronic devices with bimodal remote control functionality |
US20090217211A1 (en) * | 2008-02-27 | 2009-08-27 | Gesturetek, Inc. | Enhanced input using recognized gestures |
US20090291759A1 (en) * | 2008-05-22 | 2009-11-26 | International Business Machines Corporation | Simulation of writing on game consoles through the use of motion-sensing technology |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090197635A1 (en) * | 2008-02-01 | 2009-08-06 | Kim Joo Min | user interface for a mobile device |
US20090197615A1 (en) * | 2008-02-01 | 2009-08-06 | Kim Joo Min | User interface for mobile devices |
US8195220B2 (en) * | 2008-02-01 | 2012-06-05 | Lg Electronics Inc. | User interface for mobile devices |
US8423076B2 (en) | 2008-02-01 | 2013-04-16 | Lg Electronics Inc. | User interface for a mobile device |
US20100248832A1 (en) * | 2009-03-30 | 2010-09-30 | Microsoft Corporation | Control of video game via microphone |
US20130147850A1 (en) * | 2011-12-08 | 2013-06-13 | Motorola Solutions, Inc. | Method and device for force sensing gesture recognition |
US20130346906A1 (en) * | 2012-06-25 | 2013-12-26 | Peter Farago | Creation and exposure of embedded secondary content data relevant to a primary content page of an electronic book |
US8904304B2 (en) * | 2012-06-25 | 2014-12-02 | Barnesandnoble.Com Llc | Creation and exposure of embedded secondary content data relevant to a primary content page of an electronic book |
US20150052472A1 (en) * | 2012-06-25 | 2015-02-19 | Barnesandnoble.Com Llc | Creation and Exposure of Embedded Secondary Content Data Relevant to a Primary Content Page of An Electronic Book |
US10042519B2 (en) * | 2012-06-25 | 2018-08-07 | Nook Digital, Llc | Creation and exposure of embedded secondary content data relevant to a primary content page of an electronic book |
US10024660B2 (en) | 2012-08-27 | 2018-07-17 | Universite Du Quebec A Chicoutimi | Method to determine physical properties of the ground |
WO2014060647A1 (en) * | 2012-10-15 | 2014-04-24 | Trick Technologies Oy | A microphone device, method to operate and a system thereof |
US9936319B2 (en) | 2012-10-15 | 2018-04-03 | Trick Technologies Oy | Microphone device, method to operate and a system thereof |
CN103064513A (en) * | 2012-12-10 | 2013-04-24 | 深圳市茁迩科技发展有限公司 | Method and device for switching operation modes of motion sensing device automatically |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20090305785A1 (en) | Gesture controlled game screen navigation | |
US10960298B2 (en) | Boolean/float controller and gesture recognition system | |
US8355003B2 (en) | Controller lighting activation by proximity and motion | |
US9095775B2 (en) | User interface and method of user interaction | |
US20100304868A1 (en) | Multi-positional three-dimensional controller | |
US20130154958A1 (en) | Content system with secondary touch controller | |
JP5116679B2 (en) | Intensive computer image and sound processing and input device for interfacing with computer programs | |
JP5665752B2 (en) | Controller with spherical end with configurable mode | |
EP2347320A1 (en) | Control device for communicating visual information | |
EP2379189B1 (en) | Enhanced video game jukebox-type system and methodology | |
US7145569B2 (en) | Data processing method | |
US8328637B2 (en) | Combat action selection using situational awareness | |
Parker | Buttons, simplicity, and natural interfaces | |
US11806625B2 (en) | Patch and bulk operations for a legacy game | |
WO2010065211A1 (en) | Three-dimensional control with a multi-positional controller | |
CN115686523A (en) | AOT compiler for legacy games |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MICROSOFT CORPORATION, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BEEMAN, STEVEN M.;GIAIMO, EDWARD C., III;FILER, ERIC;AND OTHERS;SIGNING DATES FROM 20080602 TO 20080604;REEL/FRAME:025463/0713 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034766/0509 Effective date: 20141014 |