WO2012075199A2 - Multiplexed numeric keypad and touchpad - Google Patents

Multiplexed numeric keypad and touchpad Download PDF

Info

Publication number
WO2012075199A2
WO2012075199A2 PCT/US2011/062723 US2011062723W WO2012075199A2 WO 2012075199 A2 WO2012075199 A2 WO 2012075199A2 US 2011062723 W US2011062723 W US 2011062723W WO 2012075199 A2 WO2012075199 A2 WO 2012075199A2
Authority
WO
WIPO (PCT)
Prior art keywords
mode
motion
processor
signal
touchpad
Prior art date
Application number
PCT/US2011/062723
Other languages
French (fr)
Other versions
WO2012075199A3 (en
Inventor
Randal J. Marsden
Steve Hole
Original Assignee
Cleankeys Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cleankeys Inc. filed Critical Cleankeys Inc.
Priority to EP11844754.9A priority Critical patent/EP2646893A2/en
Publication of WO2012075199A2 publication Critical patent/WO2012075199A2/en
Publication of WO2012075199A3 publication Critical patent/WO2012075199A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0237Character input methods using prediction or retrieval techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • G06F3/04895Guidance during keyboard input operation, e.g. prompting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user

Definitions

  • FIGURE 1 is a hardware block diagram showing the typical hardware components of a system formed in accordance with an embodiment of the present invention

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Input From Keyboards Or The Like (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

A method and system that integrates a numeric keypad with a touchpad in the same physical location on a touch-sensitive display device. Operational mode of the same location is automatically determined based on user actions with the display or based on a manual entry by the user. The system operates in at least one mode of operation selected from: numpad mode, touchpad mode, keyboard mode and auto-detect mode. A visual indicator communicates with the user which mode is the current mode.

Description

MULTIPLEXED NUMERIC KEYPAD AND TOUCHPAD
FIELD OF THE INVENTION
[0001] The invention relates to a smooth touch-sensitive surface that allows the user to rest their hands or fingers on the surface without causing an event actuation. More specifically, the touch surface may be made up of both a keypad and a touchpad occupying the same physical space.
BACKGROUND OF THE INVENTION
[0002] The origin of the modern keyboard as the primary method for inputting text and data from a human to a machine dates back to early typewriters in the 19th century. As computers were developed, it was a natural evolution to adapt the typewriter keyboard to be used as the primary method for inputting text and data. While the implementation of the keys on a typewriter and subsequently computer keyboards have evolved from mechanical to electrical and finally to electronic, the size, placement, and mechanical nature of the keys themselves have remained largely unchanged.
[0003] As computers evolved and graphical user interfaces were developed, the mouse pointer became a common user input device. With the introduction of portable "laptop" computers, various new pointing devices were invented as an alternative to the mouse, such as trackballs, joysticks, and touchpads (also referred to "trackpads"). The overwhelming majority of laptop computers now incorporate the touchpad as the primary pointing device.
[0004] Prior to computers, a common office instrument used for performing numerical calculations was the "adding machine". This device incorporated number keys along with common mathematical operation keys, such as add, subtract, multiply and devide. The operator would perform data entry on these machines, which then display the result, print the result, or do both. Experienced operators of adding machines were able to memorize the location of the keys and enter data and perform operations very quickly without looking. As computers became common, the need for efficient numeric entry persisted and the "adding machine" functions were added to computer keyboards in the form of a numeric keyboard (or "numpad") typically located to the right of the standard keyboard.
[0005] Combining the three primary user interface devices of keyboard, touchpad, and numpad into a single device results in the device becoming unreasonably large. The problem is further complicated by the fact that many modern keyboards incorporate yet additional keys for page navigation, multimedia controls, gaming, and keyboard settings functions. The result can be a "keyboard" that is often larger than the computer itself.
SUMMARY OF THE INVENTION
[0006] The present invention describes a method and system that solves the space problem by integrating the numeric keypad part of the keyboard and the touchpad in the same physical location.
[0007] Keyboard technology has now evolved to the point of eliminating the traditional mechanical keys, in favor of a touch-sensitive surface that can detect user input through the correlation of touch and vibration sensors (Marsden, U.S. Patent Application Ser. No. 12/234,053). This surface can be used to provide all the functions of the keyboard, numpad, and touchpad, but in a much smaller space since it makes it possible to "multiplex" or use the same physical space on the surface for multiple functions. The touch surface may incorporate either a dynamic or static display beneath it, or a mixture of both.
[0008] In aspect of the invention, the numeric keypad and the touchpad occupy the same physical space. This is possible due to the fact that the touch-sensitive surface, unlike traditional mechanical keys, can have the spacing, size, orientation, and function of its "keys" dynamically assigned.
[0009] In another aspect of the invention, the system has three modes of operation: numpad mode, touchpad mode, and auto-detect mode. A visual indicator communicates with the user which mode it is in. The user changes the mode via activation of a key or key combinations on the keyboard. Visual indicators provide feedback to the user as to which mode the device is in.
[0010] In a further aspect of the invention, the system automatically determines which mode the user intends based on their interaction with the touch surface. For example, if the user slides their finger across the surface, they most likely intend for it to act as a touchpad, causing the pointer to move. Similarly, if the user taps their finger on a specific sector of the touch surface assigned to a number key, then they most likely intend for it to be used as a numpad.
BRIEF DESCRIPTION OF THE DRAWINGS
[0011] Preferred and alternative examples of the present invention are described in detail below with reference to the following drawings:
[0012] FIGURE 1 is a hardware block diagram showing the typical hardware components of a system formed in accordance with an embodiment of the present invention;
[0013] FIGURE 2 shows an exemplary process performed by the system shown in FIGURE 1; and
[0014] FIGURE 3 is a schematic parital view of an exemplary touch sensitive surface formed in accordance with an embodiment of the present invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
[0015] FIGURE 1 shows a block diagram of the hardware components of a device 100 for providing a multiplexed numeric keypad and touchpad. The device 100 includes one or more touch sensors 120 that provides input to a CPU (processor) 110 notifying the processor 110 of contact events when the surface has been touched, typically mediated by a hardware controller that interprets the raw signals received from the touch sensor(s) 120 and communicates the information to the processor 110 using a known communication protocol via an available data port. Similarly, the device 100 includes one or more vibration sensors 130 that communicate with the processor 110 when the surface is tapped, in a manner similar to that of the touch sensor(s) 120. The processor 110 communicates with an optional hardware controller to cause a display 140 to present an appropriate image. A speaker 150 is also coupled to the processor so that any appropriate auditory signals can be passed on to the user as guidance. The processor 110 has access to a memory 160, which may include a combination of temporary and/or permanent storage, and both read-only and writable memory (random access memory or RAM), read-only memory (ROM), writable non-volatile memory such as FLASH memory, hard drives, floppy disks, and so forth. The memory 160 includes program memory 170 that contains all programs and software such as an operating system 171, the User Gesture Recognition software 172, and any other application programs 173. The memory 160 also includes data memory 180 that includes user options and preferences 181 required by the User Gesture Recognition software 172, and any other data 182 required by any element of the device 100.
[0016] FIGURE 2 shows a flow chart of an exemplary process 200 that allows the same physical area on a touchscreen keyboard to be used to perform the functions of both a numeric keypad and touchpad. The process 200 is not intended to fully detail all the software of the present invention in its entirety, but is provided as an overview and an enabling disclosure of the present invention.
[0017] The process 200 is provided by the User Gesture Recognition Software 172. At block 205, when the process is first started, various system variables are initialized. For example, event time out (threshold time) is set to zero. At block 210, the process waits to be notified that user contact has occurred within the common area. While the system is waiting in block 210, a counter is incremented with the passage of time. Once user contact has occurred, block 215 determines if the counter has exceeded the maximum time (threshold) allowed for user input (stored as a user option in Data Memory 181).
[0018] If the maximum time allowed for user input has been exceeded, then the system resets the mode of the common area to the default mode in block 220. At a decision block 225, the processor 110 determines whether or not the current mode is in touchpad mode. If the current mode is in the touchpad mode, the processor 110 interprets the user contact as a touchpad event and outputs the command accordingly in block 230. [0019] If the current mode is not in the touchpad mode, then the processor 110 assumes the common area is in number pad (numpad) mode and proceeds to decision block 235. In touchpad operation, the user will make an initial touch followed by a sliding motion with their finger (or multiple fingers). In numpad operation, the user will tap on a number key and typically will not slide their finger. The processor 110 uses this difference in typical operation to interpret the user's input in decision block 235 and if a touch-and-slide motion is detected by the processor 110 based on signals provided by the sensors 120,130, the processor 110 changes the current mode to the touchpad mode in block 240, and outputs the user action as a touchpad event in block 245. If the user action is not a touch-and-slide motion then the user action is output by the processor 1 10 as a numpad event in block 250. After blocks 230, 245, 250, the process 200 returns to block 210.
[0020] Note that single taps (or multiple taps in succession) are also common when using a touchpad, and are commonly assigned to functions such as "select" or what is commonly referred to as a "mouse left button" action. These types of actions typically occur shortly after a touch-and-slide motion, and so the system will still be in touchpad mode (since the counter will not yet have reached the threshold in block 215).
[0021] Other user gestures on the touchpad are interpreted and assigned to functions, such as multiple finger swipes across the touchpad. While the device 100 is in the touchpad mode, all these gestures are interpreted as touchpad input and sent to the device's operating system as such to be interpreted by whatever system software resides therein. In this way, the system and method of the present invention acts exactly like any other touchpad when in touchpad mode.
[0022] In one embodiment, the default mode is set by the user (typically through control panel software). If the device 100 is at rest with no user input for the user-settable amount of time (threshold), the mode is restored to the default mode.
[0023] FIGURE 3 shows a schematic view representative of a touch and tap-sensitive keyboard 300 that incorporates on its forward- facing surface an area 310 incorporating the functions of both a numeric keypad and touchpad. The term "keyboard" in this application refers to any keyboard that is implemented on a touch and tap sensitive surface, including a keyboard presented on a touch-sensitive display. The keyboard 300 includes the outline of the area 310 incorporating the functions of the touchpad, the keys assigned to the numeric keypad, as well as the selection keys commonly referred to as the "left and right mouse buttons" 330. "Mode" refers to the type of function that is assigned to the commonly-shared area 310. A separate mode key 320 allows the user to manually select between Touchpad mode, numeric keypad (or "numpad") mode, or "Auto" mode (whereby the function assigned to common area 310 is determined by the system according to the actions of the user on the surface of the common area 310).
[0024] In one embodiment, the system of the present invention displays the current mode (touchpad or number pad) with visual indicators 320 along with an "Auto" mode visual indicator. In this way, the user can know which mode the system is in at all times. In one embodiment, a mode key 324 is provided below the indicators 320 on the keyboard. User activation of the mode key 324 causes the processor 110 to switch to another mode.
[0025] In one embodiment, the user may define the default mode to be the touchpad mode by first selecting Auto mode with the mode key 324 immediately followed by a touch-and-slide motion on the common area 310. In the absence of a touch-and-slide motion immediately following the selection of Auto mode, the processor 110 will set the default mode to numpad mode.
[0026] In another embodiment of the invention, the touch surface is used in a fourth mode: keyboard. In the fourth mode, the surface represents a keyboard, on which the user may enter text using a plethora of methods designed for smaller touch surfaces (such as those invented for smartphones). This mode is manually selected by the user through some scheme implemented on the keyboard or computer software, or it is selected by functionality provided by the auto-detect mode. The device stays in keyboard mode for as long as the user is typing. To exit the keyboard mode and return to the touchpad mode, the user performs a predefined gesture - such as pressing and holding all their fingers for a few seconds in the same location. The processor recognizes the unique gesture, then changes mode accordingly. Other gestures could also be recognized.
[0027] In another embodiment of the invention, the touch surface incorporates a dynamic display. The display changes in accordance with the current mode setting to display the appropriate image in the common area. For example, when numpad mode is selected, a numeric displayed; when touchpad is selected, a blank rounded rectangle is displayed; and so on.

Claims

[0028] The embodiments of the invention in which an exclusive property or privilege is claimed are defined as follows:
1. A system comprising:
a surface comprising a multi-mode area;
a plurality of touch sensors coupled to the surface, the plurality of touch sensors configured to generate at least one sense signal based on sensed user contact with the surface;
a plurality of motion sensors, the plurality of motion sensors configured to generate a motion signal based on sensed vibrations of the surface; and
a processor in signal communication with the surface, the plurality of touch sensors, and the plurality of motion sensors, wherein the processor is configured to determine mode of operation associated with the multi-mode area based on interpretation of at least one of the generated at least one sense signal and the motion signal associated with the multi-mode area.
2. The system of Claim 1, wherein the modes of operation comprise at least two of a keyboard mode, a numeric keypad mode, or a touchpad mode.
3. The system of Claim 2, wherein the processor is further configured to determine the mode of operation based on a signal associated with a user selection.
4. The system of Claim 3, wherein the surface comprises a display device coupled to the processor, wherein the user selection comprises activation of a mode key displayed by the processor on the surface.
5. The system of Claim 1, wherein the surface comprises at least one visual indicator, wherein the processor illuminates at least one visual indicator based on the determined mode of operation.
6. The system of Claim 2, wherein the processor identifies a default mode of operation.
7. The system of Claim 6, wherein the processor identifies the default mode of operation to be the touchpad mode after an auto mode selection has occurred followed within a predefined amount of time by a determination of a sliding motion at least on or near the multi-mode area based on the at least one sense signal,
wherein the processor identifies the default mode to be the numeric keypad mode if after the auto mode selection no sliding motion is detected within the predefined amount of time based on the at least one sense signal.
8. The system of Claim 6, wherein the processor determines mode of operation to be the touchpad mode, if the processor detects a touch-and-slide motion at the multi-mode area based on the generated at least one sense signal and the motion signal,
wherein the processor determines mode of operation to be at least one of the numeric keypad mode or the keyboard mode, if the processor detects only a tap motion based on the generated motion signals and the detected tap motion did not occur within a threshold amount of time since the detected touch-and-slide motion.
9. The system of Claim 8, wherein the processor returns interpretation of the generated at least one sense signal and the motion signal associated with the multi-mode area to the default mode after a predefined period of time has expired since a previously generated at least one sense signal and motion signal associated with the multi-mode area.
10. The system of Claim 2, wherein the surface comprises a display device coupled to the processor,
wherein the processor is configured to generate an image and present the generated image in the multi-mode area of the surface, wherein the generated image is associated with current mode of operation.
11. The system of Claim 1, wherein the surface comprises a static representation of at least one of a numeric keypad, keyboard or touchpad.
12. A method comprising:
at a plurality of touch sensors, generating at least one sense signal based on sensed user contact with a surface;
at a plurality of motion sensors, generating a motion signal based on sensed vibrations of the surface; and
at a processor in signal communication with the surface, the plurality of touch sensors, and the plurality of motion sensors,
receiving the generated at least one sense signal and the motion signal; and determining mode of operation associated with a multi-mode area of the surface based on interpretation of at least one of the received at least one sense signal and the motion signal associated with the multi-mode area.
13. The method of Claim 12, wherein the modes of operation comprise at least two of a keyboard mode, a numeric keypad mode, or a touchpad mode.
14. The method of Claim 13, wherein determining the mode of operation comprises determining the mode of operation based on a signal associated with a user selection.
15. The method of Claim 12, further comprising at the processor illuminating at least one visual indicator associated with the surface based on the determined mode of operation.
16. The method of Claim 13, further comprising at the processor identifying a default mode of operation.
17. The method of Claim 16, wherein identifying comprises:
identifying the default mode of operation is the touchpad mode after receiving an auto mode selection followed within a predefined amount of time by receiving at least one sense signal determined to be a sliding motion at least on or near the multi-mode area; and identifying the default mode is the numeric keypad mode if after receiving the auto mode selection no sense signal determined to be a sliding motion is received within the predefined amount of time.
18. The method of Claim 16, wherein determining mode of operation comprises:
determining the mode of operation is the touchpad mode, if a touch-and-slide motion at the multi-mode area has been detected based on the generated at least one sense signal and the motion signal,
determining the mode of operation is at least one of the numeric keypad mode or the keyboard mode, if only a tap motion has been detected based on the generated motion signals and the detected tap motion did not occur within a threshold amount of time since the detected touch-and-slide motion.
19. The method of Claim 18, further comprising at the processor returning interpretation of the generated at least one sense signal and the motion signal associated with the multi-mode area to the default mode after a predefined period of time has expired since a previously generated at least one sense signal and motion signal associated with the multi-mode area.
20. The method of Claim 13, further comprising at the processor:
generating an image based on current mode of operation; and
presenting the generated image in the multi-mode area of the surface.
PCT/US2011/062723 2010-11-30 2011-11-30 Multiplexed numeric keypad and touchpad WO2012075199A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP11844754.9A EP2646893A2 (en) 2010-11-30 2011-11-30 Multiplexed numeric keypad and touchpad

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US41827910P 2010-11-30 2010-11-30
US61/418,279 2010-11-30
US201161472799P 2011-04-07 2011-04-07
US61/472,799 2011-04-07

Publications (2)

Publication Number Publication Date
WO2012075199A2 true WO2012075199A2 (en) 2012-06-07
WO2012075199A3 WO2012075199A3 (en) 2012-09-27

Family

ID=46172548

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/US2011/062723 WO2012075199A2 (en) 2010-11-30 2011-11-30 Multiplexed numeric keypad and touchpad
PCT/US2011/062721 WO2012075197A2 (en) 2010-11-30 2011-11-30 Dynamically located onscreen keyboard

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/US2011/062721 WO2012075197A2 (en) 2010-11-30 2011-11-30 Dynamically located onscreen keyboard

Country Status (5)

Country Link
EP (2) EP2646893A2 (en)
JP (2) JP5782133B2 (en)
KR (1) KR101578769B1 (en)
CN (2) CN103443744B (en)
WO (2) WO2012075199A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106662976A (en) * 2014-07-31 2017-05-10 埃西勒国际通用光学公司 Dynamic calibrating of a touch-screen-implemented virtual braille keyboard

Families Citing this family (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6248635B2 (en) * 2011-11-08 2017-12-20 ソニー株式会社 Sensor device, analysis device, and storage medium
JP2017084404A (en) * 2012-02-23 2017-05-18 パナソニックIpマネジメント株式会社 Electronic apparatus
US20150261310A1 (en) * 2012-08-01 2015-09-17 Whirlscape, Inc. One-dimensional input system and method
US8816985B1 (en) 2012-09-20 2014-08-26 Cypress Semiconductor Corporation Methods and apparatus to detect a touch pattern
WO2014063642A1 (en) * 2012-10-25 2014-05-01 中国中化股份有限公司 Substituted pyrimidine compound and uses thereof
US9965179B2 (en) 2012-11-27 2018-05-08 Thomson Licensing Adaptive virtual keyboard
US10048861B2 (en) 2012-11-27 2018-08-14 Thomson Licensing Adaptive virtual keyboard
JP6165485B2 (en) * 2013-03-28 2017-07-19 国立大学法人埼玉大学 AR gesture user interface system for mobile terminals
JP5801348B2 (en) 2013-06-10 2015-10-28 レノボ・シンガポール・プライベート・リミテッド Input system, input method, and smartphone
US9483176B2 (en) * 2013-07-08 2016-11-01 Samsung Display Co., Ltd. Method and apparatus to reduce display lag of soft keyboard presses
JP6154690B2 (en) * 2013-07-22 2017-06-28 ローム株式会社 Software keyboard type input device, input method, electronic device
US9335831B2 (en) 2013-10-14 2016-05-10 Adaptable Keys A/S Computer keyboard including a control unit and a keyboard screen
CN103885632B (en) * 2014-02-22 2018-07-06 小米科技有限责任公司 Input method and device
JP6330565B2 (en) * 2014-08-08 2018-05-30 富士通株式会社 Information processing apparatus, information processing method, and information processing program
CN104375647B (en) * 2014-11-25 2017-11-03 杨龙 Exchange method and electronic equipment for electronic equipment
CN105718069B (en) * 2014-12-02 2020-01-31 联想(北京)有限公司 Information processing method and electronic equipment
CN106155502A (en) * 2015-03-25 2016-11-23 联想(北京)有限公司 A kind of information processing method and electronic equipment
JP6153588B2 (en) * 2015-12-21 2017-06-28 レノボ・シンガポール・プライベート・リミテッド Information processing apparatus, sensing layout updating method, and program
KR101682214B1 (en) * 2016-04-27 2016-12-02 김경신 an electric ink keyboard
US10234985B2 (en) * 2017-02-10 2019-03-19 Google Llc Dynamic space bar
CN107704186B (en) * 2017-09-01 2022-01-18 联想(北京)有限公司 Control method and electronic equipment
CN107493365A (en) * 2017-09-13 2017-12-19 深圳传音通讯有限公司 The switching method and switching device of a kind of dial for smart machine
US11159673B2 (en) 2018-03-01 2021-10-26 International Business Machines Corporation Repositioning of a display on a touch screen based on touch screen usage statistics
US10725506B2 (en) * 2018-08-21 2020-07-28 Dell Products, L.P. Context-aware user interface (UI) for multi-form factor information handling systems (IHSs)
CN109582211B (en) * 2018-12-25 2021-08-03 努比亚技术有限公司 Touch area adaptation method and device and computer readable storage medium
JP2020135529A (en) * 2019-02-21 2020-08-31 シャープ株式会社 Touch panel, compound machine, program and control method of touch panel
US11150751B2 (en) * 2019-05-09 2021-10-19 Dell Products, L.P. Dynamically reconfigurable touchpad
EP4004695A4 (en) * 2019-09-18 2022-09-28 Samsung Electronics Co., Ltd. Electronic apparatus and controlling method thereof

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4725694A (en) * 1986-05-13 1988-02-16 American Telephone And Telegraph Company, At&T Bell Laboratories Computer interface device
US20030122784A1 (en) * 2001-12-27 2003-07-03 Mark Shkolnikov Active keyboard for handheld electronic gadgets
US20050122313A1 (en) * 2003-11-11 2005-06-09 International Business Machines Corporation Versatile, configurable keyboard
US20070091070A1 (en) * 2005-10-20 2007-04-26 Microsoft Corporation Keyboard with integrated key and touchpad
US20080225006A1 (en) * 2005-10-11 2008-09-18 Abderrahim Ennadi Universal Touch Screen Keyboard
KR20100029026A (en) * 2008-09-05 2010-03-15 미테이크 인포메이션 코퍼레이션 On-screen virtual keyboard system

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3260240B2 (en) * 1994-05-31 2002-02-25 株式会社ワコム Information input method and device
US6278441B1 (en) * 1997-01-09 2001-08-21 Virtouch, Ltd. Tactile interface system for electronic data display system
KR100595925B1 (en) * 1998-01-26 2006-07-05 웨인 웨스터만 Method and apparatus for integrating manual input
US7768501B1 (en) * 1998-05-01 2010-08-03 International Business Machines Corporation Method and system for touch screen keyboard and display space sharing
US6525717B1 (en) * 1999-12-17 2003-02-25 International Business Machines Corporation Input device that analyzes acoustical signatures
CA2462058A1 (en) * 2001-09-21 2003-04-03 International Business Machines Corporation Input apparatus, computer apparatus, method for identifying input object, method for identifying input object in keyboard, and computer program
CN1280700C (en) * 2002-07-04 2006-10-18 皇家飞利浦电子股份有限公司 Automatically adaptable virtual keyboard
JP2004341813A (en) * 2003-05-15 2004-12-02 Casio Comput Co Ltd Display control method for input device and input device
KR100537280B1 (en) * 2003-10-29 2005-12-16 삼성전자주식회사 Apparatus and method for inputting character using touch screen in portable terminal
US20050190970A1 (en) * 2004-02-27 2005-09-01 Research In Motion Limited Text input system for a mobile electronic device and methods thereof
US20060066590A1 (en) * 2004-09-29 2006-03-30 Masanori Ozawa Input device
JP2006127488A (en) * 2004-09-29 2006-05-18 Toshiba Corp Input device, computer device, information processing method, and information processing program
JP4417224B2 (en) * 2004-10-25 2010-02-17 本田技研工業株式会社 Fuel cell stack
US9019209B2 (en) * 2005-06-08 2015-04-28 3M Innovative Properties Company Touch location determination involving multiple touch location processes
WO2009039365A2 (en) * 2007-09-19 2009-03-26 Madentec Limited Cleanable touch and tap-sensitive surface
KR101352994B1 (en) * 2007-12-10 2014-01-21 삼성전자 주식회사 Apparatus and method for providing an adaptive on-screen keyboard
KR101456490B1 (en) * 2008-03-24 2014-11-03 삼성전자주식회사 Touch screen keyboard display method and apparatus thereof
US8633901B2 (en) * 2009-01-30 2014-01-21 Blackberry Limited Handheld electronic device having a touchscreen and a method of using a touchscreen of a handheld electronic device
CN101937313B (en) * 2010-09-13 2019-11-12 中兴通讯股份有限公司 A kind of method and device of touch keyboard dynamic generation and input

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4725694A (en) * 1986-05-13 1988-02-16 American Telephone And Telegraph Company, At&T Bell Laboratories Computer interface device
US20030122784A1 (en) * 2001-12-27 2003-07-03 Mark Shkolnikov Active keyboard for handheld electronic gadgets
US20050122313A1 (en) * 2003-11-11 2005-06-09 International Business Machines Corporation Versatile, configurable keyboard
US20080225006A1 (en) * 2005-10-11 2008-09-18 Abderrahim Ennadi Universal Touch Screen Keyboard
US20070091070A1 (en) * 2005-10-20 2007-04-26 Microsoft Corporation Keyboard with integrated key and touchpad
KR20100029026A (en) * 2008-09-05 2010-03-15 미테이크 인포메이션 코퍼레이션 On-screen virtual keyboard system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106662976A (en) * 2014-07-31 2017-05-10 埃西勒国际通用光学公司 Dynamic calibrating of a touch-screen-implemented virtual braille keyboard

Also Published As

Publication number Publication date
KR101578769B1 (en) 2015-12-21
JP2014514785A (en) 2014-06-19
EP2646893A2 (en) 2013-10-09
CN103443744B (en) 2016-06-08
CN106201324B (en) 2019-12-13
WO2012075197A3 (en) 2012-10-04
KR20140116785A (en) 2014-10-06
JP5782133B2 (en) 2015-09-24
JP6208718B2 (en) 2017-10-04
CN103443744A (en) 2013-12-11
WO2012075199A3 (en) 2012-09-27
JP2015232889A (en) 2015-12-24
WO2012075197A2 (en) 2012-06-07
EP2646894A2 (en) 2013-10-09
CN106201324A (en) 2016-12-07

Similar Documents

Publication Publication Date Title
EP2646893A2 (en) Multiplexed numeric keypad and touchpad
US20120075193A1 (en) Multiplexed numeric keypad and touchpad
US10126942B2 (en) Systems and methods for detecting a press on a touch-sensitive surface
US20210132796A1 (en) Systems and Methods for Adaptively Presenting a Keyboard on a Touch-Sensitive Display
US20090153495A1 (en) Input method for use in an electronic device having a touch-sensitive screen
JP5721323B2 (en) Touch panel with tactilely generated reference keys
CN102224483B (en) Touch-sensitive display screen with absolute and relative input modes
US20100259482A1 (en) Keyboard gesturing
US8456433B2 (en) Signal processing apparatus, signal processing method and selection method of user interface icon for multi-touch panel
US20090077493A1 (en) Method for the Selection of Functions with the Aid of a User Interface, and User Interface
JP5755219B2 (en) Mobile terminal with touch panel function and input method thereof
US20090066653A1 (en) Systems and methods for using a keyboard as a touch panel
JP2011221640A (en) Information processor, information processing method and program
US20110063222A1 (en) Method and apparatus for switching of kvm switch ports using gestures on a touch panel
JP5556398B2 (en) Information processing apparatus, information processing method, and program
US20100220067A1 (en) Portable electronic device with a menu selection interface and method for operating the menu selection interface
JP6162299B1 (en) Information processing apparatus, input switching method, and program
EP2615534A1 (en) Electronic device and method of controlling the same
CN101470575B (en) Electronic device and its input method
US20130234997A1 (en) Input processing apparatus, input processing program, and input processing method
CN116601586A (en) Virtual keyboard processing method and related equipment
WO2010084973A1 (en) Input device, information processing device, input method, and program
EP2557491A2 (en) Hand-held devices and methods of inputting data
KR20090093250A (en) Method of transparent virtual mouse on touch type virtual keyboard
CN114690887B (en) Feedback method and related equipment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11844754

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2011844754

Country of ref document: EP