GB2534386A - Smart wearable input apparatus - Google Patents

Smart wearable input apparatus Download PDF

Info

Publication number
GB2534386A
GB2534386A GB1501018.4A GB201501018A GB2534386A GB 2534386 A GB2534386 A GB 2534386A GB 201501018 A GB201501018 A GB 201501018A GB 2534386 A GB2534386 A GB 2534386A
Authority
GB
United Kingdom
Prior art keywords
text input
input system
virtual
hand
controller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1501018.4A
Other versions
GB201501018D0 (en
Inventor
Kong Liang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to GB1501018.4A priority Critical patent/GB2534386A/en
Publication of GB201501018D0 publication Critical patent/GB201501018D0/en
Priority to CN201510988545.2A priority patent/CN106445094A/en
Priority to PCT/CN2016/070067 priority patent/WO2016115976A1/en
Publication of GB2534386A publication Critical patent/GB2534386A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • G06F3/0426Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected tracking fingers with respect to a virtual keyboard projected or printed on the surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • G06F3/04855Interaction with scrollbars
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Input From Keyboards Or The Like (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A text and pointer input system for a computing device comprises a wearable sensor device (e.g. gloves or bracelet band) 12a for measuring movement of hands 100a, the sensor measuring both individual finger movement and whole hand movement with respect to its surroundings, and transmitting the movement data to a controller; a display screen to receive display information from the controller; a controller which displays a virtual hand 18a, which moves in response to sensor-input, and which identifies key presses on finger movement when the virtual hand is in a position corresponding to a key over a virtual keyboard 20; and the controller emulates a mouse or other pointer input when the virtual hands are not in a position corresponding with the virtual keyboard. The ratio of distances moved by the sensors/hands may be adjustable, and gestures may be detected by the system. The size/position of the virtual keyboard may be adjustable, and the size of the virtual hands may be adjusted automatically with the size of the virtual keyboard. The system may include force-feedback (e.g. vibrators) to provide feedback when a key is pressed. The virtual hand may be used to scroll items, select items from a list, or select a portion of text.

Description

SMART WEARABLE INPUT APPARATUS
The present invention relates to input apparatus for controlling a computing device, and particularly to wearable input apparatus which allows text input.
BACKGROUND TO THE INVENTION
The most common means of providing text input to a computer system is via a standard keyboard. The keyboard allows for high-speed text input, but does have a number of drawbacks. The use of keyboards is associated with the development of repetitive strain injury and other pains in the palms, wrists, hands, shoulders, neck, and back.
Keyboards are not especially portable, and therefore are not generally used with tablets or mobile telephones. On these devices, the most common mode of text input is to provide a virtual' keyboard displayed on a touch-sensitive screen. Text may be input by 'typing' on the touch-screen keyboard. However, this in itself has a number of problems -for example, it can be difficult to find a good position to place the tablet where the user can easily look at the screen and also easily type on the screen. In the case of a mobile phone, the screen will almost certainly be too small to allow anything other than 'two finger' typing, and therefore a high typing speed cannot be achieved.
Motion tracking and gesture recognition devices and systems are also known, for example the Xbox (RTM) Kinect games system. However, these systems are generally not suitable for high-speed text input. Text input is possible in essentially the same way as an on-screen keyboard is used on a touch-screen device, but the typing speed is generally slow.
It is an object of the invention to reduce or substantially obviate the above mentioned problems.
STATEMENT OF INVENTION
According to the present invention, there is provided a text input system for a computer, the system comprising: a wearable sensor device for measuring movement of at least one hand of a user, the sensor being capable of measuring movement of at least one individual finger with respect to the hand as well as movement of the hand as a whole with respect to its surroundings, and the sensor device being arranged to transmit movement data to a controller; a display screen arranged to receive display information from the controller; and a controller adapted to display an image of a virtual keyboard on the display screen, and to display an image of a virtual hand on the display screen, the virtual hand moving on the display screen in response to input from the sensor device, the controller being further adapted to identify a key press when the at least one finger is moved when the corresponding finger on the virtual hand is in a position corresponding with a key on the virtual keyboard, and to pass the key press as text input to the computer.
The text input system is advantageous, because it allows high-speed text input to a computer by movement of the hands in exactly the same way as text input is achieved with a standard keyboard. Because a standard keyboard is likely to be familiar to the user, relatively little learning will be required to use the device. At the same time, the user is not constrained to have his hands in a certain position, as with a regular physical keyboard.
Since the device is wearable, the user can input text to a computer with his hands in more or less any position, for example on the arms of a chair or by his sides while lying in bed. This may be a particular advantage to users having certain physical disabilities, who may not be able to position their hands correctly to use a standard keyboard.
Preferably, the controller is implemented as software running on the computer to which text input is being provided.
The text input system is suitable for use with desktop or laptop computers, and also with more portable devices such as tablets or smartphones. In the latter case, the text input system of the invention allows for high-speed text input which is simply not possible using the touch screen keyboard which is typically provided with these devices.
Preferably, two wearable sensor devices are provided, so that the user can use one sensor device to measure movement in each of his two hands. Correspondingly, two virtual hands may be displayed on the screen and keys on the virtual keyboard may be pressed by a finger on either hand. In this case, the position of each virtual hand with respect to the real hand may be calibrated independently, allowing the physical hands to be separated by some distance, for example at either side of the user's body, whilst the virtual hands are displayed relatively close together, both over the virtual keyboard.
The ratio of the distance moved by the or each real hand to the distance moved by the or each virtual hand may be set by the user. This allows the perceived 'responsiveness' of the virtual hands to be set to a level which is found to be most natural for the user.
Preferably, the or each sensor device may be capable of measuring movement individually in each of the five fingers of a user's hand. With two such sensor devices, the user is able to use all his fingers to achieve a high typing speed, in exactly the same way as with a standard physical keyboard.
The controller may be adapted to provide for pointer input, as well as text input, to the computer. In this case, when any virtual hand is not positioned over the virtual keyboard, that hand can be used to point and 'click' items on the display screen. In this way, the system can replace both keyboard and mouse input, again allowing each hand to remain in a natural position whilst providing for full use of the computer. The controller may also be adapted to recognise certain gestures other than 'typing' on the virtual keyboard. For example, gestures for the up, down, left and right keys may be provided to make those functions accessible without the need to move the virtual hands over the corresponding keys on the virtual keyboard. The combination of typing on the virtual keyboard, pointing and clicking with the virtual hand, and gesture control to access common functions makes it possible to operate a computer very quickly, as compared with standard known input devices.
The size of the virtual keyboard on the display screen may be adjustable by the user. In particular, the size of the keyboard may be adjusted when a 'pinching' motion between two fingers is measured by the sensor device when the virtual hand is positioned near an edge of the virtual keyboard. In this way, the user can pinch or grab one or more edges of the keyboard, and move the edge(s) to adjust the size of the keyboard.
The size of the virtual hand(s) on the display screen may also be adjustable by the user. In some embodiments, the size of the virtual hand(s) may be adjusted automatically to match the size of the virtual keyboard when the size of the virtual keyboard is adjusted.
Different visual, audio, or any other feedback effects may be provided when a key on the virtual keyboard is pressed by the virtual hand. In some embodiments, the wearable sensor device may incorporate force feedback components, so that the user can 'feel' when he has pressed a key.
The text input system of the invention provides for high-speed text input on a computer with any size of screen, whilst allowing the hands of the user to remain in substantially any position. In addition, the system is highly flexible when compared to a standard keyboard. As described above, the size of the virtual keyboard may be adjusted. In addition, the virtual keyboard may be moved to different points on the display screen, and the transparency, design and colour of the virtual hands and keyboard can be set at the user's discretion. Even the shape of the keys on the virtual keyboard, the spacing between keys, and the arrangement of keys may be changed to suit the individual user. This level of flexibility is impossible with a standard physical keyboard.
DESCRIPTION OF THE DRAWINGS
For a better understanding of the present invention, and to show more clearly how it may be carried into effect, reference will now be made by way of example only to the accompanying drawings, in which: Figure 1 is a schematic illustration of a text input system according to the invention; Figure 2a shows the size of the virtual keyboard being adjusted in the text input system of Figure 1; Figure 2b shows the position of the virtual keyboard being adjusted in the text input system of Figure 1; Figure 2c shows the virtual keyboard being temporarily hidden or switched off in the text input system of Figure 1; Figure 3 shows the arrangement of keys on the virtual keyboard being adjusted in the text input system of Figure 1; Figure 4 shows example gestures which may be used to access common functions in the text input system of Figure 1; Figures 5a and 5b show an example gesture which may be used to emulate a mouse click in the text input system of Figure 1; Figure 6 shows how the system of Figure 1 may be used to scroll up and down a document; Figure 7 shows how the system of Figure 1 may be used to select items on a screen; and Figure 8 shows how the system of Figure 1 may be used to select text on a screen.
DESCRIPTION OF THE PREFERRED EMBODIMENT(S)
Referring firstly to Figure 1, an embodiment of a text input system according to the invention is indicated generally at 10. The text input system 10 includes a first wearable sensor device 12a and a second wearable sensor device 12b. Each sensor device 12a, 12b is attached to one of the user's wrists 100a, 100b. In this embodiment, the sensor devices are worn like a bracelet, but it is envisaged that other forms of sensor device may be used, for example sensors in the form of gloves.
Each sensor device 12a, 12b tracks the movement of the corresponding hand 100a, 100b. In particular, the overall speed and direction of movement of each hand with respect to its surroundings is measured, as well as the movement of each finger on each hand. The measurements from the sensor devices 12a, 12b are transmitted to a controller 14, which in this embodiment is implemented as software running on the computing device which is being controlled.
The system 10 further includes a display screen 16, which is the display screen which is already used by the computing device for output. The controller 14 is adapted to display items on the display screen 16, superimposed over other items which are being displayed by the computing device. In particular, the controller 14 displays a virtual hand 18a and a virtual hand 18b. The controller moves the virtual hands 18a, 18b around the screen in response to movement of the corresponding real hands 100a, 100b with respect to their surroundings, as measured by the sensor devices 12a, 12b. The controller 14 also moves individual fingers on the virtual hands 18a, 18b in response to movement of the corresponding fingers on the real hands 100a, 100b, the movement of the fingers being measured by the sensor devices 12a, 12b.
The controller also displays a virtual keyboard 20 on the display screen 16. By moving his hands 100a, 100b, the user can control the movement of the virtual hands 18a, 18b and therefore use all ten fingers to 'type' on the virtual keyboard 20.
When not typing, the user can also use his hands 100a, 100b to control the virtual hands 18a, 18b outside of the virtual keyboard 20, to access functions other than text input. For example, as shown in Figure 2a, the user can 'pinch' with his thumb and forefinger. The virtual hands 18a, 18b will replicate this motion, and the controller (14) can interpret this to allow access to a re-sizing function. In the Figure, it is the virtual keyboard 20 which is being re-sized, by 'pinching' at the corners and 'stretching' the virtual keyboard 20. However, it will be understood that other on-screen windows or other elements may be re-sized in this way.
In Figure 2b, the virtual keyboard 20 is being moved using virtual hand 18b. To move the virtual keyboard (or again, any other movable on-screen item), the user moves his hand 100b so that the virtual hand 18b is at a corner of the item to be moved. He then makes a 'grasping' gesture, i.e. a clenched fist. The controller (14) interprets this to allow access to a moving function. The item is then moved to its new location by moving the hand 100b, before releasing the clenched fist to 'drop' the item in place.
As well as moving and resizing the virtual keyboard 20, the virtual keyboard can conveniently be minimized, or temporarily switched off, when text input is not required.
This allows more space on the display screen for other items. As shown in Figure 2c, this may very simply be achieved by providing an 'off' key 22 on the virtual keyboard 20. When the 'off' key is pressed, the keyboard shrinks to leave only one key -an 'on' key 24. The 'on' key can be pressed to restore the full-size keyboard when required.
As shown in Figure 3, the layout of the keys on the virtual keyboard 20 may be adjusted to suit the user's preferences. A simple and common adjustment may be to adjust the spacing between keys or the size of the keys, but full flexibility is available if required. In Figure 3 the user is completely rearranging the keys on the virtual keyboard 20.
As already described with reference to Figures 2a and 2b, the virtual hands 18a, 18b (as controlled by real hands 100a, 100b) may be used to access functions other than text input.
For high-speed control of a computer, it is useful to combine text input with the virtual keyboard as described with 'point and click' input, and other gestures to quickly allow access to common functions. Figure 4 shows an example set of gestures which may be used to provide the functions of the arrow keys (up, down, left and right) without having to move the virtual hands to any particular position on the screen. On the far left, the up function is shown as a movement of the left middle finger away from the palm. In the next illustration, the down function is shown as a movement of the left middle finger towards the palm. The left arrow function is accessed by stretching out the left ring finger, and the right arrow function is accessed by stretching out the left index finger. These gestures are of course only examples, and mapping of gestures to common functions may be fully configurable by the user.
Figures 5a and 5b show how the input system may be used in place of a standard mouse. The user's right hand 100b can be moved to control movement of virtual hand 18b around the screen and, for example, a pressing motion with the index finger can be configured to correspond with a left mouse-click, and a pressing motion with the ring finger can be configured to correspond with a right mouse-click.
Figure 6 shows how the input system may be used for scrolling up and down a document. In one scrolling mode (shown on the far left of the Figure), an outstretched index finger gesture of the hand 100b causes scrolling in a 'hand tracking' mode, where the hand is simply moved to scroll the page up and down. Alternatively, as shown in the centre of the Figure, the virtual hand 18b can be used to operate a scroll bar 50, in the same way as the scroll bar 50 can be operated with a mouse pointer. On the right of the Figure, a third scrolling method is shown, in which the virtual hand 18b is manipulated to scroll just as a real hand would on a touch-sensitive screen.
In Figure 7, the input system 10 is being used to select items on a screen. The virtual hand 18b is used in exactly the same way as a conventional mouse to draw a box 70 around items 52, 54, 56, 58, 60, 62, to select those items. Items 64, 66, 68 are not selected in this example. Virtual hand 18b is moved to the top left corner of the area to be selected, and held there for a predetermined period (for example, two seconds). The virtual hand 18b is then dragged to the bottom right corner of the area to be selected, to create a box 70 and select the items within the box 70.
The input system is extremely flexible, and allows text input at a similar speed to a standard keyboard, without associated constraints on posture. In addition, the system can provide high-speed text input in situations where a standard keyboard is not available, such as on a mobile telephone.
In Figure 8, like Figure 7, a selection is being made using the input system 10. In this case, a portion of text 70 (the first two lines in the figure) are being selected.
The embodiments described above are provided by way of example only, and various changes and modifications will be apparent to persons skilled in the art without departing from the scope of the present invention as defined by the appended claims.

Claims (20)

  1. CLAIMS1. A text and pointer input system for a computing device, the system comprising: a wearable sensor device for measuring movement of at least one hand of a user, the sensor being capable of measuring movement of at least one individual finger with respect to the hand as well as movement of the hand as a whole with respect to its surroundings, and the sensor device being arranged to transmit movement data to a controller; a display screen arranged to receive display information from the controller; and a controller adapted to display an image of a virtual keyboard on the display screen, and to display an image of a virtual hand on the display screen, the virtual hand moving on the display screen in response to input from the sensor device, the controller being further adapted to identify a key press when the at least one finger is moved when the corresponding finger on the virtual hand is in a position corresponding with a key on the virtual keyboard, and to pass the key press as text input to the computer, the controller being further adapted to emulate mouse or other pointer input when the virtual hands are not in a position corresponding with the virtual keyboard.
  2. 2. A text input system as claimed in claim 1, in which two wearable sensor devices are provided for measuring movement of two hands.
  3. 3. A text input system as claimed in claim 2, in which two virtual hands are displayed on the screen, each virtual hand moving in response to input from one of the two sensor devices.
  4. 4. A text input system as claimed in any of the preceding claims, in which the ratio of the moved distance measured by the or each sensor to the distance moved by the or each virtual hand is adjustable.
  5. 5. A text input system as claimed in any of the preceding claims, in which the or each sensor device is capable of measuring movement of each of the five fingers on the user's hand individually.
  6. 6. A text input system as claimed in any of the preceding claims, in which the controller is adapted to recognise gestures other than key-press gestures, and to activate functions dependent on recognised gestures.
  7. 7. A text input system as claimed in any of the preceding claims, in which the size of the virtual keyboard on the display screen is adjustable.
  8. 8. A text input system as claimed in any of the preceding claims, in which the position of the virtual keyboard on the display screen is adjustable.
  9. 9. A text input system as claimed in any of the preceding claims, in which the size of the or each virtual hand is adjustable.
  10. 10. A text input system as claimed in claim 9, when dependent on claim 7, in which the size of the or each virtual hand is automatically adjusted when the size of the virtual keyboard is adjusted.
  11. 11. A text input system as claimed in any of the preceding claims, in which the wearable sensor device(s) include output means for providing feedback when a key is pressed.
  12. 12. A text input system as claimed in claim 11, in which the output means on the wearable sensor device include a force-feedback output device.
  13. 13. A text input system as claimed in claim 12, in which the force-feedback output device includes a vibrator.
  14. 14. A text input system as claimed in any of the preceding claims, in which the sensor device is in the form of a band.
  15. 15. A text input system as claimed in any of the preceding claims, in which the sensor device is in the form of a glove.
  16. 16. A text input system as claimed in any of the preceding claims, in which the virtual hand may be used to scroll a page on the screen.
  17. 17. A text input system as claimed in any of the preceding claims, in which the virtual hand may be used to select from a set of items on the screen.
  18. 18. A text input system as claimed in any of the preceding claims, in which the virtual hand may be used to select a portion of text on the screen.
  19. 19. A text input system substantially as described herein, with reference to and as illustrated in Figures 1 to 5 of the accompanying drawings.AMENDMENTS TO THE CLAIMS HAVE BEEN FILED AS FOLLOWS CLAIMS1. A text and pointer input system for a computing device, the system comprising: a wearable sensor device for measuring movement of at least one hand of a user, the sensor being capable of measuring movement of at least one individual finger with respect to the hand as well as movement of the hand as a whole with respect to its surroundings, and the sensor device being arranged to transmit movement data to a controller; a display screen arranged to receive display information from the controller; and a controller adapted to display an image of a virtual keyboard on the display screen, and to display an image of a virtual hand or hands on the display screen, the virtual hand(s) moving on the display screen in response to input from the sensor device, (r) the controller being further adapted to identify a key press from any position of the user's at least one hand when the at least one finger is moved when the C\IICD 15 corresponding finger on the virtual hand is in a position corresponding with a key on the virtual keyboard, and to pass the key press as text input to the computer, the controller being further adapted to emulate mouse or other pointer input whenever the virtual hand is not in a position corresponding with the virtual keyboard, the input type depending on the position of the virtual hand on the display screen.2. A text input system as claimed in claim 1, in which two wearable sensor devices are provided for measuring movement of two hands.3. A text input system as claimed in claim 2, in which two virtual hands are displayed on the screen, each virtual hand moving in response to input from one of the two sensor devices. 4. 5. 6. 7. 8. C\IO ('Si 9.A text input system as claimed in any of the preceding claims, in which the ratio of the moved distance measured by the or each sensor to the distance moved by the or each virtual hand is adjustable.A text input system as claimed in any of the preceding claims, in which the or each sensor device is capable of measuring movement of each of the five fingers on the user's hand individually.A text input system as claimed in any of the preceding claims, in which the controller is adapted to recognise gestures other than key-press gestures, and to activate functions dependent on recognised gestures.A text input system as claimed in any of the preceding claims, in which a combination of keyboard input, pointer input and gestures is supported for operating the computing device.A text input system as claimed in any of the preceding claims, in which the size of the virtual keyboard on the display screen is adjustable.A text input system as claimed in any of the preceding claims, in which the position of the virtual keyboard on the display screen is adjustable.10. A text input system as claimed in any of the preceding claims, in which the size of the or each virtual hand is adjustable.11. A text input system as claimed in claim 10, when dependent on claim 8, in which the size of the or each virtual hand is automatically adjusted when the size of the virtual keyboard is adjusted.12. A text input system as claimed in any of the preceding claims, in which the wearable sensor device(s) include output means for providing feedback when a key is pressed.13. A text input system as claimed in claim 12, in which the output means on the wearable sensor device include a force-feedback output device.14. A text input system as claimed in claim 13, in which the force-feedback output device includes a vibrator.15. A text input system as claimed in any of the preceding claims, in which the sensor device is in the form of a band.16. A text input system as claimed in any of the preceding claims, in which the sensor device is in the form of a glove.17. A text input system as claimed in any of the preceding claims, in which the virtual hand may be used to scroll a page on the screen.18. A text input system as claimed in any of the preceding claims, in which the virtual hand may be used to select from a set of items on the screen.CDTCV0 j 20 19. A text input system as claimed in any of the preceding claims, in which the virtual hand may be used to select a portion of text on the screen.
  20. 20. A text input system substantially as described herein, with reference to and as illustrated in Figures 1 to 5 of the accompanying drawings.
GB1501018.4A 2015-01-21 2015-01-21 Smart wearable input apparatus Withdrawn GB2534386A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
GB1501018.4A GB2534386A (en) 2015-01-21 2015-01-21 Smart wearable input apparatus
CN201510988545.2A CN106445094A (en) 2015-01-21 2015-12-22 Smart wearable input apparatus
PCT/CN2016/070067 WO2016115976A1 (en) 2015-01-21 2016-01-04 Smart wearable input apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1501018.4A GB2534386A (en) 2015-01-21 2015-01-21 Smart wearable input apparatus

Publications (2)

Publication Number Publication Date
GB201501018D0 GB201501018D0 (en) 2015-03-04
GB2534386A true GB2534386A (en) 2016-07-27

Family

ID=52630911

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1501018.4A Withdrawn GB2534386A (en) 2015-01-21 2015-01-21 Smart wearable input apparatus

Country Status (2)

Country Link
CN (1) CN106445094A (en)
GB (1) GB2534386A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170038830A1 (en) * 2015-08-04 2017-02-09 Google Inc. Context sensitive hand collisions in virtual reality
CN107024984A (en) * 2017-01-12 2017-08-08 瑞声科技(新加坡)有限公司 The feedback response method and terminal of a kind of button
US10782793B2 (en) 2017-08-10 2020-09-22 Google Llc Context-sensitive hand interaction

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107562205B (en) * 2017-09-15 2021-08-13 上海展扬通信技术有限公司 Projection keyboard of intelligent terminal and operation method of projection keyboard
CN109683667A (en) * 2018-12-25 2019-04-26 上海萃钛智能科技有限公司 A kind of Wearing-on-head type computer and its data inputting method
CN109782999A (en) * 2019-01-30 2019-05-21 上海摩软通讯技术有限公司 A kind of input method, input equipment and a kind of computer-readable medium
CN109658758A (en) * 2019-02-18 2019-04-19 西安科技大学 A kind of computer accounting teaching simulation System
US11137908B2 (en) * 2019-04-15 2021-10-05 Apple Inc. Keyboard operation with head-mounted device
CN110728828A (en) * 2019-11-11 2020-01-24 中国地质大学(武汉) Office sitting posture correction instrument and use method thereof
CN115176224A (en) * 2020-04-14 2022-10-11 Oppo广东移动通信有限公司 Text input method, mobile device, head-mounted display device, and storage medium
CN113499219A (en) * 2021-07-05 2021-10-15 西安交通大学 Multi-sense organ stimulation hand function rehabilitation system and method based on virtual reality game

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20020072081A (en) * 2001-03-08 2002-09-14 은탁 Virtual input device sensed finger motion and method thereof
US20020130844A1 (en) * 1998-12-31 2002-09-19 Natoli Anthony James Francis Virtual reality keyboard system and method
US20130265300A1 (en) * 2011-07-03 2013-10-10 Neorai Vardi Computer device in form of wearable glasses and user interface thereof

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20020072367A (en) * 2001-03-09 2002-09-14 삼성전자 주식회사 Information input system using bio feedback and method thereof
JP4611667B2 (en) * 2003-11-25 2011-01-12 健爾 西 Information input device, storage device, information input device, and information processing device
CN102063183A (en) * 2011-02-12 2011-05-18 深圳市亿思达显示科技有限公司 Virtual input device of grove type
KR101896947B1 (en) * 2011-02-23 2018-10-31 엘지이노텍 주식회사 An apparatus and method for inputting command using gesture
CN102736726A (en) * 2011-04-11 2012-10-17 曾亚东 Stealth technology for keyboard and mouse

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020130844A1 (en) * 1998-12-31 2002-09-19 Natoli Anthony James Francis Virtual reality keyboard system and method
KR20020072081A (en) * 2001-03-08 2002-09-14 은탁 Virtual input device sensed finger motion and method thereof
US20130265300A1 (en) * 2011-07-03 2013-10-10 Neorai Vardi Computer device in form of wearable glasses and user interface thereof

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170038830A1 (en) * 2015-08-04 2017-02-09 Google Inc. Context sensitive hand collisions in virtual reality
US10635161B2 (en) * 2015-08-04 2020-04-28 Google Llc Context sensitive hand collisions in virtual reality
CN107024984A (en) * 2017-01-12 2017-08-08 瑞声科技(新加坡)有限公司 The feedback response method and terminal of a kind of button
US10782793B2 (en) 2017-08-10 2020-09-22 Google Llc Context-sensitive hand interaction
US11181986B2 (en) 2017-08-10 2021-11-23 Google Llc Context-sensitive hand interaction

Also Published As

Publication number Publication date
GB201501018D0 (en) 2015-03-04
CN106445094A (en) 2017-02-22

Similar Documents

Publication Publication Date Title
GB2534386A (en) Smart wearable input apparatus
US10417880B2 (en) Haptic device incorporating stretch characteristics
US8065624B2 (en) Virtual keypad systems and methods
Harrison et al. Using shear as a supplemental two-dimensional input channel for rich touchscreen interaction
US5917476A (en) Cursor feedback text input method
Hinckley et al. Input/Output Devices and Interaction Techniques.
US20130275907A1 (en) Virtual keyboard
JPH0778120A (en) Hand-held arithmetic unit and processing method of input signal in hand-held arithmetic unit
US10048760B2 (en) Method and apparatus for immersive system interfacing
KR20090096528A (en) Human computer interaction device, electronic device and human computer interaction method
Bachl et al. Challenges for designing the user experience of multi-touch interfaces
US20170316717A1 (en) Semi-wearable Device for Interpretation of Digital Content for the Visually Impaired
Bergström et al. Human--Computer interaction on the skin
US20190034070A1 (en) Flexible & customisable human computer interaction (HCI) device that combines the functionality of traditional keyboard and pointing device (mouse/touchpad) on a laptop & desktop computer
US20160328024A1 (en) Method and apparatus for input to electronic devices
WO2016115976A1 (en) Smart wearable input apparatus
CN112041795A (en) Wearable data input device and operation method
US20200168121A1 (en) Device for Interpretation of Digital Content for the Visually Impaired
Benko et al. Imprecision, inaccuracy, and frustration: The tale of touch input
Blaskó et al. Single-handed interaction techniques for multiple pressure-sensitive strips
CN104484073A (en) Hand touch interaction system
CN106648086A (en) Visible blind-click operation mouse and keyboard glove and operation system
CN104063046A (en) Input Device And Method Of Switching Input Mode Thereof
WO2019073490A1 (en) 3d mouse and ultrafast keyboard
TWI510967B (en) Touch input device

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)