CN203241934U - System for identifying hand gestures, user input device and processor - Google Patents

System for identifying hand gestures, user input device and processor Download PDF

Info

Publication number
CN203241934U
CN203241934U CN 201320273762 CN201320273762U CN203241934U CN 203241934 U CN203241934 U CN 203241934U CN 201320273762 CN201320273762 CN 201320273762 CN 201320273762 U CN201320273762 U CN 201320273762U CN 203241934 U CN203241934 U CN 203241934U
Authority
CN
China
Prior art keywords
user
optical sensor
user input
mobile data
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN 201320273762
Other languages
Chinese (zh)
Inventor
J·雷纳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
STMicroelectronics Research and Development Ltd
Original Assignee
STMicroelectronics Ltd Great Britain
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by STMicroelectronics Ltd Great Britain filed Critical STMicroelectronics Ltd Great Britain
Priority to CN 201320273762 priority Critical patent/CN203241934U/en
Application granted granted Critical
Publication of CN203241934U publication Critical patent/CN203241934U/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Landscapes

  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The utility model discloses a system for identifying hand gestures, a user input device and a processor. The system for identifying the hand gestures comprises the user input device and the gesture processor, the user input device comprises a plurality of optical sensors, each optical sensor is arranged for detecting the speed, relative to the optical sensor, of one user part of one or more user parts, and the user input device is arranged for generating the mobile data of the detected speed corresponding to one or more user parts; the gesture processor is arranged for receiving the mobile data, matching the mobile data with one or more scheduled hand gestures and generating corresponding control messages relative to one or more hand scheduled gestures, wherein the mobile data correspond to the motion vector which represents the speed, relative to the optical sensors, of one or more user parts. Due to the fact that the user input device is provided with the plurality of optical sensors, the system for identifying the hand gestures can be improved, and the improved system is lower in cost and more simple in hand gesture identification compared with an routine technique.

Description

The system, user input apparatus and the processor that are used for gesture identification
Technical field
The utility model relates to for the system of gesture identification and device, and is specifically used for receiving the gesture input from the user.
Background technology
People have known that the use of locating device (for example mouse, trackball and touch pad etc.) is with the cursor on the permission user control display screen or the position of analog for many years.Yet more control technologys based on gesture are developed recently, and it attempts to surmount simple cursor control by concrete " gesture " that makes device can identify user's input.This type of gesture has the control action relevant with them.For example, " pinching " gesture can be used for dwindling, and " expansion " gesture can be used for amplifying, and " stroke and plunder " gesture can be used for rolling etc.
Use based on the control of gesture mutual to allow the user to come with calculation element (such as smart mobile phone, flat computer, portable personal computer etc.).
For example, be well-known for the device such as smart mobile phone and flat computer provides the touch-sensitive formula surface that covers on the display screen.One or more users' finger movement is from the teeth outwards detected on touch-sensitive formula surface, then installs this movement is associated with one or more predetermined gestures and generates be used to the corresponding control information of controlling described device.For example, see the image on the display screen of such device, if the user is placed on two fingers on the display screen that is coated with touch-sensitive formula surface and then with these two fingers separately, " amplification " gesture that this movement is identified as being scheduled to and the image on the display screen correspondingly are exaggerated.
Similarly, most of portable personal computers (such as laptop computer, notebook, net book etc.) are equipped with the touch-sensitive formula plate that typically is positioned at the keyboard below, and described touch-sensitive formula plate allows the cursor on user's control display screen.In some instances, this type of portable personal computer also is arranged to the gesture that the identification user inputs at touch pad.
The control that makes calculation element can identify and respond based on gesture is obviously favourable, because it is provided at more multi-control above the device for the user.Yet, in calculation element, may be complicated and expensive with conventional gesture identification hardware integration.Touch-sensitive formula is surface mounted to device will be increased the cost of device and require additional hardware and software to convert user's finger touch to significant gesture control.Although strengthen the mode that the user can control device based on the control of gesture, be costliness and complicated for calculation element provides the hardware that can identify the gesture input.
The utility model content
Conventional gesture control technology generates the gesture control information and attempts to identify user's gesture from this point by the position of the user's contact point (being User Part, such as user's finger) on the monitoring two-dimensional surface (for example touch pad, touch-sensitive formula screen etc.) along with the variation of time.It is complicated using this type of technology to generate the desired processing of gesture control information.In two-dimensional space, must follow the trail of exactly the position of one or more different contact points and must provide and process with false alarm reduction (i.e. the detection of gesture when the user does not also carry out corresponding gesture).This inputs in " touch " embodiment of gesture especially difficulty the user more with two or more contact points.
In addition, to implement the desired touch-sensitive formula surface of conventional Gesture Recognition (such as capacitive touch screen and touch pad) be expensive and consume during operation many device electric power and therefore be not suitable for the many application that are benefited from receiving the gesture control inputs in other respects.
In order to address the above problem, according to first aspect of the present utility model, provide a kind of system in gesture identification, comprising:
User input apparatus, comprise a plurality of optical sensors, each optical sensor in the described optical sensor is arranged to a User Part detecting in one or more User Parts with respect to the speed of described optical sensor, described user input apparatus is arranged to the Mobile data that generates corresponding to the described speed that detects of described one or more User Parts, and
The gesture processor is arranged to and receives described Mobile data, the corresponding control information that described Mobile data and one or more predetermined gesture couplings and generation are associated with described one or more predetermined gestures, wherein
Described Mobile data is corresponding to the motion vector of the described one or more User Parts of expression with respect to the speed of described optical sensor.
According to the utility model, it has been recognized that by two or more optical sensors are provided for user input apparatus, can realize improved gesture recognition system, implement improved system lower and simpler than the cost of the gesture identification of using routine techniques.Yet routine techniques depends on the monitoring of " position is along with the time ", it has been recognized that by several suitable optical sensors are provided according to the utility model, can catch and the velocity information of User Part with respect to the velocity correlation of optical sensor, from the gesture control information of can easily deriving of this velocity information.Therefore, need in 2 dimensional region, the monitor user ' part only not need the monitor user ' part with respect to the speed of optical sensor along with the physical location of time.
The reduction of the complicacy that is caused by capture velocity information only means that the most gesture recognition process that will carry out in addition can be at user input apparatus from carrying out and carrying out if desired even at the optical sensor place with it on the central processing unit of calculation element.In addition, the type of the necessary optical sensor of relative velocity of detection User Part is more cheap than corresponding position monitoring hardware (for example capacitive touch screen and touch pad etc.).
By Mobile data is expressed as motion vector, can provide about User Part with respect to the precise information of the speed of optical sensor but miscellaneous part and easy to handle form to be easy to the system that transfers to.In certain embodiments, described Mobile data is corresponding to the direction quadrant, and described direction quadrant falls into direction quadrant in it corresponding to each motion vector in a plurality of direction quadrants.Motion vector typically comprises value and the direction value of expression amplitude (perhaps standardized element amplitude).According to these embodiment, motion vector is simplified by a direction quadrant that the direction component list is shown in a plurality of direction quadrants.But this has reduced quantity of information that the expression Mobile data needs has still kept enough information allowing to derive significant gesture information.In certain embodiments, described direction quadrant comprises corresponding to upper and lower, left and right four direction quadrant.Therefore, Mobile data can represent by the quantity of information of further minimizing, for example two bits (for example on the 00=, under the 01=, 10=is right, and 11=is left).
In certain embodiments, only when described motion vector has amplitude greater than threshold amplitude described gesture processor just receive described Mobile data.Correspondingly, in order to generate Mobile data, must detection threshold speed.This has reduced little or very slow user moves the possibility that is interpreted as improperly gesture (i.e. wrong report), and if reduced the impact of especially having used noise in the system of optical sensor cheaply.
In certain embodiments, described gesture processor is combined in the described user input apparatus.In this type of embodiment, from carrying out gesture identification with it, reduced the essential treatment capacity in calculation element place that can be attached at user input apparatus at user input apparatus.
In certain embodiments, each optical sensor in described a plurality of optical sensor is arranged to a series of images that catches a User Part in described one or more User Parts and the speed that detects described one or more User Parts by the difference between the image of more described a series of images.This type of optical sensor is because their uses in other technologies field (such as such as the moving detector in the mass-produced device of optical mouse) and available widely.This type of optical sensor is usually lower than the touch-sensitive formula surface cost that routine is used, and has further reduced the cost of implementing according to the user input apparatus of example of the present utility model.In this type of embodiment, described one or more optical sensors comprise the light inspection device that is coupled to mobile processor, and described mobile processor is arranged to receive from described light examines the signal of device to generate described a series of images.
According to the cost of the reduction of the user input apparatus of exemplary arrangement of the present utility model and complexity so that can implement the gesture identification function with peripheral unit cheaply.For example, described user input apparatus is keyboard in certain embodiments.In certain embodiments, described a plurality of optical sensor by basic fixed position between the key of described keyboard.In other embodiments, wherein said a plurality of optical sensor is positioned as so that it replaces one or more keys of described keyboard.
In certain embodiments, described user input apparatus comprises be used to another optical sensor that cursor control is provided.
In certain embodiments, system comprises that further described system further comprises the calculation element that is coupled to described user input apparatus, and described calculation element is arranged to according to described control information control pattern displaying unit.But above-mentioned user input apparatus is applicable to be provided for generating the user input data of gesture control information the figure demonstration that is particularly useful for control display screen (such as calculation element display unit, televisor etc.) for any suitable application.
In certain embodiments, described one or more User Part is one or more user's fingers.
According to second aspect of the present utility model, a kind of user input apparatus is provided, comprise a plurality of optical sensors, each optical sensor is arranged to a User Part detecting in one or more User Parts with respect to the speed of described optical sensor, described user input apparatus is arranged to the Mobile data that generates corresponding to the described speed that detects of described one or more User Parts, wherein
Described Mobile data is applicable to one or more predetermined gesture couplings so that can generate the corresponding control information that is associated with described one or more predetermined gestures, and wherein
Described Mobile data is corresponding to the motion vector of the described one or more User Parts of expression with respect to the speed of described optical sensor.
In certain embodiments, described one or more User Part is one or more user's fingers.
According to the third aspect of the present utility model, a kind of processor for realizing gesture identification is provided, described processor be arranged to based on from the one or more User Parts of Data Detection of one or more optical sensors output with respect to the speed of described optical sensor and generate Mobile data corresponding to the described speed that detects of described one or more User Parts, wherein
Described Mobile data is applicable to one or more predetermined gesture couplings so that can generate the corresponding control information that is associated with described one or more predetermined gestures, wherein
Described Mobile data is corresponding to the motion vector of the described one or more User Parts of expression with respect to the speed of described optical sensor.
In certain embodiments, described one or more User Part is one or more user's fingers.
Various further aspect of the present utility model and feature are defined in claims.
Description of drawings
Now will only be described embodiment of the present utility model by example with reference to accompanying drawing, wherein provide corresponding reference number and wherein for similar parts:
Fig. 1 provides the schematic diagram of optics movable sensor;
Fig. 2 provides the schematic diagram according to the system of exemplary arrangement of the present utility model;
Fig. 3 a provides the schematic diagram of typical case's output that optical sensor is shown;
Fig. 3 b provides the schematic diagram that illustrates corresponding to the motion vector of the output of the optical sensor that shows among Fig. 3 a;
Fig. 4 a shows the embodiment according to the motion vector simplified function of example of the present utility model;
Fig. 4 b shows the embodiment according to the motion vector threshold function table of example of the present utility model;
Fig. 4 c shows the combination embodiment of the motion vector simplified function that shows among the motion vector simplified function that shows among Fig. 4 a according to example of the present utility model and Fig. 4 b;
Fig. 5 a to Fig. 5 c provides the schematic diagram according to the example embodiment of the user input apparatus of example of the present utility model, and
Fig. 6 provides the schematic diagram according to the system of exemplary arrangement of the present utility model.
Embodiment
Fig. 1 provides the schematic diagram that conventional optics movable sensor 101 is shown.The optics movable sensor comprises the light inspection device 103 that illuminates light source 102 (such as light emitting diode (LED) 102) and be coupled to mobile processor 104.Optics movable sensor 101 is arranged to follows the trail of surface 105 movements with respect to optics movable sensor 101.This catches corresponding to finishing in the view data in the zone 106 of being illuminated by light source 102 of optics movable sensor below 101 by light inspection device 103.As with understand, although do not show in Fig. 1, typically optical sensor also comprises and the light from light source 102 is directed to the optical element on the zone 106 that is imaged and will focuses on from the light of zone 106 reflection that is imaged optical element on the light inspection device 103.Mobile processor 106 receives the view data that catches from light inspection device 104 and a series of images of formation zone 106 continuously.These images are compared to determine that optics movable sensor 101 passes relatively moving of surface 105.Typically, before relatively, the original image that captures processed to strengthen characteristics of image (such as the edge) thus emphasize difference between an image and another image.Then typically be output as a series of X and Y coordinates movement values corresponding to the Mobile data that relatively moves of being determined by mobile processor 104.Sometimes be called as " X counting " and " Y counting " from the X and Y coordinates movement value of processor 104 output, this is corresponding to the number in the unit of the number of the unit of the movement that detects in the X plane during the given time cycle and the movement that detects in the Y plane because of them.
Typically, detected when mobile at motion sensor 101, " motion " signal is sent by mobile processor 104." motion " signal is sent to the ppu (not shown) and has detected movement with indication optics movable sensor.After receiving " motion " signal, because last exercise data reads from mobile processor 104, so ppu is then from reading X count value and Y count value corresponding to mobile mobile processor 104.
The well-known application of optics movable sensor (those of all as shown in Figure 1 types) is used for providing moving tracing in optical mouse.
Fig. 2 provides the schematic diagram according to the system 201 of exemplary arrangement of the present utility model.System is arranged to and detects one or more User Parts with respect to the speed of optical sensor and with the control information of rate conversion one-tenth based on gesture identification.User Part discussed below mainly is described according to user's finger, i.e. user's finger (such as the thumb on left hand or the right hand, ten fingers, middle finger, the third finger or little finger of toe) on hand.Yet, will be appreciated that any suitable User Part that can use its speed can use optical sensor to detect, such as palm, wrist, forearm etc.Similarly, will be appreciated that hereinafter the term " finger is mobile ", " the finger Mobile data " and " digit speed data " that use can refer to respectively movement, speed and the speed data of any suitable User Part.
System comprises user input apparatus 202 and calculation element 203.Calculation element can be the calculation element of any type, such as personal computer, game console or equivalent device.
User input apparatus 202 comprises the first optical sensor 204 and the second optical sensor 205.The first optical sensor 204 and the second optical sensor 205 are at least in part corresponding to the optics movable sensor 101 that shows among Fig. 1 and comprise and illuminate light source, light inspection device and mobile processor in some instances.Yet, will be appreciated that in other examples, can detect User Part (such as user's finger) and can be used with respect to any suitable optical sensor of the speed of sensor.
Typically the first optical sensor 204 be connected optical sensor 205 and connect to guarantee timing synchronization etc. via data bus 214.User input apparatus 202 also comprises I/O (I/O) interface unit 206 that is coupled to the first optical sensor 202 and the second optical sensor 203.Calculation element 203 comprises the pattern displaying unit 213 by 212 controls of figure video-stream processor.
In operation, each in the first optical sensor 202 and the second optical sensor 203 is arranged to the speed of a User Part on optical sensor 202,203 that detects in one or more User Parts (pointing 207,208 such as the user).Detect the mode of user's digit speed and determine that corresponding to the optics movable sensor that shows among Fig. 1 surface 105 is with respect to the mode of the movement of optics movable sensor 101.In other words catch a series of images of user's finger for given sensor.Then relatively these images to determine that finger is with respect to optical sensor the relatively moving of (the typically time cycle between read signal) on the given time cycle.
Each optical sensor in the optical sensor 202,203 is arranged to output corresponding to the finger Mobile data of user's finger with respect to the speed of optical sensor.More details about the finger Mobile data provide below.The finger Mobile data reads by I/O interface unit 206 each optical sensor from optical sensor 202,203.
I/O interface unit 206 reads the finger Mobile data from optical sensor at regular intervals in some instances.After for example having pass by in cycle predetermined time, I/O interface unit poll optical sensor is to obtain the finger Mobile data.In this way, I/O interface unit 206 receives the finger Mobile data with the speed of rule.Yet, in other examples, be the place of important factor in power consumption for example, if it is mobile not detect finger, each optical sensor keeps sleep pattern.If detect motion, then optical sensor I/O sends look-at-me to interface unit 206 and only this moment, I/O interface unit 206 read the finger Mobile data from optical sensor.
After reading the finger Mobile data, I/O interface unit 206 is carried out essential any further processing and is come decipher finger Mobile data, and then will convert the form that is applicable to transmission between user input apparatus 202 and calculation element 203 to from optical sensor 204,205 finger Mobile data.Then will point Mobile data via connection 209 and transfer to calculation element 203 from user input apparatus 202.
Received by I/O interface unit 210 at calculation element 203 from the finger Mobile data of user input apparatus 202 outputs, I/O interface unit 210 converts thereof into suitable form and then sends it to gesture processor 211.In some instances, the gesture processor is the CPU (central processing unit) with the calculation element of suitable driver and application programming.
Gesture processor 211 is arranged to and the finger Mobile data is associated with one or more gestures in several predetermined gestures and exports control signal corresponding to the gesture of being scheduled to.Control signal is input to figure video-stream processor 212, and this figure video-stream processor 212 converts control signal to for control pattern displaying unit 213 the display control information of output.
For example, the user can be placed on two fingers 207,208 user input apparatus 202 (finger is on each optical sensor) upward and be moved toward each other finger 207,208.In other words, the visual angle of the system that from Fig. 2, shows, the first finger 207 moves right and second finger 208 is moved to the left.The speed of user's finger is detected by aforesaid optical sensor 204,205, and corresponding finger Mobile data generates and be sent to user input apparatus I/0 interface unit 206 by each optical sensor 204,205.This finger Mobile data is processed and convert suitable transformat to and be sent to calculation element 203 and received at calculation element I/O interface unit 210 places via connecting 209.The finger Mobile data that receives is sent to the gesture processor.The gesture processor is processed the finger Mobile data and will be pointed Mobile data and is interpreted as " pinching " gesture, and determines that this and figure " dwindle " order and be associated.Gesture processor 211 is to the control signal of dwindling of figure video-stream processor 212 output correspondences, and this figure video-stream processor 212 is carried out reduction operation by for example dwindling the size that is presented at the graphic object on the pattern displaying unit 213.
The finger Mobile data
As mentioned above, user input apparatus 202 outputs are based on the finger Mobile data of the speed of the user's who is detected by optical sensor finger.The finger Mobile data can be that expression user's finger is with respect to any suitable data of the speed of optical sensor.In some instances, the finger Mobile data is the form with motion vector.This explains in more detail following.
Fig. 3 a provides the schematic diagram of the typical output that optical sensor (such as the optics movable sensor 101 that shows among Fig. 1) is shown.
When reading from optical sensor, the quantity that the last X that the detects counting that reads from optical sensor and Y count (that is: the unit of the unit of the movement that detects at directions X and the movement that detects in Y-direction) is received by ppu at every turn.The exemplary graph of this information shows in Fig. 3 a.Be appreciated that from Fig. 3 a, the X counting that is generated by optical sensor and Y count information corresponding on the given time cycle on directions X and Y-direction the distance of process (for example from last time from optical sensor reads).X counting and Y enumeration data can be converted into single " motion vector ", namely the direction of vector corresponding to user's finger with respect to the amplitude of the direction of optical sensor and vector corresponding to user's the finger vector with respect to the speed of optical sensor.
As mentioned above, in examples more of the present utility model, the time cycle between optical sensor is read by regularly poll so X counting and Y counting from the frequency of this poll as can be known.In other examples, wherein for example detect when motion at optical sensor and send look-at-me, can determine time between X counting and Y counting read with other time sequence informations, for example by the reference system clock.For example, at every turn in response to interrupt from optical sensor read X counting and Y enumeration data with the time, the system clock time is recorded in mobile processor and/or the I/O interface unit of optical sensor.In order to determine that X counting and Y count the time between reading, the system clock time of place record of reading was formerly deducted from the current system clock time of reading.
Fig. 3 b provides the schematic diagram of the motion vector 301 that the X counting that shows and the derivation of Y count information are shown from Fig. 3 a.Will be appreciated that when each new X counting and Y enumeration data are read from optical sensor (according to the poll of the rule of optical sensor or by detecting generation look-at-me when mobile), the amplitude of motion vector 301 and direction can be updated.
In some instances, the mobile processor that is associated with each optical sensor 207,208 is arranged to X counting and the Y enumeration data of will collect as mentioned above and converts motion vector data to, and then this motion vector data is output to I/O interface unit 206.In this type of example, the finger Mobile data that reads from each optical sensor generates motion vector corresponding to motion vector stream at every turn when reading from optical sensor.In other examples, optical sensor is arranged to be exported in a similar fashion X counting and Y and counts up to conventional optics movable sensor and I/O interface unit 206 and be arranged to and X counted and the Y enumeration data converts motion vector data to.
In some instances, carry out the motion vector simplified function.This shows in Fig. 4 a.Just as will be appreciated, which in optical sensor and the I/O processing unit to convert X counting and Y enumeration data to motion vector data according to, the motion vector simplified function can be by mobile processor or the execution of I/O processing unit of optical sensor.
Fig. 4 a shows as mentioned above from the curve map of the motion vector 401 of X counting and the generation of Y enumeration data.Yet, can find out from Fig. 4 a, curve map is divided into 4 quadrants: upper and lower, left and right.In one example, in case mobile processor (or I/O processing unit) generates motion vector from aforesaid X counting and Y enumeration data, rather than generating finger Mobile data corresponding to accurate motion vector (being amplitude and direction), mobile processor (or I/O processing unit) is as an alternative with the formal output finger Mobile data of the Mobile data of the simplification of the quadrant that falls into corresponding to motion vector.For example, if motion vector 401 falls within (finger that shows the user moves right with respect to optical sensor) in the right quadrant, optical sensor (or I/O processing unit) will be exported the Mobile data of the simplification that the finger of indicating user moving right.On the other hand, if user finger moves up substantially with respect to optical sensor, the finger that the motion vector of deriving from X counting and Y enumeration data will fall within the upper quadrant and optical sensor (or I/O processing unit) will be exported indicating user is the Mobile data of the simplification of movement etc. forward.Will be appreciated that the motion vector of simplifying in this case can be represented by two data bit or switch.For example, on the 00=, under the 01=, 10=is right, and 11=is left.In this example, the amplitude of each motion vector is standardized into unit amplitude.
In some instances, carry out the motion vector threshold function table.This shows in Fig. 4 b.Just as will be appreciated, the motion vector threshold function table can be carried out by mobile processor or the I/O processing unit of optical sensor.
Fig. 4 b shows the curve map that shows about at the motion vector 403 of the motion vector 402 of the digit speed that detects on the period 1 and the digit speed that detects on second round.In this example, optical sensor (or I/O processing unit) is not unless output movement vector data motion vector exceeds threshold amplitude.Threshold amplitude is illustrated as the zone 404 of dotted line in Fig. 4 b.Can find out from Fig. 4 b, the digit speed that is detected by optical sensor during the period 1 402 causes motion vector 402 not exceed the motion vector threshold value.Correspondingly, optical sensor (or I/O processing unit) will not generate any finger Mobile data during the period 1.On the other hand, the digit speed that is detected by optical sensor during second round causes motion vector 403 to exceed the motion vector threshold value.Correspondingly, optical sensor (or I/O processing unit) will be exported corresponding exercise data during the period 1.
In some instances, can carry out simultaneously motion vector simplified function and motion vector threshold function table.This concept is shown in Fig. 4 c.In this example, motion vector must exceed motion vector amplitude threshold 404 and be used for the finger Mobile data that will be generated by optical sensor (or I/O processing unit).If motion vector has exceeded motion vector amplitude threshold 404, export the Mobile data of the simplification of the quadrant that falls into corresponding to motion vector.Correspondingly, do not cause any finger Mobile data to be output corresponding to user's digit speed of the first motion vector 402 but cause the Mobile data of the simplification that optical sensor (or I/O processing unit) output expression user's finger just moving right corresponding to user's digit speed of the second motion vector 403.
Pat identification
In some instances, with detecting digit speed, optical sensor is arranged to " patting "-namely of detecting user's finger and detects that the user puts their finger momently and then their finger is left optical sensor.Optical sensor can be arranged in the existence of predetermined duration by identification user finger and reach the finger of " patting " mobile consistent predetermined lasting time with the human finger and having limited (for example below threshold value) and move and detect this point.Detect when patting, optical sensor can be arranged to the output indication and detect the data of patting.
In other examples, detect the user when non-moving user points and pat when detecting at the first optical sensor, and it is mobile to detect user's finger at the second optical sensor simultaneously.
The gesture identification of carrying out at user input apparatus
In example shown in Figure 2, gesture processor 211 is positioned at the outside of user input apparatus 202.Yet the gesture processor is combined in the user input apparatus in some instances.In this type of embodiment, gesture identification user input apparatus from be performed with it and the output of user input apparatus corresponding to the gesture that detects, namely corresponding to the gesture data of several predetermined gestures that detected.
Single processor on the user input apparatus
In the user input apparatus example that in Fig. 2, shows, the unit that optical sensor (each comprises mobile processor) and I/O processing unit 206 are shown as separating.Yet, will be appreciated that this only is displaying property purpose and can uses any suitable hardware layout.In some instances, the single assembly (for example integrated circuit) that the function that is associated with optical sensor and I/O processing unit 206 can be installed in the user input apparatus provides.This device can will be examined image that device catches as input from light, and exports aforesaid finger Mobile data or aforesaid gesture data.
User input apparatus
The user input apparatus 202 that shows among Fig. 2 can be arranged in any suitable manner.In some instances, user input apparatus comprises keyboard, and optical sensor has been integrated in the described keyboard.This class example is shown in Fig. 5 a, Fig. 5 b and Fig. 5 c.
Fig. 5 a provides the schematic diagram based on the user input apparatus 501 of keyboard according to exemplary arrangement of the present utility model.User input apparatus 501 comprises the keyboard 502 with key 503.Yet, the user input apparatus based on keyboard unlike routine, user input apparatus 501 comprises that the first optical sensor 504 and the second optical sensor 505, the first optical sensors 504 and the second optical sensor 505 operate as mentioned above with reference to the first optical sensor that shows among Fig. 2 and the second optical sensor.The first optical sensor 504 and the second optical sensor 505 are positioned between the key 503 of keyboard.Just as will be appreciated, comprise typically that based on the user input apparatus 502 of keyboard the I/O processing unit processes and export this data to receive from the data of optical sensor 504,505 outputs and conversion and with carrying out any above-mentioned other.User input apparatus 501 based on keyboard comprises that data output connects 506, and data output connects 506 for the user input data that comprises finger Mobile data and for example data keystroke to external computing device (such as personal computer) transmission.
Fig. 5 b provides the second schematic diagram based on the user input apparatus 507 of keyboard according to another exemplary arrangement of the present utility model.Second is labeled accordingly based on the user input apparatus based on keyboard that shows among the similar parts of the user input apparatus 507 of keyboard and Fig. 5 a.
Similar to the user input apparatus based on keyboard that shows among Fig. 5 a, the user input apparatus 507 based on keyboard that shows among Fig. 5 b comprises two optical sensors 508,509.Yet it is keys on the keyboard 502 that these optical sensors are oriented to as them, and in other words they are customized that to do and/or be positioned as them are keys of keyboard.
Fig. 5 c provides the 3rd schematic diagram based on the user input apparatus 510 of keyboard according to another exemplary arrangement of the present utility model.The 3rd is labeled accordingly based on the user input apparatus based on keyboard that shows among the similar parts of the user input apparatus 510 of keyboard and Fig. 5 a.Can find out from Fig. 5 c, corresponding based on the user input apparatus based on keyboard that shows among the user input apparatus 510 of keyboard and Fig. 5 a, except the user input apparatus 510 based on keyboard comprises except the 3rd optical sensor 511.In some instances, detect user's digit speed except being arranged to, gesture information is derived from user's digit speed, and the 3rd optical sensor is arranged to detect to point and moves, and cursor control data are from pointing mobile deriving.
Example embodiment
Fig. 6 provides the schematic diagram that illustrates according to the embodiment of the system of exemplary arrangement of the present utility model.Described system comprises the user input apparatus 601 based on keyboard that is connected to personal computer (PC) calculation element 602 via USB (universal serial bus) (USB) interface.User input apparatus 601 based on keyboard comprises keyboard unit 603 and optical sensor unit 604, and optical sensor unit 604 comprises the first optical sensor 605 and the second optical sensor 606.Each optical sensor comprises image diode 607 and based on the mobile processor of the VD5376 of STMicw Electronics motion sensor device.Will be appreciated that the mobile processor that can use any equivalence, such as the VD5377 of STMicw Electronics motion sensor device.
The first optical sensor 605 and the second optical sensor 606 are connected to mobile processor 609 via MOTION line (MOTIONL is used for the first optical sensor 605 and MOTIONR is used for the second optical sensor 606) and 12C bus 608.
If an optical sensor unit inspection in the optical sensor unit is to mobile, it sends look-at-me to mobile processor 609 at corresponding MOTION line.In case the look-at-me of receiving, mobile processor read the X that was read last time since the X that detected by corresponding VD5376 motion sensor device counting and Y enumeration data and count and the Y enumeration data.The first optical sensor and the second optical sensor are arranged to by using VD5376 register (# feature [0x31,0x32], maximum exposure pixel [0x4F] and exposure [0x41]) to detect the user and " pat " (namely finger exists but be not mobile).
Microcontroller 609 is exported described finger Mobile data to PC602 via USB interface.PC602 is equipped with drive software 610 and application software 611 thereon and is associated with a gesture many gestures of being scheduled to the finger Mobile data that will receive from the user input apparatus 601 based on keyboard and exports corresponding control information.
Microcontroller 609 is arranged to according to the X counting that will receive from the first optical sensor 605 and the second optical sensor 606 with USB HID mouse class standard ten switches listing in the form below, that revise and Y enumeration data (corresponding to user's the finger speed with respect to sensor) and converts the output switch data to.
Figure BSA00000897585700151
As mentioned above, being installed in drive software 610 on the PC and application software 611 is arranged to many gesture decipher HID mouse class switching informations of predetermined number and exports corresponding control information.For using from controlling the implementation of the demonstration of pattern displaying unit based on the finger Mobile data of the user input apparatus of keyboard output, the mapping of the motion that detects and corresponding gesture control can be according to the listed realization of following form:
Figure BSA00000897585700152
Figure BSA00000897585700161
Will be appreciated that above-mentioned specific embodiment only is described by example and has imagined other embodiment and variant.
For example, although specific embodiment listed above is described with reference to the optical sensor of the speed that detects user's finger, but will be appreciated that and to use any suitable gesture input device, such as stylus or indicant, the speed of wherein said gesture input device can be detected by optical sensor.In addition, as mentioned above, usually " finger " can be considered to refer to user's any appropriate part, such as any part of any finger on hand of user, user's palm or wrist etc.
In addition, will be appreciated that the concrete parts part that user input apparatus and calculation element are included, for example mobile processor, I/O interface unit, gesture processor etc. are logical name in some instances.Correspondingly, the function that partly provides of these parts can with above-mentioned and accompanying drawing in may not be the accurately consistent mode of the form that shows show.For example aspect of the present utility model can be implemented with the form of the computer program that comprises instruction (being computer program), described instruction can be implemented at processor, at the sub-carrier of data (such as floppy disk, CD, hard disk, PROM, RAM, any combination of flash memory or these or other storage medium) upper storage, perhaps via data-signal at network (such as Ethernet, wireless network, any combination of the Internet or these or other storage medium) upper transmission, perhaps with hardware such as ASIC (special IC) or FPGA (field programmable gate array) or be suitable for use in adaptive routine equivalent device other circuit configurable or customized and realize.

Claims (17)

1. a system that is used for gesture identification is characterized in that, comprising:
User input apparatus, comprise a plurality of optical sensors, each optical sensor in the described optical sensor is arranged to a User Part detecting in one or more User Parts with respect to the speed of described optical sensor, described user input apparatus is arranged to the Mobile data that generates corresponding to the described speed that detects of described one or more User Parts, and
The gesture processor is arranged to and receives described Mobile data, the corresponding control information that described Mobile data and one or more predetermined gesture couplings and generation are associated with described one or more predetermined gestures, wherein
Described Mobile data is corresponding to the motion vector of the described one or more User Parts of expression with respect to the speed of described optical sensor.
2. system according to claim 1 is characterized in that, wherein said Mobile data is corresponding to the direction quadrant, and described direction quadrant falls into direction quadrant in it corresponding to each motion vector in a plurality of direction quadrants.
3. system according to claim 2 is characterized in that, wherein said direction quadrant comprises corresponding to upper and lower, left and right four direction quadrant.
4. each described system in 3 according to claim 1 is characterized in that, wherein only when described motion vector has amplitude greater than threshold amplitude described gesture processor just receive described Mobile data.
5. each described system in 3 according to claim 1 is characterized in that, wherein, described gesture processor is combined in the described user input apparatus.
6. each described system in 3 according to claim 1, it is characterized in that, each optical sensor in wherein said a plurality of optical sensors is arranged to a series of images that catches a User Part in described one or more User Parts and the speed that detects described one or more User Parts by the difference between the image of more described a series of images.
7. system according to claim 6, it is characterized in that, wherein said one or more optical sensor comprises the light inspection device that is coupled to mobile processor, and described mobile processor is arranged to receive from described light examines the signal of device to generate described a series of images.
8. each described system in 3 according to claim 1 is characterized in that, wherein said user input apparatus is keyboard.
9. system according to claim 8 is characterized in that, wherein said a plurality of optical sensors by basic fixed position between the key of described keyboard.
10. system according to claim 8 is characterized in that, wherein said a plurality of optical sensors are positioned as so that it replaces one or more keys of described keyboard.
11. each described system in 3 is characterized in that according to claim 1, wherein said user input apparatus comprises be used to another optical sensor that cursor control is provided.
12. each described system in 3 is characterized in that according to claim 1, wherein said system further comprises the calculation element that is coupled to described user input apparatus, and described calculation element is arranged to according to described control information control pattern displaying unit.
13. each described system in 3 is characterized in that according to claim 1, wherein said one or more User Parts are one or more users' fingers.
14. user input apparatus, it is characterized in that, comprise a plurality of optical sensors, each optical sensor is arranged to a User Part detecting in one or more User Parts with respect to the speed of described optical sensor, described user input apparatus is arranged to the Mobile data that generates corresponding to the described speed that detects of described one or more User Parts, wherein
Described Mobile data is applicable to one or more predetermined gesture couplings so that can generate the corresponding control information that is associated with described one or more predetermined gestures, and wherein
Described Mobile data is corresponding to the motion vector of the described one or more User Parts of expression with respect to the speed of described optical sensor.
15. user input apparatus according to claim 14 is characterized in that, described one or more User Parts are one or more user's fingers.
16. processor that is used for realizing gesture identification, it is characterized in that, described processor be arranged to based on from the one or more User Parts of Data Detection of one or more optical sensors output with respect to the speed of described optical sensor and generate Mobile data corresponding to the described speed that detects of described one or more User Parts, wherein
Described Mobile data is applicable to one or more predetermined gesture couplings so that can generate the corresponding control information that is associated with described one or more predetermined gestures, wherein
Described Mobile data is corresponding to the motion vector of the described one or more User Parts of expression with respect to the speed of described optical sensor.
17. processor according to claim 16 is characterized in that, wherein said one or more User Parts are one or more user's fingers.
CN 201320273762 2012-05-15 2013-05-14 System for identifying hand gestures, user input device and processor Expired - Fee Related CN203241934U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201320273762 CN203241934U (en) 2012-05-15 2013-05-14 System for identifying hand gestures, user input device and processor

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP1208523.9 2012-05-15
CN 201320273762 CN203241934U (en) 2012-05-15 2013-05-14 System for identifying hand gestures, user input device and processor

Publications (1)

Publication Number Publication Date
CN203241934U true CN203241934U (en) 2013-10-16

Family

ID=49319208

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201320273762 Expired - Fee Related CN203241934U (en) 2012-05-15 2013-05-14 System for identifying hand gestures, user input device and processor

Country Status (1)

Country Link
CN (1) CN203241934U (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110045822A (en) * 2014-12-01 2019-07-23 罗技欧洲公司 Keyboard with aerial object detection
CN110045824A (en) * 2014-02-10 2019-07-23 苹果公司 It is inputted using the motion gesture that optical sensor detects
CN110411453A (en) * 2015-02-26 2019-11-05 意法半导体公司 Reconfigurable sensor unit for electronic equipment
CN110646938A (en) * 2018-06-27 2020-01-03 脸谱科技有限责任公司 Near-eye display system

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110045824A (en) * 2014-02-10 2019-07-23 苹果公司 It is inputted using the motion gesture that optical sensor detects
CN110045824B (en) * 2014-02-10 2022-06-17 苹果公司 Motion gesture input detected using optical sensors
US11422635B2 (en) 2014-02-10 2022-08-23 Apple Inc. Optical sensing device
CN110045822A (en) * 2014-12-01 2019-07-23 罗技欧洲公司 Keyboard with aerial object detection
CN110411453A (en) * 2015-02-26 2019-11-05 意法半导体公司 Reconfigurable sensor unit for electronic equipment
CN110646938A (en) * 2018-06-27 2020-01-03 脸谱科技有限责任公司 Near-eye display system
CN110646938B (en) * 2018-06-27 2022-05-24 脸谱科技有限责任公司 Near-eye display system

Similar Documents

Publication Publication Date Title
CN103425244A (en) Gesture recognition
US11009950B2 (en) Arbitrary surface and finger position keyboard
Taylor et al. Type-hover-swipe in 96 bytes: A motion sensing mechanical keyboard
US8086971B2 (en) Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications
Yang et al. Magic finger: always-available input through finger instrumentation
CN104679401B (en) The touch control method and terminal of a kind of terminal
US10209881B2 (en) Extending the free fingers typing technology and introducing the finger taps language technology
TWI471755B (en) Device for operation and control of motion modes of electrical equipment
Schwarz et al. Probabilistic palm rejection using spatiotemporal touch features and iterative classification
US20090051671A1 (en) Recognizing the motion of two or more touches on a touch-sensing surface
WO2014045953A1 (en) Information processing device and method, and program
US20170003762A1 (en) Systems and methods for text entry
CN104969156A (en) Interaction sensing
US8743061B2 (en) Touch sensing method and electronic device
CN103809733A (en) Man-machine interactive system and method
CN203241934U (en) System for identifying hand gestures, user input device and processor
CN104541232A (en) Multi-modal touch screen emulator
TW201701119A (en) Touch control device and operation method thereof
CN102103462A (en) Multifunctional electronic pen, character writing and sampling method and track data storage method
KR20100001918A (en) Method, apparatus for sensing moved touch and computer readable record-medium on which program for executing method thereof
US10088943B2 (en) Touch control device and operating method thereof
CN106774839A (en) A kind of intelligence wearing key board unit and its input method
Liang et al. Indexmo: exploring finger-worn RFID motion tracking for activity recognition on tagged objects
Zhang et al. A virtual keyboard implementation based on finger recognition
WO2019134606A1 (en) Terminal control method, device, storage medium, and electronic apparatus

Legal Events

Date Code Title Description
C14 Grant of patent or utility model
GR01 Patent grant
CU01 Correction of utility model patent

Correction item: Priority

Correct: 1208523.9 2012.05.16 GB

False: 1208523.9 2012.05.15 JP

Number: 42

Page: The title page

Volume: 29

CU03 Correction of utility model patent gazette

Correction item: Priority

Correct: 1208523.9 2012.05.16 GB

False: 1208523.9 2012.05.15 JP

Number: 42

Volume: 29

ERR Gazette correction

Free format text: CORRECT: PRIORITY DATA; FROM: 1208523.9 2012.05.15 JP TO: 1208523.9 2012.05.16 GB

RECT Rectification
ASS Succession or assignment of patent right

Owner name: ST MICROELECTRONICS (RD) S. A.

Free format text: FORMER OWNER: ST MICROELECTRONICS SA

Effective date: 20140408

C41 Transfer of patent application or patent right or utility model
TR01 Transfer of patent right

Effective date of registration: 20140408

Address after: Buckinghamshire

Patentee after: Stmicroelectronics (Research & Development) Limited

Address before: Buckinghamshire

Patentee before: ST Microelectronics SA

CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20131016

Termination date: 20160514

CF01 Termination of patent right due to non-payment of annual fee