CN102693000B - In order to perform calculation element and the method for multi-finger gesture function - Google Patents

In order to perform calculation element and the method for multi-finger gesture function Download PDF

Info

Publication number
CN102693000B
CN102693000B CN201210010388.4A CN201210010388A CN102693000B CN 102693000 B CN102693000 B CN 102693000B CN 201210010388 A CN201210010388 A CN 201210010388A CN 102693000 B CN102693000 B CN 102693000B
Authority
CN
China
Prior art keywords
calculation element
function
input
finger gesture
perform
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201210010388.4A
Other languages
Chinese (zh)
Other versions
CN102693000A (en
Inventor
叶宗颖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Elan Microelectronics Corp
Original Assignee
Elan Microelectronics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Elan Microelectronics Corp filed Critical Elan Microelectronics Corp
Publication of CN102693000A publication Critical patent/CN102693000A/en
Application granted granted Critical
Publication of CN102693000B publication Critical patent/CN102693000B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The present invention discloses a kind of calculation element in order to perform multi-finger gesture function and method, and described calculation element comprises: receiver, in order to receive in the first input from the first object or the second input from least two the second objects one of at least; Look-up table block, second group of function of at least two the second objects described in being associated with in order to storage; And mapping module, in order to the corresponding function of classification mapping to described second group of function by described first input from described first object, wherein said calculation element is configured to perform function corresponding in described second group of function according to the described classification of the first input.

Description

In order to perform calculation element and the method for multi-finger gesture function
Technical field
The present invention relates to a kind of calculation element, especially in regard to a kind of calculation element in order to be performed multi-finger gesture function by pointer.
Background technology
User can not need the contact panel inputting message or instruction by keyboard to provide one user's interface easily by pointer or its finger.In addition, owing to not using keyboard, the calculation element with this contact panel can have more small-sized size and the Portability of enhancing.Utilize the advantage of Portability and user's interface easily, this calculation element, such as intelligent mobile phone, panel computer and personal digital assistant (Personaldigitalassistant, PDA) in Information technology (Informationtechnology, IT) produce market, main flow is become.When operating this calculation element, described user must use at least two fingers to perform most function (i.e. multi-finger gesture function), such as, on described contact panel, zoom in or out image.But if described user wants to perform traditional function (namely singly referring to touch controllable function), he/her must use traditional pointer, but not single finger, it is very inconvenient namely to cause.
Therefore need a kind of calculation element with contact panel, the while that it can using traditional pointer, fill order refers to touch controllable function and multi-finger gesture function.
Summary of the invention
Fundamental purpose of the present invention is to provide a kind of calculation element in order to perform multi-finger gesture function and the method in order to perform multi-finger gesture function in a calculation element, to solve problem existing in prior art.
In order to achieve the above object, the invention provides a kind of calculation element in order to perform multi-finger gesture function, described calculation element comprises: receiver, in order to receive in the first input from the first object or the second input from least two the second objects one of at least; Look-up table block, second group of function of at least two the second objects described in being associated with in order to storage; And mapping module, in order to the corresponding function of classification mapping to described second group of function by described first input from described first object, wherein said calculation element is configured to the function performing a correspondence in described second group of function according to the described classification of the first input.
In order to achieve the above object, the present invention also provides a kind of method in order to perform multi-finger gesture function in calculation element, and described method comprises: on the receiver of described calculation element, receive the first input from the first object; Described in identification, whether the classification of the first input meets one of predetermine class; If the described classification of described first input meets one of described predetermine class, then judge the multi-finger gesture of its correspondence; And execution is associated with function corresponding to multi-finger gesture.
In order to achieve the above object, the present invention also provides a kind of method in order to perform multi-finger gesture function in calculation element, and described method comprises: on the receiver of described calculation element, detect the first input from the first object; Whether the classification of the first input described in identification corresponds to one of predetermine class; And if the described classification of described first input corresponds to one of described predetermine class, then perform the function that one of described predetermine class is corresponding, wherein said function corresponds to the multi-finger gesture that described calculation element comprises.
In explanation hereafter, part is proposed other features of the present invention and advantage, and it is wherein a part of to understand the present invention from described explanation, or also can understands by implementing the present invention.Can understand by the element listed especially in the claim of enclosing with combination and reach feature of the present invention and advantage.
Will be appreciated that summary description above and detailed description hereafter are all only made for illustrate and explain, it does not limit the invention of advocating herein.
Accompanying drawing explanation
When and when reading with each alterations, can better understanding be of the present invention describes in detail above.For reaching illustration purpose of the present invention, each graphic inner figure is painted with each embodiment.So should be appreciated that the accurate row that the present invention is not limited to paint in described embodiment puts mode and apparatus.
Each graphic in:
Fig. 1 is according to an embodiment of the invention for using a pointer to perform the calcspar being associated with the calculation element of the function of multi-finger gesture;
Fig. 2 A to 2C is the schematic diagram of the described mode of the finger illustrated according to embodiments of the invention identification one pointer or a user;
Fig. 3 is the calcspar of a question blank (LUT) module according to an embodiment of the invention;
Fig. 4 A and 4B illustrates the process flow diagram according to the method performing multi-finger gesture function in one embodiment of the invention via a pointer; And
Fig. 4 C illustrates the schematic diagram being associated with the pointer of illustrative method and the structure of calculation element in Fig. 4 A and 4B.
Description of reference numerals: 10-calculation element; 11-receiver; 12-processing unit; 13-control module; 14-recognition module; 15-mapping module; 16-analysis module; 17-question blank (LUT) module; 18-first question blank; 19-second question blank; 20-the 3rd question blank; D1-D3-predetermined button; E1-E3-predetermined button; The predetermined icon of I1-I3-; P-pointer; F-points; R1, R2-respond; B1, B2-bottom section; P1, P2-peak value; A1, C1, F1-predetermined value.
Embodiment
Now will in detail with reference to the embodiment of the present invention, embodiment is illustrated among accompanying drawing.To the greatest extent its may, all graphic in will according to similar elements symbol to represent same or similar parts.
Fig. 1 is according to an embodiment of the invention for using pointer to perform the calcspar being associated with the calculation element 10 of the function of multi-finger gesture.Please refer to Fig. 1, calculation element 10 can comprise receiver 11 and the processing unit 12 of mutual electrical couplings.
Receiver 11 can receive the first input from the first object and/or the second input from least two the second objects.In one embodiment of the invention, receiver 11 can comprise the contact panel of personal digital assistant (PDA), intelligent mobile phone (such as I-Phone) or panel computer (such as I-Pad), wherein said contact panel can as receiver 11, for receiving described input by the described first and/or second object from user, and more show image or word as display.
In an alternative embodiment of the invention, receiver 11 can comprise the Trackpad of kneetop computer, and it is arranged under keyboard, and only can be used for receiving the described input from described user.
In addition, described first object embodiment ground comprises pointer, and it applies pressure or voltage to receiver 11, and applying first inputs to receiver 11 thus.Comprise to described second object each embodiment the finger of described user, it applies second input to receiver 11 when contacting receiver 11.Moreover the described first input embodiment ground of described first object comprises track or the action of described pointer, and the described second input embodiment ground of described at least two the second objects comprise described in the multi-finger gestures of at least two fingers.
Receiver 11 can change described in the first and/or second input of receiving become digital signal, and transmit described digital signal to processing unit 12.The processing unit 12 that can be configured to process described digital signal can transmit corresponding order to receiver 11 in response to described digital signal afterwards, and indicates receiver 11 to perform corresponding function thus.
Processing unit 12 can comprise recognition module 14, question blank (LUT) module 17, mapping module 15, analysis module 16 and control module 13, and it is configured to process the described digital signal from receiver 11.
Specifically, which recognition module 14 to can be configured in the first object described in identification or described second object or both all apply to input to receiver 11.
LUT module 17 can be configured to store the first group of function being associated with described first object and the second group of function being associated with described second object.Particularly, described first group of function can be associated with described first object described first input in a corresponding classification, and described second group of function can be associated with described second object described second input in a corresponding classification.In addition, described second group of function more can by mapping to described first object described first input a corresponding classification.Therefore, mapping module 15 can be configured to each classification of the first input described in mapping to a corresponding function in described second group of function.Moreover analysis module 16 can be configured to analyze described first input from described first object, with any function in the described classification of the first input described in identification whether mapping to described second group of function.If by mapping, calculation element 10 can perform the function of a mapping in described second group of function of the described classification being associated with described first input.Thus, calculation element 10 performs described second group of function by described first object and in response to described first input from described first object.
When operating, if only pick out described first object, calculation element 10 can operate at one of the first mode and the second pattern being associated with described first object.Specifically, calculation element 10 can perform described first group of function in response to described first input of described first object or in described second pattern, perform described second group of function in described first mode.
In addition, if only pick out described second object, calculation element 10 can operate in the 3rd pattern being associated with described second object.Calculation element 10 can perform described second group of function in response to described second input of described second object in described 3rd pattern.
Moreover if pick out first and second object described simultaneously, control module 13 can accept described first input of described first object, and ignores described second input from described second object.Then, calculation element 10 operates in be associated with described first object described first or described second pattern.
Because described first object has higher priority than described second object, even if calculation element 10 operates to perform the described second group of function being associated with described second object in described 3rd pattern, control module 13 operates in the described first or second pattern calculation element 10 can being switched to when picking out described first object.The detailed operation of calculation element 10 is discussed with reference to Fig. 2 A to 4C after a while in lower paragraph.
Fig. 2 A to 2C is the schematic diagram illustrating the described scheme pointed according to embodiments of the invention identification pointer or described user.Please refer to Fig. 2 A, when described first object (i.e. described pointer " P ") or described second object (i.e. the described finger " F " of described user) apply input on receiver 11, analog response R1 and R2 can be distinguished.Response R1 and R2 may correspond to the electric capacity in induction, and it can detect or observe on receiver 11.Specifically, the profile responding each in R1 and R2 can take three-dimensional pattern.Such as, the described profile bottom section that can be defined by " X " and " Y " axle of the plane surface to be parallel to receiver 11 and representing with the height defined perpendicular to " Z " axle of described bottom section.In the present embodiment, the described profile responding R1 and R2 has bottom section B1 and B2 respectively.Moreover the described area of bottom section B1 and B2 can correspond respectively to the described scope of response R1 and R2.In addition, described peak value P1 and the P2 of the described profile described " Z " axle observed can correspond respectively to the described intensity of response R1 and R2.Recognition module 14 can be configured to peak value P1 and the P2 calculating the area of bottom section B1 and B2 or the profile of response R1 and R2.The Information Availability of the size of described profile in judge to be associated with the input of described profile from source, namely pointer, finger or both be all.Use recognition module 14 will discuss referring to Fig. 2 B with the condition of identification pointer or finger.
Please refer to Fig. 2 B, in general, the cross-sectional area at the tip of described pointer is less than the finger area of described user usually.Therefore, the area of the bottom section B1 of the profile of the response R1 simulated by described pointer is less than predetermined value A1, or its diameter is less than predetermined value F1.Therefore, there is the input that size is less than the profile of A1 or F1 can identification become from the input of pointer.On the contrary, the area of the bottom section B2 of the profile of the response R2 simulated by the finger of described user is greater than predetermined value A1, or its diameter is greater than predetermined value F1.Therefore, there is the input that size is greater than the profile of A1 or F1 can identification become from the input of the described finger of described user.
On the other hand, described pointer induces lower electric capacity than the finger of described user usually.Therefore, the peak value P1 of the profile of the response R1 simulated by described pointer can be less than predetermined value C1.Therefore, the input of the profile having peak value or be highly less than C1 can become from the input of pointer in identification.On the contrary, predetermined value C1 can be greater than by the peak value P2 of the profile of the response R2 of described finger simulation.Therefore, the input of the profile having peak value or be highly greater than C1 can become from the input of the finger of described user in identification.
Please refer to Fig. 2 C, user can notify that calculation element 10 have selected stylus modes or finger mode.Such as, described user by pressing predetermined button D1, receiver 11 is clicked predetermined icon I1 or press predetermined button E1 to select the pattern needed on calculation element 10 in described pointer.
Fig. 3 is the calcspar of question blank (LUT) module 17 according to an embodiment of the invention.Please refer to Fig. 3, LUT module 17 can comprise a LUT18, the 2nd LUT19 and the 3rd LUT20.Specifically, a LUT18 can store the described first group of function being associated with described first object, and one of wherein said first group of function can be associated with a corresponding classification of the described input of described first object.In an embodiment of the present invention, described first object can comprise pointer, and each of described first group of function can be associated with the track of described pointer or a corresponding classification of action, and it lists in table 1:
Table 1
When calculation element 10 operates in described first mode; calculation element 10 can to perform in the second row as table 1 listed as described in a corresponding function in first group of function, its be associated with listed in the first row as table 1 as described in the track of pointer or the classification of action.Such as, if receiver 11 receives the short click from described pointer, calculation element 10 can perform a corresponding function, selects a specific project thus.Moreover click from the long of described pointer if receiver 11 receives, calculation element 10 can perform another corresponding function, copies the content in specific project thus, and described content is temporarily stored in the memory body of calculation element 10.
In addition, the 2nd LUT19 can store the described second group of function being associated with described second object, and each of wherein said second group of function can be associated with the input of a corresponding classification of described second object.In one embodiment of the invention, described second object can comprise at least two fingers of described user, and each of described second group of function can be associated with a wherein classification of multi-finger gesture, and it is listed in second of table 2 with the third line:
Table 2
When calculation element 10 operates in described 3rd pattern, when namely inputting with finger, calculation element 10 can perform a corresponding function in second group of function as described in listed by the third line of table 2, and it is associated with the classification of multi-finger gesture, listed by the second row of table 2.Such as, if receiver 11 receives " two pieces fingers separately " multi-finger gesture (two finger spacing become large), calculation element 10 can perform a corresponding function, can be amplified in the image (zoomin) of display at present on calculation element 10 thus.Moreover, if receiver 11 receives " two pieces fingers are close " multi-finger gesture (two finger pitch smaller), calculation element 10 can perform a corresponding function, can reduce the image (zoomout) of display at present on calculation element 10 thus.
The track of pointer or one group of classification (i.e. the gesture of pointer) of action as described in listed by 2nd LUT19 more can store in the first row as table 2.The pointer gesture of the first row of table 2, the multi-finger gesture of the second row is mutually corresponding with each of described second group of function listed in the third line.When calculation element 10 operates in the second pattern; if the input trajectory of pointer or action meet one of classification listed by the first row; the classification of the calculation element 10 i.e. multi-finger gesture corresponding to input classification of this pointer of comparison, and then perform the function corresponding to classification of this multi-finger gesture.Such as, if receiver 11 receives the track " < " of described pointer, the corresponding any multi-finger gesture of track described in the first comparison of calculation element 10, go out described track " < " corresponding multi-finger gesture according to table 2 comparison " two pieces fingers separately ", calculation element then performs this " two pieces fingers separately " the corresponding function of gesture, be namely amplified in the image of display on calculation element 10.But if receiver 11 receives the mistake of described pointer or undefined track or action, calculation element 10 can not perform any function.
Moreover, 3rd LUT20 (as shown in table 3) can be similar to the 2nd LUT19 (as shown in table 2), unlike, table 2 is to provide calculation element 10 and first judges the corresponding any multi-finger gesture of the input of pointer, then goes to carry out function corresponding to this multi-finger gesture.Table 3 then provides calculation element 10 not need to judge the multi-finger gesture corresponding to pointer input, and directly judges the function that the second classification that described pointer inputs corresponds to.These function series listed by table 3 second row, described in being associated with at least two finger described second group of function, the function namely corresponding to multi-finger gesture.
Table 3
In table 3, the track of pointer or the second classification of action, described second group of function of at least two fingers described in correspondence is associated with.That is, the function corresponding to the input of pointer second classification, also corresponds to multi-finger gesture simultaneously.When calculation element 10 operates in the second pattern, if calculation element 10 judges that the input of pointer meets one of second classification listed by table 3 the first row, then perform as a corresponding function in second group of function as described in listed in table 3 second row.Such as, if receiver 11 receives the clockwise track of described pointer, calculation element 10 can perform a corresponding function, the described image of rotatable display on calculation element 10 thus.But if receiver 11 receives the mistake of described pointer or undefined track or action, calculation element 10 can not perform any function.
Calculation element 10 can described first, described second with switch in the middle of described 3rd pattern and to perform in as table 1,2 and 3 listed as described in first or as described in second group of function the mode of operation of the calculation element 10 of a corresponding function discuss with reference to Fig. 4 A to 4C in the following paragraphs.
Fig. 4 A and 4B illustrates the process flow diagram according to the method performing multi-finger gesture function in one embodiment of the invention via pointer.Its hypothesis calculation element 10 is set in armed state time initial, and receiver 11 preparation receives input.Please refer to Fig. 4 A, in step 41, receiver 11 can receive input from described first object (i.e. described pointer) and/or described second object (i.e. the finger of described user).
Then in step 42, whether recognition module 14 can receive from the finger of described pointer and/or described user in the described input of identification on receiver 11.
If pick out the finger of described pointer and described user simultaneously, due to higher priority, the input from described pointer will be accepted.That is, the described input from described pointer can be processed by processing unit 12, and can be left in the basket from the input of the finger of described user.Then at step 43, the first group of parameter being associated with described pointer can be set, to promote that processing unit 12 processes the described input from described pointer.In one embodiment of the invention, the described first group of parameter being associated with described pointer can comprise the sensitivity of receiver 11 relative to described pointer, and it can promote degree of accuracy.
But, if described input non-concurrent are from the finger of pointer and user on receiver 11, in step 46, recognition module 14 can the described input of identification on receiver 11 whether from pointer.If confirmed, in step 43, the described first group of parameter being associated with pointer can be set.
On the other hand, if the described input on receiver 11 is not from pointer, in step 47, recognition module 14 more can the described input of identification on receiver 11 whether from the finger of user.If confirmed, in step 48, the second group of parameter being associated with finger can be set, to promote that processing unit 12 processes the described input of the finger from user.The similar described first group of parameter being associated with described pointer, the described second group of parameter being associated with described finger can comprise the sensitivity (it can promote degree of accuracy) of receiver 11 relative to described finger.
Then, in step 49, calculation element 10 operates in the 3rd pattern of finger being associated with described user.In described 3rd pattern, calculation element 10 can perform be associated with described at least two fingers described second group of function in a corresponding function, its mapping is to a classification of input (namely multi-finger gesture) as described at least two fingers as described in listed in table 2.When be associated with operate in described 3rd pattern from the described input of described finger time, receiver 11 can detect any input whether had from pointer.Then in step 50, if pick out the input from pointer, in step 43, the described first group of parameter being associated with described pointer can be set, to promote to process the described input from described pointer.Processing unit 12 may skip the follow-up input of the finger from described user, and process is from the input of pointer.
Then, in the step 44 of subsequent steps 43, calculation element 10 can operate in the described first mode being associated with described pointer or described second pattern.The detailed operation of calculation element 10 in described first mode and described second pattern is waited a moment and will be discussed with reference to Fig. 4 B and 4C respectively in the following paragraphs.
Please refer to Fig. 4 B, in step 441, begin at the beginning of calculation element 10 to operate in described first mode, to perform the described first group of function (as shown in table 1) being associated with pointer.
Then in step 442, whether control module 13 can be triggered by identification first handover event, is associated with in the second pattern of pointer to be switched to operate in by calculation element 10.Described first handover event can manually by such as triggering by the predetermined button D2 be pressed in described pointer, pressing predetermined button E2 on the computing device or the predetermined icon I2 that clicks on receiver 11, as shown in Figure 4 C.In another embodiment, described first handover event is automatically being triggered when picking out the described first category of the track of described pointer or action.Certainly, in various embodiments, also definable triggers the first handover event with a kind of track of pointer or input.
Please refer back to Fig. 4 B, if pick out described first handover event, then in step 443, calculation element 10 is switched to and operates in the second pattern.But if do not trigger described first handover event, calculation element 10 can maintain in described first mode, as step 441.
In various embodiments, can, after step 43, first carry out step 442, to judge to operate under first mode or the second pattern.
Step 443 carries out the second pattern, and the content of the second pattern can be carry out according to above-mentioned table 2 or table 3 and related description thereof.
Then, in the step 444 being similar to step 442, whether control module 13 identification can trigger one second handover event and is switched back by calculation element 10 and operate in described first mode.Be similar to described first handover event, described second handover event can manually by such as by the predetermined button D2 be pressed in described pointer or another predetermined button D3, trigger by the predetermined button E2 be pressed on calculation element 10 or another predetermined button E3 or the predetermined icon I2 clicked on receiver 11 or another predetermined icon I3, as shown in Figure 4 C.In another embodiment, described second handover event is automatically being triggered when picking out described second classification of the track of described pointer or action.Certainly, in various embodiments, also definable triggers the second handover event with a kind of track of pointer or input.
Please refer back to Fig. 4 B, if pick out described second handover event, in step 441, calculation element 10 operates in described first mode for changeable time.But if do not trigger described second handover event, calculation element 10 can maintain in described second pattern, as step 443.
The invention provides with the input of pointer to perform the function corresponding to the built-in multi-finger gesture of calculation element 10.By to illustrate above and correlative type can be understood, the present invention goes for Trackpad or transparent contact panel.Further, for pointer, the present invention in a first mode can in response to the input in pointer to perform first group of function.On the other hand, pointer the operation of many fingers at touch pad can be simulated, to perform second group of function in a second mode; Two kinds of embodiments are under the second mode described in above-mentioned table 2, table 3 and related description thereof respectively.In different embodiments, the operation of pointer is divided into first mode and the second pattern perhaps and non-essential, can also be only divided into stylus modes and finger mode, be judged to carry out first group of function or second group of function by the track of pointer.
Have the knack of this those skilled in the art namely to understand and can change above-mentioned every embodiment, and the unlikely inventive concepts departing from its broad sense.Therefore, should be appreciated that the present invention is not limited to the specific embodiment originally taken off, and for contain ownership as rear year each claim modification in the spiritual and scope of the present invention of defining.
In addition, when representative embodiment of the present invention is described, method of the present invention and/or processing procedure can be expressed as a specific order of steps by this instructions; But, because the scope of described method or processing procedure is not in proposed specific order of steps, therefore described method or processing procedure should not be limited to described particular step order.It is also feasible for ought understanding other order of steps as those skilled in the art.So the particular step order that this instructions should not proposed is considered as the restriction to claim.In addition, also should the claim of method for the present invention and/or technique be only limited in the enforcement of written contained order of steps, those skilled in the art are easy to understand, and the described order that waits also can be changed, and is still covered by within spirit of the present invention and category.

Claims (17)

1. in order to perform a calculation element for multi-finger gesture function, it is characterized in that, described calculation element comprises:
Receiver, comprises Trackpad or transparent contact panel, in order to receive the first input from the first object or the second input from least two the second objects; Wherein, described first object comprises a pointer, and described second object comprises a finger;
Look-up table block, second group of function of at least two the second objects described in being associated with in order to storage; And
Mapping module, in order to classification mapping to multi-finger gesture by described first input from described first object, and the corresponding function in described second group of function, wherein, track or the action of described first input are different from described multi-finger gesture,
Wherein said calculation element is configured to perform function corresponding in described second group of function according to the described classification of the first input, wherein, corresponding function in described second group of function is the function being associated with multi-finger gesture, described in described calculation element identification, whether the classification of the first input meets one of predetermine class, if the described classification of described first input meets one of described predetermine class, then judge the multi-finger gesture of its correspondence.
2. calculation element according to claim 1, is characterized in that, described look-up table block more stores the first group of function being associated with described first object.
3. calculation element according to claim 2, is characterized in that, more comprises control module, and it is configured to described calculation element to switch in the middle of first mode, the second pattern and the 3rd pattern and operates.
4. calculation element according to claim 3, is characterized in that, described calculation element is configured in described first mode according to the corresponding function in the described first group of function of execution of the classification of described first input.
5. calculation element according to claim 3, is characterized in that, described calculation element is configured to the function performing the mapping in described second group of function in described second pattern according to the classification of described first input.
6. calculation element according to claim 3, is characterized in that, described calculation element is configured to the corresponding function performed according to the classification of described second input in described 3rd pattern in described second group of function.
7. calculation element according to claim 3, it is characterized in that, more comprise recognition module, in order to first and second input described in identification, if when wherein said control module is configured to pick out described first input, described calculation element is switched on one of first and second pattern described operation, and if when only picking out described second input, then described calculation element is switched in described 3rd pattern operation.
8. calculation element according to claim 1, is characterized in that, described first object comprises pointer, and described first input of described first object comprises track or the action of described pointer.
9. calculation element according to claim 8, is characterized in that, described at least two the second objects comprise at least two fingers, and comprises multi-finger gesture from described second input of described at least two the second objects.
10. in order to perform a method for multi-finger gesture function in calculation element, it is characterized in that, described method comprises:
The receiver of described calculation element receives the first input from the first object, and wherein, described first object comprises a pointer, and wherein said receiver comprises Trackpad or transparent contact panel;
Described in identification, whether the classification of the first input meets one of predetermine class;
If the described classification of described first input meets one of described predetermine class, then judge the multi-finger gesture of its correspondence; Wherein, the described first track inputted or action are different from described multi-finger gesture; And
Perform and be associated with function corresponding to multi-finger gesture.
11. methods according to claim 10, it is characterized in that, perform described function to comprise: rotate the image be shown at present on described calculation element, described multi-finger gesture is that two fingers move along clockwise or counter clockwise direction across the described receiver of described calculation element.
12. methods according to claim 10, is characterized in that, perform described function and comprise: amplify the image be shown at present on described calculation element, and described multi-finger gesture is that two fingers separate.
13. methods according to claim 10, is characterized in that, perform described function and comprise: reduce the image be shown at present on described calculation element, and described multi-finger gesture is that two fingers are close.
14. 1 kinds, in order to perform the method for multi-finger gesture function in calculation element, is characterized in that, described method comprises:
The receiver of described calculation element is detected the first input from the first object, and wherein, this first object comprises a pointer, and wherein said receiver comprises Trackpad or transparent contact panel;
Whether the classification of the first input described in identification corresponds to one of predetermine class; And
If the described classification of described first input corresponds to one of described predetermine class, then perform the function that one of described predetermine class is corresponding,
Wherein said function corresponds to the multi-finger gesture that described calculation element comprises; Wherein, the described first track inputted or action are different from described multi-finger gesture.
15. methods according to claim 14, it is characterized in that, perform described function to comprise: rotate the image be shown at present on described calculation element, described multi-finger gesture is that two fingers move along clockwise or counter clockwise direction across the described receiver of described calculation element.
16. methods according to claim 14, is characterized in that, perform described function and comprise: amplify the image be shown at present on described calculation element, and described multi-finger gesture is that two fingers separate.
17. methods according to claim 14, is characterized in that, perform described function and comprise: reduce the image be shown at present on described calculation element, and described multi-finger gesture is that two fingers are close.
CN201210010388.4A 2011-01-13 2012-01-13 In order to perform calculation element and the method for multi-finger gesture function Expired - Fee Related CN102693000B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201161432302P 2011-01-13 2011-01-13
US61/432,302 2011-01-13

Publications (2)

Publication Number Publication Date
CN102693000A CN102693000A (en) 2012-09-26
CN102693000B true CN102693000B (en) 2016-04-27

Family

ID=46490445

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210010388.4A Expired - Fee Related CN102693000B (en) 2011-01-13 2012-01-13 In order to perform calculation element and the method for multi-finger gesture function

Country Status (3)

Country Link
US (1) US8830192B2 (en)
CN (1) CN102693000B (en)
TW (1) TWI461962B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8464153B2 (en) * 2011-03-01 2013-06-11 Lucasfilm Entertainment Company Ltd. Copying an object in an animation creation application
WO2014098798A1 (en) * 2012-12-17 2014-06-26 Empire Technology Development, Llc Progressively triggered auto-fill
CN103092510B (en) * 2012-12-28 2016-06-22 中兴通讯股份有限公司 The guard method of application program when electronic installation and Screen sharing thereof
JP5705885B2 (en) * 2013-01-09 2015-04-22 シャープ株式会社 Input display device
CN103149723B (en) 2013-03-20 2016-02-03 敦泰电子有限公司 The touch control method of liquid crystal indicator and can touch control liquid crystal display device
KR102157270B1 (en) * 2013-04-26 2020-10-23 삼성전자주식회사 User terminal device with a pen and control method thereof
KR102138913B1 (en) * 2013-07-25 2020-07-28 삼성전자주식회사 Method for processing input and an electronic device thereof
CN104199602B (en) * 2014-08-26 2018-11-09 联想(北京)有限公司 A kind of information processing method and electronic equipment
WO2017070926A1 (en) * 2015-10-30 2017-05-04 Hewlett-Packard Development Company, L. P. Touch device
US11133582B2 (en) * 2016-09-27 2021-09-28 Sharp Kabushiki Kaisha Antenna module, display device, antenna driving method, control program, and recording medium
TWI767263B (en) * 2020-06-24 2022-06-11 仁寶電腦工業股份有限公司 Electronic device and control method thereof

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102323875A (en) * 2011-10-26 2012-01-18 中国人民解放军国防科学技术大学 Mouse event-based multi-point touch gesture interaction method and middleware

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4295280B2 (en) * 2003-08-29 2009-07-15 ノキア コーポレイション Method and apparatus for recognizing two-point user input with a touch-based user input device
EP1969452A2 (en) * 2005-12-30 2008-09-17 Apple Inc. Portable electronic device with multi-touch input
US8049732B2 (en) * 2007-01-03 2011-11-01 Apple Inc. Front-end signal compensation
US7956847B2 (en) * 2007-01-05 2011-06-07 Apple Inc. Gestures for controlling, manipulating, and editing of media files using touch sensitive devices
WO2008095139A2 (en) * 2007-01-31 2008-08-07 Perceptive Pixel, Inc. Methods of interfacing with multi-point input devices and multi-point input systems employing interfacing techniques
JP2011503709A (en) * 2007-11-07 2011-01-27 エヌ−トリグ リミテッド Gesture detection for digitizer
US20090213083A1 (en) * 2008-02-26 2009-08-27 Apple Inc. Simulation of multi-point gestures with a single pointing device
TWI518560B (en) * 2008-03-04 2016-01-21 Sentelic Corp Multi - finger gesture coding method and coding system
US20090243998A1 (en) * 2008-03-28 2009-10-01 Nokia Corporation Apparatus, method and computer program product for providing an input gesture indicator
US8289289B2 (en) * 2008-04-03 2012-10-16 N-trig, Ltd. Multi-touch and single touch detection
US20090309847A1 (en) * 2008-06-12 2009-12-17 You I Labs, Inc. Apparatus and method for providing multi-touch interface capability
US20100053111A1 (en) * 2008-09-04 2010-03-04 Sony Ericsson Mobile Communications Ab Multi-touch control for touch sensitive display
US20100149114A1 (en) * 2008-12-16 2010-06-17 Motorola, Inc. Simulating a multi-touch screen on a single-touch screen
US8125347B2 (en) * 2009-04-09 2012-02-28 Samsung Electronics Co., Ltd. Text entry system with depressable keyboard on a dynamic display
TW201040823A (en) * 2009-05-11 2010-11-16 Au Optronics Corp Multi-touch method for resistive touch panel
US9323398B2 (en) * 2009-07-10 2016-04-26 Apple Inc. Touch and hover sensing
US8957918B2 (en) * 2009-11-03 2015-02-17 Qualcomm Incorporated Methods for implementing multi-touch gestures on a single-touch touch surface

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102323875A (en) * 2011-10-26 2012-01-18 中国人民解放军国防科学技术大学 Mouse event-based multi-point touch gesture interaction method and middleware

Also Published As

Publication number Publication date
US8830192B2 (en) 2014-09-09
CN102693000A (en) 2012-09-26
US20120182322A1 (en) 2012-07-19
TWI461962B (en) 2014-11-21
TW201232331A (en) 2012-08-01

Similar Documents

Publication Publication Date Title
CN102693000B (en) In order to perform calculation element and the method for multi-finger gesture function
US20190146667A1 (en) Information processing apparatus, and input control method and program of information processing apparatus
CN201156246Y (en) Multiple affair input system
US8217909B2 (en) Multi-finger sub-gesture reporting for a user interface device
US9304656B2 (en) Systems and method for object selection on presence sensitive devices
US11132121B2 (en) Method, apparatus, storage medium, and electronic device of processing split screen display
CN106227520B (en) Application interface switching method and device
US20090102809A1 (en) Coordinate Detecting Device and Operation Method Using a Touch Panel
US20160054851A1 (en) Electronic device and method for providing input interface
CN104571857A (en) Customizing method, responding method and mobile terminal of user-defined touch
CN107209573A (en) System and method for multi-point touch gesture
CN101178633A (en) Method, system and device for correcting hand-written screen error
CN106843692A (en) Characters by touch screen display methods and device
US9035886B2 (en) System and apparatus for a multi-point touch-sensitive sensor user interface using distinct digit identification
US20200142582A1 (en) Disambiguating gesture input types using multiple heatmaps
JPWO2009031213A1 (en) Portable terminal device and display control method
CN107291367B (en) Use method and device of eraser
US10338726B2 (en) Mobile device and method of distinguishing between different touch forces
TWI709876B (en) Electronic device and switch method and system for inputting
US20150103010A1 (en) Keyboard with Integrated Pointing Functionality
CN204203932U (en) A kind of human-computer interaction device based on radio-frequency (RF) identification
US9201529B2 (en) Touch sensing method and portable electronic apparatus
CN105338145A (en) Mobile equipment dial plate display method and system and mobile equipment
CN107193399A (en) The processing method and terminal of a kind of man-machine interaction
CN112639695B (en) Adaptive digital pen and touch sensitive device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20160427

Termination date: 20220113

CF01 Termination of patent right due to non-payment of annual fee