KR101661606B1 - Method for processing touch event when a touch point is rotating respectively to other touch point - Google Patents

Method for processing touch event when a touch point is rotating respectively to other touch point Download PDF

Info

Publication number
KR101661606B1
KR101661606B1 KR1020140042616A KR20140042616A KR101661606B1 KR 101661606 B1 KR101661606 B1 KR 101661606B1 KR 1020140042616 A KR1020140042616 A KR 1020140042616A KR 20140042616 A KR20140042616 A KR 20140042616A KR 101661606 B1 KR101661606 B1 KR 101661606B1
Authority
KR
South Korea
Prior art keywords
touch
predetermined
area
processing step
tool
Prior art date
Application number
KR1020140042616A
Other languages
Korean (ko)
Other versions
KR20140122683A (en
Inventor
강회식
소병철
장선웅
윤일현
Original Assignee
주식회사 지니틱스
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 지니틱스 filed Critical 주식회사 지니틱스
Priority to PCT/KR2014/003116 priority Critical patent/WO2014168431A1/en
Priority to CN201480020923.1A priority patent/CN105308540A/en
Publication of KR20140122683A publication Critical patent/KR20140122683A/en
Application granted granted Critical
Publication of KR101661606B1 publication Critical patent/KR101661606B1/en

Links

Images

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

The present invention discloses a method for processing a user-input gesture as an intended user-input gesture when the two-finger contact with the touch-sensitive surface is rotationally moved relative to each other in the same manner as the binary rotation.

Figure R1020140042616

Description

[0001] The present invention relates to a touch event processing method and a touch event processing method,

The present invention relates to a method of processing a touch event in which a touch tool is brought into contact with a touch sensitive surface of a touch input apparatus.

The touch input device can be used in various user devices. It has been used in devices such as smart phones, PDAs, laptops, and tablets that provide display screens and touch input pads to date. In the future, a touch input device can be used for a user device having a very small display screen and a touch input pad, such as a wrist watch.

When a finger is used as the input means of the touch input device, it is convenient because there is no inconvenience of using a tool such as a stylus pen.

The pen tip of the stylus pen is so thin that it allows precise input. However, when the finger is used, since the contact surface between the finger and the touch sensing surface of the touch input device is large, it is difficult to perform the user input gesture using the finger if the total area of the touch sensing surface provided is relatively small. It may be difficult to recognize the finger gesture correctly. For example, in the case of a touch input device of a wristwatch size, the above problem may occur. Therefore, there is a need to provide a new type of touch input technology capable of accepting efficient user input even when the touch detection surface of the touch input device is narrow.

The present invention provides a new processing technique for processing a touch event generated in a touch input device. Specifically, it is intended to provide a technique for accurately conveying a user's input intention even on a touch-sensitive surface having a small area.

The touch event processing method provided in accordance with the first aspect of the present invention can be performed in the case where the two fingers are kept in contact with the touch-sensitive surface while taking a gesture in a rotational form (in comparison, A part of the case where a gesture progressing in parallel form while maintaining contact with the sensing surface can be referred to as " drag ").

In this case, in the first aspect of the present invention, it is possible to calculate a value relating to an angle formed by a straight line connecting the touched points of the two fingers to one reference line of the fixed application window. If the change of the angle at the first and second time points of the time period in which the touch by the two fingers is continuously maintained exceeds the threshold value, it can be judged that the meaningful user input is made .

According to a second aspect of the present invention, there is provided a method for determining whether to execute a predetermined processing step when a touch event is generated by the first touch tool and the second touch tool with respect to the touch-sensitive surface of the touch input apparatus . The method includes: a first point indicated by a first region, which is determined to have been touched by the first touch tool among the touch-sensitive surfaces, at a first time point of the duration of the touch event; Calculating a first value relating to an angle of a line connecting a second point indicated by a second region judged to have been touched; A third point indicated by a third region of the touch sensing surface which is determined to have been touched by the first touching tool at a second time of the duration of the touch event and a third point indicated by a touch by the second touching tool Calculating a second value relating to an angle of a line connecting between a fourth point indicated by a fourth region judged to be lost; And if the difference between the first value and the second value is greater than or equal to a predetermined threshold, executing the predetermined processing step, otherwise determining not to execute the predetermined processing step.

The predetermined processing step may include causing a change in a display state of a display device to display an image in a stationary application window, and displaying the fixed application window at the first time And causing the first image to be rotated by a predetermined angle with respect to the fixed application window at a time point after the second time point.

Wherein the predetermined first processing step or the predetermined second processing step is executed when the difference between the first value and the second value is equal to or greater than a predetermined threshold value in the determining step, If the difference between the first area occupied by the first area and the second area occupied by the second area is larger than a predetermined area threshold value, the first processing step is executed in the determining step, May be configured to execute.

Wherein said determining step is adapted to execute a predetermined first processing step or a predetermined second processing step if the difference between said first value and said second value is greater than or equal to a predetermined threshold value, And if the distance between the third point and the third point is greater than a predetermined distance threshold, the determining step may be configured to execute the first processing step, and otherwise the second processing step.

According to a third aspect of the present invention, there is provided a user equipment including a touch input device having a touch sensitive surface, a processor, a memory, and a program stored in the memory and configured to be executed by the processor. At this time, the program may include a first area that is determined to have been touched by the first touch tool among the touch sensing surfaces at a first time of the duration of the touch event generated by the touch tool, Calculating a first value relating to an angle of a line connecting a first point representing a first region representing a touch and a second point representing a second region determined to be touched by the second touch tool; A third point indicated by a third region of the touch sensing surface which is determined to have been touched by the first touching tool at a second time of the duration of the touch event and a third point indicated by a touch by the second touching tool Calculating a second value relating to an angle of a line connecting between a fourth point indicated by a fourth region judged to be lost; And if the difference between the first value and the second value is greater than or equal to a predetermined threshold, executing the predetermined processing step, otherwise determining not to execute the predetermined processing step (step < RTI ID = 0.0 > instruction.

At this time, the user equipment may further include a display device, and the predetermined processing step may include a step of causing a change in a display state of a display device that is adapted to display an image in a stationary application window And causing the first image displayed in the fixed application window at the first time to be rotated by a predetermined angle with respect to the fixed application window.

According to a fourth aspect of the present invention, a computer-readable medium can be provided. The medium is configured to cause a user device including a touch input device having a touch sensitive surface, a processor, and a memory to perform, at a first time in a duration of a touch event caused by a touch tool with respect to the touch sensitive surface, Connecting a first point indicated by a first area determined to be touched by the first touch tool and a second point indicated by a second area determined to be touched by the second touch tool Calculating a first value for an angle of a line; A third point indicated by a third region of the touch sensing surface which is determined to have been touched by the first touching tool at a second time of the duration of the touch event and a third point indicated by a touch by the second touching tool Calculating a second value relating to an angle of a line connecting between a fourth point indicated by a fourth region judged to be lost; And if the difference between the first value and the second value is greater than or equal to a predetermined threshold, executing the predetermined processing step, otherwise determining not to execute the predetermined processing step And the like. The program is stored in the memory and is configured to be executed by the processor.

According to the present invention, it is possible to provide a new processing technique for processing a touch event generated in the touch input device. Specifically, it is possible to provide a technique that can accurately transmit a user's input intention even on a touch sensing surface having a small area. Even if only a narrow touch sensitive surface is provided, which is particularly difficult to multi-touch, the present invention can be used to perform various user inputs. And even if only a touch sensitive surface that is difficult to drag is provided, the present invention can be used to perform various user inputs.

FIG. 1 illustrates an example of an internal structure of a user device capable of performing a touch event processing method according to an embodiment of the present invention.
2 is a conceptual diagram illustrating the principle of a capacitive touch input device that can be used in an embodiment of the present invention.
3 is a diagram illustrating a process of performing a touch event in which two touch points are rotated with respect to each other while maintaining contact with each other in an embodiment of the present invention.
FIG. 4 shows an example of an image processing process to be issued subsequently when a touch event occurs according to FIG.
Fig. 5 shows an example in which one touch point of two touch points keeps substantially the same position as a specific example of Fig.
FIG. 6 is a flowchart illustrating a method according to an embodiment of the present invention described in FIG.

Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings. However, the present invention is not limited to the embodiments described herein, but may be implemented in various other forms. The terminology used herein is for the purpose of understanding the embodiments and is not intended to limit the scope of the present invention. Also, the singular forms as used below include plural forms unless the phrases expressly have the opposite meaning.

FIG. 1 shows an example of a user device capable of performing a touch event processing method according to an embodiment of the present invention.

The user device 100 includes a memory 110, a control chip 120, an external port 130, a power unit 140, and an input / output subsystem 150 and other types of functional units not shown .

The control chip 120 may include a memory control unit 121, a processor 122, and a peripheral device interface unit 123 for controlling the memory 110. The power unit 140 may provide power to all power consuming elements included in the user equipment 100. The input / output subsystem 150 includes a touch input device control unit 151 having a function of controlling the touch input device 10, a display device control unit 152 having a function of controlling the display device 20, Output device control unit 153 having a function of controlling the input / The external port 130 may refer to a physical / logical port for connecting the user device 100 to an external device.

The touch input device 10, the display device 20 and the other input and output devices 30 may be integrally installed in the user device 100 or may be provided separately from the user device 100, Or may be a device that is connected to the user device 100 through a network.

2 is a conceptual diagram illustrating the principle of a capacitive touch input device that can be used in an embodiment of the present invention.

The operation principle according to some embodiments of the electrostatic touch input device is disclosed in Korean Patent Laid-Open Nos. KR 10-2011-0076059, KR 10-2011-0126026, and the contents of these prior patent documents are incorporated herein by reference .

2 (a) shows the touch sensing surface 2 included in the electrostatic touch input device (hereinafter referred to as touch input device). The touch sensitive surface 2 is actually a surface provided to receive the touch input and may be covered with a cover or the like to prevent external contaminants from entering the user equipment. An embodiment of the touch sensing surface 2 is shown in the above-mentioned prior patent documents.

The touch sensing surface 2 may include a plurality of touch nodes 21. [ Each touch node can be connected to a touch IC of the touch input device, and the touch IC is capable of measuring the electrostatic capacitance of each touch node, that is, a capacitance value. The touch node may refer to a basic unit by which the touch IC can measure the capacitances separately from each other and may have a constant area. For example, in one example of a self-capacitance manner, each touch node may be provided by an electrode that is clearly distinct from one another, and may be defined as an intersection region of a driving electrode and a sensing electrode in one example of mutual capacitance type, The idea of the invention may not depend on the specific way of implementing the touch node.

It can be assumed that the touching tool touches the inner area occupied by the dotted circle 200 on the plurality of touch nodes 21 shown in FIG. 2 (a). At this time, the amount of change in capacitance due to each contact can be calculated for each of the touch nodes 201 to 204 shown in FIG. 2B. At this time, the amount of change in the capacitance of each touch node may be proportional to the area of the contact surface between each touch node and the touch tool. For example, the amount of change in capacitance of the touch node 204 is the largest, and the amount of change in capacitance can be reduced in the order of the touch node 203, the touch node 202, and the touch node 201.

In FIG. 2, a region where the contact occurs as shown by the dotted circle 200 may be referred to as a 'touch region'. The representative position 299 of the touch region 200 can be obtained based on the values of the capacitance variation of each touch node obtained as shown in FIG. 2 (b). The representative position 299 may be present in the touch area 200 and may be given in the form of coordinates representing one point. In the embodiments of the present invention described below, the above-described concept of the touch region and the representative position can be used.

3 is a diagram for explaining a method of defining a touch event when one touch event occurs and a subsequent processing method according to the touch event, in accordance with an embodiment of the present invention. This touch event can be defined by touching two touch tools, i.e., the first touch tool and the second touch tool together on the touch sensitive surface.

3 (a) shows the time from the start to the end of one touch event. The instant Ts at which the two touch tools 61 and 62 (e.g., the fingers 61 and 62) touches the touch sensing surface 2 can be defined as the starting point of occurrence of one touch event . Then, it is possible to define the instant Te at which the two touch tools 61 and 62 are separated from the touch sensing surface 2 as the end point of the touch event. Therefore, it can be defined that one touch event exists from the time Ts to the time Te, and this time period can be defined as the duration 55 of the touch event.

3 (a), when all the fingers 61 and 62 start to contact the touch-sensitive surface 2 at time Ts and both of the fingers 61 and 62 When the touch sensing surface 2 is away from the touch sensing surface 2.

In the example of FIG. 3 (a), at a first time T1, the direction in which the line 71 connecting the two fingers 61 and 62 is directed is a counterclockwise angle from the vertical line 81 by an angle? At the second time (T2), the tilted state indicates a case in which the tilt is inclined by an angle (? 2) in the counterclockwise direction from the vertical line (81). At this time, since the fingers 61 and 62 are moved from the first time T1 to the second time T2, since it is assumed that the finger does not fall off the touch sensing surface 2 during the duration 55 of the touch event, Can be regarded as rotating in the clockwise direction with respect to each other while maintaining the contact state with respect to the touch sensing surface 2.

At this time, at the first time T1 of the duration 55 of the touch event, the first touch tool (ex: first finger) 61 of the touch sensing surface 2 Between the first point (x11, y11) indicated by the first region and the second point (x21, y11) indicated by the second region judged to have been touched by the second touch tool (second finger) 1 [deg.] With respect to the vertical line 81 of the line 71 connecting the first line 71 and the second line 71 (step S210).

Next, at the second time T2 of the duration 55 of the touch event, the third area, which is determined to have been touched by the first touch tool 61 among the touch sensing surfaces 2, The vertical line 81 of the line 71 connecting the third point (x31, y31) and the fourth point (x41, y41) indicated by the fourth region judged as being touched by the second touch tool 62, 2 > (step S220).

At this time, the second time T2 may be later than the first time T1.

Then, if the difference between the first value and the second value is greater than or equal to a predetermined threshold value, executing the predetermined processing step, otherwise determining not to execute the predetermined processing step (S230) Can be executed.

The first value and the second value may be, for example, the first angle? 1 ° and the second angle? 2 ° themselves. In step S230, when two fingers touching the touch-sensitive surface 2 are relatively rotated in a clockwise or counterclockwise direction while maintaining the contact state, it is determined that predetermined user input has been performed Step. Therefore, the predetermined processing step is executed in response to the user input.

In the case of FIG. 3, it can be assumed that the difference value (2 - - 1 deg.) Is larger than a predetermined threshold value. Accordingly, it is determined that the user command is reliably input, and a subsequent process such as rotating the screen displayed to the user in a certain direction, for example, can be started.

Hereinafter, the meaning of the predetermined processing step will be described with reference to FIGS. 3 and 4. FIG.

4 (a) shows an example of a first image 91 displayed on a user screen according to an embodiment of the present invention. A dotted line portion of the first image 91 represents a virtual boundary portion of the first image 91. [

FIG. 4B illustrates a relative arrangement relationship between the stationary application window 95 and the first image 91 provided according to an embodiment of the present invention. The fixed application window 95 may be a part printed in advance on a synthetic resin or a glass constituting the case of the screen display part of the user equipment, and the inside and the outside of which are to be distinguished. Or the fixed application window 95 may mean a fixed display area that is defined and displayed as software in the screen display unit of the user equipment. The first image 91 is arranged upright in the fixed application window 95 in FIG. 4 (b), and as a result, a screen as shown in FIG. 4 (c) can be displayed.

4D shows that the first image 91 is rotated 90 ° clockwise in the fixed application window 95, and as a result, a screen as shown in FIG. 4E can be displayed.

In FIG. 4F, the first image 91 is rotated by 40 degrees in the clockwise direction in the fixed application window 95, and as a result, a screen as shown in FIG. 4G can be displayed.

The predetermined processing step of the step S230 is a step of causing a change in the display state of the display device to display an image in the fixed application window 95, The first image 91 displayed on the window 95 is rotated by a predetermined angle (ex: 40 ° or 90 °) with respect to the fixed application window 95 after the second time T2 and displayed May be a step causing the process. Therefore, when the touch event as shown in Fig. 3 occurs, the screen displayed as shown in Fig. 4 (c) at the first time T1 is displayed after the second time (T2) ). ≪ / RTI >

Fig. 5 shows a specific example of the embodiment shown in Fig.

The method of processing the touch event shown in Fig. 3 does not depend on the distance between the second point (x21, y21) and the fourth point (x41, y41). On the other hand, FIG. 5 shows a special case in which the distance between the second point (x21, y21) and the fourth point (x41, y41) is smaller than the predetermined distance threshold value. For example, the touch event of FIG. 5 assumes the operation of rotating the finger 61 clockwise using the finger 62 of the two fingers as a rotational axis.

Therefore, in one embodiment of the present invention, as a specific example of the touch event shown in FIG. 3, when a point touched by one finger is a central axis as shown in FIG. 5, a predetermined second processing step is performed . If it is not the case that the point touched by one finger is the central axis as shown in FIG. 5, the predetermined first processing step is performed. Here, the first processing step may be different from the second processing step.

Hereinafter, another specific example based on the embodiment described in Fig. 3 will be described.

The touch event processing method shown in Fig. 3 is a method in which a touch is performed by a first touch tool (ex: first finger) 61 of the touch sensing surface 2 at a first time (T1) or a second time The difference between the first area indicated by the first area judged to have occurred and the second area indicated by the second area judged to have been touched by the second touch tool (ex: second finger) .

However, the case where the difference between the first area and the second area is larger than the predetermined area threshold value and the case where the difference is smaller can be processed.

For example, if the difference between the first area and the second area is greater than a predetermined area threshold, then one of the two fingers may be a thumb and the other one an index finger. In this case, the area of contact with the thumb is generally larger than the area of contact with the index finger.

Conversely, when the difference between the first area and the second area is smaller than the predetermined area threshold value, one of the two fingers may be the second finger (index finger) and the other finger may be the third finger (stop). In this case, the contact area between the second finger and the third finger are similar to each other.

Therefore, in one embodiment of the present invention, as a specific example of the touch event shown in FIG. 3, when one of the areas contacted by the two touch tools is larger than the predetermined area threshold value in comparison with the other one, 2 processing step. If one of the areas contacted by the two touching tools is smaller than a predetermined area threshold value, the predetermined first processing step is performed.

In the above embodiment, when the two fingers are rotated to perform the rotating operation, it is discriminated whether or not the thumb is included among the two fingers to perform a different subsequent process when the finger is included or not .

FIG. 6 is a flowchart illustrating a method according to an embodiment of the present invention described in FIG.

According to another embodiment of the present invention, there is provided a touch input device 10 having a touch sensitive surface 2, a processor 122, a memory 110, The user device 100 may be provided with a program configured to be executed by the user device 100. [

At this time, the program may include instructions for executing the steps 210, S220, and S230 described above.

Meanwhile, according to another embodiment of the present invention, the user device 100 including the touch input device 10 having the touch-sensitive surface 2, the processor 122, A program stored in the memory 110 and configured to be executed by the processor 122, the program comprising instructions for causing the computer to execute steps S210, S220, and S230, A computer readable medium may be provided.

It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the essential characteristics thereof. The contents of each claim in the claims may be combined with other claims without departing from the scope of the claims.

Claims (7)

A method for determining whether to execute a predetermined processing step when a touch event occurs by a first touch tool and a second touch tool with respect to a touch sensitive surface of a touch input device,
A first point indicated by a first region of the touch sensing surface that is determined to have been touched by the first touch tool and a second point indicated by a second touch tool Calculating a first value relating to an angle of a line connecting a second point indicated by a second region judged to be lost;
A third point indicated by a third region of the touch sensing surface which is determined to have been touched by the first touching tool at a second time of the duration of the touch event and a third point indicated by a touch by the second touching tool Calculating a second value relating to an angle of a line connecting between a fourth point indicated by a fourth region judged to be lost;
Determining whether a difference between a first area occupied by the first area and a second area occupied by the second area is larger or smaller than a predetermined area threshold value if the difference between the first value and the second value is greater than or equal to a predetermined threshold value Calculating; And
Determining whether to execute the predetermined first processing step if the difference between the first area and the second area is larger than the predetermined area threshold value and otherwise determine to execute the predetermined second processing step
/ RTI >
How to handle touch events.
The method according to claim 1,
Wherein the predetermined first processing step or the predetermined second processing step is a step of causing a change in a display state of a display device adapted to display an image in a stationary application window, Causing the first image displayed in the fixed application window to be rotated by a predetermined angle with respect to the fixed application window.
delete The method according to claim 1,
The predetermined third processing step or the predetermined fourth processing step is executed when the difference between the first value and the second value is equal to or greater than a predetermined threshold value in the determining step,
The third processing step is executed in the determining step if the distance between the first point and the third point is greater than the predetermined distance threshold value,
How to handle touch events.
A user device comprising a touch input device having a touch sensitive surface, a processor, a memory, and a program stored in the memory and configured to be executed by the processor,
The program includes:
A first point indicated by a first area determined to be touched by a first touch tool of the touch sensitive surface at a first time of a duration of a touch event generated by the touch tool with respect to the touch sensitive surface, Calculating a first value relating to an angle of a line connecting between a first point and a second point indicated by a second region which is judged to have been touched by a second touch tool;
A third point indicated by a third region of the touch sensing surface which is determined to have been touched by the first touching tool at a second time of the duration of the touch event and a third point indicated by a touch by the second touching tool Calculating a second value relating to an angle of a line connecting between a fourth point indicated by a fourth region judged to be lost;
Determining whether a difference between a first area occupied by the first area and a second area occupied by the second area is larger or smaller than a predetermined area threshold value if the difference between the first value and the second value is greater than or equal to a predetermined threshold value Calculating; And
Determining whether to execute the predetermined first processing step if the difference between the first area and the second area is larger than the predetermined area threshold value and otherwise determine to execute the predetermined second processing step
Comprising instructions for performing the steps < RTI ID = 0.0 >
User device.
6. The method of claim 5,
Further comprising a display device,
Wherein the predetermined first processing step or the predetermined second processing step is a step of causing a change in a display state of a display device adapted to display an image in a stationary application window, Causing the first image displayed in the fixed application window to be rotated by a predetermined angle with respect to the fixed application window,
User device.
A user device including a touch input device having a touch sensitive surface, a processor, and a memory,
A first point indicated by a first area determined to be touched by a first touch tool of the touch sensitive surface at a first time of a duration of a touch event generated by the touch tool with respect to the touch sensitive surface, Calculating a first value relating to an angle of a line connecting between a first point and a second point indicated by a second region which is judged to have been touched by a second touch tool;
A third point indicated by a third region of the touch sensing surface which is determined to have been touched by the first touching tool at a second time of the duration of the touch event and a third point indicated by a touch by the second touching tool Calculating a second value relating to an angle of a line connecting between a fourth point indicated by a fourth region judged to be lost;
Determining whether a difference between a first area occupied by the first area and a second area occupied by the second area is larger or smaller than a predetermined area threshold value if the difference between the first value and the second value is greater than or equal to a predetermined threshold value Calculating; And
Determining whether to execute the predetermined first processing step if the difference between the first area and the second area is larger than the predetermined area threshold value and otherwise determine to execute the predetermined second processing step
The program comprising:
The program stored in the memory and configured to be executed by the processor,
, "
Computer-readable medium.
KR1020140042616A 2013-04-10 2014-04-09 Method for processing touch event when a touch point is rotating respectively to other touch point KR101661606B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/KR2014/003116 WO2014168431A1 (en) 2013-04-10 2014-04-10 Method for processing touch event and apparatus for same
CN201480020923.1A CN105308540A (en) 2013-04-10 2014-04-10 Method for processing touch event and apparatus for same

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020130039553 2013-04-10
KR20130039553 2013-04-10

Publications (2)

Publication Number Publication Date
KR20140122683A KR20140122683A (en) 2014-10-20
KR101661606B1 true KR101661606B1 (en) 2016-09-30

Family

ID=51993693

Family Applications (3)

Application Number Title Priority Date Filing Date
KR20140042615A KR20140122682A (en) 2013-04-10 2014-04-09 Method for processing touch event where touch area rotates and device for the same
KR1020140042616A KR101661606B1 (en) 2013-04-10 2014-04-09 Method for processing touch event when a touch point is rotating respectively to other touch point
KR1020140043052A KR102191321B1 (en) 2013-04-10 2014-04-10 Method for processing touch event and device for the same

Family Applications Before (1)

Application Number Title Priority Date Filing Date
KR20140042615A KR20140122682A (en) 2013-04-10 2014-04-09 Method for processing touch event where touch area rotates and device for the same

Family Applications After (1)

Application Number Title Priority Date Filing Date
KR1020140043052A KR102191321B1 (en) 2013-04-10 2014-04-10 Method for processing touch event and device for the same

Country Status (2)

Country Link
KR (3) KR20140122682A (en)
CN (1) CN105308540A (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106020712B (en) * 2016-07-29 2020-03-27 青岛海信移动通信技术股份有限公司 Touch gesture recognition method and device
CN106250022B (en) * 2016-07-29 2019-07-09 努比亚技术有限公司 Content selection method of adjustment, device and terminal
CN106569723A (en) * 2016-10-28 2017-04-19 努比亚技术有限公司 Device and method for controlling cursor movement
US11922008B2 (en) 2021-08-09 2024-03-05 Samsung Electronics Co., Ltd. Electronic device processing input of stylus pen and method for operating the same

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110102464A1 (en) * 2009-11-03 2011-05-05 Sri Venkatesh Godavari Methods for implementing multi-touch gestures on a single-touch touch surface
WO2012064128A2 (en) 2010-11-10 2012-05-18 Chae Sang-Woo Touch screen apparatus and method for controlling same

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100194701A1 (en) * 2008-10-28 2010-08-05 Hill Jared C Method of recognizing a multi-touch area rotation gesture
EP2378403A1 (en) * 2010-04-19 2011-10-19 Tyco Electronics Services GmbH Method and device for determining a user's touch gesture
CN101917548A (en) * 2010-08-11 2010-12-15 无锡中星微电子有限公司 Image pickup device and method for adaptively adjusting picture
KR101718893B1 (en) * 2010-12-24 2017-04-05 삼성전자주식회사 Method and apparatus for providing touch interface
TWI478041B (en) * 2011-05-17 2015-03-21 Elan Microelectronics Corp Method of identifying palm area of a touch panel and a updating method thereof

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110102464A1 (en) * 2009-11-03 2011-05-05 Sri Venkatesh Godavari Methods for implementing multi-touch gestures on a single-touch touch surface
WO2012064128A2 (en) 2010-11-10 2012-05-18 Chae Sang-Woo Touch screen apparatus and method for controlling same

Also Published As

Publication number Publication date
CN105308540A (en) 2016-02-03
KR20140122683A (en) 2014-10-20
KR102191321B1 (en) 2020-12-16
KR20140122682A (en) 2014-10-20
KR20140122687A (en) 2014-10-20

Similar Documents

Publication Publication Date Title
US10379727B2 (en) Moving an object by drag operation on a touch panel
TWI569171B (en) Gesture recognition
EP2359224B1 (en) Generating gestures tailored to a hand resting on a surface
JP5738707B2 (en) Touch panel
CN107741824B (en) Detection of gesture orientation on repositionable touch surface
EP2715485B1 (en) Target disambiguation and correction
CN104007932A (en) Touch point recognition method and device
US9569045B2 (en) Stylus tilt and orientation estimation from touch sensor panel images
US20140210742A1 (en) Emulating pressure sensitivity on multi-touch devices
US8542207B1 (en) Pencil eraser gesture and gesture recognition method for touch-enabled user interfaces
JP6410537B2 (en) Information processing apparatus, control method therefor, program, and storage medium
KR101661606B1 (en) Method for processing touch event when a touch point is rotating respectively to other touch point
CN105653177A (en) Method for selecting clickable elements of terminal equipment interface and terminal equipment
KR102198596B1 (en) Disambiguation of indirect input
US9256360B2 (en) Single touch process to achieve dual touch user interface
JP6255321B2 (en) Information processing apparatus, fingertip operation identification method and program
JP5757118B2 (en) Information processing apparatus, information processing method, and program
JP2014175012A (en) Mouse pointer control method
JP2015146090A (en) Handwritten input device and input control program
TW201528114A (en) Electronic device and touch system, touch method thereof
WO2017034425A1 (en) System and method for disambiguating touch interactions

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20190903

Year of fee payment: 4