WO2011150607A1 - 一种自动识别手势的方法及移动终端 - Google Patents

一种自动识别手势的方法及移动终端 Download PDF

Info

Publication number
WO2011150607A1
WO2011150607A1 PCT/CN2010/076742 CN2010076742W WO2011150607A1 WO 2011150607 A1 WO2011150607 A1 WO 2011150607A1 CN 2010076742 W CN2010076742 W CN 2010076742W WO 2011150607 A1 WO2011150607 A1 WO 2011150607A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
mobile terminal
finger
touch screen
fingers
Prior art date
Application number
PCT/CN2010/076742
Other languages
English (en)
French (fr)
Inventor
魏兰英
赵薇
杨新力
胡博
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Priority to EP10852402.6A priority Critical patent/EP2570901B1/en
Publication of WO2011150607A1 publication Critical patent/WO2011150607A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04166Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present invention relates to gesture recognition technology in the field of mobile terminals, and in particular, to a method for automatically recognizing a gesture and a mobile terminal. Background technique
  • FIG. 1 is a schematic structural diagram of a system for implementing gesture recognition on an android platform in the prior art. As shown in FIG. 1 , a two-finger split or close action is performed, assuming that the driver layer reports data to the architecture layer at a frequency of 80 Hz, per second architecture.
  • the architecture layer preprocesses the information of a complete event and puts it in the motion class because the driver layer
  • the data is reported to the architecture layer at a frequency of 80 Hz, so that a maximum of 80 motions per second is generated, and then the data in the motion class after each pre-processing is sent to the gesture algorithm processing layer for processing.
  • the gesture algorithm processing is processed every 28ms, so it is roughly calculated 35 times per second.
  • the drawback of this method is: The algorithm used by the gesture algorithm processing layer is roughly calculated 35 times per second, so the calculation amount of the gesture algorithm processing layer is large, so that The response of the mobile terminal is relatively slow; moreover, the processing speed is also determined according to the reporting frequency of the driving layer, and does not have the ability to adaptively adjust according to requirements. Summary of the invention
  • the main object of the present invention is to provide a method for automatically recognizing a gesture and a mobile terminal, which can realize automatic recognition of multi-finger gestures quickly and efficiently.
  • the invention discloses a method for automatically recognizing a gesture, comprising:
  • the time variation amount threshold and the distance change amount threshold value are extracted, and the gesture is identified according to the preprocessed data, the touch information, and the extracted time variation amount threshold value and the distance change amount threshold value.
  • the touch screen of the mobile terminal is calibrated, and the time variation amount and the distance change value when the finger moves on the touch screen are acquired and saved:
  • the time variation amount selected in the time experience value is obtained, and the distance change threshold value, the saved time change amount threshold value, and the distance change amount threshold value are obtained.
  • the empirical value of the pre-stored speed is an empirical value of a gesture speed obtained according to users of different age groups, different heights, and different genders, and the empirical value of the pre-stored speed is performed according to a normal distribution. arrangement.
  • the touch information of the finger on the touch screen of the mobile terminal is: the chip in the driving layer of the mobile terminal acquires the touch information of the finger on the touch screen of the mobile terminal in real time, and sends the touch information to the architecture layer;
  • the touch information includes: a touch screen with a mobile terminal The upper left corner is the coordinate value of the finger in the coordinate system of the origin, the finger width value of the finger, the pressure of the finger on the touch screen, and the pressing touch value of the finger.
  • the preprocessing the data of the touch information is:
  • the architecture layer of the mobile terminal records the motion state information of the finger into the motion class according to the data of the touch information; the architecture layer records the data in the touch information into the motion class, and sends the data in the motion class to the gesture algorithm.
  • Processing layer the motion state information includes: a finger is in motion, a finger is bounced, and a finger is pressed.
  • the gesture is identified according to the preprocessed data, the touch information, and the extracted time variation threshold and the distance variation threshold:
  • the gesture algorithm processing layer of the mobile terminal obtains the motion state of the finger on the touch screen according to the preprocessed data.
  • the motion state is that the finger moves on the touch screen, according to the separation of the touch information used to separate the different fingers in the touch information.
  • the number of symbols determines the number of fingers moving on the touch screen.
  • the coordinates of the current two fingers and the current time are recorded in real time, and the distance between the two fingers is calculated; when the difference between the distances of the two fingers is two When the absolute value is greater than the distance change threshold, and the difference between the two current times is greater than the time variation threshold, the gesture is a valid gesture.
  • the method further includes:
  • the coordinates of the current three fingers and the current time are recorded in real time, and the radius of the circumscribed circle of the coordinates of the three fingers is calculated; when the absolute value of the difference of the radius of the circumcircle of the coordinates of the three fingers is greater than
  • the gesture is a valid gesture when the distance varies and the difference between the two current times is greater than the time variation threshold.
  • the method further includes:
  • the gesture is an invalid gesture when it is determined that the motion state is that all of the fingers are bouncing or a finger is pressed.
  • the method further includes: implementing a function corresponding to the gesture on the mobile terminal according to the recognition result.
  • the present invention also discloses a mobile terminal that automatically recognizes a gesture, including: a gesture calibration module, a driver layer, an architecture layer, and a gesture algorithm processing layer;
  • a gesture calibration module configured to calibrate the touch screen of the mobile terminal, acquire and save a time change amount and a distance change value when the finger moves on the touch screen;
  • a driving layer configured to acquire touch information of a finger on a touch screen of the mobile terminal
  • An architecture layer configured to preprocess data of the acquired touch information
  • the gesture algorithm processing layer is configured to extract the threshold value of the change amount of the pre-stored time and the change amount of the distance, according to the pre-processed data, the touch information, and the extracted time variation amount threshold and the distance change threshold value, Identify gestures.
  • the mobile terminal further includes: an application layer, a database; wherein, the application layer is configured to implement a gesture corresponding function on the mobile terminal according to the recognition result; and the database is configured to save the obtained time variation amount and distance
  • the amount of change is also used; it is also used to preserve the empirical value of speed and the empirical value of time.
  • the present invention provides a method for automatically recognizing a gesture and a mobile terminal, and extracts a pre-stored time variation amount threshold and a distance variation threshold value, and recognizes the gesture according to the preprocessed data, the touch information, and the extracted threshold value, that is, At the gesture algorithm processing layer, the new algorithm is used to identify the gesture. Compared with the calculation amount of 35 times per second in the prior art, the calculation amount is only 10 to 15 times per second in the present invention, which greatly reduces the gesture algorithm.
  • the calculation amount of the processing layer improves the reaction speed of the mobile terminal, and realizes the automatic recognition of the gesture quickly and efficiently, thereby bringing a good user experience to the mobile terminal user using the touch screen, and making the user operation more convenient and faster.
  • gesture calibration can be performed for different users, thus making gesture recognition have good adaptability.
  • FIG. 1 is a schematic structural diagram of a system for implementing gesture recognition on an android platform in the prior art
  • FIG. 2 is a schematic flowchart of a method for automatically recognizing a gesture according to the present invention
  • Figure 3 is the data when different numbers of fingers are pressed when the BTN TOUCH value is included in the present invention. Schematic diagram of the format
  • FIG. 4 is a schematic diagram of a data format when a different number of fingers are pressed when the BTN_TOUCH value is not included in the present invention
  • FIG. 5 is a schematic structural diagram of a system for implementing automatic recognition gestures according to the present invention. detailed description
  • the basic idea of the present invention is: calibrating a touch screen of a mobile terminal, acquiring and storing a time variation amount and a distance change value when the finger moves on the touch screen; acquiring touch information of the finger on the touch screen of the mobile terminal, and Pre-processing the data of the obtained touch information; extracting the time change amount threshold value and the distance change amount threshold value, according to the pre-processed data, the touch information, and the extracted time variation amount, the threshold value and the distance change amount are wide Value, identify the gesture.
  • FIG. 2 is a schematic flowchart of a method for automatically recognizing a gesture according to the present invention. As shown in FIG. 2, the method includes the following steps:
  • Step 201 Calibrate the touch screen of the mobile terminal, and obtain and save a time variation amount and a distance change threshold when the finger moves on the touch screen;
  • the gesture calibration function of the mobile terminal is activated, and the touch screen of the mobile terminal is calibrated, and the calibration is not required for each use after the calibration; here, the gesture calibration Features can be implemented using a calibrated application.
  • the process of calibrating the touch screen of the mobile terminal is described.
  • the distance between the two fingers and the time used during the period from when the two fingers press the touch screen to when the two fingers are lifted from the touch screen is recorded.
  • thereby finding the speed of the two fingers to find the closest speed ⁇ in the empirical value of the speed stored in advance in the mobile terminal; according to the size of the touch screen and the frequency at which the driving layer transmits the touch information to the architecture layer, etc.
  • the time variation amount threshold ⁇ is selected, and ⁇ is multiplied by ⁇ to obtain the distance change amount threshold AS, and ⁇ and AS are saved for waiting for extraction; wherein, the value of ⁇ must be Integer multiple of the interruption time, interruption time Refers to the interval at which the driver layer sends touch information to the architecture layer twice.
  • the frequency at which the driver layer sends the touch information to the architecture layer twice is 80 Hz.
  • the value of ⁇ is
  • the empirical value of the pre-stored speed and the empirical value of the time are the empirical values of the gesture movement time and the experience value of the gesture speed obtained according to the user of different ages, different heights, different genders, etc.;
  • the empirical values for the pre-stored speeds are arranged in a normal distribution, so that the efficiency of finding the closest speed ⁇ in the empirical value of the speed stored in advance in the database of the mobile terminal is improved. The speed of the reflection of the mobile terminal is accelerated.
  • Step 202 Acquire touch information of a finger on a touch screen of the mobile terminal.
  • the chip in the driving layer of the mobile terminal acquires the touch information of the finger on the touch screen of the mobile terminal in real time, and sends the touch information to the architecture layer through the transmission channel connected by the driver layer and the architecture layer according to a certain data format;
  • the touch information includes: a coordinate value of the finger in the coordinate system with the upper left corner of the touch screen of the mobile terminal as the origin, a finger width value w, a finger pressure on the touch screen p, and a finger touch touch (BTN TOUCH, Button Touch) Value;
  • BTN TOUCH Button Touch
  • the BTN_TOUCH value When the BTN_TOUCH value is 0, it means that all the fingers of the finger are raised; the BTN_TOUCH value is only when the finger is pressed for the first time or all When the finger of the finger is raised, that is, when the value of BTN_TOUCH changes, it is sent to the architecture layer as part of the touch information; wherein, a certain data format is shown in FIG. 3 and FIG. 4.
  • Figure 3 shows the data format when a single finger, two fingers, three fingers, and N fingers are pressed when the BTN_TOUCH value is included.
  • Figure 4 shows the single finger, two fingers, and three when the BTN_TOUCH value is not included.
  • the SY_MT_REPORT value is a separator for separating touch information of different fingers, and SY_REPORT is a separator for separating touch information sent each time; the driver layer sends touch information to the architecture layer at a certain frequency, The frequency is also called the interrupt frequency.
  • Different touch screen manufacturers provide different interrupt frequencies, usually 60Hz to 80Hz, and some up to 250Hz.
  • Step 203 Perform preprocessing on the obtained data in the touch information.
  • the architecture layer of the mobile terminal receives the touch information sent by the driver layer according to the data format of the touch information sent by the driver layer, for example: the driver layer according to the coordinate value x, the coordinate value y, the finger width value w, the finger
  • the touch information is sent in the order of the pressure p of the touch screen, and the architecture layer receives the touch information according to the coordinate value x, the coordinate value y, the finger width value w of the finger, and the pressure p of the finger to the touch screen;
  • the data in the touch information is preprocessed, that is, the motion state information of the finger is recorded into the motion class according to the data in the received touch information; wherein the motion state information includes ACTION_MOVE, ACTION_UP, ACTION DOW, Among them, ACTION_MOVE indicates that there is a finger in motion, ACTIONJJP indicates that all fingers are bounced, ACTION_DOWN indicates that a finger is pressed; according to whether there is BTN in the touch information.
  • the TOUCH value determines whether it is the ACTION_MOVE state. If there is no BTN_TOUCH value, it means that there is a finger moving on the touch screen, that is, the ACTION_MOVE state; if there is BTN-
  • the TOUCH value determines whether the BTN_TOUCH value is 0 or 1. If it is 0, it means that all the fingers are bounced, that is, the ACTIONJJP state. If it is 1 means that there is a finger press, it is the ACTION_DOW state; at the same time, the architectural layer will touch information.
  • the data in the record is used in the motion class to obtain the motion trajectory of each finger according to the recorded data; the architecture layer sends the data in the motion class to the gesture algorithm processing layer; the motion class is a class programming language, a class The data of the same nature is stored in a motion class.
  • the motion class is equivalent to a storage medium for storing touch information and pre-processing the touch information.
  • Step 204 extracting a pre-stored time variation amount threshold value and a distance change amount threshold value, according to the preprocessed data, the touch information, and the extracted time variation amount threshold value and the distance change amount threshold value, Gesture recognition;
  • the gesture algorithm processing layer of the mobile terminal receives the data in the motion class sent by the architecture layer, and can know the motion state of the finger on the touch screen according to the received motion state information, because the SY_MT_REPORT value in the touch information is used to separate The separator of the touch information of different fingers, so the number of fingers moving on the touch screen can be known according to the number of SYN_MT_REPORTs in the touch information in the data.
  • the gesture algorithm processing layer obtains the motion state information of the finger according to the data in the motion class. If it is ACTIONJJP or ACTION_DOWN, it means that the finger is all raised, or just pressed, so the finger does not move on the mobile terminal, and the gesture is an invalid gesture. , so there is no need to identify the gesture and end the process;
  • ACTION_MOVE when the gesture algorithm processing layer determines that the number of fingers moving on the touch screen is two fingers, the gesture of two fingers closing and two fingers away is taken as an example: when the two fingers move on the touch screen of the mobile terminal, the gesture The algorithm processing layer records the coordinates of the current two fingers in real time (X,
  • the gesture algorithm processing layer extracts the ⁇ and AS stored in step 201 from the database, compares the size of
  • the gesture algorithm processing layer determines that the number of fingers moving on the touch screen is three fingers, the gesture of three fingers closing and three fingers away is taken as an example: the gesture algorithm processing layer records the coordinates of the current three fingers in real time, according to not being on the same line.
  • the three points determine the principle of a circumscribed circle, obtain the circumscribed circle according to the coordinates of the current three fingers, and calculate the radius of the circumscribed circle ⁇ and the current time T 3 ;
  • the coordinates of the three fingers are recorded next time, the current three fingers are recorded.
  • Step 205 Implement a function corresponding to the gesture on the mobile terminal according to the recognition result.
  • the application layer of the mobile terminal receives the recognition result sent by the gesture algorithm processing layer, and determines the value of the scale and the size of 1. If the sacle is less than 1, the description is The two-finger or three-finger gesture is closed.
  • the function of reducing the image can be implemented on the mobile terminal, and the scale of the reduction is calculated according to the value of the scale; if the sacle is greater than 1, the gesture of the two-finger or three-finger is far away, for example,
  • the function of image enlargement can be realized on the mobile terminal, and the multiple of the magnification is also calculated according to the value of the scale; if equal to 1, it means that the two fingers do not move on the touch screen of the mobile terminal, and no operation is performed.
  • the gesture in the present invention may be a gesture composed of a plurality of fingers, and generally the gesture of two fingers or three fingers is better.
  • the present invention can be applied to a variety of operating systems, such as the Windows Mobile operating system, the Symbian operating system, and the Android operating system, and can also be applied to a global positioning system.
  • FIG. 5 is a schematic structural diagram of a mobile terminal that implements an automatic recognition gesture according to the present invention.
  • the mobile terminal includes: a gesture calibration module 51, Drive layer 52, architecture layer 53, gesture algorithm processing layer 54, wherein;
  • the gesture calibration module 51 is configured to calibrate the touch screen of the mobile terminal, and acquire and save a time variation amount and a distance change threshold value when the finger moves on the touch screen;
  • the gesture calibration module 51 calibrates the touch screen of the mobile terminal, and acquires and stores the time variation amount and the distance change threshold when the finger moves on the touch screen, specifically: the gesture calibration module 51 of the mobile terminal is based on the finger The distance change amount and the time taken to generate the change, the motion speed is obtained; the speed closest to the motion speed is found in the empirical value of the pre-stored speed; according to the closest speed and the pre-stored time experience value Selected time The threshold of the change amount, the threshold value of the change of the distance is obtained, and the thresholds of the two variables are saved; the driving layer 52 is configured to acquire touch information of the finger on the touch screen of the mobile terminal;
  • the driving layer 52 obtains the touch information of the finger on the touch screen of the mobile terminal, specifically: the chip in the driving layer 52 acquires the touch information of the finger on the touch screen of the mobile terminal in real time, and passes the acquired touch information according to a certain data format.
  • the transmission channel between the driving layer 52 and the architecture layer 53 is sent to the architecture layer 53 at a certain frequency;
  • the touch information includes: a coordinate value of a finger in a coordinate system with an origin of the upper left corner of the touch screen of the mobile terminal, and a finger Refers to the width value, the pressure of the finger on the touch screen, and the touch value of the finger;
  • An architecture layer 53 is configured to preprocess data of the touch information.
  • the pre-processing of the data of the touch information by the architecture layer 53 is specifically as follows:
  • the architecture layer 53 records the motion state information of the finger into the motion class according to the data of the touch information, where the motion state information includes the finger moving. , the finger is bounced and has a finger press;
  • the architecture layer records the data in the touch information into the motion class, and sends the data in the motion class to the gesture algorithm processing layer 54;
  • the gesture algorithm processing layer 54 is configured to extract the pre-stored time variation amount threshold value and the distance change value threshold value, according to the preprocessed data, the touch information, and the extracted time variation amount threshold value and the distance change amount threshold value, Gesture recognition;
  • the gesture algorithm processing layer 54 extracts the pre-stored time variation amount threshold and the distance change threshold value, and identifies the gesture according to the preprocessed data, the touch information, and the extracted threshold value, specifically: the gesture of the mobile terminal
  • the algorithm processing layer 54 obtains the motion state of the finger on the touch screen according to the pre-processed data. When it is determined that the motion state is that the finger moves on the touch screen, according to the separator of the touch information used to separate the touch information of the different fingers.
  • the quantity judgment determines the number of fingers moving on the touch screen; when it is determined that two fingers move on the touch screen, when it is determined that two fingers move on the touch screen, the coordinates of the current two fingers and the current time are recorded in real time, and the distance between the two fingers is calculated;
  • the absolute value of the difference between the distances of the two fingers is greater than the distance change threshold, and the two current time
  • the change coefficient of the gesture is calculated; when it is determined that there are three fingers moving on the touch screen, the coordinates of the current three fingers and the current time are recorded in real time, and the radius of the circumscribed circle of the coordinates of the three fingers is calculated;
  • the change coefficient of the gesture is calculated; when it is determined that the motion state is all the fingers are bombed When the finger is pressed or pressed, the gesture is an invalid gesture, ending
  • the mobile terminal further includes: an application layer 55; wherein
  • the application layer 55 is configured to implement a gesture corresponding function on the mobile terminal according to the recognition result.
  • the mobile terminal further includes: a database 56, configured to save the obtained time variation amount threshold and the distance change amount threshold; and is also used to save the experience value of the speed, the empirical value of the time, and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Description

一种自动识别手势的方法及移动终端 技术领域
本发明涉及移动终端领域的手势识别技术, 尤其涉及一种自动识别手 势的方法及移动终端。 背景技术
移动通讯的迅速发展, 引起整个社会的方方面面不同程度的改变, 目 前移动终端已成为绝大多数人生活中不可或缺的一部分。 未来移动终端的 人机交互将以自然性、 多通道性、 协作性为主要发展方向, 试图通过手势、 语音、 表情等人类自然的交流方式形成多通道、 多模式的用户与移动终端 的自然对话, 以提高用户的体验效果。 移动终端的用户界面 (UI , User Interface ) 由 "以技术为中心" 向 "以用户为中心" 发展的趋势, 使自然直 观的自然人机交互成为用户界面发展的必然趋势。 其中, 手势交互这一交 互方式作为适应自然交互趋势的 UI交互形式, 正逐渐受到大家的重视, 应 用的广泛性日益扩大。
移动终端的厂商已经在 UI技术上花费了很多精力, 包括 UI设计、 鼠 标、 键盘、 轨迹球、 重力传感器等。 随着智能移动终端的普及, 触摸屏的 功能越来越不可替代, 手势也因其新奇性、 方便性、 易用性, 成为人机交 互新技术。 基于自然手势的可双指触摸的人机交互, 是用户与移动终端之 间自然和谐对话的一种新的交互技术。 这种方式是 "以用户为中心" 的交 互, 与传统的触摸屏只能单个指头的操作不同, 这种方式允许用户多个手 指同时操作一个移动终端, 甚至可以让多个用户同时操作。 但是多个手指 同时操作意味着允许处理更加复杂的任务, 因此如何既快速、 又高效的实 现手势交互是移动终端制造商迫在眉睫要解决的问题。 目前为止, 苹果公司正在做这方面研究, 已经实现的功能主要有滑动 解锁、 缩放和翻转等, 但是主要集中在 UI的设计方面。 此外, 还有一些触 摸屏厂商做的底层手势交互处理, 主要研究一些底层的算法和结构, 但是 因为算法和结构不同, 厂商间的移动终端艮难相互兼容。
图 1是现有技术中 android平台上实现手势识别的***结构示意图, 如 图 1所示, 做一次两指分离或者合拢的动作, 假设驱动层以 80Hz的频率上 报数据给架构层, 每秒架构层需要进行 80*N次的计算, 其中 N代表一次 完整事件所需要手指的触点信息, 触点信息主要包括: 以移动终端屏幕左 上角为原点的 X坐标的值和 y坐标的值, 手指的指宽 w, 手指对屏幕的压 力 p,多点同步上才艮( SY MT REPORT, Synchronize Multi-Touch Report ) 的值, 同步上报( SY _REPORT, Synchronize Report )的值; 如果是单指, 则 N=6, 如果是两指, 则 N=l l , 如果是 M个指头, 则 N = 5*M+1 ; 架构 层把一次完整事件的信息进行预处理,放在 motion类中,因为驱动层以 80Hz 的频率上报数据给架构层, 所以每秒最多产生 80次 motion, 然后将每次预 处理后的 motion类中的数据发送给手势算法处理层进行处理, 手势算法处 理层每 28ms处理一次, 所以每秒大概计算 35次。
手势算法处理层利用的算法计算过程如下: 记录第一次两个指头按下 的两个点的坐标 0 , )和 0 '2 , J '2)、 以及第二次两个指头按下的两个点的坐 标( 和 Ascur
Figure imgf000004_0001
为 scale = Ascur I Aspre , 其中 scur代表当前两点的距离, hspre代表上一次两点的距 离, 将 scale值发送给应用层; 如果是两指远离的动作, 发送的 scale值最 大不能超过上一次 scale值的 1.25倍,如果是两指合拢的动作,发送的 scale 值最小不低于上一次 scale值的 0.8倍。 这种方法的缺陷是: 手势算法处理 层利用的算法每秒大概计算 35次, 所以手势算法处理层的计算量较大, 使 得移动终端的反应比较慢; 而且, 处理的速度还要根据驱动层的上报频率 而定, 不具备根据需求而进行自适应调节的能力。 发明内容
有鉴于此, 本发明的主要目的在于提供一种自动识别手势的方法及移 动终端, 能够快捷、 高效地实现自动识别多指手势。
为达到上述目的, 本发明的技术方案是这样实现的:
本发明公开一种自动识别手势的方法, 包括:
对移动终端的触摸屏进行校准, 获取并保存手指在触摸屏上运动时的 时间变化量阔值和距离变化量阔值;
获取手指在移动终端触摸屏上的触控信息, 并对获得的触控信息的数 据进行预处理;
提取所述时间变化量阔值和距离变化量阔值, 根据预处理后的数据、 触控信息和提取的时间变化量阔值和距离变化量阔值, 对手势进行识别。
上述方法中, 所述对移动终端的触摸屏进行校准, 获取并保存手指在 触摸屏上运动时的时间变化量阔值和距离变化量阔值为:
根据手指之间的距离变化量和产生变化所用的时间, 得到运动速度; 在预先存储的速度的经验值中找到与所得到运动速度最接近的速度; 根据 所述最接近的速度和从预先存储的时间经验值中选出的时间变化量阔值, 得到距离变化量阈值, 保存得到的时间变化量阈值和距离变化量阔值。
上述方法中, 所述预先存储的速度的经验值是根据不同年龄段、 不同 身高、 不同性别的用户釆样获得的手势速度的经验值, 所述预先存储的速 度的经验值按照正态分布进行排列。
上述方法中, 所述获取手指在移动终端触摸屏上的触控信息为: 移动终端的驱动层中的芯片实时获取手指在移动终端触摸屏上的触控 信息, 将触控信息发送给架构层; 所述触控信息包括: 以移动终端触摸屏 左上角为原点的坐标系中手指的坐标值、 手指的指宽值、 手指对触摸屏的 压力和手指的按压触摸值。
上述方法中, 所述对所述触控信息的数据进行预处理为:
移动终端的架构层根据触控信息的数据, 将手指的运动状态信息记录 到 motion类中;架构层将触控信息中的数据记录到 motion类中,并将 motion 类中的数据发送给手势算法处理层; 所述运动状态信息包括: 有手指在运 动、 手指都弹起和有手指按下。
上述方法中, 所述根据预处理后的数据、 触控信息和提取的时间变化 量阔值和距离变化量阈值, 对手势进行识别为:
移动终端的手势算法处理层根据预处理后的数据得到触摸屏上手指的 运动状态, 当确定运动状态是有手指在触摸屏上运动时, 根据触控信息中 用于分隔不同指头的触控信息的分隔符的数量判断触摸屏上运动的手指的 数量, 当确定有两指在触摸屏上运动时, 实时记录当前两指的坐标以及当 前时间, 计算两指距离; 当两次两指的距离的差值的绝对值大于距离变化 量阔值, 且两次当前时间的差大于时间变化量阔值时, 该手势为有效手势。
上述方法中, 该方法还包括:
当确定有三指在触摸屏上运动时, 实时记录当前三指的坐标以及当前 时间, 计算三指的坐标的外接圓的半径; 当两次三指的坐标的外接圓半径 的差值的绝对值大于距离变化量阔值且两次当前时间的差大于时间变化量 阔值时, 该手势为有效手势。
上述方法中, 该方法还包括:
当确定运动状态是所有手指都弹起或者有手指按下时, 该手势为无效 手势。
上述方法中, 该方法还包括: 根据识别结果在移动终端上实现手势对 应的功能。 本发明还公开一种自动识别手势的移动终端, 包括: 手势校准模块、 驱动层、 架构层、 手势算法处理层; 其中,
手势校准模块, 用于对移动终端的触摸屏进行校准, 获取并保存手指 在触摸屏上运动时的时间变化量阔值和距离变化量阔值;
驱动层, 用于获取手指在移动终端触摸屏上的触控信息;
架构层, 用于对所获取的触控信息的数据进行预处理;
手势算法处理层, 用于提取预先存储的时间的变化量阔值和距离的变 化量阔值, 根据预处理后的数据、 触控信息和提取的时间变化量阔值和距 离变化量阔值, 对手势进行识别。
上述移动终端中, 该移动终端进一步包括: 应用层、 数据库; 其中, 应用层, 用于根据识别结果在移动终端上实现手势对应的功能; 数据库, 用于保存获得的时间变化量阔值和距离变化量阔值; 还用于 保存速度的经验值和时间的经验值。
本发明提供自动识别手势的方法及移动终端, 提取预先存储的时间变 化量阔值和距离变化量阔值, 根据预处理后的数据、 触控信息和提取的阔 值, 对手势进行识别, 即在手势算法处理层釆用新的算法对手势进行识别, 与现有技术中每秒 35次的计算量相比较, 本发明中仅有每秒 10 ~ 15次的 计算量, 大大降低了手势算法处理层的计算量, 提高了移动终端的反应速 度, 实现快捷、 高效地手势自动识别, 进而给使用触摸屏的移动终端用户 带来良好的使用体验, 使用户操作更方便、 更快捷。 此外, 本发明中可以 针对不同用户进行手势校准, 因此, 使得手势识别具有良好的自适应性。 附图说明
图 1是现有技术中 android平台上实现手势识别的***的结构示意图; 图 2是本发明实现自动识别手势的方法的流程示意图;
图 3是本发明中包含 BTN TOUCH值时不同数量的指头按下时的数据 格式的示意图;
图 4是本发明中不包含 BTN_TOUCH值时不同数量的指头按下时的数 据格式的示意图;
图 5是本发明实现自动识别手势的***的结构示意图。 具体实施方式
本发明的基本思想是: 对移动终端的触摸屏进行校准, 获取并保存手 指在触摸屏上运动时的时间变化量阔值和距离变化量阔值; 获取手指在 移动终端触摸屏上的触控信息, 并对获得的触控信息的数据进行预处理; 提取所述时间变化量阔值和距离变化量阔值, 根据预处理后的数据、 触控 信息和提取的时间变化量阔值和距离变化量阔值, 对手势进行识别。
下面通过附图及具体实施例对本发明再做进一步的详细说明。
本发明提供一种自动识别手势的方法, 图 2是本发明实现自动识别手 势的方法的流程示意图, 如图 2所示, 该方法包括以下步骤:
步骤 201 ,对移动终端的触摸屏进行校准, 获取并保存手指在触摸屏上 运动时的时间变化量阔值和距离变化量阔值;
具体的, 在用户的手指第一次触摸移动终端的触摸屏时, 启动移动终 端的手势校准功能, 对移动终端的触摸屏进行校准, 校准之后每次再使用 时不需要再进行校准; 这里, 手势校准功能可利用校准的应用程序实现。
以两指手势为例, 说明对移动终端的触摸屏进行校准的过程, 记录从 两指按下触摸屏到两指从触摸屏抬起这段时间内, 两指之间的距离变化量 Mi和所用的时间 Δ , 从而求出两指的运动速度 在移动终端中预先存储 的速度的经验值中找到与 最接近的速度 υ;根据触摸屏的尺寸和驱动层将 触控信息发送给架构层的频率等, 从移动终端中预先存储的时间的经验值 中, 选取时间变化量阔值 Δτ, 将 υ与 Δτ相乘, 得到距离变化量阔值 AS , 将 Δτ和 AS保存等待提取; 其中, Δτ的值必须是中断时间的整数倍, 中断时间 指的是驱动层两次向架构层发送触控信息的时间间隔, 以驱动层两次向架 构层发送触控信息的频率为 80Hz为例, Δτ的值取中断时间的三至五倍比 较合适;
预先存储的速度的经验值和时间的经验值是根据不同年龄段、 不同身 高、 不同性别等不同性别的用户釆样获得的手势移动时间的经验值和手势 速度的经验值; 从经验值中选取与计算得到的速度最接近的速度值和时间 变化量阔值, 而不直接使用计算值, 是因为这些经验值具有一定的代表性, 可以排除一些比较极端的情况下的速度值和手势移动时间, 如极快的速度 或者极慢的速度, 从而可以防止一些误操作给校准过程带来误差。 在本发 明中, 对预先存储的速度的经验值是按照正态分布进行排列的, 这样在移 动终端的数据库中预先存储的速度的经验值中找到与 ^最接近的速度 υ的 效率会提高, 使得移动终端的反映速度加快。
步骤 202, 获取手指在移动终端触摸屏上的触控信息;
具体为, 移动终端的驱动层中的芯片实时获取手指在移动终端触摸屏 上的触控信息, 按照一定的数据格式将触控信息通过驱动层和架构层连接 的传输通道发送给架构层; 其中, 触控信息包括: 以移动终端触摸屏的左 上角为原点的坐标系中, 手指的坐标值 和 、 手指的指宽值 w、 手指对触 摸屏的压力 p、手指的按压触摸 ( BTN TOUCH, Button Touch )值; 当 BTN- _TOUCH值为 1时, 表示有手指的指头按下, 当 BTN_TOUCH值为 0时, 表示所有的手指的指头都抬起; BTN_TOUCH值只有当第一次有手指按下 或者所有的手指的指头都抬起时, 即 BTN_TOUCH值发生变化时才作为触 控信息的一部分发送给架构层; 其中, 一定的数据格式如图 3和图 4所示。
图 3中分别给出了包含 BTN_TOUCH值时单个指头、 双指、 三个指头 以及 N个指头按下时的数据格式, 图 4中分别给出了不包含 BTN_TOUCH 值时单个指头、 双指、 三个指头以及 N 个指头按下时的数据格式; 其中 SY _MT_REPORT 值是用于分隔不同指头的触控信息的分隔符, SY _REPORT是用于分隔每次发送的触控信息的分隔符; 驱动层以一定的 频率将触控信息发送给架构层, 该频率又称为中断频率, 不同的触摸屏生 产厂商提供的中断频率不同, 通常釆用 60Hz至 80Hz, 有的高达 250Hz。
步骤 203, 对获得的触控信息中的数据进行预处理;
具体的, 移动终端的架构层按照驱动层发送的触控信息的数据格式, 接收驱动层发送的触控信息, 例如: 驱动层按照坐标值 x、 坐标值 y、 手指 的指宽值 w、 手指对触摸屏的压力 p的顺序发送触控信息, 架构层就按照 坐标值 x、 坐标值 y、 手指的指宽值 w、 手指对触摸屏的压力 p的顺序接收 触控信息; 架构层对收到的触控信息中的数据进行预处理, 即: 根据收到 的触控信息中的数据, 将手指的运动状态信息记录到 motion类中; 其中, 所述运动状态信息包括 ACTION_MOVE、 ACTION_UP、 ACTION DOW , 其中, ACTION_MOVE表示有手指在运动, ACTIONJJP表示所有的手指 都弹起, ACTION_DOWN表示有手指按下;根据触控信息中的是否有 BTN-
TOUCH值判断是否是 ACTION_MOVE状态, 如果没有 BTN_TOUCH值 说明有手指在触摸屏上运动, 即是 ACTION_MOVE状态; 如果有 BTN-
TOUCH值, 就判断 BTN_TOUCH值是 0还是 1 , 如果是 0表示所有的手 指都弹起, 即是 ACTIONJJP 状态, 如果是 1 表示有手指按下, 即是 ACTION_DOW 状态; 同时, 架构层将触控信息中的数据记录到 motion 类中, 用于根据记录的数据得到各个指头的运动轨迹; 架构层将 motion类 中的数据发送给手势算法处理层; 所述 motion类是一种 class程序语言, 一 类性质相同的数据存储到一个 motion类中,在本发明中 motion类相当于用 于存储触控信息以及对触控信息进行预处理后的一种存储介质。
步骤 204,提取预先存储的时间变化量阔值和距离变化量阔值, 根据预 处理后的数据、 触控信息和提取的时间变化量阔值和距离变化量阔值, 对 手势进行识别;
具体的, 移动终端的手势算法处理层收到架构层发送的 motion类中的 数据, 根据收到的运动状态信息可以知道触摸屏上手指的运动状态, 因为 触控信息中 SY _MT_REPORT值是用于分隔不同指头的触控信息的分隔 符,所以根据数据中的触控信息中的 SYN_MT_REPORT的数量可以知道触 摸屏上运动的手指的数量。
手势算法处理层根据 motion类中的数据得到手指的运动状态信息, 如 果是 ACTIONJJP或 ACTION_DOWN, 表示手指是全部抬起, 或者只是按 下, 因此手指并没有在移动终端上运动, 该手势为无效手势, 所以不需要 对手势进行识别, 结束流程;
如果是 ACTION_MOVE, 当手势算法处理层确定触摸屏上运动的手指 的数量是两指时, 以两指合拢和两指远离的手势为例进行说明: 当两指在 移动终端的触摸屏上运动时,手势算法处理层实时记录当前两指的坐标( Xi,
^)和( 2、 ), 以及当前时间 τΐ 计算两指的距离 s = V( χ 2 - )2 + ( y2 )2; 当下一次记录两指的坐标来临时, 记录两指的坐标并计算出两指的距 离 S2和 T2; 手势算法处理层从数据库中提取在步骤 201中存储的 Δτ和 AS, 比较 | - S|与 AS的大小以及 Γ2 - 7与 Δτ的大小, 只有当 | - S| > AS并且
Γ2- 7 >A ,这次两指运动的手势才有效, 计算该手势的变化系数 scale=S2/ Si; 如果这次两指运动的手势为无效, 就对下一个手势进行识别;
当手势算法处理层确定触摸屏上运动的手指的数量是三指时, 以三指 合拢和三指远离的手势为例进行说明: 手势算法处理层实时记录当前三指 的坐标, 根据不在同一直线上的三点确定一个外接圓的原理, 根据当前三 指的坐标得到其外接圓, 并计算该外接圓的半径 Γι以及当前时间 T3; 当下 一次记录三指的坐标来临时, 记录当前三指的坐标及当前时间 Τ4, 并计算 此时三点坐标的外接圓的半径 r2; 只有当 |r2- Γι|>Μ并且 Γ4- Γ3 >Δ , 这次三 指运动的手势才有效, 计算该手势的变化系数 sacle=r2/ri; 手势算法处理层 将识别结果即手势的变化系数 scale的值发送给移动终端的应用层。
步骤 205 , 根据识别结果在移动终端上实现该手势对应的功能; 具体的, 移动终端的应用层接收手势算法处理层发送的识别结果, 判 断 scale的值与 1的大小, 如果 sacle小于 1 ,说明两指或者三指的手势是合 拢, 例如, 可在移动终端上实现图片缩小的功能, 缩小的比例根据 scale的 值计算获得; 如果 sacle大于 1 , 说明两指或三指的手势是远离, 例如, 可 在移动终端上实现图片放大的功能, 放大的倍数也是根据 scale的值计算获 得; 如果等于 1 , 说明两手指没有在移动终端的触摸屏上运动, 不执行任何 操作。
本发明中的手势可以是多个手指构成的手势, 一般两个手指或三个手 指的手势实现的效果比较好。 此外, 本发明可以应用于多种操作***, 如 Windows Mobile操作***、 Symbian操作***以及 Android操作***等,还 可以应用于全球定位***。
为实现上述方法, 本发明还提供一种自动识别手势的移动终端, 图 5 是本发明实现自动识别手势的移动终端的结构示意图, 如图 5 所示, 该移 动终端包括: 手势校准模块 51、 驱动层 52、 架构层 53、 手势算法处理层 54、 ; 其中,
手势校准模块 51 , 用于对移动终端的触摸屏进行校准, 获取并保存手 指在触摸屏上运动时的时间变化量阔值和距离变化量阔值;
所述手势校准模块 51对移动终端的触摸屏进行校准, 获取并保存手指 在触摸屏上运动时的时间变化量阔值和距离变化量阔值, 具体为: 移动终 端的手势校准模块 51根据手指之间的距离变化量和产生变化所用的时间, 得到运动速度; 在预先存储的速度的经验值中找到与所述运动速度最接近 的速度; 根据所述最接近的速度和从预先存储的时间经验值中选出的时间 的变化量阈值, 得到距离的变化量阈值, 将两个变化量阈值保存; 驱动层 52, 用于获取手指在移动终端触摸屏上的触控信息;
所述驱动层 52获取手指在移动终端触摸屏上的触控信息具体为: 驱动 层 52中的芯片实时获取手指在移动终端触摸屏上的触控信息, 按照一定的 数据格式将获取的触控信息通过驱动层 52与架构层 53之间的传输通道以 一定的频率发送给架构层 53; 所述触控信息包括: 以移动终端的触摸屏的 左上角为原点的坐标系中手指的坐标值、 手指的指宽值、 手指对触摸屏的 压力和手指的按压触摸值;
架构层 53 , 用于对所述触控信息的数据进行预处理;
所述架构层 53 对所述触控信息的数据进行预处理具体为: 架构层 53 根据触控信息的数据将手指的运动状态信息记录到 motion类中, 所述运动 状态信息包括有手指在运动、 手指都弹起和有手指按下; 架构层将触控信 息中的数据记录到 motion类中,将 motion类中的数据发送给手势算法处理 层 54;
手势算法处理层 54, 用于提取预先存储的时间变化量阔值和距离变化 量阔值, 根据预处理后的数据、 触控信息和提取的时间变化量阔值和距离 变化量阔值, 对手势进行识别;
所述手势算法处理层 54提取预先存储的时间变化量阔值和距离变化量 阔值, 根据预处理后的数据、 触控信息和提取的阔值, 对手势进行识别具 体为: 移动终端的手势算法处理层 54根据预处理后的数据得到触摸屏上的 手指的运动状态, 当确定运动状态是有手指在触摸屏上运动时, 根据触控 信息中用于分隔不同指头的触控信息的分隔符的数量判断判断触摸屏上运 动的手指的数量; 当确定有两指在触摸屏上运动时, 当确定有两指在触摸 屏上运动时, 实时记录当前两指的坐标以及当前时间, 计算两指距离; 当 两次两指的距离的差值的绝对值大于距离变化量阔值, 且两次当前时间的 差大于时间变化量阔值时, 计算手势的变化系数; 当确定有三指在触摸屏 上运动时, 实时记录当前三指的坐标以及当前时间, 计算三指的坐标的外 接圓的半径; 当两次三指的坐标的外接圓半径的差值的绝对值大于距离变 化量阔值且两次当前时间的差大于时间变化量阔值时, 计算手势的变化系 数; 当确定运动状态是所有手指都弹起或者有手指按下时, 该手势为无效 手势, 结束当前流程。
该移动终端进一步包括: 应用层 55 ; 其中,
应用层 55 , 用于才艮据识别结果在移动终端上实现手势对应的功能。 该移动终端进一步包括: 数据库 56, 用于保存获得的时间变化量阔值 和距离变化量阔值; 还用于保存速度的经验值、 时间的经验值等。
以上所述, 仅为本发明的较佳实施例而已, 并非用于限定本发明的保 护范围, 凡在本发明的精神和原则之内所作的任何修改、 等同替换和改进 等, 均应包含在本发明的保护范围之内。

Claims

权利要求书
1、 一种自动识别手势的方法, 其特征在于, 该方法包括:
对移动终端的触摸屏进行校准, 获取并保存手指在触摸屏上运动时的 时间变化量阔值和距离变化量阔值;
获取手指在移动终端触摸屏上的触控信息, 并对获得的触控信息的数 据进行预处理;
提取所述时间变化量阔值和距离变化量阔值, 根据预处理后的数据、 触控信息和提取的时间变化量阔值和距离变化量阔值, 对手势进行识别。
2、 根据权利要求 1所述的方法, 其特征在于, 所述对移动终端的触摸 屏进行校准, 获取并保存手指在触摸屏上运动时的时间变化量阔值和距离 变化量阔值为:
根据手指之间的距离变化量和产生变化所用的时间, 得到运动速度; 在预先存储的速度的经验值中找到与所得到运动速度最接近的速度; 根据 所述最接近的速度和从预先存储的时间经验值中选出的时间变化量阔值, 得到距离变化量阈值, 保存得到的时间变化量阈值和距离变化量阔值。
3、 根据权利要求 2所述的方法, 其特征在于, 所述预先存储的速度的 经验值是根据不同年龄段、 不同身高、 不同性别的用户釆样获得的手势速 度的经验值, 所述预先存储的速度的经验值按照正态分布进行排列。
4、 根据权利要求 1所述的方法, 其特征在于, 所述获取手指在移动终 端触摸屏上的触控信息为:
移动终端的驱动层中的芯片实时获取手指在移动终端触摸屏上的触控 信息, 将触控信息发送给架构层; 所述触控信息包括: 以移动终端触摸屏 左上角为原点的坐标系中手指的坐标值、 手指的指宽值、 手指对触摸屏的 压力和手指的按压触摸值。
5、 根据权利要求 1所述的方法, 其特征在于, 所述对所述触控信息的 数据进行预处理为:
移动终端的架构层根据触控信息的数据, 将手指的运动状态信息记录 到 motion类中;架构层将触控信息中的数据记录到 motion类中,并将 motion 类中的数据发送给手势算法处理层; 所述运动状态信息包括: 有手指在运 动、 手指都弹起和有手指按下。
6、 根据权利要求 1所述的方法, 其特征在于, 所述根据预处理后的数 据、 触控信息和提取的时间变化量阔值和距离变化量阔值, 对手势进行识 别为:
移动终端的手势算法处理层根据预处理后的数据得到触摸屏上手指的 运动状态, 当确定运动状态是有手指在触摸屏上运动时, 根据触控信息中 用于分隔不同指头的触控信息的分隔符的数量判断触摸屏上运动的手指的 数量, 当确定有两指在触摸屏上运动时, 实时记录当前两指的坐标以及当 前时间, 计算两指距离; 当两次两指的距离的差值的绝对值大于距离变化 量阔值, 且两次当前时间的差大于时间变化量阔值时, 该手势为有效手势。
7、 根据权利要求 6所述的方法, 其特征在于, 该方法还包括: 当确定有三指在触摸屏上运动时, 实时记录当前三指的坐标以及当前 时间, 计算三指的坐标的外接圓的半径; 当两次三指的坐标的外接圓半径 的差值的绝对值大于距离变化量阔值且两次当前时间的差大于时间变化量 阔值时, 该手势为有效手势。
8、 根据权利要求 6所述的方法, 其特征在于, 该方法还包括: 当确定运动状态是所有手指都弹起或者有手指按下时, 该手势为无效 手势。
9、 根据权利要求 1所述的方法, 其特征在于, 该方法还包括: 根据识 别结果在移动终端上实现手势对应的功能。
10、 一种自动识别手势的移动终端, 其特征在于, 该移动终端包括: 手势校准模块、 驱动层、 架构层、 手势算法处理层; 其中,
手势校准模块, 用于对移动终端的触摸屏进行校准, 获取并保存手指 在触摸屏上运动时的时间变化量阔值和距离变化量阔值;
驱动层, 用于获取手指在移动终端触摸屏上的触控信息;
架构层, 用于对所获取的触控信息的数据进行预处理;
手势算法处理层, 用于提取预先存储的时间的变化量阔值和距离的变 化量阔值, 根据预处理后的数据、 触控信息和提取的时间变化量阔值和距 离变化量阔值, 对手势进行识别。
11、 根据权利要求 10所述的移动终端, 其特征在于, 该移动终端进一 步包括: 应用层、 数据库; 其中,
应用层, 用于根据识别结果在移动终端上实现手势对应的功能; 数据库, 用于保存获得的时间变化量阔值和距离变化量阔值; 还用于 保存速度的经验值和时间的经验值。
PCT/CN2010/076742 2010-05-31 2010-09-08 一种自动识别手势的方法及移动终端 WO2011150607A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP10852402.6A EP2570901B1 (en) 2010-05-31 2010-09-08 Method and mobile terminal for automatically recognizing gesture

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2010101884228A CN101853133B (zh) 2010-05-31 2010-05-31 一种自动识别手势的方法及移动终端
CN201010188422.8 2010-05-31

Publications (1)

Publication Number Publication Date
WO2011150607A1 true WO2011150607A1 (zh) 2011-12-08

Family

ID=42804647

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2010/076742 WO2011150607A1 (zh) 2010-05-31 2010-09-08 一种自动识别手势的方法及移动终端

Country Status (3)

Country Link
EP (1) EP2570901B1 (zh)
CN (1) CN101853133B (zh)
WO (1) WO2011150607A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102999269A (zh) * 2012-12-26 2013-03-27 东莞宇龙通信科技有限公司 终端和终端操控方法
CN105607853A (zh) * 2015-12-21 2016-05-25 联想(北京)有限公司 一种信息处理方法及电子设备
CN114579033A (zh) * 2022-05-05 2022-06-03 深圳市大头兄弟科技有限公司 安卓平台的手势切换方法、装置、设备及存储介质
CN116166143A (zh) * 2023-04-25 2023-05-26 麒麟软件有限公司 全局触摸手势识别方法

Families Citing this family (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101984396A (zh) * 2010-10-19 2011-03-09 中兴通讯股份有限公司 一种自动识别旋转手势的方法及移动终端
CN101980118B (zh) * 2010-10-22 2012-07-04 福建鑫诺通讯技术有限公司 一种在Android平台下实现触摸屏校准的方法
CN101980153B (zh) * 2010-10-22 2015-07-22 中兴通讯股份有限公司 一种识别硬件手势的方法及移动终端
JP5813787B2 (ja) * 2011-02-17 2015-11-17 ナイキ イノベイト シーブイ ワークアウトセッション中のユーザーパフォーマンス指標の追跡
CN102193682A (zh) * 2011-05-30 2011-09-21 苏州瀚瑞微电子有限公司 在gui下自动校准触摸屏的方法
CN103842945B (zh) * 2011-10-11 2016-09-28 国际商业机器公司 对象指向方法、设备
CN102768595B (zh) * 2011-11-23 2015-08-26 联想(北京)有限公司 一种识别触摸屏上触控操作指令的方法及装置
CN103455266A (zh) * 2012-06-04 2013-12-18 华为终端有限公司 一种触摸屏的误触摸操作的处理方法及终端设备
CN103529976B (zh) * 2012-07-02 2017-09-12 英特尔公司 手势识别***中的干扰消除
CN102830858B (zh) * 2012-08-20 2015-12-02 深圳市真多点科技有限公司 一种手势识别方法、装置及触摸屏终端
TWI475440B (zh) * 2012-09-10 2015-03-01 Elan Microelectronics Corp 觸控裝置及其手勢判斷方法
JP5998085B2 (ja) * 2013-03-18 2016-09-28 アルプス電気株式会社 入力装置
CN103336611B (zh) * 2013-06-17 2017-07-11 惠州Tcl移动通信有限公司 一种触控操作方法、装置及其触控终端
CN103902101A (zh) * 2014-04-10 2014-07-02 上海思立微电子科技有限公司 智能终端的手势识别方法以及实现该方法的智能终端
CN103942053A (zh) * 2014-04-17 2014-07-23 北京航空航天大学 一种基于移动终端的三维模型手势触控浏览交互方法
CN104102450A (zh) * 2014-06-18 2014-10-15 深圳贝特莱电子科技有限公司 一种基于触摸屏手势识别的方法及***
CN104281411B (zh) * 2014-10-13 2017-10-24 惠州Tcl移动通信有限公司 基于移动终端的触摸屏解锁方法及***
CN104601795B (zh) * 2014-11-03 2017-03-29 中国科学技术大学苏州研究院 一种智能手机用户左右手识别方法
CN105681540B (zh) * 2014-11-18 2020-07-03 青岛海信移动通信技术股份有限公司 一种彩信播放方法及装置
CN105744322B (zh) * 2014-12-10 2019-08-02 Tcl集团股份有限公司 一种屏幕焦点的控制方法及装置
CN105892877A (zh) * 2015-10-23 2016-08-24 乐卡汽车智能科技(北京)有限公司 多指并拢或打开手势的识别方法、装置及终端设备
CN105892895A (zh) * 2015-10-23 2016-08-24 乐卡汽车智能科技(北京)有限公司 多指滑动手势的识别方法、装置及终端设备
CN105302467B (zh) * 2015-11-05 2018-10-23 网易(杭州)网络有限公司 触控操作识别和响应方法、装置及游戏操控方法、装置
CN105426722A (zh) * 2015-11-17 2016-03-23 厦门美图移动科技有限公司 一种移动终端的解锁装置及方法
CN105573545A (zh) * 2015-11-27 2016-05-11 努比亚技术有限公司 一种手势校准方法、装置及手势输入处理方法
CN106598231B (zh) * 2016-11-22 2019-12-10 深圳市元征科技股份有限公司 手势识别方法及装置
CN106598232B (zh) * 2016-11-22 2020-02-28 深圳市元征科技股份有限公司 手势识别方法及装置
US10254871B2 (en) 2017-04-10 2019-04-09 Google Llc Using pressure sensor input to selectively route user inputs
CN107168636B (zh) * 2017-05-18 2020-05-12 广州视源电子科技股份有限公司 多点触摸的手势识别方法、装置、触摸屏终端及存储介质
WO2019127195A1 (zh) 2017-12-28 2019-07-04 华为技术有限公司 一种触控方法及终端
CN108595007A (zh) * 2018-04-25 2018-09-28 四川斐讯信息技术有限公司 基于手势识别的无线中继的方法及***、无线路由设备
CN108960177B (zh) * 2018-07-13 2020-12-22 浪潮金融信息技术有限公司 一种将手势进行数字化处理的方法及装置
CN108733271A (zh) * 2018-07-19 2018-11-02 清远市蓝海慧谷智能科技有限公司 一种电容触摸屏用的触摸传导器
CN109460176A (zh) * 2018-10-22 2019-03-12 四川虹美智能科技有限公司 一种快捷菜单展示方法和智能冰箱
CN111352562B (zh) * 2019-01-22 2022-03-15 鸿合科技股份有限公司 一种粉笔字实现方法、装置、电子设备与存储介质
CN110850966A (zh) * 2019-10-22 2020-02-28 深圳市云顶信息技术有限公司 电动牙刷控制方法、装置、计算机设备和存储介质
CN111352529B (zh) * 2020-02-20 2022-11-08 Oppo(重庆)智能科技有限公司 触摸事件的上报方法、装置、终端及存储介质
CN113535057B (zh) * 2021-06-28 2022-12-16 荣耀终端有限公司 一种手势交互方法及终端设备
CN114647362B (zh) * 2022-03-22 2024-04-12 天马微电子股份有限公司 显示面板的触控算法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101295217A (zh) * 2008-06-05 2008-10-29 中兴通讯股份有限公司 手写输入处理装置和方法
CN101408814A (zh) * 2007-10-04 2009-04-15 株式会社东芝 姿态确定装置及方法
CN101546233A (zh) * 2009-05-05 2009-09-30 上海华勤通讯技术有限公司 触摸屏界面手势识别操作方法
CN101634565A (zh) * 2009-07-27 2010-01-27 深圳市凯立德计算机***技术有限公司 导航***及其操作方法

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005018129A2 (en) * 2003-08-15 2005-02-24 Semtech Corporation Improved gesture recognition for pointing devices
US7761814B2 (en) * 2004-09-13 2010-07-20 Microsoft Corporation Flick gesture
JP2007128497A (ja) * 2005-10-05 2007-05-24 Sony Corp 表示装置および表示方法
US10437459B2 (en) * 2007-01-07 2019-10-08 Apple Inc. Multitouch data fusion
US8681104B2 (en) * 2007-06-13 2014-03-25 Apple Inc. Pinch-throw and translation gestures
US8122384B2 (en) * 2007-09-18 2012-02-21 Palo Alto Research Center Incorporated Method and apparatus for selecting an object within a user interface by performing a gesture
US8390577B2 (en) * 2008-07-25 2013-03-05 Intuilab Continuous recognition of multi-touch gestures

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101408814A (zh) * 2007-10-04 2009-04-15 株式会社东芝 姿态确定装置及方法
CN101295217A (zh) * 2008-06-05 2008-10-29 中兴通讯股份有限公司 手写输入处理装置和方法
CN101546233A (zh) * 2009-05-05 2009-09-30 上海华勤通讯技术有限公司 触摸屏界面手势识别操作方法
CN101634565A (zh) * 2009-07-27 2010-01-27 深圳市凯立德计算机***技术有限公司 导航***及其操作方法

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102999269A (zh) * 2012-12-26 2013-03-27 东莞宇龙通信科技有限公司 终端和终端操控方法
CN105607853A (zh) * 2015-12-21 2016-05-25 联想(北京)有限公司 一种信息处理方法及电子设备
CN105607853B (zh) * 2015-12-21 2019-10-29 联想(北京)有限公司 一种信息处理方法及电子设备
CN114579033A (zh) * 2022-05-05 2022-06-03 深圳市大头兄弟科技有限公司 安卓平台的手势切换方法、装置、设备及存储介质
CN116166143A (zh) * 2023-04-25 2023-05-26 麒麟软件有限公司 全局触摸手势识别方法
CN116166143B (zh) * 2023-04-25 2023-07-04 麒麟软件有限公司 全局触摸手势识别方法

Also Published As

Publication number Publication date
CN101853133A (zh) 2010-10-06
EP2570901A1 (en) 2013-03-20
EP2570901B1 (en) 2019-03-27
CN101853133B (zh) 2013-03-20
EP2570901A4 (en) 2016-07-20

Similar Documents

Publication Publication Date Title
WO2011150607A1 (zh) 一种自动识别手势的方法及移动终端
WO2012051766A1 (zh) 一种自动识别旋转手势的方法及移动终端
AU2021290349B2 (en) Devices, methods, and graphical user interfaces for wireless pairing with peripheral devices and displaying status information concerning the peripheral devices
WO2012051770A1 (zh) 一种识别硬件手势的方法及移动终端
JP6790259B2 (ja) 誤タッチ防止のための方法、及び端末
US8749531B2 (en) Method for receiving input on an electronic device and outputting characters based on sound stroke patterns
CN103221914B (zh) 便携式电子设备及其控制方法
JP4853507B2 (ja) 情報処理装置、情報処理方法およびプログラム
CN107643828B (zh) 车辆、控制车辆的方法
WO2018040009A1 (zh) 签名认证方法、终端、手写笔及***
EP2620844B1 (en) Apparatus and method for adjusting touch sensitivity in mobile terminal
CN108174612A (zh) 用于利用基于先前输入强度的强度阈值对触摸输入进行处理和消除歧义的设备和方法
EP2840471A1 (en) Method, device, and electronic terminal for unlocking
CN108920066B (zh) 触摸屏滑动调整方法、调整装置及触控设备
WO2012041234A1 (zh) 基于摄像头的信息输入方法及终端
CN102193736A (zh) 支持多模式自动切换的输入方法和输入***
WO2014106380A1 (zh) 一种基于摄像头的控制方法和移动终端
WO2012041183A1 (zh) 识别在移动终端界面上输入的操作轨迹的方法和***
CN111158487A (zh) 使用无线耳机与智能终端进行交互的人机交互方法
US11023067B2 (en) Input control using fingerprints
WO2015131590A1 (zh) 一种控制黑屏手势处理的方法及终端
EP3282680B1 (en) Blowing action-based method for operating mobile terminal and mobile terminal
CN102306060B (zh) 移动设备输入方法及***
CN104571603B (zh) 一种空中手写***及手写笔
CN106886283A (zh) 字符输入方法、装置和***

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10852402

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2010852402

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE