WO2024093978A1 - 触控物识别方法、触控设备及存储介质 - Google Patents

触控物识别方法、触控设备及存储介质 Download PDF

Info

Publication number
WO2024093978A1
WO2024093978A1 PCT/CN2023/128246 CN2023128246W WO2024093978A1 WO 2024093978 A1 WO2024093978 A1 WO 2024093978A1 CN 2023128246 W CN2023128246 W CN 2023128246W WO 2024093978 A1 WO2024093978 A1 WO 2024093978A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
sensing
type
value
preset
Prior art date
Application number
PCT/CN2023/128246
Other languages
English (en)
French (fr)
Inventor
郝帅凯
Original Assignee
广州视源电子科技股份有限公司
广州视睿电子科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广州视源电子科技股份有限公司, 广州视睿电子科技有限公司 filed Critical 广州视源电子科技股份有限公司
Publication of WO2024093978A1 publication Critical patent/WO2024093978A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means

Definitions

  • the embodiments of the present application relate to the field of computer technology, and for example, relate to a touch object recognition method, a touch device, and a storage medium.
  • the active stylus In capacitive touch technology, electronic circuits are usually added to the active stylus, and the active stylus actively transmits communication signals so that the upper-level software can identify whether the current touch object is an active stylus or a finger by whether the communication signal can be actively generated. Then, different operations are performed according to the different touch objects. For example, if it is recognized as a stylus, drawing, writing and other operations are completed. If it is a finger, the canvas can be moved, scaled, rotated and other operations are realized.
  • the cost of the active stylus is higher than that of the passive stylus (the passive stylus has no electronic circuit and cannot actively generate communication signals).
  • the active stylus also requires regular battery replacement or regular charging, which brings certain inconveniences to users.
  • the present application provides a touch object recognition method, a touch device and a storage medium to solve some or all of the above-mentioned technical problems in the related art.
  • the present application provides a touch object recognition method, the method comprising:
  • the type of the touch object is determined according to the sensing values of the sensing positions included in the touch area.
  • determining the type of the touch object according to the sensing value of the sensing position included in the touch area includes:
  • each sensing position within the preset area is obtained as a candidate position
  • the type of the touch object is determined based on the sensing value corresponding to the candidate position.
  • the preset area range includes:
  • the peak position as a reference point, positions adjacent to the peak position in four directions of up, down, left, and right are selected, and the peak position constitutes a preset area range.
  • the preset area range includes:
  • the area formed by the sensing positions of the sensing values with the first h ranking order is selected as the preset area range, wherein the peak position is the sensing position with the first ranking of the sensing values, and h is a positive integer greater than or equal to 2.
  • determining the type of the touch object based on the sensing value corresponding to the candidate position includes:
  • a total sensing value or a sensing average value corresponding to each cycle is obtained, wherein the total sensing value or the sensing average value is determined according to the sensing value of the sensing position in the reference area obtained in each cycle, and the reference area includes the touch area or a preset area range in the touch area;
  • the type of the touch object is determined to be the second type.
  • determining the type of the touch object based on the sensing value corresponding to the candidate position includes:
  • a total sensing value or a sensing average value corresponding to each cycle is obtained, wherein the total sensing value or the sensing average value is determined according to the sensing value of the sensing position in the reference area obtained in each cycle, and the reference area includes the touch area or a preset area range in the touch area;
  • the type of the touch object is determined to be a third type other than the first type and the second type; wherein the second preset threshold is greater than the first preset threshold.
  • determining the type of the touch object based on the sensing value corresponding to the candidate position includes:
  • the type of the touch object is determined to be the second type.
  • determining the type of the touch object based on the sensing value corresponding to the candidate position includes:
  • the type of the touch object is determined to be a third type other than the first type and the second type, wherein the second preset threshold is greater than the first preset threshold.
  • the method further includes:
  • the touch coordinate data and the type of the touch object are reported to a preset application program, so that the preset application program generates a control instruction according to the type of the touch object and the touch coordinate data, and performs a touch operation corresponding to the control instruction.
  • a touch device including: a touch sensor, a processor, and a memory;
  • Memory used to store computer programs
  • a touch sensor is used to detect the touch action when the touch object touches the touch screen
  • the processor is used to implement the steps of the touch object recognition method of any embodiment of the first aspect when executing the program stored in the memory.
  • a computer-readable storage medium on which a computer program is stored.
  • the steps of the touch object recognition method according to any embodiment of the first aspect are implemented.
  • the method provided in the embodiment of the present application when it is detected that the touch object touches the touch screen, determines the touch area corresponding to the touch object, and obtains the sensing value corresponding to the sensing position included in the touch area. Then, according to the sensing value of the sensing position included in the touch area, the type of the touch object is determined.
  • the sensing value of the touch area is larger and more obvious. There will be relatively large differences in the sensing values generated after different touch objects touch the touch screen. Therefore, it will be more accurate to judge the type of the touch object based on the sensing value of the sensing position in the touch area.
  • FIG1 is a schematic diagram of a touch object recognition method provided in an embodiment of the present application.
  • FIG2 is a schematic diagram of the internal structure of the touch screen provided by the present application.
  • FIG3 is a schematic diagram of sensing data generated when a touch object touches different positions with different touch forces provided by the present application
  • FIG4 is a schematic diagram of sensing data generated when another touch object provided by the present application touches different positions with different touch forces
  • FIG5 is a schematic diagram of sensing data generated when another touch object provided by the present application touches different positions with different touch forces
  • FIG6 is a schematic diagram of a flow chart of another touch object recognition method provided in an embodiment of the present application.
  • FIG7 is a schematic diagram of a structure of a preset area range formed by candidate positions provided in the present application.
  • FIG8 is a flowchart of a method for obtaining a preset area range provided by the present application.
  • FIG9 is a schematic flow chart of a method for determining the type of a touch object provided by the present application.
  • FIG10 is a schematic diagram showing a comparison of the sum of sensing values generated after the touch screen is touched by a finger and a passive stylus respectively provided by the present application;
  • FIG11 is a schematic flow chart of another method for determining the type of a touch object provided by the present application.
  • FIG12 is a schematic flow chart of another method for determining the type of a touch object provided by the present application.
  • FIG13 is a schematic flow chart of another method for determining the type of a touch object provided by the present application.
  • FIG14 is a schematic flow chart of another touch object recognition method provided by an embodiment of the present application.
  • FIG15 is a schematic diagram of the structure of a passive stylus provided by the present application.
  • FIG. 16 is a schematic diagram of the structure of a touch device provided in an embodiment of the present application.
  • FIG. 1 is a schematic flow chart of a touch object recognition method provided by an embodiment of the present application. The method steps include:
  • Step 110 after detecting that a touch object touches the touch screen, determining a touch area corresponding to the touch object.
  • the method of determining the touch area corresponding to the touch object may include a variety of implementable methods. Commonly used technical means include a nine-grid sliding window method or a watershed algorithm. The method steps for determining the touch area corresponding to the touch object are not repeated here.
  • Step 120 Acquire the sensing value corresponding to the sensing position included in the touch area.
  • the entire panel in the touch screen includes interlaced driving electrodes and sensing electrodes.
  • Each sensing position is a unit position where the driving electrode and the sensing electrode intersect each other, and after being touched, the capacitance will change. As shown in FIG2 , the sensing positions are marked.
  • FIG. 3 to 5 illustrate the sensing data (capacitance data) generated when the touch object touches different positions with different touch forces.
  • the sensing value of the sensing position farther away from the touch center will be smaller, and the sensing data of the sensing position outside the touch area is usually 0.
  • Step 130 determining the type of the touch object according to the sensing values of the sensing positions included in the touch area.
  • the generated sensing values when different types of touch objects are used to touch the touch screen, the generated sensing values will be different. For example, when a capacitive passive pen is used to touch the touch screen, the total sensing value in the touch area is usually between 413 and 453, while when a finger is used to touch the touch screen, the total sensing value in the touch area is usually between 886 and 1356.
  • the type of the touch object can be determined based on the sensing values of the sensing positions included in the touch area. For example, the sum of the sensing values of all sensing positions included in the touch area, or the sensing average value corresponding to all sensing positions in the touch area, is calculated. Then, the type of the touch object is determined based on the sum of the sensing values or the sensing average value.
  • the touch object recognition method provided in the embodiment of the present application determines that when a touch object touches the touch screen, Corresponding touch area, and obtaining the sensing value corresponding to the sensing position included in the touch area. Then, according to the sensing value of the sensing position included in the touch area, determine the type of touch object.
  • the sensing value of the touch area is larger and more obvious. There will be relatively large differences in the sensing values generated by different touch objects after touching the touch screen. Therefore, it will be more accurate to judge the type of touch object based on the sensing value of the sensing position in the touch area.
  • the sensing values of all sensing positions in the touch area are directly called to determine the type of the touch object, there may be some errors.
  • the touch area is relatively large, when calculating the sensing values in all touch areas to determine the type of the touch object, the characteristic differences between the touch objects may be reduced.
  • multiple touch objects are included.
  • the touch area corresponding to each touch object includes 100 sensing positions. After the sensing values of the 100 sensing positions are superimposed, it is possible that the superposition sum is between 886 and 1356, but in fact, one of the touch objects is a passive stylus and the other is a finger.
  • the collected capacitance data (sensing value of the sensing position) contains noise
  • the larger the touch area the more noise data there will be, which will also affect the determination of the touch object type.
  • the present application also provides another touch object recognition method. As shown in FIG6 , the method steps include:
  • Step 610 Select the peak position with the largest sensing value from the touch area.
  • the peak position in FIG. 3 is the position where the sensing value is 446
  • the position where the sensing value is 394 in FIG. 4 is the peak position
  • the position where 211 is located in FIG. 5 is the peak position.
  • Step 620 taking the peak position as a reference point, obtaining each sensing position within the preset area as a candidate position.
  • Step 630 Determine the type of the touch object based on the sensing value corresponding to the candidate position.
  • the method of determining the type of the touch object is similar to step 130 , except that the data used here are the sensing values corresponding to all candidate positions within the preset area, rather than the sensing values corresponding to all sensing positions in the touch area.
  • the candidate position when taking the peak position as a reference point and obtaining each sensing position within the preset area as a candidate position, the candidate position can be selected in the following manner. See as follows:
  • the peak position can be used as a reference point, and an N times M selection range can be selected as the preset area range.
  • N and M are both positive integers greater than or equal to 2.
  • the position where the sensing value is 446 is taken as the reference point, and the positions corresponding to the sensing values including 36, 108, 37, 127, 132, 30, 114, and 35 are selected as candidate positions, and the area range formed by all the candidate positions is the preset area range.
  • the peak position as a reference point, positions adjacent to the peak position in four directions of up, down, left, and right are selected, and the peak position constitutes a preset area range.
  • Figure 7 shows that based on the peak value of the sensing value 211, positions adjacent to the peak value in the four directions of the upper, lower, left, and right directions of the peak value are selected, that is, positions where the sensing values are 203, 199, 15, and 15 respectively, as candidate positions, and these candidate positions constitute the preset area range.
  • the preset area range may also be determined by the following method, as shown in FIG8 , the method steps include:
  • Step 810 sort all the sensing values from large to small.
  • Step 820 selecting an area formed by sensing positions of sensing values with the first h in sorting order as a preset area range.
  • all the sensing values shown in FIG. 4 are sorted from large to small, namely: 394, 382, 131, 126, 114, 112, 54, 36, 22, 21, 13, 13.
  • the preset area range is the sensing positions corresponding to the sensing values of 394, 382, 131, 126, 114, 112, etc., which are the positions of the middle two rows in Figure 4.
  • determining the type of the touch object may be implemented in the following manner, as shown in FIG. 9 , the method steps include:
  • Step 910 Obtain the total sensing value or the sensing average value corresponding to each cycle within a preset time period.
  • the total sensing value or sensing average value is determined according to the sensing values of the sensing positions in the reference area obtained in each cycle, and the reference area includes the touch area or a preset area range in the touch area. That is, the total sensing value or sensing average value of each cycle can be determined according to the sensing values of the touch area obtained in each cycle, or the total sensing value or sensing average value of each cycle can be determined according to the sensing values of all candidate positions in the preset area range obtained in each cycle.
  • the capacitive screen works in a scanning manner at certain time intervals, so there may be at least one cycle within a preset time period.
  • the method further comprises:
  • Step 920 selecting the maximum value within the preset time period from the total sensing values or sensing average values corresponding to each cycle within the preset time period.
  • Step 930 comparing the maximum value with a first preset threshold value, obtaining a comparison result, and determining the type of the touch object according to the comparison result.
  • the type of the touch object is determined to be the first type
  • the type of the touch object is determined to be the second type.
  • the sum of the sensing values in the reference area (for example, a preset area range in a 3 ⁇ 3 nine-square grid based on the peak value) is usually between 413 and 453, and when a finger touches the touch screen, the sum of the sensing values in the reference area is usually between 886 and 1356.
  • FIG10 is a schematic diagram showing the comparison of the sum of the sensing values generated after the touch screen is touched by a finger and a passive stylus, respectively.
  • the maximum value is greater than or equal to 660, it is determined that the type of the touch object is a finger; otherwise, when the maximum value is less than 660, it is determined that the type of the touch object is a passive stylus.
  • the method can also include the following method steps, as shown in FIG11:
  • Step 1110 obtaining a total sensing value or a sensing average value corresponding to each scanning frame in a preset number of continuous scanning frames.
  • the total sensing value or the sensing average value is determined according to the sensing value of the sensing position in the reference area acquired in each scanning frame, and the reference area includes the touch area or a preset area range in the touch area.
  • Step 1120 selecting a maximum value among the preset number of continuous scanning frames from the total sensing values or sensing average values corresponding to each scanning frame among the preset number of continuous scanning frames.
  • Step 1130 comparing the maximum value with the first preset threshold value, obtaining a comparison result, and determining the type of the touch object according to the comparison result.
  • the type of the touch object is determined to be the first type
  • the type of the touch object is determined to be the second type.
  • the judgment process is similar to the principle of judging the type of the touch object in the previous example, and will not be explained in detail here.
  • the type of the touch object can also be determined in the following manner. Please refer to the method steps shown in Figures 12 and 13 respectively.
  • determining the type of the touch object may be achieved in the following manner, as shown in FIG. 12 .
  • the method steps include:
  • Step 1210 obtaining a total sensing value or a sensing average value corresponding to each cycle within a preset time period.
  • the total sensing value or the sensing average value is determined according to the sensing value of the sensing position in the reference area acquired in each cycle, and the reference area includes the touch area or a preset area range in the touch area.
  • the capacitive screen works in a scanning manner at certain time intervals, so there may be at least one cycle within a preset time period.
  • the method further comprises:
  • Step 1220 selecting a maximum value within the preset time period from the total sensing values or sensing average values corresponding to each cycle within the preset time period.
  • the maximum value of the total sensing values within the preset time period is selected from the total sensing values corresponding to all cycles;
  • the maximum value of the sensing average values within a preset time period is selected from the sensing average values corresponding to all cycles.
  • Step 1230 compare the maximum value with the first preset threshold value and/or the second preset threshold value and/or the third preset threshold value respectively, obtain a comparison result, and determine the type of the touch object according to the comparison result.
  • the type of the touch object is determined to be the first type
  • the type of the touch object is determined to be the second type
  • the type of the touch object is determined to be a third type other than the first type and the second type; wherein the second preset threshold is greater than the first preset threshold.
  • the sum of the sensing values in the reference area (for example, a preset area range in a 3 ⁇ 3 nine-square grid based on the peak value) is usually between 413 and 453, and when a finger touches the touch screen, the sum of the sensing values in the reference area is usually between 886 and 1356.
  • FIG. 10 is a schematic diagram showing the comparison of the sum of the sensing values generated after the touch screen is touched by a finger and a passive stylus, respectively.
  • the first preset threshold is 1357
  • the second preset threshold is 660
  • the third preset threshold is 413.
  • the type of the touch object is determined to be a hand. If the maximum value is less than 660 and greater than or equal to 413, the type of the touch object is determined to be a passive stylus.
  • the type of the touch object is the third type in addition to the first type and the second type.
  • the method can also include the following method steps, as shown in FIG13:
  • Step 1310 obtaining a total sensing value or a sensing average value corresponding to each scanning frame in a preset number of continuous scanning frames.
  • the total sensing value or the sensing average value is determined according to the sensing value of the sensing position in the reference area acquired in each scanning frame, and the reference area includes the touch area or a preset area range in the touch area.
  • Step 1320 selecting a maximum value among the preset number of continuous scanning frames from the total sensing values or sensing average values corresponding to each scanning frame among the preset number of continuous scanning frames.
  • Step 1330 compare the maximum value with the first preset threshold value and/or the second preset threshold value and/or the third preset threshold value respectively, obtain a comparison result, and determine the type of the touch object according to the comparison result.
  • the comparison process and the process of determining the type of the touch object are similar to the processes performed in the method embodiment corresponding to FIG. 12 , and will not be described in detail here.
  • the method may further include the following method steps, as shown in FIG14 , the method steps include:
  • Step 1410 acquiring touch coordinate data of the touch object.
  • Step 1420 reporting the touch coordinate data and the type of the touch object to a preset application.
  • the solution for acquiring the touch coordinate data of the touch object is a relatively mature technology and will not be described in detail here.
  • Reporting the touch coordinate data and the type of the touch object to the preset application facilitates the preset application to generate a control instruction according to the type of the touch object and the touch coordinate data, and to execute a touch operation corresponding to the control instruction.
  • performing touch operations can realize operations such as drawing and writing; when the touch object is a hand, performing touch operations can realize moving, scaling, rotating, etc. of the canvas.
  • FIG. 15 shows a schematic diagram of the structure of a passive stylus pen, which is also a common capacitive passive stylus pen.
  • FIG16 is a touch control device provided in an embodiment of the present application, and the touch control device includes: a touch sensor 111, a processor 112, and a memory 113.
  • a communication interface 114 and a communication bus 115 may also be included, wherein the touch sensor 111, the processor 112, the memory 113, and the communication interface 114 communicate with each other through the communication bus 115.
  • Memory 113 used for storing computer programs
  • the touch sensor 111 is used to detect a touch action when a touch object touches the touch screen;
  • the processor 112 is configured to execute the program stored in the memory 113 to implement the touch object recognition method provided by any one of the aforementioned method embodiments, including:
  • the type of the touch object is determined according to the sensing values of the sensing positions included in the touch area.
  • determining the type of the touch object according to the sensing value of the sensing position included in the touch area includes:
  • each sensing position within the preset area is obtained as a candidate position
  • the type of the touch object is determined based on the sensing value corresponding to the candidate position.
  • the preset area range includes:
  • positions adjacent to the peak position in four directions of up, down, left, and right are selected, as well as the peak position, to form a preset area range.
  • preset area range including:
  • the area formed by the sensing positions of the sensing values with the first h ranking order is selected as the preset area range, wherein the peak position is the sensing position with the first ranking of the sensing values, and h is a positive integer greater than or equal to 2.
  • determining the type of the touch object based on the sensing value corresponding to the candidate position includes:
  • a total sensing value or a sensing average value corresponding to each cycle is obtained, wherein the total sensing value or the sensing average value is determined according to the sensing value of the sensing position in the reference area obtained in each cycle, and the reference area includes the touch area or a preset area range in the touch area;
  • the type of the touch object is determined to be the second type.
  • determining the type of the touch object based on the sensing value corresponding to the candidate position includes:
  • a total sensing value or a sensing average value corresponding to each cycle is obtained, wherein the total sensing value or the sensing average value is determined according to the sensing value of the sensing position in the reference area obtained in each cycle, and the reference area includes the touch area or a preset area range in the touch area;
  • the type of the touch object is determined to be a third type other than the first type and the second type; wherein the second preset threshold is greater than the first preset threshold.
  • determining the type of the touch object based on the sensing value corresponding to the candidate position includes:
  • the type of the touch object is determined to be the second type.
  • determining the type of the touch object based on the sensing value corresponding to the candidate position includes:
  • the type of the touch object is determined to be a third type other than the first type and the second type, wherein the second preset threshold is greater than the first preset threshold.
  • the method further includes:
  • the touch coordinate data and the type of the touch object are reported to a preset application program, so that the preset application program generates a control instruction according to the type of the touch object and the touch coordinate data, and performs a touch operation corresponding to the control instruction.
  • a touch device determines a touch area corresponding to the touch object and obtains a sensing value corresponding to a sensing position included in the touch area after detecting that a touch object touches the touch screen. Then, the type of the touch object is determined based on the sensing value of the sensing position included in the touch area.
  • the sensing value of the touch area is larger and more obvious than the sensing value of other positions. There will be relatively large differences in the sensing values generated by different touch objects touching the touch screen. Therefore, it will be more accurate to judge the type of the touch object based on the sensing value of the sensing position in the touch area.
  • the embodiment of the present application further provides a computer-readable storage medium on which a computer program is stored.
  • a computer program is stored on which a computer program is stored.
  • the steps of the touch object recognition method provided in any of the aforementioned method embodiments are implemented.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Human Computer Interaction (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Position Input By Displaying (AREA)

Abstract

一种触控物识别方法、触控设备及存储介质,该方法包括:当检测触控物触摸到触控屏后,确定与触控物对应的触控区域(110);获取触控区域中包括的感应位置对应的感应值(120);根据触控区域中包括的感应位置的感应值,确定触控物的类型(130)。当触控物触摸到触控屏后,通常可以检测到在触控区域内的感应值发生变化。触摸区域的感应值相较于其他位置的感应值而言,数值较大,也更明显。不同的触控物对触控屏的触屏后产生的感应值会存在比较大的差异。因此,根据触摸区域内的感应位置的感应值来评判触控物的类型,将会更加的精准。

Description

触控物识别方法、触控设备及存储介质
本申请要求于2022年10月31日提交国家知识产权局、申请号为202211352689.5、申请名称为“触控物识别方法、触控设备及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及计算机技术领域,例如涉及一种触控物识别方法、触控设备及存储介质。
背景技术
电容触控技术中,通常在主动触控笔中添加电子电路,通过主动触控笔主动发射通讯信号,以便上层软件通过是否能够主动发生通讯讯号来识别当前的触控对象是主动触控笔,还是手指。进而根据触控对象的不同,来执行不同的操作。例如,如果识别是触控笔,则完成画图、书写等操作。如果是手指,则实现画布的移动、缩放,旋转等操作。
但是,上述方案中,主动触控笔的成本相较于被动触控笔(被动触控笔没有电子电路,无法主动发生通讯信号)而言,成本会比较高,而且,主动触控笔还需要定期更换电池或定期充电,给用户带来一定的不便。
如果想要规避上述问题采用被动触控笔,又存在目前的技术无法区分被动触控笔和手指的问题。因此,亟需一种能够区分被动触控笔和手指的方法,以解决上述问题。
发明内容
本申请提供了一种触控物识别方法、触控设备及存储介质,以解决相关技术中上述部分或全部的技术问题。
第一方面,本申请提供了一种触控物识别方法,该方法包括:
当检测触控物触摸到触控屏后,确定与触控物对应的触控区域;
获取触控区域中包括的感应位置对应的感应值;
根据触控区域中包括的感应位置的感应值,确定触控物的类型。
可选的,根据触控区域中包括的感应位置的感应值,确定触控物的类型,包括:
从触控区域中选取感应值最大的峰值位置;
以峰值位置为基准点,获取预设区域范围内每一个感应位置作为候选位置;
基于候选位置对应的感应值,确定触控物的类型。
可选的,预设区域范围包括:
以峰值位置为基准点,选取N乘M的框选范围作为预设区域范围,其中,N和M均为大于或等于2的正整数;
或者,以峰值位置为基准点,选取上下左右四个方向紧邻峰值位置的位置,以及峰值位置构成预设区域范围。
可选的,预设区域范围包括:
对所有的感应值由大到小进行排序;
选取排序顺序在前h的感应值所在的感应位置形成的区域,作为预设区域范围,其中,峰值位置为感应值排序第一的感应位置,h为大于或等于2的正整数。
可选的,基于候选位置对应的感应值,确定触控物的类型,包括:
在预设时间段内,获取每一个周期对应的总感应值或感应平均值,其中,总感应值或感应平均值,根据每一个周期内所获取的基准区域内的感应位置的感应值确定,基准区域包括触控区域或触控区域内的预设区域范围;
从预设时间段内的每一周期分别对应的总感应值或感应平均值中选取预设时间段内的最大值;
当最大值大于或者等于第一预设阈值时,确定触控物的类型为第一类型;
或者,当最大值小于第一预设阈值时,确定触控物的类型为第二类型。
可选的,基于候选位置对应的感应值,确定触控物的类型,包括:
在预设时间段内,获取每一个周期对应的总感应值或感应平均值,其中,总感应值或感应平均值,根据每一个周期内所获取的基准区域内的感应位置的感应值确定,基准区域包括触控区域或触控区域内的预设区域范围;
从预设时间段内的每一周期分别对应的总感应值或感应平均值中选取预设时间段内的最大值;
当最大值小于第二预设阈值,且大于或者等于第一预设阈值时,确定触控物的类型为第一类型;
当最大值小于第一预设阈值,且大于或等于第三预设阈值时,确定触控物的类型为第二类型;
或者,
当最大值小于第三预设阈值,或者,大于或等于第二预设阈值时,确定触控物的类型为除第一类型和第二类型之外的第三类型;其中,第二预设阈值大于第一预设阈值。
可选的,基于候选位置对应的感应值,确定触控物的类型,包括:
获取预设数量的连续扫描帧中每一个扫描帧分别对应的总感应值或感应平均值,其中,总感应值或感应平均值,根据每一个扫描帧内所获取的基准区域内的感应位置的感应值确定,基准区域包括触控区域或触控区域内的预设区域范围;
从预设数量的连续扫描帧中每一扫描帧分别对应的总感应值或感应平均值中选取预设数量的连续扫描帧中的最大值;
当最大值大于或者等于第一预设阈值时,确定触控物的类型为第一类型;
或者,当最大值小于第一预设阈值时,确定触控物的类型为第二类型。
可选的,基于候选位置对应的感应值,确定触控物的类型,包括:
获取预设数量的连续扫描帧中每一个扫描帧分别对应的总感应值或感应平均值,其中,总感应值或感应平均值,根据每一个扫描帧内所获取的基准区域内的感应位置的感应值确定,基准区域包括触控区域或触控区域内的预设区域范围;
从预设数量的连续扫描帧中每一扫描帧分别对应的总感应值或感应平均值中选取预设数量的连续扫描帧中的最大值;
当最大值小于第二预设阈值,且大于或者等于第一预设阈值时,确定触控物的类型为第一类型;
当最大值小于第一预设阈值,且大于或等于第三预设阈值时,确定触控物的类型为第二类型;
或者,
当最大值小于第三预设阈值,或者,大于或等于第二预设阈值时,确定触控物的类型为除第一类型和第二类型之外的第三类型,其中,第二预设阈值大于第一预设阈值。
可选的,基于候选位置对应的感应值,确定触控物的类型后,方法还包括:
获取触控物的触控坐标数据;
将触控坐标数据和触控物的类型上报至预设应用程序,以便预设应用程序根据触控物的类型,以及触控坐标数据,生成控制指令,并执行与控制指令对应的触控操作。
第二方面,提供了一种触控设备包括:触控传感器、处理器和存储器;
存储器,用于存放计算机程序;
触控传感器,用于当触控物触摸触控屏时,检测触摸动作;
处理器,用于执行存储器上所存放的程序时,实现第一方面任一项实施例的触控物识别方法的步骤。
第四方面,提供了一种计算机可读存储介质,其上存储有计算机程序,计算机程序被处理器执行时实现如第一方面任一项实施例的触控物识别方法的步骤。
本申请实施例提供的上述技术方案与相关技术相比具有如下优点:
本申请实施例提供的该方法,当检测到触控物触摸到触控屏后,确定与触控物对应的触控区域,并获取触控区域中包括的感应位置对应的感应值。然后,根据触控区域中包括的感应位置的感应值,确定触控物类型。在本申请实施例中,因为当触控物触摸到触控屏后,通常可以检测到在触控区域内的感应值发生变化。触摸区域的感应值相较于其他位置的感应值而言,数值较大,也更明显。不同的触控物对触控屏的触屏后产生的感应值会存在比较大的差异。因此,根据触摸区域内的感应位置的感应值来评判触控物的类型,将会更加的精准。
附图说明
图1为本申请实施例提供的一种触控物识别方法流程示意图;
图2为本申请提供的触控屏内部组成结构示意图;
图3为本申请提供的一种触控物触摸不同位置,采用不同力度触摸时,所产生的感应数据的示意图;
图4为本申请提供的另一种触控物触摸不同位置,采用不同力度触摸时,所产生的感应数据的示意图;
图5为本申请提供的另一种触控物触摸不同位置,采用不同力度触摸时,所产生的感应数据的示意图;
图6为本申请实施例提供的另一种触控物识别方法流程示意图;
图7为本申请提供的由候选位置构成预设区域范围的结构示意图;
图8本申请提供的一种获取预设区域范围的方法流程示意图;
图9为本申请提供的一种确定触控物的类型的方法流程示意图;
图10为本申请提供的分别通过手指和被动触控笔分别触摸触控屏后,所产生的感应值总和的对比示意图;
图11为本申请提供的另一种确定触控物的类型的方法流程示意图;
图12为本申请提供的另一种确定触控物的类型的方法流程示意图;
图13为本申请提供的另一种确定触控物的类型的方法流程示意图;
图14本申请实施例提供的另一种触控物识别方法流程示意图;
图15为本申请提供的一种被动触控笔的结构示意图;
图16为本申请实施例提供的一种触控设备结构示意图。
具体实施方式
为使本申请实施例的目的、技术方案和优点更加清楚,下面将结合本申请实施例中的附 图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
为便于对本申请实施例的理解,下面将结合附图以实施例做进一步的解释说明,实施例并不构成对本申请实施例的限定。
针对背景技术中所提及的技术问题,本申请实施例提供了一种触控物识别方法,在一实施例中,参见图1所示,图1为本申请实施例提供的一种触控物识别方法流程示意图,该方法步骤包括:
步骤110,当检测触控物触摸到触控屏后,确定与触控物对应的触控区域。
在一实施例中,确定触控物对应的触控区域的方式,可以包括多种可实现方式。比较常用的技术手段例如九宫格滑动窗口方法,又或者是分水岭算法等。确定与触控物对应的触控区域的方法步骤,这里不再赘述。
步骤120,获取触控区域中包括的感应位置对应的感应值。
在一实施例中,以触控物触摸到电容触控屏为例,参见图2所示,目前电容触控技术,触控屏中的整个面板都包括相互交错的驱动电极和感应电极。每个感应位置,即为驱动电极和感应电极相互交错的一个单位位置,触碰后,电容会发生变化。参见图2所示,图2中对感应位置进行了标识。
当触控物触摸到触控屏后,触控屏会以触摸物为中心的触控区域内产生电容数据的变化,电容数据的变化,经常由硬件驱动和模数转换等操作,最终表现为类似图3至图5所呈现的示意图。其中,图3至图5中示意出的是触控物触摸不同位置,采用不同力度触摸时,所产生的感应数据(电容数据)。越远离触控中心的感应位置的感应数值将越小,而不在触控区域范围内的感应位置的感应数据通常为0。
步骤130,根据触控区域中包括的感应位置的感应值,确定触控物的类型。
在一实施例中,考虑到使用不同类型的触控物触摸触控屏时,所产生的感应值将会存在一定的差别。例如,采用电容被动笔触摸触控屏时,触控区域内的感应值总和通常在413至453之间,而采用手指触摸触控屏后,触控区域内的感应值总和通常在886~1356之间。
也即是,不同的触摸对象触摸触控屏后,触控区域内的感应值总和将会有明显的区别,因此,可以根据触控区域中包括的感应位置的感应值,确定触控物的类型。例如,计算触控区域中包括的所有感应位置的感应值之和,或者触控区域所有感应位置对应的感应平均值。进而根据感应值之和,亦或是感应平均值,确定触控物的类型。
本申请实施例提供的触控物识别方法,当检测到触控物触摸到触控屏后,确定与触控物 对应的触控区域,并获取触控区域中包括的感应位置对应的感应值。然后,根据触控区域中包括的感应位置的感应值,确定触控物类型。在本申请实施例中,因为当触控物触摸到触控屏后,通常可以检测到在触控区域内的感应值发生变化。触摸区域的感应值相较于其他位置的感应值而言,数值较大,也更明显。不同的触控物对触控屏的触屏后产生的感应值会存在比较大的差异。因此,根据触摸区域内的感应位置的感应值来评判触控物的类型,将会更加的精准。
可选的,在上述实施例的基础上,考虑到在某些条件下,如果直接调用触控区域中所有感应位置的感应值确定触控物的类型,可能会存在一些误差。例如触控区域比较大,计算所有触控区域中的感应值来确定触控物的类型时,可能会降低触控物之间的特征差异。例如,包括多个触控物。每个触控物对应的触控区域中包括有100个感应位置,100个感应位置的感应值进行叠加后,有可能叠加和都在886~1356之间,但是实际上,其中一个触控物为被动触控笔,另一个为手指。
此外,如果采集的电容数据(感应位置的感应值)存在噪声的情况下,如果触控区域越大,噪声数据就越多,也会影响到触控物类型的判断。
为了避免上述特殊情况的发生,使得对触控物类型的判断更加严谨和精确,可选的,本申请还提供了另一种触控物识别方法。参见图6所示,该方法步骤包括:
步骤610,从触控区域中选取感应值最大的峰值位置。
在一实施例中,例如图3中的峰值位置为感应值为446的位置,图4中感应值为394的位置为峰值位置,图5中211所在的位置为峰值位置。
步骤620,以峰值位置为基准点,获取预设区域范围内每一个感应位置作为候选位置。
在一实施例中,如上所介绍的,触控物触摸触控屏后,将会存在感应值的变化,且越接近触摸中心,感应值越大。因此我们可以假设感应值最大的位置,也即是峰值位置为触摸中心,以触摸中心为基准点,获取预设区域范围内每一个感应位置作为候选位置,所有的候选位置则构成该预设区域范围。
步骤630,基于候选位置对应的感应值,确定触控物的类型。
确定触控物类型的方式参见步骤130,只不过这里采用的数据是预设区域范围内所有候选位置对应的感应值,而并非是触控区域中所有感应位置对应的感应值。
可选的,在上一实施例的基础上,在基于峰值位置为基准点,获取预设区域范围内每一个感应位置作为候选位置时,可以通过如下方式选取候选位置。参见如下:
第一种情况,可以以峰值位置为基准点,选取N乘M的框选范围作为预设区域范围。
在一个实施例子中,N和M均为大于或等于2的正整数。
以图3为例,以感应值为446所在的位置为基准点,选取感应值包括36、108、37、127、132、30、114,以及35等分别对应的位置作为候选位置,而所有的候选位置所形成的区域范围,则为预设区域范围。
或者,以峰值位置为基准点,选取上下左右四个方向紧邻峰值位置的位置,以及峰值位置构成预设区域范围。
在一实施例中,参见图7所示。图7示意出以感应值211所在的峰值为基准,选取峰值位置的上下左右四个方向紧邻峰值位置的位置,也即是感应值分别为203、199、15以及15等所在的位置作为候选位置,这些候选位置构成预设区域范围。
在另一个可选的例子中,预设区域范围的确定还可以通过如下方式获取,参见图8所示,该方法步骤包括:
步骤810,对所有的感应值由大到小进行排序。
步骤820,选取排序顺序在前h的感应值所在的感应位置形成的区域,作为预设区域范围。
示例性的,比如以图4所示的附图为例进行说明,对图4中所展示的所有的感应值由大到小进行排序,依次为:394、382、131、126、114、112、54、36、22、21、13、13。
如果选取排序为前六的感应值所在的感应位置形成的区域,作为预设区域范围,而排序为前六的感应值分别对应的位置作为候选位置,那么预设区域范围就是由感应值依次为394、382、131、126、114、112等分别对应的感应位置,也即是图4中的中间两行的位置。
可选的,在一个实施例子中,确定触控物的类型,可以通过如下方式实现,参见图9所示,该方法步骤包括:
步骤910,在预设时间段内,获取每一个周期对应的总感应值或感应平均值。
其中,总感应值或感应平均值,根据每一个周期内所获取的基准区域内的感应位置的感应值确定,基准区域包括触控区域或触控区域内的预设区域范围。也即是,可以根据每一个周期内所获取的触控区域的感应值,确定每一周期的总感应值或感应平均值,也可以根据每一个周期内所获取的预设区域范围内的所有候选位置的感应值,确定每一周期的总感应值或感应平均值。
在一实施例中,电容屏是以一定的时间间隔扫描的方式工作,因此在预设时间段内,可以存在至少一个周期。
因此,该方法步骤还包括:
步骤920,从预设时间段内的每一周期分别对应的总感应值或感应平均值中选取预设时间段内的最大值。
步骤930,将最大值与第一预设阈值进行比较,获取比较结果,并根据比较结果确定触控物的类型。
在一实施例中,当最大值大于或者等于第一预设阈值时,确定触控物的类型为第一类型;
或者,当最大值小于第一预设阈值时,确定触控物的类型为第二类型。
在一实施例中,参见上文所介绍的,通过大量的实验数据统计可知,采用电容被动笔触摸触控屏时,基准区域(例如以峰值为基准的3×3的九宫格内的预设区域范围)内的感应值总和通常在413至453之间,而采用手指触摸触控屏后,基准区域内的感应值总和通常在886~1356之间。可以参见图10所示,图10示意出分别通过手指和被动触控笔分别触摸触控屏后,所产生的感应值总和的对比示意图。
那么,假设取435到886之间的中间值660作为预设阈值。
即,当最大值大于或者等于660时,则确定触控物的类型为手指;否则,当最大值小于660时,则确定触控物的类型为被动触控笔。
在另一个可选的例子中,考虑到在预设时间段内,因为电容屏是以一定的时间间隔扫描的方式工作。所以,预设时间阈值也可以转换为预设扫描帧数量阈值。因此,该方法还可以包括如下方法步骤,参见图11所示:
步骤1110,获取预设数量的连续扫描帧中每一个扫描帧分别对应的总感应值或感应平均值。
其中,总感应值或感应平均值,根据每一个扫描帧内所获取的基准区域内的感应位置的感应值确定,基准区域包括触控区域或触控区域内的预设区域范围。
步骤1120,从预设数量的连续扫描帧中每一扫描帧分别对应的总感应值或感应平均值中选取预设数量的连续扫描帧中的最大值。
步骤1130,将最大值分别与第一预设阈值进行比较,获取比较结果,并根据比较结果确定触控物的类型。
在一实施例中,当最大值大于或者等于第一预设阈值时,确定触控物的类型为第一类型;
或者,当最大值小于第一预设阈值时,确定触控物的类型为第二类型。
判断过程与上一例子中判断触控物的类型的原理类似,这里不再过多解释说明。
可选的,对于触控物的类型判断过程中,考虑到可能会存在一些特殊情况,比如最大值大于1356的情况,或者最大值小于413的情况。因此,为了保证对触控物的类型的判定更加准确,还可以通过如下方式来确定触控物的类型。可以分别参见图12和图13所示的方法步骤。
在一个实施例子中,确定触控物的类型,可以通过如下方式实现,参见图12所示,该 方法步骤包括:
步骤1210,在预设时间段内,获取每一个周期对应的总感应值或感应平均值。
其中,总感应值或感应平均值,根据每一个周期内所获取的基准区域内的感应位置的感应值确定,基准区域包括触控区域或触控区域内的预设区域范围。
在一实施例中,电容屏是以一定的时间间隔扫描的方式工作,因此在预设时间段内,可以存在至少一个周期。
因此,该方法步骤还包括:
步骤1220,从预设时间段内的每一周期分别对应的总感应值或感应平均值中选取预设时间段内的最大值。
在一实施例中,如果步骤1210中所获取的是每一个周期分别对应的总感应值,那么从所有的周期分别对应的总感应值中选取预设时间段内的总感应值的最大值;
或者,如果步骤1210中所获取的是每一个周期分别对应的感应平均值,那么从所有周期分别对应的感应平均值中选取预设时间段内的感应平均值的最大值。
步骤1230,将最大值分别与第一预设阈值和/或第二预设阈值,和/或第三预设阈值进行比较,获取比较结果,并根据比较结果确定触控物的类型。
其中,当最大值小于第二预设阈值,且大于或者等于第一预设阈值时,确定触控物的类型为第一类型;
或者,当最大值小于第一预设阈值,且大于或等于第三预设阈值时,确定触控物的类型为第二类型;
或者,当最大值小于第三预设阈值,或者,大于或等于第二预设阈值时,确定触控物的类型为除第一类型和第二类型之外的第三类型;其中,第二预设阈值大于第一预设阈值。
在一实施例中,以图9或图11所对应的实施例中的数据为例,采用电容被动笔触摸触控屏时,基准区域(例如以峰值为基准的3×3的九宫格内的预设区域范围)内的感应值总和通常在413至453之间,而采用手指触摸触控屏后,基准区域内的感应值总和通常在886~1356之间。可以参见图10所示,图10示意出分别通过手指和被动触控笔分别触摸触控屏后,所产生的感应值总和的对比示意图。
那么,假设第一预设阈值为1357,第二预设阈值为453到886之间的中间值660,第三预设阈值为413。
当在预设时间段内的每一周期内所有候选位置的感应值所对应的感应值总和中的最大值小于1357,大于或者等于660时,则确定触控物的类型为手。如果最大值小于660,且大于或者等于413时,则确定触控物的类型为被动触控笔。
当然,如果并非是上述两种情况中的任一种,则说明触控物的类型为其他类型。
也即是,当最大值小于413,或者大于或者等于1357时,可以确定触控物的类型为除第一类型和第二类型之外的第三类型。
至于第三类型是什么,则可以再利用类似上述确定被动触控笔和手指的方法,再次界定其他类型到底是什么类型,这里不再过多赘述。
在另一个可选的例子中,考虑到在预设时间段内,因为电容屏是以一定的时间间隔扫描的方式工作。所以,预设时间阈值也可以转换为预设扫描帧数量阈值。因此,该方法还可以包括如下方法步骤,参见图13所示:
步骤1310,获取预设数量的连续扫描帧中每一个扫描帧分别对应的总感应值或感应平均值。
其中,总感应值或感应平均值,根据每一个扫描帧内所获取的基准区域内的感应位置的感应值确定,基准区域包括触控区域或触控区域内的预设区域范围。
步骤1320,从预设数量的连续扫描帧中每一扫描帧分别对应的总感应值或感应平均值中选取预设数量的连续扫描帧中的最大值。
步骤1330,将最大值分别与第一预设阈值和/或第二预设阈值,和/或第三预设阈值进行比较,获取比较结果,并根据比较结果确定触控物的类型。
比较过程和判定触控物的类型的过程,与图12所对应的方法实施例中执行的过程类似,这里不再赘述。
可选的,在上述任一实施例的基础上,该方法还可以包括如下方法步骤,参见图14所示,该方法步骤包括:
步骤1410,获取触控物的触控坐标数据。
步骤1420,将触控坐标数据和触控物的类型上报至预设应用程序。
在一实施例中,获取触控物的触控坐标数据的方案属于比较成熟的技术,这里不再赘述。
而将触控坐标数据和触控物的类型上报至预设应用程序,是方便预设应用程序根据触控物的类型,以及触控坐标数据,生成控制指令,并执行与控制指令对应的触控操作。
例如,当触控物为笔时,执行触控操作可以实现画图、书写等操作;当触控物为手时,执行触控操作可以实现画布的移动、缩放、旋转等。
图15中示意出一种被动触控笔的结构示意图,其也即是比较常见的普通的电容式被动笔。
以上,为本申请所提供的触控物识别几个方法实施例,下文中则介绍说明本申请所提供的触控物识别其他实施例,参见如下。
图16为本申请实施例提供的一种触控设备,该触控设备包括:触控传感器111、处理器112、和存储器113。此外,还可以包括通信接口114和通信总线115,其中,触控传感器111、处理器112、和存储器113,以及通信接口114通过通信总线115完成相互间的通信。
存储器113,用于存放计算机程序;
在本申请一个实施例中,触控传感器111,用于当触控物触摸触控屏时,检测触摸动作;
处理器112,用于执行存储器113上所存放的程序时,实现前述任意一个方法实施例提供的触控物识别方法,包括:
当检测触控物触摸到触控屏后,确定与触控物对应的触控区域;
获取触控区域中包括的感应位置对应的感应值;
根据触控区域中包括的感应位置的感应值,确定触控物的类型。
可选的,根据触控区域中包括的感应位置的感应值,确定触控物的类型,包括:
从触控区域中选取感应值最大的峰值位置;
以峰值位置为基准点,获取预设区域范围内每一个感应位置作为候选位置;
基于候选位置对应的感应值,确定触控物的类型。
可选的,预设区域范围包括:
以峰值位置为基准点,选取N乘M的框选范围作为预设区域范围,其中,N和M均为大于或等于2的正整数;
或者,以峰值位置为基准点,选取上下左右四个方向紧邻峰值位置的位置、以及峰值位置来构成预设区域范围。
可选的,预设区域范围,包括:
对所有的感应值由大到小进行排序;
选取排序顺序在前h的感应值所在的感应位置形成的区域,作为预设区域范围,其中,峰值位置为感应值排序第一的感应位置,h为大于或等于2的正整数。
可选的,基于候选位置对应的感应值,确定触控物的类型,包括:
在预设时间段内,获取每一个周期对应的总感应值或感应平均值,其中,总感应值或感应平均值,根据每一个周期内所获取的基准区域内的感应位置的感应值确定,基准区域包括触控区域或触控区域内的预设区域范围;
从预设时间段内的每一周期分别对应的总感应值或感应平均值中选取预设时间段内的最大值;
当最大值大于或者等于第一预设阈值时,确定触控物的类型为第一类型;
或者,当最大值小于第一预设阈值时,确定触控物的类型为第二类型。
可选的,基于候选位置对应的感应值,确定触控物的类型,包括:
在预设时间段内,获取每一个周期对应的总感应值或感应平均值,其中,总感应值或感应平均值,根据每一个周期内所获取的基准区域内的感应位置的感应值确定,基准区域包括触控区域或触控区域内的预设区域范围;
从预设时间段内的每一周期分别对应的总感应值或感应平均值中选取预设时间段内的最大值;
当最大值小于第二预设阈值,且大于或者等于第一预设阈值时,确定触控物的类型为第一类型;
当最大值小于第一预设阈值,且大于或等于第三预设阈值时,确定触控物的类型为第二类型;
或者,
当最大值小于第三预设阈值,或者,大于或等于第二预设阈值时,确定触控物的类型为除第一类型和第二类型之外的第三类型;其中,第二预设阈值大于第一预设阈值。
可选的,基于候选位置对应的感应值,确定触控物的类型,包括:
获取预设数量的连续扫描帧中每一个扫描帧分别对应的总感应值或感应平均值,其中,总感应值或感应平均值,根据每一个扫描帧内所获取的基准区域内的感应位置的感应值确定,基准区域包括触控区域或触控区域内的预设区域范围;
从预设数量的连续扫描帧中每一扫描帧分别对应的总感应值或感应平均值中选取预设数量的连续扫描帧中的最大值;
当最大值大于或者等于第一预设阈值时,确定触控物的类型为第一类型;
或者,当最大值小于第一预设阈值时,确定触控物的类型为第二类型。
可选的,基于候选位置对应的感应值,确定触控物的类型,包括:
获取预设数量的连续扫描帧中每一个扫描帧分别对应的总感应值或感应平均值,其中,总感应值或感应平均值,根据每一个扫描帧内所获取的基准区域内的感应位置的感应值确定,基准区域包括触控区域或触控区域内的预设区域范围;
从预设数量的连续扫描帧中每一扫描帧分别对应的总感应值或感应平均值中选取预设数量的连续扫描帧中的最大值;
当最大值小于第二预设阈值,且大于或者等于第一预设阈值时,确定触控物的类型为第一类型;
当最大值小于第一预设阈值,且大于或等于第三预设阈值时,确定触控物的类型为第二类型;
或者,
当最大值小于第三预设阈值,或者,大于或等于第二预设阈值时,确定触控物的类型为除第一类型和第二类型之外的第三类型,其中,第二预设阈值大于第一预设阈值。
可选的,基于候选位置对应的感应值,确定触控物的类型后,方法还包括:
获取触控物的触控坐标数据;
将触控坐标数据和触控物的类型上报至预设应用程序,以便预设应用程序根据触控物的类型,以及触控坐标数据,生成控制指令,并执行与控制指令对应的触控操作。
本申请实施例提供的一种触控设备,当检测到触控物触摸到触控屏后,确定与触控物对应的触控区域,并获取触控区域中包括的感应位置对应的感应值。然后,根据触控区域中包括的感应位置的感应值,确定触控物类型。在一实施例中,因为当触控物触摸到触控屏后,通常可以检测到在触控区域内的感应值发生变化。触摸区域的感应值相较于其他位置的感应值而言,数值较大,也更明显。不同的触控物对触控屏的触屏后产生的感应值会存在比较大的差异。因此,根据触摸区域内的感应位置的感应值来评判触控物的类型,将会更加的精准。
本申请实施例还提供了一种计算机可读存储介质,其上存储有计算机程序,计算机程序被处理器执行时实现如前述任意一个方法实施例提供的触控物识别方法的步骤。
需要说明的是,在本文中,诸如“第一”和“第二”等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括要素的过程、方法、物品或者设备中还存在另外的相同要素。
以上仅是本申请的实施方式,使本领域技术人员能够理解或实现本申请。对这些实施例的多种修改对本领域的技术人员来说将是显而易见的,本文中所定义的一般原理可以在不脱离本申请的精神或范围的情况下,在其它实施例中实现。因此,本申请将不会被限制于本文所示的这些实施例,而是要符合与本文所申请的原理和新颖特点相一致的最宽的范围。

Claims (11)

  1. 一种触控物识别方法,其中,所述方法包括:
    当检测触控物触摸到触控屏后,确定与所述触控物对应的触控区域;
    获取所述触控区域中包括的感应位置对应的感应值;
    根据所述触控区域中包括的感应位置的感应值,确定所述触控物的类型。
  2. 根据权利要求1所述的方法,其中,所述根据所述触控区域中包括的感应位置的感应值,确定所述触控物的类型,包括:
    从所述触控区域中选取感应值最大的峰值位置;
    以所述峰值位置为基准点,获取预设区域范围内每一个所述感应位置作为候选位置;
    基于所述候选位置对应的感应值,确定所述触控物的类型。
  3. 根据权利要求2所述的方法,其中,所述预设区域范围包括:
    以所述峰值位置为基准点,选取N乘M的框选范围作为所述预设区域范围,其中,N和M均为大于或等于2的正整数;
    或者,以所述峰值位置为基准点,选取上下左右四个方向紧邻所述峰值位置的位置、以及所述峰值位置来构成所述预设区域范围。
  4. 根据权利要求2所述的方法,其中,所述预设区域范围包括:
    对所有的感应值由大到小进行排序;
    选取排序顺序在前h的感应值所在的感应位置形成的区域,作为所述预设区域范围,其中,所述峰值位置为感应值排序第一的感应位置,h为大于或等于2的正整数。
  5. 根据权利要求2-4任一项所述的方法,其中,确定所述触控物的类型,包括:
    在预设时间段内,获取每一个周期对应的总感应值或感应平均值,其中,所述总感应值或所述感应平均值,根据每一个周期内所获取的基准区域内的感应位置的感应值确定,所述基准区域包括所述触控区域或所述触控区域内的所述预设区域范围;
    从所述预设时间段内的每一周期分别对应的总感应值或感应平均值中选取所述预设时间段内的最大值;
    当所述最大值大于或者等于第一预设阈值时,确定所述触控物的类型为第一类型;
    或者,当所述最大值小于所述第一预设阈值时,确定所述触控物的类型为第二类型。
  6. 根据权利要求2-4任一项所述的方法,其中,确定所述触控物的类型,包括:
    在预设时间段内,获取每一个周期对应的总感应值或感应平均值,其中,所述总感应值或所述感应平均值,根据每一个周期内所获取的基准区域内的感应位置的感应值确定,所述 基准区域包括所述触控区域或所述触控区域内的所述预设区域范围;
    从所述预设时间段内的每一周期分别对应的总感应值或感应平均值中选取所述预设时间段内的最大值;
    当所述最大值小于第二预设阈值,且大于或者等于第一预设阈值时,确定所述触控物的类型为第一类型;
    当所述最大值小于所述第一预设阈值,且大于或等于第三预设阈值时,确定所述触控物的类型为第二类型;
    或者,
    当所述最大值小于所述第三预设阈值,或者,大于或等于所述第二预设阈值时,确定所述触控物的类型为除所述第一类型和第二类型之外的第三类型;其中,所述第二预设阈值大于所述第一预设阈值。
  7. 根据权利要求2-4任一项所述的方法,其中,确定所述触控物的类型,包括:
    获取预设数量的连续扫描帧中每一个扫描帧分别对应的总感应值或感应平均值,其中,所述总感应值或所述感应平均值,根据每一个扫描帧内所获取的基准区域内的感应位置的感应值确定,所述基准区域包括所述触控区域或所述触控区域内的所述预设区域范围;
    从所述预设数量的连续扫描帧中每一扫描帧分别对应的总感应值或感应平均值中选取所述预设数量的连续扫描帧中的最大值;
    当所述最大值大于或者等于第一预设阈值时,确定所述触控物的类型为第一类型;
    或者,当所述最大值小于所述第一预设阈值时,确定所述触控物的类型为第二类型。
  8. 根据权利要求2-4任一项所述的方法,其中,确定所述触控物的类型,包括:
    获取预设数量的连续扫描帧中每一个扫描帧分别对应的总感应值或感应平均值,其中,所述总感应值或所述感应平均值,根据每一个扫描帧内所获取的基准区域内的感应位置的感应值确定,所述基准区域包括所述触控区域或所述触控区域内的所述预设区域范围;
    从所述预设数量的连续扫描帧中每一扫描帧分别对应的总感应值或感应平均值中选取所述预设数量的连续扫描帧中的最大值;
    当所述最大值小于第二预设阈值,且大于或者等于第一预设阈值时,确定所述触控物的类型为第一类型;
    当所述最大值小于所述第一预设阈值,且大于或等于第三预设阈值时,确定所述触控物的类型为第二类型;
    或者,
    当所述最大值小于所述第三预设阈值,或者,大于或等于所述第二预设阈值时,确定所 述触控物的类型为除所述第一类型和第二类型之外的第三类型,其中,所述第二预设阈值大于所述第一预设阈值。
  9. 根据权利要求1-4任一项所述的方法,其中,所述方法还包括:
    获取所述触控物的触控坐标数据;
    将所述触控坐标数据和所述触控物的类型上报至预设应用程序,以便所述预设应用程序根据所述触控物的类型,以及所述触控坐标数据,生成控制指令,并执行与所述控制指令对应的触控操作。
  10. 一种触控设备,其中,所述触控设备包括:触控传感器、处理器和存储器;
    所述存储器,用于存放计算机程序;
    所述触控传感器,用于当触控物触摸触控屏时,检测触摸动作;
    所述处理器,用于执行存储器上所存放的程序时,实现权利要求1-9任一项所述的触控物识别方法的步骤。
  11. 一种计算机可读存储介质,其上存储有计算机程序,其中,所述计算机程序被触控设备执行时实现如权利要求1-9任一项所述的触控物识别方法的步骤。
PCT/CN2023/128246 2022-10-31 2023-10-31 触控物识别方法、触控设备及存储介质 WO2024093978A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211352689.5 2022-10-31
CN202211352689.5A CN117991919A (zh) 2022-10-31 2022-10-31 触控物识别方法、触控设备及存储介质

Publications (1)

Publication Number Publication Date
WO2024093978A1 true WO2024093978A1 (zh) 2024-05-10

Family

ID=90891565

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/128246 WO2024093978A1 (zh) 2022-10-31 2023-10-31 触控物识别方法、触控设备及存储介质

Country Status (2)

Country Link
CN (1) CN117991919A (zh)
WO (1) WO2024093978A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013088982A (ja) * 2011-10-17 2013-05-13 Rohm Co Ltd タッチ式入力装置およびそのコントローラ、制御方法、電子機器
CN103984432A (zh) * 2013-01-25 2014-08-13 希迪普公司 触摸屏控制器及其控制方法
CN106155438A (zh) * 2014-12-05 2016-11-23 义隆电子股份有限公司 电容式触控装置及其物体辨识方法
CN106445120A (zh) * 2016-09-05 2017-02-22 华为技术有限公司 触控操作的识别方法及装置
CN106484184A (zh) * 2016-09-30 2017-03-08 北京集创北方科技股份有限公司 触摸屏及其识别方法
CN114035706A (zh) * 2021-11-05 2022-02-11 北京集创北方科技股份有限公司 一种触屏手指特性的分析方法、装置、设备及存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013088982A (ja) * 2011-10-17 2013-05-13 Rohm Co Ltd タッチ式入力装置およびそのコントローラ、制御方法、電子機器
CN103984432A (zh) * 2013-01-25 2014-08-13 希迪普公司 触摸屏控制器及其控制方法
CN106155438A (zh) * 2014-12-05 2016-11-23 义隆电子股份有限公司 电容式触控装置及其物体辨识方法
CN106445120A (zh) * 2016-09-05 2017-02-22 华为技术有限公司 触控操作的识别方法及装置
CN106484184A (zh) * 2016-09-30 2017-03-08 北京集创北方科技股份有限公司 触摸屏及其识别方法
CN114035706A (zh) * 2021-11-05 2022-02-11 北京集创北方科技股份有限公司 一种触屏手指特性的分析方法、装置、设备及存储介质

Also Published As

Publication number Publication date
CN117991919A (zh) 2024-05-07

Similar Documents

Publication Publication Date Title
US9588621B2 (en) Touch screen controller and method for controlling thereof
JP5520336B2 (ja) タッチパネルの縁部把持検知方法およびそのタッチパネルの縁部把持検知方法に関するデバイス
US7916126B2 (en) Bottom-up watershed dataflow method and region-specific segmentation based on historic data to identify patches on a touch sensor panel
KR100866485B1 (ko) 다접점 위치 변화 감지 장치, 방법, 및 이를 이용한 모바일기기
US20090284495A1 (en) Systems and methods for assessing locations of multiple touch inputs
US20100194701A1 (en) Method of recognizing a multi-touch area rotation gesture
JP2013515302A (ja) 投影型静電容量式タッチパネルを走査する方法、投影型静電容量式タッチパネルの走査のための記憶媒体及び装置
US11256367B2 (en) Techniques for handling unintentional touch inputs on a touch-sensitive surface
US20120242595A1 (en) Method for determining touch point
US20160054831A1 (en) Capacitive touch device and method identifying touch object on the same
US9971429B2 (en) Gesture recognition method, apparatus and device, computer program product therefor
JPWO2015181979A1 (ja) 入力装置、入力方法およびプログラム
CN107037951B (zh) 操作模式自动识别方法及终端
WO2024093978A1 (zh) 触控物识别方法、触控设备及存储介质
US20120127120A1 (en) Touch device and touch position locating method thereof
KR101666580B1 (ko) 터치 검출방법
US10551934B2 (en) Gesture recognition method, apparatus and device, computer program product therefor
CN115237271A (zh) 触控传感器、触摸板、识别非预期触碰的方法与计算机
CN103713840B (zh) 可携式装置及其按键点击范围调整方法
WO2021139096A1 (zh) 触控方法、终端和存储介质
US10712883B2 (en) Electronic device validating multiple finger touch detection through donut shaped touch islands, and related methods
CN107402658B (zh) 侦测触碰或接近的方法与控制器
JP2021082240A (ja) 静電容量方式タッチパネルのマルチモード作業方法
CN106155433B (zh) 一种电容屏点位识别与跟踪的方法及其装置
TW201816578A (zh) 觸控偵測方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23884908

Country of ref document: EP

Kind code of ref document: A1