CN107728860B - Touch point identification method and device of infrared touch screen and touch screen equipment - Google Patents

Touch point identification method and device of infrared touch screen and touch screen equipment Download PDF

Info

Publication number
CN107728860B
CN107728860B CN201710980632.2A CN201710980632A CN107728860B CN 107728860 B CN107728860 B CN 107728860B CN 201710980632 A CN201710980632 A CN 201710980632A CN 107728860 B CN107728860 B CN 107728860B
Authority
CN
China
Prior art keywords
area
candidate
touch
scanning direction
scanning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710980632.2A
Other languages
Chinese (zh)
Other versions
CN107728860A (en
Inventor
张艺梅
王武军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN201710980632.2A priority Critical patent/CN107728860B/en
Publication of CN107728860A publication Critical patent/CN107728860A/en
Application granted granted Critical
Publication of CN107728860B publication Critical patent/CN107728860B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • G06F3/0423Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen using sweeping light beams, e.g. using rotating or vibrating mirror
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04104Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

The application provides a method and a device for identifying touch points of an infrared touch screen and touch screen equipment, wherein the method comprises the following steps: in one scanning period, scanning in all n scanning directions is performed to obtain a touch area in each scanning direction; for a touch area in any scanning direction, acquiring an overlapping area of the touch area and touch areas in other scanning directions to obtain a candidate judgment area; if the candidate judgment area is an overlapped area of the touch areas in the m scanning directions, setting a virtual light path boundary which is intersected with the contour line of the candidate judgment area in the first scanning direction; and if no light path in the first scanning direction passes through the virtual light path boundary, determining that the candidate judging area is a real touch point, wherein n is an integer larger than 2, and m is an integer smaller than n and larger than or equal to 2. The method and the device are applied to touch point identification of the infrared touch screen.

Description

Touch point identification method and device of infrared touch screen and touch screen equipment
Technical Field
The application relates to the technical field of infrared touch control, in particular to a method and a device for identifying touch points of an infrared touch screen and touch screen equipment.
Background
A touch screen (also referred to as a touch screen) may detect the presence of a touch point and the location of the touch point within the display area may simplify human-computer interaction. Among the existing touch technologies, the infrared touch technology is widely applied because of its advantages of strong environmental adaptability, long service life, many recognizable touch points, etc.
The infrared touch screen is generally rectangular in structure, and as shown in fig. 1, the infrared touch screen is composed of a long emitting edge 11, a long receiving edge 12, a short emitting edge 13 and a short receiving edge 14. There are several emitting lamps 111 on the emitting edge 11, several receiving lamps 121 on the corresponding receiving edge 12, and usually 1 to 1 or 1 to 3 scanning is adopted, fig. 1 takes 1 to 3 scanning as an example, one emitting lamp 111 can emit light in three scanning directions (as indicated as direction 1, direction 2 and direction 3 in 1), and the opposite 3 receiving lamps 121 receive light in the three scanning directions simultaneously, so that the emitting lamps emit light in sequence in one scanning period, and the corresponding receiving lamps corresponding to the emitting lamps receive light in sequence, thereby forming an optical network. At present, whether a touch event occurs can be judged according to whether an infrared light path in a scanning direction is shielded or not, and the position of a touch point is judged.
In the prior art, a traditional polygon superposition method is usually adopted to identify touch points, specifically, as shown in fig. 2, touch areas in each scanning direction are intersected, an overlapped area forms a convex polygon, an area in the convex polygon is a touch point, and at present, the touch point identified by adopting the polygon superposition method is considered to be a real touch point. The touch point exemplarily shown in fig. 2 is a touch point (i.e., a convex polygon a) formed by intersecting touch areas in four scanning directions (directions 1 to 4, respectively), where the touch area in each scanning direction is an area composed of 2 light paths continuously blocked in each scanning direction.
Because the conventional method for identifying the touch point by the polygon superposition method only adopts the shielded light path to identify the touch point, and does not consider the influence of the unshielded light path on the accuracy of the touch point, the touch point may be lost (that is, the touch point is not identified) when the determined touch point is not an overlapped area obtained by intersecting touch areas in all scanning directions of the touch screen, so that the accuracy of identifying the touch point is low.
Disclosure of Invention
The embodiment of the invention provides a method and a device for identifying touch points of an infrared touch screen and touch screen equipment. The recognition accuracy of the infrared touch points can be improved.
In a first aspect, a method for identifying touch points of an infrared touch screen is provided, and the method includes:
in a scanning period, scanning in all n scanning directions is performed to obtain a touch area in each scanning direction, wherein the touch area in each scanning direction is an area formed by a single shielded light path or at least two continuous shielded light paths in each scanning direction; wherein n is an integer greater than 2;
for a touch area in any scanning direction, acquiring an overlapping area of the touch area and touch areas in other scanning directions to obtain a candidate judgment area;
if the candidate judgment area is an overlapped area of the touch areas in the m scanning directions, setting a virtual light path boundary which is intersected with the contour line of the candidate judgment area in the first scanning direction; the first scanning direction is a scanning direction that does not overlap with the candidate determination region, and m is an integer smaller than n and greater than or equal to 2;
and if no light path in the first scanning direction passes through the virtual light path boundary, determining the candidate judging area as a real touch point.
Optionally, the method further comprises:
and if the candidate judging area is the overlapped area of the touch areas in the n scanning directions, determining the candidate judging area as a real touch point.
Optionally, the method further comprises:
if the optical path in the first scanning direction passes through the virtual optical path boundary, dividing the candidate judgment area by adopting the optical path to obtain at least two candidate judgment sub-areas;
for each candidate decision sub-region:
and if the candidate judging sub-area comprises at least one intersection point formed by intersecting the blocked light paths, the candidate judging sub-area is a real touch point.
Optionally, the method further comprises:
if the optical path in the first scanning direction passes through the virtual optical path boundary, dividing the candidate judgment area by adopting the optical path to obtain at least two candidate judgment sub-areas;
for each candidate decision sub-region:
and if the candidate judgment sub-area comprises at least two blocked light paths in the scanning direction, the candidate judgment sub-area is a real touch point.
Optionally, the method further comprises:
if the candidate judging subarea does not comprise an intersection point formed by intersecting the blocked light paths, the candidate judging subarea is a ghost point;
optionally, the method further comprises:
and if the candidate judging subarea does not comprise at least two blocked light paths in the scanning direction, the candidate judging subarea is a ghost point.
In a second aspect, an infrared touch screen touch point recognition device is provided, which includes:
the acquisition module is used for executing scanning in all n scanning directions in a scanning period to obtain a touch area in each scanning direction, and acquiring an overlapped area of the touch area and touch areas in other scanning directions for the touch area in any scanning direction to obtain a candidate judgment area; the touch area in each scanning direction is an area formed by each single shielded light path or at least two continuous shielded light paths; wherein n is an integer greater than 2;
the identification module is used for setting a virtual light path boundary which is intersected with the contour line of the candidate judgment area in the first scanning direction if the candidate judgment area is the overlapped area of the touch areas in the m scanning directions; if no light path in the first scanning direction passes through the virtual light path boundary, determining the candidate judging area as a real touch point; the first scanning direction is a scanning direction in which no touch region overlaps the candidate determination region, and m is an integer less than n and greater than or equal to 2.
Optionally, the identification module is further configured to determine that the candidate determination region is a real touch point if the candidate determination region is an overlapping region of touch regions in the n scanning directions.
Optionally, the identification module is further configured to, if an optical path in the first scanning direction passes through the virtual optical path boundary, divide the candidate determination region by using the optical path to obtain at least two candidate determination sub-regions;
for each candidate decision sub-region:
and if the candidate judging sub-area comprises at least one intersection point formed by intersecting the blocked light paths, the candidate judging sub-area is a real touch point.
Optionally, the identification module is further configured to, if an optical path in the first scanning direction passes through the virtual optical path boundary, divide the candidate determination region by using the optical path to obtain at least two candidate determination sub-regions;
for each candidate decision sub-region:
and if the candidate judgment sub-area comprises at least two blocked light paths in the scanning direction, the candidate judgment sub-area is a real touch point.
Optionally, the identification module is further configured to determine that the candidate sub-region is a ghost point if the candidate sub-region does not include an intersection point formed by intersecting blocked light paths.
Optionally, the identification module is further configured to determine that the candidate sub-region is a ghost point if the candidate sub-region does not include occluded light paths in at least two scanning directions.
In a third aspect, an infrared touch screen touch point recognition device is provided, which includes: a processor and a memory;
the memory is used for storing computer execution instructions, and when the infrared touch screen touch point identification device runs, the processor executes the computer execution instructions stored in the memory, so that the infrared touch screen touch point identification device executes the infrared touch screen touch point identification method according to the first aspect or the optional implementation manner thereof.
In a fourth aspect, a touch screen device is provided, which includes the infrared touch screen touch point recognition apparatus according to the second aspect or its optional implementation manner or the third aspect.
In a fifth aspect, a computer storage medium is provided, which includes computer instructions that, when executed on a computer, cause the computer to perform the method for infrared touch screen touch point recognition as described in the first aspect or its alternative implementations.
A sixth aspect provides a computer program product, which includes computer instructions, when the computer instructions are run on a computer, the computer is caused to execute the method for recognizing touch points on an infrared touch screen as described in the first aspect or its optional implementation manner.
According to the method, the device and the touch screen equipment for identifying the touch points of the infrared touch screen, scanning in all n scanning directions is performed in one scanning period to obtain the touch area in each scanning direction, and for the touch area in any one scanning direction, the overlapped area of the touch area and the touch areas in other scanning directions is obtained to obtain a candidate judgment area; if the candidate judgment area is an overlapped area of the touch areas in the m scanning directions, setting a virtual light path boundary which is intersected with the contour line of the candidate judgment area in the first scanning direction; and if no light path in the first scanning direction passes through the virtual light path boundary, determining the candidate judging area as a real touch point.
Compared with the prior art, by the scheme, the virtual light path boundary intersected with the contour line of the candidate judgment area in the first scanning direction can be set under the condition that the candidate judgment area is the overlapped area of the touch areas in the m scanning directions; and judging whether the optical path in the first scanning direction does not pass through the virtual optical path boundary or not to determine whether the candidate judging area is a real touch point or not. Therefore, under the condition that the candidate judging area is not an overlapped area obtained by intersecting the touch areas in all scanning directions, the candidate judging area is not directly discarded, and whether the candidate judging area is a real touch point or not is judged according to an unshielded light path in the first scanning direction, so that the identification precision of the touch point can be improved.
Drawings
FIG. 1 is a schematic structural diagram of a prior art mid-IR touch screen;
FIG. 2 is a schematic diagram illustrating a conventional polygon superposition method for identifying touch points in the prior art;
fig. 3 is a schematic diagram of 2 scanning directions corresponding to a long emitting edge in a 1-to-2 scanning manner according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a touch area provided by an embodiment of the invention;
fig. 5 is a first schematic view illustrating a method for identifying a touch point of an infrared touch screen according to an embodiment of the present invention;
fig. 6 is a schematic diagram of a second method for identifying touch points of an infrared touch screen according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of a candidate decision region according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of determining a virtual lightpath boundary according to an embodiment of the present invention;
FIG. 9 is a first schematic diagram illustrating a method for segmenting candidate decision regions according to an embodiment of the present invention;
FIG. 10 is a second schematic diagram illustrating a candidate decision region segmentation according to an embodiment of the present invention;
fig. 11 is a schematic diagram illustrating a determination of whether an intersection point formed by intersecting blocked optical paths exists in a candidate judging sub-region according to an embodiment of the present invention;
fig. 12 is a first schematic structural diagram of an infrared touch screen touch point identification device according to an embodiment of the present invention;
fig. 13 is a schematic structural diagram of a touch point identification device of an infrared touch screen according to an embodiment of the present invention.
Detailed Description
The following describes a method and an apparatus for identifying touch points of an infrared touch screen, and a touch screen device in detail, with reference to the accompanying drawings.
In the embodiments of the present invention, words such as "exemplary" or "for example" are used to mean serving as examples, illustrations or descriptions. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. The symbol "/" herein denotes a relationship in which the associated object is or, for example, a/B denotes a or B.
In the description of the present invention, the meaning of "a plurality" means two or more unless otherwise specified.
First, some concepts that may be involved in embodiments of the present invention will be described.
1. Scanning direction; in the 1-to-n (n > -1) scanning mode, each light path has a different angle for a particular lamp; for a particular emission lamp, we refer to it as a scan direction for each angle of the n light paths. Therefore, the 1-to-n scanning mode has n scanning directions, and each scanning direction can correspond to a set of parallel optical paths with the same slope, as shown in fig. 3, which is 2 scanning directions corresponding to the long emitting edge in the 1-to-2 scanning mode.
2. A touch area; each scanning direction corresponds to a group of parallel light paths with the same slope; when a touch occurs, the touch point will block one or several continuous parallel light paths, which is called a single blocked light path or a continuous area composed of at least two blocked light paths as a touch area in the scanning direction. Because the concept of the optical path related in the embodiment of the present invention is an analog quantity, and the optical path has a certain width in practical application, that is, generally, one optical path is actually a light band, a single blocked optical path in the embodiment of the present invention may also be a touch area.
Exemplarily, fig. 4 is a schematic diagram of a touch area, fig. 4 shows a plurality of parallel optical paths in one scanning direction, wherein a dotted line represents blocked optical paths, and we define a starting boundary of a first blocked optical path (e.g., optical path 1 in fig. 4) and an ending boundary (where the starting and ending order of the blocked optical paths may be reversed) in consecutive blocked optical paths, where a touch area in the scanning direction is a rectangular area formed by optical path 1 optical path 2 in fig. 4 and the boundary of the touch screen.
Example one
The embodiment of the invention provides a touch point identification method of an infrared touch screen, which can be applied to touch screen equipment. As shown in fig. 5, the method may include:
s11, in one scanning cycle, scanning of all n scanning directions is performed to obtain a touch area in each scanning direction.
The touch area in each scanning direction is an area formed by a single shielded light path or at least two continuous shielded light paths in each scanning direction; and n is an integer greater than 2.
Generally, a touch screen device may include a plurality of scanning directions, and all the n scanning directions refer to all scanning directions included in the touch screen device. For example, if the long emission side of the touch screen device includes 2 scanning directions and the short emission side of the touch screen device includes 2 scanning directions, the touch screen device includes 4 scanning directions.
S12, for the touch area in any one scanning direction, acquiring the overlapping area of the touch area and the touch area in the other scanning direction to obtain the candidate judgment area.
Wherein, if the touch areas in different scanning directions intersect, an overlapping area of the touch areas appears; in the embodiment of the invention, for the touch area in any one scanning direction, the overlapping area of the touch area and the touch area in other scanning directions can be obtained. In the embodiment of the invention, at least one candidate judgment area can be obtained.
S13, if the candidate determination region is an overlapping region of the touch regions in the m scanning directions, setting a virtual optical path boundary intersecting the contour line of the candidate determination region in the first scanning direction.
Wherein the first scanning direction is a scanning direction that does not overlap with the candidate determination region. m is an integer less than n and greater than or equal to 2.
When the real touch point is small, for example, the real touch point is smaller than the widths of two adjacent optical paths, the candidate determination region may be an overlapping region of touch regions in less than n scanning directions.
The first scanning direction in the embodiment of the present invention may mean that the scanning direction satisfies one of the following 2 conditions: 1. there is no touch area in the scan direction, i.e. all light rays above this are not blocked. 2. There is a touch area in the scanning direction, but the touch area does not overlap the candidate decision area.
In the embodiment of the present invention, the virtual optical path boundary is located at the outermost side of the candidate decision region, and intersects with the contour line of the candidate decision region in the first scanning direction.
The virtual optical path boundary is not an actual optical path, but a virtual boundary provided.
In the embodiment of the invention. One first scanning direction may be included, or a plurality of first scanning directions may be included.
S14, if the optical path in the first scanning direction does not pass through the virtual optical path boundary, the candidate determination region is determined as the true touch point.
Since no light path in the first scanning direction passes within the virtual light path boundary, the light paths in all directions in the candidate determination region are blocked, and thus the candidate determination region can be determined as the true touch point according to S14.
Example two
The embodiment of the invention provides a touch point identification method of an infrared touch screen, which can be applied to touch screen equipment. As shown in fig. 6, the method may include:
s101, in one scanning period, scanning in all n scanning directions is performed to obtain a touch area in each scanning direction.
S102, acquiring an overlapped area of the touch area in any scanning direction and the touch areas in other scanning directions to obtain a candidate judgment area.
For the related description of S101 and S102, reference may be made to the description of S11 and S12, which are not repeated herein.
At least one candidate decision region can be obtained in the embodiment of the invention. For each candidate decision region, S103-S111 described below may be performed to determine whether the candidate decision region is a true touch point. For example, the embodiment of the present invention may sequentially traverse each candidate decision region by using the methods of S103 to S111 to determine whether each candidate decision region is a real touch point.
S103, judging whether the candidate judging area is an overlapping area of the touch areas in the n scanning directions.
If the candidate determination region is an overlapping region of the touch regions in the n scanning directions, performing the following S104; if the candidate determination region is an overlapping region of touch regions in m scanning directions (i.e., less than n scanning directions), the following S105 and S106 are performed.
In the case where the real touch point is sufficiently large, for example, more than two lamp pitches of the real touch point, the determination candidate area determined at this time is the touch area in the n scanning directions, and may be the determination candidate area as the real touch point.
When the real touch point is small, for example, the real touch point is smaller than the widths of two adjacent optical paths, the candidate determination region may be an overlapping region of touch regions in less than n scanning directions.
Exemplarily, taking a touch screen device including 4 scanning directions in total as an example, as shown in fig. 2, the candidate determination region is an overlapping region of touch regions in the 4 scanning directions (identified as direction 1 to direction 4 in fig. 2); the candidate determination regions shown in fig. 7 are overlapping regions of the touch regions in 2 of the 4 scanning directions (i.e., direction 1 and direction 3).
And S104, determining the candidate judging area as a real touch point.
And S105, setting a virtual light path boundary which intersects with the contour line of the candidate judging area in the first scanning direction.
For the description of the first scanning direction and the virtual optical path boundary in S105, reference may be made to the description in S13, and details are not described here.
In the embodiment of the invention. One first scanning direction may be included, or a plurality of first scanning directions may be included. The following description is given by way of example, in which the above-mentioned S105 in the embodiment of the present invention can determine two first scanning directions.
Illustratively, as shown in conjunction with fig. 7 described above, the candidate determination region is an overlapping region of touch regions in the directions 1 and 3, and thus both the directions 2 and 4 may be reversely deduced as the first scanning direction.
Further illustratively, as shown in fig. 8, virtual light path boundaries intersecting the contour lines of the candidate decision regions may be determined in a direction 2, as a boundary 1 and a boundary 2, respectively, and virtual light path boundaries intersecting the contour lines of the candidate decision regions may be determined in a direction 4, as a boundary 3 and a boundary 4, respectively, (the virtual light path boundaries are indicated by broken lines in fig. 8).
S106, judging whether the optical path in the first scanning direction passes through the virtual optical path boundary.
If no optical path in the first scanning direction passes through the virtual optical path boundary, executing the following step S107; if the optical path in the first scanning direction passes within the virtual optical path boundary, the following S108 and S109 are executed.
In the case where the real touch point (i.e., the actual touch point) is relatively small, the area of the candidate determination region may be larger than the area of the real touch point, and then there may be a case where the light path passes through the candidate determination region but does not pass through the real touch point in the first scanning direction, so it is necessary to determine whether there is such a case through S106 described above in the embodiment of the present invention.
For the description of the first scanning direction and the virtual optical path boundary in the above S106, reference may be specifically made to the related description in the above S13, and details are not described here again.
Illustratively, as shown in fig. 8, it may be determined whether an optical path in the direction 2 passes between the boundary 1 and the boundary 2, and whether an optical path in the direction 4 passes between the boundary 3 and the boundary 4.
And S107, determining the candidate judging area as a real touch point.
Since no light path in the first scanning direction passes through the virtual light path boundary, it is indicated that light paths in all scanning directions in the candidate determination region are blocked, so that the candidate determination region can be determined as a real touch point.
And S108, dividing the candidate judgment area by adopting the optical path to obtain at least two candidate judgment sub-areas.
If the optical path in the first scanning direction passes through the virtual optical path boundary, it indicates that not all the optical paths in the scanning direction in the candidate determination region are blocked, but also part of the optical paths in the scanning direction pass through the candidate determination region, and at this time, the area of the candidate determination region may be larger than the real touch point, or a plurality of real touch points exist in the candidate determination region, so that the candidate determination region needs to be divided by S108 to further identify the real touch point.
Illustratively, referring to fig. 8, as shown in fig. 9, if an optical path a in the direction 2 passes between the boundary 1 and the boundary 2, the candidate determination region is divided by the optical path a to obtain 2 candidate determination sub-regions (the 2 candidate determination sub-regions are respectively denoted as B and C).
Further exemplarily, referring to fig. 8, if the optical path a in the direction 2 passes between the boundary 1 and the boundary 2 and the optical path b in the direction 4 passes between the boundary 3 and the boundary 4, as shown in fig. 10, the candidate determination region is divided by the optical path a and the optical path b to obtain 4 candidate determination sub-regions (the 4 candidate determination sub-regions are respectively represented as D, E, F and G).
For each candidate decision sub-region, the following S109-S111 are performed to determine whether the candidate decision sub-region is a true touch point or a ghost point.
S109, judging whether the candidate judging sub-region comprises an intersection point formed by intersecting the blocked light paths.
If the candidate judgment sub-area comprises at least one intersection point formed by intersecting the blocked light paths, executing the following S110; if the candidate determination sub-region does not include the intersection point formed by intersecting the blocked optical paths, the following S111 is performed.
And S110, determining the candidate judgment sub-area as a real touch point.
And S111, determining the candidate judgment sub-region as a ghost point.
In conjunction with fig. 10, as shown in fig. 11, the intersection formed by the intersection of the occluded light paths X and Y exists in the candidate determination sub-region F in fig. 11, and the intersection formed by the intersection of the occluded light paths is not included in D, E and G, so that it is possible to determine that the determination sub-region F is a real touch point, and the candidate determination sub-regions D, E and G are ghost points.
Alternatively, S109 may be replaced with S109a described below.
S109a, judging whether the candidate judging sub-region comprises at least two blocked light paths in the scanning direction.
If the candidate determination sub-region includes at least two blocked light paths in the scanning direction, performing the above S110; if the candidate determination sub-region does not include the blocked optical paths in at least two scanning directions, the above S111 is performed.
According to the method for identifying the touch point of the infrared touch screen, provided by the embodiment of the invention, when the candidate judging area is the overlapping area of the touch areas in m scanning directions, the virtual light path boundary intersected with the contour line of the candidate judging area in the first scanning direction is determined; and under the condition that the light path in the first scanning direction passes through the virtual light path boundary, dividing the candidate judging area by adopting the light path to obtain at least two candidate judging sub-areas, and further determining whether the candidate judging sub-areas are real touch points.
In the case where not all the optical paths in the scanning direction in the candidate determination region are blocked, but part of the optical paths in the scanning direction pass through the candidate determination region, the area of the candidate determination region may be larger than the real touch point at this time, or a plurality of real touch points may exist in the candidate determination region. According to the scheme, after the candidate judging area is divided to obtain at least two candidate judging sub-areas, whether the candidate judging sub-areas are real touch points or not is further determined, the real touch points can be identified more accurately, and the identification precision of the touch points is further improved.
The determination may be made by determining a virtual optical path boundary intersecting the contour line of the candidate determination region in the first scanning direction in a case where the candidate determination region is an overlapping region of the touch regions in the m scanning directions; and judging whether the optical path in the first scanning direction does not pass through in the virtual optical path boundary, and determining whether the candidate judging area is a real touch point, so that under the condition that the candidate judging area is not an overlapped area obtained by intersecting touch areas in all scanning directions, a plurality of real touch points close to each other cannot be identified as one touch point, or the area of the identified touch point is far larger than that of the real touch point. The recognition accuracy of the touch point is improved.
Alternatively, after the real touch point is determined, the coordinates of the real touch point may be directly output, or trajectory tracking and smoothing may be performed, that is, the real touch point and the position of the real touch point determined in the previous scanning cycle are determined, a touched trajectory is determined, and the coordinates of the real touch point are output.
The real touch point determined according to the embodiment of the present invention is a convex polygon, and for example, when the coordinates of the real touch point are output, the coordinates of the center of gravity of the real touch point (that is, the convex polygon) may be output; and/or outputting the coordinates of each vertex of the real touch point (i.e., the convex polygon).
As shown in fig. 12, an embodiment of the present invention provides an infrared touch screen touch point identification apparatus, including:
an obtaining module 21, configured to perform scanning in all n scanning directions in a scanning cycle to obtain a touch area in each scanning direction, and obtain, for a touch area in any one scanning direction, an overlapping area between the touch area and touch areas in other scanning directions to obtain a candidate determination area; the touch area in each scanning direction is an area formed by each single shielded light path or at least two continuous shielded light paths; wherein n is an integer greater than 2;
an identifying module 22, configured to set a virtual optical path boundary intersecting with a contour line of the candidate determination region in the first scanning direction if the candidate determination region is an overlapping region of the touch regions in the m scanning directions; if no light path in the first scanning direction passes through the virtual light path boundary, determining the candidate judging area as a real touch point; the first scanning direction is a scanning direction in which no touch region overlaps the candidate determination region, and m is an integer less than n and greater than or equal to 2.
Optionally, the identifying module 22 is further configured to determine that the candidate determination region is a real touch point if the candidate determination region is an overlapping region of touch regions in the n scanning directions.
Optionally, the identifying module 22 is further configured to, if an optical path in the first scanning direction passes through the virtual optical path boundary, divide the candidate determination region by using the optical path to obtain at least two candidate determination sub-regions;
for each candidate decision sub-region:
and if the candidate judging sub-area comprises at least one intersection point formed by intersecting the blocked light paths, the candidate judging sub-area is a real touch point.
Optionally, the identifying module 22 is further configured to, if an optical path in the first scanning direction passes through the virtual optical path boundary, divide the candidate determination region by using the optical path to obtain at least two candidate determination sub-regions;
for each candidate decision sub-region:
and if the candidate judgment sub-area comprises at least two blocked light paths in the scanning direction, the candidate judgment sub-area is a real touch point.
Optionally, the identifying module 22 is further configured to determine that the candidate sub-region is a ghost point if the candidate sub-region does not include an intersection point formed by intersecting blocked light paths;
optionally, the identifying module 22 is further configured to determine that the candidate sub-region is a ghost point if the candidate sub-region does not include occluded light paths in at least two scanning directions.
As shown in fig. 13, an embodiment of the present invention provides an infrared touch screen touch point identification apparatus, including: a processor 31 and a memory 32;
the memory is used for storing computer execution instructions, and when the touch screen infrared touch screen touch point identification device runs, the processor executes the computer execution instructions stored in the memory, so that the infrared touch screen touch point identification device executes the infrared touch screen touch point identification method shown in the embodiment of the invention.
Illustratively, as shown in fig. 13, the processor 31 and the memory 32 may be connected by a bus 33.
The bus 33 may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The bus 33 may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 13, but this is not intended to represent only one bus or type of bus.
Embodiments of the present invention provide a touch screen device, which may include an infrared touch screen touch point recognition apparatus as shown in fig. 12 or 13.
The embodiment of the invention provides a computer storage medium, which comprises computer instructions, and when the computer instructions are run on a computer, the computer is enabled to execute the method for identifying the touch point of the infrared touch screen.
The embodiment of the invention provides a computer program product, which comprises a computer instruction, and when the computer instruction runs on a computer, the computer is enabled to execute the method for identifying the touch point of the infrared touch screen.
The technical solutions provided in the embodiments of the present invention are essentially or partially contributed to by the prior art, or all or part of the technical solutions may be implemented by software programs, hardware, firmware, or any combination thereof. When implemented using a software program, the computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the flow or functions according to embodiments of the invention, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another computer readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device including one or more available media integrated servers, data centers, and the like. The usable medium may be a magnetic medium (e.g., floppy disk, magnetic tape), an optical medium (e.g., Digital Video Disk (DVD)), or a semiconductor medium (e.g., Solid State Drive (SSD)), among others.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, only the division of the functional modules is illustrated, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions. For the specific working processes of the system, the apparatus and the unit described above, reference may be made to the corresponding processes in the foregoing method embodiments, and details are not described here again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (13)

1. A method for identifying touch points of an infrared touch screen is characterized by comprising the following steps:
in a scanning period, scanning in all n scanning directions is performed to obtain a touch area in each scanning direction, wherein the touch area in each scanning direction is an area formed by a single shielded light path or at least two continuous shielded light paths in each scanning direction; wherein n is an integer greater than 2;
acquiring an overlapped area of the touch area in any scanning direction and the touch areas in other scanning directions to obtain a candidate judgment area;
if the candidate judgment area is an overlapped area of the touch areas in the m scanning directions, setting a virtual light path boundary which is intersected with the contour line of the candidate judgment area in the first scanning direction; the first scanning direction is a scanning direction that does not overlap with the candidate determination region, and m is an integer smaller than n and greater than or equal to 2;
and if no light path in the first scanning direction passes through the virtual light path boundary, determining the candidate judging area as a real touch point.
2. The method of claim 1, further comprising:
and if the candidate judging area is the overlapped area of the touch areas in the n scanning directions, determining that the candidate judging area is a real touch point.
3. The method of claim 1, further comprising:
if the optical path in the first scanning direction passes through the virtual optical path boundary, dividing the candidate judging area by adopting the optical path to obtain at least two candidate judging sub-areas;
for each of the candidate decision sub-regions:
and if the candidate judgment sub-area comprises at least one intersection point formed by intersecting the blocked light paths, the candidate judgment sub-area is a real touch point.
4. The method of claim 1, further comprising:
if the optical path in the first scanning direction passes through the virtual optical path boundary, dividing the candidate judging area by adopting the optical path to obtain at least two candidate judging sub-areas;
for each of the candidate decision sub-regions:
and if the candidate judgment sub-area comprises at least two blocked light paths in the scanning direction, the candidate judgment sub-area is a real touch point.
5. The method of claim 3,
and if the candidate judging subarea does not comprise an intersection point formed by intersecting the blocked light paths, the candidate judging subarea is a ghost point.
6. The method of claim 4,
and if the candidate judging subarea does not comprise at least two blocked light paths in the scanning direction, the candidate judging subarea is a ghost point.
7. An infrared touch screen touch point recognition device, comprising:
the acquisition module is used for executing scanning in all n scanning directions in a scanning period to obtain a touch area in each scanning direction, and acquiring an overlapped area of the touch area and touch areas in other scanning directions for the touch area in any scanning direction to obtain a candidate judgment area; the touch area in each scanning direction is an area formed by a single shielded light path or at least two continuous shielded light paths in each scanning direction; wherein n is an integer greater than 2;
the identification module is used for setting a virtual light path boundary which is intersected with the contour line of the candidate judgment area in a first scanning direction if the candidate judgment area is an overlapped area of the touch areas in the m scanning directions; if no light path in the first scanning direction passes through the virtual light path boundary, determining the candidate judging area as a real touch point; the first scanning direction is a scanning direction that does not overlap with the candidate determination region, and m is an integer smaller than n and greater than or equal to 2.
8. The apparatus of claim 7, wherein the identifying module is further configured to determine the candidate decision region as a true touch point if the candidate decision region is an overlap region of touch regions in n scan directions.
9. The apparatus according to claim 7, wherein the identifying module is further configured to, if an optical path in a first scanning direction passes through the virtual optical path boundary, divide the candidate decision area by using the optical path to obtain at least two candidate decision sub-areas;
for each of the candidate decision sub-regions:
and if the candidate judgment sub-area comprises at least one intersection point formed by intersecting the blocked light paths, the candidate judgment sub-area is a real touch point.
10. The apparatus according to claim 7, wherein the identifying module is further configured to, if an optical path in a first scanning direction passes through the virtual optical path boundary, divide the candidate decision area by using the optical path to obtain at least two candidate decision sub-areas;
for each of the candidate decision sub-regions:
and if the candidate judgment sub-area comprises at least two blocked light paths in the scanning direction, the candidate judgment sub-area is a real touch point.
11. The apparatus of claim 9, wherein the identifying module is further configured to determine the sub-region as a ghost point if the sub-region does not include an intersection point formed by intersection of the blocked light paths.
12. The apparatus of claim 10, wherein the identifying module is further configured to determine the sub-region as a ghost point if the candidate sub-region does not include occluded optical paths in at least two scanning directions.
13. A touch screen device comprising the infrared touch screen touch point recognition apparatus according to any one of claims 7 to 12.
CN201710980632.2A 2017-10-19 2017-10-19 Touch point identification method and device of infrared touch screen and touch screen equipment Active CN107728860B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710980632.2A CN107728860B (en) 2017-10-19 2017-10-19 Touch point identification method and device of infrared touch screen and touch screen equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710980632.2A CN107728860B (en) 2017-10-19 2017-10-19 Touch point identification method and device of infrared touch screen and touch screen equipment

Publications (2)

Publication Number Publication Date
CN107728860A CN107728860A (en) 2018-02-23
CN107728860B true CN107728860B (en) 2020-06-23

Family

ID=61212158

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710980632.2A Active CN107728860B (en) 2017-10-19 2017-10-19 Touch point identification method and device of infrared touch screen and touch screen equipment

Country Status (1)

Country Link
CN (1) CN107728860B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109597529A (en) * 2018-11-06 2019-04-09 广州华欣电子科技有限公司 Infrared touch screen positioning method, device, equipment and medium
CN109542276B (en) * 2018-11-22 2022-02-08 青岛海信商用显示股份有限公司 Touch point identification method and device and display equipment
CN110275645B (en) * 2019-06-12 2022-08-05 青岛海信商用显示股份有限公司 Touch point identification method, touch point identification device, storage medium and touch device
CN110502160B (en) * 2019-08-19 2023-03-28 青岛海信商用显示股份有限公司 Touch point classification method and device, touch screen and display
CN113126795B (en) * 2019-12-31 2022-08-19 青岛海信商用显示股份有限公司 Touch identification method of touch display device and related equipment
CN111475056A (en) * 2020-06-01 2020-07-31 青岛海信商用显示股份有限公司 Touch identification method and device
CN114077359A (en) * 2020-08-11 2022-02-22 青岛海信商用显示股份有限公司 Touch recognition method and display device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010015408A1 (en) * 2008-08-07 2010-02-11 Owen Drumm Method and apparatus for detecting a multitouch event in an optical touch-sensitive device
CN103365492A (en) * 2013-08-01 2013-10-23 北京东方中原教育科技有限公司 Multi-point touch identification method for infrared touch screen
CN104571730A (en) * 2015-01-19 2015-04-29 青岛海信电器股份有限公司 Acquiring method of touch point, infrared touch device and display devic
CN104731418A (en) * 2015-03-27 2015-06-24 青岛海信电器股份有限公司 Touch screen positioning method and device and touch screen device
CN105045439A (en) * 2015-08-07 2015-11-11 青岛海信电器股份有限公司 Touch screen multi-point positioning method and apparatus and touchscreen device
CN106293261A (en) * 2015-05-21 2017-01-04 青岛海信电器股份有限公司 A kind of touch screen localization method, device and touch-screen equipment
CN106502475A (en) * 2016-10-26 2017-03-15 青岛海信电器股份有限公司 A kind of infrared touch point identification method, infrared contactor control device and display device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010015408A1 (en) * 2008-08-07 2010-02-11 Owen Drumm Method and apparatus for detecting a multitouch event in an optical touch-sensitive device
CN103365492A (en) * 2013-08-01 2013-10-23 北京东方中原教育科技有限公司 Multi-point touch identification method for infrared touch screen
CN104571730A (en) * 2015-01-19 2015-04-29 青岛海信电器股份有限公司 Acquiring method of touch point, infrared touch device and display devic
CN104731418A (en) * 2015-03-27 2015-06-24 青岛海信电器股份有限公司 Touch screen positioning method and device and touch screen device
CN106293261A (en) * 2015-05-21 2017-01-04 青岛海信电器股份有限公司 A kind of touch screen localization method, device and touch-screen equipment
CN105045439A (en) * 2015-08-07 2015-11-11 青岛海信电器股份有限公司 Touch screen multi-point positioning method and apparatus and touchscreen device
CN106502475A (en) * 2016-10-26 2017-03-15 青岛海信电器股份有限公司 A kind of infrared touch point identification method, infrared contactor control device and display device

Also Published As

Publication number Publication date
CN107728860A (en) 2018-02-23

Similar Documents

Publication Publication Date Title
CN107728860B (en) Touch point identification method and device of infrared touch screen and touch screen equipment
CN110489015B (en) Touch point determining method and device, touch screen and display
CN207780668U (en) Display device and electronic equipment
US20200410240A1 (en) Identifying cable ends using augmented reality
CN110262685B (en) Touch detection method, touch detection device, storage medium and touch screen
CN106557209B (en) Processing method, device and the terminal device of infrared touch panel touching signals
CN104615310A (en) Method and device for determining candidate touch points
CN110458920A (en) A kind of handwriting erasing method and device
CN109886127A (en) Fingerprint identification method and terminal device
CN109656393A (en) Refer to tracking, device, equipment and the computer readable storage medium of contact more
CN104035702A (en) Method for preventing intelligent terminal operation error and intelligent terminal
CN110502160B (en) Touch point classification method and device, touch screen and display
CN108228065A (en) Detect the method, apparatus and electronic equipment of UI control informations
CN111611767A (en) Verification method and device
CN108845924A (en) Control response region display control method, electronic device and storage medium
CN113934312B (en) Touch object identification method based on infrared touch screen and terminal equipment
CN113126795B (en) Touch identification method of touch display device and related equipment
CN111460072B (en) Lane line detection method, device, equipment and storage medium
CN103809817A (en) Optical touch system and object position judgment method thereof
CN105373262A (en) Method and device for identifying invalid light paths of infrared touch screen
CN109444905B (en) Dynamic object detection method and device based on laser and terminal equipment
CN110333804A (en) The response method of textures, device, terminal device and storage medium in game system
CN109597524B (en) Touch detection method, touch detection device and touch display device
CN106293261B (en) A kind of touch screen localization method, device and touch-screen equipment
CN110275645B (en) Touch point identification method, touch point identification device, storage medium and touch device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 266555 Qingdao economic and Technological Development Zone, Shandong, Hong Kong Road, No. 218

Applicant after: Hisense Visual Technology Co., Ltd.

Address before: 266100 Zhuzhou Road, Laoshan District, Shandong, No. 151, No.

Applicant before: QINGDAO HISENSE ELECTRONICS Co.,Ltd.

GR01 Patent grant
GR01 Patent grant