CN102023761B - Multi-point detection and calculation method for touch screen - Google Patents

Multi-point detection and calculation method for touch screen Download PDF

Info

Publication number
CN102023761B
CN102023761B CN201010574634XA CN201010574634A CN102023761B CN 102023761 B CN102023761 B CN 102023761B CN 201010574634X A CN201010574634X A CN 201010574634XA CN 201010574634 A CN201010574634 A CN 201010574634A CN 102023761 B CN102023761 B CN 102023761B
Authority
CN
China
Prior art keywords
pixel
touch point
touch
family
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201010574634XA
Other languages
Chinese (zh)
Other versions
CN102023761A (en
Inventor
郑金发
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vtron Group Co Ltd
Original Assignee
Vtron Technologies Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vtron Technologies Ltd filed Critical Vtron Technologies Ltd
Priority to CN201010574634XA priority Critical patent/CN102023761B/en
Publication of CN102023761A publication Critical patent/CN102023761A/en
Application granted granted Critical
Publication of CN102023761B publication Critical patent/CN102023761B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses a multi-point detection and calculation method for a touch screen. The method comprises the following steps: 1) starting a camera and collecting a touch image; 2) carrying out binary processing on the touch image, thus getting pixel value of the touch image; 3) comparing the collected pixel value with the preset pixel value, and storing corresponding pixel coordinates of the touch image if the collected pixel value is larger than the preset value; and 4) using a back-end processing module to read out the stored pixel coordinates, adopting the gravity center algorithm which is near to a touch point pixel family to classify each pixel coordinate and re-calculate the gravity center of the touch point pixel family, and finally identifying one or more touch points. By adopting the method, two touch points which are near to each other and even communicated at boundaries can be rapidly and accurately identified, and the reliability is high.

Description

A kind of multiple spot detection computations method of touch-screen
Technical field
The present invention relates to computer vision field, particularly based on the multiple spot detection computations method of the touch-screen of camera location.
Background technology
In recent years, the giant-screen touching technique has had bigger development, the infrared LED touch screen scanning occurred; The ultrasonic touch screen; Positioning touch screen behind the camera adopts two cameras to be installed on touch-screen of touch-screen corner etc., wherein with positioning touch screen behind the camera can be convenient, flexible the realization multiple point touching; Be easier to realize monitoring and tracking, therefore obtained using widely in the multiple point touching field more than two above touch points.
Positioning touch screen behind the said camera; Generally with camera collection to image carry out binary conversion treatment; Promptly be to set a predetermined value; The monochrome information of the pixel that camera collection is arrived therewith predetermined value relatively, the pixel that is higher than this predetermined value is considered to effective touch point pixel, the corresponding camera coordinate of effective touch point pixel of the two field picture that memory module will photograph is stored; Back end processing module carries out effective touch point pixel coordinate of being stored by the touch point classification and calculates the barycentric coordinates of touch point, obtains carrying out the logical coordinates that figure adjustment can obtain touch-screen again after the barycentric coordinates of touch point.
The pixel coordinate classification of described effective touch point is to classify through the relation between numerous effective touch point horizontal ordinate x and the ordinate y, determines whether belonging to same touch point through the relation of reading the distance between valid pixel horizontal ordinate x and the ordinate y before and after judging.This is a method of differentiating different touch points very intuitively, because the image behind the camera collection has carried out binary conversion treatment, the touch point pixel that obtains after treatment; The coordinate of the valid pixel that identical touch point comprises not is strict continuous distribution, can have the distance of several pixels between indivedual valid pixel coordinates, but belongs to same touch point; Like this, adopt and thisly judge the touch point according to the distance between pixel coordinate intuitively, in order to improve the accuracy of judgement; Tend to judge that apart from boundary setting be several pixels, then think same touch point less than several pixels of setting like the distance between two valid pixels, opposite then think different touch points; If two fingers distance separately is bigger; The method of this judgement is an acceptable, if but two finger distances are smaller, even two fingers get together; This decision method will produce wrong touch point coordinate; Because when two fingers left near the pel spacing of setting to this judgement, this algorithm thought that two finger touch points are a touch point, has caused misjudgment; When multipoint operation, produced insecure coordinate, caused gesture operation, pointed mistakes such as closely setting-out.
Summary of the invention
For overcoming the defective of prior art, the object of the invention provides a kind of multiple spot detection computations method of touch-screen, and it is very little in order to improve between finger distance just, even adjacent when touching together, and the accuracy of multiple touch points detection algorithm proposes.Utilize method of the present invention can effectively improve the detection accuracy of location multiple point touching behind the camera, improve the usability of multi-point touch panel.
In order to realize that a plurality of touch points are near accurately discerning a plurality of touch points under the situation in the camera location, the present invention adopts following technical scheme:
A kind of multiple spot detection computations method of touch-screen, it may further comprise the steps:
1) starts camera, gather and touch image;
2) carry out binary conversion treatment to touching image, obtain touching the pixel value of image;
3) pixel value of gathering and predetermined pixel value size are compared, just store greater than predetermined value and touch the corresponding pixel coordinate of image;
4) back end processing module is read the pixel coordinate that has stored, adopts near touch point pixel family centroid algorithm and each pixel coordinate is classified and recomputates the center of gravity of touch point pixel family, finally discerns one or more touch points.
Camera is according to the row order in the step 1), and each pixel clock is gathered a pixel value.
Coordinate sorting technique in the step 4) reaches near touch point pixel family centroid algorithm computation process following steps:
S1: each each variate-value of touch point pixel family of initialization is zero;
S2: read a pixel coordinate;
S3: judge that whether the current pixel coordinate is that first reads coordinate, is then to get into step S7; Otherwise get into step S4;
S4: calculate current pixel coordinate and the distance of preserving each touch point family center of gravity, select minimum distance;
S5: whether judging minor increment less than predetermined value, is then to get into step S6; Otherwise get into step S7;
S6: the current pixel coordinate is added nearest touch point family, recomputate the center of gravity of this touch point family, return step S2;
S7: newly-increased touch point family, the current pixel coordinate is added and should and calculate barycentric coordinates near touch point pixel family, return step S2.
Each each variate-value of touch point pixel family of initialization is zero among the step S1, and wherein variable comprises each touch point pixel family horizontal ordinate and ∑ Xn=0, ordinate and ∑ Yn=0; Touch point pixel family comprises number of pixels ∑ Cn=0; The center of gravity A'n=0 of current pixel family, the touch that subscript n expresses support for is counted, and the sequence number of each touch point family is: 0; 1,2 ... N-1.
Among the step S5; Said predetermined value is set according to actual application environment; Comprise how much deciding of pixel according to the touch point, the touch point comprises pixel greater than a predetermined value, then is made as the value more than or equal to 8 pixels; If the touch point comprises pixel less than a predetermined value, then get value smaller or equal to 8 pixels.In existing big screen display screen; Display pixel is of a size of 1.3 millimeters; It is the approximate circle spot of a radius about near 5 pixels that the normal touch of a finger is moved resulting valid pixel; Can be according to the size adjustment predetermined value of the finger touch point area of concrete experimental situation gained, common touch point pixel radius is not more than 5 pixels, fewer greater than 8 pixels.
Above-mentioned centroid algorithm is following:
The horizontal ordinate and the ∑ Xn=∑ Xn+X that adds up, ordinate and ∑ Yn=∑ Yn+Y;
Touch point family comprises number of pixels ∑ Cn=∑ Cn+ 1;
Barycentric coordinates A ' Xn=∑ Xn/ ∑ Cn, A ' Yn=∑ Yn/ ∑ Cn;
Wherein, X is the horizontal ordinate of the touch point pixel of reading, and Y is that the touch that the touch point pixel ordinate subscript n of reading expresses support for is counted, and the sequence number of each touch point family is: 0,1,2 ... N-1.The inventive method can discern fast and accurately two near in addition two touch points of boundary connected, reliability is high.
Description of drawings
Fig. 1 is coordinate classification of the present invention and center of gravity calculation process flow diagram;
Fig. 2 is for comprising two touch point synoptic diagram of a plurality of pixels among the present invention.
Embodiment
Specify in the face of the present invention down, be to be noted that described embodiment is intended to be convenient to understanding of the present invention, and it is not played any qualification effect.
Camera is taken the touch screen surface image and has been obtained image as shown in Figure 2 through binary conversion treatment, and back end processing module is read each pixel coordinate according to storage order.As shown in Figure 2, the coordinate of pixel D is to be first coordinate read, and E is second coordinate of reading, and back end processing module runs through delegation, and the Y direction increases delegation and then from left to right continues to read pixel coordinate.E wherein has the distance of two pixels between the F, D, E, F, G, H belong to the pixel coordinate with delegation.D, E, F belong to touch point A, and G, H belong to touch point B.
Coordinate classification of the present invention and center of gravity calculation method realize through following steps:
Step S1: each each variate-value of touch point pixel family of initialization is zero, and wherein variable comprises each touch point pixel family horizontal ordinate and ∑ Xn=0, ordinate and ∑ Yn=0; Touch point pixel family comprises number of pixels ∑ Cn=0; Center of gravity A ' the n=0 of current pixel family, the touch that subscript n expresses support for is counted, and the sequence number of each touch point family is (0; 1,2 ... N-1);
Step S2: read a pixel coordinate;
Step S3: judging that current coordinate reads coordinate for first, is then to get into step S7; Not then to get into step S4;
Step S4: calculate current coordinate and the distance of preserving each touch point family center of gravity, select minimum distance;
Step S5: whether judging minor increment less than 8, is then to get into step S6; Not then to get into step S7;
Step S6: current coordinate is added nearest touch point family, recomputate the center of gravity of this touch point family, return step S2;
Step S7: newly-increased touch point family, current coordinate is added this touch point family and calculates barycentric coordinates,
Return step S2;
Through above-mentioned steps, the following pixel deterministic process that Fig. 2 is represented is described in detail:
Coordinate D:
Coordinate D reads coordinate for first, directly gets into step S7, carries out horizontal ordinate and adds up ∑ X0=∑ X0+ XD; Ordinate and ∑ Y0=∑ Y0+ YD; Touch point family comprises number of pixels ∑ C0=∑ C0+ 1, barycentric coordinates A ' x0=∑ X0/ ∑ C0, A ' y0=∑ Y0/ ∑ C0;
Coordinate E:
Coordinate E is not that first reads coordinate, gets into step S4, carry out the computing with the touch point family centroidal distance of current saved, has wherein only preserved sequence number and is 0 center of gravity and be (A ' x0, A ' y0), and their distance is 1; Get into step S5, implement centroidal distance and judge, satisfy distance less than 8; Get into step S6, carry out horizontal ordinate and add up ∑ X0=∑ X0+ XE, ordinate and ∑ Y0=∑ Y0+ YV, touch point family comprises number of pixels ∑ C0=∑ C0+ 1, barycentric coordinates A ' x0=∑ X0/ ∑ C0, A ' y0=∑ Y0/ ∑ C0;
Coordinate F:
Coordinate F carries out with the same process of coordinate E.
Coordinate G:
Coordinate G is not that first reads coordinate, gets into step S4, carry out the computing with the touch point family centroidal distance of current saved, has wherein only preserved sequence number and is 0 center of gravity and be (A ' x0, A ' y0), and their distance is greater than 8; Get into step S5, implement centroidal distance and judge that distance is greater than 8; Get into step S7, newly-increased touch point family, promptly touch point family sequence number adds 1; Carry out horizontal ordinate and add up ∑ X1=∑ X1+ XG, ordinate and ∑ Y1=∑ Y1+ YE, touch point family comprises number of pixels ∑ C1=∑ C1+ 1; Barycentric coordinates A ' x1=∑ X0/ ∑ C1, A ' y1=∑ Y1/ ∑ C1;
Coordinate H:
Coordinate H is not that first reads coordinate, gets into step S4, the computing of the touch point family centroidal distance of execution and current saved, and wherein having preserved sequence number is 0; 1 center of gravity is (A ' x0, A ' y0), (A ' x1, A ' y1); Get distance, promptly arrive the distance of (A ' x1, A ' y1); Get into step S5, implement centroidal distance and judge, satisfy distance less than 8; Get into step S6, carry out horizontal ordinate and add up ∑ X1=∑ X1+ XG, ordinate and ∑ Y1=∑ Y1+ YE, touch point family comprises number of pixels ∑ C1=∑ C1+ 1, barycentric coordinates A ' x1=∑ X0/ ∑ C1, A ' y1=∑ Y1/ ∑ C1;
Coordinate C:
Coordinate C is not that first reads coordinate, gets into step S4, the computing of the touch point family centroidal distance of execution and current saved, and wherein having preserved sequence number is 0; 1 center of gravity is (A ' x0, A ' y0), (A ' x1, A ' y1); Get distance, promptly arrive the distance of (A ' x1, A ' y1); Get into step S5, implement centroidal distance and judge, satisfy distance less than 8; Get into step S6, carry out horizontal ordinate and add up ∑ X1=∑ X1+ XG, ordinate and ∑ Y1=∑ Y1+ YE, touch point family comprises number of pixels ∑ C1=∑ C1+ 1, barycentric coordinates A ' x1=∑ X0/ ∑ C1, A ' y1=∑ Y1/ ∑ C1;
Above process is up to the pixel coordinate of having read all storages, executes the barycentric coordinates A ' that can obtain touch point A after the coordinate assorting process=(A ' x0, A ' y0), the barycentric coordinates B ' of touch point B=(A ' x1, A ' y1).
In above coordinate classification and center of gravity calculation process, current coordinate and pixel family centroidal distance are set at 8 and can change according to actual application environment.In the process of coordinate classification, each pixel family center of gravity recomputates along with the adding of new coordinate.
Adopt above coordinate sorting technique and center of gravity calculation process can solve border, two touch points near or even the coordinate classification error that causes when linking together of border; Adopt two touch points of dividing into that traditional coordinate sorting technique can't be correct; Be because traditional coordinate sorting technique is to judge whether belong to same touch point family, as shown in Figure 2: touch point A and B comprise a plurality of pixels, the border is a UNICOM between the touch point through the distance between the coordinate; The touch point pixel coordinate is from left to right read according to directions X; Run through delegation, the Y direction increases delegation, continues from left to right to read.If read for twice between coordinate apart from classifying the touch point according to front and back; Obviously at middle C place, two touch points; The border of two touch points is communicated with, and can't judge correctly that according to the distance that read between coordinate front and back it still is touch point B that the pixel at C place belongs to touch point A.
The inventive method is to judge that to the distance of center of gravity C place pixel belongs to touch point A or touch point B through pixel coordinate.A ' as shown in Figure 2 is the barycentric coordinates of A touch point, and B ' is the barycentric coordinates of B touch point, and the coordinate of the coordinate at C place and A ', B ' is carried out distance operation; It is short to find out distance; While, this short distance satisfied less than 8 pixels setting, and 8 pixels of this setting can be set according to actual application environment, comprise how much deciding of pixel according to the touch point; It is many that the touch point comprises pixel; Then 8 should be made as bigger value, less if the touch point comprises pixel, then 8 can get littler value.Can determine to the distance of A ', B ' which touch point C place coordinate belongs to through having calculated the C place easily.

Claims (3)

1. the multiple spot detection computations method of a touch-screen is characterized in that, may further comprise the steps:
1) starts camera, gather and touch image;
2) carry out binary conversion treatment to touching image, obtain touching the pixel value of image;
3) pixel value of gathering and predetermined pixel value size are compared, just store greater than predetermined value and touch the corresponding pixel coordinate of image;
4) back end processing module is read the pixel coordinate that has stored, adopts near touch point pixel family centroid algorithm and each pixel coordinate is classified and recomputates the center of gravity of touch point pixel family, finally discerns one or more touch points;
Coordinate sorting technique in the step 4) reaches near touch point pixel family centroid algorithm computation process following steps:
S1: each each variate-value of touch point pixel family of initialization is zero;
S2: read a pixel coordinate;
S3: judge that whether the current pixel coordinate is that first reads coordinate, is then to get into step S7; Otherwise get into step S4;
S4: calculate current pixel coordinate and the distance of preserving each touch point family center of gravity, select minimum distance;
S5: whether judging minor increment less than predetermined value, is then to get into step S6; Otherwise get into step S7;
S6: the current pixel coordinate is added nearest touch point family, recomputate the center of gravity of this touch point family, return step S2;
S7: newly-increased touch point family, the current pixel coordinate is added this touch point pixel family and calculates barycentric coordinates, return step S2;
Among the step S5; Said predetermined value is set according to actual application environment; Comprise how much deciding of pixel according to the touch point, the touch point comprises pixel greater than a predetermined value, then is made as the value more than or equal to 8 pixels; If the touch point comprises pixel less than a predetermined value, then get value smaller or equal to 8 pixels;
Centroid algorithm is following:
The horizontal ordinate and the ∑ Xn=∑ Xn+X that adds up, ordinate and ∑ Yn=∑ Yn+Y;
Touch point family comprises number of pixels ∑ Cn=∑ Cn+ 1;
Barycentric coordinates A ' Xn=∑ Xn/ ∑ Cn, A ' Yn=∑ Yn/ ∑ Cn;
Wherein, X is the horizontal ordinate of the touch point pixel of reading, and Y is that the touch that the touch point pixel ordinate subscript n of reading expresses support for is counted, and the sequence number of each touch point family is: 0,1,2 ... N-1.
2. the multiple spot detection computations method of touch-screen according to claim 1 is characterized in that, camera is according to the row order in the step 1), and each pixel clock is gathered a pixel value.
3. the multiple spot detection computations method of touch-screen according to claim 1 is characterized in that, each each variate-value of touch point pixel family of initialization is zero among the step S1; Wherein variable comprises each touch point pixel family horizontal ordinate and ∑ Xn=0, ordinate and ∑ Yn=0, and pixel family in touch point comprises number of pixels ∑ Cn=0; The center of gravity A'n=0 of current pixel family, the touch that subscript n expresses support for is counted, and the sequence number of each touch point family is: 0; 1,2 ... N-1.
CN201010574634XA 2010-12-06 2010-12-06 Multi-point detection and calculation method for touch screen Expired - Fee Related CN102023761B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201010574634XA CN102023761B (en) 2010-12-06 2010-12-06 Multi-point detection and calculation method for touch screen

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201010574634XA CN102023761B (en) 2010-12-06 2010-12-06 Multi-point detection and calculation method for touch screen

Publications (2)

Publication Number Publication Date
CN102023761A CN102023761A (en) 2011-04-20
CN102023761B true CN102023761B (en) 2012-06-06

Family

ID=43865115

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201010574634XA Expired - Fee Related CN102023761B (en) 2010-12-06 2010-12-06 Multi-point detection and calculation method for touch screen

Country Status (1)

Country Link
CN (1) CN102023761B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103164059A (en) * 2011-12-12 2013-06-19 联咏科技股份有限公司 Multi-point touch control positioning method
CN103513801B (en) * 2012-06-18 2016-08-10 宸鸿科技(厦门)有限公司 Contactor control device and detection method thereof
CN102880344B (en) * 2012-09-13 2015-10-07 广东威创视讯科技股份有限公司 A kind of method for identifying multiple touch points
CN103853388B (en) * 2012-12-05 2017-01-25 北京汇冠新技术股份有限公司 Method for improving touch precision of infrared touch screen
CN110045868B (en) * 2019-03-25 2022-07-26 深圳市德明利技术股份有限公司 Touch point correction method based on clustering algorithm, touch device and electronic equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1912816A (en) * 2005-08-08 2007-02-14 北京理工大学 Virtus touch screen system based on camera head
CN101882033A (en) * 2010-07-16 2010-11-10 广东威创视讯科技股份有限公司 Method and device for speeding up acquisition of coordinate of touch point

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1912816A (en) * 2005-08-08 2007-02-14 北京理工大学 Virtus touch screen system based on camera head
CN101882033A (en) * 2010-07-16 2010-11-10 广东威创视讯科技股份有限公司 Method and device for speeding up acquisition of coordinate of touch point

Also Published As

Publication number Publication date
CN102023761A (en) 2011-04-20

Similar Documents

Publication Publication Date Title
EP2767889B1 (en) Apparatus and method for recognizing motion by using event-based vision sensor
US9842122B2 (en) Method and apparatus for searching images
US11775076B2 (en) Motion detecting system having multiple sensors
CN102023761B (en) Multi-point detection and calculation method for touch screen
TWI439948B (en) User input utilizing dual line scanner apparatus and method
CN102402329B (en) For parametrization and the method identifying circumference gesture in touch sensitive surface
CN102073414B (en) Multi-touch tracking method based on machine vision
CN105528592A (en) Fingerprint scanning method and device and gesture recognition method and device
CN102426480A (en) Man-machine interactive system and real-time gesture tracking processing method for same
US20110268365A1 (en) 3d hand posture recognition system and vision based hand posture recognition method thereof
CN102232209A (en) Stereo optical sensors for resolving multi-touch in a touch detection system
US11422660B2 (en) Input device, input method and program
CN103970322B (en) A kind of method and system of touch-screen track following processing
CN104991684A (en) Touch control device and working method therefor
CN103106391A (en) Gesture recognition apparatus and method thereof
CN102799271A (en) Method and system for identifying interactive commands based on human hand gestures
CN102799273A (en) Interaction control system and method
US9971429B2 (en) Gesture recognition method, apparatus and device, computer program product therefor
CN102004585B (en) Multi-area identification method for touch screen
CN102364419B (en) Camera type touch control method and system thereof
CN102495695B (en) Multipoint touch identification method of infrared diode touch screen as well as touch device and system
CN102456127A (en) Head posture estimation equipment and head posture estimation method
US10551934B2 (en) Gesture recognition method, apparatus and device, computer program product therefor
CN103914668A (en) Error-touch prevented touch identification device and identification method
CN102346605A (en) Photographing type touch control method and system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CP03 Change of name, title or address

Address after: Kezhu road high tech Industrial Development Zone, Guangzhou city of Guangdong Province, No. 233 510670

Patentee after: Wei Chong group Limited by Share Ltd

Address before: 510663 No. 6, color road, hi tech Industrial Development Zone, Guangdong, Guangzhou, China

Patentee before: Guangdong Weichuangshixun Science and Technology Co., Ltd.

CP03 Change of name, title or address
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120606

Termination date: 20191206

CF01 Termination of patent right due to non-payment of annual fee