US20180275830A1 - Object characterisation for touch displays - Google Patents

Object characterisation for touch displays Download PDF

Info

Publication number
US20180275830A1
US20180275830A1 US15/925,329 US201815925329A US2018275830A1 US 20180275830 A1 US20180275830 A1 US 20180275830A1 US 201815925329 A US201815925329 A US 201815925329A US 2018275830 A1 US2018275830 A1 US 2018275830A1
Authority
US
United States
Prior art keywords
light
touch
touch surface
sensing apparatus
touch sensing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/925,329
Inventor
Tomas Christiansson
Kristofer Jakobson
Nicklas OHLSSON
Mattias KRUS
Magnus Hollström
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
FlatFrog Laboratories AB
Original Assignee
FlatFrog Laboratories AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by FlatFrog Laboratories AB filed Critical FlatFrog Laboratories AB
Publication of US20180275830A1 publication Critical patent/US20180275830A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • G06F3/04186Touch location disambiguation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04104Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04106Multi-sensing digitiser, i.e. digitiser using at least two different sensing technologies simultaneously or alternatively, e.g. for detecting pen and finger, for saving power or for improving position detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04109FTIR in optical digitiser, i.e. touch detection by frustrating the total internal reflection within an optical waveguide due to changes of optical properties or deformation at the touch location

Definitions

  • the present disclosure relates to techniques for detecting and identifying objects on a touch surface.
  • GUI graphical user interface
  • a fixed GUI may e.g. be in the form of printed matter placed over, under or inside the panel.
  • a dynamic GUI can be provided by a display screen integrated with, or placed underneath, the panel or by an image being projected onto the panel by a projector.
  • a plurality of optical emitters and optical receivers are arranged around the periphery of a touch surface to create a grid of intersecting light paths (otherwise known as detection lines) above the touch surface. Each light path extends between a respective emitter/receiver pair. An object that touches the touch surface will block or attenuate some of the light paths. Based on the identity of the receivers detecting a blocked light path, a processor can determine the location of the intercept between the blocked light paths.
  • a user may place a finger onto the surface of a touch panel to register a touch.
  • a stylus may be used.
  • a stylus is typically a pen shaped object with at least one end configured to be pressed against the surface of the touch panel.
  • An example of a stylus according to the prior art is shown in FIG. 2 .
  • Use of a stylus 60 may provide improved selection accuracy and pointer precision over a simple finger touch. This can be due to the engineered stylus tip 62 providing a smaller and/or more regular contact surface with the touch panel than is possible with a human finger. Also, muscular control of an entire hand in a pen holding position can be more precise than a single finger for the purposes of pointer control due to lifelong training in the use of pens and pencils.
  • PCT/SE2016/051229 describes an optical IR touch sensing apparatus configured to determine a position of a touching object on the touch surface and an attenuation value corresponding to the attenuation of the light resulting from the object touching the touch surface. Using these values, the apparatus can differentiate between different types of objects, including multiple stylus tips, fingers, palms. The differentiation between the object types may be determined by a function that takes into account how the attenuation of a touching object varies across the touch surface, compensating for e.g. light field height, detection line density, detection line angular density etc.
  • an attenuation map of the touch surface For larger objects applied to the touch surface, such as palms and board erasers, it is possible to use an attenuation map of the touch surface to determine an approximate shape of the object. For example, where an optical IR touch sensing apparatus is used, an attenuation map may be generated showing an area on the touch surface where the light is highly attenuated. The shape of an attenuated area may then be used to identify the position and shape of the touching object. In a technique known according to the prior art, a rough shape of the large object can be determined by identifying all points with an attenuation above a threshold value. An approximate centroid and orientation of the large object may then be determined using the image moments of the identified points. Such techniques are described in “Image analysis via the general theory of moments” by Michael Reed Teague. Once the centroid and orientation of the large object are determined, width and height of the board eraser can be found by determining the extent of the identified pixels in the direction of the orientation angle and the normal of the orientation angle.
  • Attenuation map to determine object characteristics like size, orientation, and shape becomes very difficult due to the low resolution of the attenuation map.
  • a stylus tip may present only a few pixels of interaction on an attenuation map.
  • a first embodiment provides a touch sensing apparatus, comprising: a touch surface, a plurality of emitters arranged around the periphery of the touch surface to emit beams of light such that one or more objects touching the touch surface cause an attenuation or occlusion of the light; a plurality of light detectors arranged around the periphery of the touch surface to receive light from the plurality of emitters on a plurality of light paths, wherein each light detector is arranged to receive light from more than one emitter; and a processing element configured to: determine, based on output signals of the light detectors, a transmission value for each light path; process the transmission values to determine an object reference point on the touch surface where the light is attenuated or occluded by an object, determine a region around the object reference point, determine a plurality of light paths intersecting the region, determine a statistical measure for each of at least one light path variables of the plurality of light paths intersecting the region, including at least the transmission values of the light paths, and determine one or more characteristics of
  • a second embodiment provides a method in a touch sensing apparatus, said touch sensing apparatus comprising: a touch surface, a plurality of emitters arranged around the periphery of the touch surface to emit beams of light such that one or more objects touching the touch surface cause an attenuation or occlusion of the light; and a plurality of light detectors arranged around the periphery of the touch surface to receive light from the plurality of emitters on a plurality of light paths, wherein each light detector is arranged to receive light from more than one emitter; said method comprising: determining, based on output signals of the light detectors, a transmission value for each light path; processing the transmission values to determine an object reference point on the touch surface where the light is attenuated or occluded by an object, determining a region around the object reference point, determining a plurality of light paths intersecting the region, determining a statistical measure of values for each of at least one light path variable of the plurality of light paths intersecting the region, including at least the transmission
  • FIG. 1 is a top plan view of an optical touch apparatus.
  • FIG. 2 shows a cross-section of an above-surface-type IR optical touch apparatus according to the prior art.
  • FIG. 3 shows a cross-section of am FTIR-type IR optical touch apparatus according to the prior art.
  • FIG. 4 is a flow chart showing a process for determining characteristics of an interacting object.
  • FIG. 5 shows a top-down view of touch surface with an applied stylus tip and finger.
  • FIGS. 6 a -6 d shows a sequence of steps for determining a plurality of detection lines intersecting a region around a touching object.
  • FIG. 7 a shows the set of detection lines passing intersecting a region around a finger and a subset of detection lines interacting with the finger.
  • FIG. 7 b shows the set of detection lines passing through a region around a stylus tip and a subset of detection lines interacting with the stylus tip.
  • FIG. 8 a shows a frequency distribution of transmission values for detection lines passing through a region around a finger.
  • FIG. 8 b shows a frequency distribution of transmission values for detection lines passing through a region around a stylus tip.
  • FIG. 9 a shows a frequency distribution of transmission values for detection lines passing through a region around a finger including a threshold value.
  • FIG. 9 b shows a frequency distribution of transmission values for detection lines passing through a region around a stylus tip including a threshold value.
  • the present disclosure relates to optical touch panels and the use of techniques for providing touch sensitivity to a display apparatus. Throughout the description the same reference numerals are used to identify corresponding elements.
  • a “touch object” or “touching object” is a physical object that touches, or is brought in sufficient proximity to, a touch surface so as to be detected by one or more sensors in the touch system.
  • the physical object may be animate or inanimate.
  • An “interaction” occurs when the touch object affects a parameter measured by the sensor.
  • a “touch” denotes a point of interaction as seen in the interaction pattern.
  • a “light field” is the light flowing between an emitter and a corresponding detector. Although an emitter may generate a large amount of light in many directions, only the light measured by a detector from an emitter defines the light field for the emitter and detector.
  • FIG. 1 is a top plan view of an optical touch apparatus which may correspond to the IR optical touch apparatus of FIG. 2 .
  • Emitters 30 a are distributed around the periphery of touch surface 20 , to project light across the touch surface 20 of touch panel 10 .
  • Detectors 30 b are distributed around the periphery of touch surface 20 , to receive part of the propagating light. The light from each of emitters 30 a will thereby propagate to a number of different detectors 30 b on a plurality of light paths 50 .
  • Light paths 50 may conceptually be represented as “detection lines” that extend across the touch surface 20 to the periphery of touch surface 20 between pairs of emitters 30 a and detectors 30 b , as shown in FIG. 1 .
  • the detection lines 50 correspond to a projection of the light paths 50 onto the touch surface 20 .
  • the emitters 30 a and detectors 30 b collectively define a grid of detection lines 50 (“detection grid”) on the touch surface 20 , as seen in the top plan view of FIG. 1 .
  • the spacing of intersections in the detection grid defines the spatial resolution of the touch-sensitive apparatus 100 , i.e. the smallest object that can be detected on the touch surface 20 .
  • the width of the detection line is a function of the width of the emitters and corresponding detectors.
  • a wide detector detecting light from a wide emitter provides a wide detection line with a broader surface coverage, minimising the space in between detection lines which provide no touch coverage.
  • a disadvantage of wide detection lines may be the reduced touch precision, worse point separation, and lower signal to noise ratio.
  • the light paths are a set of virtual light paths converted from the actual light paths via an interpolation step.
  • an interpolation step is described in PCT publication WO2011139213.
  • the virtual light paths may be configured so as to match the requirements of certain CT algorithms, viz. algorithms that are designed for processing efficient and/or memory efficient and/or precise tomographic reconstruction of an interaction field.
  • any characteristics of the object are determined from a statistical measure of the virtual light paths intersecting the region.
  • the emitters 30 a may be any type of device capable of emitting radiation in a desired wavelength range, for example a diode laser, a VCSEL (vertical-cavity surface-emitting laser), an LED (light-emitting diode), an incandescent lamp, a halogen lamp, etc.
  • the emitters 30 a may also be formed by the end of an optical fibre.
  • the emitters 30 a may generate light in any wavelength range. The following examples presume that the light is generated in the infrared (IR), i.e. at wavelengths above about 750 nm.
  • the detectors 30 b may be any device capable of converting light (in the same wavelength range) into an electrical signal, such as a photo-detector, a CCD device, a CMOS device, etc.
  • the detectors 30 b collectively provide an output signal, which is received and sampled by a signal processor 140 .
  • the output signal contains a number of sub-signals, also denoted “transmission values”, each representing the energy of light received by one of light detectors 30 b from one of light emitters 30 a .
  • the signal processor 140 may need to process the output signal for separation of the individual transmission values.
  • the transmission values represent the received energy, intensity or power of light received by the detectors 30 b on the individual detection lines 50 . Whenever an object touches a detection line 50 , the received energy on this detection line is decreased or “attenuated”. Where an object blocks the entire width of the detection line of an above-surface system, the detection line will be fully attenuated or occluded.
  • FIG. 2 shows a cross-section of an IR optical touch apparatus according to the prior art.
  • object 60 having tip 62 will attenuate light propagating along at least one light path 50 .
  • object 60 may even fully occlude the light on at least one light path 50 .
  • the touch apparatus is arranged according to FIG. 2 .
  • a light emitted by emitters 30 a is transmitted through transmissive panel 10 in a manner that does not cause the light to TIR within transmissive panel 10 . Instead, the light exits transmissive panel 10 through touch surface 20 and is reflected by reflector surface 80 of edge reflector 70 to travel along a path 50 in a plane parallel with touch surface 20 . The light will then continue until deflected by reflector surface 80 of the edge reflector 70 at an opposing or adjacent edge of the transmissive panel 10 , wherein the light will be deflected back down through transmissive panel 10 and onto detectors 30 b .
  • An object 60 (optionally having object tip 62 ) touching surface 20 will occlude light paths 50 that intersect with the location of the object on the surface resulting in an attenuated light signal received at detector 30 b.
  • the top edge of reflector surface 80 is 2 mm above touch surface 20 .
  • a 2 mm deep field is advantageous for this embodiment as it minimizes the distance that the object needs to travel into the light field to reach the touch surface and to maximally attenuate the light. The smaller the distance, the shorter time between the object entering the light field and contacting the surface. This is particularly advantageous for differentiating between large objects entering the light field slowly and small objects entering the light field quickly.
  • a large object entering the light field will initially cause a similar attenuation as a smaller object fully extended into the light field.
  • the transmitted light illuminates a touch surface 20 from within the panel 10 .
  • the panel 10 is made of solid material in one or more layers and may have any shape.
  • the panel 10 defines an internal radiation propagation channel, in which light propagates by internal reflections.
  • the propagation channel is defined between the boundary surfaces of the panel 10 , where the top surface allows the propagating light to interact with touching objects 60 and thereby defines the touch surface 20 .
  • This is achieved by injecting the light into the panel 10 such that the light is reflected by total internal reflection (TIR) in the touch surface 20 as it propagates through the panel 10 .
  • TIR total internal reflection
  • the light may be reflected by TIR in the bottom surface or against a reflective coating thereon.
  • an object 60 may be brought in contact with the touch surface 20 to interact with the propagating light at the point of touch.
  • part of the light may be scattered by the object 60
  • part of the light may be absorbed by the object 60
  • part of the light may continue to propagate in its original direction across the panel 10 .
  • the touching object 60 causes a local frustration of the total internal reflection, which leads to a decrease in the energy (or, equivalently, power or intensity) of the transmitted light.
  • the signal processor 140 may be configured to process the transmission values so as to determine a property of the touching objects, such as a position (e.g. in a x,y coordinate system), a shape, or an area. This determination may involve a straight-forward triangulation based on the attenuated detection lines, e.g. as disclosed in U.S. Pat. No. 7,432,893 and WO2010/015408, or a more advanced processing to recreate a distribution of attenuation values (for simplicity, referred to as an “attenuation pattern”) across the touch surface 20 , where each attenuation value represents a local degree of light attenuation.
  • a property of the touching objects such as a position (e.g. in a x,y coordinate system), a shape, or an area. This determination may involve a straight-forward triangulation based on the attenuated detection lines, e.g. as disclosed in U.S. Pat. No. 7,432,893 and WO2010/015408, or
  • the attenuation pattern may be further processed by the signal processor 140 or by a separate device (not shown) for determination of a position, shape or area of touching objects.
  • the attenuation pattern may be generated e.g. by any available algorithm for image reconstruction based on transmission values, including tomographic reconstruction methods such as Filtered Back Projection, FFT-based algorithms, ART (Algebraic Reconstruction Technique), SART (Simultaneous Algebraic Reconstruction Technique), etc.
  • the attenuation pattern may be generated by adapting one or more basis functions and/or by statistical methods such as Bayesian inversion.
  • the term ‘signal processor’ is used throughout to describe one or more processing components for performing the various stages of processing required between receiving the signal from the detectors through to outputting a determination of touch including touch co-ordinates, touch properties, etc.
  • the processing stages of the present disclosure may be carried out on a single processing unit (with a corresponding memory unit), the disclosure is also intended to cover multiple processing units and even remotely located processing units.
  • the signal processor 140 can include one or more hardware processors 130 and a memory 120 .
  • the hardware processors can include, for example, one or more computer processing units.
  • the hardware processor can also include microcontrollers and/or application specific circuitry such as ASICs and FPGAs.
  • the flowcharts and functions discussed herein can be implemented as programming instructions stored, for example, in the memory 120 or a memory of the one or more hardware processors.
  • the programming instructions can be implemented in machine code, C, C++, JAVA, or any other suitable programming languages.
  • the signal processor 130 can execute the programming instructions and accordingly execute the flowcharts and functions discussed herein.
  • FIG. 4 shows a flow diagram according to an embodiment.
  • step 410 of FIG. 4 the signal processor 140 receives and samples output signals from detectors 30 b.
  • the output signals are processed for determination of the transmission values (or ‘transmission signals’).
  • the transmission values represent the received energy, intensity or power of light received by the detectors 30 b on the individual detection lines 50 .
  • the signal processor 140 is configured to process the transmission values to determine the presence of one or more touching objects on the touch surface.
  • the signal processor 140 is configured to process the transmission values to generate a two-dimensional attenuation map of the attenuation field across the touch surface, i.e. a spatial distribution of attenuation values, in which each touching object typically appears as a region of changed attenuation. From the attenuation map, two-dimensional touch data may be extracted and one or more touch locations may be identified.
  • the transmission values may be processed according to a tomographic reconstruction algorithm to generate the two-dimensional attenuation map of the attenuation field.
  • the signal processor 140 may be configured to generate an attenuation map for the entire touch surface. In an alternative embodiment, the signal processor 140 may be configured to generate an attenuation map for a sub-section of the touch surface, the sub-section being selected according to one or more criteria determined during processing of the transmission values.
  • the signal processor 140 is configured to process the transmission values to determine the presence of one or more touching objects on the touch surface by determining intersections between attenuated or occluded detection lines, i.e. by triangulation. In yet another embodiment, the signal processor 140 is configured to process the transmission values to determine the presence of one or more touching objects on the touch surface using non-linear touch detection techniques such as those described in US patent application publication 20150130769 or 20150138105.
  • the signal processor 140 is configured to determine an object reference point 250 for each touching object 210 , 220 .
  • finger 210 and stylus 220 are applied to touch surface 20 .
  • Object reference point 250 ′ is determined for finger 210 .
  • object reference point 250 ′′ is determined for stylus 220 .
  • an image moment is applied to the attenuation map, or to a sub-region of the attenuation map, to determine a centroid of a detected touching object, for use as the object reference point.
  • raw image moments M ij are calculated by:
  • centroid of the image moment may be calculated as:
  • the object reference point 250 is then set to the co-ordinates of the centroid of the image moment.
  • signal processor 140 is configured to determine an object reference point 250 within the interaction area of the touching object by determining a local maxima (i.e. point of highest attenuation) in the area of the attenuation map covered by the object.
  • the identified maxima may be further processed for determination of a touch shape and a center position, e.g. by fitting a two-dimensional second-order polynomial or a Gaussian bell shape to the attenuation values, or by finding the ellipse of inertia of the attenuation values.
  • Step 440 results in a collection of peak data, which may include values of position, attenuation, size, and shape for each detected peak.
  • the attenuation value may be calculated from a maximum attenuation value or a weighted sum of attenuation values within the peak shape.
  • signal processor 140 is configured to determine an object reference point 250 within the interaction area of large touching object by selecting a point at random within the boundary of the touching object.
  • the object reference point is set to the intersection point or average of intersection points, including a weighted average determined in dependence on the attenuation of the detection lines used for computing the intersection points.
  • a region 200 is determined around object 210 , 220 .
  • the region corresponds to an area of the touch surface at the point of and surrounding an object interacting with the touch surface.
  • region 200 may be a circular area, centred on object reference point 250 and having radius R.
  • Radius R may be a predetermined length.
  • radius R may be dynamically determined in dependence on properties of the touching object, including the contact area of the touching object, or a pressure exerted by the touching object on the touch surface.
  • region shapes are alternative shapes, e.g. a rectangular shaped region defined by a width and height and with object reference point 250 at its centre.
  • an ellipse may be used, defined by a width and height and with object reference point 250 at its centre.
  • a set of detection lines intersecting region 200 is determined.
  • region 200 is a circular area, centred on object reference point 250 and having radius R, the set of detection lines intersecting region 200 is determined to be the set of detection lines passing within distance R of the object reference point 250 .
  • step 460 is now described. This embodiment is recognised as one of numerous possible solutions for determining detection lines intersecting region 200 .
  • each detection line is analysed in a counterclockwise direction.
  • the detection line from the first emitter e 0 on the bottom side of the touch surface and the first detector d 0 on the right side is the first detection line to be analysed.
  • the touch system shown in FIG. 6 a shows only emitters along left and bottom edges and detectors along the right and top edges.
  • the present concepts may be applied to touch systems having a variety of emitter and detector geometries including interleaved emitter and detector arrangements.
  • the detector counter is then incremented in counterclockwise direction (i.e. d i+1 ) and the detection line between emitter e 0 and the incremented detector d i+1 is analysed.
  • This loop continues and the detection lines from the emitter are therefore analysed in a counterclockwise pattern until a detection line is identified that passes sufficiently close to the object reference point 250 , i.e. distance 255 is within the specified radii R. In FIG. 6 a , this is the detection line 170 .
  • Measuring distance 255 is preferably achieved using the dot product:
  • search sequences are envisaged including a binary search, or root-finding algorithm, such as secant method or Newton's method.
  • FIG. 6 a also shows detection line angle ⁇ , the use for which is described below.
  • region 200 is non-circular
  • other techniques for determining intersection of the region by the detection line may be used.
  • the loop then continues and the detection lines from the emitter continue to be analysed in a counterclockwise pattern, identifying all of the detection lines passing within distance R of the object reference point 250 until a detection line is identified that does not pass within distance R of the object reference point 250 .
  • All of the detection lines D 0 are defined as the set of detection lines from emitter e 0 intersecting region 200 . Of this set, the most clockwise detection line is d cw,0 and the most counterclockwise detection line is d ccw,0 .
  • the transmission values and reference values are determined.
  • the reference values are an estimated background transmission value for the detection line without any touching objects present.
  • reference values can be a transmission value of the detection line recorded at a previous time, e.g. within 500 ms.
  • reference values can be an average of transmission values over a period of time. E.g. within the last 500 ms. Such averaging techniques are described in U.S. Pat. No. 9,377,884.
  • the next step is to move on to the next emitter to determine detection lines from the next emitter that intersect region 200 .
  • the first detection line to be analysed may be [e j+1 , d cw,j ] and then continued in a counterclockwise direction. This allows a significant reduction in the number of computations required to determine the set of object boundary lines.
  • the next detection line to be analysed may be determined using a binary search or a root finding algorithm. As shown in FIG.
  • the loop then continues and the detection lines from the next emitter continue to be analysed in a counterclockwise pattern, identifying all of the detection lines passing within distance R of the object reference point 250 until a detection line is identified that does not pass within distance R of the object reference point 250 .
  • All of the detection lines D 1 are defined as the set of detection lines from emitter e 1 intersecting region 200 . Of this set, the most clockwise detection line is d cw,1 and the most counterclockwise detection line is d ccw,1 .
  • the signal processor 140 is configured to determine characteristics of the touching object in dependence on the set of detection lines intersecting region 200 around the touching object.
  • FIG. 7 a shows a set of detection lines that intersect region 200 ′ surrounding finger 210 .
  • Non-interacting detection lines 240 intersect region 200 ′ but do not interact with finger 210 in a significant way.
  • Interacting detection lines 230 intersect region 200 ′ and are also attenuated or occluded by finger 210 .
  • FIG. 7 b shows a similar arrangement to FIG. 7 a but where the interacting object is a stylus tip instead of a finger.
  • a set of detection lines are shown that intersect region 200 ′′.
  • Non-interacting detection lines 240 intersect region 200 ′ but do not interact with stylus tip 220 in a significant way.
  • Interacting detection lines 230 intersect region 200 ′ and are also attenuated or occluded by stylus tip 220 .
  • FIGS. 7 a and 7 b show the detection lines as thin lines, an actual embodiment may have much wider detection lines in the plane of the touch surface.
  • a detection line may be determined to be an interacting detection line if it is attenuated by the object by between 30% to 100%.
  • a detection line may be determined to be an interacting detection line if it is attenuated by the object by between 0.5% to 10%.
  • the ‘interaction’ threshold should be higher than the expected noise on the respective detection line.
  • an object type may be determined in dependence on at least one statistical measure of variables of the detection lines.
  • FIG. 8 a shows values of interacting detection lines 230 for finger 210 .
  • FIG. 8 a shows values of interacting detection lines 230 for stylus tip 220 .
  • Four separate stylii touch events are shown. From FIGS. 8 a and 8 b , it is clear that fingers have a negative skew while styli tend to have a more positive skew (as defined per its normal meaning in statistics).
  • the attenuation is computed using the current transmission signals and an exponential forget average of 3-4 previous transmission signals.
  • changes in attenuation are on a relatively short time scale, i.e. during the touch down event.
  • a relatively short time scale i.e. during the touch down event.
  • FIGS. 9 a and 9 b show histograms of attenuation values for the interacting detection lines with corresponding first threshold levels.
  • FIG. 9 a shows values of interacting detection lines 230 for finger 210 .
  • FIG. 9 b shows values of interacting detection lines 230 for stylus tip 220 .
  • the first threshold in FIG. 8 is computed as:
  • first ⁇ ⁇ threshold factor * sum ⁇ ( frequency ⁇ ( n ) * attenuation ⁇ ( n ) ) sum ⁇ ( frequency ⁇ ( n ) )
  • ⁇ ⁇ factor 2
  • the threshold factor may be adjusted in dependence on temporal information of the interactions of the touch system. In one embodiment, where a plurality of styli have been recently identified in an area, the threshold for detecting styli in that area may be reduced to make stylus classification more likely. Where a plurality of fingers have been recently identified in an area, the factor may be increased for determinations made in that area to make finger classification more likely. The factor may also be adjusted to ensure better performance when several proximal touches are detected, due to some detection lines passing more than one object.
  • a first threshold is used to find a ratio of detection lines above and below the first threshold. This ratio is small for fingers and higher for pens.
  • a second threshold is located between the finger ratio of approximately 0.02 and pen ratio of 0.08. i.e. 0.05.
  • the second threshold is then used to determine the object type. i.e. An object having a set of detection lines with the ratio above the second threshold may be determined to be a stylus (or a finger if below the second threshold).
  • the first threshold is computed in dependence on an attenuation peak from an attenuation map. E.g. the first threshold is set to a value corresponding to the peak value multiplied by a typical finger size.
  • the ratio of attenuated detection lines (whose attenuation is above a threshold) compared to the number of detection lines passing through the radii may be used to determine an object type.
  • a finger can be expected to affect almost all of the detection lines (most fingers are larger than 10 mm in diameter).
  • a stylus tip with 2-4 mm contact diameter will only affect around 10-70% of the detection lines depending on the width of the detection line. Consequently, in an embodiment, the object type may be determined to be a finger where the ratio of the number of affected detections vs total intersecting detections exceeds 0.7.
  • the statistical measure may comprise the symmetry, skewness, kurtosis, mode, support, head, tail, mean, median, variance or standard deviation of a variable of the set of intersecting detection lines.
  • characteristics of the object may be determined in dependence on a plurality of the statistical measures.
  • object type and an orientation of the object is determined in dependence on the statistical measure of at least the angle of the light path in the plane (shown as ⁇ in FIG. 6 a ) of the touch surface and the transmission value of the light path.
  • At least one statistical measure is a multivariate statistical measure of values for a plurality of light path variables of the set of intersecting light paths.
  • a combination of the median and the skewness of the attenuation values may be used to determine object type.
  • variance and median values may be used to determine object type.
  • an orientation of the object is determined in dependence on the statistical measure of the angle of the light path in the plane of the touch surface and the transmission value of the light path.
  • a true centre point of a touch object (as opposed to object reference point 250 ) can now be found as the solution to the following over-determined set of linear equations, solved using normal equations.
  • a normal vector (having unit length) is determined as well as a position on the respective detection line (which can be the geometrical position of either emitter or detector or some other point).
  • This technique also allows a centre position to be determined for regular shapes, oblongs, etc.
  • Geometric characteristics of the object may also be determined in dependence on the one or more statistical measure, including length, width, radii, orientation in the plane of the touch surface, shape.
  • all determined detection lines for all emitters are analysed to determine their angle ⁇ (phi), defined as the angle between the normal to the detection line and the touch surface x-axis 400 , and the shortest distance from true centre point to the detection line. Given all detection lines passing through the region, a minimum average (over a small phi-region) of attenuation*(shortest distance from detection to true centre point), provides the orientation of an elongated object.
  • phi
  • a boundary line may be determined as the detection line with the largest magnitude distance from centre point 250 where the attenuation is above a threshold.
  • the characteristics of the selected boundary line will provide useful information about the characteristics of object 210 , 220 .
  • the length (i.e. the major axis) of the object the may be determined in dependence on a vector defining the shortest distance from the boundary line to the true centre point.
  • the magnitude of the vector may be assumed to be half of the length. Therefore, the length of object may be determined to be twice the magnitude of the vector.
  • the angle of the vector also defines the orientation angle of the rectangular object.
  • the angle phi of the vector defines the wide axis of the object. Consequently, the angle of the narrow axis of the rectangle may be defined as
  • the width of the object may be determined to be twice the magnitude of the vector of the boundary line located at
  • the phi and length values for the object are determined using an average of a plurality of the highest values.
  • a touch system in another embodiment, includes a touch surface, a display, a touch sensor configured to detect one or more objects touching the touch surface and generate a touch signal, a processing element configured to: determine a position of the one or more objects in dependence on the touch signal, determine whether an object is an eraser in dependence on the touch signal, output a user interface to the display, wherein the user interface is configured to display one or more interaction objects and wherein the user interface is controlled via the one or more objects on the touch surface, wherein an erase function may only be applied to the user interface by means of an object determined to be an eraser.
  • the eraser may have a rectangular surface for application to the touch surface allowing the touch system to easily identify the shape of the eraser, either according to the above techniques or techniques otherwise known to the skilled man.
  • a teacher and children are interacting with a digital white board and where erasing objects on the digital whiteboard is only permitted by means of the physical eraser, it is surprisingly difficult for a child to accidentally or deliberately simulate the shape of a rectangular eraser on the touch surface using their fingers and hands. Therefore, it is advantageous possible to prevent a child from erasing objects (e.g. ink, text, or geometric shapes) on the digital white board without using the eraser object. i.e. without the teacher's authorization.
  • erasing objects e.g. ink, text, or geometric shapes
  • the user interface may be a canvas or whiteboard application.
  • the one or more interaction objects may comprise ink, text, or geometric shapes.
  • the one or more interaction objects may be added to the user interface by means of a non-eraser object type applied to the touch surface.
  • the erase function may remove interaction objects from the user interface at a position on the user interface corresponding to the position of the eraser on the touch surface.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

An optical IR touch sensing apparatus configured to determine, based on output signals of light detectors, a light energy value for each light path across a touch surface, and generate a transmission value for each light path based on the light energy value. A processor is then configured to process the transmission values to determine a region around the object reference point on the touch surface and a set of light paths intersecting the region. By performing statistical analysis of the set of light paths, characteristics of the object may be determined.

Description

    INCORPORATION BY REFERENCE TO ANY PRIORITY APPLICATIONS
  • Any and all applications for which a foreign or domestic priority claim is identified in the Application Data Sheet as filed with the present application are hereby incorporated by reference under 37 CFR 1.57.
  • BACKGROUND OF THE INVENTION Field of the Invention
  • The present disclosure relates to techniques for detecting and identifying objects on a touch surface.
  • Description of the Related Art
  • To an increasing extent, touch-sensitive panels are being used for providing input data to computers, electronic measurement and test equipment, gaming devices, etc. The panel may be provided with a graphical user interface (GUI) for a user to interact with using e.g. a pointer, stylus or one or more fingers. The GUI may be fixed or dynamic. A fixed GUI may e.g. be in the form of printed matter placed over, under or inside the panel. A dynamic GUI can be provided by a display screen integrated with, or placed underneath, the panel or by an image being projected onto the panel by a projector.
  • There are numerous known techniques for providing touch sensitivity to the panel, e.g. by using cameras to capture light scattered off the point(s) of touch on the panel, by using cameras to directly observe the objects interacting with the panel, by incorporating resistive wire grids, capacitive sensors, strain gauges, etc. into the panel.
  • In one category of touch-sensitive panels known as ‘above surface optical touch systems’ and known from e.g. U.S. Pat. No. 4,459,476, a plurality of optical emitters and optical receivers are arranged around the periphery of a touch surface to create a grid of intersecting light paths (otherwise known as detection lines) above the touch surface. Each light path extends between a respective emitter/receiver pair. An object that touches the touch surface will block or attenuate some of the light paths. Based on the identity of the receivers detecting a blocked light path, a processor can determine the location of the intercept between the blocked light paths.
  • For most touch systems, a user may place a finger onto the surface of a touch panel to register a touch. Alternatively, a stylus may be used. A stylus is typically a pen shaped object with at least one end configured to be pressed against the surface of the touch panel. An example of a stylus according to the prior art is shown in FIG. 2. Use of a stylus 60 may provide improved selection accuracy and pointer precision over a simple finger touch. This can be due to the engineered stylus tip 62 providing a smaller and/or more regular contact surface with the touch panel than is possible with a human finger. Also, muscular control of an entire hand in a pen holding position can be more precise than a single finger for the purposes of pointer control due to lifelong training in the use of pens and pencils.
  • PCT/SE2016/051229 describes an optical IR touch sensing apparatus configured to determine a position of a touching object on the touch surface and an attenuation value corresponding to the attenuation of the light resulting from the object touching the touch surface. Using these values, the apparatus can differentiate between different types of objects, including multiple stylus tips, fingers, palms. The differentiation between the object types may be determined by a function that takes into account how the attenuation of a touching object varies across the touch surface, compensating for e.g. light field height, detection line density, detection line angular density etc.
  • For larger objects applied to the touch surface, such as palms and board erasers, it is possible to use an attenuation map of the touch surface to determine an approximate shape of the object. For example, where an optical IR touch sensing apparatus is used, an attenuation map may be generated showing an area on the touch surface where the light is highly attenuated. The shape of an attenuated area may then be used to identify the position and shape of the touching object. In a technique known according to the prior art, a rough shape of the large object can be determined by identifying all points with an attenuation above a threshold value. An approximate centroid and orientation of the large object may then be determined using the image moments of the identified points. Such techniques are described in “Image analysis via the general theory of moments” by Michael Reed Teague. Once the centroid and orientation of the large object are determined, width and height of the board eraser can be found by determining the extent of the identified pixels in the direction of the orientation angle and the normal of the orientation angle.
  • However, for smaller objects, use of attenuation map to determine object characteristics like size, orientation, and shape becomes very difficult due to the low resolution of the attenuation map. In particular a stylus tip may present only a few pixels of interaction on an attenuation map.
  • Therefore, what is needed is a method of determining object characteristics that overcome the above limitations.
  • SUMMARY OF THE INVENTION
  • It is an objective of the disclosure to at least partly overcome one or more of the above-identified limitations of the prior art.
  • One or more of these objectives, as well as further objectives that may appear from the description below, are at least partly achieved by means of a method for data processing, a computer readable medium, devices for data processing, and a touch-sensing apparatus according to the independent claims, embodiments thereof being defined by the dependent claims.
  • A first embodiment provides a touch sensing apparatus, comprising: a touch surface, a plurality of emitters arranged around the periphery of the touch surface to emit beams of light such that one or more objects touching the touch surface cause an attenuation or occlusion of the light; a plurality of light detectors arranged around the periphery of the touch surface to receive light from the plurality of emitters on a plurality of light paths, wherein each light detector is arranged to receive light from more than one emitter; and a processing element configured to: determine, based on output signals of the light detectors, a transmission value for each light path; process the transmission values to determine an object reference point on the touch surface where the light is attenuated or occluded by an object, determine a region around the object reference point, determine a plurality of light paths intersecting the region, determine a statistical measure for each of at least one light path variables of the plurality of light paths intersecting the region, including at least the transmission values of the light paths, and determine one or more characteristics of the object in dependence on the at least one statistical measure.
  • A second embodiment provides a method in a touch sensing apparatus, said touch sensing apparatus comprising: a touch surface, a plurality of emitters arranged around the periphery of the touch surface to emit beams of light such that one or more objects touching the touch surface cause an attenuation or occlusion of the light; and a plurality of light detectors arranged around the periphery of the touch surface to receive light from the plurality of emitters on a plurality of light paths, wherein each light detector is arranged to receive light from more than one emitter; said method comprising: determining, based on output signals of the light detectors, a transmission value for each light path; processing the transmission values to determine an object reference point on the touch surface where the light is attenuated or occluded by an object, determining a region around the object reference point, determining a plurality of light paths intersecting the region, determining a statistical measure of values for each of at least one light path variable of the plurality of light paths intersecting the region, including at least the transmission values of the light paths, and determining one or more characteristics of the object in dependence on the at least one statistical measure.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Embodiments of the invention will now be described in more detail with reference to the accompanying schematic drawings.
  • FIG. 1 is a top plan view of an optical touch apparatus.
  • FIG. 2 shows a cross-section of an above-surface-type IR optical touch apparatus according to the prior art.
  • FIG. 3 shows a cross-section of am FTIR-type IR optical touch apparatus according to the prior art.
  • FIG. 4 is a flow chart showing a process for determining characteristics of an interacting object.
  • FIG. 5 shows a top-down view of touch surface with an applied stylus tip and finger.
  • FIGS. 6a-6d shows a sequence of steps for determining a plurality of detection lines intersecting a region around a touching object.
  • FIG. 7a shows the set of detection lines passing intersecting a region around a finger and a subset of detection lines interacting with the finger.
  • FIG. 7b shows the set of detection lines passing through a region around a stylus tip and a subset of detection lines interacting with the stylus tip.
  • FIG. 8a shows a frequency distribution of transmission values for detection lines passing through a region around a finger.
  • FIG. 8b shows a frequency distribution of transmission values for detection lines passing through a region around a stylus tip.
  • FIG. 9a shows a frequency distribution of transmission values for detection lines passing through a region around a finger including a threshold value.
  • FIG. 9b shows a frequency distribution of transmission values for detection lines passing through a region around a stylus tip including a threshold value.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
  • The present disclosure relates to optical touch panels and the use of techniques for providing touch sensitivity to a display apparatus. Throughout the description the same reference numerals are used to identify corresponding elements.
  • In addition to having its ordinary meaning, the following terms can also mean:
  • A “touch object” or “touching object” is a physical object that touches, or is brought in sufficient proximity to, a touch surface so as to be detected by one or more sensors in the touch system. The physical object may be animate or inanimate.
  • An “interaction” occurs when the touch object affects a parameter measured by the sensor.
  • A “touch” denotes a point of interaction as seen in the interaction pattern.
  • A “light field” is the light flowing between an emitter and a corresponding detector. Although an emitter may generate a large amount of light in many directions, only the light measured by a detector from an emitter defines the light field for the emitter and detector.
  • FIG. 1 is a top plan view of an optical touch apparatus which may correspond to the IR optical touch apparatus of FIG. 2. Emitters 30 a are distributed around the periphery of touch surface 20, to project light across the touch surface 20 of touch panel 10. Detectors 30 b are distributed around the periphery of touch surface 20, to receive part of the propagating light. The light from each of emitters 30 a will thereby propagate to a number of different detectors 30 b on a plurality of light paths 50.
  • Light paths 50 may conceptually be represented as “detection lines” that extend across the touch surface 20 to the periphery of touch surface 20 between pairs of emitters 30 a and detectors 30 b, as shown in FIG. 1. Thus, the detection lines 50 correspond to a projection of the light paths 50 onto the touch surface 20. Thereby, the emitters 30 a and detectors 30 b collectively define a grid of detection lines 50 (“detection grid”) on the touch surface 20, as seen in the top plan view of FIG. 1. The spacing of intersections in the detection grid defines the spatial resolution of the touch-sensitive apparatus 100, i.e. the smallest object that can be detected on the touch surface 20. The width of the detection line is a function of the width of the emitters and corresponding detectors. A wide detector detecting light from a wide emitter provides a wide detection line with a broader surface coverage, minimising the space in between detection lines which provide no touch coverage. A disadvantage of wide detection lines may be the reduced touch precision, worse point separation, and lower signal to noise ratio.
  • In one embodiment, the light paths are a set of virtual light paths converted from the actual light paths via an interpolation step. Such an interpolation step is described in PCT publication WO2011139213. The virtual light paths may be configured so as to match the requirements of certain CT algorithms, viz. algorithms that are designed for processing efficient and/or memory efficient and/or precise tomographic reconstruction of an interaction field. In this embodiment, any characteristics of the object are determined from a statistical measure of the virtual light paths intersecting the region.
  • As used herein, the emitters 30 a may be any type of device capable of emitting radiation in a desired wavelength range, for example a diode laser, a VCSEL (vertical-cavity surface-emitting laser), an LED (light-emitting diode), an incandescent lamp, a halogen lamp, etc. The emitters 30 a may also be formed by the end of an optical fibre. The emitters 30 a may generate light in any wavelength range. The following examples presume that the light is generated in the infrared (IR), i.e. at wavelengths above about 750 nm. Analogously, the detectors 30 b may be any device capable of converting light (in the same wavelength range) into an electrical signal, such as a photo-detector, a CCD device, a CMOS device, etc.
  • The detectors 30 b collectively provide an output signal, which is received and sampled by a signal processor 140. The output signal contains a number of sub-signals, also denoted “transmission values”, each representing the energy of light received by one of light detectors 30 b from one of light emitters 30 a. Depending on implementation, the signal processor 140 may need to process the output signal for separation of the individual transmission values. The transmission values represent the received energy, intensity or power of light received by the detectors 30 b on the individual detection lines 50. Whenever an object touches a detection line 50, the received energy on this detection line is decreased or “attenuated”. Where an object blocks the entire width of the detection line of an above-surface system, the detection line will be fully attenuated or occluded.
  • FIG. 2 shows a cross-section of an IR optical touch apparatus according to the prior art. In the example apparatus shown in FIG. 2, object 60 having tip 62 will attenuate light propagating along at least one light path 50. In the example shown of FIG. 2, object 60 may even fully occlude the light on at least one light path 50.
  • In one embodiment, the touch apparatus is arranged according to FIG. 2. A light emitted by emitters 30 a is transmitted through transmissive panel 10 in a manner that does not cause the light to TIR within transmissive panel 10. Instead, the light exits transmissive panel 10 through touch surface 20 and is reflected by reflector surface 80 of edge reflector 70 to travel along a path 50 in a plane parallel with touch surface 20. The light will then continue until deflected by reflector surface 80 of the edge reflector 70 at an opposing or adjacent edge of the transmissive panel 10, wherein the light will be deflected back down through transmissive panel 10 and onto detectors 30 b. An object 60 (optionally having object tip 62) touching surface 20 will occlude light paths 50 that intersect with the location of the object on the surface resulting in an attenuated light signal received at detector 30 b.
  • In one embodiment, the top edge of reflector surface 80 is 2 mm above touch surface 20. This results in a light field 90 which is 2 mm deep. A 2 mm deep field is advantageous for this embodiment as it minimizes the distance that the object needs to travel into the light field to reach the touch surface and to maximally attenuate the light. The smaller the distance, the shorter time between the object entering the light field and contacting the surface. This is particularly advantageous for differentiating between large objects entering the light field slowly and small objects entering the light field quickly. A large object entering the light field will initially cause a similar attenuation as a smaller object fully extended into the light field. The shorter distance for the objects to travel, the fewer frames are required before a representative attenuation signal for each object can be observed. This effect is particularly apparent when the light field is between 0.5 mm and 2 mm deep.
  • In an alternative embodiment shown in FIG. 3, the transmitted light illuminates a touch surface 20 from within the panel 10. The panel 10 is made of solid material in one or more layers and may have any shape. The panel 10 defines an internal radiation propagation channel, in which light propagates by internal reflections. The propagation channel is defined between the boundary surfaces of the panel 10, where the top surface allows the propagating light to interact with touching objects 60 and thereby defines the touch surface 20. This is achieved by injecting the light into the panel 10 such that the light is reflected by total internal reflection (TIR) in the touch surface 20 as it propagates through the panel 10. The light may be reflected by TIR in the bottom surface or against a reflective coating thereon. In this embodiment, an object 60 may be brought in contact with the touch surface 20 to interact with the propagating light at the point of touch. In this interaction, part of the light may be scattered by the object 60, part of the light may be absorbed by the object 60, and part of the light may continue to propagate in its original direction across the panel 10. Thus, the touching object 60 causes a local frustration of the total internal reflection, which leads to a decrease in the energy (or, equivalently, power or intensity) of the transmitted light.
  • The signal processor 140 may be configured to process the transmission values so as to determine a property of the touching objects, such as a position (e.g. in a x,y coordinate system), a shape, or an area. This determination may involve a straight-forward triangulation based on the attenuated detection lines, e.g. as disclosed in U.S. Pat. No. 7,432,893 and WO2010/015408, or a more advanced processing to recreate a distribution of attenuation values (for simplicity, referred to as an “attenuation pattern”) across the touch surface 20, where each attenuation value represents a local degree of light attenuation. The attenuation pattern may be further processed by the signal processor 140 or by a separate device (not shown) for determination of a position, shape or area of touching objects. The attenuation pattern may be generated e.g. by any available algorithm for image reconstruction based on transmission values, including tomographic reconstruction methods such as Filtered Back Projection, FFT-based algorithms, ART (Algebraic Reconstruction Technique), SART (Simultaneous Algebraic Reconstruction Technique), etc. Alternatively, the attenuation pattern may be generated by adapting one or more basis functions and/or by statistical methods such as Bayesian inversion. Examples of such reconstruction functions designed for use in touch determination are found in WO2009/077962, WO2011/049511, WO2011/139213, WO2012/050510, and WO2013/062471, all of which are incorporated herein by reference.
  • For the purposes of brevity, the term ‘signal processor’ is used throughout to describe one or more processing components for performing the various stages of processing required between receiving the signal from the detectors through to outputting a determination of touch including touch co-ordinates, touch properties, etc. Although the processing stages of the present disclosure may be carried out on a single processing unit (with a corresponding memory unit), the disclosure is also intended to cover multiple processing units and even remotely located processing units. In an embodiment, the signal processor 140 can include one or more hardware processors 130 and a memory 120. The hardware processors can include, for example, one or more computer processing units. The hardware processor can also include microcontrollers and/or application specific circuitry such as ASICs and FPGAs. The flowcharts and functions discussed herein can be implemented as programming instructions stored, for example, in the memory 120 or a memory of the one or more hardware processors. The programming instructions can be implemented in machine code, C, C++, JAVA, or any other suitable programming languages. The signal processor 130 can execute the programming instructions and accordingly execute the flowcharts and functions discussed herein.
  • FIG. 4 shows a flow diagram according to an embodiment.
  • In step 410 of FIG. 4, the signal processor 140 receives and samples output signals from detectors 30 b.
  • In step 420, the output signals are processed for determination of the transmission values (or ‘transmission signals’). As described above, the transmission values represent the received energy, intensity or power of light received by the detectors 30 b on the individual detection lines 50.
  • In step 430, the signal processor 140 is configured to process the transmission values to determine the presence of one or more touching objects on the touch surface. In an embodiment, the signal processor 140 is configured to process the transmission values to generate a two-dimensional attenuation map of the attenuation field across the touch surface, i.e. a spatial distribution of attenuation values, in which each touching object typically appears as a region of changed attenuation. From the attenuation map, two-dimensional touch data may be extracted and one or more touch locations may be identified. The transmission values may be processed according to a tomographic reconstruction algorithm to generate the two-dimensional attenuation map of the attenuation field.
  • In one embodiment, the signal processor 140 may be configured to generate an attenuation map for the entire touch surface. In an alternative embodiment, the signal processor 140 may be configured to generate an attenuation map for a sub-section of the touch surface, the sub-section being selected according to one or more criteria determined during processing of the transmission values.
  • In an alternative embodiment, the signal processor 140 is configured to process the transmission values to determine the presence of one or more touching objects on the touch surface by determining intersections between attenuated or occluded detection lines, i.e. by triangulation. In yet another embodiment, the signal processor 140 is configured to process the transmission values to determine the presence of one or more touching objects on the touch surface using non-linear touch detection techniques such as those described in US patent application publication 20150130769 or 20150138105.
  • In step 440, the signal processor 140 is configured to determine an object reference point 250 for each touching object 210, 220. As shown in FIG. 5, finger 210 and stylus 220 are applied to touch surface 20. Object reference point 250′ is determined for finger 210. Similarly, object reference point 250″ is determined for stylus 220.
  • In one embodiment, an image moment is applied to the attenuation map, or to a sub-region of the attenuation map, to determine a centroid of a detected touching object, for use as the object reference point. E.g. For a scalar attenuation map with pixel intensities I(x,y), raw image moments Mij are calculated by:
  • M ij = x y x i y j I ( x , y )
  • The centroid of the image moment may be calculated as:

  • x,°y°}={M 10 /M 00 ,°M 01 /M 01°}
  • The object reference point 250 is then set to the co-ordinates of the centroid of the image moment.
  • In another embodiment, signal processor 140 is configured to determine an object reference point 250 within the interaction area of the touching object by determining a local maxima (i.e. point of highest attenuation) in the area of the attenuation map covered by the object. The identified maxima may be further processed for determination of a touch shape and a center position, e.g. by fitting a two-dimensional second-order polynomial or a Gaussian bell shape to the attenuation values, or by finding the ellipse of inertia of the attenuation values. There are also numerous other techniques as is well known in the art, such as clustering algorithms, edge detection algorithms, standard blob detection, water shedding techniques, flood fill techniques, etc. Step 440 results in a collection of peak data, which may include values of position, attenuation, size, and shape for each detected peak. The attenuation value may be calculated from a maximum attenuation value or a weighted sum of attenuation values within the peak shape.
  • In another embodiment, signal processor 140 is configured to determine an object reference point 250 within the interaction area of large touching object by selecting a point at random within the boundary of the touching object.
  • In an embodiment in which touching objects are identified using intersections between attenuated or occluded detection lines, i.e. by triangulation, the object reference point is set to the intersection point or average of intersection points, including a weighted average determined in dependence on the attenuation of the detection lines used for computing the intersection points.
  • In step 450, a region 200 is determined around object 210, 220. The region corresponds to an area of the touch surface at the point of and surrounding an object interacting with the touch surface. In one embodiment, region 200 may be a circular area, centred on object reference point 250 and having radius R. Radius R may be a predetermined length. Alternatively, radius R may be dynamically determined in dependence on properties of the touching object, including the contact area of the touching object, or a pressure exerted by the touching object on the touch surface. Other embodiments are envisioned in which region shapes are alternative shapes, e.g. a rectangular shaped region defined by a width and height and with object reference point 250 at its centre. Similarly, an ellipse may be used, defined by a width and height and with object reference point 250 at its centre.
  • In step 460, a set of detection lines intersecting region 200 is determined. In an embodiment where region 200 is a circular area, centred on object reference point 250 and having radius R, the set of detection lines intersecting region 200 is determined to be the set of detection lines passing within distance R of the object reference point 250.
  • In embodiment of step 460 is now described. This embodiment is recognised as one of numerous possible solutions for determining detection lines intersecting region 200.
  • 1) The emitter/detector pairs forming each detection line are analysed in a counterclockwise direction. As shown in FIG. 6a , the detection line from the first emitter e0 on the bottom side of the touch surface and the first detector d0 on the right side is the first detection line to be analysed. For the purposes of clear explanation, the touch system shown in FIG. 6a shows only emitters along left and bottom edges and detectors along the right and top edges. However, it is understood that the present concepts may be applied to touch systems having a variety of emitter and detector geometries including interleaved emitter and detector arrangements.
  • The detector counter is then incremented in counterclockwise direction (i.e. di+1) and the detection line between emitter e0 and the incremented detector di+1 is analysed. This loop continues and the detection lines from the emitter are therefore analysed in a counterclockwise pattern until a detection line is identified that passes sufficiently close to the object reference point 250, i.e. distance 255 is within the specified radii R. In FIG. 6a , this is the detection line 170. Measuring distance 255 is preferably achieved using the dot product:

  • s=clot product(normal[e 0 −d i], object reference point−detection line position[e 0 −d i])
  • Where s is the closest distance from a point to a line.
  • Other search sequences are envisaged including a binary search, or root-finding algorithm, such as secant method or Newton's method.
  • FIG. 6a also shows detection line angle ϕ, the use for which is described below.
  • In embodiments where region 200 is non-circular, other techniques for determining intersection of the region by the detection line may be used. E.g. Ray/Polygon Intersection algorithms as known in the art.
  • As shown in FIG. 6b , the loop then continues and the detection lines from the emitter continue to be analysed in a counterclockwise pattern, identifying all of the detection lines passing within distance R of the object reference point 250 until a detection line is identified that does not pass within distance R of the object reference point 250. All of the detection lines D0 are defined as the set of detection lines from emitter e0 intersecting region 200. Of this set, the most clockwise detection line is dcw,0 and the most counterclockwise detection line is dccw,0.
  • For all detection lines D0, the transmission values and reference values are determined. In one embodiment, the reference values are an estimated background transmission value for the detection line without any touching objects present. In an alternative embodiment, reference values can be a transmission value of the detection line recorded at a previous time, e.g. within 500 ms. Alternatively, reference values can be an average of transmission values over a period of time. E.g. within the last 500 ms. Such averaging techniques are described in U.S. Pat. No. 9,377,884.
  • As shown in FIG. 6c , the next step is to move on to the next emitter to determine detection lines from the next emitter that intersect region 200.
  • As the emitter/detectors are processed in a circular order, a geometric consequence is that the detection line defined by [ej+1, dk] will be further away from the region 200 than [ej, dk]. Therefore, in a preferable configuration, when detection lines for the next emitter in the counterclockwise direction are analysed, the first detection line to be analysed may be [ej+1, dcw,j] and then continued in a counterclockwise direction. This allows a significant reduction in the number of computations required to determine the set of object boundary lines. As an alternative to selecting the next detection line in the counterclockwise direction, the next detection line to be analysed may be determined using a binary search or a root finding algorithm. As shown in FIG. 6c , once [e0, dcw,0] is determined to be the most clockwise detection line to intersect region 200 from emitter e0, detection line e1, dcw,0, shown in the figure at detection line 172, is an effective detection line to start the next loop with. This allows a significant reduction in the number of computations required to determine the set intersecting detection lines.
  • As shown in FIG. 6d , the loop then continues and the detection lines from the next emitter continue to be analysed in a counterclockwise pattern, identifying all of the detection lines passing within distance R of the object reference point 250 until a detection line is identified that does not pass within distance R of the object reference point 250. All of the detection lines D1 are defined as the set of detection lines from emitter e1 intersecting region 200. Of this set, the most clockwise detection line is dcw,1 and the most counterclockwise detection line is dccw,1.
  • The above steps are repeated for every emitter until every detection line intersecting with region 200 is determined. It is noted that the order in which detection lines are analysed is arbitrary. It is possible to start with fixed emitters or detectors when searching for intersect detection lines.
  • In step 470 of FIG. 4, the signal processor 140 is configured to determine characteristics of the touching object in dependence on the set of detection lines intersecting region 200 around the touching object.
  • FIG. 7a shows a set of detection lines that intersect region 200′ surrounding finger 210. Non-interacting detection lines 240 intersect region 200′ but do not interact with finger 210 in a significant way. Interacting detection lines 230 intersect region 200′ and are also attenuated or occluded by finger 210.
  • FIG. 7b shows a similar arrangement to FIG. 7a but where the interacting object is a stylus tip instead of a finger. In FIG. 7b , a set of detection lines are shown that intersect region 200″. Non-interacting detection lines 240 intersect region 200′ but do not interact with stylus tip 220 in a significant way. Interacting detection lines 230 intersect region 200′ and are also attenuated or occluded by stylus tip 220.
  • Although FIGS. 7a and 7b show the detection lines as thin lines, an actual embodiment may have much wider detection lines in the plane of the touch surface. For a system such as that shown in FIG. 2 where an object may fully occlude light, an un-occluded portion of the detection line may still be received at the detector and so the detection line appears to only be attenuated and not occluded. Therefore, in an embodiment of a touch system of the type shown in FIG. 2, a detection line may be determined to be an interacting detection line if it is attenuated by the object by between 30% to 100%. In an embodiment of a touch system of the type shown in FIG. 3, a detection line may be determined to be an interacting detection line if it is attenuated by the object by between 0.5% to 10%. The ‘interaction’ threshold should be higher than the expected noise on the respective detection line.
  • From a visual inspection of FIGS. 7a and 7b , it is clear that the ratio of interacting detection lines 230 to non-interacting detection lines 240 is greater for a finger object than for a stylus object. Therefore, in one embodiment, an object type may be determined in dependence on at least one statistical measure of variables of the detection lines.
  • FIGS. 8a and 8b show histograms of attenuation values (where the attenuation values represent the drop in transmission of the signal, e.g. attenuation=1−transmission) for the interacting detection lines. Four separate finger touch events are shown. FIG. 8a shows values of interacting detection lines 230 for finger 210. FIG. 8a shows values of interacting detection lines 230 for stylus tip 220. Four separate stylii touch events are shown. From FIGS. 8a and 8b , it is clear that fingers have a negative skew while styli tend to have a more positive skew (as defined per its normal meaning in statistics). One can also note that there is almost no tail for the distribution of detection lines with high attenuation for the fingers while there is a distinct tail for the distribution of detection lines for the pens. In the embodiment used to produce the histogram, the attenuation is computed using the current transmission signals and an exponential forget average of 3-4 previous transmission signals.
  • In one embodiment, changes in attenuation are on a relatively short time scale, i.e. during the touch down event. Such an attenuation map is described in U.S. Pat. No. 9,377,884.
  • FIGS. 9a and 9b show histograms of attenuation values for the interacting detection lines with corresponding first threshold levels. FIG. 9a shows values of interacting detection lines 230 for finger 210. FIG. 9b shows values of interacting detection lines 230 for stylus tip 220.
  • The first threshold in FIG. 8 is computed as:
  • first threshold = factor * sum ( frequency ( n ) * attenuation ( n ) ) sum ( frequency ( n ) ) Where factor = 2
  • The threshold factor may be adjusted in dependence on temporal information of the interactions of the touch system. In one embodiment, where a plurality of styli have been recently identified in an area, the threshold for detecting styli in that area may be reduced to make stylus classification more likely. Where a plurality of fingers have been recently identified in an area, the factor may be increased for determinations made in that area to make finger classification more likely. The factor may also be adjusted to ensure better performance when several proximal touches are detected, due to some detection lines passing more than one object.
  • In one embodiment, a first threshold is used to find a ratio of detection lines above and below the first threshold. This ratio is small for fingers and higher for pens. In the example results of FIGS. 8a, 8b, 9a, and 9b , a second threshold is located between the finger ratio of approximately 0.02 and pen ratio of 0.08. i.e. 0.05. The second threshold is then used to determine the object type. i.e. An object having a set of detection lines with the ratio above the second threshold may be determined to be a stylus (or a finger if below the second threshold). In alternative embodiment, the first threshold is computed in dependence on an attenuation peak from an attenuation map. E.g. the first threshold is set to a value corresponding to the peak value multiplied by a typical finger size.
  • For systems where the detection line width is similar to that of the pen, reconstructed peaks of the same attenuation (touches and pens) have different attenuation histograms. Since a finger is generally bigger it will have lower attenuation per detection line (if the reconstructed attenuation is the same) than for a pen (that attenuates fewer detection lines) even though the reconstructed attenuation value may end up at the same level.
  • In one embodiment, the ratio of attenuated detection lines (whose attenuation is above a threshold) compared to the number of detection lines passing through the radii may be used to determine an object type. E.g. if all detection lines that pass within 5 mm from the touch point are analysed, a finger can be expected to affect almost all of the detection lines (most fingers are larger than 10 mm in diameter). A stylus tip with 2-4 mm contact diameter will only affect around 10-70% of the detection lines depending on the width of the detection line. Consequently, in an embodiment, the object type may be determined to be a finger where the ratio of the number of affected detections vs total intersecting detections exceeds 0.7.
  • In other embodiments, the statistical measure may comprise the symmetry, skewness, kurtosis, mode, support, head, tail, mean, median, variance or standard deviation of a variable of the set of intersecting detection lines.
  • In some embodiments, characteristics of the object may be determined in dependence on a plurality of the statistical measures. In one example, object type and an orientation of the object is determined in dependence on the statistical measure of at least the angle of the light path in the plane (shown as φ in FIG. 6a ) of the touch surface and the transmission value of the light path.
  • In some embodiments, at least one statistical measure is a multivariate statistical measure of values for a plurality of light path variables of the set of intersecting light paths. E.g. A combination of the median and the skewness of the attenuation values may be used to determine object type. Alternatively, variance and median values may be used to determine object type. In an alternative example, an orientation of the object is determined in dependence on the statistical measure of the angle of the light path in the plane of the touch surface and the transmission value of the light path.
  • A True Centre Point
  • A true centre point of a touch object (as opposed to object reference point 250) can now be found as the solution to the following over-determined set of linear equations, solved using normal equations.
  • For each of the interacting detection lines 230, a normal vector (having unit length) is determined as well as a position on the respective detection line (which can be the geometrical position of either emitter or detector or some other point).
  • For all detection lines passing through the region we get one “weighted” equation:

  • 0=attenuation*clot product (normal [e j −d i], object reference point−detection line position[e j −d i])
  • Using the attenuation as weight when solving the normal equations eliminates the need to threshold the affected vs unaffected detection lines when computing the centre point in this fashion.
  • Where normal is the normal vector and detection line position[ej−di] is a position along the detection line. Then, all of the linear equations are solved to determine a centre position.
  • This technique also allows a centre position to be determined for regular shapes, oblongs, etc.
  • Geometric characteristics of the object may also be determined in dependence on the one or more statistical measure, including length, width, radii, orientation in the plane of the touch surface, shape.
  • Orientation of an Elongated Touching Object
  • In one embodiment, all determined detection lines for all emitters are analysed to determine their angle φ (phi), defined as the angle between the normal to the detection line and the touch surface x-axis 400, and the shortest distance from true centre point to the detection line. Given all detection lines passing through the region, a minimum average (over a small phi-region) of attenuation*(shortest distance from detection to true centre point), provides the orientation of an elongated object.
  • Object Boundary Lines
  • A boundary line may be determined as the detection line with the largest magnitude distance from centre point 250 where the attenuation is above a threshold. The characteristics of the selected boundary line will provide useful information about the characteristics of object 210, 220. First, where the object is substantially rectangular, the length (i.e. the major axis) of the object the may be determined in dependence on a vector defining the shortest distance from the boundary line to the true centre point. As the object is rectangular, the magnitude of the vector may be assumed to be half of the length. Therefore, the length of object may be determined to be twice the magnitude of the vector.
  • Furthermore, the angle of the vector also defines the orientation angle of the rectangular object. The angle phi of the vector defines the wide axis of the object. Consequently, the angle of the narrow axis of the rectangle may be defined as
  • phi ± π 2 .
  • Using
  • phi ± π 2 ,
  • we can also use the distance between the boundary line located at
  • phi ± π 2
  • and the true centre point in to determine the width of the object. Similar to above, the width of the object may be determined to be twice the magnitude of the vector of the boundary line located at
  • phi ± π 2 .
  • In one embodiment, the phi and length values for the object are determined using an average of a plurality of the highest values.
  • In another embodiment, a touch system is provided includes a touch surface, a display, a touch sensor configured to detect one or more objects touching the touch surface and generate a touch signal, a processing element configured to: determine a position of the one or more objects in dependence on the touch signal, determine whether an object is an eraser in dependence on the touch signal, output a user interface to the display, wherein the user interface is configured to display one or more interaction objects and wherein the user interface is controlled via the one or more objects on the touch surface, wherein an erase function may only be applied to the user interface by means of an object determined to be an eraser. The eraser may have a rectangular surface for application to the touch surface allowing the touch system to easily identify the shape of the eraser, either according to the above techniques or techniques otherwise known to the skilled man. In a class room environment where a teacher and children are interacting with a digital white board and where erasing objects on the digital whiteboard is only permitted by means of the physical eraser, it is surprisingly difficult for a child to accidentally or deliberately simulate the shape of a rectangular eraser on the touch surface using their fingers and hands. Therefore, it is advantageous possible to prevent a child from erasing objects (e.g. ink, text, or geometric shapes) on the digital white board without using the eraser object. i.e. without the teacher's authorization.
  • In the embodiment above, the user interface may be a canvas or whiteboard application. Furthermore, the one or more interaction objects may comprise ink, text, or geometric shapes. The one or more interaction objects may be added to the user interface by means of a non-eraser object type applied to the touch surface. The erase function may remove interaction objects from the user interface at a position on the user interface corresponding to the position of the eraser on the touch surface.

Claims (20)

What is claimed is:
1. A touch sensing apparatus, comprising:
a touch surface,
a plurality of emitters, arranged around the periphery of the touch surface, configured to emit beams of light such that one or more objects touching the touch surface cause an attenuation or occlusion of light;
a plurality of detectors, arranged around the periphery of the touch surface, configured to receive light from the plurality of emitters on a plurality of light paths, wherein each detector in the plurality of detectors is arranged to receive light from more than one emitter in the plurality of emitters; and
a hardware processor configured to:
determine, based on output signals from the plurality of detectors, a plurality of transmission values, each of the plurality of transmission values corresponding to each of the plurality of light paths;
determine an object reference point on the touch surface where the light is attenuated or occluded by an object based on the plurality of transmission values;
determine an area on the touch surface including the object reference point;
determine one or more light paths of the plurality of light paths intersecting the area;
determine a numerical measure based on the determined one or more light paths intersecting the area, and
determine one or more characteristics of the object based on the numerical measure.
2. The touch sensing apparatus of claim 1, further comprising a light transmissive panel defining the touch surface and an opposite surface, wherein the emitters are configured to introduce light into the panel for propagation by internal reflection between the touch surface and the opposite surface, and the detectors are configured to receive the light propagating in the panel.
3. The touch sensing apparatus of claim 1, wherein the emitters are configured to transmit the beams of light above the touch surface and the detectors are configured to receive said beams of light travelling above the touch surface.
4. The touch sensing apparatus of claim 1, wherein processing the transmission values to determine the object reference point on the touch surface comprises processing the transmission values according to an image reconstruction algorithm to determine areas of the touch surface where the light is attenuated or occluded by an object, and selecting an object reference point at a position on the touch surface corresponding to an area of occlusion or high attenuation of the light.
5. The touch sensing apparatus of claim 4, wherein the image reconstruction algorithm is an algorithm for transmission tomography.
6. The touch sensing apparatus of claim 1, wherein processing the transmission values to determine the object reference point on the touch surface comprises triangulation of attenuated or occluded light paths.
7. The touch sensing apparatus of claim 1, wherein the region is defined as a circular region with a radius R from the object reference point at the centre.
8. The touch sensing apparatus of claim 7, wherein the plurality of light paths intersecting the region are determined to be the plurality of light paths passing within radius R of the object reference point.
9. The touch sensing apparatus of claim 1, wherein the at least one light path variables may further comprise:
an angle of the light path in the plane of the touch surface,
a closest distance from object reference point to the light path,
a noise value for the light path,
a validity status of light path,
a width of the light path in the plane of the touch surface.
10. The touch sensing apparatus of claim 1, wherein the one or more statistical measure is a ratio of values above a first threshold to values below the first threshold.
11. The touch sensing apparatus of claim 10, wherein the first threshold value is determined in dependence on a determination of the attenuation or occlusion of the light at the object reference point.
12. The touch sensing apparatus of claim 1, wherein the at least one statistical measure comprises: symmetry, skewness, kurtosis, mode, support, head, tail, mean, median, variance or standard deviation.
13. The touch sensing apparatus of claim 1, wherein the one or more characteristics of the object are determined in dependence on a plurality of the one or more statistical measures.
14. The touch sensing apparatus of claim 13, wherein an object type and an orientation of the object is determined in dependence on the statistical measure of at least the angle of the light path in the plane of the touch surface and the transmission value of the light path.
15. The touch sensing apparatus of claim 1, wherein the at least one statistical measure is a multivariate statistical measure of values for each of at least two light path variables of the plurality of light paths intersecting the region.
16. The touch sensing apparatus of claim 15, wherein an orientation of the object is determined in dependence on the statistical measure of the angle of the light path in the plane of the touch surface and the transmission value of the light path.
17. The touch sensing apparatus of claim 1, wherein the one or more characteristics of the object determined in dependence on the at least one statistical measure, comprises object type.
18. The touch sensing apparatus of claim 17, wherein the touch sensing apparatus is configured to differentiate between a finger and at least one stylus.
19. The touch sensing apparatus of claim 1, wherein the touch sensing apparatus is configured to determine at least one geometric characteristic of the object from: length, width, radii, orientation in the plane of the touch surface, shape.
20. A method in a touch sensing apparatus, said touch sensing apparatus comprising:
a touch surface,
a plurality of emitters arranged around the periphery of the touch surface, configured to emit beams of light such that one or more objects touching the touch surface cause an attenuation or occlusion of the light; and
a plurality of light detectors, arranged around the periphery of the touch surface, configured to receive light from the plurality of emitters on a plurality of light paths, wherein each detector in the plurality of detectors is arranged to receive light from more than one emitter in the plurality of emitters;
said method comprising:
determining, based on output signals from the plurality of light detectors, a plurality of transmission values, each of the plurality of transmission values corresponding to each of the plurality of light paths;
determine an object reference point on the touch surface where the light is attenuated or occluded by an object based on the plurality of transmission values;
determining an area on the touch surface including the object reference point,
determining one or more of light paths of the plurality of light paths intersecting the area,
determining a numerical measure based on the determined one or more light paths intersecting the area, and
determining one or more characteristics of the object in dependence based on the at least one numerical measure.
US15/925,329 2017-03-22 2018-03-19 Object characterisation for touch displays Abandoned US20180275830A1 (en)

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
SE1730073-2 2017-03-22
SE1730073 2017-03-22
SE1730120-1 2017-04-28
SE1730120 2017-04-28
EP17172910.6 2017-05-24
EP17172910 2017-05-24
SE1730276-1 2017-10-05
SE1730276 2017-10-05

Publications (1)

Publication Number Publication Date
US20180275830A1 true US20180275830A1 (en) 2018-09-27

Family

ID=63582516

Family Applications (5)

Application Number Title Priority Date Filing Date
US15/925,333 Active 2038-05-16 US10481737B2 (en) 2017-03-22 2018-03-19 Pen differentiation for touch display
US15/925,230 Active 2038-05-05 US10606414B2 (en) 2017-03-22 2018-03-19 Eraser for touch displays
US15/925,329 Abandoned US20180275830A1 (en) 2017-03-22 2018-03-19 Object characterisation for touch displays
US16/654,393 Active US11016605B2 (en) 2017-03-22 2019-10-16 Pen differentiation for touch displays
US16/829,541 Active US11099688B2 (en) 2017-03-22 2020-03-25 Eraser for touch displays

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US15/925,333 Active 2038-05-16 US10481737B2 (en) 2017-03-22 2018-03-19 Pen differentiation for touch display
US15/925,230 Active 2038-05-05 US10606414B2 (en) 2017-03-22 2018-03-19 Eraser for touch displays

Family Applications After (2)

Application Number Title Priority Date Filing Date
US16/654,393 Active US11016605B2 (en) 2017-03-22 2019-10-16 Pen differentiation for touch displays
US16/829,541 Active US11099688B2 (en) 2017-03-22 2020-03-25 Eraser for touch displays

Country Status (3)

Country Link
US (5) US10481737B2 (en)
EP (2) EP3602258B1 (en)
WO (3) WO2018174787A1 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180181231A1 (en) * 2015-06-12 2018-06-28 Sharp Kabushiki Kaisha Eraser device and command input system
US10282035B2 (en) 2016-12-07 2019-05-07 Flatfrog Laboratories Ab Touch device
US10437389B2 (en) 2017-03-28 2019-10-08 Flatfrog Laboratories Ab Touch sensing apparatus and method for assembly
US10474249B2 (en) 2008-12-05 2019-11-12 Flatfrog Laboratories Ab Touch sensing apparatus and method of operating the same
US10606414B2 (en) 2017-03-22 2020-03-31 Flatfrog Laboratories Ab Eraser for touch displays
US10761657B2 (en) 2016-11-24 2020-09-01 Flatfrog Laboratories Ab Automatic optimisation of touch signal
US10775937B2 (en) 2015-12-09 2020-09-15 Flatfrog Laboratories Ab Stylus identification
US11029783B2 (en) 2015-02-09 2021-06-08 Flatfrog Laboratories Ab Optical touch system comprising means for projecting and detecting light beams above and inside a transmissive panel
US11182023B2 (en) 2015-01-28 2021-11-23 Flatfrog Laboratories Ab Dynamic touch quarantine frames
US11256371B2 (en) 2017-09-01 2022-02-22 Flatfrog Laboratories Ab Optical component
US11474644B2 (en) 2017-02-06 2022-10-18 Flatfrog Laboratories Ab Optical coupling in touch-sensing systems
US11567610B2 (en) 2018-03-05 2023-01-31 Flatfrog Laboratories Ab Detection line broadening
US11893189B2 (en) 2020-02-10 2024-02-06 Flatfrog Laboratories Ab Touch-sensing apparatus
US11943563B2 (en) 2019-01-25 2024-03-26 FlatFrog Laboratories, AB Videoconferencing terminal and method of operating the same

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019013767A1 (en) * 2017-07-11 2019-01-17 Hewlett-Packard Development Company, L.P. Touch input detection
CN110083272B (en) * 2019-05-06 2023-07-07 深圳市康冠商用科技有限公司 Touch positioning method and related device of infrared touch frame
EP3839706B1 (en) * 2019-12-20 2023-07-05 The Swatch Group Research and Development Ltd Method and device for determining the position of an object on a given surface
CN113934312B (en) * 2020-06-29 2023-10-20 深圳市创易联合科技有限公司 Touch object identification method based on infrared touch screen and terminal equipment
US11054943B1 (en) * 2020-08-17 2021-07-06 Microsoft Technology Licensing, Llc Touch restriction region for touch-sensitive display

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020075243A1 (en) * 2000-06-19 2002-06-20 John Newton Touch panel display system
US20120068973A1 (en) * 2009-05-18 2012-03-22 Flatfrog Laboratories Ab Determining The Location Of An Object On A Touch Surface
US20120256882A1 (en) * 2009-12-21 2012-10-11 Flatfrog Laboratories Ab Touch surface with identification of reduced performance
US20130155027A1 (en) * 2008-06-19 2013-06-20 Neonode Inc. Optical touch screen systems using total internal reflection
US20140320459A1 (en) * 2009-02-15 2014-10-30 Neonode Inc. Optical touch screens
US20150130769A1 (en) * 2012-05-02 2015-05-14 Flatfrog Laboratories Ab Object detection in touch systems

Family Cites Families (647)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR1452041A (en) 1965-04-26 1966-02-25 Electronique & Automatisme Sa Communication device with an electronic calculator
US3440426A (en) 1966-01-11 1969-04-22 Us Navy Solar attitude encoder
US3673327A (en) 1970-11-02 1972-06-27 Atomic Energy Commission Touch actuable data input panel assembly
IT961146B (en) 1971-03-12 1973-12-10 Schlumberger Compteurs DEVICE PERMITTING ME TO DETERMINE THE DIRECTION OF A BRIGHT RADIATION
FR2172828B1 (en) 1972-02-23 1974-12-13 Dassault Electronique
DE2654464A1 (en) 1976-12-01 1978-06-08 Sick Optik Elektronik Erwin PHOTOELECTRIC LIGHT RECEIVING ARRANGEMENT
US4129384A (en) 1977-06-08 1978-12-12 Batelle Memorial Institute Optical extensometer
US4254333A (en) 1978-05-31 1981-03-03 Bergstroem Arne Optoelectronic circuit element
US4209255A (en) 1979-03-30 1980-06-24 United Technologies Corporation Single source aiming point locator
US4213707A (en) 1979-04-25 1980-07-22 Eastman Kodak Company Device for improving the accuracy of optical measuring apparatus and the like
US4254407A (en) 1979-07-18 1981-03-03 Ncr Corporation Data processing system having optically linked subsystems, including an optical keyboard
US4294543A (en) 1979-11-13 1981-10-13 Command Control & Communications Corporation Optical system for developing point coordinate information
US4346376A (en) 1980-04-16 1982-08-24 Bell Telephone Laboratories, Incorporated Touch position sensitive surface
US4484179A (en) 1980-04-16 1984-11-20 At&T Bell Laboratories Touch position sensitive surface
US4420261A (en) 1980-09-02 1983-12-13 Lowbar, Inc. Optical position location apparatus
JPS58111705A (en) 1981-12-25 1983-07-02 Mitsutoyo Mfg Co Ltd Optical measuring device
US4542375A (en) 1982-02-11 1985-09-17 At&T Bell Laboratories Deformable touch sensitive surface
GB2131544B (en) 1982-12-07 1986-03-05 Lowbar Inc Optical postition location apparatus
US4593191A (en) 1982-12-29 1986-06-03 At&T Bell Laboratories Pressure and optical sensitive device with deformable protrusions
GB8302997D0 (en) 1983-02-03 1983-03-09 Bergstrom A Electromagnetic radiation circuit element
US4507557A (en) 1983-04-01 1985-03-26 Siemens Corporate Research & Support, Inc. Non-contact X,Y digitizer using two dynamic ram imagers
US4550250A (en) 1983-11-14 1985-10-29 Hei, Inc. Cordless digital graphics input device
US4752655A (en) 1984-11-16 1988-06-21 Nippon Telegraph & Telephone Corporation Coordinate input device
US4692809A (en) 1984-11-20 1987-09-08 Hughes Aircraft Company Integrated touch paint system for displays
US4673918A (en) 1984-11-29 1987-06-16 Zenith Electronics Corporation Light guide having focusing element and internal reflector on same face
JPH0325219Y2 (en) 1985-02-15 1991-05-31
JPH0325220Y2 (en) 1985-02-15 1991-05-31
US4710760A (en) 1985-03-07 1987-12-01 American Telephone And Telegraph Company, At&T Information Systems Inc. Photoelastic touch-sensitive screen
US4688993A (en) 1985-03-21 1987-08-25 United Technologies Corporation Tangential link swashplate centering member
DE3511330A1 (en) 1985-03-28 1986-10-02 Siemens Ag Arrangement for inputting graphic patterns
US5159322A (en) 1985-04-19 1992-10-27 Loebner Hugh G Apparatus to digitize graphic and scenic information and to determine the position of a stylus for input into a computer or the like
US5073770A (en) 1985-04-19 1991-12-17 Lowbner Hugh G Brightpen/pad II
US4949079A (en) 1985-04-19 1990-08-14 Hugh Loebner Brightpen/pad graphic device for computer inputs and the like
US4688933A (en) 1985-05-10 1987-08-25 The Laitram Corporation Electro-optical position determining system
US4736191A (en) 1985-08-02 1988-04-05 Karl E. Matzke Touch activated control method and apparatus
JPH0318997Y2 (en) 1985-10-04 1991-04-22
JPH0762821B2 (en) 1986-05-30 1995-07-05 株式会社日立製作所 Touch panel input device
US4782328A (en) 1986-10-02 1988-11-01 Product Development Services, Incorporated Ambient-light-responsive touch screen data input method and system
US4891829A (en) 1986-11-19 1990-01-02 Exxon Research And Engineering Company Method and apparatus for utilizing an electro-optic detector in a microtomography system
US4868912A (en) 1986-11-26 1989-09-19 Digital Electronics Infrared touch panel
US4746770A (en) 1987-02-17 1988-05-24 Sensor Frame Incorporated Method and apparatus for isolating and manipulating graphic objects on computer video monitor
US4820050A (en) 1987-04-28 1989-04-11 Wells-Gardner Electronics Corporation Solid-state optical position determining apparatus
FR2614711B1 (en) 1987-04-29 1992-03-13 Photonetics METHOD AND DEVICE FOR OPERATING THE SCREEN SIGNAL OF A TOUCH SCREEN
FR2617619B1 (en) 1987-07-02 1990-01-05 Photonetics OPTICAL TOUCH SCREEN MOUNTING DEVICE
FR2617620B1 (en) 1987-07-02 1992-09-25 Photonetics OPTICAL TYPE TOUCH SCREEN
US4772763A (en) 1987-08-25 1988-09-20 International Business Machines Corporation Data processing information input using optically sensed stylus features
JPH01195526A (en) 1988-01-29 1989-08-07 Sony Corp Touch panel device
FR2631438B1 (en) 1988-05-11 1991-06-21 Photonetics METHOD FOR POSITIONING AN OBJECT RELATIVE TO A PLANE, METHOD FOR MEASURING LENGTH AND DEVICES FOR CARRYING OUT SAID METHODS
US4988983A (en) 1988-09-02 1991-01-29 Carroll Touch, Incorporated Touch entry system with ambient compensation and programmable amplification
US4986662A (en) 1988-12-19 1991-01-22 Amp Incorporated Touch entry using discrete reflectors
FR2645645B1 (en) 1989-04-06 1991-07-12 Photonetics IMPROVEMENTS IN METHODS AND DEVICES FOR DETERMINING THE ANGLE OF CONTACT OF A DROP OF LIQUID PLACED ON A SUBSTRATE
US4916712A (en) 1989-07-27 1990-04-10 Mcdonnell Douglas Corporation Optically pumped slab laser
US5065185A (en) 1989-08-21 1991-11-12 Powers Edward A Multi-function detecting device for a document reproduction machine
ATE118208T1 (en) 1989-10-16 1995-02-15 Chiroscience Ltd CHIRAL AZABICYCLOHEPTANONES AND METHOD FOR THE PRODUCTION THEREOF.
US5105186A (en) 1990-05-25 1992-04-14 Hewlett-Packard Company Lcd touch screen
US6390370B1 (en) 1990-11-15 2002-05-21 Symbol Technologies, Inc. Light beam scanning pen, scan module for the device and method of utilization
DE4111710C2 (en) 1991-04-10 1995-01-12 Data Stream Corp Wireless input device for computers
FR2676275A1 (en) 1991-05-07 1992-11-13 Photonetics DEVICE FOR REMOTELY MEASURING THE POSITION OF AN OBJECT.
US5539514A (en) 1991-06-26 1996-07-23 Hitachi, Ltd. Foreign particle inspection apparatus and method with front and back illumination
US5345490A (en) 1991-06-28 1994-09-06 General Electric Company Method and apparatus for converting computed tomography (CT) data into finite element models
US5335557A (en) 1991-11-26 1994-08-09 Taizo Yasutake Touch sensitive input control device
JPH05190066A (en) 1992-01-14 1993-07-30 Matsushita Electric Ind Co Ltd Light shielding plate device of touch switch
CA2060564C (en) 1992-02-06 1996-05-21 Toru Suzuki Wireless input system for computer
US5483261A (en) 1992-02-14 1996-01-09 Itu Research, Inc. Graphical input controller and method with rear screen image detection
CH683370A5 (en) 1992-04-10 1994-02-28 Zumbach Electronic Ag Method and apparatus for measuring the dimension of an object.
CA2068191C (en) 1992-05-07 1994-11-22 Fernand Sergerie Reinforced composite backing tape
US7084859B1 (en) 1992-09-18 2006-08-01 Pryor Timothy R Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics
US5248856A (en) 1992-10-07 1993-09-28 Microfield Graphics, Inc. Code-based, electromagnetic-field-responsive graphic data-acquisition system
EP0599297B1 (en) 1992-11-25 1998-05-20 Sumitomo Electric Industries, Limited Method of detecting impurities in molten resin
US5502568A (en) 1993-03-23 1996-03-26 Wacom Co., Ltd. Optical position detecting unit, optical coordinate input unit and optical position detecting method employing a pattern having a sequence of 1's and 0's
JP3400485B2 (en) 1993-03-23 2003-04-28 株式会社ワコム Optical position detecting device and optical coordinate input device
DE4334937A1 (en) 1993-10-13 1995-10-05 Siemens Ag Computer tomograph
JP3135183B2 (en) 1993-10-29 2001-02-13 株式会社ワコム Position indicator
WO1995014286A1 (en) 1993-11-17 1995-05-26 Microsoft Corporation Wireless pen computer input system
US5484966A (en) 1993-12-07 1996-01-16 At&T Corp. Sensing stylus position using single 1-D image sensor
JPH07200137A (en) 1993-12-28 1995-08-04 Wacom Co Ltd Position detection device and its position indicator
US5515083A (en) 1994-02-17 1996-05-07 Spacelabs Medical, Inc. Touch screen having reduced sensitivity to spurious selections
JPH07261920A (en) 1994-03-17 1995-10-13 Wacom Co Ltd Optical position detector and optical coordinate input device
JP3421416B2 (en) 1994-03-18 2003-06-30 株式会社ワコム Position detecting device and its position indicator
US5525764A (en) 1994-06-09 1996-06-11 Junkins; John L. Laser scanning graphic input system
US5526422A (en) 1994-06-20 1996-06-11 At&T Corp. System and method for cleaning the display screen of a touch screen device
DE19521254A1 (en) 1994-06-24 1996-01-04 Minnesota Mining & Mfg Display system with brightness boosting film
US5740224A (en) 1994-09-27 1998-04-14 University Of Delaware Cone beam synthetic arrays in three-dimensional computerized tomography
US5686942A (en) 1994-12-01 1997-11-11 National Semiconductor Corporation Remote computer input system which detects point source on operator
US5736686A (en) 1995-03-01 1998-04-07 Gtco Corporation Illumination apparatus for a digitizer tablet with improved light panel
US5764223A (en) 1995-06-07 1998-06-09 International Business Machines Corporation Touch-screen input device using the monitor as a light source operating at an intermediate frequency
US6031524A (en) 1995-06-07 2000-02-29 Intermec Ip Corp. Hand-held portable data terminal having removably interchangeable, washable, user-replaceable components with liquid-impervious seal
CA2225734C (en) 1995-06-29 2006-11-14 Lynn Wiese Localized illumination using tir technology
GB9516441D0 (en) 1995-08-10 1995-10-11 Philips Electronics Uk Ltd Light pen input systems
WO1997041527A1 (en) 1996-05-01 1997-11-06 Xros, Inc. Compact, simple, 2d raster, image-building fingerprint scanner
PL330188A1 (en) 1996-05-29 1999-04-26 Deutsche Telekom Ag Information entering apparatus
US6067079A (en) 1996-06-13 2000-05-23 International Business Machines Corporation Virtual pointing device for touchscreens
DE19631414A1 (en) 1996-08-05 1998-02-19 Daimler Benz Ag Device for recording the retinal reflex image and superimposing additional images in the eye
JP3300856B2 (en) 1996-08-12 2002-07-08 イーエルオー・タッチシステムズ・インコーポレイテッド Acoustic state sensor using multiple mutually non-orthogonal waves
US5767517A (en) 1996-10-21 1998-06-16 Board Of Regents -Univ. Of Ne Hybrid resampling method for fan beam spect
DE69739633D1 (en) 1996-11-28 2009-12-10 Casio Computer Co Ltd display device
US6061177A (en) 1996-12-19 2000-05-09 Fujimoto; Kenneth Noboru Integrated computer display and graphical input apparatus and method
US6380732B1 (en) 1997-02-13 2002-04-30 Super Dimension Ltd. Six-degree of freedom tracking system having a passive transponder on the object being tracked
JPH113169A (en) 1997-06-13 1999-01-06 Tokai Rika Co Ltd Touch operation information output device
US6229529B1 (en) 1997-07-11 2001-05-08 Ricoh Company, Ltd. Write point detecting circuit to detect multiple write points
US6480187B1 (en) 1997-08-07 2002-11-12 Fujitsu Limited Optical scanning-type touch panel
US6141104A (en) 1997-09-09 2000-10-31 Image Guided Technologies, Inc. System for determination of a location in three dimensional space
US6909419B2 (en) 1997-10-31 2005-06-21 Kopin Corporation Portable microdisplay system
US5945980A (en) 1997-11-14 1999-08-31 Logitech, Inc. Touchpad with active plane for pen detection
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US6315156B1 (en) 1998-01-26 2001-11-13 Gpax International, Inc. Tape-form packaging system and apparatus for effecting assembly and disassembly thereof
KR100595925B1 (en) 1998-01-26 2006-07-05 웨인 웨스터만 Method and apparatus for integrating manual input
DE19809934A1 (en) 1998-03-07 1999-09-09 Bosch Gmbh Robert Laser display panel with contact detection
WO1999046602A1 (en) 1998-03-09 1999-09-16 Gou Lite Ltd. Optical translation measurement
US6172667B1 (en) 1998-03-19 2001-01-09 Michel Sayag Optically-based touch screen input device
US6748098B1 (en) 1998-04-14 2004-06-08 General Electric Company Algebraic reconstruction of images from non-equidistant data
JP3827450B2 (en) 1998-08-18 2006-09-27 富士通株式会社 Optical scanning touch panel
US7268774B2 (en) 1998-08-18 2007-09-11 Candledragon, Inc. Tracking motion of a writing instrument
US6972753B1 (en) 1998-10-02 2005-12-06 Semiconductor Energy Laboratory Co., Ltd. Touch panel, display device provided with touch panel and electronic equipment provided with display device
JP3530758B2 (en) 1998-12-03 2004-05-24 キヤノン株式会社 Pointer for inputting coordinates
JP4007705B2 (en) 1998-11-20 2007-11-14 富士通株式会社 Optical scanning touch panel
US6175999B1 (en) 1999-01-12 2001-01-23 Dell Usa, L.P. Universal fixture for pre-assembly of computer components
JP4245721B2 (en) 1999-03-05 2009-04-02 プラスビジョン株式会社 Coordinate input pen
US6333735B1 (en) 1999-03-16 2001-12-25 International Business Machines Corporation Method and apparatus for mouse positioning device based on infrared light sources and detectors
JP4097353B2 (en) 1999-04-07 2008-06-11 富士通株式会社 Optical scanning touch panel
JP4939682B2 (en) 1999-04-27 2012-05-30 エーユー オプトロニクス コーポレイション Display device
DE19924448A1 (en) 1999-05-28 2000-12-07 Siemens Ag Three-dimensional data set extraction method for magnetic resonance imaging
FR2794246B1 (en) 1999-05-31 2001-08-10 Saint Louis Inst DEVICE CAPABLE OF DETERMINING THE POSITION OF AN OBJECT IN AN OXZ MARK
EP1188069A2 (en) 1999-06-09 2002-03-20 Beamcontrol Aps A method for determining the channel gain between emitters and receivers
FR2795877B1 (en) 1999-06-30 2001-10-05 Photonetics PARTIALLY REFLECTIVE OPTICAL COMPONENT AND LASER SOURCE INCORPORATING SUCH COMPONENT
US6366277B1 (en) 1999-10-13 2002-04-02 Elo Touchsystems, Inc. Contaminant processing system for an acoustic touchscreen
JP3606138B2 (en) 1999-11-05 2005-01-05 セイコーエプソン株式会社 Driver IC, electro-optical device and electronic apparatus
JP2001147772A (en) 1999-11-19 2001-05-29 Fujitsu Takamisawa Component Ltd Touch panel
JP3780785B2 (en) 1999-11-30 2006-05-31 三菱電機株式会社 Concavity and convexity pattern detector
US6429857B1 (en) 1999-12-02 2002-08-06 Elo Touchsystems, Inc. Apparatus and method to improve resolution of infrared touch systems
JP2001183987A (en) 1999-12-27 2001-07-06 Pioneer Electronic Corp Cooling structure and display device using the same
US20040252867A1 (en) 2000-01-05 2004-12-16 Je-Hsiung Lan Biometric sensor
JP3881148B2 (en) 2000-02-18 2007-02-14 株式会社リコー Photodetection device for coordinate detection, coordinate input / detection device, electronic blackboard, mounting position detection method, and storage medium
US6495832B1 (en) 2000-03-15 2002-12-17 Touch Controls, Inc. Photoelectric sensing array apparatus and method of using same
US20010030642A1 (en) 2000-04-05 2001-10-18 Alan Sullivan Methods and apparatus for virtual touchscreen computer interface controller
US7859519B2 (en) 2000-05-01 2010-12-28 Tulbert David J Human-machine interface
US6864882B2 (en) 2000-05-24 2005-03-08 Next Holdings Limited Protected touch panel display system
US6660964B1 (en) 2000-09-22 2003-12-09 David Benderly Optical modification of laser beam cross section in object marking systems
US6724489B2 (en) 2000-09-22 2004-04-20 Daniel Freifeld Three dimensional scanning camera
WO2002035460A1 (en) 2000-10-27 2002-05-02 Elo Touchsystems, Inc. Touch confirming touchscreen utilizing plural touch sensors
JP4087247B2 (en) 2000-11-06 2008-05-21 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Measuring method of input device movement
US6648485B1 (en) 2000-11-13 2003-11-18 International Business Machines Corporation Highly collimating tapered light guide for uniform illumination of flat panel displays
US6940286B2 (en) 2000-12-30 2005-09-06 University Of Leeds Electrical impedance tomography
JP4004025B2 (en) 2001-02-13 2007-11-07 日東電工株式会社 Transparent conductive laminate and touch panel
DE10110744A1 (en) 2001-03-07 2002-09-26 Franc Godler Large, touch-sensitive area with time and location-controlled transmitter and receiver modules
US6452996B1 (en) 2001-03-16 2002-09-17 Ge Medical Systems Global Technology Company, Llc Methods and apparatus utilizing generalized helical interpolation algorithm
JP4768143B2 (en) 2001-03-26 2011-09-07 株式会社リコー Information input / output device, information input / output control method, and program
US6738051B2 (en) 2001-04-06 2004-05-18 3M Innovative Properties Company Frontlit illuminated touch panel
JP4812181B2 (en) 2001-04-20 2011-11-09 オリンパス株式会社 Observation optical system, imaging optical system, and apparatus using the same
US6992659B2 (en) 2001-05-22 2006-01-31 Palmone, Inc. High transparency integrated enclosure touch screen assembly for a portable hand held device
JP3959678B2 (en) 2001-07-13 2007-08-15 ミネベア株式会社 Touch panel for display device
DE10136611C1 (en) 2001-07-23 2002-11-21 Jenoptik Laserdiode Gmbh Optical device, for laser light emitted by laser diode device, has collimation optical element and homogenizing element using multiple reflection of laser beam
US6927384B2 (en) 2001-08-13 2005-08-09 Nokia Mobile Phones Ltd. Method and device for detecting touch pad unit
US6985137B2 (en) 2001-08-13 2006-01-10 Nokia Mobile Phones Ltd. Method for preventing unintended touch pad input due to accidental touching
US6765193B2 (en) 2001-08-21 2004-07-20 National Science And Technology Development Agency Optical touch switch structures
US20030048257A1 (en) 2001-09-06 2003-03-13 Nokia Mobile Phones Ltd. Telephone set having a touch pad device
US7254775B2 (en) 2001-10-03 2007-08-07 3M Innovative Properties Company Touch panel system and method for distinguishing multiple touch inputs
KR20040045490A (en) 2001-10-09 2004-06-01 코닌클리케 필립스 일렉트로닉스 엔.브이. Device having touch sensitivity functionality
US20100238139A1 (en) 2009-02-15 2010-09-23 Neonode Inc. Optical touch screen systems using wide light beams
US8339379B2 (en) 2004-04-29 2012-12-25 Neonode Inc. Light-based touch screen
US9471170B2 (en) 2002-11-04 2016-10-18 Neonode Inc. Light-based touch screen with shift-aligned emitter and receiver lenses
US20120188206A1 (en) 2001-11-02 2012-07-26 Neonode, Inc. Optical touch screen with tri-directional micro-lenses
US6948840B2 (en) 2001-11-16 2005-09-27 Everbrite, Llc Light emitting diode light bar
US6664498B2 (en) 2001-12-04 2003-12-16 General Atomics Method and apparatus for increasing the material removal rate in laser machining
KR100449710B1 (en) 2001-12-10 2004-09-22 삼성전자주식회사 Remote pointing method and apparatus therefor
US7006080B2 (en) 2002-02-19 2006-02-28 Palm, Inc. Display system
JP4477811B2 (en) 2002-02-27 2010-06-09 Hoya株式会社 Mounting plate for solid-state image sensor and mounting method to the mounting plate
DE10211307A1 (en) 2002-03-13 2003-11-20 Mechaless Systems Gmbh Device and method for optoelectronic detection of the movement and / or position of an object
WO2003077192A1 (en) 2002-03-13 2003-09-18 O-Pen Aps A touch pad, a stylus for use with the touch pad, and a method of operating the touch pad
JP2005535004A (en) 2002-03-27 2005-11-17 ネルコアー ピューリタン ベネット インコーポレイテッド Infrared touch frame system
DE50308334D1 (en) 2002-05-07 2007-11-22 Schott Ag Lighting device for buttons
JP2003330603A (en) 2002-05-13 2003-11-21 Ricoh Co Ltd Coordinate detecting device and method, coordinate detecting program for making computer execute the same method and recording medium with its program recorded
US7176897B2 (en) 2002-05-17 2007-02-13 3M Innovative Properties Company Correction of memory effect errors in force-based touch panel systems
US7952570B2 (en) 2002-06-08 2011-05-31 Power2B, Inc. Computer navigation
US20090143141A1 (en) 2002-08-06 2009-06-04 Igt Intelligent Multiplayer Gaming System With Multi-Touch Display
US7151532B2 (en) 2002-08-09 2006-12-19 3M Innovative Properties Company Multifunctional multilayer optical film
JP2004078613A (en) 2002-08-19 2004-03-11 Fujitsu Ltd Touch panel system
WO2004032210A2 (en) 2002-10-01 2004-04-15 Microfabrica Inc. Monolithic structures including alignment and/or retention fixtures for accepting components
US7133031B2 (en) 2002-10-31 2006-11-07 Microsoft Corporation Optical system design for a universal computing device
JP4093308B2 (en) 2002-11-01 2008-06-04 富士通株式会社 Touch panel device and contact position detection method
US8587562B2 (en) 2002-11-04 2013-11-19 Neonode Inc. Light-based touch screen using elliptical and parabolic reflectors
US8902196B2 (en) 2002-12-10 2014-12-02 Neonode Inc. Methods for determining a touch location on a touch screen
US9389730B2 (en) * 2002-12-10 2016-07-12 Neonode Inc. Light-based touch screen using elongated light guides
US7042444B2 (en) 2003-01-17 2006-05-09 Eastman Kodak Company OLED display and touch screen
US7629967B2 (en) 2003-02-14 2009-12-08 Next Holdings Limited Touch screen signal processing
US7532206B2 (en) 2003-03-11 2009-05-12 Smart Technologies Ulc System and method for differentiating between pointers used to contact touch surface
US20070034783A1 (en) 2003-03-12 2007-02-15 Eliasson Jonas O P Multitasking radiation sensor
JP2006523869A (en) 2003-03-12 2006-10-19 オー−プン・アンパルトセルスカブ System and method for measuring the position of a radiation emitting element
KR100533839B1 (en) 2003-03-14 2005-12-07 삼성전자주식회사 Control device of electronic devices based on motion
US7465342B2 (en) 2003-04-07 2008-12-16 Silverbrook Research Pty Ltd Method of minimizing absorption of visible light in ink compositions comprising infrared metal-dithiolene dyes
US7786983B2 (en) 2003-04-08 2010-08-31 Poa Sana Liquidating Trust Apparatus and method for a data input device using a light lamina screen
US7133032B2 (en) 2003-04-24 2006-11-07 Eastman Kodak Company OLED display and touch screen
US7362320B2 (en) 2003-06-05 2008-04-22 Hewlett-Packard Development Company, L.P. Electronic device having a light emitting/detecting display screen
JP2005004278A (en) 2003-06-09 2005-01-06 Ricoh Elemex Corp Coordinate input device
US7432893B2 (en) 2003-06-14 2008-10-07 Massachusetts Institute Of Technology Input device based on frustrated total internal reflection
US7474772B2 (en) 2003-06-25 2009-01-06 Atrua Technologies, Inc. System and method for a miniature user input device
JP4405766B2 (en) 2003-08-07 2010-01-27 キヤノン株式会社 Coordinate input device, coordinate input method
US7796173B2 (en) 2003-08-13 2010-09-14 Lettvin Jonathan D Imaging system
US7359041B2 (en) 2003-09-04 2008-04-15 Avago Technologies Ecbu Ip Pte Ltd Method and system for optically tracking a target using a triangulation technique
US7442914B2 (en) 2003-09-12 2008-10-28 Flatfrog Laboratories Ab System and method of determining a position of a radiation emitting element
ATE514991T1 (en) 2003-09-12 2011-07-15 Flatfrog Lab Ab SYSTEM AND METHOD FOR DETERMINING A POSITION OF A RADIATION SCATTERING/REFLECTION ELEMENT
KR100534968B1 (en) 2003-09-16 2005-12-08 현대자동차주식회사 cooling structure of an electronic element
WO2005029395A2 (en) 2003-09-22 2005-03-31 Koninklijke Philips Electronics N.V. Coordinate detection system for a display monitor
KR20060135610A (en) 2003-09-22 2006-12-29 코닌클리케 필립스 일렉트로닉스 엔.브이. Touch input screen using a light guide
US9123077B2 (en) 2003-10-07 2015-09-01 Hospira, Inc. Medication management system
US7221374B2 (en) 2003-10-21 2007-05-22 Hewlett-Packard Development Company, L.P. Adjustment of color in displayed images based on identification of ambient light sources
JP2005165199A (en) 2003-12-05 2005-06-23 Alps Electric Co Ltd Prism sheet, lighting device, surface emitting apparatus, and liquid crystal display device
US7265748B2 (en) 2003-12-11 2007-09-04 Nokia Corporation Method and device for detecting touch pad input
US7344279B2 (en) 2003-12-11 2008-03-18 Philips Solid-State Lighting Solutions, Inc. Thermal management methods and apparatus for lighting devices
GB2409304B (en) 2003-12-19 2007-11-14 Westerngeco Ltd Processing geophysical data
JP4616559B2 (en) 2004-01-15 2011-01-19 大日本印刷株式会社 Display device and display system
US7087907B1 (en) 2004-02-02 2006-08-08 Advanced Micro Devices, Inc. Detection of contamination in imaging systems by fluorescence and/or absorption spectroscopy
US7342705B2 (en) 2004-02-03 2008-03-11 Idc, Llc Spatial light modulator with integrated optical compensation structure
JP4522113B2 (en) 2004-03-11 2010-08-11 キヤノン株式会社 Coordinate input device
US20060033725A1 (en) 2004-06-03 2006-02-16 Leapfrog Enterprises, Inc. User created interactive interface
US7310090B2 (en) 2004-03-25 2007-12-18 Avago Technologies Ecbm Ip (Singapore) Pte Ltd. Optical generic switch panel
US6965836B2 (en) 2004-04-19 2005-11-15 Battelle Energy Alliance, Llc Method and apparatus for two dimensional surface property analysis based on boundary measurement
US7538759B2 (en) 2004-05-07 2009-05-26 Next Holdings Limited Touch panel display system with illumination and detection provided from a single edge
WO2005112581A2 (en) 2004-05-11 2005-12-01 Motion Computing, Inc. Improved display for stylus input displays
JP4429083B2 (en) 2004-06-03 2010-03-10 キヤノン株式会社 Shading type coordinate input device and coordinate input method thereof
GB0413747D0 (en) 2004-06-19 2004-07-21 Atomic Energy Authority Uk Optical keyboard
US7743348B2 (en) 2004-06-30 2010-06-22 Microsoft Corporation Using physical objects to adjust attributes of an interactive display application
US8184108B2 (en) 2004-06-30 2012-05-22 Poa Sana Liquidating Trust Apparatus and method for a folded optical element waveguide for use with light based touch screens
US7565020B2 (en) 2004-07-03 2009-07-21 Microsoft Corp. System and method for image coding employing a hybrid directional prediction and wavelet lifting
ES2555309T3 (en) 2004-07-06 2015-12-30 Maricare Oy Sensor product for electric field detection
JP2006039686A (en) 2004-07-22 2006-02-09 Pioneer Electronic Corp Touch panel device, touch region detecting method, and touch region detecting program
US7653883B2 (en) 2004-07-30 2010-01-26 Apple Inc. Proximity detector in handheld device
US20060038698A1 (en) 2004-08-19 2006-02-23 Chen Jim T Multi-purpose remote control input device
JP4761736B2 (en) 2004-08-20 2011-08-31 東芝モバイルディスプレイ株式会社 Liquid crystal display
US20060061861A1 (en) 2004-09-23 2006-03-23 Reflexite Corporation High performance rear-projection screen
US20060066586A1 (en) 2004-09-27 2006-03-30 Gally Brian J Touchscreens for displays
WO2006055830A2 (en) 2004-11-15 2006-05-26 Hologic, Inc. Matching geometry generation and display of mammograms and tomosynthesis images
US8599140B2 (en) 2004-11-17 2013-12-03 International Business Machines Corporation Providing a frustrated total internal reflection touch interface
US7847789B2 (en) 2004-11-23 2010-12-07 Microsoft Corporation Reducing accidental touch-sensitive device activation
US20060132454A1 (en) 2004-12-16 2006-06-22 Deng-Peng Chen Systems and methods for high resolution optical touch position systems
US20060158437A1 (en) 2005-01-20 2006-07-20 Blythe Michael M Display device
US7800594B2 (en) 2005-02-03 2010-09-21 Toshiba Matsushita Display Technology Co., Ltd. Display device including function to input information from screen by light
US8298078B2 (en) 2005-02-28 2012-10-30 Wms Gaming Inc. Wagering game machine with biofeedback-aware game presentation
WO2006095320A2 (en) 2005-03-10 2006-09-14 Koninklijke Philips Electronics, N.V. System and method for detecting the location, size and shape of multiple objects that interact with a touch screen display
US20060202974A1 (en) 2005-03-10 2006-09-14 Jeffrey Thielman Surface
US7705835B2 (en) 2005-03-28 2010-04-27 Adam Eikman Photonic touch screen apparatus and method of use
US7840625B2 (en) 2005-04-07 2010-11-23 California Institute Of Technology Methods for performing fast discrete curvelet transforms of data
US20060256092A1 (en) 2005-05-12 2006-11-16 Lee Daniel J Reconfigurable interactive interface device including an optical display and optical touchpad that use aerogel to direct light in a desired direction
US7646833B1 (en) 2005-05-23 2010-01-12 Marvell International Ltd. Channel equalization in receivers
US7995039B2 (en) 2005-07-05 2011-08-09 Flatfrog Laboratories Ab Touch pad system
US7916144B2 (en) 2005-07-13 2011-03-29 Siemens Medical Solutions Usa, Inc. High speed image reconstruction for k-space trajectory data using graphic processing unit (GPU)
US7629968B2 (en) 2005-07-29 2009-12-08 Avago Technologies Fiber Ip (Singapore) Pte. Ltd. Methods and systems for detecting selections on a touch screen display
US7737959B2 (en) 2005-09-08 2010-06-15 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Position detection system using laser speckle
KR20070030547A (en) 2005-09-13 2007-03-16 삼성전자주식회사 Condensing member, mathod of manufacturing thereof and display apparatus having the same
JP4510738B2 (en) 2005-09-28 2010-07-28 株式会社 日立ディスプレイズ Display device
US8847924B2 (en) 2005-10-03 2014-09-30 Hewlett-Packard Development Company, L.P. Reflecting light
JP2007128497A (en) 2005-10-05 2007-05-24 Sony Corp Display apparatus and method thereof
US20070109239A1 (en) 2005-11-14 2007-05-17 Den Boer Willem Integrated light sensitive liquid crystal display
US7655901B2 (en) 2005-11-18 2010-02-02 Research In Motion Limited Light assisted keyboard for mobile communication device
JP2007163891A (en) 2005-12-14 2007-06-28 Sony Corp Display apparatus
US8013845B2 (en) 2005-12-30 2011-09-06 Flatfrog Laboratories Ab Optical touch pad with multilayer waveguide
US8077147B2 (en) 2005-12-30 2011-12-13 Apple Inc. Mouse with optical sensing surface
EP1835464A1 (en) 2006-03-14 2007-09-19 GSF-Forschungszentrum für Umwelt und Gesundheit GmbH Method of reconstructing an image function from radon data
WO2007112742A1 (en) 2006-03-30 2007-10-11 Flatfrog Laboratories Ab A system and a method of determining a position of a scattering/reflecting element on the surface of a radiation transmissive element
US7397418B1 (en) 2006-06-05 2008-07-08 Sandia Corporation SAR image formation with azimuth interpolation after azimuth transform
JP4891666B2 (en) 2006-06-22 2012-03-07 東芝モバイルディスプレイ株式会社 Liquid crystal display
WO2008007276A2 (en) 2006-06-28 2008-01-17 Koninklijke Philips Electronics, N.V. Method and apparatus for object learning and recognition based on optical parameters
US8094136B2 (en) 2006-07-06 2012-01-10 Flatfrog Laboratories Ab Optical touchpad with three-dimensional position determination
US8031186B2 (en) 2006-07-06 2011-10-04 Flatfrog Laboratories Ab Optical touchpad system and waveguide for use therein
US20080007541A1 (en) 2006-07-06 2008-01-10 O-Pen A/S Optical touchpad system and waveguide for use therein
US7351949B2 (en) 2006-07-10 2008-04-01 Avago Technologies General Ip Pte Ltd Optical generic switch panel
US7394058B2 (en) 2006-07-12 2008-07-01 Agilent Technologies, Inc. Touch screen with light-enhancing layer
US8441467B2 (en) 2006-08-03 2013-05-14 Perceptive Pixel Inc. Multi-touch sensing display through frustrated total internal reflection
EP2047308A4 (en) 2006-08-03 2010-11-24 Perceptive Pixel Inc Multi-touch sensing display through frustrated total internal reflection
US8144271B2 (en) 2006-08-03 2012-03-27 Perceptive Pixel Inc. Multi-touch sensing through frustrated total internal reflection
US20090189874A1 (en) 2006-08-03 2009-07-30 France Telecom Image capture and haptic input device
US7969410B2 (en) 2006-08-23 2011-06-28 Avago Technologies General Ip (Singapore) Pte. Ltd. Optically detecting click events
KR20080023832A (en) 2006-09-12 2008-03-17 삼성전자주식회사 Touch screen for mobile terminal and power saving method thereof
CN101517521B (en) 2006-09-13 2012-08-15 皇家飞利浦电子股份有限公司 System for determining, and/or marking the orientation and/or identification of an object
JP4842747B2 (en) 2006-09-20 2011-12-21 株式会社リコー Optical scanning apparatus, image forming apparatus, and color image forming apparatus
WO2008034184A1 (en) 2006-09-22 2008-03-27 Rpo Pty Limited Waveguide configurations for optical touch systems
JP4567028B2 (en) 2006-09-26 2010-10-20 エルジー ディスプレイ カンパニー リミテッド Liquid crystal display device having multi-touch sensing function and driving method thereof
KR100782431B1 (en) 2006-09-29 2007-12-05 주식회사 넥시오 Multi position detecting method and area detecting method in infrared rays type touch screen
US7369724B2 (en) 2006-10-03 2008-05-06 National Semiconductor Corporation Apparatus and method for an improved lens structure for polymer wave guides which maximizes free space light coupling
US9063617B2 (en) 2006-10-16 2015-06-23 Flatfrog Laboratories Ab Interactive display system, tool for use with the system, and tool management apparatus
US8094129B2 (en) 2006-11-27 2012-01-10 Microsoft Corporation Touch sensing using shadow and reflective modes
US7924272B2 (en) 2006-11-27 2011-04-12 Microsoft Corporation Infrared sensor integrated in a touch panel
US8269746B2 (en) 2006-11-27 2012-09-18 Microsoft Corporation Communication with a touch screen
JPWO2008066004A1 (en) 2006-11-30 2010-03-04 株式会社セガ Position input device
EP2126673A4 (en) 2006-12-08 2015-03-04 Flatfrog Lab Ab Position determination in optical interface systems
TWM314487U (en) 2006-12-20 2007-06-21 Amtran Technology Co Ltd Remote control having the audio-video function
KR100833753B1 (en) 2006-12-21 2008-05-30 삼성에스디아이 주식회사 Organic light emitting diode display and driving method thereof
JP4775247B2 (en) 2006-12-21 2011-09-21 三菱電機株式会社 Position detection device
CN101211246B (en) 2006-12-26 2010-06-23 乐金显示有限公司 Organic light-emitting diode panel and touch-screen system including the same
US8125455B2 (en) 2007-01-03 2012-02-28 Apple Inc. Full scale calibration measurement for multi-touch surfaces
JP2008181411A (en) 2007-01-25 2008-08-07 Nitto Denko Corp Optical waveguide for touch panel
TWM318760U (en) 2007-01-26 2007-09-11 Pixart Imaging Inc Remote controller
US20080189046A1 (en) 2007-02-02 2008-08-07 O-Pen A/S Optical tool with dynamic electromagnetic radiation and a system and method for determining the position and/or motion of an optical tool
US20080192025A1 (en) 2007-02-13 2008-08-14 Denny Jaeger Touch input devices for display/sensor screen
WO2008112146A2 (en) 2007-03-07 2008-09-18 The Trustees Of The University Of Pennsylvania 2d partially parallel imaging with k-space surrounding neighbors based data reconstruction
WO2008112886A1 (en) 2007-03-13 2008-09-18 Evident Technologies, Inc. Infrared display with luminescent quantum dots
US8243048B2 (en) 2007-04-25 2012-08-14 Elo Touch Solutions, Inc. Touchscreen for detecting multiple touches
CA2688214A1 (en) 2007-05-11 2008-11-20 Rpo Pty Limited A transmissive body
US20080291668A1 (en) 2007-05-21 2008-11-27 Rohm And Haas Denmark Finance A/S Mini lightbar illuminators for LCE displays
US7936341B2 (en) 2007-05-30 2011-05-03 Microsoft Corporation Recognizing selection regions from multiple simultaneous inputs
CN101681230B (en) 2007-05-30 2012-02-29 马丁定点设备公司 Touch-sensitive pointing device with guiding lines
CN101075168B (en) 2007-06-22 2014-04-02 北京汇冠新技术股份有限公司 Method for discriminating multiple points on infrared touch screen
JP4368392B2 (en) 2007-06-13 2009-11-18 東海ゴム工業株式会社 Deformation sensor system
US7835999B2 (en) 2007-06-27 2010-11-16 Microsoft Corporation Recognizing input gestures using a multi-touch input device, calculated graphs, and a neural network with link weights
US9019245B2 (en) 2007-06-28 2015-04-28 Intel Corporation Multi-function tablet pen input device
EP2009541B1 (en) 2007-06-29 2015-06-10 Barco N.V. Night vision touchscreen
JP2009043636A (en) 2007-08-10 2009-02-26 Mitsubishi Electric Corp Surface light source device and display device
CN101802759A (en) 2007-08-30 2010-08-11 奈克斯特控股公司 Low profile touch panel systems
US8760400B2 (en) 2007-09-07 2014-06-24 Apple Inc. Gui applications for use with 3D remote controller
US8231250B2 (en) 2007-09-10 2012-07-31 Lighting Science Group Corporation Warm white lighting device
US20090067178A1 (en) 2007-09-11 2009-03-12 Kismart Corporation Method of forming light-scattering dots inside the diffusion plate and light guide plate by laser engraving
US8122384B2 (en) 2007-09-18 2012-02-21 Palo Alto Research Center Incorporated Method and apparatus for selecting an object within a user interface by performing a gesture
US8395588B2 (en) 2007-09-19 2013-03-12 Canon Kabushiki Kaisha Touch panel
US8587559B2 (en) 2007-09-28 2013-11-19 Samsung Electronics Co., Ltd. Multipoint nanostructure-film touch screen
US8004502B2 (en) 2007-10-05 2011-08-23 Microsoft Corporation Correcting for ambient light in an optical touch-sensitive device
US8716614B2 (en) 2007-10-10 2014-05-06 Flatfrog Laboratories Ab Touch pad and a method of operating the touch pad
US20100073318A1 (en) 2008-09-24 2010-03-25 Matsushita Electric Industrial Co., Ltd. Multi-touch surface providing detection and tracking of multiple touch points
CN100501657C (en) 2007-11-05 2009-06-17 广东威创视讯科技股份有限公司 Touch panel device and its locating method
JP5082779B2 (en) 2007-11-07 2012-11-28 株式会社日立製作所 Flat panel display
KR101407300B1 (en) 2007-11-19 2014-06-13 엘지디스플레이 주식회사 Multi touch flat display module
AR064377A1 (en) 2007-12-17 2009-04-01 Rovere Victor Manuel Suarez DEVICE FOR SENSING MULTIPLE CONTACT AREAS AGAINST OBJECTS SIMULTANEOUSLY
JP5381715B2 (en) 2007-12-17 2014-01-08 日本電気株式会社 Input device, information terminal including the same, and input method
US20090168459A1 (en) 2007-12-27 2009-07-02 Qualcomm Incorporated Light guide including conjugate film
US20090174679A1 (en) 2008-01-04 2009-07-09 Wayne Carl Westerman Selective Rejection of Touch Contacts in an Edge Region of a Touch Surface
US20090187842A1 (en) 2008-01-22 2009-07-23 3Dlabs Inc., Ltd. Drag and Drop User Interface for Portable Electronic Devices with Touch Sensitive Screens
US9857915B2 (en) 2008-01-25 2018-01-02 Microsoft Technology Licensing, Llc Touch sensing for curved displays
EP2250546A2 (en) 2008-02-11 2010-11-17 Next Holdings Limited Systems and methods for resolving multitouch scenarios for optical touchscreens
EP2469399B1 (en) 2008-02-11 2019-09-11 Idean Enterprises Oy Layer-based user interface
US8766925B2 (en) 2008-02-28 2014-07-01 New York University Method and apparatus for providing input to a processor, and a sensor pad
US9454256B2 (en) 2008-03-14 2016-09-27 Apple Inc. Sensor configurations of an input device that are switchable based on mode
US9372591B2 (en) 2008-04-10 2016-06-21 Perceptive Pixel, Inc. Methods of interfacing with multi-input devices and multi-input display systems employing interfacing techniques
US8209628B1 (en) 2008-04-11 2012-06-26 Perceptive Pixel, Inc. Pressure-sensitive manipulation of displayed objects
TW200945123A (en) 2008-04-25 2009-11-01 Ind Tech Res Inst A multi-touch position tracking apparatus and interactive system and image processing method there of
WO2009137355A2 (en) 2008-05-06 2009-11-12 Next Holdings, Inc. Systems and methods for resolving multitouch scenarios using software filters
US8830181B1 (en) 2008-06-01 2014-09-09 Cypress Semiconductor Corporation Gesture recognition system for a touch-sensing surface
US8676007B2 (en) 2008-06-19 2014-03-18 Neonode Inc. Light-based touch surface with curved borders and sloping bezel
EP2318903A2 (en) 2008-06-23 2011-05-11 FlatFrog Laboratories AB Detecting the location of an object on a touch surface
TW201007530A (en) 2008-06-23 2010-02-16 Flatfrog Lab Ab Detecting the location of an object on a touch surface
TW201005606A (en) 2008-06-23 2010-02-01 Flatfrog Lab Ab Detecting the locations of a plurality of objects on a touch surface
TW201013492A (en) 2008-06-23 2010-04-01 Flatfrog Lab Ab Determining the location of one or more objects on a touch surface
TW201001258A (en) 2008-06-23 2010-01-01 Flatfrog Lab Ab Determining the location of one or more objects on a touch surface
CN101644854A (en) 2008-08-04 2010-02-10 鸿富锦精密工业(深圳)有限公司 Direct backlight module
CN201233592Y (en) 2008-08-05 2009-05-06 北京汇冠新技术有限公司 Reflective light path construction used for infrared touch screen
JP5003629B2 (en) 2008-08-06 2012-08-15 パナソニック株式会社 Information terminal equipment
US9092092B2 (en) 2008-08-07 2015-07-28 Rapt Ip Limited Detecting multitouch events in an optical touch-sensitive device using touch event templates
US8350831B2 (en) * 2008-08-07 2013-01-08 Rapt Ip Limited Method and apparatus for detecting a multitouch event in an optical touch-sensitive device
US8227742B2 (en) 2008-08-07 2012-07-24 Rapt Ip Limited Optical control system with modulated emitters
EP2338103A1 (en) 2008-08-07 2011-06-29 Owen Drumm Optical control systems with feedback control
US9063615B2 (en) * 2008-08-07 2015-06-23 Rapt Ip Limited Detecting multitouch events in an optical touch-sensitive device using line images
US8188986B2 (en) 2008-09-23 2012-05-29 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. User input device with dynamic ambient light calibration
US9317159B2 (en) 2008-09-26 2016-04-19 Hewlett-Packard Development Company, L.P. Identifying actual touch points using spatial dimension information obtained from light transceivers
US8093545B2 (en) 2008-09-26 2012-01-10 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Lensless user input device with optical interference based on diffraction with a small aperture
US8237684B2 (en) 2008-09-26 2012-08-07 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. User input device with planar light guide illumination plate
US20110205189A1 (en) * 2008-10-02 2011-08-25 John David Newton Stereo Optical Sensors for Resolving Multi-Touch in a Touch Detection System
KR100972932B1 (en) 2008-10-16 2010-07-28 인하대학교 산학협력단 Touch Screen Panel
KR101323045B1 (en) 2008-10-21 2013-10-29 엘지디스플레이 주식회사 Sensing deving and method for amplifying output thereof
FI121862B (en) 2008-10-24 2011-05-13 Valtion Teknillinen Arrangement for touch screen and corresponding manufacturing method
KR101542129B1 (en) 2008-10-24 2015-08-06 삼성전자 주식회사 Input Device For Foldable Display Device And Input Method Thereof
JP2012508913A (en) 2008-11-12 2012-04-12 フラットフロッグ ラボラトリーズ アーベー Integrated touch sensing display device and manufacturing method thereof
US20100125438A1 (en) 2008-11-15 2010-05-20 Mathieu Audet Method of scanning, analyzing and identifying electro magnetic field sources
KR100940435B1 (en) 2008-11-26 2010-02-10 한국광기술원 Two dimensional optical fiber scanning module, optical fiber scanning system having the same and optical fiber scanning method
SE533704C2 (en) 2008-12-05 2010-12-07 Flatfrog Lab Ab Touch sensitive apparatus and method for operating the same
US8317352B2 (en) 2008-12-11 2012-11-27 Robert Saccomanno Non-invasive injection of light into a transparent substrate, such as a window pane through its face
JP5239835B2 (en) 2008-12-24 2013-07-17 富士ゼロックス株式会社 Optical waveguide and optical waveguide type touch panel
US8407606B1 (en) 2009-01-02 2013-03-26 Perceptive Pixel Inc. Allocating control among inputs concurrently engaging an object displayed on a multi-touch device
EP2377005B1 (en) 2009-01-14 2014-12-17 Citron GmbH Multitouch control panel
US20130181896A1 (en) 2009-01-23 2013-07-18 Qualcomm Mems Technologies, Inc. Integrated light emitting and light detecting device
KR20110113746A (en) 2009-01-23 2011-10-18 퀄컴 엠이엠스 테크놀로지스, 인크. Integrated light emitting and light detecting device
US8487914B2 (en) 2009-06-18 2013-07-16 Avago Technologies General Ip (Singapore) Pte. Ltd. Optical fingerprint navigation device with light guide film
WO2010092993A1 (en) 2009-02-13 2010-08-19 株式会社 東芝 Information processing device
US9158416B2 (en) 2009-02-15 2015-10-13 Neonode Inc. Resilient light-based touch surface
EP2399237B1 (en) 2009-02-20 2013-08-14 Werth Messtechnik GmbH Method for measuring an object
US8331751B2 (en) 2009-03-02 2012-12-11 mBio Diagnositcs, Inc. Planar optical waveguide with core of low-index-of-refraction interrogation medium
JP5269648B2 (en) 2009-03-02 2013-08-21 パナソニック株式会社 Portable terminal device and input device
WO2010100796A1 (en) 2009-03-06 2010-09-10 シャープ株式会社 Display apparatus
TWI524238B (en) 2009-03-31 2016-03-01 萬國商業機器公司 Multi-touch optical touch panel
TWI399677B (en) 2009-03-31 2013-06-21 Arima Lasers Corp Optical detection apparatus and method
JP5146389B2 (en) 2009-04-03 2013-02-20 ソニー株式会社 Information processing apparatus and estimation method
WO2010119882A1 (en) 2009-04-17 2010-10-21 シャープ株式会社 Display device
WO2010123809A2 (en) 2009-04-20 2010-10-28 3M Innovative Properties Company Non-radiatively pumped wavelength converter
FI124221B (en) 2009-04-24 2014-05-15 Valtion Teknillinen User Feed Arrangement and Related Production Method
US20100277436A1 (en) 2009-04-29 2010-11-04 Hong Kong Applied Science And Technology Research Institute Co., Ltd. Sensing System for a Touch Sensitive Device
WO2010127241A2 (en) 2009-04-30 2010-11-04 The Regents Of The University Of California System and methods for fast implementation of equally-sloped tomography
US20100283785A1 (en) 2009-05-11 2010-11-11 Agilent Technologies, Inc. Detecting peaks in two-dimensional signals
US8154529B2 (en) 2009-05-14 2012-04-10 Atmel Corporation Two-dimensional touch sensors
US20100295821A1 (en) 2009-05-20 2010-11-25 Tom Chang Optical touch panel
US20100315379A1 (en) 2009-05-22 2010-12-16 Matthew Allard Display Devices With Integrated Optical Components For Use in Position Detection
US8358901B2 (en) 2009-05-28 2013-01-22 Microsoft Corporation Optic having a cladding
WO2010141453A2 (en) 2009-06-01 2010-12-09 Han Jefferson Y Touch sensing
US8736581B2 (en) 2009-06-01 2014-05-27 Perceptive Pixel Inc. Touch sensing with frustrated total internal reflection
TWI414974B (en) 2009-06-17 2013-11-11 Novatek Microelectronics Corp Touch position sensing method and position sensing system of touch panel
WO2010149651A1 (en) 2009-06-23 2010-12-29 Imec Optical tactile sensors
TWI420371B (en) 2009-06-23 2013-12-21 Raydium Semiconductor Corportation Optical touch system and operating method thereof
CN201437963U (en) 2009-07-07 2010-04-14 台湾奈普光电科技股份有限公司 Structural improvement for light guide plate
ES2626435T3 (en) 2009-07-16 2017-07-25 O-Net Wavetouch Limited A device and a method of coding a position of an object
CN201465071U (en) 2009-07-20 2010-05-12 贺伟 Infrared touch screen frame structure
KR100941927B1 (en) 2009-08-21 2010-02-18 이성호 Method and device for detecting touch input
US8730212B2 (en) 2009-08-21 2014-05-20 Microsoft Corporation Illuminator for touch- and object-sensitive display
GB2486843B (en) 2009-08-25 2014-06-18 Promethean Ltd Interactive surface with a plurality of input detection technologies
US7932899B2 (en) 2009-09-01 2011-04-26 Next Holdings Limited Determining the location of touch points in a position detection system
CN102597936B (en) 2009-09-02 2015-01-07 平蛙实验室股份公司 Touch surface with a compensated signal profile
SE534244C2 (en) 2009-09-02 2011-06-14 Flatfrog Lab Ab Touch sensitive system and method for functional control thereof
WO2011031215A1 (en) 2009-09-11 2011-03-17 Flatfrog Laboratories Ab Touch surface with variable refractive index
KR101606883B1 (en) 2009-09-18 2016-04-12 삼성디스플레이 주식회사 Touch sensing apparatus
KR20110032640A (en) 2009-09-23 2011-03-30 삼성전자주식회사 Multi-touch sensing display apparatus
DE102009042922B4 (en) 2009-09-24 2019-01-24 Siemens Healthcare Gmbh Method and apparatus for image determination from x-ray projections taken when traversing a trajectory
US8749512B2 (en) 2009-09-30 2014-06-10 Apple Inc. Negative pixel compensation
US20110080344A1 (en) 2009-10-02 2011-04-07 Dedo Interactive Inc. Blending touch data streams that include touch input data
US8373679B2 (en) 2009-10-12 2013-02-12 Garmin International, Inc. Infrared touchscreen electronics
KR20120083916A (en) 2009-10-19 2012-07-26 플라트프로그 라보라토리즈 에이비 Extracting touch data that represents one or more objects on a touch surface
WO2011049512A1 (en) 2009-10-19 2011-04-28 Flatfrog Laboratories Ab Touch surface with two-dimensional compensation
RU2012118597A (en) 2009-10-19 2013-11-27 ФлэтФрог Лэборэторис АБ DETERMINATION OF TOUCH DATA FOR ONE OR MULTIPLE ITEMS ON A TOUCH SURFACE
JP5483996B2 (en) 2009-10-23 2014-05-07 キヤノン株式会社 Compensating optical device, imaging device, and compensating optical method
CN201927010U (en) 2009-11-12 2011-08-10 北京汇冠新技术股份有限公司 Touch screen, touch system and light source
JP2013511100A (en) * 2009-11-17 2013-03-28 アールピーオー・ピーティワイ・リミテッド Apparatus and method for receiving touch input
US20110115748A1 (en) 2009-11-18 2011-05-19 Amlogic Co., Ltd. Infrared Touch Screen
KR101627715B1 (en) 2009-11-18 2016-06-14 엘지전자 주식회사 Touch Panel, Driving Method for Touch Panel, and Display Apparatus having a Touch Panel
KR20110056892A (en) 2009-11-23 2011-05-31 삼성전자주식회사 Multi touch detecting apparatus for lcd display unit and multi touch detecting method using the same
TWI425396B (en) 2009-11-25 2014-02-01 Coretronic Corp Optical touch apparatus and optical touch display apparatus
US8436833B2 (en) 2009-11-25 2013-05-07 Corning Incorporated Methods and apparatus for sensing touch events on a display
TWM379163U (en) 2009-11-26 2010-04-21 Truelight Corp Packaging apparatus for high power and high orientation matrix semiconductor light-emitting devices
GB0921216D0 (en) 2009-12-03 2010-01-20 St Microelectronics Res & Dev Improved touch screen device
WO2011069148A1 (en) 2009-12-04 2011-06-09 Next Holdings Limited Methods and systems for position detection using an interactive volume
KR101926406B1 (en) 2009-12-11 2018-12-07 넥스트 홀딩스 리미티드 Position sensing systems for use in touch screens and prismatic film used therein
CN102096526B (en) 2009-12-15 2015-11-25 乐金显示有限公司 The display device of optical sensing unit, display module and use optical sensing unit
WO2011072588A1 (en) 2009-12-16 2011-06-23 北京汇冠新技术股份有限公司 Infrared touch screen
KR101579091B1 (en) 2010-01-07 2015-12-22 삼성디스플레이 주식회사 Method for detecting touch position, detecting apparatus of touch position for performing the method and display apparatus having the detecting apparatus of touch position
US8502789B2 (en) 2010-01-11 2013-08-06 Smart Technologies Ulc Method for handling user input in an interactive input system, and interactive input system executing the method
KR101704695B1 (en) 2010-03-09 2017-02-09 삼성디스플레이 주식회사 Method for detecting touch position, detecting apparatus of touch position for performing the method and display apparatus having the detecting apparatus of touch position
KR20110103140A (en) 2010-03-12 2011-09-20 삼성전자주식회사 Apparatus for multi touch and proximated object sensing by irradiating light selectively
FR2957718B1 (en) 2010-03-16 2012-04-20 Commissariat Energie Atomique HYBRID HIGH PERFORMANCE ELECTROLUMINESCENT DIODE
KR101749266B1 (en) 2010-03-24 2017-07-04 삼성디스플레이 주식회사 Touch sensing display device and cumputer-readable medium
CN101930322B (en) 2010-03-26 2012-05-23 深圳市天时通科技有限公司 Identification method capable of simultaneously identifying a plurality of contacts of touch screen
JP2011227574A (en) 2010-04-15 2011-11-10 Rohm Co Ltd Arithmetic apparatus, motion detecting apparatus, electronic device
WO2011130919A1 (en) 2010-04-23 2011-10-27 Motorola Mobility, Inc. Electronic device and method using touch-detecting surface
JP5523191B2 (en) 2010-04-30 2014-06-18 株式会社ジャパンディスプレイ Display device with touch detection function
TW201203052A (en) 2010-05-03 2012-01-16 Flatfrog Lab Ab Touch determination by tomographic reconstruction
US8274495B2 (en) 2010-05-25 2012-09-25 General Display, Ltd. System and method for contactless touch screen
US8294168B2 (en) 2010-06-04 2012-10-23 Samsung Electronics Co., Ltd. Light source module using quantum dots, backlight unit employing the light source module, display apparatus, and illumination apparatus
US9158401B2 (en) 2010-07-01 2015-10-13 Flatfrog Laboratories Ab Data processing in relation to a multi-touch sensing apparatus
CN102339168B (en) 2010-07-21 2013-10-16 北京汇冠新技术股份有限公司 Touch screen and multi-channel sampling method thereof
US20120019448A1 (en) 2010-07-22 2012-01-26 Nokia Corporation User Interface with Touch Pressure Level Sensing
CN101882034B (en) 2010-07-23 2013-02-13 广东威创视讯科技股份有限公司 Device and method for discriminating color of touch pen of touch device
KR20120012571A (en) 2010-08-02 2012-02-10 엘지이노텍 주식회사 Optical touch screen and method for assembling the same
US8648970B2 (en) 2010-08-02 2014-02-11 Chip Goal Electronics Corporation, Roc Remote controllable video display system and controller and method therefor
US9092089B2 (en) 2010-09-15 2015-07-28 Advanced Silicon Sa Method for detecting an arbitrary number of touches from a multi-touch device
US9411444B2 (en) 2010-10-11 2016-08-09 Flatfrog Laboratories Ab Touch determination by tomographic reconstruction
TWI422908B (en) 2010-10-12 2014-01-11 Au Optronics Corp Touch display device
CA2814183C (en) 2010-10-12 2018-07-10 New York University Apparatus for sensing utilizing tiles, sensor having a set of plates, object identification for multi-touch surfaces, and method
US8654064B2 (en) 2010-10-18 2014-02-18 Samsung Display Co., Ltd. Backlight having blue light emitting diodes and method of driving same
US9092135B2 (en) 2010-11-01 2015-07-28 Sony Computer Entertainment Inc. Control of virtual object using device touch interface functionality
US20130234991A1 (en) 2010-11-07 2013-09-12 Neonode Inc. Optimized hemi-ellipsoidal led shell
US20120131490A1 (en) 2010-11-22 2012-05-24 Shao-Chieh Lin Touch-controlled device and method for displaying a virtual keyboard on the touch-controlled device thereof
US8503753B2 (en) 2010-12-02 2013-08-06 Kabushiki Kaisha Toshiba System and method for triangular interpolation in image reconstruction for PET
JP2013546094A (en) 2010-12-15 2013-12-26 フラットフロッグ ラボラトリーズ アーベー Touch determination with signal enhancement
EP2466428A3 (en) 2010-12-16 2015-07-29 FlatFrog Laboratories AB Touch apparatus with separated compartments
EP2466429A1 (en) 2010-12-16 2012-06-20 FlatFrog Laboratories AB Scanning ftir systems for touch detection
US8546741B2 (en) 2011-01-13 2013-10-01 Avago Technologies General Ip (Singapore) Pte. Ltd. Compact optical finger navigation system based on speckles with an optical element including an optical redirection surface
EP2479642B1 (en) 2011-01-21 2017-08-16 BlackBerry Limited System and method for reducing power consumption in an electronic device having a touch-sensitive display
US8635560B2 (en) 2011-01-21 2014-01-21 Blackberry Limited System and method for reducing power consumption in an electronic device having a touch-sensitive display
KR101942114B1 (en) 2011-02-02 2019-01-24 플라트프로그 라보라토리즈 에이비 Optical incoupling for touch-sensitive systems
US8619062B2 (en) 2011-02-03 2013-12-31 Microsoft Corporation Touch-pressure sensing in a display panel
US9201520B2 (en) 2011-02-11 2015-12-01 Microsoft Technology Licensing, Llc Motion and context sharing for pen-based computing inputs
US8624858B2 (en) 2011-02-14 2014-01-07 Blackberry Limited Portable electronic device including touch-sensitive display and method of controlling same
US8912905B2 (en) 2011-02-28 2014-12-16 Chon Meng Wong LED lighting system
EP2684113A4 (en) 2011-03-09 2015-01-21 Flatfrog Lab Ab Touch determination with signal compensation
TW201239710A (en) 2011-03-29 2012-10-01 Genius Electronic Optical Co Ltd Optical touch system
KR20140022843A (en) 2011-04-19 2014-02-25 퍼셉티브 픽셀 인코포레이티드 Optical filtered sensor-in-pixel technology for touch sensing
US8558788B2 (en) 2011-04-29 2013-10-15 Hewlett-Packard Development Company, L.P. Diffusing light of a laser
US9541701B2 (en) 2011-05-13 2017-01-10 3M Innovative Properties Company Back-lit transmissive display having variable index light extraction layer
US20140085241A1 (en) 2011-05-16 2014-03-27 Flatfrog Laboratories Ab Device and method for determining reduced performance of a touch sensitive apparatus
US9001086B1 (en) 2011-06-08 2015-04-07 Amazon Technologies, Inc. Display illumination with light-based touch sensing
CN103975344B (en) 2011-06-15 2017-09-26 百安托国际有限公司 Installation system for modularization position sensing
GB201110218D0 (en) 2011-06-16 2011-08-03 St Microelectronics Res & Dev Optical navigation device
JP5453351B2 (en) 2011-06-24 2014-03-26 株式会社Nttドコモ Mobile information terminal, operation state determination method, program
US8963886B2 (en) 2011-07-13 2015-02-24 Flatfrog Laboratories Ab Touch-sensing display panel
US8884900B2 (en) 2011-07-13 2014-11-11 Flatfrog Laboratories Ab Touch-sensing display apparatus and electronic device therewith
TWI497376B (en) 2011-07-22 2015-08-21 Rapt Ip Ltd Optical coupler assembly for use in an optical touch sensitive device
US9075561B2 (en) 2011-07-29 2015-07-07 Apple Inc. Systems, methods, and computer-readable media for managing collaboration on a virtual work of art
US8959435B2 (en) 2011-08-23 2015-02-17 Garmin Switzerland Gmbh System and methods for detecting debris on a touchscreen system display screen
KR101862123B1 (en) 2011-08-31 2018-05-30 삼성전자 주식회사 Input device and method on terminal equipment having a touch module
EP2764426B1 (en) 2011-09-09 2019-01-23 FlatFrog Laboratories AB Light coupling structures for optical touch panels
TW201329821A (en) 2011-09-27 2013-07-16 Flatfrog Lab Ab Image reconstruction for touch determination
US9019240B2 (en) 2011-09-29 2015-04-28 Qualcomm Mems Technologies, Inc. Optical touch device with pixilated light-turning features
TW201333787A (en) 2011-10-11 2013-08-16 Flatfrog Lab Ab Improved multi-touch detection in a touch system
EP2771771A4 (en) 2011-10-27 2015-06-17 Flatfrog Lab Ab Touch determination by tomographic reconstruction
US20130106709A1 (en) 2011-10-28 2013-05-02 Martin John Simmons Touch Sensor With User Identification
JP5846631B2 (en) 2011-11-02 2016-01-20 株式会社エンプラス Light guide plate and optical system including the same
US9582178B2 (en) 2011-11-07 2017-02-28 Immersion Corporation Systems and methods for multi-pressure interaction on touch-sensitive surfaces
US20130125016A1 (en) 2011-11-11 2013-05-16 Barnesandnoble.Com Llc System and method for transferring content between devices
WO2013081896A1 (en) 2011-11-28 2013-06-06 Corning Incorporated Robust optical touch-screen systems and methods using a planar transparent sheet
WO2013081894A1 (en) 2011-11-28 2013-06-06 Corning Incorporated Optical touch-screen systems and methods using a planar transparent sheet
US9823781B2 (en) 2011-12-06 2017-11-21 Nri R&D Patent Licensing, Llc Heterogeneous tactile sensing via multiple sensor types
US9927920B2 (en) 2011-12-16 2018-03-27 Flatfrog Laboratories Ab Tracking objects on a touch surface
US10022498B2 (en) 2011-12-16 2018-07-17 Icu Medical, Inc. System for monitoring and delivering medication to a patient and method of using the same to minimize the risks associated with automated therapy
JP2015505093A (en) 2011-12-16 2015-02-16 フラットフロッグ ラボラトリーズ アーベーFlatFrog Laboratories AB Tracking objects on contact surfaces
EP3506069A1 (en) 2011-12-16 2019-07-03 FlatFrog Laboratories AB Tracking objects on a touch surface
US9711752B2 (en) 2011-12-19 2017-07-18 Lg Electronics Inc. Display apparatus
JP5296185B2 (en) 2011-12-21 2013-09-25 シャープ株式会社 Touch sensor system
WO2013095271A2 (en) 2011-12-22 2013-06-27 Flatfrog Laboratories Ab Touch determination with interaction compensation
US20130181953A1 (en) 2012-01-13 2013-07-18 Microsoft Corporation Stylus computing environment
US9250794B2 (en) 2012-01-23 2016-02-02 Victor Manuel SUAREZ ROVERE Method and apparatus for time-varying tomographic touch imaging and interactive system using same
US9588619B2 (en) * 2012-01-31 2017-03-07 Flatfrog Laboratories Ab Performance monitoring and correction in a touch-sensitive apparatus
US9811209B2 (en) 2012-02-21 2017-11-07 Flatfrog Laboratories Ab Touch determination with improved detection of weak interactions
TWI439907B (en) 2012-02-29 2014-06-01 Pixart Imaging Inc Optical touch device and detection method thereof
EP2823388B1 (en) 2012-03-09 2019-01-23 FlatFrog Laboratories AB Efficient tomographic processing for touch determination
WO2013133757A2 (en) 2012-03-09 2013-09-12 Flatfrog Laboratories Ab Efficient tomographic processing for touch determination
US20130241887A1 (en) 2012-03-14 2013-09-19 Texas Instruments Incorporated Detecting and Tracking Touch on an Illuminated Surface
US8928590B1 (en) 2012-04-03 2015-01-06 Edge 3 Technologies, Inc. Gesture keyboard method and apparatus
US9448066B2 (en) 2012-04-17 2016-09-20 Massachusetts Institute Of Technology Methods and apparatus for jammable HCI interfaces
US9904457B2 (en) 2012-04-25 2018-02-27 Nokia Technologies Oy Causing display of a three dimensional graphical user interface with dynamic selectability of items
CN102662534A (en) 2012-04-27 2012-09-12 深圳市天时通科技有限公司 Touch display device
US9626018B2 (en) 2012-05-02 2017-04-18 Flatfrog Laboratories Ab Object detection in touch systems
JP5943699B2 (en) 2012-05-11 2016-07-05 スタンレー電気株式会社 Optical touch panel
KR101319543B1 (en) 2012-05-17 2013-10-21 삼성디스플레이 주식회사 Curved dispaly apparatus and multi display apparatus including the same
US10168835B2 (en) 2012-05-23 2019-01-01 Flatfrog Laboratories Ab Spatial resolution in touch displays
US20150242055A1 (en) 2012-05-23 2015-08-27 Flatfrog Laboratories Ab Touch-sensitive apparatus with improved spatial resolution
US9678602B2 (en) 2012-05-23 2017-06-13 Flatfrog Laboratories Ab Touch-sensitive apparatus with improved spatial resolution
US9626040B2 (en) 2012-05-23 2017-04-18 Flatfrog Laboratories Ab Touch-sensitive apparatus with improved spatial resolution
US9524060B2 (en) 2012-07-13 2016-12-20 Rapt Ip Limited Low power operation of an optical touch-sensitive device for detecting multitouch events
US9857916B2 (en) 2012-07-24 2018-01-02 Flatfrog Laboratories Ab Optical coupling in touch-sensing systems using diffusively transmitting element
US9405382B2 (en) 2012-07-24 2016-08-02 Rapt Ip Limited Augmented optical waveguide for use in an optical touch sensitive device
US9886116B2 (en) 2012-07-26 2018-02-06 Apple Inc. Gesture and touch input detection through force sensing
US20140036203A1 (en) 2012-07-31 2014-02-06 Apple Inc. Light mixture for a display utilizing quantum dots
US9317146B1 (en) 2012-08-23 2016-04-19 Rockwell Collins, Inc. Haptic touch feedback displays having double bezel design
US20140063853A1 (en) 2012-08-29 2014-03-06 Flex Lighting Ii, Llc Film-based lightguide including a wrapped stack of input couplers and light emitting device including the same
CN104662496B (en) 2012-09-11 2017-07-07 平蛙实验室股份公司 Touch force in the projection type touch-sensing device based on FTIR is estimated
CN202771401U (en) 2012-09-18 2013-03-06 北京汇冠新技术股份有限公司 Infrared touch screen
US9891759B2 (en) * 2012-09-28 2018-02-13 Apple Inc. Frustrated total internal reflection and capacitive sensing
US20140210770A1 (en) 2012-10-04 2014-07-31 Corning Incorporated Pressure sensing touch systems and methods
US9557846B2 (en) * 2012-10-04 2017-01-31 Corning Incorporated Pressure-sensing touch system utilizing optical and capacitive systems
US9229576B2 (en) 2012-10-09 2016-01-05 Stmicroelectronics Asia Pacific Pte Ltd Apparatus and method for preventing false touches in touch screen systems
CN203224848U (en) 2012-10-11 2013-10-02 华映视讯(吴江)有限公司 Touch control display module
US9207800B1 (en) 2014-09-23 2015-12-08 Neonode Inc. Integrated light guide and touch screen frame and multi-touch determination method
US8694791B1 (en) 2012-10-15 2014-04-08 Google Inc. Transitioning between access states of a computing device
EP2912650B1 (en) 2012-10-25 2018-12-05 LG Electronics Inc. Display device
US20140139467A1 (en) 2012-11-21 2014-05-22 Princeton Optronics Inc. VCSEL Sourced Touch Screen Sensor Systems
WO2014083437A2 (en) * 2012-11-30 2014-06-05 Julien Piot Optical touch tomography
WO2014086084A1 (en) 2012-12-05 2014-06-12 成都吉锐触摸技术股份有限公司 Infrared touch screen
US20140160762A1 (en) 2012-12-07 2014-06-12 GE Lighting Solutions, LLC Diffuser element and lighting device comprised thereof
US20150331545A1 (en) 2012-12-17 2015-11-19 FlatFrog Laboraties AB Laminated optical element for touch-sensing systems
WO2014098742A1 (en) 2012-12-17 2014-06-26 Flatfrog Laboratories Ab Edge-coupled touch-sensitive apparatus
US20150324028A1 (en) 2012-12-17 2015-11-12 Flatfrog Laboratories Ab Optical coupling of light into touch-sensing systems
US9785287B2 (en) 2012-12-17 2017-10-10 Flatfrog Laboratories Ab Optical coupling in touch-sensing systems
WO2014098744A1 (en) 2012-12-20 2014-06-26 Flatfrog Laboratories Ab Improvements in tir-based optical touch systems of projection-type
WO2014104968A1 (en) 2012-12-27 2014-07-03 Flatfrog Laboratories Ab A touch-sensing apparatus and a method for enabling control of a touch-sensing apparatus by an external device
WO2014104967A1 (en) 2012-12-27 2014-07-03 Flatfrog Laboratories Ab Method and apparatus for detecting visible ambient light
US9223442B2 (en) 2013-01-10 2015-12-29 Samsung Display Co., Ltd. Proximity and touch sensing surface for integration with a display
WO2014112913A1 (en) 2013-01-16 2014-07-24 Flatfrog Laboratories Ab Touch-sensing display panel
CN104956298A (en) 2013-01-30 2015-09-30 福建科创光电有限公司 One glass solution capacitive touch screen and manufacturing method thereof
KR20140101166A (en) 2013-02-08 2014-08-19 엘지전자 주식회사 Display apparatus
US20140237401A1 (en) 2013-02-15 2014-08-21 Flatfrog Laboratories Ab Interpretation of a gesture on a touch sensing device
US20140237408A1 (en) 2013-02-15 2014-08-21 Flatfrog Laboratories Ab Interpretation of pressure based gesture
US9910527B2 (en) 2013-02-15 2018-03-06 Flatfrog Laboratories Ab Interpretation of pressure based gesture
US20140237422A1 (en) 2013-02-15 2014-08-21 Flatfrog Laboratories Ab Interpretation of pressure based gesture
CN203189466U (en) 2013-03-10 2013-09-11 常州市龙春针织机械科技有限公司 Axial locking device
KR102052977B1 (en) 2013-03-11 2019-12-06 삼성전자 주식회사 Multi Input Control Method and System thereof, and Electronic Device supporting the same
US9785259B2 (en) 2013-03-11 2017-10-10 Barnes & Noble College Booksellers, Llc Stylus-based slider functionality for UI control of computing device
KR20140114913A (en) 2013-03-14 2014-09-30 삼성전자주식회사 Apparatus and Method for operating sensors in user device
US9158411B2 (en) 2013-07-12 2015-10-13 Tactual Labs Co. Fast multi-touch post processing
US10055067B2 (en) 2013-03-18 2018-08-21 Sony Corporation Sensor device, input device, and electronic apparatus
EP2966547B1 (en) 2013-04-07 2019-10-16 Guangzhou Shirui Electronics Co., Ltd. All-in-one machine and method and computer memory medium for realizing quick touch in all channels thereof
US20160050746A1 (en) 2013-04-11 2016-02-18 Flatfrog Laboratories Ab Printed Circuit Assembly And A Touch Sensitive System Comprising The Assembly
US10019113B2 (en) 2013-04-11 2018-07-10 Flatfrog Laboratories Ab Tomographic processing for touch detection
WO2014168569A1 (en) 2013-04-11 2014-10-16 Flatfrog Laboratories Ab A coupling arrangement, a panel and a touch sensitive system
US10187520B2 (en) 2013-04-24 2019-01-22 Samsung Electronics Co., Ltd. Terminal device and content displaying method thereof, server and controlling method thereof
WO2014188973A1 (en) 2013-05-21 2014-11-27 シャープ株式会社 Touch panel system and electronic device
CN105283744B (en) 2013-06-05 2018-05-18 Ev 集团 E·索尔纳有限责任公司 To determine the measuring device and method of pressure map
US9256290B2 (en) 2013-07-01 2016-02-09 Blackberry Limited Gesture detection using ambient light sensors
TW201502607A (en) 2013-07-04 2015-01-16 Era Optoelectronics Inc Structure for guiding light into guide light plate to conduct total internal reflection
US9874978B2 (en) 2013-07-12 2018-01-23 Flatfrog Laboratories Ab Partial detect mode
CN203453994U (en) 2013-07-16 2014-02-26 山东共达电声股份有限公司 Light guiding device for implementing light path of optical touch panel and optical touch panel
US20160154532A1 (en) 2013-07-19 2016-06-02 Hewlett-Packard Development Company, Lp Light guide panel including diffraction gratings
US9366565B2 (en) 2013-08-26 2016-06-14 Flatfrog Laboratories Ab Light out-coupling arrangement and a touch sensitive system comprising the out-coupling arrangement
KR20150026056A (en) 2013-08-30 2015-03-11 삼성전자주식회사 An electronic device with curved bottom and operating method thereof
KR20150026044A (en) 2013-08-30 2015-03-11 엘지디스플레이 주식회사 Optical sheet, backlight unit and display device comprising the same
CN104626057B (en) 2013-11-06 2016-06-01 纬创资通股份有限公司 For auxiliary means and the using method of touch control display apparatus assembling
JP2015095104A (en) 2013-11-12 2015-05-18 シャープ株式会社 Touch panel device
US10152176B2 (en) 2013-11-22 2018-12-11 Flatfrog Laboratories Ab Touch sensitive apparatus with improved spatial resolution
TWI528226B (en) 2014-01-15 2016-04-01 緯創資通股份有限公司 Image based touch apparatus and control method thereof
US10146376B2 (en) 2014-01-16 2018-12-04 Flatfrog Laboratories Ab Light coupling in TIR-based optical touch systems
US20160328090A1 (en) 2014-01-16 2016-11-10 FlatFrong Laboratories AB Oled display panel
US10126882B2 (en) 2014-01-16 2018-11-13 Flatfrog Laboratories Ab TIR-based optical touch systems of projection-type
US20160342282A1 (en) 2014-01-16 2016-11-24 Flatfrog Laboratories Ab Touch-sensing quantum dot lcd panel
WO2015111890A1 (en) 2014-01-24 2015-07-30 엘지전자(주) Display device
JP6276867B2 (en) 2014-02-12 2018-02-07 アップル インコーポレイテッド Force determination using sheet sensor and capacitive array
US9298284B2 (en) 2014-03-11 2016-03-29 Qualcomm Incorporated System and method for optically-based active stylus input recognition
US20150271481A1 (en) 2014-03-21 2015-09-24 Christie Digital Systems Usa, Inc. System for forming stereoscopic images
US20150286698A1 (en) 2014-04-07 2015-10-08 Microsoft Corporation Reactive digital personal assistant
JP5792348B1 (en) 2014-04-16 2015-10-07 シャープ株式会社 Position input device and touch panel
US9552473B2 (en) 2014-05-14 2017-01-24 Microsoft Technology Licensing, Llc Claiming data from a virtual whiteboard
CN105094456A (en) 2014-05-21 2015-11-25 中强光电股份有限公司 Optical touch-control device and correction method thereof
US9864470B2 (en) * 2014-05-30 2018-01-09 Flatfrog Laboratories Ab Enhanced interaction touch system
US10867149B2 (en) 2014-06-12 2020-12-15 Verizon Media Inc. User identification through an external device on a per touch basis on touch sensitive devices
KR20150145836A (en) 2014-06-19 2015-12-31 삼성디스플레이 주식회사 Display apparatus and manufacturing method thereof
EP3161594A4 (en) 2014-06-27 2018-01-17 FlatFrog Laboratories AB Detection of surface contamination
WO2016034947A2 (en) 2014-09-02 2016-03-10 Rapt Ip Limited Instrument detection with an optical touch sensitive device
US9626020B2 (en) 2014-09-12 2017-04-18 Microsoft Corporation Handedness detection from touch input
US10338725B2 (en) 2014-09-29 2019-07-02 Microsoft Technology Licensing, Llc Wet ink predictor
US9921685B2 (en) 2014-12-15 2018-03-20 Rapt Ip Limited Tactile effect waveguide surface for optical touch detection
US20160216844A1 (en) 2015-01-28 2016-07-28 Flatfrog Laboratories Ab Arrangement For a Touch Sensitive Apparatus
US11182023B2 (en) 2015-01-28 2021-11-23 Flatfrog Laboratories Ab Dynamic touch quarantine frames
US10318074B2 (en) 2015-01-30 2019-06-11 Flatfrog Laboratories Ab Touch-sensing OLED display with tilted emitters
EP3537269A1 (en) * 2015-02-09 2019-09-11 FlatFrog Laboratories AB Optical touch system
KR102342869B1 (en) 2015-02-26 2021-12-23 삼성디스플레이 주식회사 Flexible display device and method of fabricating the same
KR102394204B1 (en) 2015-03-02 2022-05-09 가부시키가이샤 와코무 Active capacitive stylus, sensor controller, system comprising these, and method executed by these
US10401546B2 (en) 2015-03-02 2019-09-03 Flatfrog Laboratories Ab Optical component for light coupling
CN205015574U (en) 2015-10-14 2016-02-03 深圳市联合盛电子有限公司 Touch -sensitive screen and LCD module laminating tool group
CN105224138B (en) 2015-10-22 2019-04-19 京东方科技集团股份有限公司 Suspension touch control display device
WO2017078684A1 (en) 2015-11-03 2017-05-11 Hewlett-Packard Development Company, L.P. Light guide and touch screen assembly
US10001882B2 (en) 2015-12-02 2018-06-19 Rapt Ip Limited Vibrated waveguide surface for optical touch detection
TWI573546B (en) 2016-02-01 2017-03-11 緯創資通股份有限公司 Frame fastening assembly, frame assembly and method of mounting a frame
US20190050074A1 (en) 2016-02-12 2019-02-14 Flatfrog Laboratories Ab Assembly tools for panel and touch-sensing system
CN205384833U (en) 2016-03-06 2016-07-13 长沙环境保护职业技术学院 Intelligent tourism electron photo holder frame
CN107908353B (en) 2016-09-30 2020-12-18 禾瑞亚科技股份有限公司 Electronic system, touch control processing device and method thereof
KR20180037749A (en) 2016-10-05 2018-04-13 에스프린팅솔루션 주식회사 Display apparatus
US10437391B2 (en) 2016-11-17 2019-10-08 Shenzhen GOODIX Technology Co., Ltd. Optical touch sensing for displays and other applications
WO2018096430A1 (en) 2016-11-24 2018-05-31 Flatfrog Laboratories Ab Automatic optimisation of touch signal
KR102630571B1 (en) 2016-11-29 2024-01-30 엘지디스플레이 주식회사 Flat Panel Display Embedding Optical Imaging Sensor
KR102344055B1 (en) 2016-12-07 2021-12-28 플라트프로그 라보라토리즈 에이비 improved touch device
WO2018106172A1 (en) 2016-12-07 2018-06-14 Flatfrog Laboratories Ab Active pen true id
EP3602258B1 (en) 2017-03-22 2024-05-08 FlatFrog Laboratories AB Pen differentiation for touch displays
EP3602259A4 (en) 2017-03-28 2021-01-20 FlatFrog Laboratories AB Touch sensing apparatus and method for assembly
KR102403009B1 (en) 2017-04-28 2022-05-30 엘지디스플레이 주식회사 Display device integrated with fingerprint sensor using holographic optical element
KR102331584B1 (en) 2017-06-08 2021-11-30 엘지전자 주식회사 Display device
WO2019073300A1 (en) 2017-10-10 2019-04-18 Rapt Ip Limited Thin couplers and reflectors for sensing waveguides
CN107957812B (en) 2017-11-15 2021-06-08 苏州佳世达电通有限公司 Touch device and touch device identification method
US11169641B2 (en) 2018-01-23 2021-11-09 Beechrock Limited Compliant stylus interaction with touch sensitive surface
WO2019159012A1 (en) 2018-02-19 2019-08-22 Rapt Ip Limited Unwanted touch management in touch-sensitive devices
US11036338B2 (en) 2018-04-20 2021-06-15 Beechrock Limited Touch object discrimination by characterizing and classifying touch events
US10983611B2 (en) 2018-06-06 2021-04-20 Beechrock Limited Stylus with a control
US11003284B2 (en) 2018-06-12 2021-05-11 Beechrock Limited Touch sensitive device with a camera
US11016600B2 (en) 2018-07-06 2021-05-25 Beechrock Limited Latency reduction in touch sensitive systems
TWI734024B (en) 2018-08-28 2021-07-21 財團法人工業技術研究院 Direction determination system and direction determination method
KR102469722B1 (en) 2018-09-21 2022-11-22 삼성전자주식회사 Display apparatus and control methods thereof
KR102656834B1 (en) 2018-10-17 2024-04-16 삼성전자주식회사 Display apparatus and control method thereof
CN111061391A (en) 2018-10-17 2020-04-24 华为技术有限公司 Infrared touch frame, infrared touch screen and display device
EP3644167A1 (en) 2018-10-24 2020-04-29 Vestel Elektronik Sanayi ve Ticaret A.S. Electronic devices and methods of operating electronic devices
US11054935B2 (en) 2018-11-19 2021-07-06 Beechrock Limited Stylus with contact sensor
KR102625830B1 (en) 2018-11-27 2024-01-16 삼성전자주식회사 Display apparatus, method for controlling the same and recording media thereof
US10649585B1 (en) 2019-01-08 2020-05-12 Nxp B.V. Electric field sensor
TWI713987B (en) 2019-02-01 2020-12-21 緯創資通股份有限公司 Optical touch panel and pressure measurement method thereof
CN209400996U (en) 2019-02-19 2019-09-17 广州视源电子科技股份有限公司 Touch frame and touch display screen
WO2020201831A1 (en) 2019-03-29 2020-10-08 Rapt Ip Limited Unwanted touch management in touch-sensitive devices
US20200341587A1 (en) 2019-04-24 2020-10-29 Rapt Ip Limited Thin Interactive Display
WO2020225605A1 (en) 2019-05-03 2020-11-12 Rapt Ip Limited Waveguide-based image capture
US20200387237A1 (en) 2019-06-10 2020-12-10 Rapt Ip Limited Instrument with Passive Tip

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020075243A1 (en) * 2000-06-19 2002-06-20 John Newton Touch panel display system
US20130155027A1 (en) * 2008-06-19 2013-06-20 Neonode Inc. Optical touch screen systems using total internal reflection
US20140320459A1 (en) * 2009-02-15 2014-10-30 Neonode Inc. Optical touch screens
US20120068973A1 (en) * 2009-05-18 2012-03-22 Flatfrog Laboratories Ab Determining The Location Of An Object On A Touch Surface
US20120256882A1 (en) * 2009-12-21 2012-10-11 Flatfrog Laboratories Ab Touch surface with identification of reduced performance
US20150130769A1 (en) * 2012-05-02 2015-05-14 Flatfrog Laboratories Ab Object detection in touch systems

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10474249B2 (en) 2008-12-05 2019-11-12 Flatfrog Laboratories Ab Touch sensing apparatus and method of operating the same
US11182023B2 (en) 2015-01-28 2021-11-23 Flatfrog Laboratories Ab Dynamic touch quarantine frames
US11029783B2 (en) 2015-02-09 2021-06-08 Flatfrog Laboratories Ab Optical touch system comprising means for projecting and detecting light beams above and inside a transmissive panel
US20180181231A1 (en) * 2015-06-12 2018-06-28 Sharp Kabushiki Kaisha Eraser device and command input system
US10466850B2 (en) * 2015-06-12 2019-11-05 Sharp Kabushiki Kaisha Eraser device and command input system
US10775937B2 (en) 2015-12-09 2020-09-15 Flatfrog Laboratories Ab Stylus identification
US11301089B2 (en) 2015-12-09 2022-04-12 Flatfrog Laboratories Ab Stylus identification
US10761657B2 (en) 2016-11-24 2020-09-01 Flatfrog Laboratories Ab Automatic optimisation of touch signal
US11579731B2 (en) 2016-12-07 2023-02-14 Flatfrog Laboratories Ab Touch device
US10775935B2 (en) 2016-12-07 2020-09-15 Flatfrog Laboratories Ab Touch device
US11281335B2 (en) 2016-12-07 2022-03-22 Flatfrog Laboratories Ab Touch device
US10282035B2 (en) 2016-12-07 2019-05-07 Flatfrog Laboratories Ab Touch device
US11474644B2 (en) 2017-02-06 2022-10-18 Flatfrog Laboratories Ab Optical coupling in touch-sensing systems
US11740741B2 (en) 2017-02-06 2023-08-29 Flatfrog Laboratories Ab Optical coupling in touch-sensing systems
US11099688B2 (en) 2017-03-22 2021-08-24 Flatfrog Laboratories Ab Eraser for touch displays
US11016605B2 (en) 2017-03-22 2021-05-25 Flatfrog Laboratories Ab Pen differentiation for touch displays
US10606414B2 (en) 2017-03-22 2020-03-31 Flatfrog Laboratories Ab Eraser for touch displays
US11281338B2 (en) 2017-03-28 2022-03-22 Flatfrog Laboratories Ab Touch sensing apparatus and method for assembly
US11269460B2 (en) 2017-03-28 2022-03-08 Flatfrog Laboratories Ab Touch sensing apparatus and method for assembly
US10845923B2 (en) 2017-03-28 2020-11-24 Flatfrog Laboratories Ab Touch sensing apparatus and method for assembly
US10739916B2 (en) 2017-03-28 2020-08-11 Flatfrog Laboratories Ab Touch sensing apparatus and method for assembly
US10606416B2 (en) 2017-03-28 2020-03-31 Flatfrog Laboratories Ab Touch sensing apparatus and method for assembly
US10437389B2 (en) 2017-03-28 2019-10-08 Flatfrog Laboratories Ab Touch sensing apparatus and method for assembly
US11256371B2 (en) 2017-09-01 2022-02-22 Flatfrog Laboratories Ab Optical component
US11650699B2 (en) 2017-09-01 2023-05-16 Flatfrog Laboratories Ab Optical component
US11567610B2 (en) 2018-03-05 2023-01-31 Flatfrog Laboratories Ab Detection line broadening
US11943563B2 (en) 2019-01-25 2024-03-26 FlatFrog Laboratories, AB Videoconferencing terminal and method of operating the same
US11893189B2 (en) 2020-02-10 2024-02-06 Flatfrog Laboratories Ab Touch-sensing apparatus

Also Published As

Publication number Publication date
US11016605B2 (en) 2021-05-25
US20180275788A1 (en) 2018-09-27
US20200393935A1 (en) 2020-12-17
WO2018174786A1 (en) 2018-09-27
US11099688B2 (en) 2021-08-24
WO2018174787A1 (en) 2018-09-27
US20200150822A1 (en) 2020-05-14
US20180275831A1 (en) 2018-09-27
US10606414B2 (en) 2020-03-31
EP3602257A1 (en) 2020-02-05
EP3602257A4 (en) 2021-01-13
EP3602258A1 (en) 2020-02-05
US10481737B2 (en) 2019-11-19
EP3602258A4 (en) 2021-01-06
WO2018174788A1 (en) 2018-09-27
EP3602258B1 (en) 2024-05-08

Similar Documents

Publication Publication Date Title
US20180275830A1 (en) Object characterisation for touch displays
US11301089B2 (en) Stylus identification
US10474249B2 (en) Touch sensing apparatus and method of operating the same
US11175767B2 (en) Unwanted touch management in touch-sensitive devices
JP5782446B2 (en) Determination of contact data for one or more objects on the contact surface
US8482547B2 (en) Determining the location of one or more objects on a touch surface
US8692807B2 (en) Touch surface with a compensated signal profile
US20120200538A1 (en) Touch surface with two-dimensional compensation
US20090278795A1 (en) Interactive Input System And Illumination Assembly Therefor
KR20100072207A (en) Detecting finger orientation on a touch-sensitive device
JP2017509955A (en) Dynamic allocation of possible channels in touch sensors
JP2017507406A (en) Apparatus and method for operating with reduced sensitivity in a touch sensing device
KR102053346B1 (en) Detecting Multitouch Events in an Optical Touch-Sensitive Device using Touch Event Templates
JP2009199427A (en) Position input device, position input method, and position input program
WO2019018992A1 (en) Gesture recognition method, head-wearable device, and gesture recognition apparatus
KR101456834B1 (en) Apparatus and method for interface sensing of touch speed
JPS6167121A (en) Position detecting method in display scope

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION