GB2484979A - Tracking and identifying physical objects in an interactive surface or vision system - Google Patents
Tracking and identifying physical objects in an interactive surface or vision system Download PDFInfo
- Publication number
- GB2484979A GB2484979A GB1018323.4A GB201018323A GB2484979A GB 2484979 A GB2484979 A GB 2484979A GB 201018323 A GB201018323 A GB 201018323A GB 2484979 A GB2484979 A GB 2484979A
- Authority
- GB
- United Kingdom
- Prior art keywords
- data
- camera
- application
- infra red
- vision system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 230000002452 interceptive effect Effects 0.000 title description 6
- 238000004891 communication Methods 0.000 claims abstract description 10
- 230000005540 biological transmission Effects 0.000 claims description 17
- 238000000034 method Methods 0.000 claims description 13
- 230000003993 interaction Effects 0.000 claims description 6
- 238000005516 engineering process Methods 0.000 claims description 5
- 230000003416 augmentation Effects 0.000 claims 1
- 238000004883 computer application Methods 0.000 claims 1
- 239000003550 marker Substances 0.000 abstract description 25
- 238000005286 illumination Methods 0.000 description 10
- 238000012546 transfer Methods 0.000 description 6
- 230000002093 peripheral effect Effects 0.000 description 5
- 238000012545 processing Methods 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- RTZKZFJDLAIYFH-UHFFFAOYSA-N Diethyl ether Chemical compound CCOCC RTZKZFJDLAIYFH-UHFFFAOYSA-N 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000011960 computer-aided design Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000008713 feedback mechanism Effects 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 238000011017 operating method Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000003997 social interaction Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/002—Specific input/output arrangements not covered by G06F3/01 - G06F3/16
- G06F3/005—Input arrangements through a video camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Position Input By Displaying (AREA)
- Optical Communication System (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
An active marker (fig 1, 2), (or active fiduciary marker) is described for use with a touch display system. The described active marker includes infrared LEDs (19, 22, fig 3) on its underside that may be used to record touch input in camera (4, fig 1) multi touch display system. The brightness of the LEDs may be altered so as to communicate data (e.g. marker identification) to the camera based touch display system. A unique rotationally invariant binary sequence may be used to communicate the data. Transceiver hardware within the display system may be used to decode this communication. The active marker may also communicate using radio wave communication.
Description
Patent Application of CASCOM Ltd for Touchbridge active marker system
of which the following is a specification:
FIELD OF THE INVENTION
The present invention pertains generally to the interface of humans and optical sensing technolo-gies, and more particularly to making tangible interfaces for use with touch sensitive visual display devices,
BACKGROUND OF THE INVENTION
Large interactive surfaces have proved their usefulness in a range of application scenarios. One of the most compelfing of these is the co-located collaborative setting. In this capacity, tahietop surfaces leverage the natural social interaction that occurs when pccple meet around a table. Moreover, tabletop surfaces have been shown to act as a natural site for interaction with physical objects engendering their examination dissemination and manipulation during collaboration. Consequently, a class of tangible user interface (Tul) that exploits the union of physical objects with tabletop interactive surfaces has arisen. These systems commonly make use of a marker system to track the characteristics of physical objects (usually position and orientation) so they may he incorporated into interaction with the digital domain.
Fiduciary markers are the most prominent of these systems. A set of predetermined unique images (markers) are printed upon labels that adhere to the base of physical objects. A vision system is then used to track the position, orientation and identity (ID) of each marker. Although attractive from a cost standpoint, these systems have no provision for the communication of the state of an object beyond its position and orientation; thus the hand's ability to make subtle and sophisticated manipulations of a physical objects form is not fully leveraged. The invention allo\vs the creation of an active marker that overcomes this limitation.
SUMMARY OF THE iNVENTION
This invention is a novel approach for tracking and identifying physical objects on an interactive surface using modulated Ill light. A simple embodiment of the invention can be made using an ex-isting camera based multi-touch implementation such as that used with traditiona.l printed markers.
The cameraS is used to identify the marker using light emitted from light emitting diodes (bEDs) on the device. The bEDs are modulated to give the devices identification number using a novel encoding technique and the camera is also able to find the position and orientation. Additionally the invention allows for additional information about each object to be encoded into the modulated Ill fight. This information is relayed via a novel higher bandwidth data hnk between the apphcation and the active marker device allowing active markers communicate freely with application. Those skilled in the art will note that the commnnication channel may be made uni-or hi-directional thus simpler and lower cost devices can be made by omitting part of the physical circuitry.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention will now be described in more detail1 with reference to the accompanying drawings1 in which: Figure. 1 shows the components used to make a typical vision based multi-tonch display The multi-touch application runs on the computer (6) which analyses the images from the camera (4,11) tc determine the users interaction with the display panel (8). One embodiment of a display element might be n liquid crystal display (LCD). The nserts finger touches on the screen are normally illuminated using a direct illumination (DI) source (7) behind the display or using a frustrated total internal reflection (FTIR) touch input panel (1). A camera (4j1) can be used to discriminate the touch inputs. The active marker part of the invention (215) uses bEDs (19,22) to identify iCs presence to the camera system. The camera iniages are analysed on the computer (6) to idcntit3r the active markers (2j5) position coordinates upon the display. The invention (2,15) is controlled using a micro-controller (23) which modulates the LEDs (19,22) to produce a pattern that is unique and contains the invention's identification number (ID). This allows the camera image analysis apphcation (12) to identify the devices ID. The invention may also use two or more LEDs to create a! pattern without spatial rotational symmetry. in this way the image processing application (12) may also find the active markers orientation on the display surface. A fast data transfer link is created between the computer and a transceiver module (5,14) in addition to the camera based vision system, This data link will typically comprise of an JR transmitting LED and an 1R receiving device. This pair forms a hi-directional data channel between the computer and the invention. The channel signal to noise ratio (SNR) can be enhanced by using pulsed illumination of the FTIR (1) or DI (7) source as this allows the fast data communication to occur whilst the illumination is inactive.
The pulsed illumination controller (3) connects to the camera (4,11) to only provide illumination whilst the caaiera's shutter is open.
Figure. 2 is a block diagram of' the system. The computer (6) and it's application (13) obtain data. about the position, orientation and device ID through the image processing software (12) which processes images from the camera, (411). The same images are used to identify the touch inputs and any other marker systems in use. The PC is also connected to a hardware device (8,14) to allow data transmission at a higher rate between the active marker (215) and the PC application using the invention's own matched transceiver (10). This allows the device to relay information from peripherals (9,16,24) such as memory storage, accelerometers, visible bEDs (17) and buttons (21) to the application as well as allowing the application to control these peripherals.
Fignre. 3 is an example of an active ntarker device not showing any enclosure. The device can be made on a printed circuit board (20) which supports the other components (16,17.18,19,21,22,23,24).
To operate as the active marker part of the invention, the device requires a minimnm of a power source, battery (18), a micro controller (23), one or more IR bEDs (19,22) and an JR receiver (24).
The invention wonld also benefit from buttons (21), visible LEDs (17) and ether sensors such as an accelerometer (16) depending upon the desired application. The device is orientated such that the JR bEDs (19,22) are directed at the display (8) so that the camera (4,11) can clearly see the modulation and JR transceiver (5,14,24) can communicate effectively. The other visible LEDs (17), buttons (21) and sensors (16) would normally face ont\vards to the user.
While the patent invention shall now be described with reference to the embodiments shown in the drawings, it should be nnderstood tha.t the invention is not limited to these specific embodiments.
DETAILED DESCRIPTION OF THE DRAWINGS
One possible embodiment of the invention will now he described in detail by means of example using the attached referenced figures. The term active marker is used as a descriptive non-technical term to describe the embodiment of the invention. FIG 1
Figure 1 depicts a system diagram of how the active marker system may be incorporated into a traditional camera vision based multi-touch system. The active marker (2) is the tangible object that the user will use to communicate with the application in combination with the existing touch screen inpnt formed by either an FTJR screen (1) or a Dl (7) source. The device typically uses non-visible JR light in the 800 -1000 nm wavelength range which is conveniently visible to electronic camera (4,11) sensors. The light is produced by JR LEDs (19,2 1) of the type used for high speed data transmission and these can be switched on an off in lOns which is snificiently fast to directly generate the required modulation. A reliable communication channel was made based on the JR carrier technology used for remote control systems such as in TVs. The receiver devices (24 a.nd found in 5,10,14) were chosen from the Visbay family with part number TSOP555t. The higher bandwidth 455 kHz carrier devices provided the highest data! transfer rates required for hulk data transfer whereas the 36 kHz carrier devices offered greater range and lower power operation. The choice of micro-controller (23) nsed is of no real importance as long as it can control the LEDs fast enongh to produce the waveforms required for transmission. The devices in both the active marker (2) and the transceiver device (5) are matched to operate at the same 1111 wavelength, carrier freqnency and baud rate. The transceiver devire (5,10,14) is connected to the PC via the universal serial bus (USB) to allow a reliable high speed connection with the application software as well as providing sufficient power for the transceiver to operate. In many cases, pulsed illumination is nsed for camera! vision systems since this effectively increases the contrast ratio of the images from the camera. Pnlsed illumination is achieved using a controller (3) to synchronise the illumination to be active only during the period when the camera (4,11) shutter is open. The rest of the frame time is then nonillun1ina!tcd offering a! dark period for more reliable 1R data transmission. if the pulsed illumination sonree is also formed from high speed LEDs then this can also be used as part of the tra.nsceiver (5,14) to directly to transmit to the active marker by modulating it vith the waveforms suitable for data transmission and thus eliminating the need for a separate transmitting LED. The pulsed illumination controller (3) takes signals from the camera (4) which must be of a full frame capture type. The camera output is connected to the PC (6) where is analysed by the processing application (12). Other applications (13) control the display (8) which is typically a projector or LCD technology. FIG 2
Figure 2 is a block diagram of the system implenientat ion showing the types of data available from the vision based system (11.12) and the higher bandwidth data link from the IF. data transmission hardware (14,10). The image processing application (12) can be one of the pre-existing touch input applications that convert the images into Cartesian coordinates of the touch inputs on the screen; a! popular choice is community core vision software by nui group. The LEDs on the underside of the active markers (19,22) produce bright spots that are easily visible to the camera (4,11) and are recorded as touch inputs by the image processing apphcation. A second listening program (part of 12) monitors the output of the vision software and looks for patterns in the coordinates that may correspond to an active marker (215). This second application looks at the levels of brightness of the suspected inputs and compares these to the previous frame or frames. The micro-controller, alters the brightness of the 1R LED or LEDs (19,22) to produce a binary brightness-modulated data steam, The data transfer rate is confined to less than a few bits per LED per camera frame which significantly reduces the available bandwidth to less than 500 baud, typically 50 baud. For this reason, only small amounts of data! can he sent via this method. The binary streani call he encoded into a "unique rotationally invariant binary necklace" which allows the correct identification of the devices regardless of which bit of the device lID is received first, the active marker would usually repeats this sequence continuously, if there are two or more LEDs, the orientation can also be found.
The transceiver hardware (540) structures its data transfer to use the same device ID as identified by the application, in this way the application can always know which device transmitted the data received and also be able to control specific devices. The active marker can be connected to a variety of additional peripherals (9,1647,21) such as sensors, buttons (21), visible bEDs (17) aud accelerometers (16). These are used to enrich the application that the active marker was designed for. The PC apphcation (13) can take control of these peripherals and receive data from sensor peripherals whilst simultaneously knowing the location and orientation of the device it is ccntrolhng.
The possible applications for this are very diverse from simple devices such as keyboards and buttons to complex devices such as data storage and interactive toys. FIG 3
Figure 3 is an example drawing of a simple active marker device that can he used to make an interactive tangible object for use on a multi-touch vision based display surface. The device has two high speed downward facing IR LEDs (19,22) to allow orientation data, for example Vishay's TSMR1000 devices are ideal. The LEIDs are located a sufficient distance apart so that the imaging software can easily recon them as independent inputs (eg 4Onini). The device should have an 1R receiver (24) capable of receiving carrier based signals such as those used for remote control, for example Vishay's TS0P2436. A micro-controller (23) is needed to produce the necessary drive waveforms, such as Microchip's PIC16F84A. A power source is required such as a battery (18) to power the electronics and an enclosure (not shown) would also be required. In addition to these essential couiponents an accelerometer (16), such as Analogue Devices' ADXL33O, would provide useful data for many applications as would a push button (21). Simple indicators such as visible LEDs (17) would also he useful as feedback to the user when controlled by the application.
ENHANCEMENTS
The high data rate transmission component of the invention lends itself to many enhancements.
One of which would be to replace the 00K (on oft keying) modulation scheme with another scheme.
A further enhancement would be to use an encryption algorithm on the scheme so that secure transactions could he made.
The physical form of the invention lends itself to many embodiments and those skilled in the art will recognise that the invention maybe integrated as a sub-circuit into a device such as a mobile telephone. In this form the invention provides a convenient mechanism to enable the aforementioned instrument to communicate data containing orientation and device state.
A further addition of a sensor (such as accelerometer or electronic gyroscope) would enable classification of human interaction. This information can he used to put the device into a state of low power consumption when not being used as well as providing another parameter for application developers to work with.
A further enhancement of the invention would be to integrate a radio onto the circuit; LEEE8O2, 15.3 or IEEE8O2.15.4 compatible transceivers would both be suitable choices. This enhancement would allow the device to communicate data even when out or range of the JR transceiver. It may also be used facilitate much higher data rate communication that available over the IR communication link.
BENEFITS OF THE iNVENTION The hroa.d benefits of the invention are to provide application developers with a configurable control and feedback mechanism that users can use to interact with underlying applications. Through this mechanism rich application interfaces mar he built that are more secure, tailored or adapted for specific users and can be manipulated more naturally and intuitively. Although printed fiduciary markers go someway to achieving this (e.g. reacTlVision by Kaltenbrunner et al) the invention provides a much richer set of parameters that can be used and extended.
An example application scenario might be that the invention is placed on one part of a display and rotated to control a parameter such as volume and when rotated on another region of the display would control tempo. Furthermore, controls on the actual invention could he configured to change global application parameters such as screen resolution or brightness. Another usage scenario would be to have two displays and use the invention to select data from one, store it on the inventions on-board memory and then transfer it to the second display; a copy paste operation. If the invention were instrumented with a sensor (such as an accelerometer) the whole operation could be done using gesture driven control without the need for physical buttons.
The benefits of this invention over a radio frequency based approach is that the invention allows mitigation of position and orientation directly on the display withont extra sensors, instrumenta-tion or printed symbols; thus lowering device cost. Fnrthermore, the use of IR transmissions is significantly lower power and thus battery longevity is also improved.
OTHER EMBODIMENTS
From the foregoing description it will thus be evident that the present invention provides a design for realising an tangible interfacing technology for use with multi-touch displays. As various changes can he made in the above embodiments and operating methods without departing from the scope of the following claims, it is intended that all matter contained in the above description or shown in the accompanying drawings should he interpreted as illustrative and not in a limiting sense.
Variations or modifications to the design and constrnction of this invention, within the scope of the appended claims, may occur to those skilled in the art upon reviewing the disclosure herein (especiafly to those using computer aided design systems). Such variations or modifications are intended te be encompassed within the scope of any claims to patent protection issning upon this invention.
REFERENCES
Kaltenbrunner, WI,, Bencina, R. reacTiVision: a computer-vision framework for table-based tangible interaction. In Proc. TEI'IJZ ACM, 69-74.
Claims (14)
- CLAIMSThe embodiments of the invention in which I claim an exclusive property or privilege are defined as follows: What is claimed is: 1. A device that can he simultaneously located and identified hy a vision system with a method of transmitting additional data wirelessly.
- 2. A device as in claim I where the method of transmitting additional data from the device to the vision system is by altering its appearance to the vision system in accordance with the data being sent and the method of encoding the data; such as. but not confined to, by temporally changing the brightness of LEDs on the device in a reeognisable pattern.
- 3. A device as in claim 2 where the method of encoding the data nses a.n asyrrunetrie unique repeating binary sequence to facilitate recognition of the data without synchronisation to the start of the sequence.
- 4. A device as is claim 2 where the vision based system is part of a multi touch display system such as but not confined to, a projector or LCD display eomhined with a vision based touch screen technology like direct illmnination or frustrated total internal reflection.
- 5. A device as iu claim 2 where the method of transmitting data is synebronised to the camera apertnre nsing a synchronising signal that the device receives and in this way can achieve much faster and more reliable data transmission through the vision system.
- 6. A device as in claim 1 where the method of transmitting additional data is using modulated iufra red light such as, for illustration only, by using a transmission protocol like those published by the Infra Red Data Association,
- 7. A device as in claim 1 where the method of transmission uses a modulated carrier of infra red light such as, but not limited to, by using the hardw are designed for infra red remote eoutrol systems.
- S. A device as in claim 2 hut with the addition of a! second method of data! transmission in parallel to the first using modulated infra red light such that a second method of transmitting data at a higher rate is present.
- 9. A device as in claim 8 where the second method of data transmission uses an infra red carrier modulated with the data to achieve a greater range of data transmission to increase reliability of the second data! channel.
- 10. A device as in claim 8 but incorporating additieual data transmission channel such as, hut not hrnited to Bluetooth or other infra red carrier frequencies for the purpose of achieving multiple data transmission channels for the purpose of improveing channel congestion when using multiple devices
- ii. An application using a device to interact with a computer application where the device is capable of transmitting data to the application using a data transmission channel snch that an interaction with or enhancement to or augmentation of the application is possible.
- 12. As in claim 11 but where the communication channel uses the camera device as described in claim 2.
- 13. As in claim 11 but where the communication channel uses the camera device as described in claim 6.
- 14. As in claim 11 but where the communication channel uses the camera device as described in claim 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1018323.4A GB2484979A (en) | 2010-10-29 | 2010-10-29 | Tracking and identifying physical objects in an interactive surface or vision system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1018323.4A GB2484979A (en) | 2010-10-29 | 2010-10-29 | Tracking and identifying physical objects in an interactive surface or vision system |
Publications (2)
Publication Number | Publication Date |
---|---|
GB201018323D0 GB201018323D0 (en) | 2010-12-15 |
GB2484979A true GB2484979A (en) | 2012-05-02 |
Family
ID=43401535
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1018323.4A Withdrawn GB2484979A (en) | 2010-10-29 | 2010-10-29 | Tracking and identifying physical objects in an interactive surface or vision system |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2484979A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10507063B2 (en) | 2014-11-21 | 2019-12-17 | Think Surgical, Inc. | Visible light communication system for transmitting data between visual tracking systems and tracking markers |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6608688B1 (en) * | 1998-04-03 | 2003-08-19 | Image Guided Technologies, Inc. | Wireless optical instrument for position measurement and method of use therefor |
WO2003081911A1 (en) * | 2002-03-22 | 2003-10-02 | British Telecommunications Public Limited Company | Interactive video system |
US20080029316A1 (en) * | 2006-08-07 | 2008-02-07 | Denny Jaeger | Method for detecting position of input devices on a screen using infrared light emission |
EP2107446A1 (en) * | 2008-04-04 | 2009-10-07 | ETH Zurich | System and a method for tracking input devices on LC-displays |
-
2010
- 2010-10-29 GB GB1018323.4A patent/GB2484979A/en not_active Withdrawn
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6608688B1 (en) * | 1998-04-03 | 2003-08-19 | Image Guided Technologies, Inc. | Wireless optical instrument for position measurement and method of use therefor |
WO2003081911A1 (en) * | 2002-03-22 | 2003-10-02 | British Telecommunications Public Limited Company | Interactive video system |
US20080029316A1 (en) * | 2006-08-07 | 2008-02-07 | Denny Jaeger | Method for detecting position of input devices on a screen using infrared light emission |
EP2107446A1 (en) * | 2008-04-04 | 2009-10-07 | ETH Zurich | System and a method for tracking input devices on LC-displays |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10507063B2 (en) | 2014-11-21 | 2019-12-17 | Think Surgical, Inc. | Visible light communication system for transmitting data between visual tracking systems and tracking markers |
Also Published As
Publication number | Publication date |
---|---|
GB201018323D0 (en) | 2010-12-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11599148B2 (en) | Keyboard with touch sensors dedicated for virtual keys | |
US9285840B2 (en) | Detachable sensory-interface device for a wireless personal communication device and method | |
US8380246B2 (en) | Connecting mobile devices via interactive input medium | |
CN101971123B (en) | Interactive surface computer with switchable diffuser | |
CN107407559B (en) | Range image acquisition device and range image acquisition methods | |
US8446364B2 (en) | Visual pairing in an interactive display system | |
US8902158B2 (en) | Multi-user interaction with handheld projectors | |
Wilson et al. | BlueTable: connecting wireless mobile devices on interactive surfaces using vision-based handshaking | |
US20110007035A1 (en) | Finger-worn devices and related methods of use | |
CN104765443B (en) | Image type virtual interaction device and implementation method thereof | |
US20140321700A1 (en) | Light sensing module and system | |
CN105988663A (en) | Display apparatus, electronic apparatus, hand-wearing apparatus and control system | |
US20180246617A1 (en) | Transparent interactive touch system and method | |
JP2007531950A (en) | Interactive display system | |
CA2942773C (en) | System and method of pointer detection for interactive input | |
WO2018195828A1 (en) | Eyeball-based remote control system and electronic device | |
CN110059627A (en) | A kind of display control method and terminal | |
CN110231906A (en) | The display methods and terminal device of people of unread information | |
CN109683774A (en) | Interactive display system and interactive display control method | |
GB2484979A (en) | Tracking and identifying physical objects in an interactive surface or vision system | |
Ma et al. | Lift: Using projected coded light for finger tracking and device augmentation | |
CN106662936B (en) | Position input on a display | |
Ostkamp et al. | Short-range optical interaction between smartphones and public displays | |
CN102136185A (en) | Signal processing system, electronic device and peripheral device lighting device thereof | |
Liu | Towards Visible Light Communication Using Mobile Devices |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |