NL2004333C2 - Method and apparatus for touchlessly inputting information into a computer system. - Google Patents

Method and apparatus for touchlessly inputting information into a computer system. Download PDF

Info

Publication number
NL2004333C2
NL2004333C2 NL2004333A NL2004333A NL2004333C2 NL 2004333 C2 NL2004333 C2 NL 2004333C2 NL 2004333 A NL2004333 A NL 2004333A NL 2004333 A NL2004333 A NL 2004333A NL 2004333 C2 NL2004333 C2 NL 2004333C2
Authority
NL
Netherlands
Prior art keywords
pointing
planes
plane
faces
interface object
Prior art date
Application number
NL2004333A
Other languages
Dutch (nl)
Inventor
Ruben Meijer
Original Assignee
Ruben Meijer
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ruben Meijer filed Critical Ruben Meijer
Priority to NL2004333A priority Critical patent/NL2004333C2/en
Application granted granted Critical
Publication of NL2004333C2 publication Critical patent/NL2004333C2/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Description

Title: Method and apparatus for touchlessly inputting information into a computer system
Field of the invention 5 The present invention relates to a method for touchlessly inputting information into a computer system, and to an input apparatus implementing said method.
Background 10 Many of today’s computer systems are equipped with a graphical user interface (GUI) through which a user can interact with them. A user may be enabled to select items or objects presented to him by the GUI, for example on a video display, by a variety of input means.
Common input means include a trackball type pointing device, e.g.
15 a computer mouse. A user may place his hand on the pointing device so as to move it and to correspondingly displace a position indicator, e.g. a cursor, across the display. The position indicator typically indicates the position of the display that will respond to further input from the pointing device or another input means. Once a cursor has been moved over an item presented 20 on the display, the user may press a button on the pointing device to confirm a selection. Other input means finding increased implementation are touch screens or touch panels. Relative to the aforementioned pointing devices, they are considered to be more user friendly and natural. This is primarily because touch screens enable a user to directly interact with any information 25 presented to them, rather than indirectly via a pointing device. In addition, touch screens render separate and/or intermediate pointing devices, such as pointing sticks or pens, superfluous.
As computer systems become more and more ubiquitous, the demand for operating systems through which a computer system may be 30 operated in a natural and intuitive manner, and without special hardware 2 such as pointing devices, increases. A case in point are head-up displays systems used in the automotive industry. Such display systems may run a GUI that displays a variety of information, e.g. speedometer and night vision data, on a windshield of an automobile. This allows a user/driver to take 5 notice of the information without having to take his eyes off of the road.
Providing an operating system that enables natural interaction between the driver and the GUI, however, presents a challenge. One the one hand, trackball type pointing devices are considered neither user-friendly nor promoting natural interaction. Touch screens, on the other hand, aren’t an 10 obvious choice either. This is because a capacitive or resistive touch panel is preferably implemented as an overlay, present immediately in front of a display to be operated. In most cars, windshields tend to extend away from the driver. Though the driver may be able to point at objects presented to him by the GUI, it may typically be cumbersome for him to reach forward to 15 actually touch the windshield.
It is an object of the present invention to overcome or mitigate this problem associated with known operating systems, and more in particular, to provide for an economical input apparatus that enables a user to interact and input coordinates into an associated computer system.
20
Summary of the invention
One aspect of the present invention is directed to an input apparatus for touchlessly inputting information into a computer system by pointing a pointing member at an interface object of said system. The 25 apparatus includes monitoring means configured to monitor, within a region of interest, a plurality of planes for intersections by a pointing member, said plurality of planes comprising at least a first plane and a second plane. The apparatus further includes a controller that is operably connected to the monitoring means, and that is configured to detect, based on reference 30 signals outputted by the monitoring means, intersections of said planes by 3 the pointing member. Upon (simultaneous) detection of an intersection of both the first and the second plane the controller is configured to determine first spatial coordinates at which the pointing member intersects the first plane and second spatial coordinates at which the pointing member intersects 5 the second plane. The controller is also configured to determine from said first and second spatial coordinates a position and a pointing direction of the pointing member, and to determine from the determined position and pointing direction of the pointing member, and from supplementary information relating to the location of the interface object, coordinates of a 10 target point on the interface object at which the pointing member points.
The apparatus according to the present invention provides for a way of touchlessly inputting information into a computer system, e.g. a board computer with a head up display as described above, a portable navigation system or an industrial machine control system, in particular by pointing a 15 pointing member, e.g. a human finger, at an interface object of said system, e.g. a video display. The pointing action of the pointing member is registered with the aid of a plurality of optically or sonically monitored planes, which may typically be located somehow in front of the interface object, for example immediately adjacent thereto or at a distance therefrom. A first and second 20 plane of the plurality of planes are employed in particular for assessing both a position and a direction of the pointing member as it points at the interface object. On the basis of these parameters, supplemented by information about the location (including information about spatial distribution or shape) of the interface object, the coordinates of the point on the interface object at which 25 the pointing member points (i.e. the target point) may be determined. These latter coordinates may be passed on/communicated to a computer running an application program, for example one that provides a GUI on the interface object, in order to enable it to respond to the user’s input, for example by moving a position indicator to the target point.
4
It is understood that the input apparatus according to the present invention interprets a pointing gesture, e.g. an extended index finger, in accordance with its normal meaning as it seeks to identify the location that is pointed at. Accordingly, it provides for a natural way of inputting coordinate 5 information, which is especially suitable for use in situations where it is desirable to control a computer system’s user interface from a distance and/or without touching the associated interface object.
The aforementioned planes may be monitored using electromagnetic radiation (EMR) in a part of the electromagnetic spectrum invisible to 10 the human eye, e.g. infrared (IR). Alternatively, monitoring may take place using (ultra)sonic signals. Hence the planes may be fully transparent and permeable, and in fact imperceptible to humans unless indicated by indicating means present in or at the periphery of the planes. Such indicating means may for example include light emitting devices that emit light in a 15 visible part of the electromagnetic spectrum, and within one or more of the planes, such that a pointing member that extends through said plane(s) is partly illuminated at the location of intersection of said plane(s), thereby providing the user with feedback about the location of the plane(s). It is noted that EMR monitoring means configured to monitor a single plane in space are 20 known in the art, for example from the field of electronic (white)boards. In such boards a monitored plane may typically be present immediately adjacent and parallel to a writing surface of the board. Using a number of such single-plane monitoring means in parallel provides for an economical way of monitoring multiple parallel planes as proposed by the present 25 invention. Technical alternatives, such as camera control systems based on stereoscopic imaging, do exist but are typically difficult and relatively costly to implement.
While a first and a second plane may be used in particular to ascertain the coordinates of a target point on the interface object, any number 30 of desired further planes may be used to provide for additional functionality.
5
The plurality of planes may for example comprises a third plane that is located between, on the one hand, the first and second planes, and on the other hand, the interface object. Intersections of such a third plane may be recorded and communicated to an application program running on a 5 computer connected to the input apparatus. The application program may be capable of executing a range of different actions, and assign to each of these actions, by way of activating command, a specific intersection pattern. A single, brief intersection of the third plane, for example, may be assigned the meaning of a selection-confirmation command, comparable to a single mouse 10 click on conventional home computer systems or a single tap on a touch screen. Alternatively, such a selection-confirmation command may be linked to another intersection pattern, for example including two repetitive intersections within a particular time interval. An application program may in principle assign any intersection pattern of any of the plurality of the 15 planes to a specific action it is capable of executing.
Another aspect of the present invention relates to a method for touchlessly inputting information into a computer system by pointing a pointing member at an interface object of said system. The method includes defining in a region of interest a plurality of planes, said plurality of planes 20 comprising at least a first plane and a second plane. The method also includes monitoring said planes for intersections by the pointing member, and upon detecting an intersection of both the first and the second plane, determining first spatial coordinates at which the pointing member intersects the first plane and second spatial coordinates at which the pointing member 25 intersects the second plane. The method further includes determining from said first and second spatial coordinates a position and a pointing direction of the pointing member, and determining from the determined position and pointing direction of the pointing member, and from information relating to the location of the interface object, coordinates of a target point on the 30 interface object at which the pointing member points.
6
These and other features and advantages of the invention will be more fully understood from the following detailed description of certain embodiments of the invention, taken together with the accompanying drawings, which are meant to illustrate and not to limit the invention.
5
Brief description of the drawings
Fig. 1 schematically illustrates an exemplary embodiment of a computer system including an input apparatus according to the present invention, and shows inter alia the intersection of two parallel planes by an 10 index finger of a user’s hand; and
Figs. 2A-D schematically illustrate the intersection of the planes by the index finger as shown in shown in Fig. 1, in a perspective front view, a perspective diagonal-front view, a perspective top view and a perspective side view, respectively.
15
Detailed description
Fig. 1 schematically illustrates a computer system of a general type. It includes an interface object 10, which is operably connected to and controlled by a computer 8. The computer 8, in turn, is operably connected to 20 an input apparatus 1 according to the present invention comprising monitoring means 2, 4, 6 and a controller (not shown). The different components of the computer system will be briefly elucidated in turn.
The interface object 10 may typically include a video display capable of presenting variable information content to a user. The term video 25 display is to be construed broadly, and intends to include any type of device that is suited for the purpose of visual information transfer. However, although the interface object may generally be described as a part of the computer system, it need not be an active or controllable part thereof. In fact, it may be any passive object, as long as information about its location/spatial 30 distribution is known to the controller/computer, so as to allow it to 7 extrapolate the pointing direction of the pointing member towards the object in order to determine the coordinates of the target point on the object. The interface object may, for example, be a blind wall that is to be painted, while the computer system including the input apparatus 1 according to the 5 present invention is a spray gun control system, configured to control the aiming and firing of a spray gun that will paint the wall.
The information content presented to a user via the interface object 10 may be controlled by the computer 8. The computer 8 may include a processor for running an application program that provides for a GUI, which 10 is presented to the user via the interface object 10. User input for the application program may be provided for via the monitoring means 2, 4, 6.
The monitoring means 2, 4, 6 may be configured to monitor a plurality of planes for the presence of a pointing member 14, e.g. an index finger of a user’s hand 12. In Fig. 1, only a first plane VI and a second plane 15 V2 are shown for clarity. The planes VI, V2 may extend in parallel, but this need not always be the case. Intersections between different planes VI, V2, however, may preferably be avoided. In case the planes VI, V2 do extend in parallel, a perpendicular distance between them is preferably smaller than a length of the pointing member, such that the pointing member may extend 20 through (and thus operate) both planes at the same time. The planes VI, V2 may be disposed immediately adjacent the interface object 10, for example within a frame or bezel surrounding a video display thereof. Alternatively, as in Fig. 1, the planes may be located at a distance from the interface object 10. In either case, information may be inputted by intersecting the planes with a 25 pointing member and without touching the interface object 10.
In one embodiment, the monitoring means 2, 4, 6 may be equipped with an embedded controller adapted to process any acquired monitoring data pertaining to intersections of the planes VI, V2. In general, the controller may be configured to assess which plane is intersected at what 30 time, and at what spatial coordinates. As far as the first VI and second V2 8 planes are concerned, the controller may in particular be adapted to process said data to extract therefrom a position and a pointing direction R of the pointing member 14, and to determine, using these parameters, the spatial or interface/display coordinates of a target point 11 on the interface object 10 at 5 which the pointing member 14 points. The processed data, including for example the latter coordinates of the target point, may be outputted to the computer 8. Alternatively, the monitoring means 2 may pass on all or part of the acquired monitoring data to the computer 8 for processing and extraction of the respective parameters. Hence, in this latter case, the computer serves 10 as the controller.
The monitoring means 26 may be implemented in a variety of ways. As in the embodiment depicted in Fig. 1, the monitoring means 2, 4, 6 may for example comprise a planar, rectangular frame 2 laterally bounding a region of interest. Infrared beam generators, including for example infrared 15 light emitting diodes (LEDs) 4, may be disposed on two adjoining inner edges of the frame 2, while infrared beam sensors, such as infrared photosensors 6, may be placed on the opposite inner edges. The LEDs 4 and corresponding photosensors 6 may be arranged such that they create two or more substantially parallel 2D-grids of light beams across the frame 2. In the 20 embodiment of Fig. 1, two such grids are created; they extend in the first plane VI and second plane V2, respectively. A pointing member 14 that extends through the frame 2 may interrupt one or more of the light beams in each of the planes VI, V2, and cause a decrease in measured light intensity at the respective photosensors 6. Reference signals outputted by the 25 photosensors 6 may be communicated to the aforementioned controller, and be used to determine the spatial coordinates of the positions at which the pointing member 14 intersects the planes VI, V2. From these spatial coordinates, both a position and a pointing direction R of the pointing member 14 may be extracted, as will be discussed in some more detail below.
9
In an other embodiment of the input apparatus 1, infrared beam generators associated with at least one of the plurality of planes VI, V2, may be configured to generate infrared scanning beams that rotatively scan said at least one plane. The infrared beam sensors may detect any immediately 5 incident light beams and/or light beams reflected off of inner edges of the frame 2 and/or of a pointing member 14, and infer from an absence of expected incident light, a duration of absence and/or a decreased light intensity of incident light whether and where a pointing member 14 is present. Accordingly, a position and a pointing direction of a pointing member 10 14 extending through the frame 2 may be determined.
One skilled in the art will appreciate that the above implementations of the monitoring means 2, 4, 6 are merely exemplary and susceptible to various modifications. In an alternative embodiment, for example, the optical monitoring means may be replaced with sonic monitoring means. In another 15 alternative embodiment, the infrared based optical components may be replaced with components that operate in a different part of the electromagnetic spectrum.
Furthermore, it is noted that the number of planes to be monitored by the monitoring means 2, 4, 6 may be chosen as desired. Two planes VI, V2 20 is considered a minimum for retrieving information about the orientation of the pointing member 14. However, further planes may enable the input apparatus 1 to provide more detailed, three dimensional information about the position and orientation of the pointing member 14. For example, a third plane that is substantially parallel to the first VI and second V2 planes, and 25 that is located between, on the one hand, the first and second planes VI, V2, and on the other hand, the interface object 10, may be added. The monitoring 2, 4, 6 means may monitor this third plane for intersections by the pointing member 14 and output data regarding any intersections to the computer 8.
An application program running on the computer 8 may interpret an 30 intersection, or a certain sequence of intersections, of the third plane as a 10 special type of user input. A single intersection may for example be given the meaning of a confirmation of a selection of a graphical item presented on the video display. Likewise, two or more intersections within a certain period of time and/or of a certain duration may be coupled to a specific function/action 5 of the application program.
Figs. 2A-D schematically illustrate the intersection of the planes VI, V2 by the index finger 14, as shown in Fig. 1. Fig. 2A is a perspective front view of the situation depicted in Fig. 1, seen from the point of view of the user; Fig. 2B is a perspective diagonal-front view; Fig. 2C is a perspective top 10 view, and Fig. 2D is a perspective side view.
From Figs. 2A-D, it is clear that the index finger 14, intersects both planes VI, V2 at different spatial positions PI, P2. The spatial coordinates of both PI and P2 may be recorded relative to any suitable coordinate system.
To this end, Figs. 2A-D illustrate a three-dimensional Cartesian XYZ-15 coordinate system, in which the planes VI and V2 extend substantially perpendicularly to a Z-axis. Accordingly, any intersection of the first plane VI occurs at a first Z-coordinate, e.g. Zl, while any intersection of the second plane V2 occurs at a second Z-coordinate, e.g Z2. Furthermore, the planes VI and V2 are oriented parallel to the interface object 10, which may be located 20 at its own Z-coordinate, e.g. Zmterface. Intersection point Pi may thus be accorded spatial coordinates (XI, Yl, Zl), while intersection point P2 may be accorded spatial coordinates (X2, Y2, Z2). Any of the points PI, P2 may lend its spatial coordinates to the position of the pointing member 14, while the pointing direction R may be defined as the vectorial difference between the 25 spatial coordinates of PI and P2, e.g. as (X2-X1, Y2-Y1, Z2-Z1). From the position of the pointing member 14 and the pointing direction R, the spatial coordinates of the target point 11 on the interface object, (X’, Y’, Zmterface), may be determined by solving for the point at which a PI or P2 based pointing line 16 (cf. Fig. 1), extending in the direction of vector R, intersects a Z-plane with 30 coordinate Zmterface. Thus determined spatial coordinates of the target point 11 11 may be mapped onto corresponding video display coordinates, so as to allow a GUI of an application program run by computer 8 to provide for appropriate feedback at the target point. Such feedback may, for example, include moving a position indicator or cursor to the target point.
5 Above, various implementations of a computer system in conjunction with which the input apparatus, or the corresponding method, according to the present invention can be used have already been mentioned in passing. Indeed, particularly advantageous implementations are foreseen in the automotive industry, including the aforementioned head-up displays. It 10 is understood, however, that the input apparatus may in principle be combined with any computer system, including wireless terminals such as PDA’s, in which it may be used to mimic touch screen functionality, or audio-and/or television systems, in which the input device may serve as a true remote control. Another example underlining the versatility of the input 15 apparatus is the following. Imagine a conference room featuring a U-shaped table setup for participants, wherein a video display is provided at the head (i.e. at the open end of the U-configuration of tables). First and second, typically vertically extending parallel planes may then be provided along and over the entire U-shaped table configuration. This allows a participant, 20 independent of the position of his seat, to extend his arm or hand across his table and towards the video display so as to point out some piece of information being shown thereon. By doing so, he would intersect the first and second parallel planes. From the points of intersection, the input apparatus according to the invention can derive a position and a direction of 25 the pointing arm/hand, and determine coordinates of the target point on the video display that the participant points at. An associated computer running a presentation application program may then provide for feedback on the video display, for example in the form of a position indicator or a highlight effect, indicating to the other participants the aforementioned piece of 30 information.
12
Although illustrative embodiments of the present invention have been described above, in part with reference to the accompanying drawings, it is to be understood that the invention is not limited to these embodiments. Variations to the disclosed embodiments can be understood and effected by 5 those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. Reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present 10 invention. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, it is noted that particular features, structures, or characteristics of one or more embodiments may be combined in any suitable manner to form new, not 15 explicitly described embodiments.
13
List of elements 1 coordinate inputting apparatus 2 frame with LEDs and photosensors 4 LEDs 5 6 photosensors 8 computer 10 interface object / video display 11 target point 12 human hand 10 14 extended index finger / pointing member 16 line-of-pointing VI first plane V2 second plane 15
PI intersection point of pointing member with first plane PI
P2 intersection point of pointing member with second plane P2 R pointing direction of pointing member 20 XI,Y1 XY-plane plane relating to first plane X2,Y2 XY-plane relating to second plane X’, Y’ XY-plane relating to interface object

Claims (13)

1. Een invoerapparaat (1) voor het contactloos invoeren van informatie in een computersysteem (8, 10) door met een aanwijsorgaan (14) 5 naar een interfaceobject (10) van genoemd systeem te wijzen, waarbij het invoerapparaat omvat: - monitoringmiddelen (2,4,6) die zijn ingericht om binnen een interessegebied een veelheid aan vlakken te monitoren op doorsnijdingen door het aanwijsorgaan (14), waarbij genoemde veelheid aan vlakken ten 10 minste een eerste vlak (VI) en een tweede vlak (V2) omvat; - een controller die functioneel verbonden is met de monitoringmiddelen, en die is ingericht om, op basis van referentiesignalen die worden afgegeven door de monitoringmiddelen, doorsnijdingen van genoemde vlakken door het aanwijsorgaan (14) te detecteren, en om bij detectie 15 van een doorsnijding van zowel het eerste (VI) als het tweede (V2) vlak o eerste ruimtelijke coördinaten te bepalen waarop het aanwijsorgaan (14) het eerste vlak (VI) doorsnijdt en tweede ruimtelijke coördinaten waarop het aanwijsorgaan het tweede vlak (V2) doorsnijdt; 20. op basis van genoemde eerste en tweede ruimtelijke coördinaten een positie en een aanwijsrichting (R) van het aanwijsorgaan (14) te bepalen; en o op basis van de bepaalde positie en aanwijsrichting (R) van het aanwijsorgaan (14), en op basis van informatie die betrekking 25 heeft op de locatie van het interfaceobject (10), coördinaten van een doelpunt (11) op het interfaceobject te bepalen waarnaar het aanwijsorgaan (14) wijst.An input device (1) for contactless input of information into a computer system (8, 10) by pointing with a pointing device (14) to an interface object (10) of said system, the input device comprising: - monitoring means (2 4,6) adapted to monitor a plurality of planes for intersections through the indicator member (14) within an area of interest, said plurality of planes comprising at least a first plane (VI) and a second plane (V2); - a controller which is functionally connected to the monitoring means, and which is adapted to detect, on the basis of reference signals issued by the monitoring means, cuts of said planes by the pointing device (14), and to detect a cut of both the first (VI) and second (V2) planes to determine first spatial coordinates at which the pointer (14) intersects the first plane (VI) and second spatial coordinates at which the pointer intersects the second plane (V2); 20. on the basis of said first and second spatial coordinates, determine a position and a pointing direction (R) of the pointing member (14); and o on the basis of the determined position and pointing direction (R) of the pointing device (14), and on the basis of information relating to the location of the interface object (10), to coordinate a goal (11) on the interface object determine what the pointing device (14) points to. 2. Het invoerapparaat volgens conclusie 1, waarbij genoemde veelheid aan vlakken een derde vlak omvat dat zich bevindt tussen, enerzijds, het eerste en het tweede vlak (VI, V2), en anderzijds, het interfaceobject (10).The input device according to claim 1, wherein said plurality of faces comprises a third face located between, on the one hand, the first and second faces (VI, V2), and on the other hand, the interface object (10). 3. Het invoerapparaat volgens een der conclusies 1-2, waarbij de vlakken van genoemde veelheid aan vlakken zich onderling parallel uitstrekken, en waarbij een loodrechte afstand tussen twee buitenste vlakken van genoemde veelheid aan vlakken kleiner is dan een lengte van het aanwijsorgaan (14), zodanig dat het aanwijsorgaan tegelijkertijd alle vlakken 10 kan doorsnijden.The input device according to any of claims 1-2, wherein the planes of said plurality of planes extend parallel to each other, and wherein a perpendicular distance between two outer planes of said plurality of planes is less than a length of the indicating member (14) , such that the indicating member can simultaneously cut through all the surfaces 10. 4. Het invoerapparaat volgens een der conclusies 1-3, waarbij ten minste één van de vlakken (VI, V2) van genoemde veelheid aan vlakken met behulp van elektromagnetische straling wordt gemonitord op doorsnijdingen 15 door het aanwijsorgaan (14).The input device according to any of claims 1-3, wherein at least one of the faces (VI, V2) of said plurality of faces is monitored for intersections 15 by the indicating member (14) by means of electromagnetic radiation. 5. Het invoerapparaat volgens conclusie 4, waarbij de monitoringmiddelen (2, 4, 6), voor het ten minste ene vlak (VI, V2) dat wordt gemonitord met behulp van elektromagnetische straling, omvatten: 20. een veelheid aan infraroodstraalgenerators (4), geplaatst aan de periferie van genoemd ten minste ene vlak, waarbij de infraroodstraalgenerators zijn ingericht voor het genereren van infraroodstralen die zich uitstrekken in genoemde ten minste ene vlak; en - een veelheid aan infraroodstraalsensors (6), geplaatst aan de periferie 25 van genoemde ten minste ene vlak, waarbij elk van de infraroodstraal sensors is ingericht voor het genereren van een referentiesignaal in afhankelijkheid van een gedetecteerde infraroodstraal die is gegenereerd door genoemde infraroodstraalgenerators.The input device according to claim 4, wherein the monitoring means (2, 4, 6), for the at least one face (VI, V2) that is monitored by electromagnetic radiation, comprises: 20. a plurality of infrared ray generators (4) placed on the periphery of said at least one plane, the infrared ray generators being arranged to generate infrared rays extending in said at least one plane; and - a plurality of infrared ray sensors (6) disposed on the periphery of said at least one plane, each of the infrared ray sensors being adapted to generate a reference signal in dependence on a detected infrared ray generated by said infrared ray generators. 6. Het invoerapparaat volgens conclusie 5, waarbij de infraroodstraal- generators zijn ingericht voor het genereren van infraroodscanstralen die het ten minste ene vlak rotatiegewijs scannen.The input device according to claim 5, wherein the infrared ray generators are adapted to generate infrared scan rays that rotatably scan the at least one surface. 7. Een computersysteem, omvattende: - een invoerapparaat (1) volgens een van de conclusies 1-6; - een interfaceobject (10) omvattende een videodisplay dat in staat is om veranderlijke informatie weer te geven; en - een computer (8) die functioneel verbonden is met de controller van het 10 invoerapparaat (1) voor het daarvan ontvangen van coördinaten van het doelpunt en/of andere informatie die betrekking heeft op intersecties tussen een of meer van de vlakken en het aan wijsorgaan (14), waarbij genoemde computer (8) is ingericht om een applicatieprogramma te 15 draaien dat in staat is om verschillende acties uit te voeren, waarbij genoemd applicatieprogramma een grafische gebruikersinterface heeft ter afbeelding op genoemd videodisplay, en waarbij genoemd applicatieprogramma aan één of meer van genoemde acties een specifiek intersectiepatroon heeft toegewezen, zodanig dat - in gebruik - het applicatieprogramma de 20 respectieve actie zal uitvoeren wanneer het geassocieerde intersectiepatroon wordt gedetecteerd door en ontvangen van de controller.A computer system, comprising: - an input device (1) according to any of claims 1-6; - an interface object (10) comprising a video display capable of displaying variable information; and - a computer (8) operatively connected to the controller of the input device (1) for receiving coordinates of the goal and / or other information relating to intersections between one or more of the planes and the means (14), wherein said computer (8) is arranged to run an application program capable of performing various actions, said application program having a graphical user interface for displaying on said video display, and wherein said application program is connected to one or more of said actions has assigned a specific intersection pattern such that - in use - the application program will perform the respective action when the associated intersection pattern is detected by and received from the controller. 8. Het computersysteem volgens conclusie 7, waarbij genoemd interfaceobject (10) een autoruit met een head-up videodisplay omvat. 25The computer system of claim 7, wherein said interface object (10) comprises a car window with a head-up video display. 25 9. Een werkwijze voor het contactloos invoeren van informatie in een computersysteem door met een aanwijsorgaan (14) naar een interfaceobject (10) van genoemd systeem te wijzen, omvattende: - het binnen een interessegebied definiëren van een veelheid aan vlakken, waarbij genoemde veelheid aan vlakken een eerste vlak (VI) en een tweede vlak (V2) omvat; - het monitoren van genoemde vlakken op doorsnijdingen door het 5 aanwijsorgaan (14); - bij detectie van een doorsnijding van zowel het eerste (VI) als het tweede vlak (V2), het bepalen van eerste ruimtelijke coördinaten waarop het aanwijsorgaan (14) het eerste vlak (VI) doorsnijdt en tweede ruimtelijke coördinaten waarop het aanwijsorgaan het tweede 10 vlak (V2) doorsnijdt; - het op basis van genoemde eerste en tweede ruimte coördinaten bepalen van een positie en een aan wijsrichting (R) van het aanwijsorgaan (14); en - het op basis van de bepaalde positie en aanwijsrichting (R) van het 15 aanwijsorgaan (14), en op basis van informatie die betrekking heeft op de locatie van het interfaceobject (10), bepalen van coördinaten van een doelpunt (11) op het interfaceobject waarnaar het aanwijsorgaan (14) wijst.A method for contactlessly entering information in a computer system by pointing with a pointing device (14) to an interface object (10) of said system, comprising: - defining a plurality of planes within an area of interest, said plurality of planes comprises a first plane (VI) and a second plane (V2); - monitoring said surfaces for intersections by the indicating member (14); - upon detection of an intersection of both the first (VI) and the second plane (V2), determining first spatial coordinates at which the pointer (14) intersects the first plane (VI) and second spatial coordinates at which the pointer the second 10 intersect plane (V2); - determining a position and a pointing direction (R) of the pointing device (14) on the basis of said first and second space coordinates; and - determining coordinates of a goal (11) on the basis of the determined position and pointing direction (R) of the pointing member (14), and on the basis of information relating to the location of the interface object (10) the interface object to which the pointing device (14) points. 10. De werkwijze volgens conclusie 9, waarbij genoemde veelheid aan vlakken een derde vlak omvat dat zich bevindt tussen, enerzijds, het eerste en het tweede vlak (VI, V2), en anderzijds, het interfaceobject (10).The method of claim 9, wherein said plurality of faces comprises a third face located between, on the one hand, the first and second faces (V1, V2), and, on the other hand, the interface object (10). 11. De werkwijze volgens een der conclusies 9-10, waarbij de vlakken 25 (VI, V2) van genoemde veelheid aan vlakken zich onderling parallel uitstrekken, en waarbij een loodrechte afstand tussen twee buitenste vlakken van genoemde veelheid aan vlakken kleiner is dan een lengte van het aanwijsorgaan (14), zodanig dat het aanwijsorgaan tegelijkertijd alle vlakken kan doorsnijden. 30The method according to any of claims 9-10, wherein the faces (VI, V2) of said plurality of faces extend mutually parallel, and wherein a perpendicular distance between two outer faces of said plurality of faces is less than a length of the pointing member (14), such that the pointing member can simultaneously cut through all the faces. 30 12. De werkwijze volgens een der conclusies 9-11, waarbij ten minste één van de vlakken (VI, V2) van genoemde veelheid aan vlakken met behulp van elektromagnetische straling wordt gemonitord op doorsnijdingen door het aanwijsorgaan (14). 5The method according to any of claims 9-11, wherein at least one of the faces (VI, V2) of said plurality of faces is monitored for intersections by the indicating member (14) by means of electromagnetic radiation. 5 13. De werkwijze volgens een van de conclusies 9-12, voorts omvattende: - het communiceren van de coördinaten van het doelpunt (11) en/of andere informatie die betrekking heeft op doorsnijdingen van één of 10 meer van de vlakken (VI, V2) aan een computersysteem dat een applicatieprogramma draait dat in staat is tot het uitvoeren van verschillende acties; en - het toewijzen van een specifiek intersectiepatroon aan één of meer van genoemde acties, zodanig dat het applicatieprogramma de respectieve 15 actie zal uitvoeren wanneer het geassocieerde patroon wordt gedetecteerd en gecommuniceerd.The method according to any of claims 9-12, further comprising: - communicating the coordinates of the goal (11) and / or other information relating to intersections of one or more of the planes (VI, V2) ) to a computer system running an application program capable of performing various actions; and - assigning a specific intersection pattern to one or more of said actions, such that the application program will perform the respective action when the associated pattern is detected and communicated.
NL2004333A 2010-03-03 2010-03-03 Method and apparatus for touchlessly inputting information into a computer system. NL2004333C2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
NL2004333A NL2004333C2 (en) 2010-03-03 2010-03-03 Method and apparatus for touchlessly inputting information into a computer system.

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
NL2004333 2010-03-03
NL2004333A NL2004333C2 (en) 2010-03-03 2010-03-03 Method and apparatus for touchlessly inputting information into a computer system.

Publications (1)

Publication Number Publication Date
NL2004333C2 true NL2004333C2 (en) 2011-09-06

Family

ID=42830199

Family Applications (1)

Application Number Title Priority Date Filing Date
NL2004333A NL2004333C2 (en) 2010-03-03 2010-03-03 Method and apparatus for touchlessly inputting information into a computer system.

Country Status (1)

Country Link
NL (1) NL2004333C2 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE20112715U1 (en) * 2001-08-01 2001-10-11 Irschitz Oliver Information entry facility
US20020140633A1 (en) * 2000-02-03 2002-10-03 Canesta, Inc. Method and system to present immersion virtual simulations using three-dimensional measurement
US20030063115A1 (en) * 2001-09-10 2003-04-03 Namco Ltd. Image generation method, program, and information storage medium
US20080030460A1 (en) * 2000-07-24 2008-02-07 Gesturetek, Inc. Video-based image control system
DE102007038359A1 (en) * 2007-08-10 2009-02-12 Visumotion Gmbh User interface for determining spatial position of section of object, has two consecutively arranged position detectors, for detection of two dimensional positions, where electronic circuit is provided, which receives output signals

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020140633A1 (en) * 2000-02-03 2002-10-03 Canesta, Inc. Method and system to present immersion virtual simulations using three-dimensional measurement
US20080030460A1 (en) * 2000-07-24 2008-02-07 Gesturetek, Inc. Video-based image control system
DE20112715U1 (en) * 2001-08-01 2001-10-11 Irschitz Oliver Information entry facility
US20030063115A1 (en) * 2001-09-10 2003-04-03 Namco Ltd. Image generation method, program, and information storage medium
DE102007038359A1 (en) * 2007-08-10 2009-02-12 Visumotion Gmbh User interface for determining spatial position of section of object, has two consecutively arranged position detectors, for detection of two dimensional positions, where electronic circuit is provided, which receives output signals

Similar Documents

Publication Publication Date Title
US10831281B2 (en) Systems and methods of free-space gestural interaction
US9001087B2 (en) Light-based proximity detection system and user interface
KR102335132B1 (en) Multi-modal gesture based interactive system and method using one single sensing system
CA2481396C (en) Gesture recognition method and touch system incorporating the same
US20110298708A1 (en) Virtual Touch Interface
US20110032215A1 (en) Interactive input system and components therefor
KR100974894B1 (en) 3d space touch apparatus using multi-infrared camera
CA2801563A1 (en) Interactive input system and method
US20130257734A1 (en) Use of a sensor to enable touch and type modes for hands of a user via a keyboard
CN103329079A (en) Camera-based multi-touch interaction and illumination system and method
CN102341814A (en) Gesture recognition method and interactive input system employing same
CN102741781A (en) Sensor methods and systems for position detection
US10261653B2 (en) Method and device for making available a user interface, in particular in a vehicle
KR101675228B1 (en) 3d touchscreen device, touchscreen device and method for comtrolling the same and display apparatus
KR101809678B1 (en) Touchscreen device and method for controlling the same and display apparatus
US11635804B2 (en) Systems and/or methods incorporating electrical tomography related algorithms and circuits
US9201519B2 (en) Three-dimensional pointing using one camera and three aligned lights
KR20120136719A (en) The method of pointing and controlling objects on screen at long range using 3d positions of eyes and hands
KR20150112198A (en) multi-user recognition multi-touch interface apparatus and method using depth-camera
KR20120072502A (en) Infrared light touch screen display device and method for determining touch point of the same
NL2004333C2 (en) Method and apparatus for touchlessly inputting information into a computer system.
Michel et al. Building a Multi-Touch Display Based on Computer Vision Techniques.
EP2315106A2 (en) Method and system for detecting control commands
US20160274672A1 (en) Input system
Ahn et al. A slim hybrid multi-touch tabletop interface with a high definition LED display and multiple cameras

Legal Events

Date Code Title Description
V1 Lapsed because of non-payment of the annual fee

Effective date: 20131001