GB2212354A - Apparatus for sensing the direction of view of an eye - Google Patents

Apparatus for sensing the direction of view of an eye Download PDF

Info

Publication number
GB2212354A
GB2212354A GB8825825A GB8825825A GB2212354A GB 2212354 A GB2212354 A GB 2212354A GB 8825825 A GB8825825 A GB 8825825A GB 8825825 A GB8825825 A GB 8825825A GB 2212354 A GB2212354 A GB 2212354A
Authority
GB
United Kingdom
Prior art keywords
eye
view
camera
pupil
light source
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB8825825A
Other versions
GB8825825D0 (en
Inventor
Terence Vernon Edwards
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BAE Systems Electronics Ltd
Original Assignee
GEC Marconi Ltd
Marconi Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by GEC Marconi Ltd, Marconi Co Ltd filed Critical GEC Marconi Ltd
Publication of GB8825825D0 publication Critical patent/GB8825825D0/en
Publication of GB2212354A publication Critical patent/GB2212354A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target

Landscapes

  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The apparatus uses a video camera to view the pupil (3) of the eye (11) and track the position of a bright spot (1) in the pupil (3) being the reflection of a light source in the field of view of the eye (11). The apparatus further includes circuitry responsive to the camera output to develop an output indicating the direction of view of the eye. The circuitry may perform digital filtering operations on the digital data sequences (XR, YR) representing the coordinates of spot (1) position relative to the pupil (3) to produce smoothed values (PR, QR) of the coordinates using algorithms of a form specified in the application. <IMAGE>

Description

Apparatus for sensing the direction of view of an eye This invention relates to apparatus for sensing the direction of view of an eye.
In many electronic equipments comnunication between the equipment and a user of the equipment is effected by way of a visual display unit (VDU) and means, such as a keyboard, mouse, joystick or light pen, manually operated by the user to select one of several options displayed by the VDU. An alternative to use of such manually operated means is to sense the direction of view of an eye of the user as he views the selected option on the VDU.
It is an object of the present invention to provide an apparatus for sensing the direction of view of an eye which is suitable for use in such an application.
According to the present invention there is provided an apparatus for sensing the direction of view of an eye with respect to a viewed area comprising: a light source fixed with respect to said area and operable to direct light for reflection at said eye; a video camera fixed with respect to said area and arranged to view said eye, thereby to produce an output signal representing said eye and the reflection of said light source in said eye; and circuit means responsive to the position of said reflection relative to the pupil of said eye, as represented by the output signal of said camera, to produce an output representing the direction of view of said eye with respect to said area.
One apparatus in accordance with the present invention for sensing the direction of view of an eye will now be described, by way of example, with reference to the accompanying drawings in which: Figure 1 illustrates schematically the effect on a scene viewed by a video camera of the apparatus of movement of the eye to view the bottom left hand corner of a VDU screen; Figures 2 and 3 each illustrate schematically the effect on the scene viewed by the camera of small movement of the head of the user, whilst looking at one point of the VDU screen; Figure 4 shows a line scan of the video camera and subsequent derivations from this line scan; Figures 5 to 11 illustrate schematically the co-ordinates of various points in the scene viewed by the camera and the hardware for the evaluation of these co-ordinates; and Figure 12 is a graph showing the transmittance characteristic of an optical filter of the apparatus.
The apparatus provides an output which indicates the direction of view of an eye of a user of a VDU. The apparatus comprises a point light source, positioned with respect to the VDU screen as explained below, a video camera which views the user's eye, more particularly the reflection of the light source in the pupil of the user's eye, and circuitry which utilises the output signal of the camera to produce the desired output.
The VDU is associated with an electronic equipment (not shown), e.g. a computer, and is arranged to display, inter alia, at different positions on the screen options in respect of operation of the equipment. The electronic equipment includes means for utilising the output of the apparatus to cause the equipment to operate in the mode corresponding to the option viewed by the user's eye.
The light source and camera may be placed alongside e.g.
on top of the VDU screen. However, in an ideal configuration a minute video camera, e.g. a video camera of pinhead size, is placed in the centre of the VDU screen and a point source of light is positioned also in the VDU screen close to the camera, in order to avoid parallax errors introduced when the user's face moves across the field of view of the camera.
In order to prevent the video camera viewing light reflected from the VDU screen, an infra-red transmitting Schott black glass filter is employed covering both the light source and the camera lens.
The optical characteristics of the eye are preserved i.e.
the sharp boundary between the pupil and iris is distinguishable when light beyond the visible is used.
A partition at a wavelength of 800 nm is ideal for the following reasons: (a) the upper limit of the visible spectrum is approximately 750 nm (b) the sensors in the video camera can cover a spectrum which peaks at 800 nm (c) A Schott black glass filter has attractive characteristics at 800 nm.
Figure 12 indicates by full lines 50, 51 the transmittance characteristics of two suitable Schott black glass filters.
The spectral properties of Schott black glass filters are uniform over their entire aperture and insensitive to the angle of incidence. A Schott black glass filter of 3 mn thickness guarantees adequate short wavelength blockage for sharp cut-off.
Desirably, an infra-red semi-conductor light source is used.
Another advantage occurs besides elimination of parallax errors when a minute camera is used. The large depth of focus of a minute camera can accommodate the normal forwards/backwards movement of the user's head so that defocussing of the image will be minimal even with a fixed focus lens in the camera.
Operation of the apparatus will now be described with reference to Figures 1 to 12 taking the light source to be at the centre of the VDU screen.
Referring to Figure 1, the operation utilises the opto-physiological observation that when the eye of the user is looking directly at the centre 7 of the screen 9 of the VDU 5, the reflection of the light source (not shown) appears to the video camera (not shown) as a white dot 1 at the centre of the pupil 3 of the user's eye, as shown at A in Figure 1. When the eye glances across the screen 9, for example, to view the left hand bottom corner 8 of the screen 9, the white spot 1 moves across the pupil 3, as shown at B in Figure 1. There is therefore a relationship between the position of the white spot 1 in pupil 3 and the direction in which the user's eye is looking.
Referring to Figures 2 and 3, the size of the field of view 15 of the video camera is chosen so that relatively small upward, downward or sideways movement of the user's head 13 does not take the pupil 3 of the user's eye 11 outside the field of view 15. As shown in Figure 2, provided the user 13 continues to look directly at the centre 7 of the screen 9, the white spot 1 will remain in the centre of the pupil 3, and movement of the head 13 will merely move the pupil 3 about the field of view 15.
Similarly, as shown in Figure 3, provided the user 13 continues to look directly at the top right hand corner 12 of the screen 9, the white spot 1 will remain at the same point on the periphery of the pupil 3, and movement of the head 13 will merely move the pupil 3 about the field of view 15.
The circuitry required to process the video camera output to provide a usable output representative of the user's direction of view will now be considered.
The circuitry detects the bright white spot 1 which is the reflection of the light source in the dark circular pupil 3 of the eye by the use of voltage level detectors set respectively to detect high and low levels in the video camera signal.
Referring to Figure 4, the camera operates in the raster scan mode, and each line of thecamera output signal 20 is quantized by the voltage level detectors (not shown) to give 'hard-clipped' waveforms 21; 23 with pulses 1A and 3A corresponding to the white spot 1 and dark pupil 3 parts of the user's eye respectively.
The important information to be extracted from these waveforms 21, 23 is the relative position of pupil 3 and white spot 1, i.e. the co-ordinate differences between the position of the white spot 1 and the position of the centre of the pupil 3. These co-ordinates map directly into the co-ordinates of the point on the VDU screen 9 where the user 13 is looking.
Appropriate co-ordinate values can be obtained from the line scan numbers for the vertical axis and the pixelclock count for the horizontal axis.
Referring to Figures 5 and 6, two eight bit counters 25, 27, each having an appropriate clock (not shown) started at the beginning of the video frame and line and terminated when the bright spot 1 pulse occurs, give the vertical YB and horizontal co-ordinates of the white spot 1 directly.
More specifically, referring to Figure 5, in order to obtain the vertical co-ordinate YB Of the white spot 1, the counter 25 is: (a) reset to zero at the beginning of the video frame (b) incremented with the line sync pulses limited to a maximum count of 255 (c) stopped counting with the occurrence of the white spot 1 pulse and (d) interrogated at the end of the video frame cycle.
The count of the counter 25 is a representation of the vertical co-ordinate YB of the white spot 1.
Referring to Figure 6, in order to obtain the horizontal co-ordinate XB of the white spot 1 the counter 27 is: (a) incremented with a 4 MHz oscillator (b) reset at the beginning of each video line (c) inhibited together with the incrementing oscillator at the occurrence of the white spot 1 pulse and (d) interrogated at the end of the video frame cycle.
The count of the counter 27 is representative of the horizontal co-ordinate XB of the white spot 1.
Referring to Figure 7, in order to obtain the co-ordinates Xc, Yc of the centre of the pupil 3, the bounding rectangle 29 of the pupil 3 is detected.
It is sufficient to determine the Y co-ordinates of sides A and C of the rectangle 29 and the X co-ordinates of sides D and B of the rectangle 29. Suppose that these co-ordinates are Y1, Y2 and X1, X2 respectively, then the co-ordinates of the centre of the pupil 3 are: Xc = i(X1 + X2) ; and = = i(Y1 + Y2) Referring to Figures 8 and 9, the values of Y1 and Y2 are obtained from the eight bit line counters 31, 33 which, together with appropriate clocks (not shown) detect the scan lines in which appropriate black levels occur.
Referring to Figure 8, in order to obtain Y1, the counter 31 is: (a) reset to zero at the beginning of the video frame (b) incremented with the line sync pulses limited to a maximum count of 255 (c) stopped counting with the occurrence of the first black element pulse and (d) interrogated at the end of the video frame cycle.
The count of the counter 31 is a representation of the co-ordinate Y1 of the top of the pupil.
Referring to Figure 9, in order to obtain Y2, the counter 33 is: (a) reset to zero at the beginning of the video frame, (b) incremented with a line sync pulse if a black element occurs on that line and (c) interrogated at the end of the video.frame cycle.
The count of the counter 33 is a representation of the co-ordinate difference F between top and bottom of the pupil. The co-ordinate Y2 of the bottom of the pupil is therefore given by Y2 = Y1 + F.
Referring to Figures 10 and 11, the determination of the co-ordinates X1 and X2 requires the use of two binary magnitude comparator circuits 39, 41, two auxiliary registers 43, 45 and two counters 47, 49.
Referring to Figure 10, in order to obtain the co-ordinate X1: (a) the register 43 is set to its highest value (255 decimal) (b) the counter 47 is incremented with a 4 MHz oscillator 38 (c) the counter 47 is reset at the beginning of each video line (d) the value in the counter 47 is transferred to the register 43 when a black pulse occurs if the comparator 39 determines that the counter 47 value is less than that already in the register 43, and (e) the register 43 is interrogated at the frame end for the value of X1.
Referring to Figure 11, in order to obtain the co-ordinate X2: (a) the register 45 is set to its lowest value (zero) (b) the counter 49 is incremented with a 4 MHz oscillator 40 (c) the counter 49 is reset at the beginning of each video line (d) the value in the counter 49 is transferred to the register 45 when a black pulse occurs if the comparator 41 determines that the counter 49 value is greater than that already in the register 45, and (e) the register 45 is interrogated at the frame end for the value of X2.
Thus, as lines are scanned from line Y1 onwards (Figures 8 and 9), the counter 43 will count down to the left hand side of the pupil circle 3 and the counter 45 will count up to the right hand side of the pupil circle.
The required co-ordinate values Xc, Yc are given by: Xc = (i) . (value in REGISTER 43 + value in REGISTER 45) and Yc = (value in counter 31) + (i) . (value in counter 33) These values Xc, Yc are in the form of binary numbers with the same scale factor as the binary numbers XB, YB of the co-ordinates of the white spot 1. The differences XB - Xc and YB - Y therefore give the vertical and horizontal co-ordinate values of the centre of the pupil 3 relative to the white spot 1 in binary form with the same scale factor.
These binary numbers XB - Xc, YB - rep representing the line of sight co-ordinates will be heavily contaminated with noise and therefore unsuitable for accurate control of moving images.
It is therefore necessary to carry out smoothing filtering operations on the sequences of these numbers XB - Xc, YB produced in operation of the apparatus.
The type of filter with the required frequency selection and stability is a digital filter referred to as Second Order Linear Recursive. This type of filter exhibits a great degree of accuracy even when digital arithmetic of restricted word length is used.
The filtering operations effected are characterised by the algorithms: PR = 5/4PR-1 - 3/8PR-2 + 1/16(XR-1 + XR-2) where XR is the Rth number in the sequence of binary numbers XB - Xc, and PR is the corresponding smoothed output of the filter; and QR = 5/4QR-1 - 3/8QR-2 + 1/16(YR-1 + YR-2) where YR is the Rth number in the sequence of binary numbers YB - YC, and QR is the corresponding smoothed output of the filter.
The solution to step XR = 1, X~q = O is found by inverse Z transform:
Applying the residue theorem to solve the integral gives: a simple Pole at z = 1 to give a residue of 1, a simple Pole at z = i to give a residue of (3/4)(i)N, and a simple Pole at z = 3/4 to give a residue of - (714)(314)N, and therefore the solution PN = 1 + (3/4)()N - (7/4)(3/4)N If 8-bit words define the whole field of view seen by the video camera, then the pupil - spot displacement can be represented by a 3-bit word only. Therefore 8-bit arithmetic is adequate.
The digital filtering is easily implemented in machine code since the coefficients are simple binary values and the input numbers are already in binary form. The output will be in binary form with the same scale factor as the input.
The filtering operations are equivalent to those effected by a second order analogue filter critically damped and with no overshoot response to a step function input. Thus, fixed point arithmetic can be used and no rescaling is necessary. Filter processing can therefore be rapid, of the order of one iteration in 100 > seconds and well within i millisecond, for both horizontal and vertical co-ordinate update.
The smoothed values are available for transfer to the control computer with every frame cycle.
Whilst the invention has been described above, by way of example, in connection with selecting a mode of operation of an electronic equipment, it will be appreciated that the invention is not restricted to such application, but finds application wherever it is desirable to sense the direction of view of an eye.

Claims (8)

1. An apparatus for sensing the direction of view of an eye with respect toa viewed area comprising: a light source fixed with respect to said area and operable to direct light for reflection at said eye; a video camera fixed with respect to said area and arranged to view said eye, thereby to produce an output signal representing said eye and the reflection of said light source in said eye; and circuit means responsive to the position of said reflection relative to the pupil of said eye, as represented by the output signal of said camera, to produce an output representing the direction of view of said eye.
2. An apparatus according to Claim 1 wherein said circuit means comprises: means responsive to the output signal of said camera to produce a sequence of signals, each of which signals comprises a pair of signals respectively representing numbers X and Y, the numbers X and Y respectively indicating the position of said reflection along the X and Y axes of an XY co-ordinate axis system; and means for processing said sequence of signals to reduce contamination by noise.
3. An apparatus according to Claim 2 wherein said processing means effects a first second order linear recursive digital filtering operation characterised by the algorithm: PR = 5/4PR1 - 3/8PR2 + 1/16(or~1 + Or 2), where XR is the Rth value of X, and PR is the corresponding output of said first filtering operation; and a second second order linear recursive digital filtering operation characterised by the algorithm: QR = 5/4QR-1 - 3/8QR-2 + 1/16(YR-1 + YR-2), where VR is the Rth value of Y and QR is the corresponding output of said second filtering operation.
4. An apparatus according to Claim 1 or Claim 2 or Claim 3 wherein said viewed area comprises the screen of a visual display unit.
5. An apparatus according to Claim 4 wherein the visual display unit is associated with an electronic equipment and is arranged to display a plurality of options in respect of the mode of operation of the equipment and said equipment includes means for utilising the output of said apparatus to cause said equipment to operate in the mode corresponding to the option viewed by said eye.
6. An apparatus according to Claim 4 or Claim 5 wherein said video camera is positioned substantially at the centre of said screen; and said light source is positioned adjacent said camera.
7. An apparatus according to any one of the preceding claims wherein said light source emits and said camera is sensitive to infra-red radiation.
8. An apparatus for sensing the direction of view of an eye with respect to a viewed area substantially as hereinbefore described with reference to Figures 1 to 12 of the accompanying drawings.
GB8825825A 1987-11-09 1988-11-04 Apparatus for sensing the direction of view of an eye Withdrawn GB2212354A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB878726182A GB8726182D0 (en) 1987-11-09 1987-11-09 Eye position sensing

Publications (2)

Publication Number Publication Date
GB8825825D0 GB8825825D0 (en) 1988-12-07
GB2212354A true GB2212354A (en) 1989-07-19

Family

ID=10626650

Family Applications (2)

Application Number Title Priority Date Filing Date
GB878726182A Pending GB8726182D0 (en) 1987-11-09 1987-11-09 Eye position sensing
GB8825825A Withdrawn GB2212354A (en) 1987-11-09 1988-11-04 Apparatus for sensing the direction of view of an eye

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GB878726182A Pending GB8726182D0 (en) 1987-11-09 1987-11-09 Eye position sensing

Country Status (1)

Country Link
GB (2) GB8726182D0 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1300108A1 (en) * 2001-10-01 2003-04-09 Ernst Univ. Prof. Dipl.-Ing. Dr. Pfleger Method for obtaining, evaluating and analyzing sequences of vision

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1175945A (en) * 1967-08-23 1970-01-01 Honeywell Inc Improvements in or relating to Optical Systems
EP0055338A1 (en) * 1980-12-31 1982-07-07 International Business Machines Corporation Eye controlled user-machine communication

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1175945A (en) * 1967-08-23 1970-01-01 Honeywell Inc Improvements in or relating to Optical Systems
EP0055338A1 (en) * 1980-12-31 1982-07-07 International Business Machines Corporation Eye controlled user-machine communication

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1300108A1 (en) * 2001-10-01 2003-04-09 Ernst Univ. Prof. Dipl.-Ing. Dr. Pfleger Method for obtaining, evaluating and analyzing sequences of vision

Also Published As

Publication number Publication date
GB8726182D0 (en) 1987-12-16
GB8825825D0 (en) 1988-12-07

Similar Documents

Publication Publication Date Title
EP1200955B1 (en) Computer presentation system and method with optical tracking of wireless pointer
KR100452413B1 (en) Method and apparatus for calibrating a computer-generated projected image
US6454419B2 (en) Indicated position detection by multiple resolution image analysis
US7557935B2 (en) Optical coordinate input device comprising few elements
US6512838B1 (en) Methods for enhancing performance and data acquired from three-dimensional image systems
JP4132061B2 (en) ARRAY SENSOR POINTING INPUT SYSTEM AND ITS METHOD (Pointing Input Systems Methodological Array Sensors)
US7984995B2 (en) Method and apparatus for inhibiting a subject&#39;s eyes from being exposed to projected light
US7492357B2 (en) Apparatus and method for detecting a pointer relative to a touch surface
US20020136455A1 (en) System and method for robust foreground and background image data separation for location of objects in front of a controllable display within a camera view
US20030226968A1 (en) Apparatus and method for inputting data
JP2006522967A (en) Automatic alignment touch system and method
US20140168164A1 (en) Multi-dimensional touch input vector system for sensing objects on a touch panel
US6731330B2 (en) Method for robust determination of visible points of a controllable display within a camera view
JP2001142642A (en) Device for inputting coordinates
US8259063B2 (en) Input method of pointer input system
EP1356423B1 (en) System and method for extracting a point of interest of an object in front of a computer controllable display captured by an imaging device
US6231185B1 (en) Process and device for detecting a reflecting surface of a human being
US5583538A (en) Image display apparatus
GB2212354A (en) Apparatus for sensing the direction of view of an eye
US7170490B2 (en) Method and apparatus for locating a pointing element within a digital image
US20240069647A1 (en) Detecting method, detecting device, and recording medium
JP3080041U (en) Input device
JP4125162B2 (en) Coordinate input device
Renyan High Precision CCD Calibration and its Applications
IL171978A (en) Optical coordinate input device comprising few elements

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)