GB2281838A - Input for a virtual reality system - Google Patents

Input for a virtual reality system Download PDF

Info

Publication number
GB2281838A
GB2281838A GB9415530A GB9415530A GB2281838A GB 2281838 A GB2281838 A GB 2281838A GB 9415530 A GB9415530 A GB 9415530A GB 9415530 A GB9415530 A GB 9415530A GB 2281838 A GB2281838 A GB 2281838A
Authority
GB
United Kingdom
Prior art keywords
display
eyes
user
image
signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB9415530A
Other versions
GB9415530D0 (en
Inventor
Michihiro Kaneko
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pioneer Corp
Original Assignee
Pioneer Electronic Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pioneer Electronic Corp filed Critical Pioneer Electronic Corp
Publication of GB9415530D0 publication Critical patent/GB9415530D0/en
Publication of GB2281838A publication Critical patent/GB2281838A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

A detector is provided for detecting closing of eyelids of a user and for producing a condition signal dependent on the closing. An instruction signal is produced in response to the condition signal. In accordance with the instruction signal, an image on a display is changed. <IMAGE>

Description

TITLE OF THE INVENTION INPUT SYSTEM FOR A VIRTUAL REALITY SYSTEM BACKGROUND OF THE INVENTION The present invention relates to an input system for a virtual reality system, and more particularly to an input system for a head mounted display.
There has been proposed a virtual reality system which creates a stereoscopic computer graphics environment and shows it on a display. A user operates an input device, thereby obtaining an effect as if he were actually entering and walking in the imaginary world of computer and arbitrary moving the objects existing therein. With such a system, the user can simulate an assemblage of various parts into an engine or quasi-experience a feeling of becoming a Harley's comet, for example.
A typical input device for the virtual reality system comprises means for electromagnetically or optically detecting the movements of each joint of the body of the user, or the movements of the whole body, and converts them into parameters. More particularly, a coil for measuring the position of the user is attached to the head. When the user moves in a electromagnetic field generated by an exciting coil, a current is induced and flows through the coil on the head. A three-dimensional calculation is carried out using the strength of the current as parameter so that the position of the user is detected.
In an optical input system, a user wears a pair of gloves to which optical fibers are attached. As the user bents his fingers, the transmission rate of light through the fibers changes in accordance with the bent angles. The same principle is applied to form a body suit which detects the whole movement of the wearer.
A head mounted display (HMD) worn over the user's head is a well known display device for a virtual reality system. The HMD has a similar construction as a pair of binoculars, attached with a liquid crystal stereo display. The stereoscopic display is so adapted as to independently show an image to the right eye and the left eye, thereby providing a three-dimensional image.
Referring to Fig. 9, an example of a conventional virtual reality system has a glove 8 as an input device provided with a plurality of optical fibers disposed at various portions of the glove. The glove produces electric signals in accordance with the position of the hand and bending of each finger detected by the optical fibers. The electric signal are applied to condition detector 9 which converts the electric signals into parameter signals which are fed to coordinates calculator 10. The coordinates calculator 10 calculates a three-dimensional position of the wearer's hand and bending of each finger and converts them into a predetermined coordinate format. A display controller 3 creates a stereo image from the coordinate format and synthesizes an image data of an imaginary hand position in a virtual world from the stereo image in accordance with the coordinate format. The image data are fed to display 4 thereby showing an image.
The image is, for example, an object to be manipulated and an image of user's hand manipulating the object.
Namely, the conventional input device for the virtual reality system is a device attached to a human body. Hence the user wears a glove as the one described above to create a virtual hand and to grab objects in the virtual world, or wears a body suit to walk therethrough.
In the input device such as the glove, a large number of parameters must be detected by the sensors, thereby rendering the virtual reality system large, and hence decreasing the mobility thereof. Moreover, the accuracy of the sensors of such a device is limited, so that the device can not be used for an operation such as accurately pointing a certain point of the display, like picking one of menu bars shown on the display.
In some virtual reality systems, the participant need not exist as a virtual image in the graphic image depending on the purpose thereof. For example, where a plurality of menu bars indicating options are shown on the display, the user only needs to apply some kind of signal to selects one of the bars. Namely, the participant needs only to input an on or off signal as a parameter or two-dimensional coordinates as position data.
A conventional means for applying such a signal is a mouse where a button is manually depressed. However, in the virtual reality system, the hands are usually engaged in other operations so that it is convenient if there were other means for applying the signals to the HMD, where the inner space is small.
SUMMARY OF THE INVENTION An object of the present invention is to provide an input system which enables the user to operate an virtual reality system without using his hands.
According to the present invention, there is provided an input system for a virtual reality system comprising a display, detector means for detecting conditions of eyes of a user and for producing a condition signal dependent on a detected condition, determining means for determining an instruction according to the condition signal and for producing an instruction signal dependent on a determined instruction, display controller means responsive to the instruction signal for producing a display control signal for changing an image on the display.
In an aspect of the invention, the display is a head mounted display, and the detector means is a detector to detect closing of eyes of the user.
The other objects and feature of this invention will become understood from the following description with reference to the accompanying drawings.
BRIEF DESCRIPTION OF DRAWINGS Fig. 1 is a block diagram of an HMD provided with an input system of the present invention; Fig. 2 is an illustration explaining the operation of the input system of Fig. 1; Fig. 3 is a block diagram of the HMD having an input system of the second embodiment of the present invention; Fig. 4 is an illustration explaining the relationship between lines of sight and coordinates of a target; Fig. 5 is an illustration showing an example of a screen shown on the HMD of Fig. 3; Fig. 6 is a block diagram showing an input system of a third embodiment of the present invention; Fig. 7a and 7b show examples of the screen explaining the operation of the input system of Fig. 6; Fig. 8 is an illustration explaining the operation of an input system of fourth embodiment of the present invention; and Fig. 9 is a block diagram of an HMD having a conventional input device.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Referring to Figs. 1 an HMD according to the present invention has a blink detector 1 which monitors the operator's or users eyelids to produce a blink signal when the eyelids are intentionally closed in predetermined manners. The blink detector 1 comprised a sensor as is used in various eyeball movement detecting methods and detects the reflectance of the eyes when the eyes are opened. Alternatively, the detector may be a sensor attached to the face for detecting the movement of the facial muscles which controls the eyelids. Various devices may be used in practice, provided that the device produces an electric signals which are different when the eyes are opened and when closed.
The blink signal is fed to an instruction determining section 2 having a device for detecting the lapse of time such as a clock to measure the time during which the eyes are closed. When the lapsed time or the numbers of the closing action of the eyelid within a predetermined time unit shows that a predetermined blinking is intentionally performed by the user, an instruction signal is fed to a display controller 3.
More particularly, the input system of the present invention is stored with a plurality of predetermined eye operation protocol, each corresponding to an instruction given by the user. For example, the closing of the right eye for more than 0.5 seconds indicates a signal instructing to advance, namely to show the next screen. The closing of the left eye for more than 0.5 second means a signal to reverse the screen. When both eyes are closed for more than 3 seconds, the current program is ended. Namely, when a sensor (not shown) detects that only one of the eyes is closed for more than 0.5 second, the determining means 2 determines which of the eyes is closed, and further applies the instruction signal depending on the protocol to the display controller 3. A physiological blinking of the eyes which occurs when the eyes dry, should not be confused with the intentional blinking so that the instruction signals must be movements which do not occur unless the user intentionally moves them.
The signals may be a series of short blinkings made within a predetermined time instead of a long blink. The blink detector 1 may be constructed to detect operations of only one of the eyes. The instruction signal may be designated as "1" for the advance and "2" for the reverse signal and so on.
The display control means 3 derives one or more screens stored in a screen memory (not show) in accordance with the instruction signal from the instruction determining section 2 and shows it on the head mounted display 4. The display 4 is the same as those provided in the conventional HMD such as a liquid crystal display adapted to show a three-dimensional graphics.
When an advance signal is applied as the instruction signal, the screen is replaced by the next screen, and when a reverse signal is applied, the last screen is shown. With an end signal, the program is terminated.
The operation of the present invention is described hereinafter with reference to Fig. 2.
Suppose the display 4 is showing a screen A, when the user closes his right eye for more than 0.5 seconds to advance the screen, the blink detector 1 applies the blink signal to instruction determining section 2. The instruction determining section 2 determines which eye is blinked, and produces the instruction signal corresponding to the advance instruction. When the instruction signal is fed, the display controller 3 retrieves from the screen memory a screen B following the screen A, and shows it on the display 4. If the user keeps on blinking his right eye, the screens are replaced by the next screen one by one until a screen n is shown. When the left eye is closed, the current screen is replaced by the one before. The program is ended at any of the screens when both eyes are closed for more than 3 seconds.
Thus in accordance with the present invention, the user can easily give a simple instruction without using other parts of the body except the eyes while watching the display of the HMD.
The protocol of the instructions may be arbitrary determined as the usual switches. Moreover, a means may be provided so that user may determine the protocol at his choice.
Fig. 3 shows a second embodiment where the input parameter is lines of sight of the user.
As shown in Fig. 3, the HMD has a line of sight detector 5 which constantly monitors the movements of the eyes to generate an electric signal of sight signal. The line of sight detection is also called an eyeball movement detection. One method of detecting the line of sight is the cornea/sclera reflection method where infrared ray LED and photodiode are provided to detector reflectances of lights on the cornea (iris) and on the sclera (the white of an eye).
Another is the electro-oculogram method where the difference in potentials in the cornea and in retina of the eyeball is detected. The third method is the cornea reflection method where a camera is provided to pickup a virtual image of a spot light applied to an eyeball. Since the center of the curve of the cornea does not coincide with the rotational center of the eyeball, the virtual image formed on the cornea moves as the eyeball moves with a certain relationship.
In each of the methods, a reference value such as a value obtained when the eyes are directed straight ahead is experimentally measured beforehand and the difference between the reference value and the value obtained is calculated. That is, a displacement of the eyeball corresponds to a movement of an eye target on a coordinate plane. Any of the methods may be employed in the present embodiment as long as the device therefor is small enough to be disposed within the HMD.
A line of sight signal, which represents the displacement of the eyeball, is applied to a coordinate calculator 6 which calculates coordinates of the eye target in the virtual reality world, that is a point where the eyes are focused on. The coordinates are applied to the display controller 3 which superimposes an icon such as a hand or an arrow as a position marker on the graphical image shown on the display 4.
Describing the operation with reference to Fig. 4, lines of sight of the right eye and the lift eye are detected by the line of sight detector 5. It can be assumed that an eye target on which the eyes are focused is a point A where the line of sight of both eyes cross. Hence, the coordinates of the eye target can be obtained by calculating the position of the intersection of the lines of sight. However, the image shown on the display in the present embodiment is two-dimensional so that the eye target is on a fixed plane. The coordinates can hence be easily obtained by calculating the intersection of the line of sight from one of the eyes and the fixed value. Thus the calculation is simplified.
Namely, in a simple two-dimensional coordinate system shown on the display comprising pixels of 640 dots by 400 dots, the coordinates x, y of the point A are calculated by the coordinate calculator 6, for example as 290 and 105, respectively. The coordinates x and y are fed to the display controller 3 which then superimposes the icon on a corresponding position Di of the display 4 in Fig. 5. The icon indicating the target smoothly follows the movement of the eyes from the point Di to a point Di' as shown by a dotted line.
In other word, the icon is always positioned at a point which the user is watching.
Thus, the input operation, which is done by the hand in the conventional system, coincides with the natural movement of the user in the present embodiment, thereby enabling to control a system with eyes. For example, in a case where a file having a large area such as a whole page of a newspaper or a map is to be displayed, only a part of the file is shown at a sufficient resolution. The coordinates of the eye target is applied to the display as a parameter. When the target reaches an edge of the screen, the screen is automatically scrolled to show another part of the file in a direction toward which the eyes point. Hence the screen can be scrolled in any direction without using hands.
Thus, in accordance with the second embodiment, the eye target serves as a pointing device so that a device responsive to the unintentional movement of the eyes can be constructed.
The third embodiment of the present invention shown in Fig. 6 is a combination of the first and second embodiments which is intended to be applied to a virtual reality system where a third-dimensional image is shown on the display as shown in Figs. 7a and 7b.
The same reference in Fig. 6 as those in Figs. 1 and 3 designates the same parts in Fig. 6.
Referring to Fig. 6, the line of sight detector 5 detects the lines of sight of the right and the left eyes and converts them into the line of sight signal in a predetermined format as in the second embodiment.
The line of sight signal is applied to the coordinate calculator 6 which calculates the coordinates of the eye target where the lines from both eyes converge.
The coordinates are applied to the display controller 3.
The HMD of the present embodiment is further provided with an image generator 7 for creating a virtual reality image. The virtual reality image may show an imaginary world of three-dimensional computer graphics, or may be a reproduced picture of the real world taken by a stereoscopic camera. The image may be two-dimensional if the display 4 is capable of showing a two-dimensional image.
An image data from the image generator 7 is fed to the display controller 3. The display controller 3 further produces an image data of menu bars which are to be shown on the display for controlling the virtual reality system. The menu bars are stored in a memory (not shown) provided in the display controller 3 and read out on demand. The menu bar may be shown at a fixed position of the display, or at an appropriate position with regard to the three-dimensional image if they are allotted with two-dimensional standard coordinates.
The display controller 3 operates the display 4 to show the three-dimensional image and the menu bars on the screen. Further an icon such as an arrow indicating the eye target determined at the coordinate calculator 6 is combined with the image.
The line of sight detector 5 of the present embodiment further detects the closing of the eyelids.
The line of sight is further fed to the blink detector 2 which generates the instruction signal in accordance with a predetermined protocol of the eye movement for giving instructions. The instruction signal is fed to the display controller 3 which accordingly controls the display 4.
In operation, as shown in Fig. 7a, the display 4 has pixels of 640 dots by 400 dots, the coordinates at the left top being (0,0) and the coordinates at the right bottom being (639,399). A stereoscopic image D3d generated at the image generator 7 is shown on the entire screen, and a menu bar block Dm is shown on the right hand side of the display 4. The menu block Dm has standard coordinates (in the figure, the left top corner) of (485,10). The standard coordinates can be changed so as to transfer the menu block to another position of the display. The menu block Dm comprises five menu bars from Nos. 1 to 5, each designating an option.
When the lines of sight of a user is directed toward a point C having coordinates (480,50), an arrow-shaped icon I is shown on the display 4, pointing at the point C. Since the point C is outside the boundary of the menu block Dm, nothing happens although the user closes his eyes for more than a predetermine time such as 0.5 seconds Referring to Fig. 7b, when the user moves his eyes to a point C' having coordinates (521,210), the icon I moves thereby pointing the menu bar No. 3. If the user closes his right eye for more than 0.5 seconds, for example, the instruction determining section 2 applies a select signal, as an instruction signal, to the display controller 3, thereby selecting the option No.
3. Namely, the display controller 3 is operated as though an actual menu bar is depressed by hand, thereby executing a predetermined program of the option No. 3.
The program is recalled by closing the left eye for more than 0.5 seconds.
These operations correspond to operations of a computer provided with a mouse, wherein an icon on a display is moved by moving a mouse and pushing an input button.
Thus, in accordance with the third embodiment, the user wearing the HMD can control the virtual reality system with his eyes while watching the virtual reality world on the screen. The hands which are used for the input operation in the conventional system can be used for other purposes, for example, holding a joystick with both hands in game applications. The input system can be easily provided in the HMD so that the virtual reality system can be controlled without impairing the mobility of HMD.
The present embodiment may be applied to a two-dimensional display.
Whereas the second and third embodiments are for detecting a two-dimensional position, the fourth embodiment of the present invention provides a system for obtaining the three-dimensional coordinates. The HMD according to the fourth embodiment has the same construction as that of the third embodiment shown in Fig. 6 so that the description thereof is omitted.
Since the virtual reality system provides a three-dimensional world, the target of the eyes watching the world is also three-dimensionally positioned as at a point B in Fig. 4. The coordinates x, y and z of the point B is expressed as (290, 105, 90). The alphabet z represents a position along an axis z perpendicular to the x-y plane and shows a vertical distance from the eyes, a numeritical value of which can be obtained in accordance with an appropriate standard value.
The intersections of lines of sight of both eyes detected by the line of sight detector 5 is the eye target. Hence the Z coordinate of the eye target can be obtained by carrying out a geometric calculation based on a basic axis of the eyes and an angle e between the two lines of sight. The three-dimensional coordinates calculated by the coordinate calculator 6 is applied to the display controller 3 where the data is combined with the image data from the image generator 7.
Whereas only the graphic images generated at the image generator 7 is stereoscopic in the third embodiment, the image on the entire screen of the display in the present embodiment has Z coordinate which corresponds to a distance from the eyes. Hence the user sees every part of the image as having a depth.
Referring to Fig. 8 schematically showing a stereoscopic picture image, on a display 4 are shown an image of a rock R flying toward the user, a menu bar block Dm and an icon pointing an eye target of the user at a position F. The actual distance dl between the eyes and the display 4 is determined in accordance with the design of the HMD and therefore constant. Thus, although the focal distance of the eyes is adjusted to the distance dl, in the virtual reality world, the rock R is seen as rock R' positioned further away from the eyes by a distance d2. Hence the lines of sight of the eyes are converged at a point E' on the rock R' while the user watches the rock approaching nearer toward him.
Similarly, the menu bar block Dm is indicated as a menu bar block Dm' at position having a distance d3 from the eyes. If the user wishes to select a menu bar, the user moves his eyes to see one of the menu bars, thereby moving the icon to a point F'. If the target is at a point F" which is behind the bar block Dm', the icon is hidden from view behind the bar block.
Thus the icon can be moved back and forth in the image depending on the focal point of the eyes, pointing different objects which may be positioned at the same x and y coordinates.
Accordingly, the fourth embodiment provides a system where the icon can be arbitrary moved in the three-dimensional image hereby causing an exciting visual effect in games and other various applications.
In the third and fourth embodiments, the icon which is continually present on the screen may be obtrusive to the user. Therefore, the system may be modified to show the icon only when the eye target is on the menu bar block. Furthermore, the selecting operation and confirming operation of the menus may be executed by manually operating switches as in the conventional system instead of by moving the eyelids.
The input system may further be applied to an HMD provided with a stereo head receiver so as to coincide the location of the sound image with the eye target, thereby enabling to sound an effective response as a input confirmation signal or create dynamic sound effects. More particularly, the calculated coordinates of the eye target is applied to a digital signal processor as a parameter so as to process the frequency component of a sound emitted from a sound source. As a result, the user can detect a three-dimensional sound image.
Various displays may be disposed in the HMD to which the present invention is applied, as long as they are capable of showing stereoscopic images, and be either color or black and white. The display may be of any type having a flat surface, such as a liquid crystal display which is watched from the front side, a projection display where images are projected from the rear of a semi-transparent screen, a CRT, a plasma display panel and an electroluminecence panel. In order to provide a stereoscopic image, the display may be a lenticular display where a plurality of cylindrical lenses are provided to split the image into two, that is, for the right eye and the left eye, respectively. Alternatively, a parallax barrier display where each eye sees pixles through a slit may be employed. Furthermore, a display having an electronic shutter may be disposed for each of the right and left eyes, provided they do not take up a space for the lines of sight detector in the HMD.
From the foregoing it will be understood that the present invention provides an input system for a HMD which enables the user to give instructions with only his eyes. Hence the present invention can be employed in various applications such as a virtual reality game device and a driving device for actually driving a vehicle.
In the input system of the present invention, since the distance between the eyes and the sensors are constant in the HMD, and external light hardly enters therein, the movements of the eyelids and eyeballs can be accurately converted into electric signal.
While the presently preferred embodiments of the present invention have been shown and described, it is to be understood that these disclosures are for the purpose of illustration and that various changes and modifications may be made without departing from the scope of the invention as set forth in the appended claims.

Claims (5)

WHAT IS CLAIMED IS:
1. An input system for a virtual reality system comprising: a display; detector means for detecting conditions of eyes of a user and for producing a condition signal dependent on a detected condition; determining means for determining an instruction according to the condition signal and for producing an instruction signal dependent on a determined instruction; display controller means responsive to the instruction signal for producing a display control signal for changing an image on the display.
2. The system according to claim 1 wherein the display is a head mounted display.
3. The system according to claim 1 wherein the detector means detects closing of at least one eye of the user.
4. The system according to claim 1 wherein the detector means detects lines of sight of eyes of the user.
5. An input system for a virtual reality system substantially as described herein with reference to Figures 1 to 8 of the drawings.
GB9415530A 1993-08-04 1994-08-01 Input for a virtual reality system Withdrawn GB2281838A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP5193469A JPH0749744A (en) 1993-08-04 1993-08-04 Head mounting type display input device

Publications (2)

Publication Number Publication Date
GB9415530D0 GB9415530D0 (en) 1994-09-21
GB2281838A true GB2281838A (en) 1995-03-15

Family

ID=16308537

Family Applications (1)

Application Number Title Priority Date Filing Date
GB9415530A Withdrawn GB2281838A (en) 1993-08-04 1994-08-01 Input for a virtual reality system

Country Status (2)

Country Link
JP (1) JPH0749744A (en)
GB (1) GB2281838A (en)

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0743589A2 (en) * 1995-05-15 1996-11-20 Canon Kabushiki Kaisha Interactive image generation method and apparatus
EP0903661A1 (en) * 1997-08-27 1999-03-24 Canon Kabushiki Kaisha Apparatus and method to input data based on visual-axis detection
EP0933720A2 (en) * 1998-01-29 1999-08-04 Shimadzu Corporation Input apparatus for the physically handicapped
EP0942350A1 (en) * 1998-03-13 1999-09-15 Canon Kabushiki Kaisha Visual axis decision input device and method
WO2000054134A1 (en) * 1999-03-09 2000-09-14 Siemens Aktiengesellschaft Input/output device for a user terminal
EP1331609A1 (en) * 2002-01-23 2003-07-30 Radica China Ltd. Optical controller
WO2004034241A2 (en) * 2002-10-09 2004-04-22 Raphael Bachmann Rapid input device
US6728632B2 (en) * 2001-08-30 2004-04-27 Ericsson Inc. Navigation devices, systems, and methods for determining location, position, and/or orientation information based on movement data generated by a movement detector
EP1637975A1 (en) * 2004-09-20 2006-03-22 Samsung Electronics Co., Ltd. Apparatus and method for inputting keys using biological signals in head mounted display information terminal
WO2009062492A2 (en) * 2007-11-15 2009-05-22 Spatial View Gmbh Method for representing image objects in a virtual three-dimensional image space
US7630524B2 (en) 2000-05-16 2009-12-08 Swisscom Mobile Ag Biometric identification and authentication method
EP2360653A2 (en) * 2008-11-17 2011-08-24 Byong-Hoon Jeon Emergency rescue system using eye expression recognition, and method for same
WO2013117727A1 (en) * 2012-02-09 2013-08-15 Universität Zürich System for examining eye movements, particularly the vestibulo-ocular reflex and dynamic visual acuity
US20140062867A1 (en) * 2012-09-06 2014-03-06 Michael Baumgartner Electrode Blinking Device
CN103677247A (en) * 2012-12-19 2014-03-26 苏州贝腾特电子科技有限公司 Method for virtual mouse clicking
WO2014068832A1 (en) * 2012-11-02 2014-05-08 Sony Corporation Image display device and information input device
CN104076512A (en) * 2013-03-25 2014-10-01 精工爱普生株式会社 Head-mounted display device and method of controlling head-mounted display device
GB2517059A (en) * 2013-06-11 2015-02-11 Sony Comp Entertainment Europe Head-mountable apparatus and systems
CN105260017A (en) * 2015-09-28 2016-01-20 南京民办致远外国语小学 Glasses mouse and working method therefor
US9390256B2 (en) 2012-03-06 2016-07-12 Paypal, Inc. System and methods for secure entry of a personal identification number (PIN)
WO2016137405A1 (en) * 2015-02-27 2016-09-01 Meditech Solution Company Limited A communicative system by monitoring patients' eye blinking
US9596391B2 (en) 2013-09-03 2017-03-14 Tobii Ab Gaze based directional microphone
US9678568B2 (en) 2014-01-17 2017-06-13 Casio Computer Co., Ltd. Apparatus, system, method for designating displayed items and for controlling operation by detecting movement
US9952883B2 (en) 2014-08-05 2018-04-24 Tobii Ab Dynamic determination of hardware
CN108108022A (en) * 2018-01-02 2018-06-01 联想(北京)有限公司 A kind of control method and auxiliary imaging devices
US9996683B2 (en) 2012-03-06 2018-06-12 Paypal, Inc. Physiological response pin entry
WO2018109570A1 (en) * 2016-12-15 2018-06-21 Sony Mobile Communications Inc. Smart contact lens and multimedia system including the smart contact lens
EP3407164A1 (en) * 2017-05-23 2018-11-28 Stichting IMEC Nederland A method and a system for monitoring an eye position
US10198622B2 (en) 2013-03-29 2019-02-05 Panasonic Intellectual Property Management Co., Ltd. Electronic mirror device
US10310597B2 (en) 2013-09-03 2019-06-04 Tobii Ab Portable eye tracking device
US10686972B2 (en) 2013-09-03 2020-06-16 Tobii Ab Gaze assisted field of view control

Families Citing this family (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09127459A (en) * 1995-11-02 1997-05-16 Canon Inc Display device provided with gaze detection system
JP2001522098A (en) * 1997-10-30 2001-11-13 ドクター・バルデヴェグ・ゲーエムベーハー Image processing method and apparatus
JP2000347596A (en) * 1998-08-31 2000-12-15 Semiconductor Energy Lab Co Ltd Portable information processing system
JP2003280805A (en) * 2002-03-26 2003-10-02 Gen Tec:Kk Data inputting device
JP2004038470A (en) * 2002-07-02 2004-02-05 Canon Inc Augmented reality system and information processing method
JP4839432B2 (en) * 2003-12-17 2011-12-21 国立大学法人静岡大学 Pointing device and method based on pupil position detection
EP2202609B8 (en) * 2004-06-18 2016-03-09 Tobii AB Eye control of computer apparatus
JP4730621B2 (en) * 2007-05-07 2011-07-20 敦 西川 Input device
JP5180258B2 (en) * 2010-05-28 2013-04-10 株式会社コナミデジタルエンタテインメント GAME DEVICE, GAME CONTROL METHOD, AND PROGRAM
JP5953714B2 (en) * 2011-11-24 2016-07-20 セイコーエプソン株式会社 Device, head-mounted display device, device control method, and head-mounted display device control method
KR101919010B1 (en) 2012-03-08 2018-11-16 삼성전자주식회사 Method for controlling device based on eye movement and device thereof
KR20140011203A (en) * 2012-07-18 2014-01-28 삼성전자주식회사 Control apparatus connected with a plurality of display apparatus and method for controlling a plurality of display apparatus, and display apparatus contol system thereof
JP6303274B2 (en) * 2013-03-25 2018-04-04 セイコーエプソン株式会社 Head-mounted display device and method for controlling head-mounted display device
JP2013210643A (en) * 2013-04-26 2013-10-10 Sony Corp Display device and display method
CA2924496A1 (en) * 2013-09-17 2015-03-26 Amazon Technologies, Inc. Approaches for three-dimensional object display
JP6349777B2 (en) * 2013-10-23 2018-07-04 株式会社ニコン Portable terminal device for information exchange
JP6608137B2 (en) * 2014-01-03 2019-11-20 ハーマン インターナショナル インダストリーズ インコーポレイテッド Detection of binocular transduction on display
CN104391567B (en) * 2014-09-30 2017-10-31 深圳市魔眼科技有限公司 A kind of 3D hologram dummy object display control method based on tracing of human eye
WO2016142933A1 (en) 2015-03-10 2016-09-15 Eyefree Assisting Communication Ltd. System and method for enabling communication through eye feedback
JP6354653B2 (en) * 2015-04-25 2018-07-11 京セラドキュメントソリューションズ株式会社 Augmented reality operation system and augmented reality operation program
JP6580624B2 (en) * 2017-05-11 2019-09-25 株式会社コロプラ Method for providing virtual space, program for causing computer to execute the method, and information processing apparatus for executing the program
JP6342038B1 (en) * 2017-05-26 2018-06-13 株式会社コロプラ Program for providing virtual space, information processing apparatus for executing the program, and method for providing virtual space
WO2019111257A1 (en) 2017-12-07 2019-06-13 Eyefree Assisting Communication Ltd. Communication methods and systems
JP6669183B2 (en) * 2018-03-05 2020-03-18 セイコーエプソン株式会社 Head mounted display and control method of head mounted display
JP6878346B2 (en) * 2018-04-02 2021-05-26 株式会社コロプラ A method for providing a virtual space, a program for causing a computer to execute the method, and an information processing device for executing the program.
JP7309830B2 (en) * 2020-10-01 2023-07-18 株式会社東芝 Electronic device and display method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2201069A (en) * 1987-01-21 1988-08-17 Jonathan David Waldern Method and apparatus for the perception of computer-generated imagery
WO1993020499A1 (en) * 1992-03-31 1993-10-14 The Research Foundation Of State University Of New York Apparatus and method for eye tracking interface

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2201069A (en) * 1987-01-21 1988-08-17 Jonathan David Waldern Method and apparatus for the perception of computer-generated imagery
WO1993020499A1 (en) * 1992-03-31 1993-10-14 The Research Foundation Of State University Of New York Apparatus and method for eye tracking interface

Cited By (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0743589A3 (en) * 1995-05-15 1997-11-12 Canon Kabushiki Kaisha Interactive image generation method and apparatus
EP0743589A2 (en) * 1995-05-15 1996-11-20 Canon Kabushiki Kaisha Interactive image generation method and apparatus
US6437794B1 (en) 1995-05-15 2002-08-20 Canon Kabushiki Kaisha Interactive image generation method and apparatus utilizing a determination of the visual point position of an operator
US6426740B1 (en) 1997-08-27 2002-07-30 Canon Kabushiki Kaisha Visual-axis entry transmission apparatus and method therefor
EP0903661A1 (en) * 1997-08-27 1999-03-24 Canon Kabushiki Kaisha Apparatus and method to input data based on visual-axis detection
EP0933720A2 (en) * 1998-01-29 1999-08-04 Shimadzu Corporation Input apparatus for the physically handicapped
EP0933720A3 (en) * 1998-01-29 1999-11-10 Shimadzu Corporation Input apparatus for the physically handicapped
US6161932A (en) * 1998-03-13 2000-12-19 Canon Kabushiki Kaisha Visual axis input and decision transfer device and method
EP0942350A1 (en) * 1998-03-13 1999-09-15 Canon Kabushiki Kaisha Visual axis decision input device and method
WO2000054134A1 (en) * 1999-03-09 2000-09-14 Siemens Aktiengesellschaft Input/output device for a user terminal
US7630524B2 (en) 2000-05-16 2009-12-08 Swisscom Mobile Ag Biometric identification and authentication method
US6728632B2 (en) * 2001-08-30 2004-04-27 Ericsson Inc. Navigation devices, systems, and methods for determining location, position, and/or orientation information based on movement data generated by a movement detector
EP1331609A1 (en) * 2002-01-23 2003-07-30 Radica China Ltd. Optical controller
US6836751B2 (en) 2002-01-23 2004-12-28 Radica China Ltd. Optical controller
WO2004034241A3 (en) * 2002-10-09 2005-07-28 Raphael Bachmann Rapid input device
WO2004034241A2 (en) * 2002-10-09 2004-04-22 Raphael Bachmann Rapid input device
EP1637975A1 (en) * 2004-09-20 2006-03-22 Samsung Electronics Co., Ltd. Apparatus and method for inputting keys using biological signals in head mounted display information terminal
WO2009062492A2 (en) * 2007-11-15 2009-05-22 Spatial View Gmbh Method for representing image objects in a virtual three-dimensional image space
WO2009062492A3 (en) * 2007-11-15 2010-04-22 Spatial View Gmbh Method for representing image objects in a virtual three-dimensional image space
EP2360653A2 (en) * 2008-11-17 2011-08-24 Byong-Hoon Jeon Emergency rescue system using eye expression recognition, and method for same
EP2360653A4 (en) * 2008-11-17 2013-04-10 Byong-Hoon Jeon Emergency rescue system using eye expression recognition, and method for same
WO2013117727A1 (en) * 2012-02-09 2013-08-15 Universität Zürich System for examining eye movements, particularly the vestibulo-ocular reflex and dynamic visual acuity
US10362024B2 (en) 2012-03-06 2019-07-23 Paypal, Inc. System and methods for secure entry of a personal identification number (PIN)
US9996683B2 (en) 2012-03-06 2018-06-12 Paypal, Inc. Physiological response pin entry
US9390256B2 (en) 2012-03-06 2016-07-12 Paypal, Inc. System and methods for secure entry of a personal identification number (PIN)
US20140062867A1 (en) * 2012-09-06 2014-03-06 Michael Baumgartner Electrode Blinking Device
WO2014068832A1 (en) * 2012-11-02 2014-05-08 Sony Corporation Image display device and information input device
US9841812B2 (en) 2012-11-02 2017-12-12 Sony Corporation Image display device and information input device
CN104755023A (en) * 2012-11-02 2015-07-01 索尼公司 Image display device and information input device
CN103677247A (en) * 2012-12-19 2014-03-26 苏州贝腾特电子科技有限公司 Method for virtual mouse clicking
CN104076512A (en) * 2013-03-25 2014-10-01 精工爱普生株式会社 Head-mounted display device and method of controlling head-mounted display device
US9921646B2 (en) 2013-03-25 2018-03-20 Seiko Epson Corporation Head-mounted display device and method of controlling head-mounted display device
US10198622B2 (en) 2013-03-29 2019-02-05 Panasonic Intellectual Property Management Co., Ltd. Electronic mirror device
GB2517059B (en) * 2013-06-11 2016-10-19 Sony Computer Entertainment Europe Ltd Head-mountable displays and systems
GB2517059A (en) * 2013-06-11 2015-02-11 Sony Comp Entertainment Europe Head-mountable apparatus and systems
US10708477B2 (en) 2013-09-03 2020-07-07 Tobii Ab Gaze based directional microphone
US9710058B2 (en) 2013-09-03 2017-07-18 Tobii Ab Portable eye tracking device
US9665172B2 (en) 2013-09-03 2017-05-30 Tobii Ab Portable eye tracking device
US10686972B2 (en) 2013-09-03 2020-06-16 Tobii Ab Gaze assisted field of view control
US10389924B2 (en) 2013-09-03 2019-08-20 Tobii Ab Portable eye tracking device
US10116846B2 (en) 2013-09-03 2018-10-30 Tobii Ab Gaze based directional microphone
US10375283B2 (en) 2013-09-03 2019-08-06 Tobii Ab Portable eye tracking device
US9596391B2 (en) 2013-09-03 2017-03-14 Tobii Ab Gaze based directional microphone
US10277787B2 (en) 2013-09-03 2019-04-30 Tobii Ab Portable eye tracking device
US10310597B2 (en) 2013-09-03 2019-06-04 Tobii Ab Portable eye tracking device
US9678568B2 (en) 2014-01-17 2017-06-13 Casio Computer Co., Ltd. Apparatus, system, method for designating displayed items and for controlling operation by detecting movement
US9952883B2 (en) 2014-08-05 2018-04-24 Tobii Ab Dynamic determination of hardware
WO2016137405A1 (en) * 2015-02-27 2016-09-01 Meditech Solution Company Limited A communicative system by monitoring patients' eye blinking
CN105260017A (en) * 2015-09-28 2016-01-20 南京民办致远外国语小学 Glasses mouse and working method therefor
WO2018109570A1 (en) * 2016-12-15 2018-06-21 Sony Mobile Communications Inc. Smart contact lens and multimedia system including the smart contact lens
EP3407164A1 (en) * 2017-05-23 2018-11-28 Stichting IMEC Nederland A method and a system for monitoring an eye position
CN108108022A (en) * 2018-01-02 2018-06-01 联想(北京)有限公司 A kind of control method and auxiliary imaging devices

Also Published As

Publication number Publication date
GB9415530D0 (en) 1994-09-21
JPH0749744A (en) 1995-02-21

Similar Documents

Publication Publication Date Title
GB2281838A (en) Input for a virtual reality system
CN110045816B (en) Near-eye display and system
CN110647237B (en) Gesture-based content sharing in an artificial reality environment
US10635895B2 (en) Gesture-based casting and manipulation of virtual content in artificial-reality environments
JP4251673B2 (en) Image presentation device
US11810244B2 (en) Devices, methods, and graphical user interfaces for interacting with three-dimensional environments
CN103347437B (en) Gaze detection in 3D mapping environment
JP4871270B2 (en) System and method for operating in a virtual three-dimensional space and system for selecting operations via a visualization system
JP4413203B2 (en) Image presentation device
CN110646938B (en) Near-eye display system
CN114995594A (en) Interaction with 3D virtual objects using gestures and multi-DOF controllers
WO2005043218A1 (en) Image display device
KR20220120649A (en) Artificial Reality System with Varifocal Display of Artificial Reality Content
CN111949128A (en) Information processing method, program for causing computer to execute the information processing method, and computer
CN110018736A (en) The object via near-eye display interface in artificial reality enhances
JPH10334275A (en) Method and system for virtual reality and storage medium
EP2279469A1 (en) Display of 3-dimensional objects
WO2019142560A1 (en) Information processing device for guiding gaze
US11743447B2 (en) Gaze tracking apparatus and systems
JP7128473B2 (en) Character display method
Argelaguet et al. Visual feedback techniques for virtual pointing on stereoscopic displays
WO2021220407A1 (en) Head-mounted display device and display control method
JPH10334274A (en) Method and system for virtual realize and storage medium
WO2022044581A1 (en) Information processing device, information processing method, and program
KR20230121953A (en) Stereovision-Based Virtual Reality Device

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)