WO2018076202A1 - Head-mounted display device that can perform eye tracking, and eye tracking method - Google Patents
Head-mounted display device that can perform eye tracking, and eye tracking method Download PDFInfo
- Publication number
- WO2018076202A1 WO2018076202A1 PCT/CN2016/103375 CN2016103375W WO2018076202A1 WO 2018076202 A1 WO2018076202 A1 WO 2018076202A1 CN 2016103375 W CN2016103375 W CN 2016103375W WO 2018076202 A1 WO2018076202 A1 WO 2018076202A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- eye
- head
- human eye
- image information
- eyeball
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
Definitions
- the present invention relates to the field of computer technologies, and in particular, to a head mounted visual device, and more particularly to a head mounted visual device capable of human eye tracking and a human eye tracking method.
- a head-mounted display (HMD, also known as a head-mounted visual device) reflects a two-dimensional image directly into the viewer's eye, specifically by a set of optical systems (primarily precision optical lenses) that amplify the ultra-microdisplay The image on the image is projected onto the retina, and the large-screen image is presented in the viewer's eye. The image is a magnifying glass to see the object and present an enlarged virtual object image.
- the image can be obtained directly through a light emitting diode (LED), an active matrix liquid crystal display (AMLCD), an organic light emitting diode (OLED), or a liquid crystal with silicon (LCOS), or can be indirectly obtained by conducting a fiber or the like.
- LED light emitting diode
- AMLCD active matrix liquid crystal display
- OLED organic light emitting diode
- LCOS liquid crystal with silicon
- the display system is imaged at infinity by a collimating lens and then reflected through the reflecting surface into the human eye. Head-mounted visual devices are quietly changing people's modern lives because of their portability and entertainment.
- the existing head-mounted visual device cannot actively interact with the user, that is, the wearer actively operates the head-mounted visual device, and the head-mounted visual device cannot actively sense the user's attention and the user's mood. Therefore, people think of using eye tracking technology to actively perceive the user's attention and the user's mood.
- how to use the eye tracking technology in the head-mounted visual device to realize real-time tracking of human eye information to obtain the gaze point of the human eye in space there is currently no good solution;
- the weight of the head-mounted visual device is considered to be a non-negligible factor.
- the existing eye tracker already has a mature product, it is directly embedded in the eye-catching visual device. The tracker will undoubtedly increase the weight of the virtual reality helmet and reduce the customer experience.
- the technical problem to be solved by the present invention is to provide a head-mounted visual device capable of human eye tracking and a human eye tracking method, which solves the problem that the existing head-mounted visual device cannot track the human eye viewing orientation. problem.
- a specific embodiment of the present invention provides a head mounted visual device capable of performing human eye tracking, comprising: a virtual reality helmet for accommodating a head mounted visual device; a light source disposed at the The virtual reality helmet is used to illuminate the eyeball of the human eye; the micro camera is disposed on the virtual reality helmet for collecting eyeball image information of the human eye, so that the server determines the orientation information of the pupil of the human eye according to the image information of the eyeball.
- a specific embodiment of the present invention further provides a human eye tracking method for a head-mounted visual device, comprising: illuminating an eyeball with an LED light source; collecting an eyeball image information of a human eye by using a micro camera; and utilizing a spatial mapping relationship according to The eyeball image information determines orientation information of the pupil of the human eye.
- the head-mounted visual device capable of performing human eye tracking and the human eye tracking method have at least the following beneficial effects: by embedding a miniature camera and an LED light source in the head-mounted visual device, and Multiple reference points are set in the virtual scene, and the spatial mapping relationship between the miniature camera, the reference point and the eyeball is constructed by using the three-dimensional matrix; then the micro-camera is used to capture the image information of the eyeball, and the acquired image information of the eyeball is obtained according to the spatial mapping relationship.
- the analysis can obtain the pupil focus area in real time, thereby determining the user's viewing orientation, without increasing the weight of the head-mounted visual device, and not leaking the environmental information around the user, thereby improving the user experience.
- FIG. 1A is a schematic structural diagram of a main body of a head-mounted visual device capable of tracking human eyes according to an embodiment of the present invention
- FIG. 1B is a schematic rear view of a head-mounted visual device capable of tracking human eyes according to an embodiment of the present invention
- Embodiment 1 of a human eye tracking method for a head mounted visual device according to an embodiment of the present invention
- Embodiment 3 is a flowchart of Embodiment 2 of a human eye tracking method for a head mounted visual device according to an embodiment of the present invention
- FIG. 4 is a schematic diagram of three-dimensional coordinates of a spatial positional relationship between a miniature camera, a reference point, and a human eyeball according to an embodiment of the present invention
- FIG. 5 is a diagram of a coordinate conversion relationship provided by a specific embodiment of the present invention.
- orientation terms used herein for example, up, down, left, right, front or back, etc., only the orientation with reference to the drawings. Therefore, the orientation terms used are used to illustrate that it is not intended to limit the creation.
- FIG. 1A is a schematic diagram of a main structure of a head-mounted visual device capable of tracking human eyes according to an embodiment of the present invention
- FIG. 1B is a head-mounted type capable of tracking human eyes according to an embodiment of the present invention
- a rear view structure of the visual device as shown in FIG. 1A and FIG.
- a light source and a micro camera are respectively disposed on two sides of the virtual reality helmet lens, one light source and one micro camera correspond to one eye of the user, and another light source and Another micro camera corresponds to the other eye of the user, the light source is used to illuminate the eyeball of the human eye, and the miniature camera is used to collect the eyeball image information of the human eye, so that the server determines the orientation information of the pupil of the human eye according to the image information of the eyeball.
- the head mounted visual device comprises a virtual reality helmet 10, a light source 20 and a miniature camera 30, wherein the virtual reality helmet 10 is for accommodating a head mounted visual device;
- the light source 20 is used to illuminate the eyeball of the human eye;
- the micro camera 30 is disposed in the virtual reality helmet 10, and the miniature camera 30 is configured to collect eyeball image information of the human eye, so that the server according to the eyeball
- the image information determines the orientation information of the pupil of the human eye, wherein the micro camera 30 can be a miniature camera, a micro camera, etc.
- the light source 20 can be a micro LED light source, and when the micro camera 30 collects the eyeball image information of the human eye, the light source 20 is instantly turned on and off.
- the miniature camera 30 is connected to the server via the HDMI data line of the miniature camera.
- the orientation information of the pupil of the human eye specifically refers to: viewing the straight line directly in front of the human eye as a reference line, and then connecting the viewing target point with the pupil of the human eye, and the angle and positional relationship information between the connection line and the reference line is The orientation information of the pupil of the human eye.
- the server calculates the orientation information of the pupil of the human eye according to the spatial positional relationship between the micro camera 30, the reference point, and the eye of the human eye.
- the number of reference points is at least 4.
- the light source 20 specifically includes a first LED light source 201 and a second LED light source 202.
- a first LED light source 201 is disposed at a left lens edge of the virtual reality helmet 10;
- a second LED light source 202 is disposed at a right lens edge of the virtual reality helmet 10; and
- the first LED light source 201 is configured to illuminate a left eye The eyeball;
- the second LED light source 202 is for illuminating the right eyeball.
- the miniature camera 30 specifically includes a first miniature camera 301 and a second miniature camera 302. a first miniature camera 301 is disposed at a left lens edge of the virtual reality helmet 10; a second miniature camera 302 is disposed at a right lens edge of the virtual reality helmet 10; and a first miniature camera 301 is configured to capture a left eye Eyeball image information; the second miniature camera 302 is used to capture eyeball image information of the right eye.
- the server obtains a left-eye optical axis vector of a left-eye gaze orientation according to the eyeball image information of the left eye, and obtains a right-eye optical axis vector of the right-eye gaze orientation according to the right-eye ocular image information, and then The orientation information of the pupil of the human eye is determined according to the intersection of the optical axis vector of the left eye and the optical axis vector of the right eye.
- a miniature camera and a light source are disposed in a virtual reality helmet, and a plurality of reference points are set in the virtual scene, and a spatial mapping relationship between the miniature camera, the reference point, and the eye of the human eye is constructed by using the three-dimensional matrix;
- the micro-camera is used to collect the image information of the eyeball, and the acquired image information of the eyeball is analyzed according to the spatial mapping relationship, and the pupil focusing area can be obtained in real time, thereby determining the view of the user. Look at the orientation without increasing the weight of the head-mounted visual device and without revealing environmental information around the user.
- the power source is integrated into a USB interface (not shown) to supply power to the electronic components such as the light source 20 and the micro camera 30 in the virtual reality helmet;
- the wearable visual device is connected to the server through the HDMI data line, the server controls the light source 20 switch and the micro camera 30 to collect the eyeball image information through the HDMI data line, and the processing of the eyeball image information collected by the micro camera 30 is completed by the server.
- a processor may be provided in the virtual reality helmet 10 to perform the processing and control of the server.
- FIG. 2 is a flowchart of Embodiment 1 of a human eye tracking method for a head-mounted visual device according to an embodiment of the present invention.
- an LED light source is turned on instantaneously, and a micro camera collects a human eye.
- the eyeball image information determines the orientation information of the pupil of the human eye by analyzing the acquired image information of the eyeball.
- Step 101 Illuminating the eyeball of the human eye with an LED light source.
- the LED light source is similar to the camera's flash, and is turned off immediately after turning on, without affecting the user's normal visual experience.
- Step 102 Acquire an eyeball image information of a human eye by using a micro camera.
- the miniature camera captures the eyeball image information of the human eye when the LED light source is turned on; the miniature camera can be a miniature camera, a miniature camera, or the like.
- Step 103 Determine the orientation information of the pupil of the human eye according to the eyeball image information by using a spatial mapping relationship.
- the step 103 includes: collecting eyeball image information of the left eye and eyeball image information of the right eye; obtaining a left-eye optical axis vector of the left-eye gaze orientation according to the eyeball image information of the left eye, and according to the right The eyeball image information of the eye obtains a right eye optical axis vector of the right eye gaze orientation; and the orientation information of the human eye pupil is determined according to the left eye optical axis vector and the right eye optical axis vector.
- the micro-camera (which can also use a sensor such as a micro camera) collects the eyeball image information of the human eye, analyzes the acquired eyeball image information according to the spatial mapping relationship, and can obtain the pupil focusing area in real time, thereby determining the viewing orientation of the user. Does not increase the weight of the head-mounted visual device, and does not reveal environmental information around the user, thereby improving the user experience.
- FIG. 3 is a flowchart of Embodiment 2 of a human eye tracking method for a head-mounted visual device according to an embodiment of the present invention. As shown in FIG. 3, before the user performs human eye tracking, the user needs to utilize The three-dimensional matrix constructs a spatial mapping relationship between the miniature camera, the reference point, and the eye of the human eye.
- the method further includes:
- Step 100 Construct a spatial mapping relationship between the miniature camera, the reference point, and the eyeball of the human eye using the three-dimensional matrix.
- a three-dimensional matrix form is used to fit the coordinate system between the eyeball of the human eye and the coordinate system of the reference point, and the position between the miniature camera and the eye of the human eye. Relationship, finally constructing the spatial mapping relationship between the miniature camera, the reference point and the eyeball of the human eye, and using the spatial mapping relationship combined with the collected eyeball image information, the visual gaze point of the user in the virtual space can be calculated in real time.
- FIG. 4 is a schematic diagram of three-dimensional coordinates of a spatial positional relationship between a miniature camera, a reference point, and a human eyeball according to an embodiment of the present invention.
- the present invention provides a pupil focus area tracking applied to a virtual reality helmet.
- the solution is mainly to install a miniature camera (for example, a miniature camera) on both sides of the lens of a head-mounted visual device (for example, a virtual reality helmet), and install an LED light source on the edge of the miniature camera lens, and the virtual reality helmet is operated in a virtual state.
- Four reference points are set in the scene. When the eyeball is looking at the reference point, the LED light source is turned on.
- the miniature camera captures and records the real-time image information of the eyeball and the pupil, and then combines the miniature camera, the reference point, and the coordinate system of the human eyeball. Spatial positional relationship, with different functional forms and matrix forms to fit the one-to-one correspondence between the eye reference frame and the reference frame where the reference point is located, and obtain the pupil position and its orientation information, which can be calculated in the space.
- E1 and E2 are the origin of the space rectangular coordinate system where the left and right eyeballs are located; S1 and S2 are the origin of the space rectangular coordinate system where the miniature camera is located; O is the origin of the space rectangular coordinate system where the target fixation point is located; X1 and X2 are the reference points set in the virtual reality X1 and X2 are located on the midline of the line segment of the two eyeballs; X3 is the target fixation point in the virtual reality scene; H1, H2 and Ct are the vertical distance between the camera and the human eye; L is the distance between the two eyeballs; Cs is the distance between two miniature cameras; the distance between reference points X1 and X2 is equal to the distance between reference points X1 and S0, both of which are ⁇ X; the angle of ⁇ E1X1E2 is 2 ⁇ .
- the spatial position and orientation information of the pupil are calculated. , you can get the vector coordinates of the pupil looking at a certain point.
- the spatial position of the pupil can be expressed as The pupil movement in the space contains three dimensions of the X-axis, the Y-axis, and the Z-axis. Therefore, there should be three unknown parameters, but since the pupil moves on the fixed plane of the eyeball, it is on the fixed plane where the eyeball is located. , Contains only two unknown parameters ⁇ 0 , ⁇ 0 in the two-dimensional space in which the pupil plane motion is located, and another parameter Directly related to ⁇ 0, ⁇ 0.
- the gaze orientation of the pupil is the rotation angle of the pupil in three dimensions of the space in which it is located, denoted as R, and the spatial position and orientation data of the pupil are integrated, and the vector coordinate information [R, t] when the pupil looks at a certain point can be obtained.
- R is a 3x3 rotation matrix representing the gaze orientation of the pupil
- t is a 3x1 vector representing the spatial position information of the pupil. Since the rotation angle R is also on the fixed plane of the eyeball, there are two rotation angles which are unknown parameters, one is the rotation angle around the X axis, and the other is the rotation angle around the Z axis, and the two rotation angles determine the value of R.
- R The values of R can be determined by (1) and (2):
- the coordinate system of the reference points X 1 and X 2 is recorded as the plane coordinate system O
- the coordinate system of the eyeball is recorded as the three-dimensional coordinate system E of the eye
- the coordinate system of the camera is recorded as S
- the coordinate system of the two-dimensional image of the eye movement of the camera is located.
- Recorded as B according to the relationship between the camera, the reference point and the coordinate system of the eyeball in the virtual reality eye tracking system, the coordinate conversion relationship diagram shown in Fig. 5 can be obtained.
- T O ⁇ E T O ⁇ S ⁇ T S ⁇ B ⁇ T B ⁇ E
- T O ⁇ E represents the conversion relationship from the eye coordinate system E to the coordinate system O where the reference point is located, and can be calibrated by the reference point.
- another T O ⁇ S camera coordinate system S relative to the coordinate system O of the reference point, and the coordinate system B of the two-dimensional image captured by the T S ⁇ B camera relative to the coordinate system S of the camera can be obtained by calibration .
- T B ⁇ E Two unknown parameters (x, y) in T B ⁇ E are calculated from the reference point, that is, the transformation relationship between the current eye coordinate system E and the coordinate system B in which the two-dimensional image is located.
- Relative orbital eye are two unknown amount, in the eyes and eye shape limitations, eye movement only in X, Y-axis, the reference point by a calibration can be obtained two unknowns T B ⁇ E in, Get the conversion relationship of T B ⁇ E .
- the unknown parameters of the coordinate system can be calculated.
- R is a 3 ⁇ 3 rotation matrix
- t is a 3 ⁇ 1 vector
- C is an internal matrix.
- the four external parameters of the pupil determine the position and orientation of the pupil relative to the scene, including two rotation angles, which can uniquely determine R, and the other two parameters constitute t.
- the starting point (x 0 , y 0 ) represents the pixel coordinates at the intersection of the optical axis and the reference point
- f x and f y represent the length of the focus in the horizontal and vertical directions, respectively.
- the two-dimensional image of the eyeball captured by the camera can be converted into the optical axis vector coordinate of the eye gaze orientation, and the intersection of the optical axis vectors acquired by the two eyes is the target gaze region, and here are mainly the following three Situation:
- the first type the optical axes intersect.
- the obtained optical axes of the two eyes are successfully intersected to obtain the target fixation point.
- the light columns intersect. According to the eyeball feature of each user, a light column having a radius of r (according to the characteristics of the user's eyes) centered on the optical axis vector Fo is formed, and the intersection of the left and right eye beams is the target attention area.
- the third type the light cones intersect.
- the actual line-of-sight geometry range is that the retina is the apex of the cone of light, and the line of sight is the central axis of the cone of light, and the cone of light is at an angle, that is, the field of view is an area on the focal plane of the view.
- the intersection area of the area is the focus area, and the geometric center of the focus area is the focus.
- the first two methods yield sufficient approximation accuracy.
- the reference point is set in the virtual scene to pick up the eyeball image data of the pupil at different target points, through the spatial positional relationship of the system, the conversion between different coordinate systems, and the image data.
- the visual gaze point of the user in the virtual space can be calculated in real time.
- the solution of the invention mainly comprises the following contents: setting of the virtual reality helmet edge camera and the LED light source; setting of the reference point in the virtual reality scene; photographing the pupil movement image; and segmenting the eye white and the pupil according to the image information to obtain the pupil and the eyeball Position relationship; calculate the real-time position and focus direction of the pupil based on the acquired data.
- a miniature camera is placed at the edge of the lens of the virtual reality helmet to capture the changes in the user's eye.
- an LED light source is arranged on the micro camera to emit light, which helps the camera to collect data.
- the position relationship of the miniature camera is shown in Fig. 4.
- Set reference point Before the user uses the virtual reality helmet, set 4 target points from the near to far as the reference point in the default virtual scene.
- the reference point is set to obtain the data information when the eye focuses on the reference point, and the user pupils focus.
- the camera captures the image information of the user's eyeball at this time.
- the camera photographs the eye movement image: when the user's eyes look at each reference point, the LED light is turned on, and the camera takes a group of images to record the pupil motion information to obtain image data.
- Analyze the image information to obtain the positional relationship between the pupil and the eyeball transmit different sets of image information captured by the camera to the server, and segment the white of the eye and the pupil through image analysis.
- the reference system between the eye reference frame and the reference point is fitted in different functional forms and matrix forms.
- Corresponding mapping relationship is obtained, and the position of the pupil and its orientation information are obtained, and the visual gaze point of the user in the virtual space is calculated in real time.
- the application environment of the invention is an eye tracking technology inside the virtual reality immersed helmet, geometric nearsight of the near field eye line of sight, and the application environment is an environment with no content tracking other than the eye space, and the environment is controllable for protecting the user's personal information.
- the interaction (without revealing the user's surroundings) is convenient and easy to use; due to the geometric vision myopia model, the visual optical path reconstruction parameter model of the user's lens, pupil, cornea, vitreous, etc. is not calculated, and the calculation amount is small and the implementation is simple.
- an embodiment of the present invention may be implemented in various hardware, software code or combinations of both.
- an embodiment of the present invention may also be a program code for executing the above method in a Digital Signal Processor (DSP).
- DSP Digital Signal Processor
- the invention may also relate to various functions performed by a computer processor, digital signal processor, microprocessor or Field Programmable Gate Array (FPGA).
- the above described processor may be configured to perform specific tasks in accordance with the present invention, which are accomplished by executing machine readable software code or firmware code that defines a particular method disclosed herein.
- Software code or firmware code can be developed into different programming languages and different formats or forms. Software code can also be compiled for different target platforms. However, different code patterns, types, and languages of software code and other types of configuration code for performing tasks in accordance with the present invention do not depart from the spirit and scope of the present invention.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- Human Computer Interaction (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
Claims (10)
- 一种能够进行人眼追踪的头戴式可视设备,其特征在于,该头戴式可视设备包括:A head-mounted visual device capable of tracking human eyes, characterized in that the head-mounted visual device comprises:虚拟现实头盔(10),用于容纳头戴式可视设备;a virtual reality helmet (10) for housing a head mounted visual device;光源(20),设置于所述虚拟现实头盔(10)内,用于照射人眼眼球;以及a light source (20) disposed in the virtual reality helmet (10) for illuminating an eyeball of a human eye;***机(30),设置于所述虚拟现实头盔(10)内,用于采集人眼的眼球图像信息,以便服务器根据所述眼球图像信息确定人眼瞳孔的方位信息。The micro camera (30) is disposed in the virtual reality helmet (10) for collecting eyeball image information of the human eye, so that the server determines the orientation information of the pupil of the human eye according to the eyeball image information.
- 如权利要求1所述的能够进行人眼追踪的头戴式可视设备,其特征在于,所述服务器具体根据***机(30)、参考点以及人眼眼球之间的空间位置关系计算出人眼瞳孔的方位信息。The head-mounted visual device capable of performing human eye tracking according to claim 1, wherein the server calculates a person according to a spatial positional relationship between the micro camera (30), the reference point, and the eye of the human eye. The orientation information of the eye pupil.
- 如权利要求2所述的能够进行人眼追踪的头戴式可视设备,其特征在于,所述参考点的个数为4个。The head-mounted visual device capable of tracking human eyes according to claim 2, wherein the number of the reference points is four.
- 如权利要求1所述的能够进行人眼追踪的头戴式可视设备,其特征在于,所述光源(20)具体包括:The head-mounted visual device capable of performing human eye tracking according to claim 1, wherein the light source (20) specifically comprises:第一LED光源(201),设置于所述虚拟现实头盔(10)的左侧镜片边缘处,用于照射左眼眼球;以及a first LED light source (201) disposed at an edge of the left lens of the virtual reality helmet (10) for illuminating the left eyeball;第二LED光源(202),设置于所述虚拟现实头盔(10)的右侧镜片边缘处,用于照射右眼眼球。A second LED light source (202) is disposed at an edge of the right lens of the virtual reality helmet (10) for illuminating the right eyeball.
- 如权利要求1所述的能够进行人眼追踪的头戴式可视设备,其特征在于,所述***机(30)具体包括:The head-mounted visual device capable of performing human eye tracking according to claim 1, wherein the miniature camera (30) specifically comprises:第一***机(301),设置于所述虚拟现实头盔(10)的左侧镜片边缘处,用于拍摄左眼的眼球图像信息;以及a first miniature camera (301) disposed at an edge of a left lens of the virtual reality helmet (10) for capturing eyeball image information of the left eye;第二***机(302),设置于所述虚拟现实头盔(10)的右侧镜片边缘处,用于拍摄右眼的眼球图像信息。A second miniature camera (302) is disposed at an edge of the right lens of the virtual reality helmet (10) for capturing eyeball image information of the right eye.
- 如权利要求5所述的能够进行人眼追踪的头戴式可视设备,其特征在于,所述服务器具体根据左眼的眼球图像信息获得左眼注视方位的左眼光轴矢量,并 根据右眼的眼球图像信息获得右眼注视方位的右眼光轴矢量,再根据左眼光轴矢量和右眼光轴矢量的相交处确定人眼瞳孔的方位信息。The head-mounted visual device capable of performing human eye tracking according to claim 5, wherein the server obtains a left-eye optical axis vector of a left-eye gaze orientation according to the eyeball image information of the left eye, and Obtaining the right-eye optical axis vector of the right-eye gaze orientation according to the eyeball image information of the right eye, and determining the orientation information of the human eye pupil according to the intersection of the left-eye optical axis vector and the right-eye optical axis vector.
- 如权利要求1所述的能够进行人眼追踪的头戴式可视设备,其特征在于,所述***机(30)采集人眼的眼球图像信息时,所述光源(20)瞬间开启并关闭。The head-mounted visual device capable of performing human eye tracking according to claim 1, wherein when the micro camera (30) collects eyeball image information of a human eye, the light source (20) is instantly turned on and off. .
- 一种用于头戴式可视设备的人眼追踪方法,其特征在于,该方法包括:A human eye tracking method for a head mounted visual device, the method comprising:利用LED光源照射人眼眼球;Illuminating the eyeball of the human eye with an LED light source;利用***机采集人眼的眼球图像信息;以及Acquiring eyeball image information of the human eye using a miniature camera;利用空间映射关系根据所述眼球图像信息确定人眼瞳孔的方位信息。The orientation information of the pupil of the human eye is determined according to the eyeball image information by using a spatial mapping relationship.
- 如权利要求8所述的用于头戴式可视设备的人眼追踪方法,其特征在于,利用LED光源照射人眼眼球的步骤之前,该方法还包括:The human eye tracking method for a head-mounted visual device according to claim 8, wherein before the step of illuminating the eyeball with the LED light source, the method further comprises:利用三维矩阵构建***机、参考点以及人眼眼球之间的空间映射关系。A three-dimensional matrix is used to construct a spatial mapping relationship between a miniature camera, a reference point, and a human eye.
- 如权利要求8所述的用于头戴式可视设备的人眼追踪方法,其特征在于,根据所述眼球图像信息确定人眼瞳孔的方位信息的步骤,具体包括:The human eye tracking method for a head-mounted visual device according to claim 8, wherein the step of determining the orientation information of the pupil of the human eye according to the image information of the eyeball comprises:采集左眼的眼球图像信息和右眼的眼球图像信息;Collecting eyeball image information of the left eye and eyeball image information of the right eye;根据左眼的眼球图像信息获得左眼注视方位的左眼光轴矢量,并根据右眼的眼球图像信息获得右眼注视方位的右眼光轴矢量;以及Obtaining a left-eye optical axis vector of a left-eye gaze orientation according to the eye image information of the left eye, and obtaining a right-eye optical axis vector of the right-eye gaze orientation according to the eye image information of the right eye;根据所述左眼光轴矢量和所述右眼光轴矢量确定人眼瞳孔的方位信息。 The orientation information of the pupil of the human eye is determined according to the left-eye optical axis vector and the right-eye optical axis vector.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2016/103375 WO2018076202A1 (en) | 2016-10-26 | 2016-10-26 | Head-mounted display device that can perform eye tracking, and eye tracking method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2016/103375 WO2018076202A1 (en) | 2016-10-26 | 2016-10-26 | Head-mounted display device that can perform eye tracking, and eye tracking method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018076202A1 true WO2018076202A1 (en) | 2018-05-03 |
Family
ID=62023004
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2016/103375 WO2018076202A1 (en) | 2016-10-26 | 2016-10-26 | Head-mounted display device that can perform eye tracking, and eye tracking method |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2018076202A1 (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110308794A (en) * | 2019-07-04 | 2019-10-08 | 郑州大学 | There are two types of the virtual implementing helmet of display pattern and the control methods of display pattern for tool |
CN110347260A (en) * | 2019-07-11 | 2019-10-18 | 歌尔科技有限公司 | A kind of augmented reality device and its control method, computer readable storage medium |
CN110633014A (en) * | 2019-10-23 | 2019-12-31 | 哈尔滨理工大学 | Head-mounted eye movement tracking device |
CN111240464A (en) * | 2018-11-28 | 2020-06-05 | 简韶逸 | Eyeball tracking correction method and device |
CN111524175A (en) * | 2020-04-16 | 2020-08-11 | 东莞市东全智能科技有限公司 | Depth reconstruction and eye movement tracking method and system for asymmetric multiple cameras |
CN111665932A (en) * | 2019-03-05 | 2020-09-15 | 宏达国际电子股份有限公司 | Head-mounted display device and eyeball tracking device thereof |
CN112540084A (en) * | 2019-09-20 | 2021-03-23 | 联策科技股份有限公司 | Appearance inspection system and inspection method |
CN112633128A (en) * | 2020-12-18 | 2021-04-09 | 上海影创信息科技有限公司 | Method and system for pushing information of interested object in afterglow area |
CN112926521A (en) * | 2021-03-30 | 2021-06-08 | 青岛小鸟看看科技有限公司 | Eyeball tracking method and system based on light source on-off |
CN113138664A (en) * | 2021-03-30 | 2021-07-20 | 青岛小鸟看看科技有限公司 | Eyeball tracking system and method based on light field perception |
CN113242384A (en) * | 2021-05-08 | 2021-08-10 | 聚好看科技股份有限公司 | Panoramic video display method and display equipment |
CN113362676A (en) * | 2020-03-04 | 2021-09-07 | 上海承尊器进多媒体科技有限公司 | Virtual reality driving system and method based on virtual reality |
CN114209990A (en) * | 2021-12-24 | 2022-03-22 | 艾视雅健康科技(苏州)有限公司 | Method and device for analyzing effective work of medical device entering eye in real time |
CN114296233A (en) * | 2022-01-05 | 2022-04-08 | 京东方科技集团股份有限公司 | Display module, manufacturing method thereof and head-mounted display device |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040150728A1 (en) * | 1997-12-03 | 2004-08-05 | Shigeru Ogino | Image pick-up apparatus for stereoscope |
CN103439794A (en) * | 2013-09-11 | 2013-12-11 | 百度在线网络技术(北京)有限公司 | Calibration method for head-mounted device and head-mounted device |
CN104603673A (en) * | 2012-09-03 | 2015-05-06 | Smi创新传感技术有限公司 | Head mounted system and method to compute and render stream of digital images using head mounted system |
CN104685541A (en) * | 2012-09-17 | 2015-06-03 | 感官运动仪器创新传感器有限公司 | Method and an apparatus for determining a gaze point on a three-dimensional object |
US20150160725A1 (en) * | 2013-12-10 | 2015-06-11 | Electronics And Telecommunications Research Institute | Method of acquiring gaze information irrespective of whether user wears vision aid and moves |
CN105393160A (en) * | 2013-06-28 | 2016-03-09 | 微软技术许可有限责任公司 | Camera auto-focus based on eye gaze |
-
2016
- 2016-10-26 WO PCT/CN2016/103375 patent/WO2018076202A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040150728A1 (en) * | 1997-12-03 | 2004-08-05 | Shigeru Ogino | Image pick-up apparatus for stereoscope |
CN104603673A (en) * | 2012-09-03 | 2015-05-06 | Smi创新传感技术有限公司 | Head mounted system and method to compute and render stream of digital images using head mounted system |
CN104685541A (en) * | 2012-09-17 | 2015-06-03 | 感官运动仪器创新传感器有限公司 | Method and an apparatus for determining a gaze point on a three-dimensional object |
CN105393160A (en) * | 2013-06-28 | 2016-03-09 | 微软技术许可有限责任公司 | Camera auto-focus based on eye gaze |
CN103439794A (en) * | 2013-09-11 | 2013-12-11 | 百度在线网络技术(北京)有限公司 | Calibration method for head-mounted device and head-mounted device |
US20150160725A1 (en) * | 2013-12-10 | 2015-06-11 | Electronics And Telecommunications Research Institute | Method of acquiring gaze information irrespective of whether user wears vision aid and moves |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111240464A (en) * | 2018-11-28 | 2020-06-05 | 简韶逸 | Eyeball tracking correction method and device |
CN111665932A (en) * | 2019-03-05 | 2020-09-15 | 宏达国际电子股份有限公司 | Head-mounted display device and eyeball tracking device thereof |
CN111665932B (en) * | 2019-03-05 | 2023-03-24 | 宏达国际电子股份有限公司 | Head-mounted display device and eyeball tracking device thereof |
CN110308794A (en) * | 2019-07-04 | 2019-10-08 | 郑州大学 | There are two types of the virtual implementing helmet of display pattern and the control methods of display pattern for tool |
CN110347260A (en) * | 2019-07-11 | 2019-10-18 | 歌尔科技有限公司 | A kind of augmented reality device and its control method, computer readable storage medium |
CN112540084A (en) * | 2019-09-20 | 2021-03-23 | 联策科技股份有限公司 | Appearance inspection system and inspection method |
CN110633014A (en) * | 2019-10-23 | 2019-12-31 | 哈尔滨理工大学 | Head-mounted eye movement tracking device |
CN110633014B (en) * | 2019-10-23 | 2024-04-05 | 常州工学院 | Head-wearing eye movement tracking device |
CN113362676A (en) * | 2020-03-04 | 2021-09-07 | 上海承尊器进多媒体科技有限公司 | Virtual reality driving system and method based on virtual reality |
CN111524175A (en) * | 2020-04-16 | 2020-08-11 | 东莞市东全智能科技有限公司 | Depth reconstruction and eye movement tracking method and system for asymmetric multiple cameras |
CN112633128A (en) * | 2020-12-18 | 2021-04-09 | 上海影创信息科技有限公司 | Method and system for pushing information of interested object in afterglow area |
CN112926521A (en) * | 2021-03-30 | 2021-06-08 | 青岛小鸟看看科技有限公司 | Eyeball tracking method and system based on light source on-off |
CN112926521B (en) * | 2021-03-30 | 2023-01-24 | 青岛小鸟看看科技有限公司 | Eyeball tracking method and system based on light source on-off |
US11863875B2 (en) | 2021-03-30 | 2024-01-02 | Qingdao Pico Technology Co., Ltd | Eyeball tracking method and system based on on-off of light sources |
CN113138664A (en) * | 2021-03-30 | 2021-07-20 | 青岛小鸟看看科技有限公司 | Eyeball tracking system and method based on light field perception |
CN113242384A (en) * | 2021-05-08 | 2021-08-10 | 聚好看科技股份有限公司 | Panoramic video display method and display equipment |
CN114209990A (en) * | 2021-12-24 | 2022-03-22 | 艾视雅健康科技(苏州)有限公司 | Method and device for analyzing effective work of medical device entering eye in real time |
CN114296233A (en) * | 2022-01-05 | 2022-04-08 | 京东方科技集团股份有限公司 | Display module, manufacturing method thereof and head-mounted display device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018076202A1 (en) | Head-mounted display device that can perform eye tracking, and eye tracking method | |
US11290706B2 (en) | Display systems and methods for determining registration between a display and a user's eyes | |
US10917634B2 (en) | Display systems and methods for determining registration between a display and a user's eyes | |
US9728010B2 (en) | Virtual representations of real-world objects | |
CN107991775B (en) | Head-mounted visual equipment capable of tracking human eyes and human eye tracking method | |
US9779512B2 (en) | Automatic generation of virtual materials from real-world materials | |
US9727132B2 (en) | Multi-visor: managing applications in augmented reality environments | |
CA2820950C (en) | Optimized focal area for augmented reality displays | |
JP5908491B2 (en) | Improved autofocus for augmented reality display | |
US20160131902A1 (en) | System for automatic eye tracking calibration of head mounted display device | |
CN112805659A (en) | Selecting depth planes for a multi-depth plane display system by user classification | |
US20140152558A1 (en) | Direct hologram manipulation using imu | |
CN108139806A (en) | Relative to the eyes of wearable device tracking wearer | |
JP2016507805A (en) | Direct interaction system for mixed reality environments | |
US11422620B2 (en) | Display systems and methods for determining vertical alignment between left and right displays and a user's eyes | |
CN112753037A (en) | Sensor fusion eye tracking | |
CN114581514A (en) | Method for determining fixation point of eyes and electronic equipment | |
WO2023195995A1 (en) | Systems and methods for performing a motor skills neurological test using augmented or virtual reality | |
JP2015013011A (en) | Visual field restriction image data creation program and visual field restriction apparatus using the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16919734 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16919734 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 02.07.2019) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16919734 Country of ref document: EP Kind code of ref document: A1 |