WO2022205789A1 - 基于虚拟现实的眼球追踪方法、*** - Google Patents

基于虚拟现实的眼球追踪方法、*** Download PDF

Info

Publication number
WO2022205789A1
WO2022205789A1 PCT/CN2021/118285 CN2021118285W WO2022205789A1 WO 2022205789 A1 WO2022205789 A1 WO 2022205789A1 CN 2021118285 W CN2021118285 W CN 2021118285W WO 2022205789 A1 WO2022205789 A1 WO 2022205789A1
Authority
WO
WIPO (PCT)
Prior art keywords
tracking
eye
infrared light
light source
user
Prior art date
Application number
PCT/CN2021/118285
Other languages
English (en)
French (fr)
Inventor
吴涛
Original Assignee
青岛小鸟看看科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 青岛小鸟看看科技有限公司 filed Critical 青岛小鸟看看科技有限公司
Priority to US17/878,023 priority Critical patent/US11640201B2/en
Publication of WO2022205789A1 publication Critical patent/WO2022205789A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/20Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means

Definitions

  • the present disclosure relates to the technical field of virtual reality, and more particularly, to a method and system for eye tracking based on virtual reality.
  • virtual reality systems are becoming more and more common, and are used in many fields, such as computer games, health and safety, industry, and education and training.
  • mixed virtual reality systems are being integrated into mobile communication devices, game consoles, personal computers, movie theaters, theme parks, university labs, student classrooms, hospital exercise rooms, and more.
  • virtual reality is a form of reality that is adjusted in some way before being presented to the user, and may include virtual reality (VR), augmented reality (AR), mixed reality (MR), or some combination and/or or derivative combinations.
  • VR virtual reality
  • AR augmented reality
  • MR mixed reality
  • a typical virtual reality system includes one or more devices configured to present and display content to a user.
  • a virtual reality system may include a head mounted display (HMD) worn by a user and configured to output virtual reality content to the user.
  • HMD head mounted display
  • a virtual reality system configured by an all-in-one machine is more popular, that is, various hardware devices such as a mobile computing processing unit and an image and graphics renderer are integrated in the all-in-one device. Due to the current application and popularization of virtual reality all-in-one devices in many fields, in some scenarios, the quality parameters such as the image clarity of the rendered content presented by the virtual reality all-in-one device are relatively high. Capabilities and rendering capabilities present no small challenge.
  • the existing eye-tracking technology mainly installs two eye-tracking modules on the left and right eye positions of the virtual reality integrated machine screen, and uses the same light source in both eye-tracking modules.
  • the light emitted by the light sources in the two eye tracking modules may easily interfere with each other, especially for users wearing myopia glasses, which increases the error of the calculation results and affects the position accuracy of eye tracking.
  • the purpose of the present disclosure is to provide an eye tracking method and system based on virtual reality, so as to solve the problem of installing two eye tracking modules on the left and right eye positions of the virtual reality integrated machine screen respectively, and tracking the two eye
  • the modules are each equipped with the same light source, which causes the light emitted by the light sources in the two eye tracking modules to easily interfere with each other during calibration or use, especially for users wearing myopia glasses, which increases the error of the calculation results. Issues affecting the positional accuracy of eye tracking.
  • a virtual reality-based eye tracking method provided by the present disclosure includes:
  • the reflected infrared light of the illuminated left infrared light source is captured by the left tracking camera, and the reflected infrared light of the illuminated right infrared light source is captured by the right tracking camera to form monocular tracking data in each specific frame, and according to the The interpupillary distance and the monocular tracking data are used to calculate the tracking data of the other monocular in the specific frame; wherein, only one of the left infrared light source and the right infrared light source can be lit in the same specific frame;
  • the tracking data of one eye and the tracking data of another eye are arranged in time sequence of the specific frame to form the tracking data of both eyes to complete the eye tracking.
  • the process of obtaining the user's interpupillary distance through the eyeball calibration data includes:
  • the process of obtaining the user's interpupillary distance through the eyeball calibration data includes:
  • the left-eye calibration data is formed by photographing the movement of the user adjusting the left eye according to the eye-eye calibration data by the left-tracking camera;
  • the right-eye calibration data is formed by photographing the right-eye movement of the user adjusting the right eye according to the eye-eye calibration data by the right-tracking camera;
  • Fitting processing is performed on the left eye calibration data and the right eye calibration data according to the relative positional relationship between the left tracking camera and the right tracking camera to obtain the distance between the centroids of the user's eyes to generate the interpupillary distance.
  • the reflected infrared light of the illuminated left infrared light source is photographed by the left tracking camera, and the reflected infrared light of the illuminated right infrared light source is photographed by the right tracking camera, so as to form the tracking data of the monocular in each specific frame.
  • process including:
  • the left infrared light source and the right infrared light source to emit infrared light to the left and right eyes of the user respectively, so that the infrared light forms reflected infrared light in the left and right eyes of the user;
  • the reflected infrared rays are sequentially captured in specific frames, and the tracking data of the monocular is formed in each specific frame according to the relative positions of the reflected infrared rays by the computer vision technology.
  • the left infrared light source and the right infrared light source are alternately lit in sequence according to the number of odd-even frames of the specific frame.
  • the present disclosure also provides an eye tracking system based on virtual reality, which is configured to implement the aforementioned method for eye tracking based on virtual reality, including a display provided in an integrated virtual reality machine, and a processor built in the integrated virtual reality machine.
  • a monocular tracking module and an infrared light source wherein the processor includes a pupil distance acquisition module and a tracking calculation module; the monocular tracking module includes a left tracking camera and a right tracking camera; the infrared light source includes a left infrared light source and a right infrared light source ;
  • the display is configured to present eyeball calibration data to both eyes of the user;
  • the interpupillary distance obtaining module is set to obtain the interpupillary distance of the user's eyes;
  • the left tracking camera is set to capture the reflected infrared light of the illuminated left infrared light source
  • the right tracking camera is set to capture the reflected infrared light of the illuminated right infrared light source, so as to form monocular tracking data in each specific frame ; wherein, the left infrared light source and the right infrared light source can only light up one in the same specific frame;
  • the tracking calculation module is configured to calculate the tracking data of the other eye in the specific frame according to the interpupillary distance and the tracking data of the single eye, and is also configured to calculate the tracking data of the single eye and the tracking data of the other eye according to the tracking data of the other eye.
  • the time sequence arrangement of the specific frames forms the tracking data of the eyes to complete the eye tracking.
  • the left tracking camera and the right tracking camera are built in the virtual reality integrated machine at positions corresponding to the left and right eyes of the user;
  • the left infrared light source and the right infrared light source are respectively arranged around the left tracking camera and the right tracking camera.
  • the left tracking camera is further configured to capture the user's actions of adjusting the left eye according to the eye calibration data to form left eye calibration data
  • the right tracking camera is further configured to capture the user's adjustment of the right eye according to the eye calibration data.
  • Action to form right eye calibration data so that the interpupillary distance acquisition module is set to acquire the left eye calibration data and the right eye calibration data, and according to the relative positional relationship between the left tracking camera and the right tracking camera.
  • the left eye calibration data and the right eye calibration data are fitted to obtain the distance between the centroids of the user's eyes to generate the interpupillary distance.
  • a camera is also included,
  • the camera is configured to capture the user's action of adjusting the eyes according to the calibration data to obtain a user calibration image, so that the interpupillary distance obtaining module performs positioning analysis on the user calibration image to obtain the distance between the centroids of the user's eyes to generate the binocular distance. Interpupillary distance.
  • the shooting frame rate of the left tracking camera and the right tracking camera is 60 Hz.
  • the method and system for eye tracking based on virtual reality firstly presents eye calibration data to both eyes of the user through the display, and then obtains the interpupillary distance of the user through the eye calibration data;
  • the reflected infrared light of the bright left infrared light source is captured by the right tracking camera to capture the reflected infrared light of the illuminated right infrared light source, so that the tracking data of the monocular can be formed in each specific frame, and calculated according to the interpupillary distance and monocular tracking data.
  • the tracking data of the other monocular in a specific frame; in which, only one of the left infrared light source and the right infrared light source can be lit in the same specific frame; thus, the tracking data of the monocular and the tracking data of the other monocular can be divided according to the time of the specific frame.
  • the tracking data of the eyes are arranged in sequence to complete the eye tracking. Since only one of the left infrared light source and the right infrared light source can be lit in the same specific frame, there is no mutual interference, which can solve the problem of the light emitted by the light source in the double eye tracking. Mutual interference and large error in calculation results affect the position accuracy of eye tracking.
  • FIG. 1 is a flowchart of an eye tracking method based on virtual reality according to an embodiment of the present disclosure
  • FIG. 2 is a schematic diagram of a virtual reality-based eye tracking system according to an embodiment of the present disclosure.
  • Two eye-tracking modules are installed on the left and right eye positions of the virtual reality all-in-one screen, respectively, and the same light source is arranged in the two eye-tracking modules.
  • the emitted light can easily interfere with each other, especially for users wearing myopia glasses, which increases the error of the calculation results and affects the positional accuracy of eye tracking.
  • the present disclosure provides an eye tracking method and system based on virtual reality, and the specific embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings.
  • FIG. 1 illustrates the virtual reality-based eye tracking method according to the embodiment of the present disclosure
  • FIG. 2 illustrates the virtual reality-based eye tracking system according to the embodiment of the present disclosure. Exemplary markings are made.
  • the virtual reality-based eye tracking method includes:
  • S110 Present eyeball calibration data to both eyes of the user through the display
  • S130 Shooting the reflected infrared light of the illuminated left infrared light source through the left tracking camera, and shooting the reflected infrared light of the illuminated right infrared light source through the right tracking camera, so as to form monocular tracking data in each specific frame, and Calculate the tracking data of another monocular in a specific frame according to the interpupillary distance and monocular tracking data; wherein, only one of the left infrared light source and the right infrared light source can be lit in the same specific frame;
  • S140 Arrange the tracking data of one eye and the tracking data of another eye in a time sequence of a specific frame to form tracking data of both eyes to complete eye tracking.
  • the process of obtaining the interpupillary distance of the user through the eyeball calibration data may include:
  • S1-121 Use the camera to capture the user's actions of adjusting the eyes according to the calibration data to obtain the user's calibration image;
  • S1-122 Perform positioning analysis on the user calibration image to obtain the distance between the centroids of the user's eyes to generate the interpupillary distance.
  • the process of obtaining the interpupillary distance of the user through the eyeball calibration data may further include:
  • S2-121 Use the left tracking camera to capture the user's movements of adjusting the left eye according to the eyeball calibration data to form the left eye calibration data; use the right tracking camera to capture the user's movements to adjust the right eye according to the eyeball calibration data to form the right eye calibration data;
  • S2-122 Perform fitting processing on the left eye calibration data and the right eye calibration data according to the relative positional relationship between the left tracking camera and the right tracking camera to obtain the distance between the centroids of the user's eyes to generate the interpupillary distance.
  • step S130 the reflected infrared light of the lit left infrared light source is photographed by the left tracking camera, and the reflected infrared light of the illuminated right infrared light source is photographed by the right tracking camera, so that in each specific
  • the process of forming monocular tracking data in a frame including:
  • S131 Make the left infrared light source and the right infrared light source emit infrared light to the left and right eyes of the user respectively, so that the infrared light forms reflected infrared light in the left and right eyes of the user;
  • S132 Capture the reflected infrared rays in sequence according to specific frame times, and use a computer vision technology to form monocular tracking data in each specific frame according to the relative positions of the reflected infrared rays.
  • step S130 is to use the left tracking camera to shoot the reflected infrared light of the lit left infrared light source, and to use the right tracking camera to shoot the reflected infrared light of the lit right infrared light source, so that in each The tracking data of one eye is formed in a specific frame, and the tracking data of the other eye in the specific frame is calculated according to the interpupillary distance and the tracking data of the monocular; among them, the left infrared light source and the right infrared light source can only light up one in the same specific frame.
  • only the left infrared light source may always be lit, or only the right infrared light source may be lit, or the left infrared light source and the right infrared light source may be staggered and randomly lit.
  • the infrared light source and the right infrared light source light up alternately according to the number of odd and even frames in a specific frame.
  • the left infrared light source is illuminated first as an example, that is, the left infrared light source is illuminated first in the first frame, and the right infrared light source is illuminated in the second frame.
  • the left infrared light source is on in the odd-numbered specific frame
  • the right infrared light source is on in the even-numbered specific frame.
  • the other infrared light source must be off.
  • the left tracking camera captures the reflected infrared light in the user's left eyeball to obtain left eye tracking data.
  • the right infrared light source is off, and the corresponding right tracking camera can shoot or not, even if shooting
  • the captured data will not be clear, so even if the data is captured synchronously with the left tracking camera, there is no reference;
  • the reflected infrared light of the right infrared light source is not captured, but another monocular (an odd number in this embodiment) in a specific frame is calculated according to the interpupillary distance and monocular tracking data (the left eye tracking data in this embodiment) obtained in step S120. frame) to obtain the tracking data of the left and right eyes in the particular frame.
  • step S140 is to arrange the tracking data of one eye and the tracking data of another eye according to the time sequence of a specific frame to form the tracking data of both eyes to complete the eye tracking.
  • step S130 obtains the left and right eyes in the specific frame
  • the tracking data of the left eye and the tracking data of the right eye are sequentially arranged in the order of the first frame, the second frame, and the third frame to form the tracking data of both eyes to complete the eye tracking.
  • the eye tracking method based on virtual reality first presents eye calibration data to both eyes of the user through the display, and then obtains the interpupillary distance of the user through the eye calibration data; and then shoots the lit left infrared light source through the left tracking camera
  • the reflected infrared light is captured by the right tracking camera to capture the reflected infrared light of the illuminated right infrared light source, so that monocular tracking data can be formed in each specific frame.
  • the tracking data of another monocular in which, only one of the left infrared light source and the right infrared light source can be lit in the same specific frame; thus, the tracking data of the monocular and the tracking data of the other monocular are arranged in the time sequence of a specific frame to form a binocular. Track data to complete eye tracking. Since only one of the left infrared light source and the right infrared light source can be lit in the same specific frame, there is no mutual interference, which can solve the problem that the light emitted by the light sources in the double eye tracking is easy to interfere with each other and calculate the results. The error is large, which affects the position accuracy of eye tracking.
  • the present disclosure also provides a virtual reality-based eye tracking method 100 , which is configured to implement the aforementioned virtual reality-based eye tracking method, which includes a display 110 provided in a virtual reality integrated machine, a built-in virtual reality
  • the infrared light source 140 includes a left infrared light source 141 and a right infrared light source 142;
  • the display 110 is configured to present eye calibration data to both eyes of the user;
  • the interpupillary distance obtaining module 121 is configured to obtain the interpupillary distance of the user
  • the left tracking camera 131 is set to capture the reflected infrared light of the lit left infrared light source 141
  • the right tracking camera 132 is set to capture the reflected infrared light of the lit right infrared light source 142 to form monocular tracking data in each specific frame ;
  • the left infrared light source 141 and the right infrared light source 142 can only light up one in the same specific frame;
  • the tracking calculation module 122 is configured to calculate the tracking data of the other monocular in a specific frame according to the interpupillary distance and the tracking data of the monocular, and is also configured to arrange the tracking data of the monocular and the tracking data of the other monocular in the time sequence of the specific frame Form the tracking data of both eyes to complete eye tracking.
  • the positions of the left tracking camera 131 and the right tracking camera 132 are not specifically limited.
  • the positions corresponding to the left eye and the right eye of the user; the left infrared light source 141 and the right infrared light source 142 are respectively arranged around the left tracking camera 131 and the right tracking camera 132;
  • the left tracking camera 131 is further configured to capture the user's actions of adjusting the left eye according to the eye calibration data to form left eye calibration data
  • the right tracking camera 132 is further configured to capture the user's adjustment of the right eye according to the eye calibration data.
  • the action forms the right eye calibration data, so that the interpupillary distance obtaining module 121 is set to obtain the left eye calibration data and the right eye calibration data, and according to the relative positional relationship between the left tracking camera and the right tracking camera, the left eye calibration data and the right eye calibration data are A fitting process is performed to obtain the distance between the centroids of the user's eyes to generate the interpupillary distance.
  • the virtual reality-based eye tracking system shown in FIG. 2 further includes a camera (not shown in the figure), and the camera is configured to capture the user's action of adjusting the eyes according to the calibration data to obtain the user's calibration image , so that the interpupillary distance obtaining module 121 performs positioning analysis on the user calibration image to obtain the distance between the centroids of the user's eyes to generate the interpupillary distance, and then calculates the tracking data of another monocular in a specific frame according to the interpupillary distance and the monocular tracking data.
  • a camera not shown in the figure
  • the camera is configured to capture the user's action of adjusting the eyes according to the calibration data to obtain the user's calibration image , so that the interpupillary distance obtaining module 121 performs positioning analysis on the user calibration image to obtain the distance between the centroids of the user's eyes to generate the interpupillary distance, and then calculates the tracking data of another monocular in a specific frame according to the interpupillary distance and the monocular tracking data.
  • the method and system for eye tracking based on virtual reality first present eye calibration data to both eyes of the user through the display, and then obtain the interpupillary distance of the user through the eye calibration data; and then use the left tracking camera Shoot the reflected infrared light of the lit left infrared light source, and shoot the reflected infrared light of the illuminated right infrared light source through the right tracking camera, so that monocular tracking data can be formed in each specific frame, and based on the interpupillary distance and monocular tracking
  • the data calculates the tracking data of the other monocular in a specific frame; in which, only one of the left infrared light source and the right infrared light source can be lit in the same specific frame; thus, the tracking data of the monocular and the tracking data of the other monocular can be classified according to the specific frame.
  • the time sequence is arranged to form the tracking data of both eyes to complete the eye tracking. Since only one of the left infrared light source and the right infrared light source can be lit in the same specific frame, there is no mutual interference, which can solve the problem of the light source in binocular tracking. The light is easy to interfere with each other, and the calculation result has a large error, which affects the position accuracy of eye tracking.
  • modules or steps of the present disclosure can be implemented by a general-purpose computing device, and they can be centralized on a single computing device or distributed in a network composed of multiple computing devices
  • they can be implemented in program code executable by a computing device, so that they can be stored in a storage device and executed by the computing device, and in some cases, can be performed in a different order than shown here.
  • the described steps, or they are respectively made into individual integrated circuit modules, or a plurality of modules or steps in them are made into a single integrated circuit module to realize.
  • the present disclosure is not limited to any particular combination of hardware and software.
  • the eye tracking method based on light field perception has the following beneficial effects: since only one of the left infrared light source and the right infrared light source can be lit in the same specific frame, there is no mutual interference, and the Solve the problem that the light emitted by the light source in the double eye tracking is easy to interfere with each other and the calculation result has a large error.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Eye Examination Apparatus (AREA)
  • Position Input By Displaying (AREA)

Abstract

一种基于虚拟现实的眼球追踪方法、***首先通过显示器向用户双眼呈现眼球校准数据(S110),再通过眼球校准数据获取用户的双眼瞳距(S120);进而通过左追踪相机拍摄点亮的左红外光源的反射红外光线,通过右追踪相机拍摄点亮的右红外光源的反射红外光线,如此得以在每一特定帧中形成单眼的追踪数据,并根据双眼瞳距与单眼追踪数据计算出在特定帧中另一单眼的追踪数据(S130);从而将单眼的追踪数据与另一单眼的追踪数据按特定帧的时间顺序排列形成双眼的追踪数据以完成眼球追踪(S140)。由于左红外光源与右红外光源在同一特定帧中仅能点亮一个故不存在相互干扰的情况,能够解决双眼球追踪中的光源发出的光线容易相互干扰、计算结果误差大的问题。

Description

基于虚拟现实的眼球追踪方法、*** 技术领域
本公开涉及虚拟现实技术领域,更为具体地,涉及一种基于虚拟现实的眼球追踪方法、***。
背景技术
由于科技的进步,市场需求的多元化发展,虚拟现实***正变得越来越普遍,应用在许多领域,如电脑游戏、健康和安全、工业和教育培训。举几个例子,混合虚拟现实***正在被整合到移动通讯设备、游戏机、个人电脑、电影院、主题公园、大学实验室、学生教室、医院锻炼健身室等生活各个角落。
一般而言,虚拟现实是一种在呈现给用户之前以某种方式进行调整的现实形式,可能包括虚拟现实(VR)、增强现实(AR)、混合现实(MR)、或某种组合和/或衍生组合。
典型的虚拟现实***包括一个或多个设置为向用户呈现和显示内容的设备。例如,一种虚拟现实***可以包含由用户佩戴并配置为向用户输出虚拟现实内容的头戴显示器(HMD)。目前比较流行的是一体机配置的虚拟现实***,即移动计算处理单元,图像图形渲染器等各种硬件设备都集成在一体机设备中。由于目前虚拟现实一体机设备在许多领域场景下应用和普及,有些场景下对虚拟现实一体机设备呈现的渲染内容的图像清晰度等质量参数要求比较高,对虚拟现实一体机设备的移动端的处理能力和渲染能力带来了不小的挑战。
现有的眼球追踪技术主要是通过在虚拟现实一体机屏幕的左,右眼位置上分别安装两个眼球追踪模块,并且在两个眼球追踪模块中均采用相同的光源,导致在标定或者使用时,两个眼球追踪模块中的光源发出的光线容易会相互干扰,特别是佩戴近视眼镜的用户,使其计算结果的误差增大,影响眼球追踪的位置精度。
因此,亟需一种能够有效避免在标定或者使用时,两个眼球追踪模块光源容易相互干扰的问题,提高追踪精度和稳定性的基于虚拟现实的眼球追踪方法、***。
发明内容
鉴于上述问题,本公开的目的是提供一种基于虚拟现实的眼球追踪方法、***,以解决虚拟现实一体机屏幕的左,右眼位置上分别安装两个眼球追踪模块,并且在两个眼球追踪模块中均各自布置有相同的光源,导致在标定或者使用时,两个眼球追踪模块中的光源发出的光线容易会相互干扰,特别是佩戴近视眼镜的用户,使其计算结果的误差增大,影响眼球追踪的位置精度的问题。
本公开提供的一种基于虚拟现实的眼球追踪方法,包括:
通过显示器向用户双眼呈现眼球校准数据;
通过所述眼球校准数据获取用户的双眼瞳距;
通过左追踪相机拍摄点亮的左红外光源的反射红外光线,以及,通过右追踪相机拍摄点亮的右红外光源的反射红外光线,以在每一特定帧中形成单眼的追踪数据,并根据所述双眼瞳距与所述单眼追踪数据计算出在所述特定帧中另一单眼的追踪数据;其中,所述左红外光源与所述右红外光源在同一特定帧中仅能点亮一个;
将单眼的追踪数据与另一单眼的追踪数据按所述特定帧的时间顺序排列形成双眼的追踪数据以完成眼球追踪。
优选地,通过所述眼球校准数据获取用户的双眼瞳距的过程,包括:
通过摄像头捕捉用户根据所述校准数据调整双眼的动作以获取用户校准图像;
对所述用户校准图像进行定位剖析获取用户双眼质心之间的距离以生成双眼瞳距。
优选地,通过所述眼球校准数据获取用户的双眼瞳距的过程,包括:
通过所述左追踪相机拍摄用户根据所述眼球校准数据调整左眼的动作形成左眼校准数据;通过所述右追踪相机拍摄用户根据所述眼球校准数据调整右眼的动作形成右眼校准数据;
根据所述左追踪相机与所述右追踪相机的相对位置关系对所述左眼校准数据和所述右眼校准数据进行拟合处理以获取用户双眼质心之间的距离生成双眼瞳距。
优选地,通过左追踪相机拍摄点亮的左红外光源的反射红外光线,以及,通过右追踪相机拍摄点亮的右红外光源的反射红外光线,以在每一特定帧中形成单眼的追踪数据的过程,包括:
使所述左红外光源与所述右红外光源分别向用户的左眼、右眼发射红外光线,使所述红外光线在用户左眼、右眼中形成反射红外光线;
按照特定帧次依次捕捉所述反射红外光线并通过计算机视觉技术根据所述反射红外光线的相对位置以在每一特定帧中形成单眼的追踪数据。
优选地,所述左红外光源与所述右红外光源按照所述特定帧的奇偶帧次数依次交替点亮。
本公开还提供一种基于虚拟现实的眼球追踪***,设置为实现前述的基于虚拟现实的眼球追踪方法,包括设置在虚拟现实一体机中的显示器、内置在所述虚拟现实一体机中的处理器、单眼追踪模块和红外光源,其中,所述处理器包括瞳距获取模块和追踪计算模块;所述单眼追踪模块包括左追踪相机和右追踪相机;所述红外光源包括左红外光源和右红外光源;
所述显示器设置为向用户双眼呈现眼球校准数据;
所述瞳距获取模块设置为获取用户的双眼瞳距;
所述左追踪相机设置为拍摄点亮的左红外光源的反射红外光线,所述右追踪相机设置为拍摄点亮的右红外光源的反射红外光线,以在每一特定帧中形成单眼的追踪数据;其中,所述左红外光源与所述右红外光源在同一特定帧中仅能点亮一个;
所述追踪计算模块设置为根据所述双眼瞳距与所述单眼追踪数据计算出在所述特定帧中另一单眼的追踪数据,还设置为将单眼的追踪数据与另一单眼的追踪数据按所述特定帧的时间顺序排列形成双眼的追踪数据以完成眼球追踪。
优选地,所述左追踪相机与所述右追踪相机内置在所述虚拟现实一体机中与用户的左眼、右眼相对应的位置;
所述左红外光源、所述右红外光源分别设置在所述左追踪相机与所述右追踪相机的四周。
优选地,所述左追踪相机还设置为拍摄用户根据所述眼球校准数据调整左眼的动作形成左眼校准数据,所述右追踪相机还设置为拍摄用户根据所述眼球校准数据调整右眼的动作形成右眼校准数据,以使所述瞳距获取模块设置为获取所述左眼校准数据和所述右眼校准数据,根据所述左追踪相机与所述右追踪相机的相对位置关系对所述左眼校准数据和所述右眼校准数据进行拟合处理以获取用户双眼质心之间的距离生成双眼瞳距。
优选地,还包括摄像头,
所述摄像头设置为捕捉用户根据所述校准数据调整双眼的动作以获取用户校准图像,以使所述瞳距获取模块对所述用户校准图像进行定位剖析获取用户双眼质心之间的距离以生成双眼瞳距。
优选地,所述左追踪相机与所述右追踪相机的拍摄帧率为60Hz。
从上面的技术方案可知,本公开提供的基于虚拟现实的眼球追踪方法、***首先通过显示器向用户双眼呈现眼球校准数据,再通过眼球校准数据获取用户的双眼瞳距;进而通过左追踪相机拍摄点亮的左红外光源的反射红外光线,通过右追踪相机拍摄点亮的右红外光源的反射红外光线,如此得以在每一特定帧中形成单眼的追踪数据,并根据双眼瞳距与单眼追踪数据计算出在特定帧中另一单眼的追踪数据;其中,左红外光源与右红外光源在同一特定帧中仅能点亮一个;从而将单眼的追踪数据与另一单眼的追踪数据按特定帧的时间顺序排列形成双眼的追踪数据以完成眼球追踪,由 于左红外光源与右红外光源在同一特定帧中仅能点亮一个故不存在相互干扰的情况,能够解决双眼球追踪中的光源发出的光线容易相互干扰、计算结果误差大,影响眼球追踪的位置精度的问题。
附图说明
通过参考以下结合附图的说明书内容,并且随着对本公开的更全面理解,本公开的其它目的及结果将更加明白及易于理解。在附图中:
图1为根据本公开实施例的基于虚拟现实的眼球追踪方法的流程图;
图2为根据本公开实施例的基于虚拟现实的眼球追踪***的示意图。
具体实施方式
虚拟现实一体机屏幕的左,右眼位置上分别安装两个眼球追踪模块,并且在两个眼球追踪模块中各自布置有相同的光源,导致在标定或者使用时,两个眼球追踪模块中的光源发出的光线容易会相互干扰,特别是佩戴近视眼镜的用户,使其计算结果的误差增大,影响眼球追踪的位置精度。
针对上述问题,本公开提供一种基于虚拟现实的眼球追踪方法、***,以下将结合附图对本公开的具体实施例进行详细描述。
为了说明本公开提供的基于虚拟现实的眼球追踪方法、***,图1对本公开实施例的基于虚拟现实的眼球追踪方法进行了示例性标示;图2对本公开实施例的基于虚拟现实的眼球追踪***进行了示例性标示。
以下示例性实施例的描述实际上仅仅是说明性的,决不作为对本公开及其应用或使用的任何限制。对于相关领域普通技术人员已知的技术和设备可能不作详细讨论,但在适当情况下,所述技术和设备应当被视为说明书的一部分。
如图1所示,本公开提供的本公开实施例的基于虚拟现实的眼球追踪方法,包括:
S110:通过显示器向用户双眼呈现眼球校准数据;
S120:通过眼球校准数据获取用户的双眼瞳距;
S130:通过左追踪相机拍摄点亮的左红外光源的反射红外光线,以及,通过右追踪相机拍摄点亮的右红外光源的反射红外光线,以在每一特定帧中形成单眼的追踪数据,并根据双眼瞳距与单眼追踪数据计算出在特定帧中另一单眼的追踪数据;其中,该左红外光源与右红外光源在同一特定帧中仅能点亮一个;
S140:将单眼的追踪数据与另一单眼的追踪数据按特定帧的时间顺序排列形成双眼的追踪数据以完成眼球追踪。
如图1所示,在步骤S120中通过眼球校准数据获取用户的双眼瞳距的过程,可以包括:
S1-121:通过摄像头捕捉用户根据校准数据调整双眼的动作以获取用户校准图像;
S1-122:对用户校准图像进行定位剖析获取用户双眼质心之间的距离以生成双眼瞳距。
如图1所示,在步骤S120中通过眼球校准数据获取用户的双眼瞳距的过程,还可以包括:
S2-121:通过左追踪相机拍摄用户根据眼球校准数据调整左眼的动作形成左眼校准数据;通过右追踪相机拍摄用户根据眼球校准数据调整右眼的动作形成右眼校准数据;
S2-122:根据左追踪相机与右追踪相机的相对位置关系对左眼校准数据和右眼校准数据进行拟合处理以获取用户双眼质心之间的距离生成双眼瞳距。
如图1所示,在步骤S130中,通过左追踪相机拍摄点亮的左红外光源的反射红外光线,以及,通过右追踪相机拍摄点亮的右红外光源的反射红外光线,以在每一特定帧中形成单眼的追踪数据的过程,包括:
S131:使左红外光源与右红外光源分别向用户的左眼、右眼发射红外 光线,使红外光线在用户左眼、右眼中形成反射红外光线;
S132:按照特定帧次依次捕捉反射红外光线并通过计算机视觉技术根据反射红外光线的相对位置以在每一特定帧中形成单眼的追踪数据。
如图1所示的实施例,步骤S130为通过左追踪相机拍摄点亮的左红外光源的反射红外光线,以及,通过右追踪相机拍摄点亮的右红外光源的反射红外光线,以在每一特定帧中形成单眼的追踪数据,并根据双眼瞳距与单眼追踪数据计算出在特定帧中另一单眼的追踪数据;其中,左红外光源与右红外光源在同一特定帧中仅能点亮一个;具体的,在整个步骤S130的全过程中可以始终只有左红外光源点亮,也可以只有右红外光源点亮,还可以左红外光源、右红外光源交错随机点亮,在本实施例中左红外光源与右红外光源按照特定帧的奇偶帧次数依次交替点亮,在此以左红外光源先亮为例说明,即第一帧左红外光源先亮,第二帧右红外光源亮,换句话说在本实施例中为奇数次的特定帧中左红外光源亮,为偶数次的特定帧中右红外光源亮,在一个红外光源亮的时候另一个红外光源必须为熄灭状态,具体的,当左红外光源点亮时左追踪相机拍摄用户左眼球中的反射红外光线以获取左眼追踪数据,此时右红外光源为熄灭状态,与之对应的右追踪相机可以拍摄也可以不拍摄,即使拍摄了,由于此时右红外光源为熄灭状态,故拍摄的数据也不会清晰因此即使与左追踪相机同步拍摄其数据也无参考作用;但在该特定帧(本实施例的奇数帧)中虽未拍摄到右红外光源的反射红外光线,但根据步骤S120获取的双眼瞳距与单眼追踪数据(本实施例中的左眼追踪数据)计算出在特定帧中另一单眼(本实施例中奇数帧右眼)的追踪数据,从而获取该特定帧中左、右眼双眼的追踪数据。
如图1所示,步骤S140为将单眼的追踪数据与另一单眼的追踪数据按特定帧的时间顺序排列形成双眼的追踪数据以完成眼球追踪,当步骤S130获取特定帧中左、右眼双眼的追踪数据之后,将左眼的追踪数据与右眼的追踪数据按照第一帧、第二帧、第三帧的顺序依次排列以形成双眼的追踪数据完成眼球追踪。
如上所述,本公开提供的基于虚拟现实的眼球追踪方法首先通过显示器向用户双眼呈现眼球校准数据,再通过眼球校准数据获取用户的双眼瞳距;进而通过左追踪相机拍摄点亮的左红外光源的反射红外光线,通过右追踪相机拍摄点亮的右红外光源的反射红外光线,如此得以在每一特定帧中形成单眼的追踪数据,并根据双眼瞳距与单眼追踪数据计算出在特定帧中另一单眼的追踪数据;其中,左红外光源与右红外光源在同一特定帧中仅能点亮一个;从而将单眼的追踪数据与另一单眼的追踪数据按特定帧的时间顺序排列形成双眼的追踪数据以完成眼球追踪,由于左红外光源与右红外光源在同一特定帧中仅能点亮一个故不存在相互干扰的情况,能够解决双眼球追踪中的光源发出的光线容易相互干扰、计算结果误差大,影响眼球追踪的位置精度的问题。
如图2所示,本公开还提供一种基于虚拟现实的眼球追踪100,设置为实现前述的基于虚拟现实的眼球追踪方法,其包括设置在虚拟现实一体机中的显示器110、内置在虚拟现实一体机中的处理器120、单眼追踪模块130和红外光源140,其中,处理器120包括瞳距获取模块121和追踪计算模块122;该单眼追踪模块130包括左追踪相机131和右追踪相机132;该红外光源140包括左红外光源141和右红外光源142;
该显示器110设置为向用户双眼呈现眼球校准数据;
该瞳距获取模块121设置为获取用户的双眼瞳距;
该左追踪相机131设置为拍摄点亮的左红外光源141的反射红外光线,右追踪相机132设置为拍摄点亮的右红外光源142的反射红外光线以在每一特定帧中形成单眼的追踪数据;其中,该左红外光源141与该右红外光源142在同一特定帧中仅能点亮一个;
该追踪计算模块122设置为根据双眼瞳距与单眼追踪数据计算出在特定帧中另一单眼的追踪数据,还设置为将单眼的追踪数据与另一单眼的追踪数据按特定帧的时间顺序排列形成双眼的追踪数据以完成眼球追踪。
如图2所示的实施例,该左追踪相机131与右追踪相机132的位置不 作具体限制,在本实施例中,该左追踪相机131与该右追踪相机132内置在虚拟现实一体机中与用户的左眼、右眼相对应的位置;该左红外光源141、该右红外光源142分别设置在左追踪相机131与该右追踪相机132的四周;
在一个具体实施例中,该左追踪相机131还设置为拍摄用户根据眼球校准数据调整左眼的动作形成左眼校准数据,该右追踪相机132还设置为拍摄用户根据眼球校准数据调整右眼的动作形成右眼校准数据,以使该瞳距获取模块121设置为获取左眼校准数据和右眼校准数据,根据左追踪相机与右追踪相机的相对位置关系对左眼校准数据和右眼校准数据进行拟合处理以获取用户双眼质心之间的距离生成双眼瞳距。
在另一具体实施例中,如图2所示的基于虚拟现实的眼球追踪***还包括摄像头(图中未示出),该摄像头设置为捕捉用户根据校准数据调整双眼的动作以获取用户校准图像,以使瞳距获取模块121对用户校准图像进行定位剖析获取用户双眼质心之间的距离以生成双眼瞳距,进而根据双眼瞳距与单眼追踪数据计算出在特定帧中另一单眼的追踪数据完成眼球追踪。
通过上述实施方式可以看出,本公开提供的,基于虚拟现实的眼球追踪方法、***首先通过显示器向用户双眼呈现眼球校准数据,再通过眼球校准数据获取用户的双眼瞳距;进而通过左追踪相机拍摄点亮的左红外光源的反射红外光线,通过右追踪相机拍摄点亮的右红外光源的反射红外光线,如此得以在每一特定帧中形成单眼的追踪数据,并根据双眼瞳距与单眼追踪数据计算出在特定帧中另一单眼的追踪数据;其中,左红外光源与右红外光源在同一特定帧中仅能点亮一个;从而将单眼的追踪数据与另一单眼的追踪数据按特定帧的时间顺序排列形成双眼的追踪数据以完成眼球追踪,由于左红外光源与右红外光源在同一特定帧中仅能点亮一个故不存在相互干扰的情况,能够解决双眼球追踪中的光源发出的光线容易相互干扰、计算结果误差大,影响眼球追踪的位置精度的问题。
如上参照附图以示例的方式描述了根据本公开提出的基于虚拟现实 的眼球追踪方法、***。但是,本领域技术人员应当理解,对于上述本公开所提出的基于虚拟现实的眼球追踪方法、***,还可以在不脱离本公开内容的基础上做出各种改进。因此,本公开的保护范围应当由所附的权利要求书的内容确定。
本实施例中的具体示例可以参考上述实施例及示例性实施方式中所描述的示例,本实施例在此不再赘述。
显然,本领域的技术人员应该明白,上述的本公开的各模块或各步骤可以用通用的计算装置来实现,它们可以集中在单个的计算装置上,或者分布在多个计算装置所组成的网络上,它们可以用计算装置可执行的程序代码来实现,从而,可以将它们存储在存储装置中由计算装置来执行,并且在某些情况下,可以以不同于此处的顺序执行所示出或描述的步骤,或者将它们分别制作成各个集成电路模块,或者将它们中的多个模块或步骤制作成单个集成电路模块来实现。这样,本公开不限制于任何特定的硬件和软件结合。
以上所述仅为本公开的优选实施例而已,并不设置为限制本公开,对于本领域的技术人员来说,本公开可以有各种更改和变化。凡在本公开的原则之内,所作的任何修改、等同替换、改进等,均应包含在本公开的保护范围之内。
工业实用性
如上所述,本公开实施例提供的基于光场感知的眼球追踪方法具有以下有益效果:由于左红外光源与右红外光源在同一特定帧中仅能点亮一个故不存在相互干扰的情况,能够解决双眼球追踪中的光源发出的光线容易相互干扰、计算结果误差大的问题。

Claims (12)

  1. 一种基于虚拟现实的眼球追踪方法,包括:
    通过显示器向用户双眼呈现眼球校准数据;
    通过所述眼球校准数据获取用户的双眼瞳距;
    通过左追踪相机拍摄点亮的左红外光源的反射红外光线,以及,通过右追踪相机拍摄点亮的右红外光源的反射红外光线,以在每一特定帧中形成单眼的追踪数据,并根据所述双眼瞳距与所述单眼追踪数据计算出在所述特定帧中另一单眼的追踪数据;其中,所述左红外光源与所述右红外光源在同一特定帧中仅能点亮一个;
    将单眼的追踪数据与另一单眼的追踪数据按所述特定帧的时间顺序排列形成双眼的追踪数据以完成眼球追踪。
  2. 如权利要求1所述的基于虚拟现实的眼球追踪方法,其中,通过所述眼球校准数据获取用户的双眼瞳距的过程,包括:
    通过摄像头捕捉用户根据所述校准数据调整双眼的动作以获取用户校准图像;
    对所述用户校准图像进行定位剖析获取用户双眼质心之间的距离以生成双眼瞳距。
  3. 如权利要求1所述的基于虚拟现实的眼球追踪方法,其中,通过所述眼球校准数据获取用户的双眼瞳距的过程,包括:
    通过所述左追踪相机拍摄用户根据所述眼球校准数据调整左眼的动作形成左眼校准数据;通过所述右追踪相机拍摄用户根据所述眼球校准数据调整右眼的动作形成右眼校准数据;
    根据所述左追踪相机与所述右追踪相机的相对位置关系对所述左眼校准数据和所述右眼校准数据进行拟合处理以获取用户双眼质心之间的距离生成双眼瞳距。
  4. 如权利要求1所述的基于虚拟现实的眼球追踪方法,其中,通过左追踪相机拍摄点亮的左红外光源的反射红外光线,以及,通过右追踪相 机拍摄点亮的右红外光源的反射红外光线,以在每一特定帧中形成单眼的追踪数据的过程,包括:
    使所述左红外光源与所述右红外光源分别向用户的左眼、右眼发射红外光线,使所述红外光线在用户左眼、右眼中形成反射红外光线;
    按照特定帧次依次捕捉所述反射红外光线并通过计算机视觉技术根据所述反射红外光线的相对位置以在每一特定帧中形成单眼的追踪数据。
  5. 如权利要求4所述的基于虚拟现实的眼球追踪方法,其中,
    所述左红外光源与所述右红外光源按照所述特定帧的奇偶帧次数依次交替点亮。
  6. 一种基于虚拟现实的眼球追踪***,设置为实现如权利要求1-5任一项所述的基于虚拟现实的眼球追踪方法,包括设置在虚拟现实一体机中的显示器、内置在所述虚拟现实一体机中的处理器、单眼追踪模块和红外光源,其中,所述处理器包括瞳距获取模块和追踪计算模块;所述单眼追踪模块包括左追踪相机和右追踪相机;所述红外光源包括左红外光源和右红外光源;
    所述显示器设置为向用户双眼呈现眼球校准数据;
    所述瞳距获取模块设置为获取用户的双眼瞳距;
    所述左追踪相机设置为拍摄点亮的左红外光源的反射红外光线,所述右追踪相机设置为拍摄点亮的右红外光源的反射红外光线,以在每一特定帧中形成单眼的追踪数据;其中,所述左红外光源与所述右红外光源在同一特定帧中仅能点亮一个;
    所述追踪计算模块设置为根据所述双眼瞳距与所述单眼追踪数据计算出在所述特定帧中另一单眼的追踪数据,还设置为将单眼的追踪数据与另一单眼的追踪数据按所述特定帧的时间顺序排列形成双眼的追踪数据以完成眼球追踪。
  7. 如权利要求6所述的基于虚拟现实的眼球追踪***,其中,
    所述左追踪相机与所述右追踪相机内置在所述虚拟现实一体机中与用户的左眼、右眼相对应的位置;
    所述左红外光源、所述右红外光源分别设置在所述左追踪相机与所述右追踪相机的四周。
  8. 如权利要求7所述的基于虚拟现实的眼球追踪***,其中,
    所述左追踪相机还设置为拍摄用户根据所述眼球校准数据调整左眼的动作形成左眼校准数据,所述右追踪相机还设置为拍摄用户根据所述眼球校准数据调整右眼的动作形成右眼校准数据,以使所述瞳距获取模块设置为获取所述左眼校准数据和所述右眼校准数据,根据所述左追踪相机与所述右追踪相机的相对位置关系对所述左眼校准数据和所述右眼校准数据进行拟合处理以获取用户双眼质心之间的距离生成双眼瞳距。
  9. 如权利要求7所述的基于虚拟现实的眼球追踪***,其中,还包括摄像头,
    所述摄像头设置为捕捉用户根据所述校准数据调整双眼的动作以获取用户校准图像,以使所述瞳距获取模块对所述用户校准图像进行定位剖析获取用户双眼质心之间的距离以生成双眼瞳距。
  10. 如权利要求7所述的基于虚拟现实的眼球追踪***,其中,
    所述左追踪相机与所述右追踪相机的拍摄帧率为60Hz。
  11. 一种计算机可读存储介质,所述计算机可读存储介质中存储有计算机程序,所述计算机程序被处理器执行时实现所述权利要求1至5任一项中所述的方法,或者实现权利要求6-10任一项中所述的方法。
  12. 一种电子装置,包括存储器和处理器,所述存储器中存储有计算机程序,所述处理器被设置为运行所述计算机程序以执行所述权利要求1至5任一项中所述的方法,或者执行权利要求6-10任一项中所述的方法。
PCT/CN2021/118285 2021-03-30 2021-09-14 基于虚拟现实的眼球追踪方法、*** WO2022205789A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/878,023 US11640201B2 (en) 2021-03-30 2022-07-31 Virtual reality-based eyeball tracking method and system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110340595.5 2021-03-30
CN202110340595.5A CN112926523B (zh) 2021-03-30 2021-03-30 基于虚拟现实的眼球追踪方法、***

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/878,023 Continuation US11640201B2 (en) 2021-03-30 2022-07-31 Virtual reality-based eyeball tracking method and system

Publications (1)

Publication Number Publication Date
WO2022205789A1 true WO2022205789A1 (zh) 2022-10-06

Family

ID=76176602

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/118285 WO2022205789A1 (zh) 2021-03-30 2021-09-14 基于虚拟现实的眼球追踪方法、***

Country Status (3)

Country Link
US (1) US11640201B2 (zh)
CN (1) CN112926523B (zh)
WO (1) WO2022205789A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112416125A (zh) * 2020-11-17 2021-02-26 青岛小鸟看看科技有限公司 Vr头戴式一体机
CN112926523B (zh) * 2021-03-30 2022-07-26 青岛小鸟看看科技有限公司 基于虚拟现实的眼球追踪方法、***

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104834381A (zh) * 2015-05-15 2015-08-12 中国科学院深圳先进技术研究院 用于视线焦点定位的可穿戴设备及视线焦点定位方法
CN106598260A (zh) * 2017-02-06 2017-04-26 上海青研科技有限公司 眼球追踪装置、采用该眼球追踪装置的vr设备和ar设备
CN106768361A (zh) * 2016-12-19 2017-05-31 北京小鸟看看科技有限公司 与vr头戴设备配套的手柄的位置追踪方法和***
WO2018093102A1 (ko) * 2016-11-16 2018-05-24 (주)스코넥엔터테인먼트 단일 카메라를 이용한 양안 촬영 장치
CN207650482U (zh) * 2017-11-13 2018-07-24 北京七鑫易维信息技术有限公司 消除干扰光的双目视线追踪设备
CN112099615A (zh) * 2019-06-17 2020-12-18 北京七鑫易维科技有限公司 注视信息确定方法、装置、眼球追踪设备及存储介质
CN112926521A (zh) * 2021-03-30 2021-06-08 青岛小鸟看看科技有限公司 基于光源亮灭的眼球追踪方法、***
CN112926523A (zh) * 2021-03-30 2021-06-08 青岛小鸟看看科技有限公司 基于虚拟现实的眼球追踪方法、***

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2773775C (en) 2010-04-01 2017-03-07 Enermotion Inc. A system and method for storing thermal energy as auxiliary power in a vehicle
JP6083761B2 (ja) * 2012-05-25 2017-02-22 国立大学法人静岡大学 瞳孔検出方法、角膜反射検出方法、顔姿勢検出方法及び瞳孔追尾方法
WO2015143073A1 (en) 2014-03-19 2015-09-24 Intuitive Surgical Operations, Inc. Medical devices, systems, and methods integrating eye gaze tracking for stereo viewer
US9361519B2 (en) * 2014-03-28 2016-06-07 Intel Corporation Computational array camera with dynamic illumination for eye tracking
US9936195B2 (en) * 2014-11-06 2018-04-03 Intel Corporation Calibration for eye tracking systems
KR101745140B1 (ko) * 2015-09-21 2017-06-08 현대자동차주식회사 시선 추적 장치 및 방법
US9983709B2 (en) 2015-11-02 2018-05-29 Oculus Vr, Llc Eye tracking using structured light
CN105496351B (zh) * 2015-12-30 2017-11-14 深圳市莫廷影像技术有限公司 一种双目验光装置
CN106874895B (zh) * 2017-03-31 2019-02-05 北京七鑫易维信息技术有限公司 一种视线追踪装置及头戴式显示设备
KR102523433B1 (ko) * 2018-01-10 2023-04-19 삼성전자주식회사 깊이 정보를 결정하는 방법 및 광학 시스템
US10867408B1 (en) * 2018-07-23 2020-12-15 Apple Inc. Estimation of spatial relationships between sensors of a multi-sensor device
KR102469762B1 (ko) * 2018-11-16 2022-11-22 한국전자통신연구원 시운동 반사에 의해 움직이는 동공과 객체 간의 근접 깊이 및 동공 중심 위치를 측정하는 동공 추적 장치 및 동공 추적 방법
CN109542240B (zh) * 2019-02-01 2020-07-10 京东方科技集团股份有限公司 眼球追踪装置及追踪方法
CN110502100B (zh) * 2019-05-29 2020-09-29 中国人民解放军军事科学院军事医学研究院 基于眼动跟踪的虚拟现实交互方法及装置
US11475641B2 (en) * 2020-07-21 2022-10-18 Microsoft Technology Licensing, Llc Computer vision cameras for IR light detection

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104834381A (zh) * 2015-05-15 2015-08-12 中国科学院深圳先进技术研究院 用于视线焦点定位的可穿戴设备及视线焦点定位方法
WO2018093102A1 (ko) * 2016-11-16 2018-05-24 (주)스코넥엔터테인먼트 단일 카메라를 이용한 양안 촬영 장치
CN106768361A (zh) * 2016-12-19 2017-05-31 北京小鸟看看科技有限公司 与vr头戴设备配套的手柄的位置追踪方法和***
CN106598260A (zh) * 2017-02-06 2017-04-26 上海青研科技有限公司 眼球追踪装置、采用该眼球追踪装置的vr设备和ar设备
CN207650482U (zh) * 2017-11-13 2018-07-24 北京七鑫易维信息技术有限公司 消除干扰光的双目视线追踪设备
CN112099615A (zh) * 2019-06-17 2020-12-18 北京七鑫易维科技有限公司 注视信息确定方法、装置、眼球追踪设备及存储介质
CN112926521A (zh) * 2021-03-30 2021-06-08 青岛小鸟看看科技有限公司 基于光源亮灭的眼球追踪方法、***
CN112926523A (zh) * 2021-03-30 2021-06-08 青岛小鸟看看科技有限公司 基于虚拟现实的眼球追踪方法、***

Also Published As

Publication number Publication date
CN112926523B (zh) 2022-07-26
US11640201B2 (en) 2023-05-02
CN112926523A (zh) 2021-06-08
US20220374076A1 (en) 2022-11-24

Similar Documents

Publication Publication Date Title
CN113240601B (zh) 单一深度追踪调节-聚散解决方案
CN109558012B (zh) 一种眼球追踪方法及装置
US11455032B2 (en) Immersive displays
CN105992965B (zh) 响应于焦点移位的立体显示
WO2022205789A1 (zh) 基于虚拟现实的眼球追踪方法、***
WO2022205788A1 (zh) 基于光源亮灭的眼球追踪方法、***
CN108762496B (zh) 一种信息处理方法及电子设备
WO2016108720A1 (ru) Способ и устройство для отображения трехмерных объектов
JP7471572B2 (ja) キャリブレーションシステム、及び、キャリブレーション方法
US11774759B2 (en) Systems and methods for improving binocular vision
JP7491926B2 (ja) 拡張現実ヘッドセットの動的収束調整
KR101965393B1 (ko) 시지각 트레이닝을 제공하는 컴퓨팅 장치, 헤드마운트 디스플레이 장치 기반의 시지각 트레이닝 제공방법 및 프로그램
US9805612B2 (en) Interest-attention feedback method for separating cognitive awareness into different left and right sensor displays
WO2018144890A1 (en) Rendering extended video in virtual reality
US11715176B2 (en) Foveated rendering method and system of virtual reality system based on monocular eyeball tracking
WO2022205770A1 (zh) 基于光场感知的眼球追踪***、方法
CN110755241A (zh) 视觉训练方法、视觉训练装置及存储介质
CN110897841A (zh) 视觉训练方法、视觉训练装置及存储介质
EP4083854A1 (en) System and method of head mounted display personalisation
TW202017368A (zh) 智慧眼鏡、系統及其使用方法
CN113855987B (zh) 三维内容助眠方法、装置、设备及存储介质
US12001605B2 (en) Head mounted display with visual condition compensation
CN111757089A (zh) 利用眼睛的瞳孔增强调节来渲染图像的方法和***
Jones Peripheral visual cues and their effect on the perception of egocentric depth in virtual and augmented environments

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21934426

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21934426

Country of ref document: EP

Kind code of ref document: A1