CN112686927B - Human eye position regression calculation method - Google Patents

Human eye position regression calculation method Download PDF

Info

Publication number
CN112686927B
CN112686927B CN202011642099.7A CN202011642099A CN112686927B CN 112686927 B CN112686927 B CN 112686927B CN 202011642099 A CN202011642099 A CN 202011642099A CN 112686927 B CN112686927 B CN 112686927B
Authority
CN
China
Prior art keywords
human eye
frames
eye position
calculation method
motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011642099.7A
Other languages
Chinese (zh)
Other versions
CN112686927A (en
Inventor
朱志林
潘博
孟乒乒
张伟香
方勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Evis Technology Co ltd
Original Assignee
Shanghai Evis Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Evis Technology Co ltd filed Critical Shanghai Evis Technology Co ltd
Priority to CN202011642099.7A priority Critical patent/CN112686927B/en
Publication of CN112686927A publication Critical patent/CN112686927A/en
Application granted granted Critical
Publication of CN112686927B publication Critical patent/CN112686927B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The invention discloses a human eye position regression calculation method, which comprises the following steps: acquiring the position of the human eyes in the current frame through a trained detection model, and storing the acquired human eye position until the caching requirement of n+1 frames is met, and starting to perform human eye actual position regression; taking out the data in the buffered n+1 frames to obtain the latest n/4 frames, and calculating an average offset value mBias between frames; calculating the offset value mu of the buffered n+1 frames i Will be
Figure DDA0002880528290000011
Weights ω as last n frames i The method comprises the steps of carrying out a first treatment on the surface of the Fitting motion equation parameters by a weighted least square method, and regressing the actual human eye position after the delay according to the delay of the measured camera exposure to the system; and calculating corresponding grating parameters by taking the human eye position obtained by regression as a parameter, so as to realize the projection of the optimal 3D effect of the naked eye 3D display at the human eye position. The human eye position regression calculation method provided by the invention can accurately calculate the actual human eye position of the viewer and can alsoProviding an optimal viewing experience.

Description

Human eye position regression calculation method
Technical Field
The invention belongs to the technical field of human eye tracking, relates to a human eye position calculation method, and particularly relates to a human eye position regression calculation method.
Background
Along with the gradual maturation of naked eye 3D display technology combined with human eye tracking, the improvement of the accuracy of the human eye position acquired by a camera and the human eye position of an actual viewer becomes a problem to be solved. By detecting the position of the human eye, changing the parameters of the grating, the three-dimensional content is presented to the viewer based on the human eye detection naked eye 3D display mode. This display may give a higher resolution experience to the viewer, but requires accurate eye position of the viewer and does not have hysteresis in the viewer movement.
Aiming at the condition that delay exists between the image transmission of the camera and the system and the accuracy of the eye position of the viewer is affected, the delayed eye position is regressed, so that the calculated grating parameter corresponds to the actual eye position of the viewer. By means of regression and time delay, the problem of hysteresis of grating parameter change under the condition of movement of a viewer is avoided, and the detected viewer is guaranteed to have optimal viewing experience.
In view of this, a new human eye position regression method is designed to overcome at least some of the above-mentioned drawbacks of the existing human eye position calculation methods.
Disclosure of Invention
The invention provides a human eye position regression calculation method which can accurately calculate the actual human eye position of a viewer and can provide the best viewing experience.
In order to solve the technical problems, according to one aspect of the present invention, the following technical scheme is adopted:
a human eye position regression calculation method, the human eye position regression calculation method comprising:
step S1, acquiring the position of the human eye in the current frame through a trained detection model, and storing the acquired position of the human eye until the buffer requirement of n+1 frames is met, and starting to carry out actual position regression of the human eye;
s2, taking out the data in the buffered n+1 frames from the latest n/4 frames, and calculating an average offset value mBias between the frames;
step S3, calculating the offset value mu of the buffered n+1 frames i Will be
Figure BDA0002880528270000011
Weights ω as last n frames i
S4, fitting motion equation parameters through a weighted least square method, and regressing the actual human eye position after the delay according to the delay of the measured camera exposure to the system;
and S5, calculating corresponding grating parameters by taking the human eye position obtained by regression as a parameter, and realizing the projection of the optimal 3D effect of the naked eye 3D display on the human eye position.
In the step S1, the position of the human eye detected by the detection model is stored, and the regression of the position of the human eye is started when the requirement of n+1 frames is satisfied;
where n+1 frames can be dynamically adjusted according to the delay time or the model detection time, using 20 frames as buffered frames, but not limited to this frame number.
In step S2, the average offset value mBias of n/4 frames is taken as the base of the motion offset of the current viewer, considering that the data of the latest updated frame can reflect the motion state of the current viewer.
As an embodiment of the present invention, in the step S3, the offset μ between all n+1 frames is calculated i The weight ω of each frame is assigned by the following formula considering that the most recent frame most likely matches the current motion i
Figure BDA0002880528270000021
Wherein mu i The offset value corresponding to the n+1 frame is represented, mBias represents the average offset value of the nearest n/4 frame, alpha represents a proportionality coefficient, an adjustable parameter is adopted here, epsilon represents a jitter error caused by detection in the motion process, and the position is constant;
by comparing the deviation value of the current frame with the absolute value difference of the average deviation value mBias to allocate weights, the fitted motion equation parameters can be changed in time according to the latest detected frame deviation when the motion state of the observer changes.
In the step S4, the motion equation parameter is fitted by a weighted least square method, and the actual human eye position after the delay is added is regressed according to the delay of the measured camera exposure to the system;
the motion equation is a unitary quadratic equation, and the motion state of a viewer is enough to be described by the unitary quadratic equation when the buffer frame is fitted in consideration of the limitation of the buffer frame number; the formula of the weighted least squares method is as follows:
Figure BDA0002880528270000022
omega in the formula i As a weight parameter, a 0 、a 1 A 2 The motion equation parameters to be fitted are obtained;
the delay of the exposure of the camera to the system is fixed delay of the transmission of the camera to the system, but the naked eye 3D display needs to be capable of projecting left and right images to a viewer in real time, and the delay can cause deviation between the projected position and the actual position of the viewer, so that the viewing effect is affected.
In step S5, the position of the human eye to be returned needs to be interpolated and converted into the grating parameter of the display, so that the left and right images projected by the grating project corresponding spatial positions, and the optimal viewing effect can be achieved only by returning the position of the human eye after delay.
The invention has the beneficial effects that: the human eye position regression calculation method provided by the invention can accurately calculate the actual human eye position of the viewer and can provide the best viewing experience.
Drawings
Fig. 1 is a flowchart of a method for calculating regression of eye position according to an embodiment of the invention.
Detailed Description
Preferred embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
For a further understanding of the present invention, preferred embodiments of the invention are described below in conjunction with the examples, but it should be understood that these descriptions are merely intended to illustrate further features and advantages of the invention, and are not limiting of the claims of the invention.
The description of this section is intended to be illustrative of only a few exemplary embodiments and the invention is not to be limited in scope by the description of the embodiments. It is also within the scope of the description and claims of the invention to interchange some of the technical features of the embodiments with other technical features of the same or similar prior art.
The description of the steps in the various embodiments in the specification is merely for convenience of description, and the implementation of the present application is not limited by the order in which the steps are implemented. "connected" in the specification includes both direct and indirect connections.
The invention discloses a human eye position regression calculation method, and FIG. 1 is a flow chart of a human eye position regression calculation method in an embodiment of the invention; referring to fig. 1, the human eye position regression calculation method includes:
step S1, acquiring the position of the human eyes in the current frame through a trained detection model, and storing the acquired human eye position until the caching requirement of n+1 frames is met, and starting to carry out the actual position regression of the human eyes.
In one embodiment, the position of the human eye detected by the detection model is stored, and the regression of the position of the human eye is started after the requirement of n+1 frames is met;
where n+1 frames can be dynamically adjusted according to the delay time or the model detection time, using 20 frames as buffered frames, but not limited to this frame number.
Step S2, the data in the buffered n+1 frames are taken out of the latest n/4 frames, and an average offset value mBias between the frames is calculated.
In one embodiment, the average offset value mBias of n/4 frames is taken as the base of the current viewer motion offset, considering that the data of the most recently updated frames more reflects the current viewer's motion state.
Calculating the offset value mu of the buffered n+1 frames [ step S3 ] i Will be
Figure BDA0002880528270000041
Weights ω as last n frames i
In one embodiment, the offset μ between all n+1 frames is calculated i The weight ω of each frame is assigned by the following formula considering that the most recent frame most likely matches the current motion i
Figure BDA0002880528270000042
Wherein mu i The offset value corresponding to the n+1 frame is represented, mBias represents the average offset value of the nearest n/4 frame, alpha represents a proportionality coefficient, an adjustable parameter is adopted here, epsilon represents a jitter error caused by detection in the motion process, and the position is constant;
by comparing the deviation value of the current frame with the absolute value difference of the average deviation value mBias to allocate weights, the fitted motion equation parameters can be changed in time according to the latest detected frame deviation when the motion state of the observer changes.
Step S4, fitting motion equation parameters through a weighted least square method, and returning to the actual human eye position after the delay according to the delay of the measured camera exposure transmitted to the system.
In one embodiment, fitting motion equation parameters by a weighted least square method, and regressing the actual human eye position after adding the delay according to the delay of the measured camera exposure to the transmission to the system;
the motion equation is a unitary quadratic equation, and the motion state of a viewer is enough to be described by the unitary quadratic equation when the buffer frame is fitted in consideration of the limitation of the buffer frame number; the formula of the weighted least squares method is as follows:
Figure BDA0002880528270000043
omega in the formula i As a weight parameter, a 0 、a 1 A 2 The motion equation parameters to be fitted are obtained;
the delay of the exposure of the camera to the system is fixed delay of the transmission of the camera to the system, but the naked eye 3D display needs to be capable of projecting left and right images to a viewer in real time, and the delay can cause deviation between the projected position and the actual position of the viewer, so that the viewing effect is affected.
And (S5) calculating corresponding grating parameters by taking the human eye position obtained by regression as a parameter, so as to realize the projection of the optimal 3D effect of the naked eye 3D display on the human eye position.
In an embodiment, the returned eye position needs to be interpolated and converted into the grating parameter of the display, so that the left and right images projected by the grating project corresponding space positions, and the optimal viewing effect can be achieved only by returning the delayed eye position.
In summary, the human eye position regression calculation method provided by the invention can accurately calculate the actual human eye position of the viewer and can provide the best viewing experience.
It should be noted that the present application may be implemented in software and/or a combination of software and hardware; for example, an Application Specific Integrated Circuit (ASIC), a general purpose computer, or any other similar hardware device may be employed. In some embodiments, the software programs of the present application may be executed by a processor to implement the above steps or functions. Likewise, the software programs of the present application (including related data structures) may be stored in a computer-readable recording medium; such as RAM memory, magnetic or optical drives or diskettes, and the like. In addition, some steps or functions of the present application may be implemented in hardware; for example, as circuitry that cooperates with the processor to perform various steps or functions.
The technical features of the above-described embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above-described embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The description and applications of the present invention herein are illustrative and are not intended to limit the scope of the invention to the embodiments described above. Effects or advantages referred to in the embodiments may not be embodied in the embodiments due to interference of various factors, and description of the effects or advantages is not intended to limit the embodiments. Variations and modifications of the embodiments disclosed herein are possible, and alternatives and equivalents of the various components of the embodiments are known to those of ordinary skill in the art. It will be clear to those skilled in the art that the present invention may be embodied in other forms, structures, arrangements, proportions, and with other assemblies, materials, and components, without departing from the spirit or essential characteristics thereof. Other variations and modifications of the embodiments disclosed herein may be made without departing from the scope and spirit of the invention.

Claims (6)

1. The human eye position regression calculation method is characterized by comprising the following steps of:
step S1, acquiring the position of the human eye in the current frame through a trained detection model, and storing the acquired position of the human eye until the buffer requirement of n+1 frames is met, and starting to carry out actual position regression of the human eye;
s2, taking out the data in the buffered n+1 frames from the latest n/4 frames, and calculating an average offset value mBias between the frames;
step S3, calculating the offset value mu of the buffered n+1 frames i Will be
Figure FDA0003980457750000011
Weights ω as last n frames i
Wherein mu i The offset value corresponding to the n+1 frame is represented, mBias represents the average offset value of the nearest n/4 frame, alpha represents a proportionality coefficient, an adjustable parameter is adopted here, epsilon represents a jitter error caused by detection in the motion process, and the position is constant;
s4, fitting motion equation parameters through a weighted least square method, and regressing the actual human eye position after the delay according to the delay of the measured camera exposure to the system;
and S5, calculating corresponding grating parameters by taking the human eye position obtained by regression as a parameter, and realizing the projection of the optimal 3D effect of the naked eye 3D display on the human eye position.
2. The human eye position regression calculation method according to claim 1, wherein:
in the step S1, the positions of the eyes detected by the detection model are stored, and the regression of the positions of the eyes is started after the requirement of n+1 frames is met; and the n+1 frames are dynamically adjusted according to the delay time length or the model detection time.
3. The human eye position regression calculation method according to claim 1, wherein:
in the step S2, the average offset value mBias of the n/4 frames is used as the base of the motion offset of the current viewer.
4. The human eye position regression calculation method according to claim 1, wherein:
in the step S3, the offset value mu between all n+1 frames is calculated i The weight ω of each frame is assigned by the following formula considering that the most recent frame most likely matches the current motion i
Figure FDA0003980457750000012
By comparing the deviation value of the current frame with the absolute value difference of the average deviation value mBias to allocate weight, the fitted motion equation parameters can be changed in time according to the latest detected frame deviation when the motion state of the observer changes.
5. The human eye position regression calculation method according to claim 1, wherein:
in the step S4, fitting motion equation parameters by a weighted least square method, and regressing the actual human eye position after adding the delay according to the delay of the measured camera exposure to the system;
the motion equation is a unitary quadratic equation, and considering the limitation of the buffer frame number, when fitting is performed in the buffer frame, the unitary quadratic equation describes the motion state of the viewer; the formula of the weighted least squares method is as follows:
Figure FDA0003980457750000021
omega in the formula i As a weight parameter, a 0 、a 1 A 2 And the parameters of the equation of motion which need fitting are obtained.
6. The human eye position regression calculation method according to claim 1, wherein:
in step S5, the returned eye position needs to be interpolated and converted into the grating parameter of the display, so that the left and right images projected by the grating project corresponding space positions, and the best viewing effect can be achieved only by the eye position after the return delay.
CN202011642099.7A 2020-12-31 2020-12-31 Human eye position regression calculation method Active CN112686927B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011642099.7A CN112686927B (en) 2020-12-31 2020-12-31 Human eye position regression calculation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011642099.7A CN112686927B (en) 2020-12-31 2020-12-31 Human eye position regression calculation method

Publications (2)

Publication Number Publication Date
CN112686927A CN112686927A (en) 2021-04-20
CN112686927B true CN112686927B (en) 2023-05-12

Family

ID=75456771

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011642099.7A Active CN112686927B (en) 2020-12-31 2020-12-31 Human eye position regression calculation method

Country Status (1)

Country Link
CN (1) CN112686927B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103810472A (en) * 2013-11-29 2014-05-21 南京大学 Method for pupil position filtering based on movement correlation
CN104318264A (en) * 2014-10-14 2015-01-28 武汉科技大学 Facial feature point tracking method based on human eye preferential fitting
CN106055104A (en) * 2015-03-26 2016-10-26 霍尼韦尔国际公司 Methods and apparatus for providing snapshot truthing system for tracker
CN106897662A (en) * 2017-01-06 2017-06-27 北京交通大学 The localization method of the face key feature points based on multi-task learning
CN107563346A (en) * 2017-09-20 2018-01-09 南京栎树交通互联科技有限公司 One kind realizes that driver fatigue sentences method for distinguishing based on eye image processing
CN109104603A (en) * 2018-09-25 2018-12-28 上海玮舟微电子科技有限公司 A kind of viewpoint compensation method, apparatus, electronic equipment and storage medium
CN109658395A (en) * 2018-12-06 2019-04-19 代黎明 Optic disk method for tracing and system and eyeground acquisition device
CN109842793A (en) * 2017-09-22 2019-06-04 深圳超多维科技有限公司 A kind of naked eye 3D display method, apparatus and terminal
CN109922343A (en) * 2019-04-12 2019-06-21 杭州电子科技大学上虞科学与工程研究院有限公司 A method of conspicuousness, which is extracted, from a small amount of user's eye movement data carries out video compress
CN111160292A (en) * 2019-12-31 2020-05-15 上海易维视科技有限公司 Human eye detection method
CN111281403A (en) * 2020-03-09 2020-06-16 西安交通大学 Fine-grained human body fatigue detection method and device based on embedded equipment

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI504933B (en) * 2012-06-13 2015-10-21 Innolux Corp 2d/3d switchable display device and method for manufacturing the same
US10048749B2 (en) * 2015-01-09 2018-08-14 Microsoft Technology Licensing, Llc Gaze detection offset for gaze tracking models

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103810472A (en) * 2013-11-29 2014-05-21 南京大学 Method for pupil position filtering based on movement correlation
CN104318264A (en) * 2014-10-14 2015-01-28 武汉科技大学 Facial feature point tracking method based on human eye preferential fitting
CN106055104A (en) * 2015-03-26 2016-10-26 霍尼韦尔国际公司 Methods and apparatus for providing snapshot truthing system for tracker
CN106897662A (en) * 2017-01-06 2017-06-27 北京交通大学 The localization method of the face key feature points based on multi-task learning
CN107563346A (en) * 2017-09-20 2018-01-09 南京栎树交通互联科技有限公司 One kind realizes that driver fatigue sentences method for distinguishing based on eye image processing
CN109842793A (en) * 2017-09-22 2019-06-04 深圳超多维科技有限公司 A kind of naked eye 3D display method, apparatus and terminal
CN109104603A (en) * 2018-09-25 2018-12-28 上海玮舟微电子科技有限公司 A kind of viewpoint compensation method, apparatus, electronic equipment and storage medium
CN109658395A (en) * 2018-12-06 2019-04-19 代黎明 Optic disk method for tracing and system and eyeground acquisition device
CN109922343A (en) * 2019-04-12 2019-06-21 杭州电子科技大学上虞科学与工程研究院有限公司 A method of conspicuousness, which is extracted, from a small amount of user's eye movement data carries out video compress
CN111160292A (en) * 2019-12-31 2020-05-15 上海易维视科技有限公司 Human eye detection method
CN111281403A (en) * 2020-03-09 2020-06-16 西安交通大学 Fine-grained human body fatigue detection method and device based on embedded equipment

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
"3D Gaze Estimation from 2D Pupil Positions on Monocular Head-Mounted Eye Trackers";Mohsen Mansouryar等;《ETRA2016》;20161231;第197-200页 *
"Hybrid Eye-Tracking on a Smartphone with CNN Feature Extraction and an Infrared 3D Model";Braiden Brousseau等;《sensors》;20200119;第20卷(第543期);第1-21页 *
"人眼跟踪技术研究及其在裸人眼跟踪技术研究及其在裸人眼跟踪技术研究及其在裸人眼跟踪技术研究及其在裸人眼跟踪技术研究及其在裸眼3D下的应用";陈国龙;《中国优秀硕士学位论文全文数据库 信息科技辑》;20160315(第03期);第1-5章 *
"狭缝光栅式裸眼3D显示沙盘的研究和制备";刘俊;《中国优秀硕士学位论文全文数据库 信息科技辑》;20180915(第09期);第2-5章 *

Also Published As

Publication number Publication date
CN112686927A (en) 2021-04-20

Similar Documents

Publication Publication Date Title
US20230336873A1 (en) Video Stabilization
KR101518531B1 (en) System and method for measuring potential eyestrain of stereoscopic motion pictures
CN109191506B (en) Depth map processing method, system and computer readable storage medium
JP2020520603A5 (en)
EP2544445A1 (en) Image processing device, image processing method, image processing program and storage medium
CN102959586A (en) Motion estimation device, depth estimation device, and motion estimation method
JP2012528519A (en) Multi-projector system and method
US20140132784A1 (en) Estimation of Picture Motion Blurriness
WO2020042581A1 (en) Focusing method and device for image acquisition apparatus
JP6393254B2 (en) Method and apparatus for correcting distortion error due to adjustment effect in stereoscopic display
CN109959919A (en) Automobile and monocular cam distance measuring method, device
CN112686927B (en) Human eye position regression calculation method
Hashimoto et al. Radiometric compensation for non-rigid surfaces by continuously estimating inter-pixel correspondence
CN110691228A (en) Three-dimensional transformation-based depth image noise marking method and device and storage medium
CN114543797A (en) Pose prediction method and apparatus, device, and medium
WO2007129444A1 (en) Image distortion correction method, distortion correction program, and optical device
JP5620516B2 (en) Stabilization method and computer system
JP2020107216A (en) Information processor, control method thereof, and program
JP7449715B2 (en) Framing area learning device, framing area estimating device, and programs thereof
JP2004012626A (en) Stereoscopic video display device and stereoscopic video display method
JP2014052426A (en) Imaging device, control method for imaging device, and program
JP6352182B2 (en) How to correct stereo film frame zoom settings and / or vertical offset
CN202841286U (en) Portable phone device provided with camera
CN113099106B (en) Video processing method, device, equipment and storage medium
WO2023189068A1 (en) Information processing device, information processing method, and information processing program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant