CN111568367A - Method for identifying and quantifying eye jump invasion - Google Patents

Method for identifying and quantifying eye jump invasion Download PDF

Info

Publication number
CN111568367A
CN111568367A CN202010404817.0A CN202010404817A CN111568367A CN 111568367 A CN111568367 A CN 111568367A CN 202010404817 A CN202010404817 A CN 202010404817A CN 111568367 A CN111568367 A CN 111568367A
Authority
CN
China
Prior art keywords
eye
jump
eye jump
identifying
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010404817.0A
Other languages
Chinese (zh)
Other versions
CN111568367B (en
Inventor
靳慧斌
楚明健
常银霞
刘海波
冯朝辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Civil Aviation University of China
Original Assignee
Civil Aviation University of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Civil Aviation University of China filed Critical Civil Aviation University of China
Priority to CN202010404817.0A priority Critical patent/CN111568367B/en
Publication of CN111568367A publication Critical patent/CN111568367A/en
Application granted granted Critical
Publication of CN111568367B publication Critical patent/CN111568367B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/11Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils
    • A61B3/112Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils for measuring diameter of pupils

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Human Computer Interaction (AREA)
  • Eye Examination Apparatus (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a method for identifying and quantifying eye jump invasion, which comprises the following steps: extracting original eye movement data; preprocessing the extracted data; performing eye jump eye movement detection; identifying a gaze eye movement; identifying an eye jump intrusion based on the gaze baseline; carrying out data cleaning; eye jump intrusion is quantified. The invention provides a method for identifying and quantifying eye jump invasion based on the current eye movement research technology, which can stably, accurately and effectively identify and quantify the eye jump invasion.

Description

Method for identifying and quantifying eye jump invasion
Technical Field
The invention relates to the technical field of eye movement, in particular to a method for identifying and quantifying eye jump invasion.
Background
In recent years, with the development of artificial intelligence, eye movement technology is becoming a popular and leading technology currently studied. The eye has two distinct movements, fixation and eye jump. Eye jump intrusion during eye movement is a particular horizontal eye jump deviation that occurs only when the eye is gazed at. Research shows that the eye jump intrusion index is related to the workload of an operator, and when the workload of the operator is increased, the eye jump intrusion index is increased.
The current research on eye movement techniques is mainly focused on the identification of the gaze behavior and the eye jump behavior, and relatively little research on the identification of some other eye movement behaviors (e.g. blinking, etc.). Although the existing eye movement behavior recognition algorithm can recognize the fixation and eye jump behaviors to a certain degree, certain accuracy is lacked, and the eye jump behaviors can be rarely quantized.
Disclosure of Invention
The invention aims to provide a method for identifying and quantifying eye jump intrusion, which solves the problems in the prior art and can accurately quantify eye jump behaviors.
In order to achieve the purpose, the invention provides the following scheme: the invention provides a method for identifying and quantifying eye jump intrusion, which comprises the following steps:
s1, extracting original eye movement data;
s2, preprocessing the data extracted in the step S1;
s3, eye jump eye movement detection is carried out;
s4, identifying fixation eye movement;
s5, identifying eye jump invasion based on the fixation baseline;
s6, cleaning the data obtained in the step S5;
and S7, quantifying the eye jump invasion.
Preferably, the original eye movement data in step S1 is obtained by the headset and saved as a TXT format file.
Preferably, the data preprocessing in step S2 is to replace data points with confidence lower than 0.8, data points not on the screen, and missing data points.
Preferably, the eka algorithm is used for the eye jump detection in step S3.
Preferably, the eye jump in step S3 includes a normal eye jump and an eye jump offset.
Preferably, in step S4, a normal eye jump is obtained by eliminating the eye jump type eye movement offset, and the complement of the normal eye jump is the fixation eye movement.
Preferably, in step S5, two moving median windows are used to identify eye jump type eye movement deviation during each segment of fixation eye movement, i.e. to obtain eye jump intrusion.
Preferably, the data cleansing in step S6 includes: conventional eye jump residue, unreasonable fixation process and invalid data.
Preferably, in step S7, the data obtained in step S6 are classified to obtain all eye jump intrusion values and smaller eye movement deviation values, an average value of the eye jump intrusion values and the smaller eye movement deviation values is calculated to obtain an average eye jump intrusion value, and the average smaller eye movement deviation value is obtained, so that eye jump intrusion quantification is completed.
The invention discloses the following technical effects:
compared with the prior art, the method has the advantages of reliability, stability and accuracy, can stably and accurately identify the eye jump invasion, further quantizes the eye jump invasion and is convenient to research the influence of the eye jump invasion.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without inventive exercise.
FIG. 1 is a flow chart of a method of the present invention;
fig. 2 is a simplified schematic diagram of an eye tracker spatial coordinate system and a standardized spatial coordinate system.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
Referring to fig. 1, the present invention provides a method for identifying and quantifying eye jump intrusions, comprising the steps of:
first, the original eye movement data of the object to be studied is obtained by the head-mounted eye tracker, and referring to fig. 2, since the head-mounted eye tracker is adopted in this embodiment, the origin of the coordinate system of the eye tracker is fixed relative to the head, wherein the right-hand direction to be tested is the positive direction of the X-axis of the coordinate system, upward is the positive direction of the Y-axis, and forward is the positive direction of the Z-axis. The face of the testee faces the center of the screen, and the normal line of the face is vertical to the screen. The visual axis is a straight line connecting the position on the screen seen by the eyes and the center of the pupil, and the point F is the intersection point of the visual axis and the screen, namely, a point on the screen observed at the moment of the test. The origin of the standard space coordinate system is the origin of the eye tracker coordinate system, and the positive direction of the standard space coordinate system is the same as the eye tracker space coordinate system. Three columns of the size _ normal0_ X, size _ normal0_ Y, and size _ normal0_ Z in the original data sample are X-axis coordinates, Y-axis coordinates, and Z-axis coordinates of the visual axis in the normalized spatial coordinate system, respectively, and the range of three coordinate axes in the standard spatial coordinate system is [ -1,1 ]. diameter _3d is the diameter of the pupil detected by the eye tracker, and when the area of the pupil detected is not complete, the system automatically processes to a circle, giving a diameter and confidence that the data is authentic when the confidence is greater than or equal to 0.8. x _ scaled and y _ scaled are the screen coordinates of the eye observation position at the time of eye tracker sampling, in pixels, and the resolution of the computer used in this embodiment is 1366x768, so that x _ scaled is [0,1366] and y _ scaled is [0,768 ]; on _ srf is a logic value sequence for determining whether the eye is looking at the screen area at the sampling time, if it is 1, it indicates that the subject is looking at a certain position in the screen at the sampling time, and if it is 0, it indicates that the subject is looking at a position outside the screen at the sampling time. The original data is finally saved as a file in the format of "txt". The types of eye movement data are shown in table 1:
TABLE 1
Figure BDA0002490901520000051
And secondly, preprocessing the acquired original eye movement data, replacing points with the reliability lower than 0.8, points not on the screen and missing data, and replacing the points with the median of the ten previous data of the points.
And thirdly, performing eye jump eye movement detection by adopting an EK algorithm. The EK algorithm is essentially a speed threshold algorithm. Velocity of each point
Figure BDA0002490901520000061
Obtained from equation (1):
Figure BDA0002490901520000062
in the formula
Figure BDA0002490901520000063
Is the horizontal angle of the second data point after data point n;
Figure BDA0002490901520000064
is the horizontal angle of the first data point after data point n;
Figure BDA0002490901520000065
is the horizontal angle of the first data point before data point n;
Figure BDA0002490901520000066
is the horizontal angle of the second data point before data point n;
Δ t is the sampling time interval.
Velocity threshold, horizontal velocity threshold σ, for eye jump eye movement detection in two-dimensional velocity spacexWith a vertical speed threshold σyRespectively represented by the formulas (2) and (2)Obtained by the formula (3).
Figure BDA0002490901520000067
Figure BDA0002490901520000068
In the formula, the symbol < v > represents the median value.
Wherein the detection thresholds η x and η y are multiples of the median:
ηx=λσx
ηy=λσy
wherein λ is a parameter which needs to be set reasonably in the eye jump eye movement detection method.
The last detected eye jump movement t (k) satisfies the formula (4):
Figure BDA0002490901520000071
where k represents the k-th data point,
the eye jump eye movements are identified in the step, the eye jump peak speed and the eye jump maximum amplitude in the eye jump eye movement process are calculated, when the eye jump maximum amplitude is increased, the eye jump peak speed is increased, the eye jump maximum amplitude is taken as a horizontal axis, the eye jump peak speed is taken as a vertical axis, the trend of a straight line can be observed, and the main sequence relation between the eye jump peak speed and the vertical axis can embody the accuracy of the algorithm.
Fourth, eye jump eye movement is identified, including conventional eye jump and eye jump eye movement offset. And eliminating the eye jump type eye movement offset to obtain the conventional eye jump, wherein the complement set of the conventional eye jump is all the fixation eye movements.
And fifthly, identifying eye jump invasion based on the fixation baseline, namely acquiring eye jump type eye movement deviation data. And identifying eye jump eye movement deviation in each section of eye movement watching process by using the two moving median windows, wherein the eye jump eye movement deviation is eye jump invasion. The two moving median windows comprise a large moving median window and a small moving median window, the median of data 1000ms before and 1000ms after the current data point is taken to form the large moving median window with the time length of 2000ms, however, the time window is not continuous, the data in the middle of 500ms of the time window does not participate in calculating the fixation base line, the large moving median window is used for determining the basic horizontal fixation position, namely, the fixation base line is determined, and the fixation base line represents a trend of stable sight; the small moving median window has a time length of 50ms and is used to detect the eye level deviation in real time.
An important application level of the eye movement deviation quantification algorithm is to quantify eye jump eye movement deviation based on the fixation baseline position. The eye deviation quantification algorithm does not require the exact location of the visual target in the physical world, as it automatically detects the gaze process and evaluates the gaze location, from which eye jump eye deviation is calculated from the eye deviation. In this embodiment, the purpose of the eye movement offset quantization algorithm using two moving median windows of different lengths is to highlight the difference between the basic gaze position represented by the large moving median window and the eye movement offset position represented by the small moving median window.
Sixthly, cleaning the acquired eye jump type eye movement deviation data, and cleaning the following data: conventional eye jump residue, unreasonable fixation process and invalid data. So that the final quantized data contains only eye movement deviations of the fixation process.
Offset value corresponding to conventional eye jump residual data: for each detected conventional eye jump, the starting time and the ending time are marked, but the actual starting time of the conventional eye jump may be out of the marked time period, which is caused by the fixed sampling time interval of the eye tracker, so that accurate correction cannot be performed, and the offset value corresponding to the residual data is large, but is not caused by the fixation eye movement offset, so that the residual data needs to be cleaned. For a detected conventional eye jump, the offset value corresponding to the conventional eye jump procedure and the data point one data point before and one data point after the procedure is set to 0.
And the deviation value corresponding to the eye jump type eye movement deviation in the unreasonable eye movement watching process: the fixation eye movement process with the duration less than 1000ms is called an unreasonable fixation eye movement process, because the large moving median window cannot acquire enough data from the fixation eye movement process with the duration less than 1000ms, the offset value corresponding to the fixation eye movement process with the duration less than 1000ms needs to be cleared, and the processing method is to set the offset value corresponding to the unreasonable fixation eye movement process to 0.
Invalid data: refers to the missing data that was replaced. The offset value corresponding to these data is generated by the replacement process, is not caused by the eye movement offset of the fixation process, and therefore needs to be cleared, and the offset value corresponding to the invalid data is set to 0.
And seventhly, classifying the cleaned eye movement deviation data according to a specified classification standard to obtain two eye movement deviations, wherein the eye movement deviation with the deviation amplitude of 0.4-4.1 degrees is called eye movement invasion, the eye movement deviation with the amplitude of less than 0.4 degrees is called small eye movement deviation or micro eye movement, and the average value of each type is calculated. The average eye jump invasion value is the sum of the deviation values of the eye jump invasion data points divided by the corresponding time, and the unit is deg/s. The eye movement deviation quantification algorithm can output two values after processing the data of each experiment: and averaging the eye jump intrusion values, averaging smaller eye movement deviation values, and finishing the eye jump intrusion quantification.
The accuracy of the identification method can be obtained according to the main sequence relation between the identified eye jump peak speed and the maximum amplitude; the invention adopts the EK algorithm, and compared with the speed threshold algorithm, the speed threshold of the EK algorithm is obtained in a self-adaptive way, and the speed threshold is not required to be set manually, so that the EK algorithm has better reliability; and because the threshold is selected in a self-adaptive manner, the method is suitable for different people and different data sampling methods, and the stability is good.
In the description of the present invention, it is to be understood that the terms "longitudinal", "lateral", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", and the like, indicate orientations or positional relationships based on those shown in the drawings, are merely for convenience of description of the present invention, and do not indicate or imply that the referenced devices or elements must have a particular orientation, be constructed and operated in a particular orientation, and thus, are not to be construed as limiting the present invention.
The above-described embodiments are merely illustrative of the preferred embodiments of the present invention, and do not limit the scope of the present invention, and various modifications and improvements of the technical solutions of the present invention can be made by those skilled in the art without departing from the spirit of the present invention, and the technical solutions of the present invention are within the scope of the present invention defined by the claims.

Claims (9)

1. A method of identifying and quantifying eye jump intrusions, comprising: the method comprises the following steps:
s1, extracting original eye movement data;
s2, preprocessing the data extracted in the step S1;
s3, eye jump eye movement detection is carried out;
s4, identifying fixation eye movement;
s5, identifying eye jump invasion based on the fixation baseline, namely acquiring eye jump type eye movement deviation data;
s6, cleaning the data obtained in the step S5;
and S7, quantifying the eye jump invasion.
2. The method of identifying and quantifying eye jump intrusions of claim 1, wherein: the original eye movement data in step S1 is obtained by the head-mounted eye movement apparatus and saved as a TXT format file.
3. The method of identifying and quantifying eye jump intrusions of claim 1, wherein: the data preprocessing in step S2 is to replace the data points with confidence lower than a predetermined threshold, the data points not on the screen, and the missing data points.
4. The method of identifying and quantifying eye jump intrusions of claim 1, wherein: the eka algorithm is used for detecting eye jump in step S3.
5. The method of identifying and quantifying eye jump intrusions of claim 1, wherein: the eye jump in step S3 includes a normal eye jump and an eye jump offset.
6. The method of identifying and quantifying eye jump intrusions of claim 1, wherein: in step S4, a conventional eye jump is obtained by eliminating the eye jump type eye movement offset, and the complement of the conventional eye jump is the fixation eye movement.
7. The method of identifying and quantifying eye jump intrusions of claim 1, wherein: in the step S5, two moving median windows are used to identify eye jump type eye movement deviation in each eye movement watching process, that is, eye jump intrusion is obtained.
8. The method of identifying and quantifying eye jump intrusions of claim 1, wherein: the data cleaning in step S6 includes: conventional eye jump residue, unreasonable fixation process and invalid data.
9. The method of identifying and quantifying eye jump intrusions of claim 1, wherein: in step S7, the data obtained in step S6 are classified to obtain all eye jump intrusion values and smaller eye movement deviation values, and an average value of the eye jump intrusion values and the smaller eye movement deviation values is calculated to obtain an average eye jump intrusion value, and the smaller eye movement deviation values are averaged, thereby completing the eye jump intrusion quantification.
CN202010404817.0A 2020-05-14 2020-05-14 Method for identifying and quantifying eye jump invasion Active CN111568367B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010404817.0A CN111568367B (en) 2020-05-14 2020-05-14 Method for identifying and quantifying eye jump invasion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010404817.0A CN111568367B (en) 2020-05-14 2020-05-14 Method for identifying and quantifying eye jump invasion

Publications (2)

Publication Number Publication Date
CN111568367A true CN111568367A (en) 2020-08-25
CN111568367B CN111568367B (en) 2023-07-21

Family

ID=72121028

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010404817.0A Active CN111568367B (en) 2020-05-14 2020-05-14 Method for identifying and quantifying eye jump invasion

Country Status (1)

Country Link
CN (1) CN111568367B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08279247A (en) * 1995-04-05 1996-10-22 Fujitsu Ltd Method and device for processing reproduced signal as well as disk device
CN103713728A (en) * 2014-01-14 2014-04-09 东南大学 Method for detecting usability of complex system human-machine interface
CN104898823A (en) * 2014-03-04 2015-09-09 中国电信股份有限公司 Method and device for controlling sighting mark motion
WO2016112690A1 (en) * 2015-01-14 2016-07-21 北京工业大学 Eye movement data based online user state recognition method and device
CN109124657A (en) * 2013-03-11 2019-01-04 亚特兰大儿童医疗保健公司 For recognizing and the system and method for the detection of development condition
CN109199412A (en) * 2018-09-28 2019-01-15 南京工程学院 Abnormal emotion recognition methods based on eye movement data analysis
CN109199411A (en) * 2018-09-28 2019-01-15 南京工程学院 Case insider's recognition methods based on Model Fusion
CN109925678A (en) * 2019-03-01 2019-06-25 北京七鑫易维信息技术有限公司 A kind of training method based on eye movement tracer technique, training device and equipment
CN109933193A (en) * 2019-03-01 2019-06-25 北京体育大学 Intelligent auxiliary training system based on sportsman's eye movement information real-time capture

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08279247A (en) * 1995-04-05 1996-10-22 Fujitsu Ltd Method and device for processing reproduced signal as well as disk device
CN109124657A (en) * 2013-03-11 2019-01-04 亚特兰大儿童医疗保健公司 For recognizing and the system and method for the detection of development condition
CN103713728A (en) * 2014-01-14 2014-04-09 东南大学 Method for detecting usability of complex system human-machine interface
CN104898823A (en) * 2014-03-04 2015-09-09 中国电信股份有限公司 Method and device for controlling sighting mark motion
WO2016112690A1 (en) * 2015-01-14 2016-07-21 北京工业大学 Eye movement data based online user state recognition method and device
CN109199412A (en) * 2018-09-28 2019-01-15 南京工程学院 Abnormal emotion recognition methods based on eye movement data analysis
CN109199411A (en) * 2018-09-28 2019-01-15 南京工程学院 Case insider's recognition methods based on Model Fusion
CN109925678A (en) * 2019-03-01 2019-06-25 北京七鑫易维信息技术有限公司 A kind of training method based on eye movement tracer technique, training device and equipment
CN109933193A (en) * 2019-03-01 2019-06-25 北京体育大学 Intelligent auxiliary training system based on sportsman's eye movement information real-time capture

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
郑秀娟: "基于视觉特性的驾驶安全眼动研究进展" *
靳慧斌: "基于眼动数据分析的安检视觉搜索特征" *

Also Published As

Publication number Publication date
CN111568367B (en) 2023-07-21

Similar Documents

Publication Publication Date Title
CN107784294B (en) Face detection and tracking method based on deep learning
CN111611905B (en) Visible light and infrared fused target identification method
CN108235774B (en) Information processing method and device, cloud processing equipment and computer program product
CN111616718B (en) Method and system for detecting fatigue state of driver based on attitude characteristics
CN112464793A (en) Method, system and storage medium for detecting cheating behaviors in online examination
CN109711239B (en) Visual attention detection method based on improved mixed increment dynamic Bayesian network
CN110148092B (en) Method for analyzing sitting posture and emotional state of teenager based on machine vision
CN112926522B (en) Behavior recognition method based on skeleton gesture and space-time diagram convolution network
CN114005167A (en) Remote sight estimation method and device based on human skeleton key points
CN116051631A (en) Light spot labeling method and system
CN114333046A (en) Dance action scoring method, device, equipment and storage medium
CN111626152B (en) Space-time line-of-sight direction estimation prototype design method based on Few-shot
CN115937928A (en) Learning state monitoring method and system based on multi-vision feature fusion
CN110910426A (en) Action process and action trend identification method, storage medium and electronic device
CN114120188A (en) Multi-pedestrian tracking method based on joint global and local features
CN110598647A (en) Head posture recognition method based on image recognition
CN117593792A (en) Abnormal gesture detection method and device based on video frame
CN113033501A (en) Human body classification method and device based on joint quaternion
CN112818796A (en) Intelligent posture discrimination method and storage device suitable for online invigilation scene
CN111568367B (en) Method for identifying and quantifying eye jump invasion
CN116453198A (en) Sight line calibration method and device based on head posture difference
CN110598635A (en) Method and system for face detection and pupil positioning in continuous video frames
CN115050102A (en) Mobile equipment standard sitting posture analysis processing method and system and storage medium thereof
CN111507555B (en) Human body state detection method, classroom teaching quality evaluation method and related device
Miyoshi et al. Detection of Dangerous Behavior by Estimation of Head Pose and Moving Direction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant