CN108932473A - Eye movement feature extracting method, device and storage medium - Google Patents
Eye movement feature extracting method, device and storage medium Download PDFInfo
- Publication number
- CN108932473A CN108932473A CN201810522110.2A CN201810522110A CN108932473A CN 108932473 A CN108932473 A CN 108932473A CN 201810522110 A CN201810522110 A CN 201810522110A CN 108932473 A CN108932473 A CN 108932473A
- Authority
- CN
- China
- Prior art keywords
- angle
- blinkpunkt
- straight line
- watching area
- eye movement
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/193—Preprocessing; Feature extraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The present invention provides a kind of eye movement feature extracting method, device and storage medium, the method includes:Q for obtaining interface to be measured watch point data attentively, wherein Q is the positive integer greater than 2, described to watch point data attentively according to the acquisition of the residence time of eye-observation position;Watch point data attentively according to the Q of acquisition, generate by continuous two blinkpunkt formed with directive straight line;Obtain the angle that two directive straight lines of continuous band are formed, the angle that will acquire is compared with default angle, if the angle not in the range of the default angle, extracts and marks to form the two of the angle straight lines as the first extraction track characteristic.By obtaining eye movement gaze data, general browsing habit data are violated in analysis, the doubtful blinkpunkt or watching area for violating browsing habit is marked, the image that can be used for locating interface design problem is generated, convenient for showing the data analysis at interface and reducing manual analysis workload.
Description
Technical field
The present invention relates to data processing fields, in particular to a kind of eye movement feature extracting method, device and deposit
Storage media.
Background technique
Eye movement tracer technique is usually used in obtaining the sight track of user, such as under the scenes such as browsing, operation specific interface
It has been widely used.Eye movement tracer technique can pass through point to user in the position that specific region sight stops and time
Analysis obtains the rule of user's browsing, as user in the focus of specific interface and the browsing sequence of interface difference element and is closed
Duration is infused, and then can analyze interface effect, and adjust interface.
Currently, eye movement tracer technique is mainly the residence time and coordinate for recording user's sight on interface, according to front and back
The difference of time obtains the stay time of sight focus, in conjunction with the Coordinate generation interface of sight focus hotspot's distribution figure and
Sight motion track figure, and carry out specific region, the specific type of specific time watch attentively rule statistics and analysis, to obtain
Eye movement feature.In this method, eye movement feature mainly passes through blinkpunkt and the relationship in region, time is analyzed,
The relationship between blinkpunkt cannot be analyzed completely;And such method can only describe the essential characteristic at tested interface, as whole
Hotspot's distribution, the eye movement characteristics information that habit is browsed for not meeting user is not identified and is handled, to interface and user
The effect for experiencing judgement is smaller, the higher cost of subsequent artefacts' judgement.
Although being only by user at interface currently there are also the technology for carrying out data analysis for eye movement
The upper eye movement data browsed naturally are clustered, and generate at least one eye movement rule, and then analyze page on the whole
Design part Single-issue that may be present in face.This simple settling mode has abandoned true detail data, and uses
Integration data, regular information directive property is weaker, and being not easy to locating interface, there may be problems.
Summary of the invention
In order to solve the above technical problems, the present invention provides a kind of eye movement feature extracting method, device and storages to be situated between
Matter solves current presentation interface data and analyzes incomplete problems, and by obtaining eye movement gaze data, analysis is violated general
Browsing habit data, mark the doubtful blinkpunkt or watching area for violating browsing habit, and generation can be used for locating interface design and ask
The image of topic, convenient for showing the data analysis at interface and reducing manual analysis workload.
According to a first aspect of the embodiments of the present invention, a kind of eye movement feature extracting method, the method packet are provided
It includes:
Q for obtaining interface to be measured watch point data attentively, wherein Q is the positive integer greater than 2, and the blinkpunkt gets parms
According to the minimum residence time of eye-observation position;
Watch point data attentively according to the Q of acquisition, generate by continuous two blinkpunkt formed with directive straight line;
The angle that two directive straight lines of continuous band are formed is obtained, the angle that will acquire and default angle carry out
Comparison, if the angle not in the range of the default angle, extracts and marks to form the two of angle straight lines works
Track characteristic is extracted for first.
According to a second aspect of the embodiments of the present invention, a kind of eye movement feature deriving means are provided, described device includes:
Module is obtained, Q for obtaining interface to be measured watch point data attentively, wherein Q is the positive integer greater than 2;
Generation module, for watching point data attentively according to the Q of acquisition, generation is had by what continuous two blinkpunkt was formed
The straight line in direction;
First extraction module, the angle that the directive straight line of continuous band for obtaining two is formed, what be will acquire is described
Angle is compared with default angle, if the angle not in the range of the default angle, extracts and marks to form institute
Two straight lines for stating angle extract track characteristic as first.
According to a third aspect of the embodiments of the present invention, a kind of computer readable storage medium, the computer storage are provided
Medium includes computer program, wherein the computer program makes described one when being executed by one or more computers
A or multiple computers perform the following operations:
The operation include the steps that it is any one of as above described in eye movement feature extracting method included.
Implement a kind of eye movement feature extracting method, device and storage medium provided in an embodiment of the present invention, have with
Lower advantage:(1) eye movement data analyst coverage can be extended, from by the blinkpunkt time, position relational extensions to by blinkpunkt it
Between relationship analyzed, data mining doubles;(2) can show interface there may be the problem of, by eye movement data point
Analysis expands to analytical property from descriptive nature;(3) can rapidly extracting eye movement data feature, positioning user doubted in interface
Like the information point of improper browsing behavior, manual analysis workload is reduced, promotes data analysis efficiency.
Detailed description of the invention
Fig. 1 is a kind of flow chart of eye movement feature extracting method of the embodiment of the present invention;
Fig. 2 is the flow chart of another eye movement feature extracting method of the embodiment of the present invention;
Fig. 3 is the schematic diagram of the first embodiment of the method for the invention;
Fig. 4 is the schematic diagram of second of embodiment of the method for the invention;
Fig. 5 is a kind of structural schematic diagram of eye movement feature deriving means 1 of the embodiment of the present invention.
Specific embodiment
To keep the purposes, technical schemes and advantages of the embodiment of the present invention clearer, below in conjunction with attached drawing to this hair
It is bright to be described in further detail.
Fig. 1 is a kind of flow chart of eye movement feature extracting method of the embodiment of the present invention;Referring to Fig. 1, the method
Including:
Step S1, Q for obtaining interface to be measured watch point data attentively, wherein Q is the positive integer greater than 2, and the blinkpunkt obtains
Take parameter according to the minimum residence time of eye-observation position;
Step S2 watches point data attentively according to the Q of acquisition, generate by continuous two blinkpunkt formed with directive
Straight line;
Step S3, obtains the angle that two directive straight lines of continuous band are formed, the angle that will acquire and default
Angle compares, if the angle not in the range of the default angle, extracts and marks to form the two of the angle
Straight line extracts track characteristic as first.
In embodiments of the present invention, the method also includes:
According to self defined area cover up rule, the Q blinkpunkt at the interface to be measured is generated into Q watching area, wherein
Q is the positive integer greater than 2;Watching area collection is generated according to the tightness degree of the watching area;By any two continuous bands
Directive straight line and watching area collection position carry out regional correlation, extract and label passes through the watching area collection
Uncovered area with directive straight line, extract track characteristic as second.
In embodiments of the present invention, the directive straight line of the band includes:It is with the blinkpunkt for generating morning time
Point, to generate the line of the blinkpunkt as terminal in evening time, wherein direction pointed by the extended line of the line is as described straight
The direction of line.
In embodiments of the present invention, the default angle is the included angle value model in customized normal browsing path
It encloses.
In embodiments of the present invention, the self defined area cover up rule includes:Watching area is watched attentively described in including
The physical region and its associated region of point.
In embodiments of the present invention, the watching area includes:Using the blinkpunkt as the center of circle, spread to surrounding
The region of 20-50 pixel.
In embodiments of the present invention, the blinkpunkt gets parms the minimum residence time according to eye-observation position
Including:40-200 milliseconds.
Eye movement feature extracting method and device in the present invention, by obtaining eye movement gaze data, analysis eye movement note
Relationship between viewpoint, marks the doubtful blinkpunkt or watching area for violating general browsing habit, and generation can be used for locating interface
The image of design problem, more clearly show interface there may be the problem of, designed for internet product, advertisement is set
Meter and optimization provide more favorable data and support, also provide a great convenience for the available sex work of user experience, are promoted and are produced
The user experience of product.
Fig. 2 is the flow chart of another eye movement feature extracting method of the embodiment of the present invention;Fig. 3 is of the present invention
The schematic diagram of the first embodiment of method;Fig. 4 is the schematic diagram of second of embodiment of the method for the invention;Referring to fig. 2
And Fig. 3 and Fig. 4 is combined, described method includes following steps:
Step S101, the Q obtained for interface to be measured watch point data attentively, and wherein Q is the positive integer greater than 2, blinkpunkt
Getting parms can be on the basis of the minimum residence time of observation be realized in certain position according to human eye, and term of reference is 40-200 milli
Second;Wherein, blinkpunkt can be by calling such as each eye movement data point that Tobii Studio (eye movement Trancking Software) records
Position watches the time started attentively, watches the data such as duration attentively, and distinguishes the eye movement comprising watching behavior attentively according to preset parameter and standard
Data point.Watch point data attentively and refers to the same data for being tested and generating during browsing or operating the page to be measured.Wherein, to
Test interface can be webpage, be also possible to other any interfaces for reading class, such as cell phone application (application software), plate electricity
Brain, e-book etc..
Step S102 watches point data generation Q-1 attentively according to acquired Q and has side by what continuous two blinkpunkt was formed
To straight line, wherein blinkpunkt is starting point earlier to generate the time, to generate time later blinkpunkt as the line of terminal,
Wherein direction of the direction as straight line pointed by the extended line of the line.In an embodiment of the present invention, M, N, O, P are by note
Viewpoint generates four blinkpunkts of time from morning to night, three line segments such as MN, NO, OP can be generated, and respectively with N, O, P point institute
Direction carry out line segment and extend and to be formed with directive straight line.
Step S103 obtains the directive continuous linear of two bands and is formed by angle, and wherein angular range is with horizontal direction
The right side is 0 degree of starting point, 360 degree of ranges clockwise.In an embodiment of the present invention, straight line MN and straight line NO forms angle angle 1, straight line
NO and straight line OP forms angle angle 2.
Any directive straight line of two bands is formed by angle and compared with default angle by step S104, wherein
Default angular range is formed by angle, reference parameter range 0-180 degree with reference to the general normal browsing path of the mankind.In the present invention
Embodiment in, angle 1 be less than 180 degree, within the scope of reference parameter, angle 2 be greater than 180 degree, not in reference parameter range.
Step S105 extracts not two straight lines involved in the angle of preset range, marks in showing material, as
One of (watching area of doubtful improper browsing sequence) eye movement feature extracted.In an embodiment of the present invention, structure
The straight line NO and straight line OP of angulation 2 will be marked in plane ABCD.
Step S203 generates Q watching area according to Q acquired blinkpunkt, and wherein Q is the positive integer greater than 2, note
The region that the element that viewed area is watched attentively in showing material by subject covers.The region that wherein element is covered includes element
The physical region and its associated region occupied.For example, the watching area of text element can with the corresponding word of blinkpunkt with
And the region where the word to link up with word, the watching area of picture element can with the corresponding pictorial element of blinkpunkt and and its
The region where other elements in cognition with syntagmatic.In an embodiment of the present invention, M, N, O, P are with blinkpunkt
The region of covering is formed centered on the element at place.
Step S204 generates watching area collection according to the tightness degree of watching area, and wherein watching area is covered with element
It based on the real area of lid, and supports to spread using the element as the center of circle to surrounding, reference parameter range is in 20-50 pixel.Its
The range of scatter of middle watching area is advisable with forming 3-5 watching area collection in tested region.In an embodiment of the present invention,
M, tri- blinkpunkts of N, O are formed by that watching area position is more close, and forming region collection 1, blinkpunkt P is formed by blinkpunkt
And its range of scatter and other blinkpunkts distance are farther out, forming region collection 2.
Step S205, by any two continuous blinkpunkts formation with the position where directive straight line and watching area collection
It compares.Wherein straight line and region integrate position using measured material to show in the plane of background.In reality of the invention
It applies in example, i.e., compares straight line MN, NO, OP and region collection 1, region collection 2 in measured material ABCD plane.
Step S206 extracts the straight line Jing Guo watching area collection uncovered area, marks in showing material, as extraction
The two of (watching area of doubtful improper reading order) the eye movement feature arrived.In an embodiment of the present invention, straight line MN,
NO, in region collection 1, straight line OP passes through region collection 1 and region collection 2, and therefore, straight line OP will be extracted label in plane
In ABCD.
The eye movement feature deriving means of the embodiment of the present invention, by the analysis and simplification to eye movement data, so that right
The infomation detection for not meeting general eye movement rule is more accurate, provides in user experience and interface to the judgement of problem
Better data support, keep work simpler, conveniently, improve working efficiency and user experience.
Fig. 5 is a kind of structural schematic diagram of eye movement feature deriving means 1 of the embodiment of the present invention.It is described referring to Fig. 5
Device 1 includes:
Module 100 is obtained, Q for obtaining interface to be measured watch point data attentively, wherein Q is the positive integer greater than 2;
Generation module 200, for watching point data attentively according to the Q of acquisition, what generation was formed by continuous two blinkpunkt
With directive straight line;
First extraction module 300, the angle that the directive straight line of continuous band for obtaining two is formed, the institute that will acquire
It states angle to compare with default angle, if the angle not in the range of the default angle, extracts and marks to be formed
Two straight lines of the angle extract track characteristic as first.
In embodiments of the present invention, described device further includes:
Second extraction module, for according to self defined area cover up rule, the Q blinkpunkt at the interface to be measured to be generated
Q watching area, wherein Q is the positive integer greater than 2;Watching area collection is generated according to the tightness degree of the watching area;It will
Any two continuously carry out regional correlation with directive straight line and watching area collection position, extract and label passes through
Cross the watching area collection uncovered area with directive straight line, extract track characteristic as second.
In embodiments of the present invention, the directive straight line of the band includes:It is with the blinkpunkt for generating morning time
Point, to generate the line of the blinkpunkt as terminal in evening time, wherein direction pointed by the extended line of the line is as described straight
The direction of line.
In embodiments of the present invention, the default angle is the included angle value model in customized normal browsing path
It encloses.
In embodiments of the present invention, the self defined area cover up rule includes:Watching area is watched attentively described in including
The physical region and its associated region of point.
In embodiments of the present invention, the watching area includes:Using the blinkpunkt as the center of circle, spread to surrounding
The region of 20-50 pixel.
In embodiments of the present invention, the blinkpunkt gets parms the minimum residence time according to eye-observation position
Including:40-200 milliseconds.
In addition, the computer storage medium includes to calculate the present invention also provides a kind of computer readable storage medium
Machine program, which is characterized in that the computer program makes one or more of when being executed by one or more computers
Computer performs the following operations:The operation includes the steps that eye movement feature extracting method as described above is included,
This is repeated no more.
Through the above description of the embodiments, those skilled in the art can be understood that the present invention can be by
The mode of software combination hardware platform is realized.Based on this understanding, technical solution of the present invention makes tribute to background technique
That offers can be embodied in the form of software products in whole or in part, which can store is situated between in storage
In matter, such as ROM/RAM, magnetic disk, CD, including some instructions use is so that a computer equipment (can be individual calculus
Machine, server or network equipment etc.) execute method described in certain parts of each embodiment of the present invention or embodiment.
The above disclosure is only a preferred embodiment of the invention, cannot limit protection of the invention certainly with this
Range, therefore is still fallen within by right of the present invention and is wanted for equivalent variations made by above-described embodiment according to the introduction of the claims in the present invention
It asks in the range of being covered.
Claims (15)
1. a kind of eye movement feature extracting method, which is characterized in that the method includes:
Q for obtaining interface to be measured watch point data attentively, wherein Q is the positive integer greater than 2, described to watch point data attentively according to human eye
The residence time for observing position obtains;
Watch point data attentively according to the Q of acquisition, generate by continuous two blinkpunkt formed with directive straight line;
Obtain the angle that two directive straight lines of continuous band are formed, the angle that will acquire and default angle carry out pair
Than if the angle not in the range of the default angle, extracts and marks to form the two of the angle straight line conducts
First extracts track characteristic.
2. method as described in claim 1, which is characterized in that the method also includes:
According to self defined area cover up rule, the Q blinkpunkt at the interface to be measured is generated into Q watching area, wherein Q is
Positive integer greater than 2;Watching area collection is generated according to the tightness degree of the watching area;It is continuous with side by any two
To straight line and watching area collection position carry out regional correlation, extract and label do not cover by the watching area collection
Cover area with directive straight line, extract track characteristic as second.
3. method as described in claim 1, which is characterized in that the directive straight line of band includes:
To generate the blinkpunkt of morning time as starting point, to generate the line of the blinkpunkt as terminal in evening time, the wherein line
Direction of the direction pointed by extended line as the straight line.
4. method as described in claim 1, which is characterized in that the default angle is the angle in customized normal browsing path
Numberical range.
5. method as claimed in claim 2, which is characterized in that the self defined area cover up rule includes:
Watching area includes the physical region and its associated region of the blinkpunkt.
6. method as claimed in claim 5, which is characterized in that the watching area includes:
Using the blinkpunkt as the center of circle, to the region of surrounding diffusion 20-50 pixel.
7. method as described in claim 1, which is characterized in that the minimum that the blinkpunkt gets parms according to eye-observation position
Residence time includes:
40-200 milliseconds.
8. a kind of eye movement feature deriving means, which is characterized in that described device includes:
Module is obtained, Q for obtaining interface to be measured watch point data attentively, wherein Q is the positive integer greater than 2;
Generation module generates for watching point data attentively according to the Q of acquisition and has direction by what continuous two blinkpunkt was formed
Straight line;
First extraction module, the angle that the directive straight line of continuous band for obtaining two is formed, the angle that will acquire
It is compared with default angle, if the angle not in the range of the default angle, extracts and marks to form the folder
Two straight lines at angle extract track characteristic as first.
9. device as claimed in claim 8, which is characterized in that described device further includes:
Second extraction module, for according to self defined area cover up rule, the Q blinkpunkt at the interface to be measured to be generated Q
Watching area, wherein Q is the positive integer greater than 2;Watching area collection is generated according to the tightness degree of the watching area;It will appoint
Meaning two continuously carries out regional correlation with directive straight line and watching area collection position, extracts and label passes through
The watching area collection uncovered area with directive straight line, extract track characteristic as second.
10. device as claimed in claim 8, which is characterized in that the directive straight line of band includes:
To generate the blinkpunkt of morning time as starting point, to generate the line of the blinkpunkt as terminal in evening time, the wherein line
Direction of the direction pointed by extended line as the straight line.
11. device as claimed in claim 8, which is characterized in that the default angle is the folder in customized normal browsing path
Angle numberical range.
12. device as claimed in claim 9, which is characterized in that the self defined area cover up rule includes:Watching area includes
The physical region and its associated region of the blinkpunkt.
13. device as claimed in claim 12, which is characterized in that the watching area includes:
Using the blinkpunkt as the center of circle, to the region of surrounding diffusion 20-50 pixel.
14. device as claimed in claim 8, which is characterized in that the blinkpunkt gets parms according to eye-observation position most
The small residence time includes:
40-200 milliseconds.
15. a kind of computer readable storage medium, the computer storage medium includes computer program, which is characterized in that
The computer program performs the following operations one or more of computers when being executed by one or more computers:
The operation includes the steps that eye movement feature extracting method of any of claims 1-7 such as includes.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810522110.2A CN108932473A (en) | 2018-05-28 | 2018-05-28 | Eye movement feature extracting method, device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810522110.2A CN108932473A (en) | 2018-05-28 | 2018-05-28 | Eye movement feature extracting method, device and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108932473A true CN108932473A (en) | 2018-12-04 |
Family
ID=64449421
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810522110.2A Pending CN108932473A (en) | 2018-05-28 | 2018-05-28 | Eye movement feature extracting method, device and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108932473A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109271030A (en) * | 2018-09-25 | 2019-01-25 | 华南理工大学 | Blinkpunkt track various dimensions comparative approach under a kind of three-dimensional space |
CN109925678A (en) * | 2019-03-01 | 2019-06-25 | 北京七鑫易维信息技术有限公司 | A kind of training method based on eye movement tracer technique, training device and equipment |
CN112596602A (en) * | 2019-09-17 | 2021-04-02 | 奥迪股份公司 | Apparatus for adjusting display of information on display screen and corresponding method and medium |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090086165A1 (en) * | 2007-09-28 | 2009-04-02 | Beymer David James | System and method of detecting eye fixations using adaptive thresholds |
CN103500011A (en) * | 2013-10-08 | 2014-01-08 | 百度在线网络技术(北京)有限公司 | Eye movement track law analysis method and device |
-
2018
- 2018-05-28 CN CN201810522110.2A patent/CN108932473A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090086165A1 (en) * | 2007-09-28 | 2009-04-02 | Beymer David James | System and method of detecting eye fixations using adaptive thresholds |
CN103500011A (en) * | 2013-10-08 | 2014-01-08 | 百度在线网络技术(北京)有限公司 | Eye movement track law analysis method and device |
Non-Patent Citations (2)
Title |
---|
SANTELLA, ANTHONY, AND DOUG DECARLO: "《Robust clustering of eye movement recordings for quantification of visual interest》", 《PROCEEDINGS OF THE 2004 SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS》 * |
TOBII AB: "《Tobii Studio User’s Manual》", 31 January 2016 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109271030A (en) * | 2018-09-25 | 2019-01-25 | 华南理工大学 | Blinkpunkt track various dimensions comparative approach under a kind of three-dimensional space |
CN109925678A (en) * | 2019-03-01 | 2019-06-25 | 北京七鑫易维信息技术有限公司 | A kind of training method based on eye movement tracer technique, training device and equipment |
CN112596602A (en) * | 2019-09-17 | 2021-04-02 | 奥迪股份公司 | Apparatus for adjusting display of information on display screen and corresponding method and medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Bylinskii et al. | Eye fixation metrics for large scale evaluation and comparison of information visualizations | |
Lang et al. | Depth matters: Influence of depth cues on visual saliency | |
Kishishita et al. | Analysing the effects of a wide field of view augmented reality display on search performance in divided attention tasks | |
US7760910B2 (en) | Evaluation of visual stimuli using existing viewing data | |
Naspetti et al. | Automatic analysis of eye-tracking data for augmented reality applications: A prospective outlook | |
Ramos Gameiro et al. | Exploration and exploitation in natural viewing behavior | |
CN108932473A (en) | Eye movement feature extracting method, device and storage medium | |
Blascheck et al. | Visual comparison of eye movement patterns | |
CN109376598A (en) | Facial expression image processing method, device, computer equipment and storage medium | |
Masciocchi et al. | Alternatives to eye tracking for predicting stimulus-driven attentional selection within interfaces | |
Olejarczyk et al. | Incidental memory for parts of scenes from eye movements | |
Lagun et al. | Understanding mobile searcher attention with rich ad formats | |
Göbel et al. | FeaturEyeTrack: automatic matching of eye tracking data with map features on interactive maps | |
CN109302301A (en) | A kind of appraisal procedure and device of business game | |
Golenia et al. | Implicit relevance feedback from electroencephalography and eye tracking in image search | |
Ni et al. | Touch saliency: Characteristics and prediction | |
Wedel et al. | Eye tracking methodology for research in consumer psychology | |
Rajashekar et al. | Foveated analysis of image features at fixations | |
Popelka et al. | Advanced map optimalization based on eye-tracking | |
Thurman et al. | Diagnostic spatial frequencies and human efficiency for discriminating actions | |
Rogalska et al. | Blinking extraction in eye gaze system for stereoscopy movies | |
EP3602343B1 (en) | Media content tracking | |
Xu et al. | Analyzing students' attention by gaze tracking and object detection in classroom teaching | |
CN104318223A (en) | Face distinguishing feature position determining method and system | |
Djamasbi et al. | Search results pages and competition for attention theory: An exploratory eye-tracking study |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20181204 |
|
WD01 | Invention patent application deemed withdrawn after publication |