CN115705088A - Calibration method, calibration device, terminal equipment and storage medium - Google Patents

Calibration method, calibration device, terminal equipment and storage medium Download PDF

Info

Publication number
CN115705088A
CN115705088A CN202110930920.3A CN202110930920A CN115705088A CN 115705088 A CN115705088 A CN 115705088A CN 202110930920 A CN202110930920 A CN 202110930920A CN 115705088 A CN115705088 A CN 115705088A
Authority
CN
China
Prior art keywords
calibration
behavior data
data set
gazing
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110930920.3A
Other languages
Chinese (zh)
Inventor
张朕
请求不公布姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing 7Invensun Technology Co Ltd
Original Assignee
Beijing 7Invensun Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing 7Invensun Technology Co Ltd filed Critical Beijing 7Invensun Technology Co Ltd
Priority to CN202110930920.3A priority Critical patent/CN115705088A/en
Publication of CN115705088A publication Critical patent/CN115705088A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention discloses a calibration method, a calibration device, terminal equipment and a storage medium. The method comprises the following steps: the method comprises the steps of obtaining a user's gaze behavior data set when a calibrator flickers at a set frequency, wherein the calibrator is a target object selected in a use scene and used for calibration, and gaze behavior data included in the gaze behavior data set comprises eye characteristic data and electroencephalogram data; extracting an effective fixation behavior data set from the fixation behavior data set according to the electroencephalogram data and the set frequency; determining a calibration coefficient according to the eye characteristic data included in the effective behavior data set and the position information of the calibrator; and finishing the calibration of the terminal equipment according to the calibration coefficient. By using the method, the calibration of the terminal equipment can be completed in the using process of the terminal equipment, and the technical effect of unconscious calibration of eye tracking is realized.

Description

Calibration method, calibration device, terminal equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of eye movement tracking, in particular to a calibration method, a calibration device, terminal equipment and a storage medium.
Background
The eye tracking technology is a technology for estimating the position coordinates of a user gazing point through a software and hardware system, and is a technology for collecting and analyzing where people look at anytime and anywhere, and is widely applied to the fields of scientific research experiments, VR/AR immersive interaction, commercial tests and the like.
In the multi-point calibration process of a terminal device, such as an eye tracking device, since the device needs to correctly correlate the collected eye feature data with the position information of the calibration point at which the user is gazing, the prior art scheme requires that before the user uses the eye tracking function, the device must first open a separate calibration interface to guide the user to concentrate on attention to sequentially gazing at the calibration point, thereby completing the calibration process. If the calibration effect is poor, the user is replaced, the head of the user is moved greatly, the position of the head-mounted display is adjusted by the user, and the like, the user needs to re-enter the calibration interface to complete calibration, so that the use experience of the user is influenced.
Disclosure of Invention
The embodiment of the invention provides a calibration method, a calibration device, terminal equipment and a storage medium, which effectively improve the efficiency of calibrating the terminal equipment and improve the use experience of a user.
In a first aspect, an embodiment of the present invention provides a calibration method, including:
the method comprises the steps of obtaining a user's gaze behavior data set when a calibrator flickers at a set frequency, wherein the calibrator is a target object selected in a use scene and used for calibration, and gaze behavior data included in the gaze behavior data set comprises eye characteristic data and electroencephalogram data;
extracting an effective fixation behavior data set from the fixation behavior data set according to the electroencephalogram data and the set frequency;
determining a calibration coefficient according to the eye feature data included in the effective behavior data set and the position information of the calibration object;
and finishing the calibration of the terminal equipment according to the calibration coefficient.
Optionally, the number of the calibrators is at least one, and when the number of the calibrators is at least two, the set frequencies corresponding to the calibrators are not equal.
Optionally, when the number of the calibrators is at least two, each of the calibrators is displayed simultaneously.
Optionally, the selection criterion of the calibrator satisfies at least one of the following: the display area is less than or equal to a set area threshold; the display time length is greater than or equal to the set time length; the blinking attribute indication may be flashed.
Optionally, extracting an effective gazing behavior data set from the gazing behavior data set according to the electroencephalogram data and the set frequency, including:
according to a time period corresponding to a target frequency, splitting the gazing behavior data set to obtain a gazing behavior data subset;
for each fixation behavior data subset, determining whether eye feature data included in the fixation behavior data subset is eye feature data of a user fixation calibrator or not according to electroencephalogram data and electroencephalogram waveform features of the fixation behavior data subset, and if yes, determining the fixation behavior data subset as an effective fixation behavior data subset;
wherein the brain wave waveform characteristic is the waveform characteristic of the brain wave signal triggered when the calibrator flicks at a set frequency; when the calibrator is one, the target frequency is the set frequency; when the number of the calibrators is at least two, the target frequency is the set frequency with the largest value among the set frequencies corresponding to the calibrators.
Optionally, extracting an effective gazing behavior data set from the gazing behavior data set according to the electroencephalogram data and the set frequency, including:
extracting eye feature data corresponding to electroencephalogram data with electroencephalogram waveform features from the gazing behavior data set;
and determining the extracted eye feature data as an effective fixation behavior data set.
Optionally, when the number of the calibrators is one, completing calibration of the terminal device according to the calibration coefficient, including: correcting the sight line estimation algorithm model according to the calibration coefficient;
when the number of the calibrators is at least two, completing calibration of the terminal device according to the calibration coefficient, including:
and after the calibration coefficients corresponding to all the calibrators are determined, correcting the sight line estimation algorithm model according to each calibration coefficient.
In a second aspect, an embodiment of the present invention further provides a calibration apparatus, including:
the device comprises an acquisition module, a processing module and a control module, wherein the acquisition module is used for acquiring a gazing behavior data set of a user when a calibrator flicks at a set frequency, the calibrator is a target object selected from a use scene and used for calibration, and the gazing behavior data included in the gazing behavior data set comprises eye characteristic data and electroencephalogram data;
the extraction module is used for extracting an effective watching behavior data set from the watching behavior data set according to the electroencephalogram data and the set frequency;
the determining module is used for determining a calibration coefficient according to the eye characteristic data included in the effective behavior data set and the position information of the calibrator;
and the calibration module is used for completing calibration of the terminal equipment according to the calibration coefficient.
In a third aspect, an embodiment of the present invention further provides a terminal device, including:
one or more processors;
storage means for storing one or more programs;
the one or more programs are executed by the one or more processors to cause the one or more processors to implement the calibration method provided by the embodiments of the present invention.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the calibration method provided by the embodiment of the present invention.
The embodiment of the invention provides a calibration method, a calibration device, terminal equipment and a storage medium, wherein when a calibrator flickers at a set frequency, the calibrator is a target object selected in a use scene for calibration, a user's gazing behavior data set is obtained, and the gazing behavior data included in the gazing behavior data set comprises eye characteristic data and electroencephalogram data; secondly, extracting an effective fixation behavior data set from the fixation behavior data set according to the electroencephalogram data and the set frequency; then determining a calibration coefficient according to the eye characteristic data included in the effective behavior data set and the position information of the calibrator; and finally, completing the calibration of the terminal equipment according to the calibration coefficient. By utilizing the technical scheme, the calibration of the terminal equipment can be completed in the using process of the terminal equipment, and the technical effect of the eye tracking unconscious calibration is realized.
Drawings
Fig. 1 is a schematic flowchart of a calibration method according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of a calibration method according to an exemplary embodiment of the present invention;
FIG. 3 is a schematic diagram of an EEG waveform provided by an exemplary embodiment of the present invention;
fig. 4 is a schematic structural diagram of a calibration apparatus according to a second embodiment of the present invention;
fig. 5 is a schematic structural diagram of a terminal device according to a third embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Before discussing exemplary embodiments in greater detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the operations (or steps) as a sequential process, many of the operations can be performed in parallel, concurrently, or simultaneously. In addition, the order of the operations may be re-arranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, and the like. In addition, the embodiments and features of the embodiments in the present invention may be combined with each other without conflict.
The term "including" and variations thereof as used herein is intended to be open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment".
It should be noted that the concepts of "first", "second", etc. mentioned in the present invention are only used for distinguishing corresponding contents, and are not used for limiting the order or interdependence relationship.
It is noted that references to "a", "an", and "the" modifications in the present invention are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that reference to "one or more" unless the context clearly dictates otherwise.
Example one
Fig. 1 is a flowchart of a calibration method according to an embodiment of the present invention, where the method is applicable to a case of calibrating a terminal device, and the method may be executed by a calibration apparatus, where the apparatus may be implemented by software and/or hardware and is generally integrated on the terminal device, and in this embodiment, the terminal device includes but is not limited to: devices such as computers and mobile phones having an eye tracking function.
The accuracy of eye tracking is an important index for measuring the reliability of the eye tracking function, and is described as a deviation value of a position coordinate of a user gazing point and a position coordinate of a gazed object, which are obtained by a sight line estimation algorithm, and is usually represented by an included angle between a connecting line between the two position coordinates and human eyes.
Since eye tracking technology has extremely high accuracy requirements, users usually need to first enter a calibration interface to complete calibration before using the device with eye tracking function. The calibration means that: the system acquires the eye feature data of the user while the user focuses on one or more calibration points, and performs correlation calculation on the calibration point position coordinates (namely the position information of the calibration object) corresponding to each calibration point and the eye feature data of the user in the current calibration point display process to obtain the calibration coefficient.
To ensure the accuracy of eye tracking, the system often requires the user to coordinate the multiple calibration. In the prior art, a plurality of calibration points must be displayed one by one, and it is required that a user must immediately concentrate on gazing at a calibration point until the calibration point stops being displayed while the calibration point starts being displayed. The system will continue to acquire ocular feature data during this process and associate the ocular feature data with the currently displayed position coordinates of the calibration point.
In the multi-point calibration process of the eye tracking system, because the system needs to correctly correlate the acquired eye feature data with the position coordinates of the calibration point at which the user is gazing, the prior art scheme requires that before using the eye tracking function, the system must open a separate calibration interface first to guide the user to concentrate on attention to sequentially gazing at the calibration points, thereby completing the calibration process. Once the calibration effect is poor (possibly caused by the user being inattentive in the calibration process), the user is replaced, the user moves the head by a large margin (corresponding to the telemetric eye tracker), the user adjusts the position of the head-mounted display (corresponding to the wearable eye tracker), and the like, the user is often required to quit the browsing interface and re-enter the calibration interface to complete the calibration.
When a user is browsing content, performing immersive interaction in VR/AR scenes, and the like, the above situations may seriously affect the user experience, and increase the time cost of the user when using the eye tracking function.
In order to solve the above technical problem, as shown in fig. 1, a calibration method according to a first embodiment of the present invention includes the following steps:
and S110, acquiring a gazing behavior data set of the user when the calibrator flickers at a set frequency.
In this embodiment, the calibration object is a target object selected in a usage scenario for calibration, and the gaze behavior data included in the gaze behavior data set includes eye characteristic data and electroencephalogram data.
The selection of the calibration object is not limited herein, and may be determined based on the usage scenario, such as selecting the calibration object from the usage scenario of the terminal device based on the display area, the display duration and the flicker property of the object in the usage scenario.
For example, an object in the scene with a display area less than or equal to a set area threshold may be selected as the calibration object. The setting of the set area threshold may be determined based on the usage scenario. The smaller the display area, the higher the accuracy that can be achieved after calibration.
The present embodiment does not limit the specific content of the eye characteristic data and the electroencephalogram data, for example, the eye characteristic data may include a pupil position coordinate, a purkinje spot position coordinate, and the like, and the electroencephalogram data may include a visual evoked potential SSVEP signal, a P300 evoked potential signal, and the like. It should be noted that the gaze behavior data includes a plurality of data items, and each data item in each gaze behavior data must be acquired synchronously, i.e., with the same system time stamp.
Note that at the time of calibration, the number of calibrators displayed is at least one. Each calibrator corresponds to a gaze behavior data set for determining calibration coefficients corresponding to the calibrator. Each gaze behavior data set may comprise a plurality of gaze behavior data, such as a set of all gaze behavior data from display to cessation of display of one calibration object determined as a gaze behavior data set. The electroencephalogram data are used for filtering eye feature data when a user does not watch the calibration object in the watching behavior data set, so that the accuracy of the calibration coefficient is improved.
In the embodiment, the calibrator flickers at a set frequency in a use scene, so that a user can trigger an electroencephalogram signal when watching the calibrator.
The method comprises the steps of acquiring a user's gaze behavior data set in the display process of the calibrator, and determining whether corresponding eye characteristic data are acquired when the user gazes the calibrator or not through electroencephalogram data in the gaze behavior data set, so that calibration efficiency is improved.
The present embodiment does not limit the setting frequency, and may be determined based on the usage scenario.
The technical means for acquiring the gazing behavior data set is not limited in the step, and if different data correspond to different collectors, the corresponding data acquired by the collectors can be acquired in the step to form the gazing behavior data set.
In one embodiment, the number of the calibrators is at least one, and when the number of the calibrators is at least two, the set frequencies corresponding to the calibrators are not equal.
In this embodiment, the next page of the usage scenario of the terminal device may include a plurality of calibrators, and the set frequency of the flashing of each calibrator is different, so as to distinguish different calibrators.
When the calibration is implemented, the present embodiment may prompt the user that the calibration stage is currently performed, and the prompting means is not limited, for example, prompting in a display form or a voice broadcast form. When the number of the calibrators in one display page is at least two, the calibrators can flash at the set frequency simultaneously or sequentially at different or the same set frequency.
In one embodiment, when the number of the calibrators is at least two, each of the calibrators is simultaneously displayed.
Each calibrant may blink while displayed at a different set frequency.
In one embodiment, the selection criteria for the calibrant satisfies at least one of: the display area is less than or equal to a set area threshold; the display duration is greater than or equal to the set duration; the blinking attribute indication may be flashed.
The display area may be considered to be the area of the displayed page in which the objects in the usage scene are displayed. The display duration may be considered as a duration of time that objects in the usage scenario are continuously displayed in the usage scenario. The flicker attribute may be an attribute that characterizes whether an object in the usage scene is capable of flickering. The determination of the flicker attribute is based on a usage scenario determination, such as determining whether the object is capable of flickering based on the usage scenario. The specific values of the area threshold and the set time length are not limited.
And S120, extracting an effective fixation behavior data set from the fixation behavior data set according to the electroencephalogram data and the set frequency.
The valid gaze behavior data set may be considered to be the acquired data, such as acquired brain electrical data and acquired eye feature data, at which the user gazes at the calibration point.
The embodiment may determine a corresponding time period based on the set frequency, and then determine the brain wave waveform characteristics that the brain wave data should have when the user gazes at the calibrator based on the time period. And filtering the electroencephalogram data based on the determined electroencephalogram waveform characteristics to realize the screening of the effective fixation behavior data set. Brain wave waveform characteristics include, but are not limited to: the length range of the latent period, the peak range of the wave crest and the wave trough, and the like.
In one embodiment, when effective gazing behavior data is extracted, the step may directly analyze all electroencephalogram data in the gazing behavior data set to extract an eye feature corresponding to the electroencephalogram data satisfying the electroencephalogram waveform feature for forming an effective gazing behavior data set.
In one embodiment, when extracting the valid gazing behavior data, the step may split the gazing behavior data set to obtain a plurality of gazing behavior data subsets, and then analyze each gazing behavior data subset to obtain the valid gazing behavior data set. The splitting criterion is not defined herein, such as based on a calibrant scintillation time period as the splitting criterion. Such as setting the gaze behavior data for a number of time periods to form a subset of the gaze behavior data. The number of the set number is not limited herein.
S130, determining a calibration coefficient according to the eye feature data included in the effective behavior data set and the position information of the calibration object.
The specific technical means of how to determine the calibration coefficient is not limited in this step, such as determining the gaze point position information of the user, such as the gaze point position coordinates, based on the eye feature data. Then, a calibration coefficient is determined based on the gazing point position information and the position information of the calibration point.
And S140, completing the calibration of the terminal equipment according to the calibration coefficient.
The technical means for completing calibration of the terminal device based on the calibration coefficient is not limited.
The calibration method provided by the embodiment of the invention comprises the steps of firstly, when a calibrator flickers at a set frequency, the calibrator is a target object selected from a use scene and used for calibration, and a user gazing behavior data set comprises gazing behavior data including eye characteristic data and electroencephalogram data; secondly, extracting an effective fixation behavior data set from the fixation behavior data set according to the electroencephalogram data and the set frequency; then determining a calibration coefficient according to the eye feature data included in the effective behavior data set and the position information of the calibration object; and finally, finishing the calibration of the terminal equipment according to the calibration coefficient. By the method, the calibration object is selected in the use scene to complete calibration, the calibration of the terminal equipment can be completed in the use process of the terminal equipment, and the technical effect of unconscious calibration of eye tracking is achieved.
On the basis of the above-described embodiment, a modified embodiment of the above-described embodiment is proposed, and it is to be noted here that, in order to make the description brief, only the differences from the above-described embodiment are described in the modified embodiment.
In one embodiment, extracting an effective gaze behavior data set from the gaze behavior data set according to the brain electrical data and the set frequency comprises:
according to a time period corresponding to the target frequency, splitting the gazing behavior data set to obtain a gazing behavior data subset;
for each gazing behavior data subset, determining whether eye feature data included in the gazing behavior data subset is eye feature data of a user gazing calibration object or not according to electroencephalogram data and electroencephalogram waveform features of the gazing behavior data subset, and if yes, determining the gazing behavior data subset as an effective gazing behavior data subset;
wherein the brain wave waveform characteristic is the waveform characteristic of the brain wave signal triggered when the calibrator flicks at a set frequency; when the calibrator is one, the target frequency is the set frequency; when the number of the calibrators is at least two, the target frequency is the set frequency with the largest value among the set frequencies corresponding to the calibrators.
Splitting the gaze behavior data set and then analyzing for each subset of gaze behavior data can speed up processing speed.
When the electroencephalogram data in each subset of the gazing behavior data are determined whether the corresponding eye feature data are the eye feature data of the user gazing calibration object, determining based on the waveform features corresponding to the electroencephalogram data, when the electroencephalogram data have the electroencephalogram waveform features, the eye feature data corresponding to the electroencephalogram data are the eye feature data collected when the user gazes the calibration object, and otherwise, the eye feature data are the eye feature data collected when the user does not gaze the calibration object.
In one embodiment, extracting an effective gaze behavior data set from the gaze behavior data set according to the brain electrical data and the set frequency comprises:
extracting eye feature data corresponding to electroencephalogram data with electroencephalogram waveform features from the gazing behavior data set;
and determining the extracted eye feature data as an effective fixation behavior data set.
The present embodiment may directly analyze the gaze behavior data set to screen out a valid gaze behavior data set.
In one embodiment, when the number of the calibrators is one, completing calibration of the terminal device according to the calibration coefficient, including: correcting a sight line estimation algorithm model according to the calibration coefficient;
when the number of the calibrators is at least two, completing calibration of the terminal device according to the calibration coefficient, including:
and after the calibration coefficients corresponding to all the calibrators are determined, correcting the sight line estimation algorithm model according to each calibration coefficient.
The sight line estimation algorithm model may be regarded as a model for performing sight line estimation, and in this embodiment, after the calibration coefficients are determined, the sight line trajectory algorithm model is modified according to all the calibration coefficients, so as to complete calibration of the terminal device. When the number of the calibration objects is one, directly calibrating the terminal equipment after the calibration coefficient is determined; and when a plurality of calibrators exist, calibrating the terminal equipment after determining the calibration coefficient corresponding to each calibrator.
It should be noted that the calibration object in the present invention can be regarded as a calibration point, and the difference is that the calibration object in the present application is an object that should exist in a scene used by the terminal device. The invention can directly finish calibration in the using process of the terminal equipment without setting a calibration interface.
The present invention is described below by way of example, and the calibration method provided by the exemplary embodiment of the present invention may be regarded as an eye tracking unconscious calibration method, and the technical principle of the calibration method provided by the present invention includes: when a user watches a flickering object with fixed frequency, the brain wave signals can show regular fluctuation, so that the coordinates of the object gazed at by the user can be judged, and the terminal equipment can be calibrated.
Eye tracking unconscious calibration system: the device comprises a data acquisition module (namely an acquisition module), a data screening module (namely an extraction module) and a data calculation module (namely a calibration module).
1. The data acquisition module can automatically select a calibration object, namely a calibration object, in the scene according to a preset selection requirement, and acquires the gazing behavior data of the user while the calibration object flickers according to different frequencies (set frequencies). The fixation behavior data comprises eye characteristic data and electroencephalogram data.
2. The data screening module can judge which calibration object the collected gazing behavior data correspond to the user gazing at according to the electroencephalogram data, and then an effective gazing behavior data set is integrated.
3. And the data calculation module can correlate the eye characteristic data in the effective fixation behavior data set with the position coordinates of the calibration points, and calculates to obtain the calibration coefficient.
Fig. 2 is a schematic flow chart of a calibration method according to an exemplary embodiment of the present invention, and referring to fig. 2, the eye tracking unconscious calibration method includes: the method comprises the following steps:
and presetting a calibration object selection requirement.
The user directly enters the usage scenario. When the eye tracking system starts to calibrate, the system automatically selects a plurality of calibration objects, confirms the position coordinates of the calibration points and collects the gazing behavior data of the user. The calibration object blinks at different frequencies. The calibration point position coordinates are determined from the geometric center of the calibration object. The fixation behavior data comprises eye characteristic data and electroencephalogram data.
The gaze behavior data is divided into a number of gaze behavior data packets.
And judging whether the eye characteristic data contained in the gazing behavior data packets are the eye characteristic data of the user gazing at the calibration object one by one according to the electroencephalogram data in the gazing behavior data packets. If yes, further confirming which calibration object the user is gazing at; if not, the gaze behavior data packet is invalid.
And correspondingly integrating the eye characteristic data of the user gazing at a certain calibration object into a valid gazing behavior data set.
And associating the eye feature data in the effective fixation behavior data set with the coordinates of the calibration point position, and calculating to obtain a calibration coefficient.
And finishing the correction of the line-of-sight estimation algorithm model based on all the calibration coefficients. And finishing the calibration.
The calibration point in this example may be considered a calibrator.
The eye tracking unconscious calibration method is described in detail below:
the method comprises the following steps: the selection requirement of the calibration object is preset, and the selection requirement can be a threshold range of the display area of the object.
The threshold range of the display area of the calibration object may be determined based on the display area of the terminal device, for example, in the case of display with 1920 × 1080 resolution, when the requirement of the scene for the eye tracking accuracy is high, the threshold may be set to 100 to 200 pixels; the threshold may be set to 500-2000 pixels when the scene has a low demand for eye tracking accuracy.
Step two: the system directly displays the scene content that the user needs to browse. When the eye tracking system calibration begins, the system automatically selects a plurality of calibration objects in the scene content according to preset selection requirements. And selecting the geometric center of each calibration object as the position coordinates of the calibration point. The geometric center of the calibration object can be defined as the intersection between the longest axis (the line connecting a pair of points with the longest distance on the edge) and the shortest axis of the outer contour of the object.
Step three: the calibration object starts to blink at a different frequency while the system starts to collect the user's gaze behavior data. The fixation behavior data comprises eye characteristic data and electroencephalogram data.
Step four: the fixation behavior data is divided into several fixation behavior data packets with the fastest blinking frequency as a criterion among all blinking calibration objects.
For example, if the flicker frequency is 5Hz, 3Hz, or 2Hz, the gaze behavior data collected is regarded as a gaze behavior data packet every 200ms, with 5Hz being the standard.
Step five: and judging whether the eye characteristic data contained in the gazing behavior data packets are the eye characteristic data of the user staring at the calibration object one by one according to the electroencephalogram data in the gazing behavior data packets. If yes, further confirming which calibration object the user is gazing at; if not, the gaze behavior data packet is invalid.
For example, based on the blinking frequency of each calibration object, the brain wave waveform characteristics of the SSVEP signal or the P300 signal that the user should generate when gazing at a certain calibration object are determined, and the brain wave waveform characteristics may be a latency duration range, a peak range of peaks and valleys, and the like. If the electroencephalogram data in the gazing behavior data packet has electroencephalogram waveform characteristics when a user gazes at a certain calibration object, the eye characteristic data in the gazing behavior data packet is considered to be the eye characteristic data of the user gazing at the calibration object; otherwise, the eye feature data is not the eye feature data when the user gazes at the calibration object, and is invalid data.
Fig. 3 is a schematic diagram of an electroencephalogram according to an exemplary embodiment of the present invention, taking an example of acquiring a P300 signal on a Pz electrode (top midline electrode) of a head of a user, where a flashing frequency of a certain calibration point is 2Hz, i.e., flashing once every 500ms, so as to form a visual stimulus to the user, and when a peak value of 6.7 μ V of the P300 signal in the electroencephalogram (electroencephalogram waveform) appears once every 500ms at the same frequency, it can be determined that the user is currently gazing at a first calibration point. As shown in fig. 3, the peak of the P300 signal from the 300ms to the 1800ms in the diagram is at the same frequency as the blinking of the calibration point, and the eye feature data in the gaze behavior data packet within 1500ms is the eye feature data of the calibration object at which the user is gazing.
Step six: eye feature data corresponding to a calibration object at which a user is gazing is integrated into a valid gaze behavior data set.
For example, the eye characteristic data of a user's gaze at calibration object a is integrated into the active gaze behavior data set a.
Step seven: and correlating the eye characteristic data in the effective fixation behavior data set with the position coordinates of the calibration point of the calibration object, and calculating to obtain a calibration coefficient.
For example, the gaze estimation algorithm reads the pupil position coordinates and purkinje spot position coordinates contained in the eye feature data set a, and calculates to obtain the gaze point position coordinates of the current user. Furthermore, the sight line estimation algorithm model is corrected based on the deviation value of the position coordinates of the fixation point and the position coordinates of the calibration point of the calibration object A, so that the purpose of deviation correction is achieved, and the correction coefficient is the calibration coefficient A.
Step eight: and finishing the correction of the line-of-sight estimation algorithm model based on all the calibration coefficients. And finishing the calibration.
Before the user uses the eye movement tracking function, the user does not need to finish calibration on a separate calibration interface, and can finish 'invisible' calibration in an unconscious state by directly entering a use scene. Once the situation that the user needs to recalibrate occurs, the user does not need to exit the interface being browsed, and the calibration can be completed directly on the current interface. The use experience of the user is remarkably improved, and the time cost of the user in the process of using the eye movement tracking function is reduced.
In addition to the embodiment of the eye tracking unconscious calibration method described above, in which step five is used to acquire the P300 signal on the Pz electrode (top midline electrode) of the user's head, another embodiment may be further exemplified based on the present solution.
SSVEP signals are collected on Fz (frontal midline electrode), O1 (left occipital electrode), oz (occipital midline electrode) and O2 (right occipital electrode), feature extraction is carried out on the SSVEP signals through a typical Correlation Analysis (CCA), linear combination with the maximum SSVEP signal Correlation coefficient of a plurality of channels is calculated, and then identification of the SSVEP signal peak value and the maximum Correlation coefficient is carried out based on the typical Correlation Analysis, so that the frequency of the SSVEP signals is obtained.
Further, the frequency of the SSVEP signal in the gazing behavior data packet is compared with the flicker frequency of the calibration object, and the eye feature data in the gazing behavior data packet with the SSVEP signal and the flicker frequency of the calibration object having the same frequency are the eye feature data of the calibration object at which the user is gazing.
Example two
Fig. 4 is a schematic structural diagram of a calibration apparatus according to a second embodiment of the present invention, where the apparatus is applicable to a case of calibrating a terminal device, where the apparatus may be implemented by software and/or hardware and is generally integrated on the terminal device.
As shown in fig. 4, the apparatus includes:
an obtaining module 31, configured to obtain a gazing behavior data set of a user when the calibration object flickers at a set frequency, where the calibration object is a target object selected in a use scene and used for calibration, and gazing behavior data included in the gazing behavior data set includes eye feature data and electroencephalogram data;
an extracting module 32, configured to extract an effective gazing behavior data set from the gazing behavior data set according to the electroencephalogram data and the set frequency;
a determining module 33, configured to determine a calibration coefficient according to the eye feature data included in the effective behavior data set and the position information of the calibration object;
and the calibration module 34 is configured to complete calibration of the terminal device according to the calibration coefficient.
In this embodiment, the apparatus first acquires, through the acquisition module 31, a gaze behavior data set of a user when the calibrant flashes at a set frequency, where the calibrant is a target object selected in a use scene for calibration, and the gaze behavior data included in the gaze behavior data set includes eye feature data and electroencephalogram data; secondly, extracting an effective fixation behavior data set from the fixation behavior data set through an extraction module 32 according to the electroencephalogram data and the set frequency; then, determining a calibration coefficient by a determining module 33 according to the eye feature data included in the effective behavior data set and the position information of the calibration object; and finally, completing the calibration of the terminal equipment through a calibration module 34 according to the calibration coefficient.
The embodiment provides a calibration device, which can complete calibration of terminal equipment in the using process of the terminal equipment by selecting a calibration object in a using scene, and achieves the technical effect of eye tracking unconscious calibration.
In one embodiment, the number of the calibrators is at least one, and when the number of the calibrators is at least two, the set frequencies corresponding to the calibrators are not equal.
In one embodiment, when the number of the calibrators is at least two, each of the calibrators is simultaneously displayed.
In one embodiment, the selection criteria for the calibrant satisfies at least one of: the display area is less than or equal to a set area threshold; the display duration is greater than or equal to the set duration; the blinking attribute indication may be flashed.
In an embodiment, the extracting module 32 is specifically configured to:
according to a time period corresponding to a target frequency, splitting the gazing behavior data set to obtain a gazing behavior data subset;
for each gazing behavior data subset, determining whether eye feature data included in the gazing behavior data subset is eye feature data of a user gazing calibration object or not according to electroencephalogram data and electroencephalogram waveform features of the gazing behavior data subset, and if yes, determining the gazing behavior data subset as an effective gazing behavior data subset;
wherein the brain wave waveform characteristic is the waveform characteristic of the brain wave signal triggered when the calibrator flicks at a set frequency; when the calibrator is one, the target frequency is the set frequency; when the number of the calibrators is at least two, the target frequency is the set frequency with the largest value among the set frequencies corresponding to the calibrators.
In an embodiment, the extracting module 32 is specifically configured to:
extracting eye feature data corresponding to electroencephalogram data with electroencephalogram waveform features from the gazing behavior data set;
and determining the extracted eye feature data as an effective fixation behavior data set.
In one embodiment, when the number of the calibrators is one, completing calibration of the terminal device according to the calibration coefficient, including: correcting the sight line estimation algorithm model according to the calibration coefficient;
when the number of the calibrators is at least two, completing calibration of the terminal device according to the calibration coefficient, including:
and after the calibration coefficients corresponding to all the calibrators are determined, correcting the sight line estimation algorithm model according to each calibration coefficient.
The calibration device can execute the calibration method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
EXAMPLE III
Fig. 5 is a schematic structural diagram of a terminal device according to a third embodiment of the present invention. As shown in fig. 5, a terminal device provided in the fourth embodiment of the present invention includes: one or more processors 41 and storage 42; the processor 41 in the terminal device may be one or more, and one processor 41 is taken as an example in fig. 5; storage 42 is used to store one or more programs; the one or more programs are executed by the one or more processors 41 such that the one or more processors 41 implement the calibration method according to any one of the embodiments of the invention.
The terminal device may further include: an input device 43 and an output device 44.
The processor 41, the storage device 42, the input device 43 and the output device 44 in the terminal equipment may be connected by a bus or other means, and the connection by the bus is exemplified in fig. 5.
The storage device 42 in the terminal device serves as a computer-readable storage medium, and can be used to store one or more programs, which may be software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the calibration method provided in the embodiment of the present invention (for example, the modules in the calibration device shown in fig. 4 include the obtaining module 31, the extracting module 32, the determining module 33, and the calibrating module 34). The processor 41 executes various functional applications and data processing of the terminal device by running software programs, instructions and modules stored in the storage device 42, namely, implements the calibration method in the above method embodiment.
The storage device 42 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal device, and the like. Further, the storage 42 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, storage 42 may further include memory located remotely from processor 41, which may be connected to the device over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input means 43 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal device. The output device 44 may include a display device such as a display screen.
And, when the one or more programs included in the above-mentioned terminal device are executed by the one or more processors 41, the programs perform the following operations:
the method comprises the steps that when a calibrator flickers at a set frequency, a user's gaze behavior data set is obtained, the calibrator is a target object selected in a use scene and used for calibration, and gaze behavior data included in the gaze behavior data set comprise eye feature data and electroencephalogram data;
extracting an effective fixation behavior data set from the fixation behavior data set according to the electroencephalogram data and the set frequency;
determining a calibration coefficient according to the eye feature data included in the effective behavior data set and the position information of the calibration object;
and finishing the calibration of the terminal equipment according to the calibration coefficient.
EXAMPLE five
An embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program is used to execute a calibration method when executed by a processor, and the method includes:
the method comprises the steps that when a calibrator flickers at a set frequency, a user's gaze behavior data set is obtained, the calibrator is a target object selected in a use scene and used for calibration, and gaze behavior data included in the gaze behavior data set comprise eye feature data and electroencephalogram data;
extracting an effective fixation behavior data set from the fixation behavior data set according to the electroencephalogram data and the set frequency;
determining a calibration coefficient according to the eye characteristic data included in the effective behavior data set and the position information of the calibrator;
and finishing the calibration of the terminal equipment according to the calibration coefficient.
Optionally, the program, when executed by the processor, may be further configured to perform the calibration method provided by any of the embodiments of the invention.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read Only Memory (ROM), an Erasable Programmable Read Only Memory (EPROM), a flash Memory, an optical fiber, a portable CD-ROM, an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. A computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take a variety of forms, including, but not limited to: an electromagnetic signal, an optical signal, or any suitable combination of the foregoing. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, radio Frequency (RF), etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing description is only exemplary of the invention and that the principles of the technology may be employed. Those skilled in the art will appreciate that the present invention is not limited to the particular embodiments described herein, and that various obvious changes, rearrangements and substitutions will now be apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. A method of calibration, comprising:
the method comprises the steps of obtaining a user's gaze behavior data set when a calibrator flickers at a set frequency, wherein the calibrator is a target object selected in a use scene and used for calibration, and gaze behavior data included in the gaze behavior data set comprises eye characteristic data and electroencephalogram data;
extracting an effective fixation behavior data set from the fixation behavior data set according to the electroencephalogram data and the set frequency;
determining a calibration coefficient according to the eye characteristic data included in the effective behavior data set and the position information of the calibrator;
and finishing the calibration of the terminal equipment according to the calibration coefficient.
2. The method according to claim 1, wherein the number of the calibrators is at least one, and when the number of the calibrators is at least two, the set frequencies corresponding to the calibrators are not equal.
3. The method of claim 1, wherein each of the calibrators is simultaneously displayed when the number of calibrators is at least two.
4. The method of claim 1, wherein the calibrant has a selection criterion that satisfies at least one of: the display area is less than or equal to a set area threshold; the display duration is greater than or equal to the set duration; the blinking attribute indication may be flashed.
5. The method of claim 1, wherein extracting an effective gaze behavior data set from the gaze behavior data set based on the brain electrical data and the set frequency comprises:
according to a time period corresponding to the target frequency, splitting the gazing behavior data set to obtain a gazing behavior data subset;
for each fixation behavior data subset, determining whether eye feature data included in the fixation behavior data subset is eye feature data of a user fixation calibrator or not according to electroencephalogram data and electroencephalogram waveform features of the fixation behavior data subset, and if yes, determining the fixation behavior data subset as an effective fixation behavior data subset;
wherein the brain wave waveform characteristic is the waveform characteristic of the brain wave signal triggered when the calibrator flicks at a set frequency; when the calibrator is one, the target frequency is the set frequency; when the number of the calibrators is at least two, the target frequency is the set frequency with the largest value among the set frequencies corresponding to the calibrators.
6. The method of claim 1, wherein extracting an effective gaze behavior data set from the gaze behavior data set based on the brain electrical data and the set frequency comprises:
extracting eye feature data corresponding to electroencephalogram data with electroencephalogram waveform features from the gazing behavior data set;
and determining the extracted eye characteristic data as a valid gazing behavior data set.
7. The method of claim 1,
when the number of the calibrators is one, the calibration of the terminal device is completed according to the calibration coefficient, and the calibration method comprises the following steps: correcting the sight line estimation algorithm model according to the calibration coefficient;
when the number of the calibrators is at least two, completing calibration of the terminal device according to the calibration coefficient, including:
and after the calibration coefficients corresponding to all the calibrators are determined, correcting the sight line estimation algorithm model according to each calibration coefficient.
8. A calibration device, comprising:
the device comprises an acquisition module, a processing module and a control module, wherein the acquisition module is used for acquiring a gazing behavior data set of a user when a calibrator flickers at a set frequency, the calibrator is a target object selected in a use scene and used for calibration, and the gazing behavior data included in the gazing behavior data set comprises eye characteristic data and electroencephalogram data;
the extraction module is used for extracting an effective gazing behavior data set from the gazing behavior data set according to the electroencephalogram data and the set frequency;
the determining module is used for determining a calibration coefficient according to the eye characteristic data included in the effective behavior data set and the position information of the calibration object;
and the calibration module is used for completing calibration of the terminal equipment according to the calibration coefficient.
9. A terminal device, comprising:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the calibration method of any one of claims 1-7.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the calibration method according to any one of claims 1 to 7.
CN202110930920.3A 2021-08-13 2021-08-13 Calibration method, calibration device, terminal equipment and storage medium Pending CN115705088A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110930920.3A CN115705088A (en) 2021-08-13 2021-08-13 Calibration method, calibration device, terminal equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110930920.3A CN115705088A (en) 2021-08-13 2021-08-13 Calibration method, calibration device, terminal equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115705088A true CN115705088A (en) 2023-02-17

Family

ID=85180187

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110930920.3A Pending CN115705088A (en) 2021-08-13 2021-08-13 Calibration method, calibration device, terminal equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115705088A (en)

Similar Documents

Publication Publication Date Title
EP2081100B1 (en) Adjusting device for brain wave identification method, adjusting method and computer program
JP4856791B2 (en) EEG interface system, EEG interface providing apparatus, EEG interface execution method, and program
US7716697B2 (en) Information processing system, information processing apparatus, and method
US10488925B2 (en) Display control device, control method thereof, and display control system
JP4399513B2 (en) EEG interface system, EEG interface apparatus, method, and computer program
US20210330229A1 (en) Method and device for attention training, and computer readable storage medium
CN108632658B (en) Bullet screen display method and terminal
CN109343700B (en) Eye movement control calibration data acquisition method and device
JP4659905B2 (en) Apparatus and method for determining necessity of electroencephalogram identification
US20100130882A1 (en) Apparatus, method and program for adjusting distinction method for electroencephalogram signal
CN109582131B (en) Asynchronous hybrid brain-computer interface method
CN109375765B (en) Eyeball tracking interaction method and device
CN110969116B (en) Gaze point position determining method and related device
US8803982B2 (en) Information processing apparatus, information processing method, program, and information processing system for determining a subject as being imaged by a plurality of imaging devices
CN109875583B (en) Fatigue driving detection system and method based on AR technology
CN107422844B (en) Information processing method and electronic equipment
CN113110743A (en) Parallel brain-eye fusion system and method
CN106333643B (en) User health monitoring method, monitoring device and monitoring terminal
CN112346569A (en) Pupil-brain-electrical hybrid brain-computer interface platform and processing method thereof
CN111722708B (en) Eye movement-based multi-dimensional geographic information self-adaptive intelligent interaction method and device
CN115705088A (en) Calibration method, calibration device, terminal equipment and storage medium
JP2014050649A (en) Feebleness evaluation device, feebleness evaluation method, and program
CN116088686B (en) Electroencephalogram tracing motor imagery brain-computer interface training method and system
CN113491502A (en) Eyeball tracking calibration inspection method, device, equipment and storage medium
CN112651270A (en) Gaze information determination method and apparatus, terminal device and display object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination