CN116483209A - Cognitive disorder man-machine interaction method and system based on eye movement regulation and control - Google Patents

Cognitive disorder man-machine interaction method and system based on eye movement regulation and control Download PDF

Info

Publication number
CN116483209A
CN116483209A CN202310735278.2A CN202310735278A CN116483209A CN 116483209 A CN116483209 A CN 116483209A CN 202310735278 A CN202310735278 A CN 202310735278A CN 116483209 A CN116483209 A CN 116483209A
Authority
CN
China
Prior art keywords
eye movement
task
training
cognitive
patient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310735278.2A
Other languages
Chinese (zh)
Inventor
王晓怡
马珠江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Smart Spirit Technology Co ltd
Original Assignee
Beijing Smart Spirit Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Smart Spirit Technology Co ltd filed Critical Beijing Smart Spirit Technology Co ltd
Priority to CN202310735278.2A priority Critical patent/CN116483209A/en
Publication of CN116483209A publication Critical patent/CN116483209A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4076Diagnosing or monitoring particular conditions of the nervous system
    • A61B5/4088Diagnosing of monitoring cognitive diseases, e.g. Alzheimer, prion diseases or dementia
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/70ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Public Health (AREA)
  • Psychiatry (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Data Mining & Analysis (AREA)
  • Biophysics (AREA)
  • Surgery (AREA)
  • General Engineering & Computer Science (AREA)
  • Pathology (AREA)
  • Psychology (AREA)
  • Hospice & Palliative Care (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Neurology (AREA)
  • Developmental Disabilities (AREA)
  • Child & Adolescent Psychology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Human Computer Interaction (AREA)
  • Physiology (AREA)
  • Social Psychology (AREA)
  • Evolutionary Biology (AREA)
  • Educational Technology (AREA)
  • Ophthalmology & Optometry (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Epidemiology (AREA)
  • Signal Processing (AREA)

Abstract

The invention discloses a cognitive impairment man-machine interaction method and system based on eye movement regulation and control. The man-machine interaction method comprises the following steps: acquiring personal information of a patient; through the scale evaluation and task test, acquiring evaluation data of a patient, and acquiring effective eye movement data by using an eye movement instrument; interest analysis is carried out according to the effective eye movement data, and an interest characteristic value of a patient is obtained; and selecting a training task according to the interest characteristic value and the evaluation data, and performing corresponding human-computer interaction cognitive training. The invention dynamically adjusts the interest characteristic value of the patient by using the eye movement monitoring technology, thereby realizing individuation of training tasks; the training tasks of the system are updated periodically, so that the preference of the periodic interest change of the patient on the training tasks is met, and the training effect is improved.

Description

Cognitive disorder man-machine interaction method and system based on eye movement regulation and control
Technical Field
The invention relates to a cognitive disorder man-machine interaction method based on eye movement regulation and control, and also relates to a corresponding cognitive disorder man-machine interaction system, belonging to the technical field of man-machine interaction.
Background
The human eye is called a heart-like window, which has important meaning in natural human interaction, and can sense the surrounding environment, represent the attention of a person, convey the emotion of a person, and the like. Important information of a research object in visual cognition can be collected by recording characteristics such as eyeball motion track, pupil change and the like of a person when visual information is processed. Therefore, eye-tracking (eye-tracking) is widely studied and applied in the field of human-computer interaction.
In recent years, researchers have found that abnormalities in eye movements in Alzheimer's Disease (AD) patients are not only associated with pathological structural changes in brain functional areas, but also reflect decline in cognitive function in patients. At present, it is considered that the characteristic eye movement change of Alzheimer's disease is consistent with the pathological change of Alzheimer's disease, and the early diagnosis and the clinical differential diagnosis of Alzheimer's disease can be assisted. There is increasing evidence that early stage alzheimer's patients have some subtle drawbacks in cognitive suppression, gradually losing effective control of attention as the disease progresses, and presenting a barrier to suppression control and eye movement correction. These defects can be detected using relatively simple eye tracking, but these subtle defects are often ignored by traditional cognitive assessment.
The existing cognitive evaluation is mainly based on scales, the acquisition process of physiological indexes such as images, brain electricity and the like is complicated, the test experience of a patient is easily affected, and then the test result is affected. Meanwhile, the existing computerized cognitive training is mainly based on games, but different games have different attractions to different patients, and the experience of the patients on the training games directly influences the score of the training task, namely the capability level represented by the game task. Therefore, how to identify interest preferences of patients in training games to enhance training experience, further enhance training compliance and improve training effect is still one of the main problems faced in task planning and pushing rule design by current computerized cognitive training.
Disclosure of Invention
The invention aims to provide a cognitive impairment man-machine interaction method based on eye movement regulation and control.
The invention aims to provide a cognitive impairment human-computer interaction system based on eye movement regulation and control.
In order to achieve the technical purpose, the invention adopts the following technical scheme:
according to a first aspect of an embodiment of the present invention, there is provided a cognitive impairment human-computer interaction method based on eye movement regulation, including the steps of:
acquiring personal information of a patient;
acquiring evaluation data of a patient through evaluation and task testing; and acquiring effective eye movement data by using an eye movement instrument;
according to the effective eye movement data, interest analysis is carried out to obtain an interest characteristic value of a patient;
and selecting a training task according to the interest characteristic value and the evaluation data, and performing corresponding human-computer interaction cognitive training.
Wherein preferably the interest analysis is to input valid eye movement data into a machine learning model to obtain a patient's interest feature value.
Wherein preferably the machine learning model is a deep-interest network model.
Wherein preferably, the training task is selected according to the interest characteristic value and the evaluation data, and the method comprises the following substeps:
selecting a plurality of training tasks with corresponding capacities according to the scores of evaluation of the patient's scale and the scores of task tests to form a task library;
and selecting a specific training task from the task library according to the interest characteristic value of the patient.
Wherein preferably each training task in said task library has a plurality of classification labels, a plurality of said training tasks being ordered based on said classification labels to push a particular training task,
the specific training task is a training task with the task type and task style which are matched with the task type and task style characteristics in the interest characteristic value.
Wherein preferably, the classification tag comprises a primary tag and a secondary tag; wherein,,
the first-level label comprises a task type and a task duration which are obtained in the scale evaluation;
the secondary label comprises a task style obtained from the interest feature value.
Preferably, the cognitive impairment man-machine interaction method further comprises the following steps: when a certain patient performs each training task, the eye movement instrument is used for detecting the original eye movement data of the patient in each training task, the interest characteristic value is obtained, and the interest modeling data is updated in time so as to push the next human-computer interaction cognitive training task with corresponding capability.
Preferably, the score of the scale evaluation is a score obtained by evaluating the scale of Montreal cognitive evaluation;
the score of the task test is a score obtained by a reverse eye jump task test.
According to a second aspect of the embodiment of the invention, a cognitive disorder man-machine interaction system based on eye movement regulation is provided, which comprises an input/output unit, a cognitive evaluation unit, a cognitive training unit, an eye movement monitoring unit and a data processing unit; wherein,,
the input/output unit is respectively connected with the cognitive evaluation unit and the cognitive training unit and is connected with the data processing unit so as to receive information from the data processing unit and realize man-machine interaction with the cognitive evaluation unit and the cognitive training unit;
the cognition evaluation unit is matched with the input/output unit, performs overall cognition function on a patient through an electronic evaluation scale, and receives original eye movement data from the eye movement monitoring unit so as to evaluate inhibition control and eye movement error correction capability;
the eye movement monitoring unit is connected with the cognitive evaluation unit and is used for providing raw eye movement data for the cognitive evaluation unit so as to provide refined cognitive impairment recognition in a cognitive evaluation stage;
the data processing unit is connected with the cognition evaluation unit and is used for processing data from the electronic scale and the eye movement monitoring in the evaluation stage and carrying out fusion analysis so as to determine the cognition damage degree of a patient;
the data processing unit is also connected with the cognitive training unit and is used for processing task data generated in the training process and interest tag data extracted by eye movement monitoring so as to perform fusion calculation and establish an interest model.
Compared with the prior art, the invention has the following technical effects:
(1) More evaluation function indexes can be obtained through eye movement tracking, so that the accuracy of judging the overall cognitive ability level of a patient is improved;
(2) Through eye tracking, the interest characteristic value of the patient can be dynamically adjusted, so that individuation of training tasks is realized;
(3) The training task of the system is updated periodically, so that the preference of the periodic interest change of the patient on the training task can be met, and the training effect is effectively improved.
Drawings
Fig. 1 is a schematic structural diagram of a cognitive impairment man-machine interaction system based on eye movement regulation according to an embodiment of the present invention;
fig. 2 is a workflow diagram of a cognitive impairment man-machine interaction method based on eye movement regulation according to an embodiment of the present invention;
FIG. 3 is a flowchart illustrating the calculation of interest feature values according to an embodiment of the present invention.
Detailed Description
The technical contents of the present invention will be described in detail with reference to the accompanying drawings and specific examples.
Referring to fig. 1, the invention discloses a cognitive impairment human-computer interaction system based on eye movement regulation and control. The cognitive disorder man-machine interaction system at least comprises: the system comprises an input/output unit, a cognition evaluation unit, a cognition training unit, an eye movement monitoring unit and a data processing unit. The input/output unit is connected with the cognitive evaluation unit and the cognitive training unit respectively and is connected with the data processing unit so as to receive information from the data processing unit and realize man-machine interaction with the cognitive evaluation unit and the cognitive training unit.
And the cognition evaluation unit is matched with the input/output unit, is used for carrying out overall cognition function on a patient through an electronic evaluation scale and receiving the original eye movement data from the eye movement monitoring unit so as to evaluate the inhibition control and the eye movement error correction capability. The cognitive training unit provides training tasks, such as game training tasks, for intervention of impaired cognitive function of the patient.
The eye movement monitoring unit is connected with the cognitive evaluation unit and is used for providing original eye movement data for the cognitive evaluation unit so as to provide refined cognitive impairment recognition in a cognitive evaluation stage, so that the cognitive impairment condition of a patient can be judged more accurately; the eye movement monitoring unit is connected with the cognitive training unit and is used for timely providing interest analysis of the patient in the cognitive training stage so as to judge whether the cognitive condition of the patient is improved.
The data processing unit is connected with the cognition evaluation unit on one hand and is used for processing data from the electronic scale and the eye movement monitoring in the evaluation stage, carrying out fusion analysis on the data, and determining the cognition damage degree of a patient; and on the other hand, the system is connected with a cognitive training unit and is used for processing task data generated in the training process and interest tag data extracted by eye movement monitoring, carrying out fusion calculation on the task data and the interest tag data, and establishing an interest model, thereby being used as one of important basis for training task pushing.
The cognitive disorder man-machine interaction method provided by the embodiment of the invention is described in detail below by combining the cognitive disorder man-machine interaction system. Referring to fig. 2, the method for man-machine interaction of cognitive impairment based on eye movement control specifically includes the following steps.
S10, personal information of the patient is acquired.
Specifically, the patient needs to register an account in advance, and a login account is obtained by inputting personal information, so that the cognitive impairment man-machine interaction system can be logged in.
The personal information includes at least one of a plurality of information such as age, sex, taste, etc. for evaluation by the cognitive evaluation unit and configuring training tasks in the cognitive training unit. For example, if the age is less than 12 years old and the gender is female, the color in the training regimen preferentially recommends a pink color.
S11, acquiring original eye movement data of a patient based on an eye movement instrument.
Specifically, the original eye movement data includes data such as task requirements of a patient in each task test process, first fixation time and duration of task elements, fixation times, eye jump times and the like. The data are data related to eyeball movement synchronously collected by utilizing an eye movement tracking technology in the process of performing scale evaluation and task test on a patient. This data may enhance the accuracy of diagnostic analysis of patient attention-like related functions.
S12, acquiring evaluation data of a patient through scale evaluation and task test; and effective eye movement data is acquired using the eye movement meter.
Specifically, the evaluation data obtained by the scale evaluation and the task test includes:
(1) Cognitive assessment scale evaluation data (score 1)
In one embodiment of the present invention, a Montreal cognitive assessment scale (MoCA) is utilized as the assessment scale, but the present invention is not limited thereto and other assessment scales may be utilized, such as Becky depression self-assessment scale (Beck Depression Inventory), mood disorder questionnaire (Mood Disorder Questionnaire), montgomery-Asterg depression scale (Montgomery-Asberg Depression Rating Scale), patient health questionnaire (Patient Health Questionnaire).
MoCA is a simple alzheimer's disease screening tool that includes attention and concentration capabilities, executive functions, memory, language, visual construction skills, conceptual thinking, computation and localization. The highest score tested was 30 points; a score of 26 and above is considered cognitive normal.
In one embodiment of the invention, the overall score of MoCA is taken as the scale score (score 1) of the overall cognitive function of the patient, with specific tasks as the scores of the individual cognitive domains, namely, attention score, executive function score, memory score, language score, vision construction skill score, concept thinking score, calculation and localization score, and the like.
(2) Task test evaluation data (score 2)
In one embodiment of the invention, a reverse eye jump task test is utilized to evaluate the patient's ability to control suppression and correct eye movement, resulting in task test assessment data (score 2). However, the invention is not limited thereto, but may be other eye movement based task tests, such as natural reading paradigm, blank overlapping paradigm.
In the reverse eye jump task test, a 1s instruction display is provided before each reverse eye jump test to remind the tested to look towards the target. The gaze point is white, the back of the screen is black, and the gaze point appears for 1s. The ocular jump disrupter was red with a 200ms blank interval before the ocular jump disrupter appeared. Then, the eye-jump interferents (red) are randomly presented at positions 4 degrees (4 degrees herein refer to the angle of view) from the left or right side of the fixed target for a presentation time of 2s. The subject is required to look at the center of the screen and then jump to the opposite position of the interferent in the screen immediately when the interferent appears. In one embodiment of the present invention, a total of 24 reverse eye jump tests were performed with the correct rate and error rate (0-1) as the level of patient restraint control and eye movement correction capability.
Based on the original eye movement data provided by the eye movement instrument, eye link software (eye movement tracking software based on a vision system and used for researching human vision behaviors and cognitive processes) is used, information such as movement tracks, fixation points, line of sight directions and the like of eyes can be recorded in real time and converted into visualized data by scientists in the university of california, and effective eye movement data can be obtained after the original eye movement data is processed offline.
Specifically, the velocity signal is more than 1500 deg/s or the acceleration signal is more than 100000 deg in the original eye movement data 2 All frames of/sec (these values may be due to blinks, etc., not normal eye movement data) are removed and noise and peaks are filtered out. Then, analyzing algorithm detection is carried out on the residual original eye movement data by using Eyelink software, all eye hops in each test task and a series of space and time attributes of each eye hop are extracted, and after the data are stored, micro hops with the amplitude smaller than 0.7 degree in the data are removed. Here, the time from the start of the eye jump to the start of the target is measured as the eye jump latency period, and the effective eye jump is calculated only within a time window of 80 to 700ms after the start of the target to exclude the intended eye movement, that is, the eye movement that has started before the occurrence of the eye jump disturbance.
S13, according to the effective eye movement data, interest analysis is carried out by using a machine learning method, and the interest characteristic value of the patient is obtained.
In one embodiment of the invention, the deep interest network (Deep Interest Network) model is used for interest feature analysis, but is not limited thereto and other neural network learning models are also possible.
Specifically, the deep interest network model includes:
(1) Input layer: and acquiring by an eye tracker, and transmitting the data to EyeLink software for data analysis to obtain an interest analysis tag.
Specifically, each training process of each patient is taken as an interest monitoring period, and each training task interface is taken as an interest area (also called an interest area) to conduct interest feature extraction analysis, wherein the interest feature extraction analysis comprises each training task and an interaction interface thereof, and content elements of each training task. After the EyeLink software collects the training data, the collected effective eye movement tracking data are further subjected to fusion processing to provide the reaction of each patient in the interest area, wherein the reaction comprises the variables of quantity, duration, frequency and the like, so that the interest analysis tag is obtained.
As shown in table 1 and fig. 3, the interest analysis tag includes a plurality of levels: the primary labels (or class labels) are: task preferences, interface preferences; the secondary tags (or class tags) are: task preference corresponds to task type, task duration; interface preference corresponds to color, shape, position, size, text; the specific label content is as follows: the primary labels are task preference K, and comprise m secondary labels, namely a task type K1 (K1 memory class, K2 attention class … …), a task duration K2 comprises three levels (less than or equal to 60s, 61-180 s and more than or equal to 180 s), and an interface preference K3 … … Km … …, wherein the interface preference Kf comprises n tertiary labels: color preference Ff1, shape preference Ff2 (circle, square, triangle, polygon, irregular figure … …), size preference Ff3 (large, medium, small), position preference Ff4 (up, down, left, right, medium, and combined position), text preference Ff5 (including text length, style, etc. … … Kfn … …, where m, n are positive integers.
Table 1: interest analysis tag
(2) Calculation layer: and calculating each interest analysis label. The deep interest network model is introduced with a module called a local activation unit (Local Activation Unit) for learning the relation between candidate task content and the historical behaviors of the patient, giving out the degree of correlation (namely weight parameter) between the candidate task content and each historical behavior, and then carrying out weighted summation on the historical behavior sequences to finally obtain the feature expression of the interest of the patient.
(3) Output layer: and obtaining an interest characteristic value of the patient through calculation, and carrying out interest modeling according to the interest characteristic value so as to be used for pushing a man-machine interaction scheme of a corresponding task for the patient.
And S14, selecting a training task according to the interest characteristic value and the evaluation data of the patient, and performing corresponding human-computer interaction cognitive training.
Specifically, the method comprises the following steps:
s141, determining training content: based on the cognitive ability scores (i.e., score 1 and score 2) of the patient, a plurality of training tasks corresponding to the ability are selected to form a task library. That is, a plurality of training tasks corresponding to the ability are selected to constitute a task library based on the score 1 (attention and concentration ability, executive function, memory, language, visual construction skills, conceptual thinking, calculation and localization) and the score 2 (suppression control and eye movement correction ability).
S142, selecting a training task: after determining that a plurality of training tasks of a patient form a task library, selecting a specific training task from the task library according to the interest characteristic value of the patient.
Before training task selection, task sequencing is completed, and then the task with the highest secondary label score is preferentially selected for pushing according to the interest characteristic value characteristics of the patient. It should be noted that, each training task in the task library has a plurality of classification labels, and each training task has the n labels and the attached values thereof, so as to facilitate the sorting before the training task selection. The classification labels comprise a first-level label and a second-level label, wherein the first-level label is of a task type, namely an attention type, a memory type, an executive function type, a suppression control and eye movement correction type … …; the secondary label is a task style, namely color, shape, position, size and text … …, namely, the primary label comprises a task type and task duration obtained in the scale evaluation; the secondary labels comprise task styles derived from the interest feature values.
For example, a certain memory training task has a time length label of (50 s), a color label of (red=70 points, blue 16 points, green 4 points, other 10 points), a shape label of (square 90 points, other 10 points), and a text label of (phrase 99 points, other 1 point, positive style 69 points, neutral 21 points, and negative 10 points). Thus, the specific training task pushed is the training task whose task type and task style best match the task type and task style characteristics in the patient's interest feature values.
S143, updating training tasks: including patient training task updates and system training task updates. Specifically, when a certain patient performs each training task, the eye tracker is used for detecting the original eye movement data of the patient in each training task, obtaining the interest characteristic value, and updating the interest modeling data in time so as to push the next human-computer interaction cognitive training task with corresponding capability. Here, the interest value is also referred to as an interest feature value; interest modeling data is an interest model and a user interest image of a specific user.
Through repeated personalized cognitive training-interest modeling-cognitive training-interest modeling … …, the improvement of the cognitive ability of the patient can be realized until the illness state of the patient is controlled. Meanwhile, the training task is periodically updated (… … each day, each week and each month) by the system, and meanwhile, the interest characteristic value of the patient established by the system is updated accordingly, so that the preference of the periodic interest change of the patient on the training task is met, and the human-computer interaction cognitive training effect is improved. Specifically, the principle followed by the update of the system training task is as follows: the total number of tasks is kept unchanged, and the replaced new tasks need to be parallel tasks in the same type task library. Such as: the system is built with a red-style memory class total task library comprising 1-10 levels, the system is correspondingly provided with 10 task libraries, and if the data generated by the patient on the same day training task contains a red-style memory task with 1 level of difficulty, the replacement task needs to be randomly extracted from the 1-level red task library for replacement, so that the compliance of the patient cognitive training is effectively improved. In addition, the training tasks after the periodic evaluation are updated, and corresponding task extraction is required according to the new evaluation result.
In summary, according to the cognitive disorder man-machine interaction method and system based on eye movement regulation provided by the embodiment of the invention, the eye movement monitoring technology is integrated into the cognitive evaluation process and the cognitive training process, so that the system can acquire more cognitive function labels when performing cognitive evaluation, and thus the overall cognitive ability level of a patient can be judged more accurately. Meanwhile, the eye movement monitoring technology is integrated in the cognitive training process, so that the feature change of the interest feature value of the patient can be mastered in time, the cognitive training process is more personalized, and the training effect is effectively improved. In addition, the cognitive impairment man-machine interaction system periodically updates the system training task, and can meet the preference of the periodic interest change of the patient on the training task, so that the training effect is improved.
The human-computer interaction method and the human-computer interaction system for cognitive disorder based on eye movement regulation provided by the invention are described in detail. Any obvious modifications to the present invention, without departing from the spirit thereof, would constitute an infringement of the patent rights of the invention and would take on corresponding legal liabilities.

Claims (9)

1. A cognitive impairment man-machine interaction method based on eye movement regulation and control is characterized by comprising the following steps:
acquiring personal information of a patient;
through the scale evaluation and task test, acquiring evaluation data of a patient, and acquiring effective eye movement data by using an eye movement instrument;
interest analysis is carried out according to the effective eye movement data, and an interest characteristic value of a patient is obtained;
and selecting a training task according to the interest characteristic value and the evaluation data, and performing corresponding human-computer interaction cognitive training.
2. The eye movement control-based cognitive impairment human-computer interaction method of claim 1, wherein:
the interest analysis is to input effective eye movement data into a machine learning model to obtain interest characteristic values of a patient.
3. The eye movement control-based cognitive impairment human-computer interaction method of claim 2, wherein:
the machine learning model is a deep-interest network model.
4. The eye movement regulation-based cognitive impairment human-computer interaction method of claim 3, wherein selecting a training task based on the feature of interest values and the assessment data comprises the sub-steps of:
selecting a plurality of training tasks with corresponding capacities according to the scores of evaluation of the patient's scale and the scores of task tests to form a task library;
and selecting a specific training task from the task library according to the interest characteristic value of the patient.
5. The eye movement regulation-based cognitive impairment human-computer interaction method of claim 4, wherein:
each training task in the task library is provided with a plurality of classification labels, a plurality of training tasks are ordered based on the classification labels to push specific training tasks,
the specific training task is a training task with the task type and task style which are matched with the task type and task style characteristics in the interest characteristic value.
6. The eye movement control-based cognitive impairment human-computer interaction method according to claim 5, wherein:
the classification labels comprise primary labels and secondary labels; wherein,,
the first-level label comprises a task type and a task duration which are obtained in the scale evaluation;
the secondary label comprises a task style obtained from the interest feature value.
7. The eye movement regulation-based cognitive impairment human-computer interaction method of claim 6, further comprising:
when a certain patient performs each training task, the eye movement instrument is used for detecting the original eye movement data of the patient in each training task, the interest characteristic value is obtained, and the interest modeling data is updated in time so as to push the next human-computer interaction cognitive training task with corresponding capability.
8. The eye movement control-based cognitive impairment human-computer interaction method of claim 6, wherein:
the score of the scale evaluation is obtained by evaluating the Montreal cognitive evaluation scale;
the score of the task test is a score obtained by a reverse eye jump task test.
9. The cognitive disorder man-machine interaction system based on eye movement regulation is characterized by comprising an input/output unit, a cognitive evaluation unit, a cognitive training unit, an eye movement monitoring unit and a data processing unit; wherein,,
the input/output unit is respectively connected with the cognitive evaluation unit and the cognitive training unit and is connected with the data processing unit so as to receive information from the data processing unit and realize man-machine interaction with the cognitive evaluation unit and the cognitive training unit;
the cognition evaluation unit is matched with the input/output unit, performs overall cognition function on a patient through an electronic evaluation scale, and receives original eye movement data from the eye movement monitoring unit so as to evaluate inhibition control and eye movement error correction capability;
the eye movement monitoring unit is connected with the cognitive evaluation unit and is used for providing raw eye movement data for the cognitive evaluation unit so as to provide refined cognitive impairment recognition in a cognitive evaluation stage;
the data processing unit is connected with the cognition evaluation unit and is used for processing data from the electronic scale and the eye movement monitoring in the evaluation stage and carrying out fusion analysis so as to determine the cognition damage degree of a patient;
the data processing unit is also connected with the cognitive training unit and is used for processing task data generated in the training process and interest tag data extracted by eye movement monitoring so as to perform fusion calculation and establish an interest model.
CN202310735278.2A 2023-06-20 2023-06-20 Cognitive disorder man-machine interaction method and system based on eye movement regulation and control Pending CN116483209A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310735278.2A CN116483209A (en) 2023-06-20 2023-06-20 Cognitive disorder man-machine interaction method and system based on eye movement regulation and control

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310735278.2A CN116483209A (en) 2023-06-20 2023-06-20 Cognitive disorder man-machine interaction method and system based on eye movement regulation and control

Publications (1)

Publication Number Publication Date
CN116483209A true CN116483209A (en) 2023-07-25

Family

ID=87212213

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310735278.2A Pending CN116483209A (en) 2023-06-20 2023-06-20 Cognitive disorder man-machine interaction method and system based on eye movement regulation and control

Country Status (1)

Country Link
CN (1) CN116483209A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2754835A1 (en) * 2011-10-07 2013-04-07 Baycrest Centre For Geriatric Care Methods and systems for assessing cognitive function
CN107929007A (en) * 2017-11-23 2018-04-20 北京萤视科技有限公司 A kind of notice and visual capacity training system and method that tracking and intelligent evaluation technology are moved using eye
CN109445578A (en) * 2018-10-12 2019-03-08 上海陈天桥脑疾病研究所 A kind of cognitive ability assessment system and method
CN113192600A (en) * 2021-04-06 2021-07-30 四川大学华西医院 Cognitive assessment and correction training system based on virtual reality and eye movement tracking
CN115868939A (en) * 2023-01-13 2023-03-31 北京中科睿医信息科技有限公司 Cognitive assessment method, device, equipment, medium and computer program product

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2754835A1 (en) * 2011-10-07 2013-04-07 Baycrest Centre For Geriatric Care Methods and systems for assessing cognitive function
CN107929007A (en) * 2017-11-23 2018-04-20 北京萤视科技有限公司 A kind of notice and visual capacity training system and method that tracking and intelligent evaluation technology are moved using eye
CN109445578A (en) * 2018-10-12 2019-03-08 上海陈天桥脑疾病研究所 A kind of cognitive ability assessment system and method
CN113192600A (en) * 2021-04-06 2021-07-30 四川大学华西医院 Cognitive assessment and correction training system based on virtual reality and eye movement tracking
CN115868939A (en) * 2023-01-13 2023-03-31 北京中科睿医信息科技有限公司 Cognitive assessment method, device, equipment, medium and computer program product

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王喆: "《深度学习推荐***》", 电子工业出版社, pages: 86 - 88 *

Similar Documents

Publication Publication Date Title
JP3224675U (en) Interactive and adaptive learning using pupil response, face tracking, and emotion detection, neurocognitive disorder diagnosis, and non-following detection system
Kübler et al. SubsMatch 2.0: Scanpath comparison and classification based on subsequence frequencies
Li et al. Automated detection of cognitive engagement to inform the art of staying engaged in problem-solving
US20200046277A1 (en) Interactive and adaptive learning and neurocognitive disorder diagnosis systems using face tracking and emotion detection with associated methods
JP2021516099A (en) Cognitive screens, monitors, and cognitive therapies targeting immune-mediated and neurodegenerative disorders
US20170039876A1 (en) System and method for identifying learner engagement states
KR102381088B1 (en) Psychological test system based on artificial intelligence and operation method thereof
CN113658697B (en) Psychological assessment system based on video fixation difference
Zhang et al. A human-in-the-loop deep learning paradigm for synergic visual evaluation in children
JP7099377B2 (en) Information processing equipment and information processing method
CN114969557A (en) Propaganda and education pushing method and system based on multi-source information fusion
Vakanski et al. Metrics for performance evaluation of patient exercises during physical therapy
CN111317448A (en) Method and system for analyzing visual space cognition
CN108962397B (en) Pen and voice-based cooperative task nervous system disease auxiliary diagnosis system
Xia et al. Dynamic viewing pattern analysis: towards large-scale screening of children with ASD in remote areas
CN117442154A (en) Visual detection system based on children's attention
KR101150768B1 (en) A quality test method of dermatoglyphic patterns analysis and program recording medium
Giannakos et al. Sensing-based analytics in education: The rise of multimodal data enabled learning systems
US20230105077A1 (en) Method and system for evaluating and monitoring compliance, interactive and adaptive learning, and neurocognitive disorder diagnosis using pupillary response, face tracking emotion detection
CN116483209A (en) Cognitive disorder man-machine interaction method and system based on eye movement regulation and control
Gao et al. Detecting Teacher Expertise in an Immersive VR Classroom: Leveraging Fused Sensor Data with Explainable Machine Learning Models
CN114119932A (en) VR teaching method, apparatus, electronic device, storage medium and program product
Xia et al. An interpretable English reading proficiency detection model in an online learning environment: A study based on eye movement
CN115223232A (en) Eye health comprehensive management system
Bisogni et al. Gaze analysis: A survey on its applications

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20230725