CN107256332A - The electric experimental evaluation system and method for brain based on eye movement data - Google Patents

The electric experimental evaluation system and method for brain based on eye movement data Download PDF

Info

Publication number
CN107256332A
CN107256332A CN201710378148.2A CN201710378148A CN107256332A CN 107256332 A CN107256332 A CN 107256332A CN 201710378148 A CN201710378148 A CN 201710378148A CN 107256332 A CN107256332 A CN 107256332A
Authority
CN
China
Prior art keywords
sequence
eye movement
movement data
blinkpunkt
participation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710378148.2A
Other languages
Chinese (zh)
Other versions
CN107256332B (en
Inventor
吕宝粮
郑伟龙
石振锋
周畅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Zero Unique Technology Co ltd
Original Assignee
Shanghai Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiaotong University filed Critical Shanghai Jiaotong University
Priority to CN201710378148.2A priority Critical patent/CN107256332B/en
Publication of CN107256332A publication Critical patent/CN107256332A/en
Application granted granted Critical
Publication of CN107256332B publication Critical patent/CN107256332B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines

Landscapes

  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

A kind of electric experimental evaluation system and method for the brain based on eye movement data, by eye tracker acquisition target eye movement data, the blinkpunkt setup time spatial model in eye movement data;Then using similarity degree between the dynamic time warping algorithm rapid technology sequence of calculation and distance matrix is built, again outlier detection is carried out by density-based algorithms, and the participation of object is obtained using study sequence training pattern according to cluster result after quantified sequence;The present invention can objective quantification evaluation object participate in the conscientious degree of experiment, be experiment and model formation feedback, to ensure the quality of data and improve classification accuracy.The assessment quantified to the degree that object participates in experiment, constructs the quantization feedback of Emotion identification experiment.

Description

The electric experimental evaluation system and method for brain based on eye movement data
Technical field
The present invention relates to a kind of technology of field of information processing, specifically a kind of brain electricity experiment based on eye movement data Assessment system and method.
Background technology
Machine learning is the science of an artificial intelligence, and the main study subject in the field is artificial intelligence, particularly such as Where the performance of specific algorithm is improved in empirical learning.Machine learning is segmented into from mode of learning:1. supervised learning;2. Unsupervised learning;3. semi-supervised learning;4. enhancing study.Current supervised learning has relatively ripe utilization in each field, But, dependence of the supervised learning to sample label limits it and further developed:Label is inaccurate, sample radix it is excessive cause to Calibration label cost is excessive etc. may to influence the supervised learning degree of accuracy.On the contrary, semi-supervised learning, unsupervised learning and enhancing Study is closer to the mode of learning of the mankind, learns to make action how by observing, and each action can be to environment It has been influenceed that, the feedback of the surrounding environment that learning object is arrived according to the observation is judged.Therefore, semi-supervised learning, non-supervisory Study and enhancing study occupy a critically important part in machine learning field.Feedback is to realize semi-supervised, non-supervisory learn Practise, enhancing learns a highly important ring.Therefore, the proposition of the method is also important one for realizing more preferable unsupervised learning Step.
Most experiments are required to participate in experiment gathered data by object now, and therefore, object degree of participation is directly affected The good and bad degree of data.For example in Emotion identification experiment, object stimulates material by watching, and is induced corresponding mood simultaneously Eeg data is gathered, predicts that object is watching mood when each stimulates material according to eeg data.If object is pierced in viewing Hormone material is, stupefied, absent-minded or deliberately half-hearted viewing material behavior occurs, will cause the quality of data lowly, model prediction Accuracy is reduced.Before the method proposition, generally by the way of feedback of filling in a form, object is terminating the viewing of a fragment Afterwards, the evaluation to oneself mood is filled in feedback form.It is this feedback subjective factor it is excessive, there is also object active concealment, The possibility of deception, therefore, one objectively very important to assess the mode of object degree of participation based on True Data.
Dynamic time warping algorithm (DTW) was once a kind of main stream approach of speech recognition.It is by Time alignment and distance It is regular estimate combine, using dynamic programming techniques, compare two patterns of different sizes, solve word speed in speech recognition many The problem of change.It is based on Dynamic Programming, can effectively reduce search time, but for great amount of samples data, O (n2) time complexity Still a large amount of operation times can be consumed.Therefore, DTW rapid technology is employed in this method, by document Stan Salvador& Philip Chan,FastDTW:Toward Accurate Dynamic Time Warping in Linear Time and The FastDTW algorithms that Space.KDD Workshop on Mining Temporal and Sequential Data are proposed.By Differ in the length of eye movement data, this method matches the technology for being originally intended to speech recognition used in eye movement data, as we An important part of method.
Eye tracker is in recent years new sci-tech product, after wearing, can be with precise acquisition wearer on eye motion Information, including:Blink, blinkpunkt, watch duration, pupil size attentively.And can count:Number of winks, fixation times are swept, put down Equal pupil size, averagely blink duration, frequency of wink, gaze frequency etc..In document Yifei Lu, Wei-Long Zheng, Binbin Li,and Bao-Liang Lu,Combining Eye Movements and EEG to Enhance Emotion Recognition,in Proc.of the International Joint Conference on Artificial In Intelligence (IJCAI'15), by extracting feature from eye tracker statistics, Emotion identification can be used alone as Or construct multi-modal.
The content of the invention
The present invention is directed to deficiencies of the prior art, proposes a kind of electric experimental evaluation system of brain based on eye movement data System and method, objective quantification evaluation object participate in the conscientious degree of experiment, are that experiment and model form feedback, to ensure data matter Amount and raising classification accuracy.The assessment quantified to the degree that object participates in experiment, constructs Emotion identification The quantization feedback of experiment.
The present invention is achieved by the following technical solutions:
The present invention relates to a kind of electric experimental evaluation system of brain based on eye movement data, including:Eye tracker, distance matrix generation Module, participation detection module and Emotion identification module, wherein:Eye tracker is connected with distance matrix generation module and transmits eye Dynamic data message, distance matrix generation module is connected and transmission range information, participation detection module with participation detection module It is connected with Emotion identification module and transmits participation testing result information and Emotion identification object information.
Described eye movement data includes:Watch coordinate attentively, watch duration attentively, watch state pause judgments time, pan origin coordinates attentively, sweep Apparent path, pan duration, pan state pause judgments time, pan angle.
The present invention relates to a kind of electric experimental evaluation method of the brain based on eye movement data based on said system, pass through eye tracker Acquisition target eye movement data, blinkpunkt setup time-spatial model in eye movement data;Then dynamic time warping is used Similarity degree and build distance matrix between the algorithm rapid technology sequence of calculation, then by density-based algorithms carry out from The detection of group's point and quantization sequence, obtain the participation of object.
Described distance matrix, is obtained in the following manner:
I) order extracting object viewing stimulates all blinkpunkts during material fragment, i.e.,:{(x1,y1,t1),(x2,y2, t2),…,(xn,yn,tn), wherein:xi,yiIt is to watch point coordinates, t attentively i-thiIt is i-th of blinkpunkt duration, n watches attentively for fragment Point number;
Ii the blinkpunkt that duration is less than certain predetermined threshold δ) is considered as invalid blinkpunkt, invalid note is deleted from sequence Viewpoint.Effective blinkpunkt is deployed successively, i.e. repeatedly coordinate t/ δ times, wherein:T is the duration of the blinkpunkt;
Iii) by step ii) in blinkpunkt according to region to sequential coding;
Iv dynamic time warping algorithm comparative sequences similarity two-by-two) is used, a concrete numerical value is obtained and represents two sequences Similarity degree, smaller expression similarity is higher;
Described dynamic time warping algorithm refers to:FastDTW[Stan Salvador&Philip Chan,FastDTW: Toward Accurate Dynamic Time Warping in Linear Time and Space.KDD Workshop on Mining Temporal and Sequential Data,pp.70-80,2004]。
V) using each two sequence similarity numerical value as distance, distance matrix is constructedWherein:Dij =Dji, DijIt is the distance between i-th of sequence and j-th sequence, m is single stimulation material total sample number, i.e. object number;
Described outlier detection and quantization sorts, and specifically includes following steps:
I) range data based on previous module, using density-based algorithms, detects outlier and non-peels off Point.
Ii) extract each sequence distance of non-outlier into cluster result and be used as characteristic vector<v1,v2,…,vp>, its Middle viFor ith feature value, p is characterized sum, i.e., non-outlier number;
Iii) for each stimulation material fragment, the sample for the experiment fed back to needs is according to Emotion identification result precision Sequence, is used as training label;
Iv) choosing one successively stimulates material fragment as test set, and instruction is used as using all stimulation material fragments of residue Practice collection, use SVM Rank training patterns, prediction test sample sequence.
Described outlier detection, two classes are divide into by object:Eye movement data is detected as the participation of the object of outlier Experiment degree is not high, brain electricity, eye movement data poor quality, therefore can with science avoid use this partial data;The dynamic number of eye According to the object for being detected as non-outlier, illustrate that it take part in experiment in earnest, mood ought to be induced well, data matter Amount is outstanding, therefore can effectively improve classification accuracy using only this partial data.
Described quantization sequence, prediction test sample sequence, quantifies the degree of participation of each object.The result of quantization, is used In improving experiment, during object many experiments, object feedback information is given, object degree of participation is improved.And can be by result Incorporate into former experiment, for example, give in the Emotion identification of brain electricity, forecast model, improve model accuracy.
Technique effect
Compared with prior art, the present invention is detected by participation, is selected and using high-quality data, is effectively improved mood The identification prediction degree of accuracy, secondly the present invention is detected by participation, in subject experimentation, is given subject feedback information, is carried High subject degree of participation;Experimental evaluation is carried out using the inventive method, than realizing difficulty using HMM (HMM) It is upper simpler, effectively.
Brief description of the drawings
Fig. 1 is the electric experimental evaluation system schematic of the brain based on eye movement data;
Fig. 2 is the dynamic sequence diagram of embodiment eye;
Fig. 3 is embodiment configuration diagram.
Embodiment
Embodiment 1
As shown in figure 1, the present embodiment uses BeGaze by SMI iView ETG eye tracker acquisition target eye movement datas Software extracts eye movement data, and the blinkpunkt in eye movement data generates distance matrix by distance matrix module;Then pass through Participation detection module is obtained a result with Emotion identification module.
As shown in figure 3, the present embodiment is carried out in the experimental situation strictly controlled.Test a tangible independence and every Carried out in the room of sound, room lighting is by illuminator control, it is ensured that intensity of illumination is moderate and invariable, and room temperature is by air-conditioning system System maintains the temperature of comfortable.
By being detected to 10 objects in the present embodiment, test it and participate in Emotion identification and test and wear eye tracker to adopt Collect eye movement data, wherein 5 are required conscientiously to watch stimulation material, 5 are required that half-hearted viewing stimulates material.Use this hair Bright method, the classification results degree of accuracy is up to 90%.
Table 1.Eye movement data cluster result with label
Note:" 1 " represents that conscientiously viewing stimulates material, and " -1 " represents that half-hearted viewing stimulates material.
Embodiment 2
The present embodiment is using environment same as Example 1 and identical eye movement data collection equipment.In addition, the present embodiment ESI NeuroScan systems have been used to carry out the collection of brain electricity.Brain electricity cap possesses 64 electrodes, and distribution of electrodes meets international uniform 10-20 system standards, wherein two lead and do not utilize, therefore eeg data is led in collection 62 altogether.Brain electricity cap sample frequency be 1000Hz。
The present embodiment is tested with 26 objects, is detected that its participation Emotion identification is tested and wears eye tracker collection eye and is moved In data, experimentation, the conscientious degree of object is unknown.Using the inventive method, object data is divided into conscientious, half-hearted two Class.Using the Emotion identification method based on brain electricity, everyone the Emotion identification degree of accuracy is calculated.Wherein, the average feelings of all objects Thread recognition accuracy is 68.52%, and it is 81.70% to be divided into the average Emotion identification degree of accuracy of conscientious object, is divided into not recognizing again Genuine object bat is 57.26%.Based on cluster result, after being quantified, prediction sequence and true sequence correlation Up to 0.77.
Table 2.The not eye movement data cluster result of tape label
Table 3.Quantitative evaluation ranking results
The method of the above embodiment of the present invention illustrates effectiveness of the invention and remarkable result.Predicted by notice The Emotion identification degree of accuracy has larger coefficient correlation with true emotional recognition accuracy.
Above-mentioned specific implementation can by those skilled in the art on the premise of without departing substantially from the principle of the invention and objective with difference Mode local directed complete set is carried out to it, protection scope of the present invention is defined by claims and not by above-mentioned specific implementation institute Limit, each implementation in the range of it is by the constraint of the present invention.

Claims (6)

1. a kind of electric experimental evaluation system of the brain based on eye movement data, it is characterised in that including:Eye tracker, distance matrix generation Module, participation detection module and Emotion identification module, wherein:Eye tracker is connected with distance matrix generation module and transmits eye Dynamic data message, distance matrix generation module is connected and transmission range information, participation detection module with participation detection module It is connected with Emotion identification module and transmits participation testing result information and Emotion identification object information;
Described eye movement data includes:Watch coordinate attentively, watch duration attentively, watch state pause judgments time, pan origin coordinates, pan road attentively Footpath, pan duration, pan state pause judgments time, pan angle.
2. a kind of electric experimental evaluation method of brain based on eye movement data based on said system, it is characterised in that pass through eye tracker Acquisition target eye movement data, blinkpunkt setup time-spatial model in eye movement data;Then dynamic time warping is used Similarity degree and build distance matrix between the algorithm rapid technology sequence of calculation, then by density-based algorithms carry out from The detection of group's point and quantization sequence, obtain the participation of object.
3. method according to claim 2, it is characterized in that, described distance matrix is obtained in the following manner:
I) order extracting object viewing stimulates all blinkpunkts during material fragment, i.e.,:{(x1,y1,t1),(x2,y2, t2),…,(xn,yn,tn), wherein:xi,yiIt is to watch point coordinates, t attentively i-thiIt is i-th of blinkpunkt duration, n watches attentively for fragment Point number;
Ii the blinkpunkt that duration is less than certain predetermined threshold δ) is considered as invalid blinkpunkt, deletes and invalid watches attentively from sequence Point.Effective blinkpunkt is deployed successively, i.e. repeatedly coordinate t/ δ times, wherein:T is the duration of the blinkpunkt;
Iii) by step ii) in blinkpunkt according to region to sequential coding;
Iv dynamic time warping algorithm comparative sequences similarity two-by-two) is used, a concrete numerical value is obtained and represents that two sequences are similar Degree, smaller expression similarity is higher;
V) using each two sequence similarity numerical value as distance, distance matrix is constructedWherein:Dij=Dji, DijIt is the distance between i-th of sequence and j-th sequence, m is single stimulation material total sample number, i.e. object number.
4. method according to claim 2, it is characterized in that, described outlier detection and quantify to sort, specifically include with Lower step:
I) range data based on previous module, using density-based algorithms, detects outlier and non-outlier;
Ii) extract each sequence distance of non-outlier into cluster result and be used as characteristic vector<v1,v2,…,vp>, wherein viFor Ith feature value, p is characterized sum, i.e., non-outlier number;
Iii) for each stimulation material fragment, the sample of the experiment to needing to feed back is arranged according to Emotion identification result precision Sequence, is used as training label;
Iv) choosing one successively stimulates material fragment as test set, using all stimulation material fragments of residue as training set, Use SVM Rank training patterns, prediction test sample sequence.
5. method according to claim 2, it is characterized in that, object divide into two classes by described outlier detection.
6. method according to claim 2, it is characterized in that, described quantization sequence, prediction test sample sequence quantifies every The degree of participation of individual object.
CN201710378148.2A 2017-05-24 2017-05-24 Electroencephalogram experiment evaluation system and method based on eye movement data Active CN107256332B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710378148.2A CN107256332B (en) 2017-05-24 2017-05-24 Electroencephalogram experiment evaluation system and method based on eye movement data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710378148.2A CN107256332B (en) 2017-05-24 2017-05-24 Electroencephalogram experiment evaluation system and method based on eye movement data

Publications (2)

Publication Number Publication Date
CN107256332A true CN107256332A (en) 2017-10-17
CN107256332B CN107256332B (en) 2020-09-29

Family

ID=60028053

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710378148.2A Active CN107256332B (en) 2017-05-24 2017-05-24 Electroencephalogram experiment evaluation system and method based on eye movement data

Country Status (1)

Country Link
CN (1) CN107256332B (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107977560A (en) * 2017-11-23 2018-05-01 北京航空航天大学 Identity identifying method and device based on Eye-controlling focus
CN108932532A (en) * 2018-07-11 2018-12-04 杭州电子科技大学 A kind of eye movement data number suggesting method required for the prediction of saliency figure
CN109199412A (en) * 2018-09-28 2019-01-15 南京工程学院 Abnormal emotion recognition methods based on eye movement data analysis
CN109620265A (en) * 2018-12-26 2019-04-16 中国科学院深圳先进技术研究院 Recognition methods and relevant apparatus
CN109697831A (en) * 2019-02-25 2019-04-30 湖北亿咖通科技有限公司 Fatigue driving monitoring method, device and computer readable storage medium
CN110464365A (en) * 2018-05-10 2019-11-19 深圳先进技术研究院 A kind of attention rate determines method, apparatus, equipment and storage medium
CN110517772A (en) * 2019-09-24 2019-11-29 清华大学 A kind of self-closing disease screening system
CN111081374A (en) * 2019-12-16 2020-04-28 华南师范大学 Autism auxiliary diagnosis device based on common attention paradigm
CN111311070A (en) * 2020-01-20 2020-06-19 南京航空航天大学 Product design scheme decision method combining electroencephalogram and eye movement and combining user similarity
CN112381559A (en) * 2020-10-14 2021-02-19 浪潮软件股份有限公司 Tobacco retailer segmentation method based on unsupervised machine learning algorithm
CN112669996A (en) * 2020-12-26 2021-04-16 深圳市龙华区妇幼保健院(深圳市龙华区妇幼保健计划生育服务中心、深圳市龙华区健康教育所) Remote diagnosis and treatment rehabilitation system based on reverse eye jump and memory-oriented eye jump
CN112836747A (en) * 2021-02-02 2021-05-25 首都师范大学 Eye movement data outlier processing method and device, computer equipment and storage medium
WO2021109855A1 (en) * 2019-12-04 2021-06-10 中国科学院深圳先进技术研究院 Deep learning-based autism evaluation assistance system and method
CN113326733A (en) * 2021-04-26 2021-08-31 吉林大学 Eye movement point data classification model construction method and system
CN113537295A (en) * 2021-06-22 2021-10-22 北京航空航天大学 Sight estimation cross-scene adaptation method and device based on outlier guidance
CN113869229A (en) * 2021-09-29 2021-12-31 电子科技大学 Deep learning expression recognition method based on prior attention mechanism guidance
CN114334140A (en) * 2022-03-08 2022-04-12 之江实验室 Disease prediction system and device based on multi-relation function connection matrix

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150029969A (en) * 2013-09-11 2015-03-19 세종대학교산학협력단 Sensibility classification method using brain wave
CN104462819A (en) * 2014-12-09 2015-03-25 国网四川省电力公司信息通信公司 Local outlier detection method based on density clustering
CN104700090A (en) * 2015-03-25 2015-06-10 武汉大学 Method and system for measuring eye movement fixation points based on density

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150029969A (en) * 2013-09-11 2015-03-19 세종대학교산학협력단 Sensibility classification method using brain wave
CN104462819A (en) * 2014-12-09 2015-03-25 国网四川省电力公司信息通信公司 Local outlier detection method based on density clustering
CN104700090A (en) * 2015-03-25 2015-06-10 武汉大学 Method and system for measuring eye movement fixation points based on density

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
STAN SALVADOR ET AL.: "FastDTW: Toward Accurate Dynamic Time Warping in Linear Time and Space", 《INTELLIGENT DATA ANALYSIS》 *
WEILONG ZHENG ET AL.: "Revealing Critical Channels and Frequency Bands for Emotion recognition from EEG with deep belief network", 《 2015 7TH INTERNATIONAL IEEE/EMBS CONFERENCE ON NEURAL ENGINEERING》 *
YIFEI LU ET AL.: "Combining Eye Movements and EEG to Enhance Emotion Recognition", 《PROCEEDINGS OF THE 24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE》 *
雷鹏: "轨迹聚类中距离度量与聚类方法的研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107977560A (en) * 2017-11-23 2018-05-01 北京航空航天大学 Identity identifying method and device based on Eye-controlling focus
CN110464365A (en) * 2018-05-10 2019-11-19 深圳先进技术研究院 A kind of attention rate determines method, apparatus, equipment and storage medium
CN110464365B (en) * 2018-05-10 2022-08-12 深圳先进技术研究院 Attention degree determination method, device, equipment and storage medium
CN108932532A (en) * 2018-07-11 2018-12-04 杭州电子科技大学 A kind of eye movement data number suggesting method required for the prediction of saliency figure
CN109199412A (en) * 2018-09-28 2019-01-15 南京工程学院 Abnormal emotion recognition methods based on eye movement data analysis
CN109199412B (en) * 2018-09-28 2021-11-09 南京工程学院 Abnormal emotion recognition method based on eye movement data analysis
CN109620265A (en) * 2018-12-26 2019-04-16 中国科学院深圳先进技术研究院 Recognition methods and relevant apparatus
CN109697831A (en) * 2019-02-25 2019-04-30 湖北亿咖通科技有限公司 Fatigue driving monitoring method, device and computer readable storage medium
CN110517772A (en) * 2019-09-24 2019-11-29 清华大学 A kind of self-closing disease screening system
WO2021109855A1 (en) * 2019-12-04 2021-06-10 中国科学院深圳先进技术研究院 Deep learning-based autism evaluation assistance system and method
CN111081374B (en) * 2019-12-16 2022-09-13 华南师范大学 Autism auxiliary diagnosis device based on common attention paradigm
CN111081374A (en) * 2019-12-16 2020-04-28 华南师范大学 Autism auxiliary diagnosis device based on common attention paradigm
CN111311070B (en) * 2020-01-20 2020-12-25 南京航空航天大学 Product design scheme decision method combining electroencephalogram and eye movement and combining user similarity
CN111311070A (en) * 2020-01-20 2020-06-19 南京航空航天大学 Product design scheme decision method combining electroencephalogram and eye movement and combining user similarity
CN112381559A (en) * 2020-10-14 2021-02-19 浪潮软件股份有限公司 Tobacco retailer segmentation method based on unsupervised machine learning algorithm
CN112669996A (en) * 2020-12-26 2021-04-16 深圳市龙华区妇幼保健院(深圳市龙华区妇幼保健计划生育服务中心、深圳市龙华区健康教育所) Remote diagnosis and treatment rehabilitation system based on reverse eye jump and memory-oriented eye jump
CN112836747A (en) * 2021-02-02 2021-05-25 首都师范大学 Eye movement data outlier processing method and device, computer equipment and storage medium
CN113326733A (en) * 2021-04-26 2021-08-31 吉林大学 Eye movement point data classification model construction method and system
CN113537295A (en) * 2021-06-22 2021-10-22 北京航空航天大学 Sight estimation cross-scene adaptation method and device based on outlier guidance
CN113537295B (en) * 2021-06-22 2023-10-24 北京航空航天大学 View estimation cross-scene adaptation method and device based on outlier guidance
CN113869229A (en) * 2021-09-29 2021-12-31 电子科技大学 Deep learning expression recognition method based on prior attention mechanism guidance
CN113869229B (en) * 2021-09-29 2023-05-09 电子科技大学 Deep learning expression recognition method based on priori attention mechanism guidance
CN114334140A (en) * 2022-03-08 2022-04-12 之江实验室 Disease prediction system and device based on multi-relation function connection matrix

Also Published As

Publication number Publication date
CN107256332B (en) 2020-09-29

Similar Documents

Publication Publication Date Title
CN107256332A (en) The electric experimental evaluation system and method for brain based on eye movement data
Abd El Meguid et al. Fully automated recognition of spontaneous facial expressions in videos using random forest classifiers
Elfeki et al. Video summarization via actionness ranking
Bartlett et al. Recognizing facial expression: machine learning and application to spontaneous behavior
Karnati et al. LieNet: A deep convolution neural network framework for detecting deception
CN108921042A (en) A kind of face sequence expression recognition method based on deep learning
CN110348416A (en) Multi-task face recognition method based on multi-scale feature fusion convolutional neural network
CN105976397B (en) A kind of method for tracking target
CN110427881A (en) The micro- expression recognition method of integration across database and device based on the study of face local features
CN109330613A (en) Human body Emotion identification method based on real-time brain electricity
Akshay et al. Machine learning algorithm to identify eye movement metrics using raw eye tracking data
Tabassum et al. Non-intrusive identification of student attentiveness and finding their correlation with detectable facial emotions
Ugli et al. A transfer learning approach for identification of distracted driving
Shatnawi et al. Deep learning approach for masked face identification
Zhang et al. In the blink of an eye: Event-based emotion recognition
Van Huynh et al. Emotion recognition by integrating eye movement analysis and facial expression model
Yaseen et al. A novel approach based on multi-level bottleneck attention modules using self-guided dropblock for person re-identification
Chen et al. Distracted driving recognition using vision transformer for human-machine co-driving
Avola et al. Human body language analysis: a preliminary study based on kinect skeleton tracking
Danraka et al. Discrete firefly algorithm based feature selection scheme for improved face recognition
He et al. Dual multi-task network with bridge-temporal-attention for student emotion recognition via classroom video
Zhang et al. Automatic construction and extraction of sports moment feature variables using artificial intelligence
Li et al. Skeleton based action quality assessment of figure skating videos
CN113762217A (en) Behavior detection method
Li et al. Motion fatigue state detection based on neural networks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20220729

Address after: Room 23a, No. 19, Lane 99, Nandan East Road, Xuhui District, Shanghai 200030

Patentee after: Lv Baoliang

Address before: 200240 No. 800, Dongchuan Road, Shanghai, Minhang District

Patentee before: SHANGHAI JIAO TONG University

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20221011

Address after: Room 901, Building A, SOHO Fuxing Plaza, No. 388 Madang Road, Huangpu District, Shanghai, 200025

Patentee after: Shanghai Zero Unique Technology Co.,Ltd.

Address before: Room 23a, No. 19, Lane 99, Nandan East Road, Xuhui District, Shanghai 200030

Patentee before: Lv Baoliang

TR01 Transfer of patent right