CN111184509A - Emotion-induced electroencephalogram signal classification method based on transfer entropy - Google Patents

Emotion-induced electroencephalogram signal classification method based on transfer entropy Download PDF

Info

Publication number
CN111184509A
CN111184509A CN201911196894.5A CN201911196894A CN111184509A CN 111184509 A CN111184509 A CN 111184509A CN 201911196894 A CN201911196894 A CN 201911196894A CN 111184509 A CN111184509 A CN 111184509A
Authority
CN
China
Prior art keywords
transfer entropy
matrix
frequency band
signals
eeg
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911196894.5A
Other languages
Chinese (zh)
Inventor
高云园
王翔坤
高博
朱涛
佘青山
孟明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Dianzi University
Original Assignee
Hangzhou Dianzi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Dianzi University filed Critical Hangzhou Dianzi University
Priority to CN201911196894.5A priority Critical patent/CN111184509A/en
Publication of CN111184509A publication Critical patent/CN111184509A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7225Details of analog processing, e.g. isolation amplifier, gain or sensitivity adjustment, filtering, baseline or drift compensation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Veterinary Medicine (AREA)
  • Psychiatry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physiology (AREA)
  • Psychology (AREA)
  • Power Engineering (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention discloses an emotion-induced electroencephalogram signal classification method based on transfer entropy. And then selecting 10 channels of electroencephalograms such as Fp1, Fp2, P3, P4, C3, C4, O1, O2, F3 and F4, constructing a transfer entropy relation matrix diagram among the channels by using a correlation algorithm of transfer entropy, and performing feature extraction on the generated transfer entropy matrix relation diagram through a direction gradient histogram. And finally, training and classifying the extracted features by using a support vector machine with a kernel function as a radial basis function, and analyzing and classifying the electroencephalogram signals according to different emotional states. The method can comprehensively, accurately and quickly reflect information interaction among different channels during emotional stimulation, becomes the basis of subsequent classification and analysis, reduces characteristic redundancy, improves classification accuracy and reduces analysis time.

Description

Emotion-induced electroencephalogram signal classification method based on transfer entropy
Technical Field
The invention belongs to the field of biological signal processing, and relates to a method for classifying electroencephalogram signals under different emotional stimuli based on transfer entropy.
Technical Field
Electroencephalograms (EEG) are signals generated by spontaneous or rhythmic activity of a brain nerve group recorded through electrodes, and reflect potential changes during activity of a brain functional region nerve cell group. Studies in cognitive psychology and brain neurology indicate that mood development and changes are directly related to the central nervous system of the brain. Emotion recognition is mainly recognized by analyzing the speech, facial expressions, EEG signals and other physiological electrical signals of participants, and EEG signals have the characteristics of good time resolution, non-invasiveness, rapidness and low cost, so that EEG signals become a main method for studying emotion change.
EEG signal features can be generally divided into time domain features, frequency domain features, time-frequency domain features and space domain features, wherein the time domain features mainly comprise some statistics of signals, event-related potentials, power, energy, high-order zero-crossing analysis, fractal dimension and the like, the frequency domain features mainly decompose original EEG signals into frequency bands such as theta (4-8Hz), α (8-13Hz), β (13-30Hz), gamma (30-45Hz) and the like, and then high-order spectra, event-related synchronization, event-related desynchronization, power spectral density and the like are respectively calculated from the frequency bands.
Because the brain is a quite complex nonlinear dynamical system, and the Transfer entropy is an asymmetric measurement method based on transition probability definition, which contains directional and dynamic information, has the characteristics of independence on established models and nonlinear quantitative analysis, and can be used for estimating the functional coupling strength and information Transfer between brain regions, the invention uses the Transfer Entropy (TE) in the nonlinear method as an EEG feature extraction method, constructs a Transfer entropy relation matrix according to the relation between channels, and explores the EEG signal change under different emotional states.
Disclosure of Invention
In order to objectively and effectively analyze and classify EEG signals under different emotional stimuli, the invention provides an emotion induction EEG signal classification method based on transfer entropy by utilizing the EEG signals. Firstly, EEG signals of participants under different emotional stimuli are collected, the collected signals are preprocessed, then a transfer entropy relation matrix is constructed for a selected channel by using transfer entropy, a picture is generated, then feature extraction is carried out on the generated picture by using a Histogram of Oriented Gradients (HOG), and finally, a support vector machine with a kernel function as a radial basis function is used for classifying features. The invention can effectively analyze the change of EEG signals when different emotions are stimulated, and provides an idea for the wide application of subsequent EEG emotion classification and man-machine interaction.
The method mainly comprises the following steps:
step one, different emotions are induced by using different types of visual and auditory stimuli, and multichannel EEG signals of the participants in the period are collected.
Step two, preprocessing the collected different emotion-induced EEG signals as follows:
1) removing baseline drift, ocular artifacts and 50Hz power frequency signals by using EEGLAB, retaining 4-45Hz EEG signals by using a band-pass filter, and down-sampling the collected EEG signals from 512Hz to 128Hz and storing;
2) decomposing each sample into data of 4 frequency bands by using wavelet packet transformation, wherein the data are respectively a theta frequency band (4-8Hz), an α frequency band (8-13Hz), a β frequency band (13-30Hz) and a gamma frequency band (30-45Hz), performing 6-layer decomposition by using a 'db 5' wavelet base, finding out wavelet packet tree nodes respectively corresponding to the 4 frequency bands, and reconstructing wavelet packet coefficients of each frequency band to obtain EEG signal data of the 4 frequency bands;
3) and respectively carrying out dual-density dual-tree complex wavelet transform on the data of the 4 frequency bands to obtain cleaner EEG signals.
Step three, selecting EEG signals of 10 channels including Fp1, Fp2, P3, P4, C3, C4, O1, O2, F3 and F4 from the preprocessed data, constructing a 10 x 10 transfer entropy relation matrix by using transfer entropy, and generating a matrix relation graph after normalizing the two-dimensional relation matrix.
The calculation method of the transfer entropy comprises the following steps: suppose that given two time series X ═ X1,x2,…,xTY ═ Y1,y2,…,yTWhere T is the length of the time series, x1、y1Are respectively the first observed value, x2、y2Respectively, the second observed value and so on; we can get the transfer entropy of Y to X (TE)Y→X) And X to Y Transfer Entropy (TE)X→Y) The formula (1) and (2) are shown as follows:
Figure RE-GDA0002450801880000031
Figure RE-GDA0002450801880000032
where n is a discrete time index, τ is the predicted time, and p (·) represents the probability distribution.
And step four, extracting the characteristics of the transfer entropy matrix relation graph generated in the step three by using a Histogram of Oriented Gradients (HOG).
And step five, classifying the features in the step four by using a Support Vector Machine (SVM) with the kernel function as the radial basis function, obtaining optimal parameters by using cross validation and grid search, and training a training set to obtain a classification model with better effect.
Compared with the existing methods for classifying various emotional stimulus EEG signals, the method has the following characteristics:
firstly, in the selection of EEG signal sources, Fp1 and Fp2 channels of frontal lobe, P3 and P4 of apical lobe, C3 and C4 channels of occipital lobe, O1 and O2 channels and F3 and F4 channels of temporal lobe are selected, so that information interaction among different channels during emotional stimulation can be comprehensively, accurately and quickly reflected, and the basis of later classification and analysis is formed.
Secondly, during feature extraction, firstly, features among the selected EEG signal channels are extracted by using transfer entropy, a relation matrix among the channels is constructed, an MATLAB is used for generating a picture after normalization processing is carried out, and then, feature extraction is carried out on the picture by using HOG.
Drawings
FIG. 1 shows a flow chart of EEG signal classification at emotional stimulation;
FIG. 2 shows an EEG signal channel selection diagram;
FIG. 3 shows a schematic wavelet frequency division;
FIG. 4 is a diagram showing (a) a matrix of transfer entropy relationship in a calm state, and FIG. 4(b) a matrix of transfer entropy relationship in a pressure state;
FIG. 5 shows a flow chart for feature extraction using HOG;
FIG. 6 shows the relationship between Cell and Block in HOG feature extraction;
FIG. 7 shows the classification accuracy results of different methods.
Detailed Description
In order to efficiently extract and classify features of an EEG, the present invention primarily improves on EEG feature extraction. The embodiments of the present invention will be described in detail below with reference to the accompanying drawings: the embodiment is implemented on the premise of the technical scheme of the invention, and a detailed implementation mode and a specific operation process are given.
The overall flow of the emotion-induced EEG signal classification method based on transfer entropy is shown in FIG. 1, and the specific implementation method comprises the following steps:
step one, EEG signals at different emotional stimuli are collected. 32 volunteers were selected for 40 visual and auditory stimulation experiments, and EEG signals of 10 channels, Fp1, Fp2 located in frontal lobe, P3, P4, C3, C4 located in parietal lobe, O1, O2 located in occipital lobe, F3 located in temporal lobe, and F4 located in temporal lobe were collected at the same time, as shown in fig. 2. After the experiment is finished, each volunteer needs to score the 40 visual and auditory stimuli respectively from 1 to 9 in two dimensions of Valence (active-passive) and Arousal (awakened-not-awakened) according to own experience and a Valence-awakening degree two-dimensional emotion model.
Step two, preprocessing the collected different emotion-induced EEG signals as follows:
(1) removing baseline drift, ocular artifacts and 50Hz power frequency signals by using EEGLAB, only retaining 4-45Hz EEG signals by using a band-pass filter, and down-sampling the collected EEG signals from 512Hz to 128Hz and storing;
(2) according to the theory related to psychology and brain science, four nodal rate wave frequency bands of an EEG signal are closely related to the physiological and psychological activities of a person, so that each sample is decomposed into data of 4 frequency bands by wavelet packet transformation, wherein the data are a theta frequency band (4-8Hz), an α frequency band (8-13Hz), a β frequency band (13-30Hz) and a gamma frequency band (30-45Hz), 6 layers of decomposition is carried out by using a 'db 5' wavelet base, wavelet packet tree nodes corresponding to the 4 frequency bands are found out, and the EEG signal data of the 4 frequency bands are obtained after the wavelet packet coefficients of the frequency bands are reconstructed.
(3) The data of these 4 bands are separately subjected to dual-density dual-tree complex wavelet transform to obtain cleaner EEG signals, as shown in FIG. 3. according to the previous research, the bands contain a large number of features related to emotion, so the invention uses the collected EEG signals of β bands to perform analysis research.
And step three, constructing a 10 multiplied by 10 transfer entropy relation matrix for the processed data by using the transfer entropy, and generating a matrix relation diagram after normalizing the two-dimensional relation matrix, as shown in fig. 4.
Step four, using the HOG to perform feature extraction on the transfer entropy matrix relation diagram generated in step three, wherein the technical process is shown in fig. 5, and the specific steps are as follows:
(1) the Gamma space and the color space are normalized. The contrast of the image is adjusted to suppress noise interference.
(2) The image gradients are calculated. The image is processed in the horizontal and vertical directions using a one-dimensional discrete difference template.
(3) A direction histogram is created. The image is divided into several cells (cells), each pixel in a Cell is weighted and projected into a histogram with gradient directions (mapped to a fixed angular range), and then the gradient direction histogram of this Cell can be obtained.
(4) The cells are grouped into larger spatial blocks (blocks) and then each Block is normalized separately. Fig. 6 shows the relationship between blocks and cells. And by analogy, combining the feature vectors of all the units in the block into the HOG feature of the space block.
(5) HOG features were collected. All overlapping blocks in the detection window are collected to obtain HOG features and merged into a final feature vector for classification.
And step five, classifying the features in the step four by using a Support Vector Machine (SVM) with the kernel function as the radial basis function, obtaining optimal parameters by using cross validation and grid search, and training a training set to obtain a classification model with better effect.
This example analyzed EEG signals of 32 participants at audiovisual stimuli satisfying both calm and stress emotions, resulting in a total of 125 sets of matrices of transitive entropy relations at calm and 127 sets of matrices of transitive entropy relations at stress. As shown in fig. 7, comparing the accuracy rates of different methods, it can be seen that after the transfer entropy relationship matrix is obtained, the feature extraction of the transfer entropy relationship matrix map by using the HOG can effectively improve the classification accuracy rate and reveal the relationship between EEG signals of different channels during emotional stimulation.

Claims (1)

1. A sentiment-induced electroencephalogram signal classification method based on transfer entropy specifically comprises the following steps:
step one, inducing different emotions by using different types of visual and auditory stimuli, and collecting multichannel scalp electroencephalogram signals of participants in the period;
step two, preprocessing the collected EEG signals induced by different emotions as follows:
1) removing baseline drift, ocular artifacts and 50Hz power frequency signals by using EEGLAB, retaining 4-45Hz EEG signals by using a band-pass filter, and down-sampling the collected EEG signals from 512Hz to 128Hz and storing;
2) decomposing each sample into data of 4 frequency bands by using wavelet packet transformation, wherein the data are respectively a theta frequency band (4-8Hz), an α frequency band (8-13Hz), a β frequency band (13-30Hz) and a gamma frequency band (30-45Hz), performing 6-layer decomposition by using a 'db 5' wavelet base, finding out wavelet packet tree nodes respectively corresponding to the 4 frequency bands, and reconstructing wavelet packet coefficients of each frequency band to obtain EEG signal data of the 4 frequency bands;
3) respectively carrying out dual-density dual-tree complex wavelet transform on the data of the 4 frequency bands, and carrying out noise elimination on an EEG signal;
selecting EEG signals of 10 channels including Fp1, Fp2, P3, P4, C3, C4, 01, 02, F3 and F4 from the preprocessed data, constructing a 10 x 10 transfer entropy relation matrix by using transfer entropy, and normalizing the two-dimensional relation matrix to generate a matrix relation graph;
the calculation method of the transfer entropy comprises the following steps: suppose that given two time series X ═ X1,x2,…,xTY ═ Y1,y2,…,yTWhere T is the length of the time series, x1、y1Are respectively the first observed value, x2、y2Respectively, the second observed value and so on; obtaining the transfer entropy TE from Y to Xy→XAnd X to Y transfer entropy TEX→YThe formula (1) and (2):
Figure FDA0002294880550000011
Figure FDA0002294880550000012
wherein n is a discrete time index, τ is a prediction time, and p (·) represents probability distribution;
step four, extracting the characteristics of the transfer entropy matrix relation graph generated in the step three by using the direction gradient histogram;
and step five, classifying the features in the step four by using a support vector machine with the kernel function as the radial basis function, obtaining the optimal parameters by using cross validation and grid search, and training the training set to obtain a classification model with better effect.
CN201911196894.5A 2019-11-29 2019-11-29 Emotion-induced electroencephalogram signal classification method based on transfer entropy Pending CN111184509A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911196894.5A CN111184509A (en) 2019-11-29 2019-11-29 Emotion-induced electroencephalogram signal classification method based on transfer entropy

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911196894.5A CN111184509A (en) 2019-11-29 2019-11-29 Emotion-induced electroencephalogram signal classification method based on transfer entropy

Publications (1)

Publication Number Publication Date
CN111184509A true CN111184509A (en) 2020-05-22

Family

ID=70684359

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911196894.5A Pending CN111184509A (en) 2019-11-29 2019-11-29 Emotion-induced electroencephalogram signal classification method based on transfer entropy

Country Status (1)

Country Link
CN (1) CN111184509A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112163518A (en) * 2020-09-28 2021-01-01 华南理工大学 Emotion modeling method for emotion monitoring and adjusting system
CN112244880A (en) * 2020-09-24 2021-01-22 杭州电子科技大学 Emotion-induced electroencephalogram signal analysis method based on variable-scale symbol compensation transfer entropy
CN112270235A (en) * 2020-10-20 2021-01-26 西安工程大学 Improved SVM electroencephalogram signal emotion recognition method
CN112450947A (en) * 2020-11-20 2021-03-09 杭州电子科技大学 Dynamic brain network analysis method for emotional arousal degree
CN112971811A (en) * 2021-02-09 2021-06-18 北京师范大学 Brain function positioning method and device and electronic equipment
CN113178195A (en) * 2021-03-04 2021-07-27 杭州电子科技大学 Speaker identification method based on sound-induced electroencephalogram signals
CN113576478A (en) * 2021-04-23 2021-11-02 西安交通大学 Electroencephalogram signal-based image emotion classification method, system and device
CN115631793A (en) * 2022-12-01 2023-01-20 新格元(南京)生物科技有限公司 Single Cell transcriptome Pseudo-Cell analysis method, model, storage medium and equipment
CN116369949A (en) * 2023-06-06 2023-07-04 南昌航空大学 Electroencephalogram signal grading emotion recognition method, electroencephalogram signal grading emotion recognition system, electronic equipment and medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103610461A (en) * 2013-10-17 2014-03-05 杭州电子科技大学 EEG noise elimination method based on dual-density wavelet neighborhood related threshold processing
CN106548160A (en) * 2016-11-09 2017-03-29 浙江博天科技有限公司 A kind of face smile detection method
CN106901728A (en) * 2017-02-10 2017-06-30 杭州电子科技大学 Multichannel brain myoelectricity coupling analytical method based on mutative scale symbol transfer entropy
CN107157477A (en) * 2017-05-24 2017-09-15 上海交通大学 EEG signals Feature Recognition System and method
CN107704805A (en) * 2017-09-01 2018-02-16 深圳市爱培科技术股份有限公司 method for detecting fatigue driving, drive recorder and storage device
CN108742613A (en) * 2018-05-30 2018-11-06 杭州电子科技大学 Orient coupling analytical method between the flesh of coherence partially based on transfer entropy and broad sense

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103610461A (en) * 2013-10-17 2014-03-05 杭州电子科技大学 EEG noise elimination method based on dual-density wavelet neighborhood related threshold processing
CN106548160A (en) * 2016-11-09 2017-03-29 浙江博天科技有限公司 A kind of face smile detection method
CN106901728A (en) * 2017-02-10 2017-06-30 杭州电子科技大学 Multichannel brain myoelectricity coupling analytical method based on mutative scale symbol transfer entropy
CN107157477A (en) * 2017-05-24 2017-09-15 上海交通大学 EEG signals Feature Recognition System and method
CN107704805A (en) * 2017-09-01 2018-02-16 深圳市爱培科技术股份有限公司 method for detecting fatigue driving, drive recorder and storage device
CN108742613A (en) * 2018-05-30 2018-11-06 杭州电子科技大学 Orient coupling analytical method between the flesh of coherence partially based on transfer entropy and broad sense

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112244880A (en) * 2020-09-24 2021-01-22 杭州电子科技大学 Emotion-induced electroencephalogram signal analysis method based on variable-scale symbol compensation transfer entropy
CN112163518B (en) * 2020-09-28 2023-07-18 华南理工大学 Emotion modeling method for emotion monitoring and adjusting system
CN112163518A (en) * 2020-09-28 2021-01-01 华南理工大学 Emotion modeling method for emotion monitoring and adjusting system
CN112270235A (en) * 2020-10-20 2021-01-26 西安工程大学 Improved SVM electroencephalogram signal emotion recognition method
CN112450947A (en) * 2020-11-20 2021-03-09 杭州电子科技大学 Dynamic brain network analysis method for emotional arousal degree
CN112971811A (en) * 2021-02-09 2021-06-18 北京师范大学 Brain function positioning method and device and electronic equipment
CN112971811B (en) * 2021-02-09 2022-04-01 北京师范大学 Brain function positioning method and device and electronic equipment
CN113178195A (en) * 2021-03-04 2021-07-27 杭州电子科技大学 Speaker identification method based on sound-induced electroencephalogram signals
CN113178195B (en) * 2021-03-04 2022-08-26 杭州电子科技大学 Speaker identification method based on sound-induced electroencephalogram signals
CN113576478A (en) * 2021-04-23 2021-11-02 西安交通大学 Electroencephalogram signal-based image emotion classification method, system and device
CN115631793B (en) * 2022-12-01 2023-05-26 新格元(南京)生物科技有限公司 Single Cell transcriptome pseudocell analysis method, model, storage medium and device
CN115631793A (en) * 2022-12-01 2023-01-20 新格元(南京)生物科技有限公司 Single Cell transcriptome Pseudo-Cell analysis method, model, storage medium and equipment
CN116369949A (en) * 2023-06-06 2023-07-04 南昌航空大学 Electroencephalogram signal grading emotion recognition method, electroencephalogram signal grading emotion recognition system, electronic equipment and medium
CN116369949B (en) * 2023-06-06 2023-09-15 南昌航空大学 Electroencephalogram signal grading emotion recognition method, electroencephalogram signal grading emotion recognition system, electronic equipment and medium

Similar Documents

Publication Publication Date Title
CN111184509A (en) Emotion-induced electroencephalogram signal classification method based on transfer entropy
Riaz et al. EMD-based temporal and spectral features for the classification of EEG signals using supervised learning
James et al. Extracting multisource brain activity from a single electromagnetic channel
CN109784023B (en) Steady-state vision-evoked electroencephalogram identity recognition method and system based on deep learning
CN111329474A (en) Electroencephalogram identity recognition method and system based on deep learning and information updating method
KR101842750B1 (en) Realtime simulator for brainwaves training and interface device using realtime simulator
CN114533086B (en) Motor imagery brain electrolysis code method based on airspace characteristic time-frequency transformation
Ashokkumar et al. RETRACTED: Implementation of deep neural networks for classifying electroencephalogram signal using fractional S‐transform for epileptic seizure detection
CN112450947B (en) Dynamic brain network analysis method for emotional arousal degree
Yang et al. Improved time-frequency features and electrode placement for EEG-based biometric person recognition
Marrouch et al. Data-driven Koopman operator approach for computational neuroscience
Gao et al. Multi-ganglion ANN based feature learning with application to P300-BCI signal classification
CN113180659B (en) Electroencephalogram emotion recognition method based on three-dimensional feature and cavity full convolution network
CN115770044B (en) Emotion recognition method and device based on electroencephalogram phase amplitude coupling network
CN115414051A (en) Emotion classification and recognition method of electroencephalogram signal self-adaptive window
Yang et al. On the effectiveness of EEG signals as a source of biometric information
CN112426162A (en) Fatigue detection method based on electroencephalogram signal rhythm entropy
Cong et al. Analysis of ongoing EEG elicited by natural music stimuli using nonnegative tensor factorization
CN113349795B (en) Depression electroencephalogram analysis method based on sparse low-rank tensor decomposition
Li et al. Biometric distinctiveness of brain signals based on eeg
Zhang et al. Four-classes human emotion recognition via entropy characteristic and random Forest
CN112244880B (en) Emotion-induced electroencephalogram signal analysis method based on variable-scale symbol compensation transfer entropy
CN110403602B (en) Improved public spatial mode feature extraction method for electroencephalogram signal emotion analysis
Saha et al. Automatic emotion recognition from multi-band EEG data based on a deep learning scheme with effective channel attention
Puri et al. Wavelet packet sub-band based classification of alcoholic and controlled state EEG signals

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200522