CN114343674A - Combined judgment subspace mining and semi-supervised electroencephalogram emotion recognition method - Google Patents

Combined judgment subspace mining and semi-supervised electroencephalogram emotion recognition method Download PDF

Info

Publication number
CN114343674A
CN114343674A CN202111578215.8A CN202111578215A CN114343674A CN 114343674 A CN114343674 A CN 114343674A CN 202111578215 A CN202111578215 A CN 202111578215A CN 114343674 A CN114343674 A CN 114343674A
Authority
CN
China
Prior art keywords
formula
electroencephalogram
matrix
semi
subspace
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111578215.8A
Other languages
Chinese (zh)
Other versions
CN114343674B (en
Inventor
彭勇
李幸
张怿恺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Dianzi University
Original Assignee
Hangzhou Dianzi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Dianzi University filed Critical Hangzhou Dianzi University
Priority to CN202111578215.8A priority Critical patent/CN114343674B/en
Publication of CN114343674A publication Critical patent/CN114343674A/en
Application granted granted Critical
Publication of CN114343674B publication Critical patent/CN114343674B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention discloses a method for jointly judging and distinguishing subspace excavation and semi-monitoring electroencephalogram emotion recognition. The method comprises the following steps: and guiding the testee to watch the video with obvious emotional tendency to acquire electroencephalogram data. And preprocessing the acquired electroencephalogram data and extracting characteristics to generate a sample matrix. And constructing a joint sub-discrimination space and a semi-supervised learning model, reducing intra-class dispersion and increasing inter-class dispersion among electroencephalogram data by projecting a sample matrix to a new characteristic space, and adding a non-labeled sample into a training model for semi-supervised learning after printing a pseudo label on the non-labeled sample. And joint optimization is realized by fixing two variables and updating the rule of the other variable according to the objective function, and the accuracy of emotion recognition is improved by continuously iteratively optimizing the sub-discrimination space. The physical significance of the combined projection matrix is researched, an electroencephalogram activation mode in emotion recognition is obtained, and a key frequency band and a lead related to emotional effect are obtained.

Description

Combined judgment subspace mining and semi-supervised electroencephalogram emotion recognition method
Technical Field
The invention belongs to the technical field of electroencephalogram signal processing, and particularly relates to a method for jointly judging and distinguishing subspace excavation and semi-monitoring electroencephalogram emotion recognition.
Background
The emotion is adaptive physiological expression generated under the condition that people are stimulated by external environment in daily life and work, and has the functions of information transmission and behavior regulation. According to the definition of psychological dictionary, the emotion is attitude and experience generated after objective things are compared with the needs of people. The emotion can reflect the current physiological and psychological state of a person, and also has important influence on cognition, communication, decision making and the like of the person. From the artificial intelligence point of view, the emotion generation is accompanied by individual characterization and psychological response changes, so that it can be measured and simulated by scientific methods. The machine can automatically and accurately identify the emotional state of people, and the realization of emotional man-machine interaction is a research hotspot in the fields of current information science, psychology, cognitive neuroscience and the like.
The electroencephalogram signals serve as unsteady-state signals, the obtained original electroencephalogram data are often inconsistent in distribution, in order to obtain a stable emotion recognition mode in a machine learning model, the inter-class dispersion degree of the electroencephalogram data can be increased by a method of projecting the electroencephalogram data to a sub-discrimination space, the intra-class dispersion degree is reduced, a better discrimination mode is obtained, the recognition precision of the model is improved, and the reliability of emotion human-computer interaction is guaranteed.
Disclosure of Invention
The invention aims to provide a method for jointly judging and distinguishing subspace excavation and semi-monitoring electroencephalogram emotion recognition. By the method, the original data can be projected to a matrix A of a discrimination subspace, a projection matrix B of a connection label matrix and a label-free sample YuAnd performing combined iterative optimization, obtaining a better classification effect by continuously iteratively optimizing the sub-discrimination space so as to improve the accuracy of emotion recognition, and obtaining a key frequency band and a brain region related to the occurrence of emotion effects through the obtained A, B projection matrix.
The method comprises the following specific steps:
step 1, collecting electroencephalogram data of a tested person in K different emotional states.
Step 2, preprocessing and extracting characteristics of the electroencephalogram data acquired in the step 1, wherein each sample matrix X consists of electroencephalogram characteristics of a testee, and a label vector y is an emotion label corresponding to the electroencephalogram characteristics in the sample matrix X; two different sample matrices are selected as labeled data and unlabeled data respectively.
And 3, constructing a machine learning model of the joint judgment subspace mining and semi-supervised electroencephalogram emotion recognition method, and integrating the judgment subspace and the semi-supervised learning model obtained by mapping the projection matrix A into a unified framework to obtain a joint optimization target function.
3-1, establishing an embedded description factor v and a theta objective function as shown in a formula (1);
Figure BDA0003425317620000021
in the formula (1), the reaction mixture is,
Figure BDA0003425317620000022
for the input sample matrix, projection matrix
Figure BDA0003425317620000023
For projecting the raw data into a better discrimination subspace, projection matrix
Figure BDA0003425317620000024
For concatenating the data in the discrimination subspace with the tag information,
Figure BDA0003425317620000025
indicating information corresponding to the label matrix, wherein n ═ l + u, l indicates the number of labeled samples, u indicates the number of unlabeled samples,
Figure BDA0003425317620000026
designating a pseudo label corresponding to the label-free sample; i | · | purple wind21Is represented by21A norm; λ represents a regularization term parameter;
3-2, further rewriting the objective function formula (1) into the formula (2):
Figure BDA0003425317620000027
in the formula (2), Tr (·) is the trace operation of the matrix;
Figure BDA0003425317620000028
is a diagonal matrix in which each diagonal element has a value of
Figure BDA0003425317620000029
In the formula (3), giRepresenting the matrix G ═ A*B*The ith row of elements, | · | | non-woven phosphor2Is represented by2And (4) norm.
Step 4, initializing the pseudo label YuAnd D, obtaining the updating rules of all the variables by fixing two variables and updating the other variable according to the jointly optimized objective function obtained in the step 3, and sequentially carrying out comparison on the unmarked sample Y in the objective function formulauAnd optimizing the projection matrix A and the connection matrix B, and repeating the optimization process for multiple times to realize joint iterative optimization.
And 5, inputting the sample matrix X obtained in the step 2 into the objective function subjected to iterative optimization in the step 4 to obtain a corresponding predicted value label, wherein the predicted value label is the emotional state of the testee corresponding to the sample at the acquisition moment, and adding the obtained pseudo label into the training process of the model to realize semi-supervised learning.
Preferably, in step 2, different emotional states are induced by allowing the subject to watch different types of movies through a video-induced method, wherein the emotional categories include happy, sad, neutral, frightened, and nausea.
Preferably, Y is determined in step 4uA, B the specific procedure is as follows;
4-1. update Y by fixing A, BuLet us order
Figure BDA00034253176200000210
The formula (2) is rewritten as shown in (4):
Figure BDA00034253176200000211
by solving for Y line by lineuLet us order
Figure BDA00034253176200000212
Represents YuThe formula (4) in the ith row of (1) can be expressed as the formula (5)
Figure BDA0003425317620000031
According to equation (5), c is solved in the following processiAnd yiFor transposition of (c)iAnd yiExpression (6) is obtained by the lagrange multiplier method;
Figure BDA0003425317620000032
order to
Figure BDA0003425317620000033
Denotes yiOf the optimal solution, η*、β*To an optimal solution
Figure BDA0003425317620000034
Corresponding parameters, y being obtained from KKT conditioniThe optimal solution of (a) is:
Figure BDA0003425317620000035
wherein
Figure BDA0003425317620000036
4-2. through fixing A, YuTo update B, equation (2) is rewritten as shown in (8):
Figure BDA0003425317620000037
in equation (8), the matrix B is differentiated and the derivative is set to 0, and the update rule for obtaining B is equation (9):
B=(AT(XXT+λD)A)-1 ATXY (9)
4-3. by fixing YuB, updating a, substituting formula (9) into formula (2) to obtain formula (10):
Figure BDA0003425317620000038
order St=XXT,Sb=XYYTXTIn which S istAnd SbThe intra-class dispersion and the inter-class dispersion in the linear discriminant analysis are respectively expressed, and the optimal solution of the variable a can be expressed as equation (11).
Figure BDA0003425317620000039
Preferably, the pretreatment in step 2 is carried out as follows:
2-1, down-sampling the electroencephalogram data to 200Hz, and performing band-pass filtering on the electroencephalogram data to a range of 1-50 Hz; according to the 5-band method,
dividing the frequency bands into five frequency bands of Delta, Theta, Alpha, Beta and Gamma;
2-2, respectively carrying out short-time Fourier transform with 4 seconds of time window and no overlap on the electroencephalogram data of the 5 frequency bands, and extracting differential entropy characteristics h (X) as shown in formula (12):
h(X)=-∫xf(x)ln(f(x))dx (12)
in the formula (12), X is an input sample matrix, and X is an element in the input sample matrix; (x) is a probability density function;
the updated differential entropy characteristic h (X) is shown as a formula (13);
Figure BDA0003425317620000041
in the formula (13), σ is a standard deviation of the probability density function; μ is the expectation of the probability density function.
Preferably, 17 leads are adopted for the electroencephalogram data acquisition, and 5 frequency bands are selected; the 5 frequency bands are respectively 1-4Hz, 4-8Hz, 8-14Hz, 14-31Hz and 31-50 Hz.
The projection matrix A, B in step 4 is preferably used to explore the activation patterns in emotion recognition, making the matrix G A B,
Figure BDA0003425317620000042
representing the importance of each dimension feature, since the importance of each feature dimension can be normalized by its/2And (4) evaluating the norm, and obtaining a key frequency band and a brain area identified with electroencephalogram emotion through the obtained theta.
The invention has the beneficial effects that:
1. the method for jointly judging and distinguishing subspace excavation and semi-supervised electroencephalogram emotion recognition can project electroencephalogram data into subspaces to obtain a better classification surface, and because the electroencephalogram data are unstable, the classification effect in an original data space is not ideal, the method can increase the inter-class dispersion of SEED-V electroencephalogram data centralized samples and reduce the intra-class dispersion, and the experimental comparison finds that compared with the performance of the currently popular semi-supervised RLSR model, the accuracy of an emotion recognition model is greatly improved.
2. The invention relates to a semi-supervised learning method, which can be used for training by combining unlabelled sample data, initially putting a labeled sample into a model for training, then marking a pseudo label on the unlabelled sample through the obtained model, and putting the pseudo label sample into the model for training.
3. The electroencephalogram data acquisition process is achieved by a plurality of electrode caps, the sample data is influenced by experiment time and lead positions, and each lead represents a characteristic dimension. According to the method, through calculation of the projection matrixes A and B, a frequency band and a lead which are more favorable for model training can be obtained, and a key frequency band and a brain region which are generated with an emotional effect are obtained through implicit information obtained through model training.
Drawings
FIG. 1 is a diagram of a model framework of the present invention;
FIG. 2 is a key band diagram of the present invention;
fig. 3 is a key guidance diagram of the present invention.
Detailed Description
The invention is further described below with reference to the accompanying drawings.
The invention solves the important problem of low accuracy of the classification surface of the original space in the data set of electroencephalogram emotion recognition and is based on the following starting points: in emotion recognition, electroencephalogram signals are used as unsteady signals, data after preprocessing can obtain a stable emotion recognition mode, but due to the fact that sample differences of different labels among data sets are small, the obtained classification surface effect is poor, the obtained model classification performance is weak, if samples can be projected to a sub-discrimination space, a good classification surface is obtained, intra-class dispersion among the data sets is small, inter-class dispersion is large, and a model with good robustness is obtained. Therefore, the learning can be carried out by a method of projecting the data set to the sub-discrimination space, and the learning has great significance for improving the accuracy of emotion recognition.
As shown in fig. 1, a method for jointly judging and identifying subspace mining and semi-supervised electroencephalogram emotion recognition specifically comprises the following steps:
step 1, electroencephalogram data acquisition.
The emotion induction method adopted in the experiment is stimulation material induction, namely, the emotion induction method is generated by watching a specific emotion stimulation material to be tested, so that the aim of inducing the corresponding emotional state of the tested object is fulfilled. Human emotion does not appear very strong under daily conditions, so that in order to acquire strong emotion information, certain induction needs to be performed on a human subject, 5 film segments with obvious emotion tendencies are selected to be respectively played to the human subject at different times for watching, and the 5 film segments are connected to corresponding brain areas through brain cap leads while watching a film to acquire brain electrical data of the human subject as an original emotion brain electrical data set.
And (3) carrying out M times of electroencephalogram data acquisition on N subjects under the same emotion induction segment to obtain N.M groups of electroencephalogram data, wherein the data volume of each group of data is d x N, wherein d is the dimension of each group of data, and N is the number of electroencephalogram data samples which are acquired at a single time and are related to time. The set of data includes multiple category-tagged electroencephalogram data acquired in one acquisition. Each set of data is taken as a sample matrix X. Each sample matrix X corresponds to a label y; the label y corresponds to the emotion classification of the subject.
To study the stability of emotion recognition and to ensure the effectiveness of the stimulation, each subject was asked to participate in 3 experiments, at least three days apart. The test subjects in each experiment required 15 stimuli and 3 mood types to be viewed. Meanwhile, in order to ensure the effectiveness of stimulation and prevent the testee from getting bored, the data checked in the experiment of each time by the testee are completely different, and the total time for checking the data in each experiment is controlled to be about 50 minutes. In each trial, the participant viewed one of the movie fragments while his electroencephalographic signals were collected using the ESI NeuroScan system with 62 leads.
All the stimulation materials had 15 seconds of time before playing to introduce the background of the material and the emotion they wanted to evoke. After the stimulation material is played, there is a self-assessment and rest time of 15 seconds or 30 seconds depending on the type of material. If the stimulation material type is aversion or fear, the rest time is 30 seconds and the happy, neutral and sad time is 15 seconds.
And 2, preprocessing and extracting characteristics of all the electroencephalogram data obtained in the step 1. The method is carried out on the basis of 62 leads and 5 frequency bands (Delta (1-4Hz), Theta (4-8Hz), Alpha (8-14Hz), Beta (14-31Hz) and Gamma (31-50Hz)) and differential entropy characteristics are extracted. In practical application, the number of leads depends on the electroencephalogram cap worn by a subject during data acquisition; the division of frequency bands also follows a physiologically meaningful 5-band division; the most common features of electroencephalographic signals are power spectral density and differential entropy. The electroencephalogram signal of a human being is very weak, which means that the electroencephalogram signal is easy to interfere, and the acquired result is difficult to directly carry out experiments, so that the requirements on the preprocessing of the electroencephalogram signal are provided:
the pretreatment process is as follows:
2-1, down-sampling the electroencephalogram data to 200Hz, and performing band-pass filtering on the electroencephalogram data to a range of 1-50 Hz. According to the 5-frequency band method, the method is divided into five frequency bands of Delta, Theta, Alpha, Beta and Gamma
And 2-2, taking the electroencephalogram data of the 5 frequency bands as sample matrixes, respectively carrying out short-time Fourier transform with 4 seconds of time window and no overlap, and extracting differential entropy characteristics. The differential entropy signature h (x) is defined as:
h(X)=-∫xf(x)ln(f(x))dx (12)
in the formula (12), X is an input sample matrix (i.e. electroencephalogram data of a certain frequency band), and X is an element in the input sample matrix; f (x) is a probability density function. For a sample matrix X following a gaussian distribution, its differential entropy characteristic h (X) can be calculated as shown in equation (13):
Figure BDA0003425317620000061
in the formula (13), σ is a standard deviation of the probability density function; μ is the expectation of the probability density function.
It can be seen that the differential entropy signature is essentially a logarithmic form of the power spectral density signature, i.e.
Figure BDA0003425317620000062
The preprocessing of the electroencephalogram signals aims to improve the signal-to-noise ratio, so that the preprocessing effect of data is improved, and interference is reduced.
And 3, constructing a combined judgment subspace excavation and semi-supervised learning electroencephalogram emotion recognition machine learning model, and integrating the judgment subspace and the semi-supervised learning model obtained by mapping the projection matrix A into a unified framework to obtain a combined optimized target function.
3-1, establishing an embedded description factor v and a theta objective function as shown in a formula (1);
Figure BDA0003425317620000063
in the formula (1), the reaction mixture is,
Figure BDA0003425317620000064
for the input sample matrix, projection matrix
Figure BDA0003425317620000065
For projecting the raw data into a better discrimination subspace, projection matrix
Figure BDA0003425317620000066
For concatenating the data in the discrimination subspace with the tag information,
Figure BDA0003425317620000067
indicating information corresponding to the label matrix, wherein n ═ l + u, l indicates the number of labeled samples, u indicates the number of unlabeled samples,
Figure BDA0003425317620000068
representing a pseudo label corresponding to the unlabeled exemplar; i | · | purple wind21Is represented by21A norm; λ represents a regularization term parameter;
3-2, further rewriting the objective function formula (1) into the formula (2):
Figure BDA0003425317620000071
in the formula (2), Tr (·) is the trace operation of the matrix;
Figure BDA0003425317620000072
is a diagonal matrix in which each diagonal element has a value of
Figure BDA0003425317620000073
In the formula (3), giRepresenting the matrix G ═ A*B*The ith row of elements, | · | | non-woven phosphor2Is represented by2And (4) norm.
Step 4, initializing the pseudo label YuAnd D, obtaining the updating rules of all the variables by fixing two variables and updating the other variable according to the jointly optimized objective function obtained in the step 3, and sequentially carrying out comparison on the unmarked sample Y in the objective function formulauOptimizing the projection matrix A and the connection matrix B, and repeating the optimization process for multiple times to realize joint iterative optimization;
4-1. update Y by fixing A, BuLet us order
Figure BDA0003425317620000074
The formula (2) is rewritten as shown in (4):
Figure BDA0003425317620000075
by solving for Y line by lineuLet us order
Figure BDA0003425317620000076
Represents YuThe formula (4) in the ith row of (1) can be expressed as the formula (5)
Figure BDA0003425317620000077
According to equation (5), c is solved in the following processiAnd yiFor transposition of (c)iAnd yiExpression (6) is obtained by the lagrange multiplier method;
Figure BDA0003425317620000078
order to
Figure BDA0003425317620000079
Denotes yiOf the optimal solution, η*、β*To an optimal solution
Figure BDA00034253176200000710
Corresponding parameters, y being obtained from KKT conditioniThe optimal solution of (a) is:
Figure BDA00034253176200000711
wherein
Figure BDA00034253176200000712
4-2. through fixing A, YuTo update B, equation (2) is rewritten as shown in (8):
Figure BDA00034253176200000713
in equation (8), the matrix B is differentiated and the derivative is set to 0, and the update rule for obtaining B is equation (9):
B=(AT(XXT+λD)A)-1 ATXY (22)
4-3. by fixing YuB, updating a, substituting formula (9) into formula (2) to obtain formula (10):
Figure BDA0003425317620000081
order St=XXT,Sb=XYYTXTIn which S istAnd SbRespectively representing the intra-class dispersion and the inter-class dispersion in the linear discriminant analysis, the optimal solution of the variable a can be represented as:
Figure BDA0003425317620000082
let G be AB,
Figure BDA0003425317620000083
representing the importance of each dimension feature, since the importance of each feature dimension can be normalized by its/2The norm, which yields the following equation:
Figure BDA0003425317620000084
wherein g isiRepresentation matrix
Figure BDA0003425317620000085
Row i element of (1);
and 5, inputting the sample matrix X obtained in the step 2 into the objective function subjected to iterative optimization in the step 4 to obtain a corresponding predicted value label, wherein the predicted value label is the emotional state of the testee corresponding to the sample at the acquisition moment, and adding the obtained pseudo label into the training process of the model to realize semi-supervised learning.
Figure BDA0003425317620000086
Figure BDA0003425317620000091
Compared with the currently popular RLSR21 semi-supervised method, the result of the embodiment has higher identification precision.

Claims (7)

1. A joint judgment subspace mining and semi-supervised electroencephalogram emotion recognition method is characterized by comprising the following steps:
step 1, acquiring electroencephalogram data of a testee in K different emotional states;
step 2, preprocessing and extracting characteristics of the electroencephalogram data acquired in the step 1, wherein each sample matrix X consists of electroencephalogram characteristics of a testee, and a label vector y is an emotion label corresponding to the electroencephalogram characteristics in the sample matrix X; selecting two different sample matrixes as tagged data and untagged data respectively;
step 3, constructing a machine learning model of the joint judgment subspace mining and semi-supervised electroencephalogram emotion recognition method, and integrating the judgment subspace and the semi-supervised learning model obtained by mapping the projection matrix A into a unified frame to obtain a joint optimization objective function;
step 4, initializing the pseudo label YuAnd D, obtaining the updating rules of all the variables by fixing two variables and updating the other variable according to the jointly optimized objective function obtained in the step 3, and sequentially carrying out comparison on the unmarked sample Y in the objective function formulauOptimizing the projection matrix A and the connection matrix B, and repeating the optimization process for multiple times to realize joint iterative optimization;
and 5, inputting the sample matrix X obtained in the step 2 into the objective function subjected to iterative optimization in the step 4 to obtain a corresponding predicted value label, wherein the predicted value label is the emotional state of the testee corresponding to the sample at the acquisition moment, and adding the obtained pseudo label into the training process of the model to realize semi-supervised learning.
2. The joint judgment subspace mining and semi-supervised electroencephalogram emotion recognition method according to claim 1, characterized in that: the step 3 specifically comprises the following steps:
3-1, establishing an objective function embedded with a description factor v and a description factor theta, wherein the objective function is shown as a formula (1);
Figure FDA0003425317610000011
in the formula (1), the reaction mixture is,
Figure FDA0003425317610000012
for the input sample matrix, projection matrix
Figure FDA0003425317610000013
For projecting the raw data into a better discrimination subspace, projection matrix
Figure FDA0003425317610000014
For concatenating the data in the discrimination subspace with the tag information,
Figure FDA0003425317610000015
indicating information corresponding to the label matrix, wherein n ═ l + u, l indicates the number of labeled samples, u indicates the number of unlabeled samples,
Figure FDA0003425317610000016
representing a pseudo label corresponding to the unlabeled exemplar; i | · | purple wind21Is represented by21A norm; λ represents a regularization term parameter;
3-2, further rewriting the objective function formula (1) into the formula (2):
Figure FDA0003425317610000017
in the formula (2), Tr (·) is the trace operation of the matrix;
Figure FDA0003425317610000018
is a diagonal matrix in which each diagonal element has a value of
Figure FDA0003425317610000021
In the formula (3), giRepresenting the matrix G ═ A*B*The ith row of elements, | · | | non-woven phosphor2Is represented by2And (4) norm.
3. The joint judgment subspace mining and semi-supervised electroencephalogram emotion recognition method according to claim 2, characterized in that: the step 4 specifically comprises the following steps:
4-1. update Y by fixing A, BuLet us order
Figure FDA00034253176100000212
The formula (2) is rewritten as shown in (4):
Figure FDA0003425317610000022
by solving for Y line by lineuLet us order
Figure FDA0003425317610000023
Represents YuThe formula (4) on the ith line of (1) is expressed as the formula (5)
Figure FDA0003425317610000024
According to equation (5), c is solved in the following processiAnd yiFor transposition of (c)iAnd yiExpression (6) is obtained by the lagrange multiplier method;
Figure FDA0003425317610000025
order to
Figure FDA00034253176100000213
Denotes yiOf the optimal solution, η*、β*To an optimal solution
Figure FDA0003425317610000026
Corresponding parameters, y being obtained from KKT conditioniThe optimal solution of (a) is:
Figure FDA0003425317610000027
wherein
Figure FDA0003425317610000028
4-2. through fixing A, YuTo update B, equation (2) is rewritten as shown in (8):
Figure FDA0003425317610000029
in equation (8), the matrix B is differentiated and the derivative is set to 0, and the update rule for obtaining B is equation (9):
B=(AT(XXT+λD)A)-1ATXY (9)
4-3. by fixing YuB, updating a, substituting formula (9) into formula (2) to obtain formula (10):
Figure FDA00034253176100000210
order St=XXT,Sb=XYYTXTIn which S istAnd SbWhen the intra-class dispersion and the inter-class dispersion in the linear discriminant analysis are expressed respectively, the variable a is expressed by the following formula (11):
Figure FDA00034253176100000211
the variable a is the optimal solution.
4. The method of claim 1, wherein the emotion classification comprises: happy, sad, neutral, fear, nausea.
5. The joint judgment subspace mining and semi-supervised electroencephalogram emotion recognition method according to claim 1, characterized in that: the pretreatment process in the step 2 comprises the following substeps:
2-1, down-sampling the electroencephalogram data to 200Hz, and performing band-pass filtering on the electroencephalogram data to a range of 1-50 Hz; according to the 5-frequency band method, the method is divided into five frequency bands of Delta, Theta, Alpha, Beta and Gamma
2-2, respectively carrying out short-time Fourier transform with 4 seconds of time window and no overlap on the electroencephalogram data of the 5 frequency bands, and extracting differential entropy characteristics h (X) as shown in a formula (20):
h(X)=-∫xf(x)ln(f(x))dx (12)
in the formula (20), X is an input sample matrix, and X is an element in the input sample matrix; (x) is a probability density function;
the updated differential entropy characteristic h (X) is shown as a formula (21);
Figure FDA0003425317610000031
in the formula (21), σ is a standard deviation of the probability density function; μ is the expectation of the probability density function.
6. The electroencephalogram fatigue detection method based on the sample and characteristic quality joint quantitative evaluation as claimed in claim 1, characterized in that: the EEG data acquisition adopts 62 leads and selects 5 frequency bands; the 5 frequency bands are respectively 1-4Hz, 4-8Hz, 8-14Hz, 14-31Hz and 31-50 Hz.
7. The joint judgment subspace mining and semi-supervised electroencephalogram emotion recognition method according to claim 1, characterized in that: the projection matrix A, B in step 4 is used to explore the activation patterns in emotion recognition, let matrix G be A B,
Figure FDA0003425317610000032
representing the importance of each dimension of the feature, l by which the importance of each feature dimension is normalized2And (4) measuring by using a norm, and obtaining a key frequency band and a brain area identified by electroencephalogram emotion through the obtained theta.
CN202111578215.8A 2021-12-22 2021-12-22 Combined discrimination subspace mining and semi-supervised electroencephalogram emotion recognition method Active CN114343674B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111578215.8A CN114343674B (en) 2021-12-22 2021-12-22 Combined discrimination subspace mining and semi-supervised electroencephalogram emotion recognition method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111578215.8A CN114343674B (en) 2021-12-22 2021-12-22 Combined discrimination subspace mining and semi-supervised electroencephalogram emotion recognition method

Publications (2)

Publication Number Publication Date
CN114343674A true CN114343674A (en) 2022-04-15
CN114343674B CN114343674B (en) 2024-05-03

Family

ID=81101572

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111578215.8A Active CN114343674B (en) 2021-12-22 2021-12-22 Combined discrimination subspace mining and semi-supervised electroencephalogram emotion recognition method

Country Status (1)

Country Link
CN (1) CN114343674B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114841214A (en) * 2022-05-18 2022-08-02 杭州电子科技大学 Pulse data classification method and device based on semi-supervised discrimination projection

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108009571A (en) * 2017-11-16 2018-05-08 苏州大学 A kind of semi-supervised data classification method of new direct-push and system
US20200125897A1 (en) * 2018-10-18 2020-04-23 Deepnorth Inc. Semi-Supervised Person Re-Identification Using Multi-View Clustering
CN113157094A (en) * 2021-04-21 2021-07-23 杭州电子科技大学 Electroencephalogram emotion recognition method combining feature migration and graph semi-supervised label propagation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108009571A (en) * 2017-11-16 2018-05-08 苏州大学 A kind of semi-supervised data classification method of new direct-push and system
US20200125897A1 (en) * 2018-10-18 2020-04-23 Deepnorth Inc. Semi-Supervised Person Re-Identification Using Multi-View Clustering
CN113157094A (en) * 2021-04-21 2021-07-23 杭州电子科技大学 Electroencephalogram emotion recognition method combining feature migration and graph semi-supervised label propagation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
XIAOJUN CHEN 等: "Semi-supervised Feature Selection via Rescaled Linear Regression", PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI-17), 1 August 2017 (2017-08-01) *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114841214A (en) * 2022-05-18 2022-08-02 杭州电子科技大学 Pulse data classification method and device based on semi-supervised discrimination projection

Also Published As

Publication number Publication date
CN114343674B (en) 2024-05-03

Similar Documents

Publication Publication Date Title
Li et al. Depression recognition using machine learning methods with different feature generation strategies
CN106886792B (en) Electroencephalogram emotion recognition method for constructing multi-classifier fusion model based on layering mechanism
CN103631941B (en) Target image searching system based on brain electricity
CN111134666A (en) Emotion recognition method of multi-channel electroencephalogram data and electronic device
CN108304917A (en) A kind of P300 signal detecting methods based on LSTM networks
CN108056774A (en) Experimental paradigm mood analysis implementation method and its device based on visual transmission material
CN112773378B (en) Electroencephalogram emotion recognition method for feature weight adaptive learning
CN113157094B (en) Electroencephalogram emotion recognition method combining feature migration and graph semi-supervised label propagation
CN108256579A (en) A kind of multi-modal sense of national identity quantization measuring method based on priori
Yudhana et al. Human emotion recognition based on EEG signal using fast fourier transform and K-Nearest neighbor
CN111476158A (en) Multi-channel physiological signal somatosensory gesture recognition method based on PSO-PCA-SVM
Wang et al. Maximum weight multi-modal information fusion algorithm of electroencephalographs and face images for emotion recognition
CN113974627B (en) Emotion recognition method based on brain-computer generated confrontation
CN114343674A (en) Combined judgment subspace mining and semi-supervised electroencephalogram emotion recognition method
Chu et al. An enhanced EEG microstate recognition framework based on deep neural networks: an application to Parkinson's disease
CN117883082A (en) Abnormal emotion recognition method, system, equipment and medium
Demir et al. Bio-inspired filter banks for frequency recognition of SSVEP-based brain–computer interfaces
CN112861629A (en) Multi-window distinguishing typical pattern matching method and brain-computer interface application
CN117407748A (en) Electroencephalogram emotion recognition method based on graph convolution and attention fusion
Zhang et al. TorchEEGEMO: A deep learning toolbox towards EEG-based emotion recognition
CN111265214B (en) Electroencephalogram signal analysis method based on data structured decomposition
Wei et al. An investigation of pilot emotion change detection based on multimodal physiological signals
CN114818822A (en) Electroencephalogram migration emotion recognition method combining semi-supervised regression and icon label propagation
CN114186591A (en) Method for improving generalization capability of emotion recognition system
CN114638253A (en) Identity recognition system and method based on emotion electroencephalogram feature fusion optimization mechanism

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant