CN115054270A - Sleep staging method and system for extracting sleep spectrogram features based on GCN - Google Patents

Sleep staging method and system for extracting sleep spectrogram features based on GCN Download PDF

Info

Publication number
CN115054270A
CN115054270A CN202210687782.5A CN202210687782A CN115054270A CN 115054270 A CN115054270 A CN 115054270A CN 202210687782 A CN202210687782 A CN 202210687782A CN 115054270 A CN115054270 A CN 115054270A
Authority
CN
China
Prior art keywords
sleep
vector sequence
electroencephalogram
sequence
gcn
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210687782.5A
Other languages
Chinese (zh)
Inventor
陈欣荣
赵宇芳
陈啸
杜东书
李文智
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shaoxing Institute Of Shanghai University
Original Assignee
Shaoxing Institute Of Shanghai University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shaoxing Institute Of Shanghai University filed Critical Shaoxing Institute Of Shanghai University
Priority to CN202210687782.5A priority Critical patent/CN115054270A/en
Publication of CN115054270A publication Critical patent/CN115054270A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4812Detecting sleep stages or cycles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7225Details of analog processing, e.g. isolation amplifier, gain or sensitivity adjustment, filtering, baseline or drift compensation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7253Details of waveform analysis characterised by using transforms
    • A61B5/7257Details of waveform analysis characterised by using transforms using Fourier transforms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Signal Processing (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Artificial Intelligence (AREA)
  • Psychiatry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physiology (AREA)
  • Mathematical Physics (AREA)
  • Fuzzy Systems (AREA)
  • Evolutionary Computation (AREA)
  • Power Engineering (AREA)
  • Psychology (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention discloses a sleep staging method and a sleep staging system for extracting sleep spectrogram characteristics based on GCN, wherein the method comprises the following steps: s1, acquiring an electroencephalogram signal, and processing the electroencephalogram signal to obtain an electroencephalogram time-frequency image; s2, based on the time-frequency image, carrying out position coding processing to obtain a characteristic vector sequence; s3, modeling based on the characteristic vector sequence to obtain an output vector sequence; and S4, classifying the output vector sequences based on the output vector sequences to obtain the stage result of the output vector sequences. The system comprises: the device comprises a signal acquisition unit, a coding processing unit, a modeling unit and a classification unit; the invention provides interpretability for the model by coding the self-attention score into a heat map based on a Transformer backbone. At the sequence processing level, a novel multi-branch strategy is provided, the final feature vector is calculated, and the output is classified by using a softmax layer, so that the staging result of the sequence is obtained.

Description

Sleep staging method and system for extracting sleep spectrogram features based on GCN
Technical Field
The invention belongs to the field of sleep monitoring, and particularly relates to a sleep staging method and a sleep staging system for extracting sleep spectrogram features based on GCN.
Background
Sleep problems such as sleep disorder and sleep deprivation bring huge burden to public health. How to analyze the sleep condition and better diagnose the sleep problem has important significance on the global sleep health. Sleep staging is the first step in sleep diagnosis and assessment and is also a critical step. In recent years, thanks to the development of deep learning technology, researchers have tried to apply deep learning methods to the field of automatic sleep staging, and early attempts have mainly used relatively popular deep network frameworks such as convolutional neural networks, cyclic neural networks, automatic encoders, etc., and have achieved certain results.
However, in clinical application, sleep evaluation and monitoring still depend heavily on manual evaluation of sleep experts and there is no automatic sleep staging scheme that can be effectively applied to the fields of home monitoring and the like. This is mainly because the existing automatic sleep staging methods have two main drawbacks: firstly, the structural stacking or independent structures of different neural networks do not well combine the characteristics of sleep electroencephalogram signals, so that the classification result obtained by simple probability calculation of the finally obtained characteristic vector is low in accuracy; and secondly, the 'black box suspicion' of deep learning also limits sleep evaluation in application scenes such as clinical and accurate home monitoring to a certain extent.
Disclosure of Invention
The invention provides a sleep stage method and a sleep stage system for extracting sleep spectrogram characteristics based on GCN. At the sequence processing level, a novel multi-branch strategy is provided, and finally, a final feature vector is calculated, and the output is classified by using a softmax layer, so that the stage result of the sequence is obtained.
In order to achieve the purpose, the invention provides the following scheme:
the sleep staging method for extracting the sleep spectrogram characteristics based on the GCN comprises the following steps:
s1, acquiring an electroencephalogram signal, and processing the electroencephalogram signal to obtain an electroencephalogram time-frequency image;
s2, performing position coding processing based on the electroencephalogram time-frequency image to obtain a characteristic vector sequence;
s3, modeling based on the characteristic vector sequence to obtain an output vector sequence;
and S4, classifying the output vector sequence based on the output vector sequence to obtain the stage result of the output vector sequence.
Preferably, the S1 method for processing the electroencephalogram signal includes:
s11, carrying out short-time Fourier transform on the electroencephalogram signal to obtain a power spectrum;
s12, transforming the power spectrum by adopting a logarithmic scaling method to obtain a logarithmic power spectrum; and obtaining the electroencephalogram time-frequency image based on the logarithmic power spectrum.
Preferably, the position encoding method in S2 includes: and (3) coding the electroencephalogram signals by adopting a Transformer as a main structure and combining an attention mechanism.
Preferably, the modeling method in S3 includes: and modeling the characteristic vector sequence by adopting a graph convolution network and a bidirectional recurrent neural network.
Preferably, the output vector sequence classification in S4 adopts a softmax function.
Preferably, the staging result in S4 corresponds to five stages of sleep staging: awake phase, N1 phase, N2 phase, N3 phase and rapid eye movement phase.
The invention also provides a sleep staging system for extracting sleep spectrogram characteristics based on the GCN, which comprises the following steps: the device comprises a signal acquisition unit, a coding processing unit, a modeling unit and a classification unit;
the signal acquisition unit is connected with the coding processing unit and is used for acquiring electroencephalogram signals and processing the electroencephalogram signals to obtain electroencephalogram time-frequency images;
the encoding processing unit is also connected with the modeling unit and is used for encoding the electroencephalogram time-frequency image to obtain a characteristic vector sequence;
the modeling unit is also connected with the classification unit and is used for modeling the characteristic vector sequence to obtain an output vector sequence;
the classification unit is used for classifying the output vector sequence to obtain a stage result of the output vector sequence.
Preferably, the signal acquisition unit adopts a head-wearing electrode electroencephalogram signal acquisition device.
The invention has the beneficial effects that:
at the epoch processing level, the present invention encodes the self-attention scores into heatmaps based on the transform backbone, providing interpretability for the model. At the sequence processing level, the present invention proposes a novel multi-branch strategy. The convolution network is used as a main branch to deeply mine the potential characteristics of the electroencephalogram signal, and compared with the traditional neural network such as the convolution neural network, the convolution network can better process data with topological structures such as the electroencephalogram signal. On the auxiliary branch, the invention adopts a bidirectional circulation neural network to extract the characteristics. And finally, fusing the outputs of the two branches by using a fusion strategy, calculating a final feature vector, and classifying the outputs by using a softmax layer to obtain a staging result of the sequence.
Experimental results on Sleep-EDF public data sets show that the Sleep staging method is superior to the existing Sleep staging method, and the overall accuracy is as high as 90%.
Drawings
In order to more clearly illustrate the technical solution of the present invention, the drawings needed to be used in the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
Fig. 1 is a schematic flowchart of a sleep staging method for extracting sleep spectrogram features based on a GCN according to a first embodiment of the present invention;
FIG. 2 is a schematic diagram of a neural network structure of a sleep staging method for extracting sleep spectrogram features based on GCN according to the present invention;
fig. 3 is a schematic structural diagram of a sleep staging system for extracting sleep spectrogram features based on a GCN according to a third embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
Example one
As shown in fig. 1, a schematic flow chart of the sleep staging method for extracting sleep spectrogram features based on GCN of the present invention includes the following steps:
s1, acquiring an electroencephalogram signal, and processing the electroencephalogram signal to obtain an electroencephalogram time-frequency image;
in the first embodiment, the acquired electroencephalogram signals are divided into L electroencephalogram signal segments with the duration of 30s, and the power spectrum of each electroencephalogram signal is calculated through short-time Fourier transform with the time window of 2 s; then, converting the power spectrum into a logarithmic power spectrum by a logarithmic scaling method; further obtaining a two-dimensional time-frequency image S, wherein S belongs to R F×T (ii) a Wherein F represents the total number of frequency intervals between all samples in the frequency domain, and T represents a spectrum column; in the first embodiment, F is 129, and T is 29.
S2, based on the time-frequency image, carrying out position coding processing to obtain a characteristic vector sequence;
processing the time-frequency image S into a sequence of T spectrum columns, and coding through sine and cosine functions; in this embodiment, the ith row, the 2j column and the (2j +1) th column are taken as examples:
Figure BDA0003700334840000051
Figure BDA0003700334840000052
wherein, P represents a coding matrix; f denotes the total number of frequency intervals between all samples in the frequency domain.
Then, the position information coding is carried out on the time frequency image S:
Figure BDA0003700334840000053
in the formula,
Figure BDA0003700334840000054
representing the time-frequency image after encoding the position information, P ep ∈R T×F Representing a position-coding matrix.
N Using Epochtransformer E The converter is used for coding, and the coding method comprises the following steps:
X (i) =EpochTransformer(X (i-1) ),X (i) ∈R T×F ,1≤i≤N E
wherein,
Figure BDA0003700334840000061
X (i) show that
Figure BDA0003700334840000062
The sequence of spectral columns of (a) is encoded.
To reduce the output of all transformers to a compact eigenvector representing epoch, the sequence is reduced
Figure BDA0003700334840000063
Obtaining the sequence (X) by weighted combination through an attention mechanism 1 ,…,X L );The calculation method is as follows:
Figure BDA0003700334840000064
wherein X ∈ R F A derived feature vector representing the input epoch;
Figure BDA0003700334840000065
is the output sequence of Epochtransformer
Figure BDA0003700334840000066
One of (a), α t Is the attention weight learned by the softmax layer.
Wherein, the attention weight calculation formula is as follows:
Figure BDA0003700334840000067
a e =tanh(W a X t +b a ),W a ∈R A×F ,b a ∈R A
wherein, W a ∈R A×F And b a ∈R A Respectively learnable weight matrix and bias, a e ∈R A Is the trained epoch context vector. A represents the attention size.
S3, modeling based on the characteristic vector sequence to obtain an output vector sequence;
as shown in fig. 2, the present invention proposes a novel multi-branch strategy: the method adopts a multilayer graph convolution network as a main branch to deeply mine the potential characteristics of the electroencephalogram signal, and adopts a bidirectional circulation neural network as an auxiliary branch to extract the characteristics. The specific method comprises the following steps:
in the multi-layer graph convolution network part, attention feature vector sequence (X) 1 ,…,X L ) Is encoded as an output vector O G =(O G1 ,O G2 ,...,O GL ) (ii) a The method comprises the following steps:
(11) constructing a multilayer graph convolution network;
the construction method comprises the following steps:
Figure BDA0003700334840000071
wherein,
Figure BDA0003700334840000072
representing the addition of a self-connected adjacency matrix,
Figure BDA0003700334840000073
A adj an adjacency matrix for representing connection relation of adjacent parts in an undirected graph G, I N Is a matrix of characteristics of the image data,
Figure BDA0003700334840000074
and W (l) To train a matrix, superscript (l) indicates that the matrix belongs to layer l. σ () is the activation function of the network. H (l) ∈R N×D Is the activation matrix of layer I, the activation matrix of layer 0 is
Figure BDA0003700334840000075
In the first embodiment, a two-layer graph convolution network is used to model the attention feature sequence, and an adjacency matrix a is used adj Expressed as:
Figure BDA0003700334840000076
wherein, when i ═ j +1 or i ═ j-1, a ij 1, otherwise a ij =0。
(12) Computing adjacency matrices with added self-joins and normalization
Figure BDA0003700334840000077
Calculated according to the following formula:
Figure BDA0003700334840000078
in the formula,
Figure BDA0003700334840000079
representing a diagonal matrix.
(13) Calculating an output vector O according to a formula Gl The sequence of (a);
the calculation formula is as follows:
Figure BDA00037003348400000710
in the formula,
Figure BDA00037003348400000711
representing an adjacency matrix, W, added self-join and normalized (0) ∈R M×T The weight matrix is a weight matrix which can be learnt from an input layer to a hidden layer and is provided with T characteristic graphs; w (1) ∈R T×F It is a weight matrix that can be learned from the hidden layer to the output layer.
In the first embodiment, all residual layers share network parameters, and in order to obtain proper residual combination, a full-link layer network is used to combine X l Conversion to and O RL Vectors of the same structure.
In the bidirectional recurrent neural network part, attention feature vector sequence (X) 1 ,…,X L ) Is encoded as an output vector (O) R1 ,O R2 ,…,O RL ) Wherein L is more than or equal to 1 and less than or equal to L. Computing an output sequence O by computing its predecessor hidden state vector and back hidden state vector R
The method specifically comprises the following steps:
(21) calculating a forward hidden state vector:
Figure BDA0003700334840000081
(22) calculating a backward hidden state vector:
Figure BDA0003700334840000082
(23) calculating an output sequence:
Figure BDA0003700334840000083
in the formula,
Figure BDA0003700334840000084
is a learnable weight matrix;
Figure BDA0003700334840000085
represents a backward-hidden state vector and a forward-hidden state vector,
Figure BDA0003700334840000086
represents a forward hidden state vector;
Figure BDA0003700334840000087
is a learnable offset.
S4, classifying the output vector sequences based on the output vector sequences to obtain the stage result of the output vector sequences;
adding the output vectors of the two branches of the bidirectional cyclic neural network and the multilayer graph convolution network to obtain a sequence of the output vectors: o is l =O Rl +O Gl L is more than or equal to 1 and less than or equal to L; wherein L represents the number of electroencephalogram signal segments.
Output vector sequence O ═ O (O) 1 ,O 2 ,...,O L ) Inputting a shared softmax layer, wherein the shared softmax layer can effectively reduce parameters in the network, and calculating the probability value of the sleep stage corresponding to each sequence
Figure BDA0003700334840000088
The method for acquiring the sleep stage probability value corresponding to each sequence comprises the following steps: and sequentially calculating the probability value of each sequence for each sleep stage, and taking the maximum probability value as the sleep stage corresponding to the sequence.
For example: the probability values for a sequence of each sleep stage are: 0.6, 0.1 and 0.1, the maximum probability value 0.6 is taken as the probability value of the sleep stage corresponding to the sequence, and the sleep stage corresponding to the sequence is the waking period.
Example two
In order to judge the reliability of the constructed model, the invention introduces a loss function to calculate the loss of the model prediction result and the real result. The method comprises the following steps:
in one embodiment, the output vector sequence O ═ O (O) 1 ,O 2 ,...,O L ) After inputting a shared soft max layer, calculating the probability value of sleep stage corresponding to each sequence
Figure BDA0003700334840000091
When network training is performed, a sequence (S) is input 1 ,S 2 ,...,S L ) And a coded vector (y) corresponding thereto 1 ,y 2 ,...,y L ) As ground-truth, to evaluate the reliability of the model. When the loss is large, the parameter adaptability is adjusted. The loss function formula for the network is as follows:
Figure BDA0003700334840000092
where θ represents a parameter of the network and λ represents a trade-off error and l 2 Hyper-parameters of the non-regularization term.
EXAMPLE III
As shown in fig. 3, the present invention further provides a sleep staging system for extracting sleep spectrogram features based on GCN, including: the device comprises a signal acquisition unit, a coding processing unit, a modeling unit and a classification unit;
the signal acquisition unit is connected with the coding processing unit and is used for acquiring electroencephalogram signals and processing the electroencephalogram signals to obtain electroencephalogram time-frequency images;
the method for obtaining the electroencephalogram time-frequency image comprises the following steps:
carrying out short-time Fourier transform on the electroencephalogram signal to obtain a power spectrum;
transforming the power spectrum by adopting a logarithmic scaling method to obtain a logarithmic power spectrum; and obtaining the electroencephalogram time-frequency image based on the logarithmic power spectrum.
The encoding processing unit is also connected with the modeling unit and is used for encoding the electroencephalogram time-frequency image to obtain a characteristic vector sequence;
the method for obtaining the characteristic vector sequence comprises the following steps:
processing the time-frequency image S into a sequence of T spectrum columns, and coding through sine and cosine functions; take row i, column 2j, and column (2j +1) as examples:
Figure BDA0003700334840000101
Figure BDA0003700334840000102
wherein, P represents a coding matrix; f denotes the total number of frequency intervals between all samples in the frequency domain.
Then, the position information coding is carried out on the time frequency image S:
Figure BDA0003700334840000103
in the formula,
Figure BDA0003700334840000104
representing the time-frequency image after encoding the position information, P ep ∈R T×F Representing a position-coding matrix.
N Using Epochtransformer E The converter is used for coding, and the coding method comprises the following steps:
X (i) =EpochTransformer(X (i-1) ),X (i) ∈R T×F ,1≤i≤N E
wherein,
Figure BDA0003700334840000105
X (i) show that
Figure BDA0003700334840000106
The sequence of spectral columns of (a) is encoded.
To reduce the output of all transformers to a compact eigenvector representing epoch, the sequence is reduced
Figure BDA0003700334840000111
Obtaining the sequence (X) by weighted combination through an attention mechanism 1 ,…,X L ) (ii) a The calculation method is as follows:
Figure BDA0003700334840000112
wherein X ∈ R F A derived feature vector representing the input epoch;
Figure BDA0003700334840000113
is the output sequence of an Epoch Transformer
Figure BDA0003700334840000114
One of (a), α t Is the attention weight learned by the soft max layer.
Wherein, the attention weight calculation formula is as follows:
Figure BDA0003700334840000115
a e =tanh(W a X t +b a ),W a ∈R A×F ,b a ∈R A
wherein, W a ∈R A×F And b a ∈R A Respectively learnable weight matrix and bias,a e ∈R A Is the trained epoch context vector. A represents the attention size.
The modeling unit is also connected with the classification unit and is used for modeling the characteristic vector sequence to obtain an output vector sequence;
the method for obtaining the output vector sequence comprises the following steps:
the method adopts a multilayer graph convolution network as a main branch to deeply mine the potential characteristics of the electroencephalogram signal, and adopts a bidirectional circulation neural network as an auxiliary branch to extract the characteristics. The specific method comprises the following steps:
in the multi-layer graph convolution network part, attention feature vector sequence (X) 1 ,…,X L ) Is encoded as an output vector O G =(O G1 ,O G2 ,...,O GL ) (ii) a The method comprises the following steps:
constructing a multilayer graph convolution network;
the construction method comprises the following steps:
Figure BDA0003700334840000121
wherein,
Figure BDA0003700334840000122
representing the addition of a self-connected adjacency matrix,
Figure BDA0003700334840000123
A adj for the adjacency graph G, I N Is a matrix of characteristics that is,
Figure BDA0003700334840000124
and W (l) To train a matrix, superscript (l) indicates that the matrix belongs to layer l. σ () is the activation function of the network. H (l) ∈R N×D Is the activation matrix of layer I, the activation matrix of layer 0 is
Figure BDA0003700334840000125
Modeling attention feature sequence by using two-layer graph convolution network and adopting adjacency matrix A adj Expressed as:
Figure BDA0003700334840000126
wherein, when i ═ j +1 or i ═ j-1, a ij 1, otherwise a ij =0。
Computing a join matrix with added self-joins and normalization
Figure BDA0003700334840000127
Calculated according to the following formula:
Figure BDA0003700334840000128
in the formula,
Figure BDA0003700334840000129
representing a diagonal matrix.
Calculating an output vector O according to a formula Gl The sequence of (a);
the calculation formula is as follows:
Figure BDA00037003348400001210
in the formula,
Figure BDA00037003348400001211
representing the joining matrix W with the addition of self-joins and normalization (0) ∈R M×T The weight matrix is a weight matrix which can be learnt from an input layer to a hidden layer and is provided with T characteristic graphs; w is a group of (1) ∈R T×F It is a weight matrix that can be learned from the hidden layer to the output layer.
All residual layers share network parameters, and in order to obtain the proper residual combination, a full-link layer network is usedMixing X l Conversion to and O RL Vectors of the same structure.
In the bidirectional recurrent neural network part, attention feature vector sequence (X) 1 ,…,X L ) Is encoded as an output vector (O) R1 ,O R2 ,…,O RL ) Wherein L is more than or equal to 1 and less than or equal to L. Computing an output sequence O by computing its predecessor hidden state vector and back hidden state vector R
The method specifically comprises the following steps:
calculating a forward hidden state vector:
Figure BDA0003700334840000131
calculating a backward hidden state vector:
Figure BDA0003700334840000132
calculating an output sequence:
Figure BDA0003700334840000133
in the formula,
Figure BDA0003700334840000134
is a learnable weight matrix;
Figure BDA0003700334840000135
represents a backward-hidden state vector and a forward-hidden state vector,
Figure BDA0003700334840000136
represents a forward hidden state vector;
Figure BDA0003700334840000137
is a learnable offset.
The classification unit is used for classifying the output vector sequence to obtain the stage result of the output vector sequence.
The method for obtaining the stage result of the output vector sequence comprises the following steps:
adding the output vectors of the two branches of the bidirectional cyclic neural network and the multilayer graph convolution network to obtain a sequence of the output vectors: o is l =O Rl +O Gl L is more than or equal to 1 and less than or equal to L; wherein L represents the number of electroencephalogram signal segments.
Output vector sequence O ═ O (O) 1 ,O 2 ,...,O L ) Inputting a shared softmax layer, wherein the shared softmax layer can effectively reduce parameters in the network, and calculating the probability value of the sleep stage corresponding to each sequence
Figure BDA0003700334840000138
Wherein, the signal acquisition unit can adopt a head-wearing electrode electroencephalogram signal acquisition device.
The above-described embodiments are merely illustrative of the preferred embodiments of the present invention, and do not limit the scope of the present invention, and various modifications and improvements of the technical solutions of the present invention can be made by those skilled in the art without departing from the spirit of the present invention, and the technical solutions of the present invention are within the scope of the present invention defined by the claims.

Claims (8)

1. The sleep staging method for extracting the sleep spectrogram characteristics based on the GCN is characterized by comprising the following steps of:
s1, acquiring an electroencephalogram signal, and processing the electroencephalogram signal to obtain an electroencephalogram time-frequency image;
s2, carrying out position coding processing based on the electroencephalogram time-frequency image to obtain a characteristic vector sequence;
s3, modeling based on the characteristic vector sequence to obtain an output vector sequence;
and S4, classifying the output vector sequence based on the output vector sequence to obtain the stage result of the output vector sequence.
2. The sleep staging method for extracting sleep spectrogram features based on GCN as claimed in claim 1, wherein said S1 processing said EEG signal comprises:
s11, carrying out short-time Fourier transform on the electroencephalogram signal to obtain a power spectrum;
s12, transforming the power spectrum by adopting a logarithmic scaling method to obtain a logarithmic power spectrum; and obtaining the electroencephalogram time-frequency image based on the logarithmic power spectrum.
3. The sleep staging method for extracting sleep spectrogram features based on GCN as claimed in claim 1, wherein said position encoding method in S2 comprises: and (3) coding the electroencephalogram signals by adopting a Transformer as a main structure and combining an attention mechanism.
4. The sleep staging method for extracting sleep spectrogram features based on GCN as claimed in claim 1, wherein said modeling method in S3 comprises: and modeling the feature vector sequence by adopting a graph convolution network and a bidirectional recurrent neural network.
5. The sleep staging method based on GCN extraction sleep spectrogram feature of claim 4, wherein the output vector sequence classification in S4 employs a softmax function.
6. The sleep staging method for extracting sleep spectrogram features based on the GCN as claimed in claim 1, wherein said staging results in S4 correspond to five stages of sleep staging: awake phase, N1 phase, N2 phase, N3 phase and rapid eye movement phase.
7. Sleep staging system based on GCN draws sleep spectrogram characteristic, its characterized in that includes: the device comprises a signal acquisition unit, a coding processing unit, a modeling unit and a classification unit;
the signal acquisition unit is connected with the coding processing unit and is used for acquiring electroencephalogram signals and processing the electroencephalogram signals to obtain electroencephalogram time-frequency images;
the encoding processing unit is also connected with the modeling unit and is used for encoding the electroencephalogram time-frequency image to obtain a characteristic vector sequence;
the modeling unit is also connected with the classifying unit and is used for modeling the characteristic vector sequence to obtain an output vector sequence;
the classification unit is used for classifying the output vector sequence to obtain a stage result of the output vector sequence.
8. The GCN-based sleep stage classification system for extracting sleep spectrogram features according to claim 7, wherein said signal acquisition unit is a head-mounted electrode EEG signal acquisition device.
CN202210687782.5A 2022-06-17 2022-06-17 Sleep staging method and system for extracting sleep spectrogram features based on GCN Pending CN115054270A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210687782.5A CN115054270A (en) 2022-06-17 2022-06-17 Sleep staging method and system for extracting sleep spectrogram features based on GCN

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210687782.5A CN115054270A (en) 2022-06-17 2022-06-17 Sleep staging method and system for extracting sleep spectrogram features based on GCN

Publications (1)

Publication Number Publication Date
CN115054270A true CN115054270A (en) 2022-09-16

Family

ID=83201860

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210687782.5A Pending CN115054270A (en) 2022-06-17 2022-06-17 Sleep staging method and system for extracting sleep spectrogram features based on GCN

Country Status (1)

Country Link
CN (1) CN115054270A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115969329A (en) * 2023-02-08 2023-04-18 长春理工大学 Sleep staging method, system, device and medium
CN116898455A (en) * 2023-07-06 2023-10-20 湖北大学 Sleep electroencephalogram signal detection method and system based on deep learning model

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115969329A (en) * 2023-02-08 2023-04-18 长春理工大学 Sleep staging method, system, device and medium
CN116898455A (en) * 2023-07-06 2023-10-20 湖北大学 Sleep electroencephalogram signal detection method and system based on deep learning model
CN116898455B (en) * 2023-07-06 2024-04-16 湖北大学 Sleep electroencephalogram signal detection method and system based on deep learning model

Similar Documents

Publication Publication Date Title
Parmar et al. Learning to score olympic events
CN115054270A (en) Sleep staging method and system for extracting sleep spectrogram features based on GCN
CN112766172B (en) Facial continuous expression recognition method based on time sequence attention mechanism
CN114564991B (en) Electroencephalogram signal classification method based on transducer guided convolutional neural network
CN111080032A (en) Load prediction method based on Transformer structure
CN110070895B (en) Mixed sound event detection method based on factor decomposition of supervised variational encoder
CN113423005B (en) Intelligent music generation method and system based on improved neural network
CN116486308A (en) Teaching management system and method based on intelligent education
CN113707331B (en) Traditional Chinese medicine syndrome differentiation data generation method and system
CN111626296A (en) Medical image segmentation system, method and terminal based on deep neural network
Luo et al. Croup and pertussis cough sound classification algorithm based on channel attention and multiscale Mel-spectrogram
CN110853764B (en) Diabetes syndrome prediction system
CN117333497A (en) Mask supervision strategy-based three-dimensional medical image segmentation method for efficient modeling
CN114626424B (en) Data enhancement-based silent speech recognition method and device
CN116570284A (en) Depression recognition method and system based on voice characterization
CN113361505B (en) Non-specific human sign language translation method and system based on contrast decoupling element learning
CN115456025A (en) Electroencephalogram emotion recognition method based on layered attention time domain convolution network
CN113553917A (en) Office equipment identification method based on pulse transfer learning
CN113269702A (en) Low-exposure vein image enhancement method based on cross-scale feature fusion
CN113096070A (en) Image segmentation method based on MA-Unet
CN114847968A (en) Electroencephalogram sleep staging method based on long-term and short-term memory network
Ting et al. Performance analysis of single and combined bit-planes feature extraction for recognition in face expression database
CN116738359B (en) Mongolian multi-mode emotion analysis method based on pre-training model and high-resolution network
CN118094139A (en) Current load decomposition method based on time sequence deep analysis
CN116756571A (en) Behavior recognition model training method based on time-frequency fusion enhancement

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination