CN111882036B - Convolutional neural network training method, electroencephalogram signal identification method, device and medium - Google Patents

Convolutional neural network training method, electroencephalogram signal identification method, device and medium Download PDF

Info

Publication number
CN111882036B
CN111882036B CN202010710647.9A CN202010710647A CN111882036B CN 111882036 B CN111882036 B CN 111882036B CN 202010710647 A CN202010710647 A CN 202010710647A CN 111882036 B CN111882036 B CN 111882036B
Authority
CN
China
Prior art keywords
neural network
convolutional neural
electroencephalogram
layer
electroencephalogram signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010710647.9A
Other languages
Chinese (zh)
Other versions
CN111882036A (en
Inventor
王力
黄伟键
刘彦俊
颜振雄
王友康
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou University
Original Assignee
Guangzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou University filed Critical Guangzhou University
Priority to CN202010710647.9A priority Critical patent/CN111882036B/en
Publication of CN111882036A publication Critical patent/CN111882036A/en
Application granted granted Critical
Publication of CN111882036B publication Critical patent/CN111882036B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Software Systems (AREA)
  • Biophysics (AREA)
  • Dermatology (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • Human Computer Interaction (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention discloses a convolutional neural network training method, an electroencephalogram signal identification method, a device and a medium. The convolutional neural network trained by the invention is a multi-input, multi-convolution-scale and multi-convolution type hybrid convolutional neural network, reasonably designs the sizes of a multi-input convolutional layer and a convolutional kernel, and has higher recognition accuracy; the training set used for training the convolutional neural network is obtained by carrying out time domain data enhancement and frequency domain data enhancement expansion based on the acquired brain electrical signals, so that the training data volume of the convolutional neural network can be increased, the overfitting phenomenon is reduced, the noise interference in the brain electrical signals can be effectively reduced, and the recognition effect is improved. The invention is widely applied to the technical field of signal processing.

Description

Convolutional neural network training method, electroencephalogram signal identification method, device and medium
Technical Field
The invention relates to the technical field of signal processing, in particular to a convolutional neural network training method, an electroencephalogram signal identification device and a medium.
Background
The brain-computer interface can convert brain activities into computer control instructions so as to control external equipment, and can be widely applied to the fields of medicine, industrial control and the like. The brain electrical signal has the advantages of no invasiveness, high time resolution and the like, so the brain electrical signal is used as a signal source of an brain electrical interface. The application of the brain-computer signal to the brain-computer interface relates to the identification process of the brain-computer signal, namely, the identification of which type or characteristic the brain-computer signal belongs to, so as to be converted into a computer control instruction. The brain-computer signal also has the characteristics of non-stability, nonlinearity, randomness and the like, namely the characteristics of the brain-computer signal are changed along with time, so that the brain-computer signal is easy to be interfered by noise, is unfavorable for the identification of the brain-computer signal, and is also unfavorable for the application of a brain-computer interface.
Disclosure of Invention
Aiming at least one technical problem, the invention aims to provide a convolutional neural network training method, an electroencephalogram signal identification device and a medium.
In one aspect, an embodiment of the present invention includes a convolutional neural network training method, including:
executing a plurality of acquisition processes; each time the acquisition process is used for acquiring the brain electrical signals respectively;
performing time domain data enhancement and frequency domain data enhancement on the electroencephalogram signal;
the convolutional neural network is trained using the electroencephalogram signal that has undergone the time domain data enhancement and frequency domain data enhancement.
Further, the acquiring process includes:
acquiring brain electrical signals generated by a subject when the subject performs motor imagery through C3, cz and C4 channels;
classifying the electroencephalogram signals into left-hand motor imagery electroencephalogram signals or right-hand motor imagery electroencephalogram signals;
classifying and marking the electroencephalogram signals;
classifying the electroencephalogram signals into a training set or a testing set; the training set is used for training the convolutional neural network, and the test set is used for testing the convolutional neural network.
Further, the convolutional neural network training method further comprises the following steps:
screening out abnormal values of the electroencephalogram signals;
and performing first band-pass filtering on the electroencephalogram signals.
Further, the time domain data enhancement includes:
decomposing the electroencephalogram signal into data fragments in a time domain;
performing the exchange of the data segments between the electroencephalogram signals acquired in each acquisition process; the data segments that are swapped have the same time domain position;
and performing second band-pass filtering on the electroencephalogram signals.
Further, the frequency domain data enhancement includes:
performing third band-pass filtering on the electroencephalogram signal subjected to the time domain data enhancement; the third band-pass filter is provided with a plurality of pass bands, and the result of the third band-pass filter is to obtain the frequency components of the electroencephalogram signals;
performing a swap of the frequency components between the electroencephalogram signals acquired in each acquisition process; the frequency components that are swapped have the same frequency domain position.
Further, stopping the time domain data enhancement when the intensity of the electroencephalogram signal reaches a preset threshold value;
and stopping the time domain data enhancement when all the electroencephalogram signals are executed at least once.
Further, the convolutional neural network comprises an input layer, a time convolutional layer, a depth convolutional layer, a first pooling layer, a separable convolutional layer, a second pooling layer, a full-connection layer and an output layer which are connected in sequence;
the input layer is used for receiving the electroencephalogram signals;
the time convolution layer is used for extracting time characteristic information from the electroencephalogram signals;
the depth convolution layer is used for extracting space characteristic information from the electroencephalogram signals;
the separable convolution layer is used for extracting frequency characteristic information from the electroencephalogram signals;
the first pooling layer and the second pooling layer are used for compressing and simplifying the time characteristic information, the space characteristic information and the frequency characteristic information;
the full connection layer is used for fusing the output result of the second pooling layer;
and the output layer is used for carrying out classified output according to the fusion result of the full-connection layer.
On the other hand, the embodiment of the invention also comprises an electroencephalogram signal identification method, which comprises the following steps:
acquiring an electroencephalogram signal to be processed;
inputting the electroencephalogram signal to be processed into a convolutional neural network; the convolutional neural network is trained by the training method in the embodiment;
obtaining an output result of the convolutional neural network; the output result of the convolutional neural network comprises the type of the electroencephalogram signal.
In another aspect, embodiments of the present invention also include a computer apparatus comprising a memory for storing at least one program and a processor for loading the at least one program to perform the method of the embodiments.
In another aspect, embodiments of the present invention also include a storage medium having stored therein processor-executable instructions which, when executed by a processor, are adapted to carry out the method of the embodiments.
The beneficial effects of the invention are as follows: the convolutional neural network trained in the embodiment is a multi-input, multi-convolution-scale and multi-convolution type hybrid convolutional neural network, the sizes of a multi-input convolutional layer and a convolutional kernel are reasonably designed, and the convolutional neural network has high recognition accuracy; the training set used for training the convolutional neural network is obtained by carrying out time domain data enhancement and frequency domain data enhancement expansion based on the acquired brain electrical signals, so that the training data volume of the convolutional neural network can be increased, the overfitting phenomenon is reduced, the noise interference in the brain electrical signals can be effectively reduced, and the recognition effect is improved.
Drawings
FIG. 1 is a flow chart of a convolutional neural network training method in an embodiment;
FIG. 2 is a schematic diagram of stimulating a subject to produce an EEG signal in an embodiment;
FIG. 3 is a schematic diagram showing the distribution of electrodes of an electroencephalogram signal acquisition apparatus used in the example;
FIG. 4 is a schematic diagram of a working sequence for acquiring an EEG signal in an embodiment;
FIG. 5 is a schematic diagram of time domain data enhancement in an embodiment;
fig. 6 is a schematic diagram of frequency data enhancement in an embodiment.
Detailed Description
In this embodiment, referring to fig. 1, the convolutional neural network training method includes the following steps:
p1. executing a plurality of acquisition processes; each acquisition process is used for acquiring an electroencephalogram signal respectively;
p2. performing time domain data enhancement and frequency domain data enhancement on the electroencephalogram signals;
p3. training convolutional neural network using the brain electrical signal with time domain data enhancement and frequency domain data enhancement.
In this embodiment, each time the acquisition process in step P1 is performed, the subject is required to perform an imagination of a certain type of action, so that the brain of the subject generates an electroencephalogram signal, and the electroencephalogram signal is acquired by the electroencephalogram signal acquisition apparatus including the C3, cz and C4 channels. In this embodiment, the motion imagined by the subject during each acquisition may be required to be of the same type, so that interference caused by different types of imagined motion may be reduced. And if the acquisition process is executed once, the subject is instructed to perform one action imagination, one electroencephalogram signal acquisition is performed, and a plurality of electroencephalogram signals can be obtained after the acquisition process is executed for a plurality of times.
Specifically, in this embodiment, the step P1 specifically includes the following substeps:
p101. collecting brain electrical signals generated by a subject when performing motor imagery through C3, cz and C4 channels;
p102. classifying the brain electrical signal into a left-hand motor imagery brain electrical signal or a right-hand motor imagery brain electrical signal;
p103. classifying and marking the electroencephalogram signals;
p104. classifying the electroencephalogram signals into a training set or a test set; the training set is used for training the convolutional neural network, and the test set is used for testing the convolutional neural network.
In this embodiment, steps P101-P104 may be performed during each acquisition. In this embodiment, when step P101 is performed, the display device shown in fig. 2 is used to display an image and sound prompt, a left-hand image motion prompt and a right-hand image motion prompt to the subject, so as to stimulate the subject to perform the motion image and generate an electroencephalogram signal. In this embodiment, the distribution of electrodes of the electroencephalogram signal acquisition apparatus is shown in fig. 3, and the channels C3, cz, and C4 in fig. 3 are used to acquire an electroencephalogram signal when step P101 is performed.
In performing step P101, referring to fig. 4, starting to time after instructing the subject to perform motor imagery, 0-3 seconds being a preparation phase in which the subject is ready; 3-7 seconds is a imagination stage, and the subject performs motor imagination in the imagination stage; the 7 th to 8 th seconds are the idle phase, and the system can execute the steps P102 to P104 in the idle phase, i.e. the total time for each acquisition process is 8 seconds when executing the step P1.
In this embodiment, by executing steps P102 and P103, the acquired brain electrical signal is marked as a left-hand motor imagery brain electrical signal or a right-hand motor imagery brain electrical signal. In performing step P104, the electroencephalogram signals are categorized by:
1. if n acquisition processes are performed, n electroencephalogram signals with the numbers of 1, 2, 3 … … n and the like are obtained, the electroencephalogram signal with the number of 1 is taken as a test set, and other electroencephalogram signals are taken as training sets;
2. taking the electroencephalogram signal with the number of 2 as a test set and other electroencephalogram signals as training sets;
……
and n, taking the electroencephalogram signals with the number of n as a test set and other electroencephalogram signals as training sets.
Through the process, n groups of test sets and training sets can be obtained, wherein the training sets are used for training the convolutional neural network, and the test sets are used for testing the convolutional neural network. In this embodiment, for the electroencephalogram signals in the training set, the time domain data enhancement and the frequency domain data enhancement in step P2 are further performed and then used for training the convolutional neural network.
In this embodiment, the electroencephalogram signal obtained in step P1 may also be preprocessed. The pretreatment process comprises the following steps:
A1. screening out abnormal values of the electroencephalogram signals;
A2. and performing first band-pass filtering on the electroencephalogram signals.
In the case of executing the electroencephalogram signal acquisition process for a total of 8 seconds in this embodiment, the electroencephalogram signals acquired by each acquisition process may be intercepted and retained, specifically, only the 3.5 th to 7 th second portions of each electroencephalogram signal are retained, other portions of each electroencephalogram signal are deleted, the 3.5 th to 7 th second portions of each electroencephalogram signal have less noise from the viewpoint of noise distribution, and other portions have more noise and thus have more abnormal values, and abnormal value screening may be performed by intercepting and retaining.
When the step A2 is executed, the electroencephalogram signals acquired in each acquisition process can be subjected to first band-pass filtering, wherein the passband of the first band-pass filtering is 2Hz-35Hz. In this embodiment, after the first bandpass filtering, the electroencephalogram signal acquired in each acquisition process is constructed into a format of 3×875.
In this embodiment, the time domain data enhancement in the step P2 includes:
p201. decomposing the electroencephalogram signal into data fragments in the time domain;
p202. performing a swap of the data segments between the electroencephalogram signals acquired in each of the acquisition processes; the data segments that are swapped have the same time domain position;
p203. second bandpass filtering the electroencephalogram signal.
The principle of steps P201-P203 can be seen with reference to fig. 5. In this embodiment, the electroencephalogram signal 1, the electroencephalogram signal 2 and the electroencephalogram signal 3 are respectively obtained through three acquisition processes, and then the electroencephalogram signal 1, the electroencephalogram signal 2 and the electroencephalogram signal 3 are respectively divided into three data segments in the time domain. When the format of the electroencephalogram signal is 3×875, the size of each data fragment is 3×291, 3×292, and 3×292, respectively. Referring to fig. 5, the data segments being swapped have the same time domain position, for example, the last data segment of the electroencephalogram signal 1 is swapped with the last data segment of the electroencephalogram signal 2, and the middle data segment of the electroencephalogram signal 2 is swapped with the middle data segment of the electroencephalogram signal 3. After the exchange is completed, the electroencephalogram signal 1, the electroencephalogram signal 2 and the electroencephalogram signal 3 are subjected to second band-pass filtering, wherein the passband of the second band-pass filtering is 2Hz-35Hz.
After the step P201-P203 is performed once, the intensity of the electroencephalogram signal is enhanced, in this embodiment, the step P201-P203 is repeated for a plurality of times until the intensity of the electroencephalogram signal is enhanced to 3 times of the original intensity, and the step P201-P203 is not performed.
In this embodiment, after performing one or more steps P201 to P203, frequency domain data enhancement is performed on the electroencephalogram data subjected to the time domain data enhancement.
In this embodiment, the frequency domain data enhancement in the step P2 includes:
p204. third bandpass filtering the electroencephalogram signal enhanced by the time domain data; the third band-pass filter is provided with a plurality of pass bands, and the result of the third band-pass filter is to obtain the frequency components of the electroencephalogram signals;
p205. performing a swap of the frequency components between the electroencephalogram signals acquired in each of the acquisition processes; the frequency components that are swapped have the same frequency domain position.
The principle of steps P204-P205 can be seen with reference to fig. 5. In this embodiment, the electroencephalogram signal 1, the electroencephalogram signal 2, and the electroencephalogram signal 3 are acquired respectively through three acquisition processes. The third bandpass filtering is performed on two of the electroencephalogram signals, and as shown in fig. 6, the third bandpass filtering is performed on the electroencephalogram signal 1 and the electroencephalogram signal 2 respectively, wherein the third bandpass filtering has three pass bands of 4-7Hz (theta rhythm), 8-13Hz (mu rhythm) and 13-32Hz (beta rhythm). Performing third band-pass filtering on the electroencephalogram signal 1 to obtain frequency components of the electroencephalogram signal 1 at 4-7Hz, 8-13Hz and 13-32 Hz; and carrying out third band-pass filtering on the electroencephalogram signal 2 to obtain frequency components of the electroencephalogram signal 2 at 4-7Hz, 8-13Hz and 13-32 Hz.
When step P205 is performed, the frequency components that are swapped have the same frequency domain position, for example, referring to fig. 5, the frequency components at 13-32Hz in the electroencephalogram signal 1 are swapped with the frequency components at 13-32Hz in the electroencephalogram signal 2, so that the frequency enhancement process between the electroencephalogram signal 1 and the electroencephalogram signal 2 is completed.
In this embodiment, the steps P204-P205 may be repeated for multiple times, and different electroencephalograms are paired in pairs and frequency component exchanged for each time of executing the steps P204-P205 until the steps P204-P205 are not executed after the pairwise pairing and frequency component exchange between all electroencephalograms are completed. For example, the electroencephalogram signals acquired in the embodiment include an electroencephalogram signal 1, an electroencephalogram signal 2 and an electroencephalogram signal 3, and frequency components are exchanged by combining the electroencephalogram signal 1 with the electroencephalogram signal 2, combining the electroencephalogram signal 1 with the electroencephalogram signal 3, and combining the electroencephalogram signal 2 with the electroencephalogram signal 3, respectively.
The time domain data enhancement and the frequency domain data enhancement realized by executing the steps P201-P205 can enlarge the data volume on the basis of the original acquired brain electrical signals, thereby being beneficial to training the convolutional neural network.
In this embodiment, the signal size of the electroencephalogram signal acquired in each acquisition process is 3×3×875, and the convolutional neural network used has the structure shown in table 1.
TABLE 1
In this embodiment, the convolutional neural network includes an input layer, a time convolutional layer, a depth convolutional layer, a first pooling layer, a separable convolutional layer, a second pooling layer, a full-connection layer, and an output layer that are sequentially connected.
In this embodiment, when the convolutional neural network is trained using an electroencephalogram signal, the convolutional neural network receives an input electroencephalogram signal and transmits the electroencephalogram signal to the time convolutional layer.
Each temporal convolution layer will have 3 different convolution kernels, the sizes of which are 1 x 85, 1 x 65 and 1 x 45, respectively, the convolution step size being 1 x 3, the dimension of the output space being 10. The output format of the temporal convolution layer is 3 parallel feature maps, each feature map having dimensions 3×264×10, 3×271×10, 3×277×10, respectively. The characteristic diagram output by the time convolution layer contains the time characteristic information of the brain electrical signal.
The feature map output by the time convolution layer is input to the depth convolution layer. Each depth convolution layer has 3 identical convolution kernels, the size of the convolution kernels is 3×1, the convolution step size is 1×1, and the dimension of the output space is 3. The output format of the depth convolution layer is 3 parallel feature maps, each feature map having dimensions 1×264×30, 1×271×30, 1×277×30, respectively. The characteristic map output by the depth convolution layer contains the space characteristic information of the brain electrical signal.
The feature map output by the deep convolution layer is input to the first pooling layer. Each first pooling layer will have 3 identical convolution kernels, the size of these convolution kernels being 1 x 6, the convolution step size being 1 x 3, the dimension of the output space being 1. The output format of the first pooling layer is 3 parallel feature maps, each feature map having dimensions of 1 x 87 x 30, 1 x 89 x 30, 1 x 91 x 30, respectively. The first pooling layer may compress and simplify the feature map output by the deep convolutional layer.
The feature map output by the first pooling layer is input to the separable convolution layer. Each separable convolution layer will have 3 identical convolution kernels, each of the convolution kernels having a size of 1 x 8, a convolution step size of 1 x 1, and a dimension of 30 in the output space. The output format of the separable convolutional layer is 3 parallel feature maps each having dimensions of 1×87×30, 1×89×30, 1×91×30, respectively. The characteristic diagram output by the separable convolution layer contains frequency characteristic information of the electroencephalogram signals.
The feature map of the separable convolution layer output is input to the second pooling layer. Each second pooling layer will have 3 identical convolution kernels, the size of these convolution kernels being 1 x 6, the convolution step size being 1 x 3, the dimension of the output space being 1. The output format of the second pooling layer is 3 parallel feature maps, each feature map having dimensions of 1 x 28 x 30, 1 x 29 x 30, respectively. The second pooling layer may compress and simplify the feature map of the separable convolutional layer output.
And fusing the output signals of the second pooling layer through the full-connection layer, and outputting the output signals through the 2-class of the softmax layer to obtain the identification result.
In this embodiment, a label may be set for the electroencephalogram data, and the training end condition may be determined by calculating a distance between the recognition result output by the convolutional neural network and the label of the electroencephalogram data input to the convolutional neural network, for example, when the distance between the recognition result and the corresponding label is smaller than a preset threshold value or the number of training times reaches a predetermined value, training on the convolutional neural network is ended.
In this embodiment, the convolutional neural network used is a multi-input, multi-convolution-scale, multi-convolution type hybrid convolutional neural network, and the sizes of the multi-input convolutional layer and the convolutional kernel are reasonably designed, so that the convolutional neural network has high recognition accuracy. The training set used for training the convolutional neural network is obtained by carrying out time domain data enhancement and frequency domain data enhancement expansion based on the acquired brain electrical signals, so that the training data volume of the convolutional neural network can be increased, the overfitting phenomenon is reduced, the noise interference in the brain electrical signals can be effectively reduced, and the recognition effect is improved.
In this embodiment, based on the convolutional neural network obtained by training by the training method, an electroencephalogram signal identification method may be executed, including the following steps:
s1, acquiring an electroencephalogram signal to be processed;
s2, inputting the brain electrical signals to be processed into a convolutional neural network;
s3, obtaining an output result of the convolutional neural network.
If the electroencephalogram signals in the training set are marked with labels to distinguish the types of the electroencephalogram signals when the convolutional neural network is trained, after the steps S1-S3 are executed, the type of the electroencephalogram signals to be processed, which are received by the convolutional neural network, can be determined according to the output result of the convolutional neural network.
The convolutional neural network obtained based on the training of the steps P1-P3 has higher recognition accuracy, the overfitting phenomenon is not obvious, the noise interference in the electroencephalogram signals to be processed can be effectively avoided, and the recognition effect is improved.
In this embodiment, a computer apparatus includes a memory and a processor, where the memory is configured to store at least one program, and the processor is configured to load the at least one program to execute the convolutional neural network training method or the electroencephalogram signal recognition method in the embodiment, so as to achieve the same technical effects as described in the embodiment.
In this embodiment, a storage medium has stored therein processor-executable instructions that, when executed by a processor, are used to perform the convolutional neural network training method or the electroencephalogram signal recognition method in the embodiment, achieving the same technical effects as described in the embodiment.
It should be noted that, unless otherwise specified, when a feature is referred to as being "fixed" or "connected" to another feature, it may be directly or indirectly fixed or connected to the other feature. Further, the descriptions of the upper, lower, left, right, etc. used in this disclosure are merely with respect to the mutual positional relationship of the various components of this disclosure in the drawings. As used in this disclosure, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. In addition, unless defined otherwise, all technical and scientific terms used in this example have the same meaning as commonly understood by one of ordinary skill in the art. The terminology used in the description of the embodiments is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The term "and/or" as used in this embodiment includes any combination of one or more of the associated listed items.
It should be understood that although the terms first, second, third, etc. may be used in this disclosure to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element of the same type from another. For example, a first element could also be termed a second element, and, similarly, a second element could also be termed a first element, without departing from the scope of the present disclosure. The use of any and all examples, or exemplary language (e.g., "such as") provided herein, is intended merely to better illuminate embodiments of the invention and does not pose a limitation on the scope of the invention unless otherwise claimed.
It should be appreciated that embodiments of the invention may be implemented or realized by computer hardware, a combination of hardware and software, or by computer instructions stored in a non-transitory computer readable memory. The methods may be implemented in a computer program using standard programming techniques, including a non-transitory computer readable storage medium configured with a computer program, where the storage medium so configured causes a computer to operate in a specific and predefined manner, in accordance with the methods and drawings described in the specific embodiments. Each program may be implemented in a high level procedural or object oriented programming language to communicate with a computer system. However, the program(s) can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language. Furthermore, the program can be run on a programmed application specific integrated circuit for this purpose.
Furthermore, the operations of the processes described in the present embodiments may be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The processes (or variations and/or combinations thereof) described in this embodiment may be performed under control of one or more computer systems configured with executable instructions, and may be implemented as code (e.g., executable instructions, one or more computer programs, or one or more applications), by hardware, or combinations thereof, that collectively execute on one or more processors. The computer program includes a plurality of instructions executable by one or more processors.
Further, the method may be implemented in any type of computing platform operatively connected to a suitable computing platform, including, but not limited to, a personal computer, mini-computer, mainframe, workstation, network or distributed computing environment, separate or integrated computer platform, or in communication with a charged particle tool or other imaging device, and so forth. Aspects of the invention may be implemented in machine-readable code stored on a non-transitory storage medium or device, whether removable or integrated into a computing platform, such as a hard disk, optical read and/or write storage medium, RAM, ROM, etc., such that it is readable by a programmable computer, which when read by a computer, is operable to configure and operate the computer to perform the processes described herein. Further, the machine readable code, or portions thereof, may be transmitted over a wired or wireless network. When such media includes instructions or programs that, in conjunction with a microprocessor or other data processor, implement the steps described above, the invention described in this embodiment includes these and other different types of non-transitory computer-readable storage media. The invention also includes the computer itself when programmed according to the methods and techniques of the present invention.
The computer program can be applied to the input data to perform the functions described in this embodiment, thereby converting the input data to generate output data that is stored to the non-volatile memory. The output information may also be applied to one or more output devices such as a display. In a preferred embodiment of the invention, the transformed data represents physical and tangible objects, including specific visual depictions of physical and tangible objects produced on a display.
The present invention is not limited to the above embodiments, but can be modified, equivalent, improved, etc. by the same means to achieve the technical effects of the present invention, which are included in the spirit and principle of the present invention. Various modifications and variations are possible in the technical solution and/or in the embodiments within the scope of the invention.

Claims (7)

1. A convolutional neural network training method, comprising:
executing a plurality of acquisition processes; each time the acquisition process is used for acquiring the brain electrical signals respectively;
performing time domain data enhancement and frequency domain data enhancement on the electroencephalogram signal;
training the convolutional neural network using the electroencephalogram signal subjected to the time domain data enhancement and frequency domain data enhancement;
the time domain data enhancement includes:
decomposing the electroencephalogram signal into data fragments in a time domain;
performing the exchange of the data segments between the electroencephalogram signals acquired in each acquisition process; the data segments that are swapped have the same time domain position;
performing second band-pass filtering on the electroencephalogram signals;
the frequency domain data enhancement includes:
performing third band-pass filtering on the electroencephalogram signal subjected to the time domain data enhancement; the third band-pass filter is provided with a plurality of pass bands, and the result of the third band-pass filter is to obtain the frequency components of the electroencephalogram signals;
performing a swap of the frequency components between the electroencephalogram signals acquired in each acquisition process; the frequency components that are swapped have the same frequency domain position;
the convolutional neural network comprises an input layer, a time convolutional layer, a depth convolutional layer, a first pooling layer, a separable convolutional layer, a second pooling layer, a full-connection layer and an output layer which are connected in sequence;
the input layer is used for receiving the electroencephalogram signals;
the time convolution layer is used for extracting time characteristic information from the electroencephalogram signals;
the depth convolution layer is used for extracting space characteristic information from the electroencephalogram signals;
the separable convolution layer is used for extracting frequency characteristic information from the electroencephalogram signals;
the first pooling layer and the second pooling layer are used for compressing and simplifying the time characteristic information, the space characteristic information and the frequency characteristic information;
the full connection layer is used for fusing the output result of the second pooling layer;
and the output layer is used for carrying out classified output according to the fusion result of the full-connection layer.
2. The convolutional neural network training method of claim 1, wherein the acquiring process comprises:
acquiring brain electrical signals generated by a subject when the subject performs motor imagery through C3, cz and C4 channels;
classifying the electroencephalogram signals into left-hand motor imagery electroencephalogram signals or right-hand motor imagery electroencephalogram signals;
classifying and marking the electroencephalogram signals;
classifying the electroencephalogram signals into a training set or a testing set; the training set is used for training the convolutional neural network, and the test set is used for testing the convolutional neural network.
3. The convolutional neural network training method of claim 1, further comprising:
screening out abnormal values of the electroencephalogram signals;
and performing first band-pass filtering on the electroencephalogram signals.
4. The convolutional neural network training method of claim 1, wherein:
stopping the time domain data enhancement when the intensity of the electroencephalogram signal reaches a preset threshold value;
and stopping the time domain data enhancement when all the electroencephalogram signals are executed at least once.
5. An electroencephalogram signal identification method is characterized by comprising the following steps:
acquiring an electroencephalogram signal to be processed;
inputting the electroencephalogram signal to be processed into a convolutional neural network; training of the convolutional neural network by the training method according to any one of claims 1-4;
obtaining an output result of the convolutional neural network; the output result of the convolutional neural network comprises the type of the electroencephalogram signal.
6. A computer device comprising a memory for storing at least one program and a processor for loading the at least one program to perform the method of any of claims 1-5.
7. A storage medium having stored therein processor executable instructions which, when executed by a processor, are for performing the method of any of claims 1-5.
CN202010710647.9A 2020-07-22 2020-07-22 Convolutional neural network training method, electroencephalogram signal identification method, device and medium Active CN111882036B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010710647.9A CN111882036B (en) 2020-07-22 2020-07-22 Convolutional neural network training method, electroencephalogram signal identification method, device and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010710647.9A CN111882036B (en) 2020-07-22 2020-07-22 Convolutional neural network training method, electroencephalogram signal identification method, device and medium

Publications (2)

Publication Number Publication Date
CN111882036A CN111882036A (en) 2020-11-03
CN111882036B true CN111882036B (en) 2023-10-31

Family

ID=73155193

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010710647.9A Active CN111882036B (en) 2020-07-22 2020-07-22 Convolutional neural network training method, electroencephalogram signal identification method, device and medium

Country Status (1)

Country Link
CN (1) CN111882036B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112370017B (en) * 2020-11-09 2022-03-18 腾讯科技(深圳)有限公司 Training method and device of electroencephalogram classification model and electronic equipment
CN114942410B (en) * 2022-05-31 2022-12-20 哈尔滨工业大学 Interference signal identification method based on data amplification

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109299751A (en) * 2018-11-26 2019-02-01 南开大学 The SSVEP brain electricity classification method of convolutional Neural model based on the enhancing of EMD data
CN109711383A (en) * 2019-01-07 2019-05-03 重庆邮电大学 Convolutional neural networks Mental imagery EEG signal identification method based on time-frequency domain
CN109784242A (en) * 2018-12-31 2019-05-21 陕西师范大学 EEG Noise Cancellation based on one-dimensional residual error convolutional neural networks
CN110059565A (en) * 2019-03-20 2019-07-26 杭州电子科技大学 A kind of P300 EEG signal identification method based on improvement convolutional neural networks
CN110069958A (en) * 2018-01-22 2019-07-30 北京航空航天大学 A kind of EEG signals method for quickly identifying of dense depth convolutional neural networks
CN110163180A (en) * 2019-05-29 2019-08-23 长春思帕德科技有限公司 Mental imagery eeg data classification method and system
CN110263606A (en) * 2018-08-30 2019-09-20 周军 Scalp brain electrical feature based on end-to-end convolutional neural networks extracts classification method
CN110353702A (en) * 2019-07-02 2019-10-22 华南理工大学 A kind of emotion identification method and system based on shallow-layer convolutional neural networks
CN110765920A (en) * 2019-10-18 2020-02-07 西安电子科技大学 Motor imagery classification method based on convolutional neural network
CN110929581A (en) * 2019-10-25 2020-03-27 重庆邮电大学 Electroencephalogram signal identification method based on space-time feature weighted convolutional neural network
CN111012336A (en) * 2019-12-06 2020-04-17 重庆邮电大学 Parallel convolutional network motor imagery electroencephalogram classification method based on spatio-temporal feature fusion

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110069958A (en) * 2018-01-22 2019-07-30 北京航空航天大学 A kind of EEG signals method for quickly identifying of dense depth convolutional neural networks
CN110263606A (en) * 2018-08-30 2019-09-20 周军 Scalp brain electrical feature based on end-to-end convolutional neural networks extracts classification method
CN109299751A (en) * 2018-11-26 2019-02-01 南开大学 The SSVEP brain electricity classification method of convolutional Neural model based on the enhancing of EMD data
CN109784242A (en) * 2018-12-31 2019-05-21 陕西师范大学 EEG Noise Cancellation based on one-dimensional residual error convolutional neural networks
CN109711383A (en) * 2019-01-07 2019-05-03 重庆邮电大学 Convolutional neural networks Mental imagery EEG signal identification method based on time-frequency domain
CN110059565A (en) * 2019-03-20 2019-07-26 杭州电子科技大学 A kind of P300 EEG signal identification method based on improvement convolutional neural networks
CN110163180A (en) * 2019-05-29 2019-08-23 长春思帕德科技有限公司 Mental imagery eeg data classification method and system
CN110353702A (en) * 2019-07-02 2019-10-22 华南理工大学 A kind of emotion identification method and system based on shallow-layer convolutional neural networks
CN110765920A (en) * 2019-10-18 2020-02-07 西安电子科技大学 Motor imagery classification method based on convolutional neural network
CN110929581A (en) * 2019-10-25 2020-03-27 重庆邮电大学 Electroencephalogram signal identification method based on space-time feature weighted convolutional neural network
CN111012336A (en) * 2019-12-06 2020-04-17 重庆邮电大学 Parallel convolutional network motor imagery electroencephalogram classification method based on spatio-temporal feature fusion

Also Published As

Publication number Publication date
CN111882036A (en) 2020-11-03

Similar Documents

Publication Publication Date Title
CN111317468B (en) Electroencephalogram signal classification method, electroencephalogram signal classification device, computer equipment and storage medium
CN110353675B (en) Electroencephalogram signal emotion recognition method and device based on picture generation
CN111882036B (en) Convolutional neural network training method, electroencephalogram signal identification method, device and medium
CN111329474B (en) Electroencephalogram identity recognition method and system based on deep learning and information updating method
CN107656612B (en) Large instruction set brain-computer interface method based on P300-SSVEP
CN105266804B (en) A kind of brain-electrical signal processing method based on low-rank and sparse matrix decomposition
CN109965871B (en) Method, system, medium, and apparatus for analyzing brain-computer interface signal
CN112022153B (en) Electroencephalogram signal detection method based on convolutional neural network
CN111783942A (en) Brain cognition process simulation method based on convolution cyclic neural network
Caramia et al. Optimizing spatial filter pairs for EEG classification based on phase-synchronization
CN108334766A (en) Electronic device, unlocking method and related product
CN111671420A (en) Method for extracting features from resting electroencephalogram data and terminal equipment
CN115414041A (en) Autism assessment device, method, terminal device and medium based on electroencephalogram data
EP3955177B1 (en) Search method and information processing system
CN109492602B (en) Process timing method and system based on human body language
CN114366101B (en) Motor imagery electroencephalogram signal classification method, device, equipment and storage medium
CN115392287A (en) Electroencephalogram signal online self-adaptive classification method based on self-supervision learning
CN114795247A (en) Electroencephalogram signal analysis method and device, electronic equipment and storage medium
CN113360876A (en) SSVEP-based identity recognition method and device, electronic device and storage medium
Gatti et al. Prediction of hand movement speed and force from single-trial eeg with convolutional neural networks
CN108542383B (en) Electroencephalogram signal identification method, system, medium and equipment based on motor imagery
CN112450946A (en) Electroencephalogram artifact restoration method based on loop generation countermeasure network
Wang et al. Residual learning attention cnn for motion intention recognition based on eeg data
CN112998724B (en) Electro-oculogram artifact removing method and device and electronic equipment
CN118013352B (en) EEG-fNIRS motor imagery identification method and device based on heterogram network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant