CN106821681A - A kind of upper limbs ectoskeleton control method and system based on Mental imagery - Google Patents
A kind of upper limbs ectoskeleton control method and system based on Mental imagery Download PDFInfo
- Publication number
- CN106821681A CN106821681A CN201710107353.5A CN201710107353A CN106821681A CN 106821681 A CN106821681 A CN 106821681A CN 201710107353 A CN201710107353 A CN 201710107353A CN 106821681 A CN106821681 A CN 106821681A
- Authority
- CN
- China
- Prior art keywords
- layer
- upper limbs
- mental imagery
- limbs ectoskeleton
- eeg signals
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003340 mental effect Effects 0.000 title claims abstract description 59
- 210000001364 upper extremity Anatomy 0.000 title claims abstract description 59
- 238000000034 method Methods 0.000 title claims abstract description 28
- 238000000605 extraction Methods 0.000 claims abstract description 18
- 210000000245 forearm Anatomy 0.000 claims abstract description 17
- 230000009471 action Effects 0.000 claims abstract description 15
- 210000003205 muscle Anatomy 0.000 claims description 23
- 210000002569 neuron Anatomy 0.000 claims description 21
- 239000004677 Nylon Substances 0.000 claims description 20
- 229920001778 nylon Polymers 0.000 claims description 20
- 230000033001 locomotion Effects 0.000 claims description 19
- 239000002184 metal Substances 0.000 claims description 18
- 239000011159 matrix material Substances 0.000 claims description 11
- 238000001914 filtration Methods 0.000 claims description 8
- 238000013507 mapping Methods 0.000 claims description 7
- 230000005611 electricity Effects 0.000 claims description 6
- 229920000049 Carbon (fiber) Polymers 0.000 claims description 4
- 239000004917 carbon fiber Substances 0.000 claims description 4
- VNWKTOKETHGBQD-UHFFFAOYSA-N methane Chemical compound C VNWKTOKETHGBQD-UHFFFAOYSA-N 0.000 claims description 4
- 238000005070 sampling Methods 0.000 claims description 4
- 230000001360 synchronised effect Effects 0.000 claims description 4
- 230000003321 amplification Effects 0.000 claims description 3
- 238000006243 chemical reaction Methods 0.000 claims description 3
- 238000003199 nucleic acid amplification method Methods 0.000 claims description 3
- 210000000988 bone and bone Anatomy 0.000 claims description 2
- 238000005452 bending Methods 0.000 claims 1
- 238000003475 lamination Methods 0.000 claims 1
- 230000000694 effects Effects 0.000 abstract description 6
- 238000013527 convolutional neural network Methods 0.000 description 21
- 210000004556 brain Anatomy 0.000 description 14
- 238000012549 training Methods 0.000 description 6
- 230000004913 activation Effects 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 230000000763 evoking effect Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000033764 rhythmic process Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000013135 deep learning Methods 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 238000007796 conventional method Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000001537 neural effect Effects 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 230000001953 sensory effect Effects 0.000 description 2
- 241000208340 Araliaceae Species 0.000 description 1
- 208000014644 Brain disease Diseases 0.000 description 1
- 206010008190 Cerebrovascular accident Diseases 0.000 description 1
- 241001269238 Data Species 0.000 description 1
- LFQSCWFLJHTTHZ-UHFFFAOYSA-N Ethanol Chemical compound CCO LFQSCWFLJHTTHZ-UHFFFAOYSA-N 0.000 description 1
- 235000005035 Panax pseudoginseng ssp. pseudoginseng Nutrition 0.000 description 1
- 235000003140 Panax quinquefolius Nutrition 0.000 description 1
- 206010033799 Paralysis Diseases 0.000 description 1
- 208000006011 Stroke Diseases 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000002490 cerebral effect Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 235000013399 edible fruits Nutrition 0.000 description 1
- 210000002310 elbow joint Anatomy 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000007717 exclusion Effects 0.000 description 1
- 210000003414 extremity Anatomy 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 239000012467 final product Substances 0.000 description 1
- 235000008434 ginseng Nutrition 0.000 description 1
- 238000011478 gradient descent method Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 210000001595 mastoid Anatomy 0.000 description 1
- 235000013372 meat Nutrition 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000035772 mutation Effects 0.000 description 1
- 210000005036 nerve Anatomy 0.000 description 1
- 230000002232 neuromuscular Effects 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 239000000047 product Substances 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 210000004761 scalp Anatomy 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000002269 spontaneous effect Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 230000001225 therapeutic effect Effects 0.000 description 1
- 238000009966 trimming Methods 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H1/00—Apparatus for passive exercising; Vibrating apparatus; Chiropractic devices, e.g. body impacting devices, external devices for briefly extending or aligning unbroken bones
- A61H1/02—Stretching or bending or torsioning apparatus for exercising
- A61H1/0274—Stretching or bending or torsioning apparatus for exercising for the upper limbs
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/369—Electroencephalography [EEG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7225—Details of analog processing, e.g. isolation amplifier, gain or sensitivity adjustment, filtering, baseline or drift compensation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/725—Details of waveform analysis using specific filters therefor, e.g. Kalman or adaptive filters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2201/00—Characteristics of apparatus not provided for in the preceding codes
- A61H2201/12—Driving means
- A61H2201/1207—Driving means with electric or magnetic drive
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2201/00—Characteristics of apparatus not provided for in the preceding codes
- A61H2201/14—Special force transmission means, i.e. between the driving means and the interface with the user
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2201/00—Characteristics of apparatus not provided for in the preceding codes
- A61H2201/16—Physical interface with patient
- A61H2201/1602—Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
- A61H2201/1635—Hand or arm, e.g. handle
- A61H2201/1638—Holding means therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2201/00—Characteristics of apparatus not provided for in the preceding codes
- A61H2201/16—Physical interface with patient
- A61H2201/1657—Movement of interface, i.e. force application means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2201/00—Characteristics of apparatus not provided for in the preceding codes
- A61H2201/50—Control means thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2205/00—Devices for specific parts of the body
- A61H2205/06—Arms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2230/00—Measuring physical parameters of the user
- A61H2230/08—Other bio-electrical signals
- A61H2230/10—Electroencephalographic signals
- A61H2230/105—Electroencephalographic signals used as a control parameter for the apparatus
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Biomedical Technology (AREA)
- Signal Processing (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Psychiatry (AREA)
- Artificial Intelligence (AREA)
- Physiology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Psychology (AREA)
- Pain & Pain Management (AREA)
- Physical Education & Sports Medicine (AREA)
- Rehabilitation Therapy (AREA)
- Power Engineering (AREA)
- Epidemiology (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Automation & Control Theory (AREA)
- Evolutionary Computation (AREA)
- Fuzzy Systems (AREA)
- Mathematical Physics (AREA)
- Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
Abstract
A kind of upper limbs ectoskeleton control method based on Mental imagery, comprises the following steps:1) left arrow or lower the arrow prompting being tested in display screen carry out corresponding Mental imagery, i.e. left hand Mental imagery and pin Mental imagery, and the original EEG signals of generation are pre-processed;2) by pretreated original EEG signals input CNN structures, EEG signals are carried out with feature extraction and classification, exports recognition result;3) driver control signal is obtained according to output recognition result, drives upper limbs ectoskeleton and drive forearm to do corresponding sports;If classification results are left hand Mental imagery, upper limbs ectoskeleton does stretches action, if classification results are pin Mental imagery, upper limbs ectoskeleton does action in the wrong.When CNN structures are designed, convolution kernel is set to by vector according to the characteristic that EEG signals time and space characteristics are combined.And a kind of upper limbs ectoskeleton control system based on Mental imagery is provided.Classification and Identification rate of the present invention is higher, experience effect is good.
Description
Technical field
It is more particularly to a kind of based on Mental imagery the present invention relates to rehabilitation medical and Mental imagery brain electricity sorting technique
Upper limbs ectoskeleton control method and system.
Background technology
In recent years, rehabilitation type upper limbs ectoskeleton is used as a kind of therapeutic equipment of auxiliary, by Intelligent Human-Machine Interface, can be to place
The training of varying strength, different mode is provided in different convalescent patients with cerebral apoplexy.Brain-computer interface technology (brain-
Computer interface, BCI) using EEG signals (electroencephalogram, EEG) human brain can be realized with meter
The communication and control of calculation machine or other electronic equipments, have been widely used in rehabilitation training.As a kind of control of man-machine interaction
Means processed, brain-computer interface technology not only can recognize trouble as other traditional control methods (surface electromyogram signal and force feedback)
Person's motion intention, may also help in neuromuscular system paralysis but the patient having a normal thinking realizes being interacted with extraneous.
Traditional Mental imagery sorting technique elder generation manual extraction EEG time-frequency characteristics information, then using the side of machine learning
The mapping relations that method is set up between EEG features and Mental imagery.But because the signal to noise ratio of EEG signals is low, what brain-computer interface faced
One of subject matter is low classification rate, and some are based on conventional method (feature extraction and sort module are separated) and left hand and right hand is moved
Imagine that the discrimination that the research of classification draws is also failed to higher than 80%.Therefore, how EEG signals are carried out with effectively feature to carry
Take and classify, it has also become the important research content of Mental imagery identification.For example, Shandong University is " a kind of to be based on convolutional Neural net
The P300 EEG signals detection method of network " patent application document (201510445894.X) proposes a kind of based on convolutional Neural net
The P300 EEG signals detection methods of network, it carries out feature extraction by convolutional neural networks to P300 EEG signals, goes forward side by side one
Step is detected.But it has very big deficiency, one is the matrix that the convolution kernel in method is set in general pattern identification, meeting
Make in the feature after convolution algorithm miscellaneous space and temporal information simultaneously;Two be P300 EEG signals belong to evoked brain potential, it is necessary to
Environmental stimuli is produced, not as Mental imagery brain electricity (spontaneous brain electricity) has actual operation;Three is by P300 EEG signals
Detection, can only be used as a switch order, and passing through Mental imagery can carry out multi-class identification, and output various control refers to
Order.
Convolutional neural networks (convolutional neural network, CNN) in deep learning are a kind of multilayers
The mutation of perceptron, has been widely used in speech recognition and field of image recognition.Shared based on local receptor field and weights
Concept, CNN can substantially reduce the complexity of network structure, reduce the quantity of weights.Because CNN region be directly facing primary signal, can
With the characteristic information for extracting more extensive, deeper, having more degree of having any different.Therefore, it can be prevented effectively from conventional method by feature
Extraction is separated with decoder module and causes the problem of information loss in characteristic extraction procedure.Be applied to for CNN by existing Patents
Evoked brain potential (such as 201510445894.X), but the classification electric still without Mental imagery brain is applied to.Evoked brain potential and from
The pattern difference that both brain electricity are produced is generated, causes different fundamental characteristics, therefore process, the performance difference classified.Also, brain electricity
Signal is a kind of signal for combining time and space characteristics, and the existing two dimensional image recognition methods based on CNN is not applied to simultaneously, needed
Convolution kernel size and network structure are pointedly set.
The content of the invention
In order to overcome existing EEG signals to cause because signal to noise ratio is low, Classification and Identification rate is relatively low, experience effect is poor not
Foot, the present invention provides a kind of upper limbs ectoskeleton controlling party based on Mental imagery that Classification and Identification rate is higher, experience effect is good
Method and system.
The technical solution adopted for the present invention to solve the technical problems is:
A kind of upper limbs ectoskeleton control method based on Mental imagery, the control method is comprised the following steps:
1) being tested left arrow or lower the arrow prompting in display screen carries out corresponding Mental imagery, i.e. left hand motion is thought
As the original EEG signals with pin Mental imagery, generation are pre-processed;
2) for the Space Time characteristic of EEG signals, a kind of 5 layers of CNN structures are devised, the 1st layer is input layer, the 2nd layer
It is convolutional layer, the 3rd layer is down-sampled layer, and the 4th, 5 layers is full articulamentum, the 3rd layer of output and the 4th, 5 layers of composition and classification part;
According to the characteristic that EEG signals time and space characteristics are combined, convolution kernel is set in vector, rather than general pattern identification
Matrix, do not make in the feature after convolution algorithm while mixing two kinds of information;
Pretreated original EEG signals are input into the CNN structures, feature extraction and classification are carried out to EEG signals,
Output recognition result;
3) upper limbs ectoskeleton controller obtains driver control signal according to output recognition result, drives upper limbs ectoskeleton simultaneously
Forearm is driven to do corresponding sports;If classification results are left hand Mental imagery, upper limbs ectoskeleton does stretches action, if classification knot
Fruit is pin Mental imagery, then upper limbs ectoskeleton does action in the wrong.
Further, the step 2) in, in described 1st layer, the input matrix that each input sample is, wherein 28 representatives are logical
Road number, 60 represent the time sampling point in each passage;
In described 2nd layer, it is office that the connection between space filtering, therefore this layer and input layer is carried out to original input sample
Portion connects.;8 kinds of wave filters are used in the layer, every kind of wave filter input matrix that deconvolutes just obtains the mapping of different characteristic, obtains final product
To 8 characteristic patterns;Convolution kernel is dimensioned to [28 × 1], and the size of each characteristic pattern is (1 × 60);Convolution kernel is set to
Vector;
In described 3rd layer, to EEG signals feature extraction in time, 5 kinds are used for each characteristic pattern in the 2nd layer
Wave filter, therefore by after the mapping of this part, the 3rd layer has 40 characteristic patterns, being dimensioned to of convolution kernel [1 ×
10], the size of each characteristic pattern is (1 × 6);It is identical with convolution kernel length that convolution step-length is set;
Described 4th layer, coordinate the 3rd layer and output layer composition classified part, therefore be all full connection, neuron before and after the layer
Number is set to 100;
Described 5th layer is output layer, comprising 2 neurons, represents two classification problems, i.e. left hand Mental imagery or pin fortune
The dynamic imagination.
Further, the step 1) in, it is tested by the cue of left arrow or lower arrow, carry out left hand motion and think
As with pin Mental imagery, respectively control upper limbs ectoskeleton stretches action and bend action.
Further, the step 1) in, the pretreatment includes amplification, A/D conversions and filter step, frequency filtering section
It is 8-30Hz.
A kind of upper limbs ectoskeleton control system based on Mental imagery, the control system includes original eeg signal acquisition
Unit, pretreatment module, CNN sort modules and upper limbs ectoskeleton controller, the original eeg signal acquisition unit collection
Original EEG signals are processed by pretreatment module, then the characteristic extracting module in input CNN sort modules and classification
Module carries out feature extraction and classification to EEG signals, exports recognition result;Upper limbs ectoskeleton controller is obtained according to output result
Driver control signal is obtained, drives upper limbs ectoskeleton simultaneously to drive forearm to do corresponding sports;If classification results are left hand, motion is thought
As then upper limbs ectoskeleton does and stretches action, if classification results are pin Mental imagery, upper limbs ectoskeleton does action in the wrong.
The upper limbs ectoskeleton includes that two pneumatic muscles, pneumatic muscles fixture, postbrachium metal link rod, forearm metal connect
Bar, potentiometer, potentiometer fixture, connecting rope, nylon joint and four adjustable carbon fiber wristers of size;Two pneumatic muscles
Used as driver, the fixing end and pneumatic muscles fixture of the driver are screwed, the active segment of the driver
It is connected by connecting rope;Pneumatic muscles fixture is mutually fixed with postbrachium metal link rod;Connecting rope is mutually solid with one section of nylon joint
Fixed, two pneumatic muscles drive nylon joint to rotate by stretching connecting rope;Nylon joint is connected with forearm metal link rod, nylon
During joint motions, forearm metal link rod is followed and is synchronized with the movement;Potentiometer is used for Real-time Feedback angle signal, inputs on display screen
Analog feedback bar move it, while as the output valve of disaggregated model, with reference to desired value assessment classification performance;Potentiometer
Shell is connected with potentiometer fixture, and the turning handle of potentiometer is connected with nylon joint.
Technology design of the invention is:Different Mental imageries (such as imagination left hand, the right hand, pin, the motion of tongue), can make big
The EEG signals of the corresponding region of cortex produce change.When such as imagining unilateral hands movement, in CF section brain offside
The mu rhythm and pace of moving things and beta rhythm and pace of moving things energy of motor sensory area reduce, and the motor sensory area mu rhythm and pace of moving things of homonymy and beta rhythm and pace of moving things energy increase
Greatly, this phenomenon is referred to as Event-related desynchronization (event-related desynchronization, ERD) and event phase
Close synchronous (event-related synchronization, ERS).Pin, tongue motion also can be in corticocerebral respective regions
Produce similar phenomenon.Brain machine interface system based on Mental imagery can be different to these ERD/ERS patterns classify, from
And obtain the control signal of upper limbs ectoskeleton.
According to the characteristic that EEG signals time and space characteristics are combined, innovatively using based on the theoretical volume of deep learning
Product neutral net carries out feature extraction and classification to single trial motor imagery EEG signal, and output order carries out the reality of upper limbs ectoskeleton
When control.The method can effectively solve the problems, such as that EEG signals cause classification rate low because signal to noise ratio is low, and Classification and Identification rate can
To reach more than 90%;Apply this method in upper limbs ectoskeleton, control mode is more flexible, natural, and operation is simpler,
So as to reach good experience effect.
A kind of upper limbs ectoskeleton control method and system based on Mental imagery of the invention, it is adaptable to which hand, limbs are residual
Hinder the rehabilitation training of personage, and the aspect such as smart home, amusement game, military training based on brain-computer interface control.
Beneficial effects of the present invention are mainly manifested in:
(1) feature is carried out to single trial motor imagery EEG signal using the convolutional neural networks based on deep learning theory to carry
Take and classify, effectively solve the problems, such as that EEG signals cause classification rate low because signal to noise ratio is low;
(2) characteristic being combined according to EEG signals time and space characteristics, convolution kernel is set to vectorial rather than general
Matrix in image recognition, does not make in the feature after convolution algorithm while mixing two kinds of information;
(3) in applying this method to upper limbs ectoskeleton, control mode compared with evoked brain potential is controlled, more flexibly, from
So, operation is simpler, so as to reach good experience effect.
Brief description of the drawings
Fig. 1 is holistic approach flow chart of the present invention;
Fig. 2 is single motion imagination timing diagram;
Fig. 3 is the frame diagram of CNN sort modules;
Fig. 4 is the structure chart of upper limbs ectoskeleton.
Specific embodiment
The invention will be further described below in conjunction with the accompanying drawings.
A kind of 1~Fig. 4 of reference picture, upper limbs ectoskeleton control method based on Mental imagery comprises the following steps:It is tested
(2) the arrow prompting in display screen carries out corresponding Mental imagery, and the original EEG signals of generation pass through pretreatment module
(3) processed, and be input into CNN sort modules (4);CNN sort modules (4) carry out the feature extraction and classification of EEG signals,
Output recognition result;Upper limbs ectoskeleton controller (5) obtains driver control signal according to output result, drives upper limbs ectoskeleton
And drive forearm to do corresponding sports.
A kind of upper limbs ectoskeleton control system based on Mental imagery, including original eeg signal acquisition unit, pretreatment
Module (3), CNN sort modules (4) and upper limbs ectoskeleton controller (5), be tested (2) left arrow in display screen (1) or
Lower arrow prompting carries out corresponding Mental imagery (left hand Mental imagery and pin Mental imagery), and the original EEG signals of generation pass through
Pretreatment module (3) is processed, including amplification, A/D conversions, filter step, the spy being then input into CNN sort modules (4)
Levying extraction module (41) and sort module (42) carries out feature extraction and classification to EEG signals, exports recognition result;Outside upper limbs
Bone controller (5) obtains driver control signal according to output result, drives upper limbs ectoskeleton (6) and drives forearm to do accordingly
Motion;If classification results are left hand Mental imagery, upper limbs ectoskeleton (6) does stretches action, if classification results are moved for pin
The imagination, then upper limbs ectoskeleton (6) does action in the wrong.
Mental imagery brain electric data collecting equipment uses the ActiveTwo64 passage EEG signals of BioSemi companies of Holland
Acquisition system.According to 10/20 systems approach gather 28 eeg datas of passage, respectively FC5, FC3, FC1, FCz, FC2, FC4,
FC6, C5, C3, C1, Cz, C2, C4, C6, CP5, CP3, CP1, CPz, CP2, CP4, CP6, P5, P3, P1, Pz, P2, P4 and P6.Ginseng
Electrode is examined to be placed at left ear mastoid process;Earth electrode is substituted by two absolute electrodes of CMS and DRL.Setting sample frequency is
1000Hz, high-pass filtering 1Hz, LPF 100Hz, notch filter 50Hz.Before electrode is disposed, alcohol wipe skin need to be used,
And use the impedance between conductive paste reduction electrode and scalp.
The display screen (1) can be display, flat board or other display equipment.
As shown in Fig. 2 program process is as follows when the single motion based on arrow prompting is imagined:Each Mental imagery continues 8 seconds,
Preceding two seconds display screens (1) show blank, a "+" word occur in display screen (1) center afterwards, and send auditory tone cues, remind
Tested Mental imagery will start;From 4 seconds to 8 seconds, it is to the left or downward that the "+" word on display screen (1) is changed into randomly generating
Arrow, is tested and points to the motion of imagination left hand or pin motion according to arrow.There is the random interval of 2-5 seconds between each Mental imagery;Often
There is the time of having a rest of 3 minutes between 35 Mental imageries, to prevent tested fatigue.
Tested (2) are NBD or the normal adults of brain diseases.
The pretreatment module (3) is including amplifying, A/D is changed, filter step;Frequency filtering section is 8-30Hz;To obtain
Most strong ERD/ERS patterns, 4-7 seconds EEG signals for intercepting each Mental imagery are processed;Define opening for data sectional
Window length is 50 milliseconds, thus each input sample will be by a 28 passage × 60 time sampling point (3s time periods × 1000Hz
Sample rate ÷ 50ms open a window length) matrix composition;
As shown in figure 3, for the Space Time characteristic of EEG signals, devising a kind of 5 layers of CNN structures, the 1st layer is input
Layer, the 2nd layer (convolutional layer) and 3 layers of (down-sampled layer) constitutive characteristic extraction module, the 3rd layer of output (characteristic value) and the 4th, 5 layers
(full articulamentum) composition and classification part.Each layer describes as follows:
L1:The layer is input layer, and each input sample is 28 × 60 input matrix, wherein 28 represent port number, 60 tables
Show the time sampling point in each passage;
L2:The layer is convolutional layer (the 1st hidden layer), and Main Function is that space filtering is carried out to original input sample, therefore
Connection between this layer and input layer is local connection.8 kinds of wave filters, every kind of wave filter are used to deconvolute input matrix in the layer
The mapping of different characteristic is just obtained, that is, obtains 8 characteristic patterns.Convolution kernel is dimensioned to [28 × 1], each characteristic pattern it is big
Small is (1 × 60).Convolution kernel is set to the matrix in vector rather than general pattern identification, and its reason is after not making convolution algorithm
Mix two kinds of information in feature, only include space characteristics;
L3:The layer is down-sampled layer (the 2nd hidden layer), and Main Function is the feature extraction to EEG signals in time,
Therefore the theory that local connection and weights are shared also is added.For each characteristic pattern in L2 layers using 5 kinds of wave filters, therefore
By after the mapping of this part, L3 layers has 40 characteristic patterns.Convolution kernel is dimensioned to [1 × 10], each characteristic pattern
Size is (1 × 6).The reason for setting convolution step-length is identical with convolution kernel length is to prevent over-fitting to reduce parameter, in reality
Realized while existing convolution operation down-sampled;
L4:The layer is full articulamentum (the 3rd hidden layer), and effect is to coordinate preceding layer and output layer, constitutes classified part, because
All it is full connection before and after this layer.Neuron number is set to 100;
L5:The layer is output layer, and comprising 2 neurons, (left hand Mental imagery or pin motion are thought to represent two classification problems
As).
The training process of CNN mainly uses back-propagation algorithm, that is, be input into training data, first each nerve of forward calculation
The activation value of unit, then backwards calculation error again, and the gradient of each weights and biasing is sought error, and each power is adjusted accordingly
Value and deviation.It is n (l, m, j) to define a neuron in network, and wherein l represents the number of plies, and m represents m-th spy in this layer
Figure is levied, j represents j-th neuron in this feature figure.The input and output of each neuron are expressed as in each layer:WithAnd
Wherein, f () is activation primitive.Preceding two-layer hidden layer (L2 and L3) is using hyperbolic tangent function as activation primitive:
F (x)=atanh (bx) (2)
Wherein, a=1.7159,The full articulamentum of two-layer uses Sigmoid functions as activation primitive afterwards:
Transitive relation of each layer neuron number of network between is as follows:
L1:N channel × T time sampled point, can be expressed as IN, T, wherein N is port number, and T is sampled point.
L2:In convolutional layer, the convolution kernel that the characteristic pattern of last layer can be learnt by carries out convolution, then by one
Activation primitive, it is possible to obtain output characteristic figure:
Wherein,It is the volume collection core of [28 × 1],It is biasing.
L3:The layer is similar with the second layer:
Wherein,It is the convolution kernel of [1 × 10],It is biasing.
L4:L3 layers of all neurons connect this layer of all of neuron entirely:
Wherein,It is L3 layers of neuron to the L4 layers of connection weight of neuron, b4J () is biasing.
L5:L4 layers of all neurons connect this layer of all of neuron entirely:
Wherein, w5I () is L4 layers of neuron to the L5 layers of connection weight of neuron, b5J () is biasing.
In order to ensure that network can effectively be trained and restrain, the initialization of network weight and biasing need to be carried out.Herein
The connection weight of middle network and biasing are initialized at [± a 1/n (l, m, i)Ninput] interval in be uniformly distributed, wherein n
(l,m,i)NinputIt is l layers, the front layer neuron number being connected with i-th neuron in m-th characteristic pattern.L2 and L3 layers
Learning rate γ be defined as
Wherein,It is l layers, the neuron number of weights is shared in m-th characteristic pattern.L4 and L5 layers of
Habit rate γ is defined as
Gradient descent method is used to adjust connection weight and biasing, final error is reached minimum.Maximum iteration
It is set to 10000.
As shown in figure 4, the upper limbs ectoskeleton (6) is by two pneumatic muscles (61), pneumatic muscles fixture (62), postbrachium
Metal link rod (63), forearm metal link rod (64), potentiometer (65), potentiometer fixture (66), connecting rope (67), nylon joint
(68) constituted with four adjustable carbon fiber wristers (69) of size;Two pneumatic muscles (61) are used as driver, fixing end (611)
It is screwed with pneumatic muscles fixture (62), active segment (612) is connected by connecting rope (67);Pneumatic muscles fixture
(62) mutually fixed with postbrachium metal link rod (63);Connecting rope (67) is mutually fixed with one section of nylon joint (68), two pneumatic fleshes
Meat (61) drives nylon joint (68) to rotate by stretching connecting rope (67);Nylon joint (68) and forearm metal link rod (64) phase
Even, when nylon joint (68) move, forearm metal link rod (64) is followed and is synchronized with the movement;Potentiometer (65) is for Real-time Feedback angle
Signal, the analog feedback bar inputed on display screen (1) moves it, meanwhile, can as the output valve of disaggregated model, with reference to
Desired value assessment classification performance;The shell of potentiometer (65) is connected with potentiometer fixture (66), the turning handle of potentiometer (65) and
Nylon joint (68) is connected;Two sections of metal link rods can be adjusted according to the difference of subject's arm length, and forearm metal connects
The adjustable extent of bar (64) is 25 centimetres to 30 centimetres, and the adjustable extent of postbrachium metal link rod (63) is 20 centimetres to 25 centimetres;
Four adjustable carbon fiber wristers (69) of size are for carrying out the fixation of upper limbs ectoskeleton (6).As the machinery of one degree of freedom
Body, user's shoulder needs a fixed angle value (0 ° to 180 °), and Angle of Elbow Joint scope is 0 ° to 90 ° (average
Anthropometric values).Whole upper limbs exoskeleton system (6) weighs 2.1 kilograms.
The upper limbs ectoskeleton controller (5) obtains driver control letter according to the output result of CNN sort modules (4)
Number, method is as follows:Two pneumatic muscles (61) are connected with nylon joint (68), drive it to rotate;Two pneumatic muscles (61)
Initial gas pressure value is P0, shrinkage factor is ε0, convergent force F0, length is L0;When an air pressure signal Δ P input, two pneumatic muscles
(61) atmospheric pressure value is changed into P respectively0+ Δ P and P0- Δ P, shrinkage factor is changed into εaAnd εb, convergent force is changed into FaAnd Fb, length is changed into
L0- Δ L and L0+ Δ L, wherein, At this moment, upper limbs ectoskeleton (6)
Nylon joint (68) due to torque imbalance and rotate, until the foundation that torque is newly balanced;Two pneumatic muscles (61)
Convergent force FaAnd FbCan be obtained by the following formula:
Fa=(P0+ΔP)[a(1-εa)2-b] (10)
Fb=(P0-ΔP)[a(1-εb)2-b] (11)
Wherein,D0It is pneumatic muscles original outer diameter, θ0For pneumatic muscles fiber is initial
Braid angle;Then new trimming moment can be expressed as:
M=(Fa-Fb)×R (12)
According to formula (10)-(12), air pressure signal Δ P can be obtained by the following formula:
The above is only the preferred embodiment of the present invention, it should be understood that the present invention is not limited to described herein
Form, is not to be taken as the exclusion to other embodiment, and can be used for various other combinations, modification and environment, and can be at this
In the text contemplated scope, it is modified by the technology or knowledge of above-mentioned teaching or association area.And those skilled in the art are entered
Capable change and change does not depart from the spirit and scope of the present invention, then all should be in the protection domain of appended claims of the present invention
It is interior.
Claims (6)
1. a kind of upper limbs ectoskeleton control method based on Mental imagery, it is characterised in that:The control method includes following step
Suddenly:
1) be tested left arrow in display screen or the prompting of lower arrow carry out corresponding Mental imagery, i.e. left hand Mental imagery and
Pin Mental imagery, the original EEG signals of generation are pre-processed;
2) for the Space Time characteristic of EEG signals, a kind of 5 layers of CNN structures are devised, the 1st layer is input layer, and the 2nd layer is volume
Lamination, the 3rd layer is down-sampled layer, and the 4th, 5 layers is full articulamentum, the 3rd layer of output and the 4th, 5 layers of composition and classification part;
According to the characteristic that EEG signals time and space characteristics are combined, convolution kernel is set to vector;
Pretreated original EEG signals are input into the CNN structures, EEG signals are carried out with feature extraction and classification, exported
Recognition result;
3) upper limbs ectoskeleton controller obtains driver control signal according to output recognition result, drives upper limbs ectoskeleton and drives
Forearm does corresponding sports;If classification results are left hand Mental imagery, upper limbs ectoskeleton does stretches action, if classification results are
Pin Mental imagery, then upper limbs ectoskeleton do action in the wrong.
2. a kind of upper limbs ectoskeleton control method based on Mental imagery as claimed in claim 1, it is characterised in that:The step
It is rapid 2) in,
In described 1st layer, the input matrix that each input sample is, wherein 28 represent port number, during 60 represent each passage
Time sampling point;
In described 2nd layer, it is local company that the connection between space filtering, therefore this layer and input layer is carried out to original input sample
Connect.;8 kinds of wave filters are used in the layer, every kind of wave filter input matrix that deconvolutes just obtains the mapping of different characteristic, that is, obtain 8
Individual characteristic pattern;Convolution kernel is dimensioned to [28 × 1], and the size of each characteristic pattern is (1 × 60);Convolution kernel be set to
Amount;
In described 3rd layer, to EEG signals feature extraction in time, 5 kinds of filtering are used for each characteristic pattern in the 2nd layer
Device, therefore by after the mapping of this part, the 3rd layer have 40 characteristic patterns, and convolution kernel is dimensioned to [1 × 10], often
The size of individual characteristic pattern is (1 × 6);It is identical with convolution kernel length that convolution step-length is set;
Described 4th layer, coordinate the 3rd layer and output layer composition classified part, therefore be all full connection, neuron number before and after the layer
It is set to 100;
Described 5th layer is output layer, comprising 2 neurons, represents two classification problems, i.e. left hand Mental imagery or pin motion is thought
As.
3. a kind of upper limbs ectoskeleton control method based on Mental imagery as claimed in claim 1 or 2, it is characterised in that:Institute
State step 1) in, it is tested by the cue of left arrow or lower arrow, left hand Mental imagery and pin Mental imagery are carried out, respectively
Stretching action and bending for control upper limbs ectoskeleton is acted.
4. a kind of upper limbs ectoskeleton control method based on Mental imagery as claimed in claim 1 or 2, it is characterised in that:Institute
State step 1) in, the pretreatment includes amplification, A/D conversions and filter step, and frequency filtering section is 8-30Hz.
5. a kind of control system of the upper limbs ectoskeleton control method for realizing being based on as claimed in claim 1 Mental imagery, its
It is characterised by:The control system is included outside original eeg signal acquisition unit, pretreatment module, CNN sort modules and upper limbs
Bone controller, the original EEG signals of the original eeg signal acquisition unit collection are processed by pretreatment module,
Then the characteristic extracting module and sort module in input CNN sort modules carry out feature extraction and classification to EEG signals, defeated
Go out recognition result;Upper limbs ectoskeleton controller obtains driver control signal according to output result, drives upper limbs ectoskeleton and band
Dynamic forearm does corresponding sports;If classification results are left hand Mental imagery, upper limbs ectoskeleton does stretches action, if classification results
It is pin Mental imagery, then upper limbs ectoskeleton does action in the wrong.
6. control system as claimed in claim 5, it is characterised in that:The upper limbs ectoskeleton includes two pneumatic muscles, gas
Dynamic muscle fixture, postbrachium metal link rod, forearm metal link rod, potentiometer, potentiometer fixture, connecting rope, nylon joint and
Four adjustable carbon fiber wristers of size;Two pneumatic muscles are used as driver, the fixing end and pneumatic muscles of the driver
Fixture is screwed, and the active segment of the driver is connected by connecting rope;Pneumatic muscles fixture and postbrachium metal
Connecting rod is mutually fixed;Connecting rope is mutually fixed with one section of nylon joint, and two pneumatic muscles drive nylon to close by stretching connecting rope
Section rotation;Nylon joint is connected with forearm metal link rod, and during nylon joint motions, forearm metal link rod is followed and is synchronized with the movement;Electricity
Position device is used for Real-time Feedback angle signal, and the analog feedback bar inputed on display screen moves it, while as disaggregated model
Output valve, with reference to desired value assessment classification performance;The shell of potentiometer is connected with potentiometer fixture, the turning handle of potentiometer and
Nylon joint is connected.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710107353.5A CN106821681A (en) | 2017-02-27 | 2017-02-27 | A kind of upper limbs ectoskeleton control method and system based on Mental imagery |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710107353.5A CN106821681A (en) | 2017-02-27 | 2017-02-27 | A kind of upper limbs ectoskeleton control method and system based on Mental imagery |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106821681A true CN106821681A (en) | 2017-06-13 |
Family
ID=59134777
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710107353.5A Pending CN106821681A (en) | 2017-02-27 | 2017-02-27 | A kind of upper limbs ectoskeleton control method and system based on Mental imagery |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106821681A (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108210246A (en) * | 2018-01-10 | 2018-06-29 | 北京工业大学 | A kind of four-degree-of-freedom rehabilitation mechanical arm assembly |
CN108319928A (en) * | 2018-02-28 | 2018-07-24 | 天津大学 | A kind of deep learning model and application based on Multi-objective PSO optimization |
CN108364062A (en) * | 2018-02-28 | 2018-08-03 | 天津大学 | Deep learning model building method based on MEMD and the application in Mental imagery |
CN108446020A (en) * | 2018-02-28 | 2018-08-24 | 天津大学 | Merge Mental imagery idea control method and the application of Visual Graph and deep learning |
CN108433722A (en) * | 2018-02-28 | 2018-08-24 | 天津大学 | Portable brain electric collecting device and its application in SSVEP and Mental imagery |
CN108828960A (en) * | 2018-09-11 | 2018-11-16 | 武汉理工大学 | A kind of pneumatic muscles model-free High-order Iterative Learning control method |
CN108852349A (en) * | 2018-05-17 | 2018-11-23 | 浙江大学 | A kind of moving decoding method using Cortical ECoG signal |
CN109276244A (en) * | 2018-09-03 | 2019-01-29 | 南京理工大学 | The recognition methods that age-care based on brain wave information is intended to |
CN109528450A (en) * | 2019-01-24 | 2019-03-29 | 郑州大学 | A kind of exoskeleton rehabilitation robot of motion intention identification |
CN109711383A (en) * | 2019-01-07 | 2019-05-03 | 重庆邮电大学 | Convolutional neural networks Mental imagery EEG signal identification method based on time-frequency domain |
CN109730818A (en) * | 2018-12-20 | 2019-05-10 | 东南大学 | A kind of prosthetic hand control method based on deep learning |
CN110069958A (en) * | 2018-01-22 | 2019-07-30 | 北京航空航天大学 | A kind of EEG signals method for quickly identifying of dense depth convolutional neural networks |
CN110303471A (en) * | 2018-03-27 | 2019-10-08 | 清华大学 | Assistance exoskeleton control system and control method |
CN111557828A (en) * | 2020-04-29 | 2020-08-21 | 天津科技大学 | Active stroke lower limb rehabilitation robot control method based on healthy side coupling |
CN112315744A (en) * | 2020-11-24 | 2021-02-05 | 中国医学科学院生物医学工程研究所 | Multi-degree-of-freedom cooperative movement upper limb exoskeleton instruction method based on motor imagery |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020038294A1 (en) * | 2000-06-16 | 2002-03-28 | Masakazu Matsugu | Apparatus and method for detecting or recognizing pattern by employing a plurality of feature detecting elements |
US20150088024A1 (en) * | 2012-03-19 | 2015-03-26 | University Of Florida Research Foundation, Inc. | Methods and systems for brain function analysis |
CN105068644A (en) * | 2015-07-24 | 2015-11-18 | 山东大学 | Method for detecting P300 electroencephalogram based on convolutional neural network |
US20160027423A1 (en) * | 2014-04-11 | 2016-01-28 | Thomas Andrew Deuel | Encephalophone |
CN105708587A (en) * | 2016-01-25 | 2016-06-29 | 电子科技大学 | Lower-limb exoskeleton training method and system triggered by brain-computer interface under motion imagination pattern |
JP2016129661A (en) * | 2015-01-09 | 2016-07-21 | パナソニックIpマネジメント株式会社 | Determination system, control signal output system, rehabilitation system, determination method, control signal output method, computer program, and brain wave signal acquisition system |
CN106020472A (en) * | 2016-05-13 | 2016-10-12 | 天津理工大学 | Brain computer interface system on basis of motor imageries of different uplifting amplitudes of lower limbs |
-
2017
- 2017-02-27 CN CN201710107353.5A patent/CN106821681A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020038294A1 (en) * | 2000-06-16 | 2002-03-28 | Masakazu Matsugu | Apparatus and method for detecting or recognizing pattern by employing a plurality of feature detecting elements |
US20150088024A1 (en) * | 2012-03-19 | 2015-03-26 | University Of Florida Research Foundation, Inc. | Methods and systems for brain function analysis |
US20160027423A1 (en) * | 2014-04-11 | 2016-01-28 | Thomas Andrew Deuel | Encephalophone |
JP2016129661A (en) * | 2015-01-09 | 2016-07-21 | パナソニックIpマネジメント株式会社 | Determination system, control signal output system, rehabilitation system, determination method, control signal output method, computer program, and brain wave signal acquisition system |
CN105068644A (en) * | 2015-07-24 | 2015-11-18 | 山东大学 | Method for detecting P300 electroencephalogram based on convolutional neural network |
CN105708587A (en) * | 2016-01-25 | 2016-06-29 | 电子科技大学 | Lower-limb exoskeleton training method and system triggered by brain-computer interface under motion imagination pattern |
CN106020472A (en) * | 2016-05-13 | 2016-10-12 | 天津理工大学 | Brain computer interface system on basis of motor imageries of different uplifting amplitudes of lower limbs |
Non-Patent Citations (4)
Title |
---|
唐智川、张克俊、李超等: "基于深度卷积神经网络的运动想象分类及其在脑控外骨骼中的应用", 《计算机学报》 * |
唐智川、张克俊、李超等: "基于深度卷积神经网络的运动想象分类及其在脑控外骨骼中的应用", 《计算机学报》, 17 November 2016 (2016-11-17) * |
唐智川、张克俊、李超等: "基于深度卷积神经网络的运动想象分类及其在脑控外骨骼中的应用", 计算机学报 * |
郑奇: "双自由度类人上肢的设计及气动执行器的研究", 《中国优秀博硕士学位论文全文数据库 (硕士)信息科技辑》, pages 2 * |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108210246A (en) * | 2018-01-10 | 2018-06-29 | 北京工业大学 | A kind of four-degree-of-freedom rehabilitation mechanical arm assembly |
CN110069958A (en) * | 2018-01-22 | 2019-07-30 | 北京航空航天大学 | A kind of EEG signals method for quickly identifying of dense depth convolutional neural networks |
CN110069958B (en) * | 2018-01-22 | 2022-02-01 | 北京航空航天大学 | Electroencephalogram signal rapid identification method of dense deep convolutional neural network |
CN108446020A (en) * | 2018-02-28 | 2018-08-24 | 天津大学 | Merge Mental imagery idea control method and the application of Visual Graph and deep learning |
CN108433722A (en) * | 2018-02-28 | 2018-08-24 | 天津大学 | Portable brain electric collecting device and its application in SSVEP and Mental imagery |
CN108446020B (en) * | 2018-02-28 | 2021-01-08 | 天津大学 | Motor imagery idea control method fusing visual effect and deep learning and application |
CN108319928B (en) * | 2018-02-28 | 2022-04-19 | 天津大学 | Deep learning method and system based on multi-target particle swarm optimization algorithm |
CN108364062A (en) * | 2018-02-28 | 2018-08-03 | 天津大学 | Deep learning model building method based on MEMD and the application in Mental imagery |
CN108364062B (en) * | 2018-02-28 | 2021-10-22 | 天津大学 | Deep learning model construction method based on MEMD and application of deep learning model in motor imagery |
CN108319928A (en) * | 2018-02-28 | 2018-07-24 | 天津大学 | A kind of deep learning model and application based on Multi-objective PSO optimization |
CN110303471A (en) * | 2018-03-27 | 2019-10-08 | 清华大学 | Assistance exoskeleton control system and control method |
CN108852349A (en) * | 2018-05-17 | 2018-11-23 | 浙江大学 | A kind of moving decoding method using Cortical ECoG signal |
CN108852349B (en) * | 2018-05-17 | 2020-06-30 | 浙江大学 | Motion decoding method using cortical electroencephalogram signal |
CN109276244A (en) * | 2018-09-03 | 2019-01-29 | 南京理工大学 | The recognition methods that age-care based on brain wave information is intended to |
CN108828960A (en) * | 2018-09-11 | 2018-11-16 | 武汉理工大学 | A kind of pneumatic muscles model-free High-order Iterative Learning control method |
CN109730818A (en) * | 2018-12-20 | 2019-05-10 | 东南大学 | A kind of prosthetic hand control method based on deep learning |
CN109711383A (en) * | 2019-01-07 | 2019-05-03 | 重庆邮电大学 | Convolutional neural networks Mental imagery EEG signal identification method based on time-frequency domain |
CN109711383B (en) * | 2019-01-07 | 2023-03-31 | 重庆邮电大学 | Convolutional neural network motor imagery electroencephalogram signal identification method based on time-frequency domain |
CN109528450A (en) * | 2019-01-24 | 2019-03-29 | 郑州大学 | A kind of exoskeleton rehabilitation robot of motion intention identification |
CN111557828A (en) * | 2020-04-29 | 2020-08-21 | 天津科技大学 | Active stroke lower limb rehabilitation robot control method based on healthy side coupling |
CN112315744A (en) * | 2020-11-24 | 2021-02-05 | 中国医学科学院生物医学工程研究所 | Multi-degree-of-freedom cooperative movement upper limb exoskeleton instruction method based on motor imagery |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106821681A (en) | A kind of upper limbs ectoskeleton control method and system based on Mental imagery | |
CN108304068B (en) | Upper limb rehabilitation training robot control system and method based on brain-computer interface | |
CN104173124B (en) | A kind of upper limb healing system based on bio signal | |
CN113398422B (en) | Rehabilitation training system and method based on motor imagery-brain-computer interface and virtual reality | |
CN104360730B (en) | Man-machine interaction method supported by multi-modal non-implanted brain-computer interface technology | |
CN111544854B (en) | Cerebral apoplexy motor rehabilitation method based on brain myoelectric signal deep learning fusion | |
CN107397649A (en) | A kind of upper limbs exoskeleton rehabilitation robot control method based on radial base neural net | |
CN110534180B (en) | Deep learning human-computer interaction motor imagery brain-computer interface system and training method | |
CN110179643A (en) | A kind of neck rehabilitation training system and training method based on annulus sensor | |
CN111584031B (en) | Brain-controlled intelligent limb rehabilitation system based on portable electroencephalogram acquisition equipment and application | |
CN111110982A (en) | Hand rehabilitation training method based on motor imagery | |
CN104951082B (en) | A kind of brain-machine interface method for strengthening EEG signals using accidental resonance | |
CN103349595A (en) | Intelligent brain-computer interface wheelchair based on multi-mode hierarchical control | |
CN203043423U (en) | Rehabilitation training device based on brain-computer interface | |
CN1803122A (en) | Method for producing rehabilitation exerciser controlling order using imagination movement brain wave | |
CN113274032A (en) | Cerebral apoplexy rehabilitation training system and method based on SSVEP + MI brain-computer interface | |
CN107562191A (en) | The online brain-machine interface method of fine Imaginary Movement based on composite character | |
Duvinage et al. | A five-state P300-based foot lifter orthosis: Proof of concept | |
CN108520239A (en) | A kind of Method of EEG signals classification and system | |
CN112043473A (en) | Parallel nested and autonomous preferred classifier for brain-myoelectricity fusion perception of intelligent artificial limb | |
CN106693178A (en) | Voluntary will-based functional electrical stimulation closed-loop control method for upper limb rehabilitation | |
CN107126303A (en) | A kind of upper and lower extremities exercising support method based on mobile phone A PP | |
CN207101480U (en) | Upper limbs ectoskeleton control system based on Mental imagery | |
CN107808166A (en) | The myoelectricity feature extracting method that a kind of MEMD tensors linear Laplace differentiates | |
CN114021604A (en) | Motion imagery training system based on real-time feedback of 3D virtual reality technology |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170613 |
|
RJ01 | Rejection of invention patent application after publication |