WO2019016811A1 - Brain-computer interface rehabilitation system and method - Google Patents

Brain-computer interface rehabilitation system and method Download PDF

Info

Publication number
WO2019016811A1
WO2019016811A1 PCT/IL2018/050796 IL2018050796W WO2019016811A1 WO 2019016811 A1 WO2019016811 A1 WO 2019016811A1 IL 2018050796 W IL2018050796 W IL 2018050796W WO 2019016811 A1 WO2019016811 A1 WO 2019016811A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
computing unit
motor
physical actuator
errp
Prior art date
Application number
PCT/IL2018/050796
Other languages
French (fr)
Inventor
Miriam Zacksenhouse
Reuven Katz
Original Assignee
Technion Research & Development Foundation Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Technion Research & Development Foundation Limited filed Critical Technion Research & Development Foundation Limited
Publication of WO2019016811A1 publication Critical patent/WO2019016811A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1124Determining motor skills
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4058Detecting, measuring or recording for evaluating the nervous system for evaluating the central nervous system
    • A61B5/4064Evaluating the brain
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7475User input or interface means, e.g. keyboard, pointing device, joystick
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H1/00Apparatus for passive exercising; Vibrating apparatus; Chiropractic devices, e.g. body impacting devices, external devices for briefly extending or aligning unbroken bones
    • A61H1/02Stretching or bending or torsioning apparatus for exercising
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H1/00Apparatus for passive exercising; Vibrating apparatus; Chiropractic devices, e.g. body impacting devices, external devices for briefly extending or aligning unbroken bones
    • A61H1/02Stretching or bending or torsioning apparatus for exercising
    • A61H1/0237Stretching or bending or torsioning apparatus for exercising for the lower limbs
    • A61H1/0266Foot
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H1/00Apparatus for passive exercising; Vibrating apparatus; Chiropractic devices, e.g. body impacting devices, external devices for briefly extending or aligning unbroken bones
    • A61H1/02Stretching or bending or torsioning apparatus for exercising
    • A61H1/0274Stretching or bending or torsioning apparatus for exercising for the upper limbs
    • A61H1/0285Hand
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2505/00Evaluating, monitoring or diagnosing in the context of a particular type of medical care
    • A61B2505/09Rehabilitation or training
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2576/00Medical imaging apparatus involving image processing or analysis
    • A61B2576/02Medical imaging apparatus involving image processing or analysis specially adapted for a particular organ or body part
    • A61B2576/026Medical imaging apparatus involving image processing or analysis specially adapted for a particular organ or body part for the brain
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/0522Magnetic induction tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/055Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/22Ergometry; Measuring muscular strength or the force of a muscular blow
    • A61B5/224Measuring muscular strength
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/375Electroencephalography [EEG] using biofeedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/1635Hand or arm, e.g. handle
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/164Feet or leg, e.g. pedal
    • A61H2201/1642Holding means therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/165Wearable interfaces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1657Movement of interface, i.e. force application means
    • A61H2201/1659Free spatial automatic movement of interface within a working area, e.g. Robot
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5007Control means thereof computer controlled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2230/00Measuring physical parameters of the user
    • A61H2230/08Other bio-electrical signals
    • A61H2230/10Electroencephalographic signals
    • A61H2230/105Electroencephalographic signals used as a control parameter for the apparatus
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing

Definitions

  • the invention relates to the field of motor training and motor rehabilitation after neurological trauma.
  • a person suffering neural damage may lose motor control of one or more body parts. Although some people may recover from strokes, a majority of patients suffer from residual neurological deficits that persistently impair function. Restoration of motor function involves instigating regenerative responses that promote the growth of new neural connections in the brain, a process known to those skilled in the art as brain plasticity. This process can be aided by physical therapy, which typically involves one-on-one attention from a therapist who assists the patient thorough repetitive physical exercises of the affected body part. However, physical therapy is usually intensive, time consuming and costly. The repetitive nature of physical therapy makes it conducive to at least partial automation through the use of electromechanical devices, such as robotic systems.
  • robotic rehabilitation systems deliver movement therapy (referred to as robotic therapy, RT) that involves performing goal-directed motor tasks in one or more degrees of freedom.
  • An interactive control system which takes into account any counter forces applied by the user, can adjust the power output of the robot to provide neutral, assistive, or resistive forces.
  • RT needs to actively engage the patients in attempting to move and to challenge them by adapting to their performance.
  • three general approaches have been developed: (1) assist-as-needed approaches, (2) RT triggering based on kinematic and neurophysiological indices, especially those indicating patient intent to move, and (3) virtual reality games for a more immersive experience.
  • EEG-based triggers may include movement-related cortical potentials (MRCP), which reflect movement intent, motor imagery.
  • BCIs brain-computer interfaces
  • a system which includes an acquisition element configured to acquire brain activity data from a user, a physical actuator configured to interact with movement of a body part of the user, and a computing unit operatively coupled to the acquisition element and the physical actuator, the computing unit being configured to instruct the user to perform a motor exercise, continually monitor user movement parameters, receive and continually process the brain activity data from the user, decode the brain activity data from the user in real time to extract an error -related potential (ErrP) signal, and adjust an operational parameter of the physical actuator based on the ErrP signal.
  • ErrP error -related potential
  • a method for motor training and rehabilitation incorporating brain-machine interface including the steps of acquiring brain activity data from a user, providing a physical actuator capable of interacting with movement of a body part of the user, providing an instruction to perform a motor exercise, the motor exercise including manipulation of the physical actuator by the user, wherein the instruction is provided by a computing unit, continually monitoring user movement parameters, continually processing the brain activity data from the user, decoding the brain activity data from the user in real time to extract an error-related potential (ErrP) signal, and adjusting an operational parameter of the physical actuator based on the ErrP signal.
  • ErrP error-related potential
  • the acquisition element is configured to acquire the brain activity data from the user using a technique including one of electroencephalography (EEG), magnetoencephalography (MEG), functional magnetic resonance imaging (fMRI), functional near-infrared spectroscopy (fNIRS), single -photon emission computed tomography (SPECT), and electrocorticography (ECoG).
  • EEG electroencephalography
  • MEG magnetoencephalography
  • fMRI functional magnetic resonance imaging
  • fNIRS functional near-infrared spectroscopy
  • SPECT single -photon emission computed tomography
  • EoG electrocorticography
  • the physical actuator includes one of: a movable element attachable to the user's body part, an articulated robotic arm having one or more movable joints, a grip or handle element, a wearable element, means for immovably securing at least one of user's body parts to limit the movement thereof during the performance of said motor exercise, and means for partially or fully bearing the weight of the user during said motor exercise.
  • the articulated robotic arm has more than one degree of freedom.
  • the grip or handle element is configured for at least one of linear, rotational and spherical movement.
  • the wearable element is a glove including one or more actuators attachable to corresponding digits of a hand of the user.
  • the wearable element is a boot comprising one or more actuators attachable to corresponding digits of a foot of the user.
  • the means for partially or fully bearing the weight of the user is a harness by which the user is suspended.
  • the computing unit is configured to control the application of at least one of passive, pushing, assisting, reminding, responding, and resisting forces by the physical actuator.
  • the computing unit is configured to continually monitor at least one of position, force, torque and velocity of the user's movement.
  • the computing unit is further configured to decode at least one of the P3a and the P3b and error-related negativity (ERN) components of the ErrP signal.
  • ERN error-related negativity
  • the computing unit is further configured to determine whether said ErrP signal corresponds to an execution error or an outcome error.
  • the computing unit further comprises a computer readable storage medium having stored thereon computer readable program instructions for executing a motor exercise program.
  • the computing unit is configured to provide to the user said motor exercise instruction by at least one of visual, aural, and tactile means. In some embodiments, the computing unit is configured to provide to the user a feedback relating to the performance of said motor exercise. In some embodiments, the computing unit further comprises a display device. In some embodiments, the computing unit is configured to present said instruction and feedback as part of a video game. In some embodiments, the computing unit is configured to present said instruction and feedback in virtual reality. In some embodiments, the computing unit comprises a camera-type motion sensor.
  • FIG. 1 is a schematic illustration of an embodiment of a system and method for brain-computer interface (BCI) rehabilitation
  • Fig. 2 illustrates one exemplary embodiment of a BCI rehabilitation system. DETAILED DESCRIPTION
  • the present system and method are configured to augment human movement behavior in order to accelerate complex movement skill acquisition and improve outcome.
  • the present system and method relate to the application of specific brain signals, evoked in response to perceived errors in the performance of a motor task, to provide various forms of feedback, including real-time feedback and/or post-performance feedback, for training and rehabilitation.
  • a newly emerging use of BCI technology is in the area of motor training and rehabilitation.
  • BCI rehabilitation devices may incorporate real time closed-loop feedback to enhance the recruitment of selected brain areas by guiding a more focused activation of specific brain signals. This in turn may help to accelerate brain plasticity, and thus reduce the length, difficulty and cost of the recovery process.
  • ERPs event-related potentials
  • ErrP error-related potentials
  • the system 100 of the present disclosure includes a physical actuator 102 and a computing unit 104.
  • the physical actuator 102 comprises a robotic-type movable element configured to interact with a body part of a user, such that said body part can independently and controllably move in concert with physical actuator 102 in a one or more degrees-of-freedom (DOF) environment, such as one, two, three, four, or more degrees of freedom.
  • DOE degrees-of-freedom
  • the interaction of the physical actuator 102 with the body part of the user may involve providing haptic feedback to the user, i.e., by applying torque and/or force as feedback, or by cancelling-out the dynamics of the physical actuator 102 such that it becomes free-moving.
  • Physical actuator 102 may further comprise a grip or handle element; a wearable element; means, such as straps, for immovably securing at least one of the user's body parts to limit the movement thereof during the performance of motor exercises; and means, such as a body harness, for partially or fully bearing the weight of the user during motor exercise.
  • the computing unit 104 is operatively coupled to the physical actuator 102 and is programmed to control operation thereof.
  • the computing unit 104 can be programmed to execute one or more desired exercise routines at the physical actuator 102, selected to improve motor function of an affected body part of a user.
  • the computing unit 104 is configured to (i) command the physical actuator 102 to apply at least one of passive (none), pushing (against the user), assisting (toward the goal), reminding (applied for short duration), responding (applied for short duration toward the goal), and resisting (against the movement of the user) forces;; and (ii) to continually monitor user's movement parameters, such as, but not limited to position, orientation, force, torque and velocity of the physical actuator 102.
  • the computing unit is further configured to recognize the application of counter forces by the user to the physical actuator 102 and to respond by adjusting the power output of the physical actuator 102, to permit the user to override the desired exercise path of the physical actuator 102.
  • the operation of the computing unit 104 may be overseen by a physical therapist, for example, via an appropriate computer interface.
  • the therapist will be able to select a particular sequence or mix of exercise routines for the user and adjust parameters of the operation of the system 100 in response to user interaction therewith.
  • computing unit 104 may activate the physical actuator 102 to provide assistive or resistive force to the user when an ErrP signal is detected.
  • Acquisition element 108 is configured to enable detection of a signal stream from the user in the course of the performance by the user of each exercise in the set of exercises.
  • Acquisition element 108 comprises, e.g., an electroencephalogram (EEG) electrode array comprising a desired number of electrodes disposed in contact with the user's scalp.
  • EEG electroencephalogram
  • Acquisition element 108 is operatively coupled to computing unit 104, which is further configured to (i) receive and continually process the brain activity data from the user, and (ii) decode the said brain activity data from the user in real time to extract ErrP signals associated with various types of task errors.
  • ErrP signals may include, for example, the P3a and P3b subcomponents of P300, as well as ERN.
  • Computing unit 104 is further configured to adjust one or more parameters of the operation of the physical actuator 102 in response to said ErrP signals.
  • computing unit 104 may activate the physical actuator 102 to provide assistive or resistive force to the user when one or more subcomponents of the ErrP signal are detected.
  • computing unit 104 may modify the sequence or mix of exercises based upon at least one identified ErrP signal. Those of skill in the art will appreciate that either one or both of the aforementioned subcomponents of P300 may be used by the computing unit 104 in evaluating the performance by the user of an exercise in the set of exercises.
  • the system 100 can include one or more audiovisual displays 106 that are operated by the computing unit 104 to provide visual and audio exercise instructions and performance feedback to the user.
  • the system 100 provides a visual instruction, e.g., by displaying the target orientation or position on the display 106.
  • the display 106 may show a video clip or illustration of an arm rotating to the right or left, as appropriate.
  • such instructions may be provided in an audio fashion via a speaker (not depicted).
  • said instructions and feedback may be provided in a video gaming or virtual reality environment as part of the rehabilitation training.
  • the system 100 may further provide haptic feedback to the user through tactile means (e.g., vibration applied at the physical actuator 102).
  • the system 100 may also employ a camera-type motion sensor, such as Kinect® by Microsoft Corp., to recognize user movements.
  • Fig. 2 illustrates one exemplary embodiment of a BCI rehabilitation system 200.
  • the exemplary embodiment of the BCI rehabilitation system 200 as depicted is particularly suited for use with a user's arm. Persons of skill in the art will appreciate that alternative embodiments may be adapted for use with other limbs, such a user's leg, or other limbs, such as a user's head.
  • the BCI rehabilitation system 200 comprises a physical actuator 202 useful with the systems of the present disclosure, such as, for example, the system 100 of Fig. 1.
  • the physical actuator 202 generally comprises a base 210 to which is coupled motor assembly 212.
  • Motor assembly 212 comprises motor 212a, an encoder (not depicted) that measures the orientation of the motor, and a handle 212b, which is securely coupled to the output shaft of motor 212a.
  • motor assembly 212 further comprises torque sensor 212c.
  • the torque sensor 212c is a transducer that converts a torsional mechanical input into an electrical output signal.
  • Base 210 is generally sized and shaped to ergonomically receive a user's forearm while the user's hand or palm is grasping the handle 212b.
  • other embodiments of the BCI rehabilitation system 200 may be adapted for treating other movements of the arm, e.g., reaching movements, or treatment of other limbs, e.g., the leg.
  • the movement of the output shaft of motor 212a establishes a one DOF environment at the handle 212b in which the user's wrist can rotate the handle 212b about the pronation/supination (PS) axis of a wrist joint rotation.
  • the handle 212b can assume various forms and is generally configured to promote ergonomic gripping thereof by a user's hand or palm.
  • the computing unit 204 is programmed to control operation of the physical actuator 202. In some embodiments, the computing unit 204 is further programmed to effectuate performance of one or more rehabilitation exercise routines at the physical actuator 202 selected to improve motor function of the user's wrist. In some embodiments, the system 200 can include one or more displays 206 that are operated by the computing unit 204 to display a graphical user interface related to the desired exercise routine.
  • Brain signals including, but not limited to P300, P3a, and P3b, as discussed above, are acquired using EEG electrode array 208 arranged in a fitted cap, corresponding to the acquisition element 108.
  • EEG magnetoencephalography
  • fMRI functional magnetic resonance imaging
  • fNIRS functional near-infrared spectroscopy
  • SPECT single-photon emission computed tomography
  • EoG electrocorticography
  • a particular intention of some embodiments of the invention is to interact with the cerebral aspects of rehabilitation, as they relate, for example, to the plasticity and/or training of a human brain.
  • a user's brain which is damaged due to traumatic brain injury or a stroke may be subjected to beneficial therapies as described herein.
  • beneficial therapies as described herein.
  • this invention may also be beneficial in connection when used with patients having other types of physical disabilities and limitations.
  • aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit,” “module” or “system.” Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
  • the system disclosed in the present specification may further be specially constructed for the required purposes or may comprise a general-purpose computer or other device selectively activated or reconfigured by a computer program stored in the computer.
  • the algorithms presented herein are not inherently related to any particular computer or other apparatus.
  • Various general-purpose machines may be used with programs in accordance with the teachings herein.
  • the construction of more specialized system to perform the required method steps may be appropriate.
  • the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
  • a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • a computer readable storage medium may be any tangible medium that can contain or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
  • a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object-oriented programming language such as Python, Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • LAN local area network
  • WAN wide area network
  • Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
  • the present invention may be a system, a method, and/or a computer program product.
  • the computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Human Computer Interaction (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Neurology (AREA)
  • Biophysics (AREA)
  • Rehabilitation Therapy (AREA)
  • Pain & Pain Management (AREA)
  • Epidemiology (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Neurosurgery (AREA)
  • Psychology (AREA)
  • Physiology (AREA)
  • Dermatology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Psychiatry (AREA)
  • Rehabilitation Tools (AREA)

Abstract

A system and method are described for motor training and rehabilitation incorporating a brain-machine interface. The system includes an acquisition element configured to acquire brain activity data from a user, a physical actuator configured to interact with movement of a body part of the user, and a computing unit operatively coupled to the acquisition element and the physical actuator, the computing unit being configured to instruct the user to perform a motor exercise, continually monitor user movement parameters, receive and continually process the brain activity data from the user, decode the brain activity data from the user in real time to extract an error-related potential (ErrP) signal, and adjust an operational parameter of the physical actuator based on the ErrP signal. Related systems, apparatus, and methods are also described.

Description

BRAIN-COMPUTER INTERFACE REHABILITATION SYSTEM AND METHOD
RELATED APPLICATION INFORMATION
[0001] This application claims the benefit of priority from U.S. provisional patent application number 62/533,721, filed on July 18, 2017, which is incorporated herein by reference in its entirety.
FIELD OF THE INVENTION
[0002] The invention relates to the field of motor training and motor rehabilitation after neurological trauma.
BACKGROUND
[0003] A person suffering neural damage (for example, as a result of stroke or traumatic brain injury) may lose motor control of one or more body parts. Although some people may recover from strokes, a majority of patients suffer from residual neurological deficits that persistently impair function. Restoration of motor function involves instigating regenerative responses that promote the growth of new neural connections in the brain, a process known to those skilled in the art as brain plasticity. This process can be aided by physical therapy, which typically involves one-on-one attention from a therapist who assists the patient thorough repetitive physical exercises of the affected body part. However, physical therapy is usually intensive, time consuming and costly. The repetitive nature of physical therapy makes it conducive to at least partial automation through the use of electromechanical devices, such as robotic systems.
[0004] Known robotic rehabilitation systems deliver movement therapy (referred to as robotic therapy, RT) that involves performing goal-directed motor tasks in one or more degrees of freedom. An interactive control system, which takes into account any counter forces applied by the user, can adjust the power output of the robot to provide neutral, assistive, or resistive forces.
[0005] To provide successful motor recovery, RT needs to actively engage the patients in attempting to move and to challenge them by adapting to their performance. In order to enhance patient engagement during RT, three general approaches have been developed: (1) assist-as-needed approaches, (2) RT triggering based on kinematic and neurophysiological indices, especially those indicating patient intent to move, and (3) virtual reality games for a more immersive experience.
[0006] Kinematic real time triggering methods, which are based on force, velocity, and time thresholds, have already been incorporated in commercial rehabilitation robots. Neurophysiological triggers, especially those based on electroencephalography (EEG), reflect brain activity and thus can assure patient engagement and may enhance presynaptic activity to the cell population or network responsible for moving the impaired limb. EEG- based triggers may include movement-related cortical potentials (MRCP), which reflect movement intent, motor imagery.
[0007] The use of brain-computer interfaces (BCIs) is well established as a way to enable users to interact with their environment through brain signals, for example, to control a communication device or a prosthetic limb
[0008] The foregoing examples of the related art and limitations related therewith are intended to be illustrative and not exclusive. Other limitations of the related art will become apparent to those of skill in the art upon a reading of the specification and a study of the figures.
SUMMARY
[0009] The following embodiments and aspects thereof are described and illustrated in conjunction with systems, tools and methods which are meant to be exemplary and illustrative, not limiting in scope.
[0010] According to a first aspect, there is provided a system which includes an acquisition element configured to acquire brain activity data from a user, a physical actuator configured to interact with movement of a body part of the user, and a computing unit operatively coupled to the acquisition element and the physical actuator, the computing unit being configured to instruct the user to perform a motor exercise, continually monitor user movement parameters, receive and continually process the brain activity data from the user, decode the brain activity data from the user in real time to extract an error -related potential (ErrP) signal, and adjust an operational parameter of the physical actuator based on the ErrP signal.
[0011] According to another aspect, there is provided a method for motor training and rehabilitation incorporating brain-machine interface, the method including the steps of acquiring brain activity data from a user, providing a physical actuator capable of interacting with movement of a body part of the user, providing an instruction to perform a motor exercise, the motor exercise including manipulation of the physical actuator by the user, wherein the instruction is provided by a computing unit, continually monitoring user movement parameters, continually processing the brain activity data from the user, decoding the brain activity data from the user in real time to extract an error-related potential (ErrP) signal, and adjusting an operational parameter of the physical actuator based on the ErrP signal.
[0012] In some embodiments, the acquisition element is configured to acquire the brain activity data from the user using a technique including one of electroencephalography (EEG), magnetoencephalography (MEG), functional magnetic resonance imaging (fMRI), functional near-infrared spectroscopy (fNIRS), single -photon emission computed tomography (SPECT), and electrocorticography (ECoG). In some embodiments, the physical actuator includes one of: a movable element attachable to the user's body part, an articulated robotic arm having one or more movable joints, a grip or handle element, a wearable element, means for immovably securing at least one of user's body parts to limit the movement thereof during the performance of said motor exercise, and means for partially or fully bearing the weight of the user during said motor exercise. In some embodiments, the articulated robotic arm has more than one degree of freedom. In some embodiments, the grip or handle element is configured for at least one of linear, rotational and spherical movement. In some embodiments, the wearable element is a glove including one or more actuators attachable to corresponding digits of a hand of the user. In some embodiments, the wearable element is a boot comprising one or more actuators attachable to corresponding digits of a foot of the user. In some embodiments, the means for partially or fully bearing the weight of the user is a harness by which the user is suspended. [0013] In some embodiments, the computing unit is configured to control the application of at least one of passive, pushing, assisting, reminding, responding, and resisting forces by the physical actuator. In some embodiments, the computing unit is configured to continually monitor at least one of position, force, torque and velocity of the user's movement. In some embodiments, the computing unit is further configured to decode at least one of the P3a and the P3b and error-related negativity (ERN) components of the ErrP signal. In some embodiments, the computing unit is further configured to determine whether said ErrP signal corresponds to an execution error or an outcome error. In some embodiments, the computing unit further comprises a computer readable storage medium having stored thereon computer readable program instructions for executing a motor exercise program.
[0014] In some embodiments, the computing unit is configured to provide to the user said motor exercise instruction by at least one of visual, aural, and tactile means. In some embodiments, the computing unit is configured to provide to the user a feedback relating to the performance of said motor exercise. In some embodiments, the computing unit further comprises a display device. In some embodiments, the computing unit is configured to present said instruction and feedback as part of a video game. In some embodiments, the computing unit is configured to present said instruction and feedback in virtual reality. In some embodiments, the computing unit comprises a camera-type motion sensor.
[0015] In addition to the exemplary aspects and embodiments described above, further aspects and embodiments will become apparent by reference to the figures and by study of the following detailed description.
BRIEF DESCRIPTION OF THE FIGURES
[0016] Exemplary embodiments are illustrated in referenced figures. Dimensions of components and features shown in the figures are generally chosen for convenience and clarity of presentation and are not necessarily shown to scale. The figures are listed below.
[0017] Fig. 1 is a schematic illustration of an embodiment of a system and method for brain-computer interface (BCI) rehabilitation; and
[0018] Fig. 2 illustrates one exemplary embodiment of a BCI rehabilitation system. DETAILED DESCRIPTION
[0019] There is disclosed a system and method that employ BCI in an advantageous way in the context of a motor training and rehabilitation system. The present system and method are configured to augment human movement behavior in order to accelerate complex movement skill acquisition and improve outcome. Specifically, the present system and method relate to the application of specific brain signals, evoked in response to perceived errors in the performance of a motor task, to provide various forms of feedback, including real-time feedback and/or post-performance feedback, for training and rehabilitation. A newly emerging use of BCI technology is in the area of motor training and rehabilitation. BCI rehabilitation devices may incorporate real time closed-loop feedback to enhance the recruitment of selected brain areas by guiding a more focused activation of specific brain signals. This in turn may help to accelerate brain plasticity, and thus reduce the length, difficulty and cost of the recovery process.
[0020] Certain sensory, cognitive, or motor events, which are known to evoke specific signals in the brain, may be referred to as event-related potentials (ERP). ERPs that are evoked in the brain in response to a perceived error in the performance of a task (referred to as error-related potentials, or ErrP) are of particular interest in the case of the present invention.
[0021] One embodiment of a BCI rehabilitation system in accordance with principles of the present disclosure is shown in schematic form in Fig. 1. The system 100 of the present disclosure includes a physical actuator 102 and a computing unit 104. In general terms, the physical actuator 102 comprises a robotic-type movable element configured to interact with a body part of a user, such that said body part can independently and controllably move in concert with physical actuator 102 in a one or more degrees-of-freedom (DOF) environment, such as one, two, three, four, or more degrees of freedom. The interaction of the physical actuator 102 with the body part of the user may involve providing haptic feedback to the user, i.e., by applying torque and/or force as feedback, or by cancelling-out the dynamics of the physical actuator 102 such that it becomes free-moving. Physical actuator 102 may further comprise a grip or handle element; a wearable element; means, such as straps, for immovably securing at least one of the user's body parts to limit the movement thereof during the performance of motor exercises; and means, such as a body harness, for partially or fully bearing the weight of the user during motor exercise.
[0022] The computing unit 104 is operatively coupled to the physical actuator 102 and is programmed to control operation thereof. In this regard, the computing unit 104 can be programmed to execute one or more desired exercise routines at the physical actuator 102, selected to improve motor function of an affected body part of a user. In the course of executing said exercise routines, the computing unit 104 is configured to (i) command the physical actuator 102 to apply at least one of passive (none), pushing (against the user), assisting (toward the goal), reminding (applied for short duration), responding (applied for short duration toward the goal), and resisting (against the movement of the user) forces;; and (ii) to continually monitor user's movement parameters, such as, but not limited to position, orientation, force, torque and velocity of the physical actuator 102. The computing unit is further configured to recognize the application of counter forces by the user to the physical actuator 102 and to respond by adjusting the power output of the physical actuator 102, to permit the user to override the desired exercise path of the physical actuator 102. In some configurations, the operation of the computing unit 104 may be overseen by a physical therapist, for example, via an appropriate computer interface. In such configurations, the therapist will be able to select a particular sequence or mix of exercise routines for the user and adjust parameters of the operation of the system 100 in response to user interaction therewith.
[0023] In one embodiment, computing unit 104 may activate the physical actuator 102 to provide assistive or resistive force to the user when an ErrP signal is detected.
[0024] Acquisition element 108 is configured to enable detection of a signal stream from the user in the course of the performance by the user of each exercise in the set of exercises. Acquisition element 108 comprises, e.g., an electroencephalogram (EEG) electrode array comprising a desired number of electrodes disposed in contact with the user's scalp. Acquisition element 108 is operatively coupled to computing unit 104, which is further configured to (i) receive and continually process the brain activity data from the user, and (ii) decode the said brain activity data from the user in real time to extract ErrP signals associated with various types of task errors. Such ErrP signals may include, for example, the P3a and P3b subcomponents of P300, as well as ERN. In experiments conducted by the inventors, the inventors have shown that task-relevant disturbances generated by cursor and target jumps elicited both frontal-central positivity around 200-275 ms (after the onset of task-relevant disturbances) that is consistent with P3a and parietal positivity around 370- 400 ms (after the onset of task-relevant disturbances) that is consistent with P3b.
[0025] Computing unit 104 is further configured to adjust one or more parameters of the operation of the physical actuator 102 in response to said ErrP signals. In one variation, computing unit 104 may activate the physical actuator 102 to provide assistive or resistive force to the user when one or more subcomponents of the ErrP signal are detected. In another variation, computing unit 104 may modify the sequence or mix of exercises based upon at least one identified ErrP signal. Those of skill in the art will appreciate that either one or both of the aforementioned subcomponents of P300 may be used by the computing unit 104 in evaluating the performance by the user of an exercise in the set of exercises.
[0026] The system 100 can include one or more audiovisual displays 106 that are operated by the computing unit 104 to provide visual and audio exercise instructions and performance feedback to the user. By way of example, and without limiting the generality of the foregoing, the system 100 provides a visual instruction, e.g., by displaying the target orientation or position on the display 106. Alternatively, the display 106 may show a video clip or illustration of an arm rotating to the right or left, as appropriate. In some embodiments, such instructions may be provided in an audio fashion via a speaker (not depicted). Optionally, said instructions and feedback may be provided in a video gaming or virtual reality environment as part of the rehabilitation training. The system 100 may further provide haptic feedback to the user through tactile means (e.g., vibration applied at the physical actuator 102). The system 100 may also employ a camera-type motion sensor, such as Kinect® by Microsoft Corp., to recognize user movements.
[0027] Fig. 2 illustrates one exemplary embodiment of a BCI rehabilitation system 200. The exemplary embodiment of the BCI rehabilitation system 200 as depicted is particularly suited for use with a user's arm. Persons of skill in the art will appreciate that alternative embodiments may be adapted for use with other limbs, such a user's leg, or other limbs, such as a user's head. The BCI rehabilitation system 200 comprises a physical actuator 202 useful with the systems of the present disclosure, such as, for example, the system 100 of Fig. 1. The physical actuator 202 generally comprises a base 210 to which is coupled motor assembly 212. Motor assembly 212 comprises motor 212a, an encoder (not depicted) that measures the orientation of the motor, and a handle 212b, which is securely coupled to the output shaft of motor 212a. In one variation, motor assembly 212 further comprises torque sensor 212c. As is known in the art, the torque sensor 212c is a transducer that converts a torsional mechanical input into an electrical output signal. Base 210 is generally sized and shaped to ergonomically receive a user's forearm while the user's hand or palm is grasping the handle 212b. As noted above, other embodiments of the BCI rehabilitation system 200 may be adapted for treating other movements of the arm, e.g., reaching movements, or treatment of other limbs, e.g., the leg. The movement of the output shaft of motor 212a establishes a one DOF environment at the handle 212b in which the user's wrist can rotate the handle 212b about the pronation/supination (PS) axis of a wrist joint rotation. The handle 212b can assume various forms and is generally configured to promote ergonomic gripping thereof by a user's hand or palm.
[0028] The computing unit 204 is programmed to control operation of the physical actuator 202. In some embodiments, the computing unit 204 is further programmed to effectuate performance of one or more rehabilitation exercise routines at the physical actuator 202 selected to improve motor function of the user's wrist. In some embodiments, the system 200 can include one or more displays 206 that are operated by the computing unit 204 to display a graphical user interface related to the desired exercise routine.
[0029] Brain signals, including, but not limited to P300, P3a, and P3b, as discussed above, are acquired using EEG electrode array 208 arranged in a fitted cap, corresponding to the acquisition element 108. However, it will be appreciated that various EEG acquisition devices comprising fewer electrodes may also be used with the present system. In some embodiments, the acquisition element 108 of Fig. 1 might be adapted to receive signals from magnetoencephalography (MEG), functional magnetic resonance imaging (fMRI), functional near-infrared spectroscopy (fNIRS), single-photon emission computed tomography (SPECT), and electrocorticography (ECoG). In the discussion of Fig. 2, the acquisition element 108 of Fig. 1 is associated with the EEG electrode array 208 by way of a non-limiting example. [0030] A particular intention of some embodiments of the invention is to interact with the cerebral aspects of rehabilitation, as they relate, for example, to the plasticity and/or training of a human brain. By way of example, in some embodiments of the present invention, a user's brain which is damaged due to traumatic brain injury or a stroke may be subjected to beneficial therapies as described herein. However, this invention may also be beneficial in connection when used with patients having other types of physical disabilities and limitations.
[0031 ] As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit," "module" or "system." Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
[0032] The system disclosed in the present specification may further be specially constructed for the required purposes or may comprise a general-purpose computer or other device selectively activated or reconfigured by a computer program stored in the computer. The algorithms presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose machines may be used with programs in accordance with the teachings herein. Alternatively, the construction of more specialized system to perform the required method steps may be appropriate.
[0033] Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain or store a program for use by or in connection with an instruction execution system, apparatus, or device.
[0034] A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
[0035] Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
[0036] Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object-oriented programming language such as Python, Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
[0037] The present invention may be a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention.
[0038] In the description and claims of the application, each of the words "comprise" "include" and "have", and forms thereof, are not necessarily limited to members in a list with which the words may be associated. In addition, where there are inconsistencies between this application and any document incorporated by reference, it is hereby intended that the present application controls.
[0039] The descriptions of the various embodiments of the present invention have been presented for purposes of illustration but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein was chosen to best explain the principles of the embodiments, the practical application or technical improvement over technologies found in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims

CLAIMS What is claimed is:
1. A system comprising:
an acquisition element configured to acquire brain activity data from a user;
a physical actuator configured to interact with movement of a body part of the user; and
a computing unit operatively coupled to the acquisition element and the physical actuator, the computing unit being configured to:
instruct the user to perform a motor exercise,
continually monitor user movement parameters,
receive and continually process the brain activity data from the user, decode the brain activity data from the user in real time to extract an
error-related potential (ErrP) signal, and
adjust an operational parameter of the physical actuator based on
said ErrP signal.
2. The system of claim 1, wherein the acquisition element is configured to acquire the brain activity data from the user using a technique comprising one of electroencephalography (EEG), magnetoencephalography (MEG), functional magnetic resonance imaging (fMRI), functional near-infrared spectroscopy (fNIRS), single -photon emission computed tomography (SPECT), and electrocorticography (ECoG).
3. The system of claim 1, wherein the physical actuator comprises at least one of: a movable element attachable to the body part of the user;
an articulated robotic arm having one or more movable joints;
a grip or handle element;
a wearable element;
means for immovably securing at least one body part of the user to limit the movement thereof during the performance of said motor exercise; and means for partially or fully bearing weight of the user during said motor exercise.
4. The system of claim 3, wherein the articulated robotic arm has more than one degree of freedom.
5. The system of claim 3, wherein the grip or handle element is configured for at least one of linear, rotational and spherical movement.
6. The system of claim 3, wherein the wearable element comprises a glove comprising one or more actuators attachable to corresponding digits of a hand of the user.
7. The system of claim 3, wherein the wearable element comprises a boot comprising one or more actuators attachable to corresponding digits of a foot of the user.
8. The system of claim 3, wherein the means for partially or fully bearing the weight of the user is a harness by which the user is suspended.
9. The system of claim 1, wherein the computing unit is configured to control application of at least one of passive, pushing, assisting, reminding, responding, and resisting forces by the physical actuator.
10. The system of claim 1, wherein the computing unit is configured to continually monitor at least one of position, force, torque and velocity of movement of the user.
11. The system of claim 1 , wherein the computing unit is further configured to decode at least one of the P3a and the P3b error-related negativity (ERN) components of the ErrP signal.
12. The system of claim 1, wherein the computing unit is further configured to determine whether said ErrP signal corresponds to an execution error or an outcome error.
13. The system of claim 1, wherein the computing unit further comprises a computer readable storage medium having stored thereon computer readable program instructions for executing a motor exercise program.
14. The system of claim 13, wherein the computing unit is configured to provide said motor exercise program to the user by at least one of visual, aural, and tactile means.
15. The system of claim 13, wherein the computing unit is configured to present said motor exercise program as part of a video game.
16. The system of claim 13, wherein the computing unit is configured to present said motor exercise program in virtual reality.
17. The system of claim 13, wherein the computing unit is configured to provide a feedback relating to the performance of said motor exercise to the user.
18. The system of claim 17, wherein the wherein the computing unit is configured to present said feedback as part of a video game.
19. The system of claim 17, wherein the computing unit is configured to present said feedback in virtual reality.
20. The system of claim 1 , further comprising a display device.
21. The system of claim 1 , wherein the computing unit comprises a camera-type motion sensor.
22. A method for motor training and rehabilitation incorporating a brain-machine interface, comprising the steps of:
acquiring brain activity data from a user;
providing a physical actuator capable of interacting with movement of a body part of the user;
providing an instruction to perform a motor exercise, the motor exercise comprising manipulation of the physical actuator by the user, wherein the instruction is provided by a computing unit;
continually monitoring user movement parameters;
continually processing the brain activity data from the user;
decoding the brain activity data from the user in real time to extract an error-related potential (ErrP) signal; and
adjusting an operational parameter of the physical actuator based on said ErrP signal.
23. The method of claim 22 wherein said brain activity data from the user is acquired using a brain imaging technique comprising one of electroencephalography (EEG), magnetoencephalography (MEG), functional magnetic resonance imaging (fMRI), functional near-infrared spectroscopy (fNIRS), single-photon emission computed tomography (SPECT), and electrocorticography (ECoG).
24. The method of claim 22, wherein the physical actuator comprises at least one of: a movable element attachable to the body part of the user;
an articulated robotic arm having one or more movable joints;
a grip or handle element;
a wearable element;
means for immovably securing at least one of body part of the user to limit the movement thereof during the performance of said motor exercise; and means for partially or fully bearing weight of the user during said motor exercise.
25. The method of claim 24, wherein the articulated robotic arm has more than one degree of freedom.
26. The method of claim 24, wherein the grip or handle element is configured for at least one of linear, rotational and spherical movement.
27. The method of claim 24, wherein the wearable element is a glove comprising one or more actuators attachable to corresponding digits of a hand of the user.
28. The method of claim 24, wherein the wearable element comprises a boot comprising one or more actuators attachable to corresponding digits of a foot of the user.
29. The method of claim 24, wherein the means for partially or fully bearing the weight of the user is a harness by which the user is suspended.
30. The method of claim 22, wherein the computing unit is configured to control the application of at least one of passive, pushing, assisting, reminding, responding, and resisting forces.
31. The method of claim 22, wherein the computing unit is configured to continually monitor at least one of position, force, torque and velocity of movement of the user.
32. The method of claim 22, wherein the computing unit is further configured to decode at least one of the P3a and the P3b error-related negativity (ERN) components of the ErrP signal.
33. The method of claim 22, and further comprising determining whether said ErrP signal corresponds to an execution error or an outcome error.
34. The method of claim 22, wherein the computing unit further comprises a computer readable storage medium having stored thereon computer readable program instructions for executing a motor exercise program.
35. The method of claim 34, wherein the computing unit is configured to provide said motor exercise program to the user by at least one of visual, aural, and tactile means.
36. The method of claim 34, wherein the computing unit is configured to present said motor exercise program as part of a video game.
37. The method of claim 34, wherein the computing unit is configured to present said motor exercise program in virtual reality.
38. The method of claim 34, wherein the computing unit is configured to provide a feedback relating to the performance of said motor exercise to the user.
39. The method of claim 34, wherein the wherein the computing unit is configured to present said feedback as part of a video game.
40. The method of claim 34, wherein the computing unit is configured to present said feedback in virtual reality.
41. The method of claim 22, further comprising a display device.
42. The method of claim 22, wherein the computing unit comprises a camera-type motion sensor.
PCT/IL2018/050796 2017-07-18 2018-07-18 Brain-computer interface rehabilitation system and method WO2019016811A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762533721P 2017-07-18 2017-07-18
US62/533,721 2017-07-18

Publications (1)

Publication Number Publication Date
WO2019016811A1 true WO2019016811A1 (en) 2019-01-24

Family

ID=65015713

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2018/050796 WO2019016811A1 (en) 2017-07-18 2018-07-18 Brain-computer interface rehabilitation system and method

Country Status (1)

Country Link
WO (1) WO2019016811A1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111523519A (en) * 2020-06-09 2020-08-11 福州大学 ErrP self-adaptive common space mode identification method
CN112085052A (en) * 2020-07-28 2020-12-15 中国科学院深圳先进技术研究院 Training method of motor imagery classification model, motor imagery method and related equipment
WO2021008087A1 (en) * 2019-07-17 2021-01-21 西安交通大学 Contrast sensitivity test method based on motion visual evoked potential
WO2021062016A1 (en) * 2019-09-26 2021-04-01 The Regents Of The University Of California Peripheral brain-machine interface system via volitional control of individual motor units
WO2022047377A1 (en) * 2020-08-31 2022-03-03 Vincent John Macri Digital virtual limb and body interaction
CN115349857A (en) * 2022-07-18 2022-11-18 国家康复辅具研究中心 Dynamic rehabilitation assessment method and system based on fNIRS brain function map
US11673042B2 (en) 2012-06-27 2023-06-13 Vincent John Macri Digital anatomical virtual extremities for pre-training physical movement
US11682480B2 (en) 2013-05-17 2023-06-20 Vincent J. Macri System and method for pre-action training and control
US11804148B2 (en) 2012-06-27 2023-10-31 Vincent John Macri Methods and apparatuses for pre-action gaming
US11904101B2 (en) 2012-06-27 2024-02-20 Vincent John Macri Digital virtual limb and body interaction
US11944446B2 (en) 2014-01-13 2024-04-02 Vincent John Macri Apparatus, method, and system for pre-action therapy

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050228515A1 (en) * 2004-03-22 2005-10-13 California Institute Of Technology Cognitive control signals for neural prosthetics
US7120486B2 (en) * 2003-12-12 2006-10-10 Washington University Brain computer interface
US20090221928A1 (en) * 2004-08-25 2009-09-03 Motorika Limited Motor training with brain plasticity
US20100137734A1 (en) * 2007-05-02 2010-06-03 Digiovanna John F System and method for brain machine interface (bmi) control using reinforcement learning
WO2014025765A2 (en) * 2012-08-06 2014-02-13 University Of Miami Systems and methods for adaptive neural decoding
US20150012111A1 (en) * 2013-07-03 2015-01-08 University Of Houston Methods for closed-loop neural-machine interface systems for the control of wearable exoskeletons and prosthetic devices
WO2016094862A2 (en) * 2014-12-12 2016-06-16 Francis Joseph T Autonomous brain-machine interface

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7120486B2 (en) * 2003-12-12 2006-10-10 Washington University Brain computer interface
US20050228515A1 (en) * 2004-03-22 2005-10-13 California Institute Of Technology Cognitive control signals for neural prosthetics
US20090221928A1 (en) * 2004-08-25 2009-09-03 Motorika Limited Motor training with brain plasticity
US20100137734A1 (en) * 2007-05-02 2010-06-03 Digiovanna John F System and method for brain machine interface (bmi) control using reinforcement learning
WO2014025765A2 (en) * 2012-08-06 2014-02-13 University Of Miami Systems and methods for adaptive neural decoding
US20150012111A1 (en) * 2013-07-03 2015-01-08 University Of Houston Methods for closed-loop neural-machine interface systems for the control of wearable exoskeletons and prosthetic devices
WO2016094862A2 (en) * 2014-12-12 2016-06-16 Francis Joseph T Autonomous brain-machine interface

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
CHAVARRIAGA, RICARDO ET AL.: "Errare machinale est: the use of error-related potentials in brain-machine interfaces", FRONTIERS IN NEUROSCIENCE, vol. 8, 22 July 2014 (2014-07-22), pages 208, XP055565990, Retrieved from the Internet <URL:https://doi.org/10.3389/fnins.2014.00208> *
DEMCHENKO, IGOR ET AL.: "Distinct electroencephalographic responses to disturbances and distractors during continuous reaching movements", BRAIN RESEARCH, vol. 1652, 28 September 2016 (2016-09-28), pages 178 - 187, XP029793181 *
OMEDES, JASON ET AL.: "Factors that affect error potentials during a grasping task: toward a hybrid natural movement decoding BCI", JOURNAL OF NEURAL ENGINEERING, 6 June 2018 (2018-06-06), XP020329287 *
POLICH, JOHN.: "Updating P300: an integrative theory of P3a and P3b", CLINICAL NEUROPHYSIOLOGY, vol. 118.10, 18 June 2007 (2007-06-18), pages 2128 - 2148, XP022248097 *
SP?LER, MARTIN ET AL.: "Error-related potentials during continuous feedback: using EEG to detect errors of different type and severity", FRONTIERS IN HUMAN NEUROSCIENCE, vol. 9, 26 March 2015 (2015-03-26), pages 155, XP055565994, Retrieved from the Internet <URL:https://doi.org/10.3389/fnhum.2015.00155> *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11673042B2 (en) 2012-06-27 2023-06-13 Vincent John Macri Digital anatomical virtual extremities for pre-training physical movement
US11804148B2 (en) 2012-06-27 2023-10-31 Vincent John Macri Methods and apparatuses for pre-action gaming
US11904101B2 (en) 2012-06-27 2024-02-20 Vincent John Macri Digital virtual limb and body interaction
US11682480B2 (en) 2013-05-17 2023-06-20 Vincent J. Macri System and method for pre-action training and control
US11944446B2 (en) 2014-01-13 2024-04-02 Vincent John Macri Apparatus, method, and system for pre-action therapy
WO2021008087A1 (en) * 2019-07-17 2021-01-21 西安交通大学 Contrast sensitivity test method based on motion visual evoked potential
WO2021062016A1 (en) * 2019-09-26 2021-04-01 The Regents Of The University Of California Peripheral brain-machine interface system via volitional control of individual motor units
CN111523519A (en) * 2020-06-09 2020-08-11 福州大学 ErrP self-adaptive common space mode identification method
CN111523519B (en) * 2020-06-09 2022-05-13 福州大学 ErrP self-adaptive common space mode identification method
CN112085052A (en) * 2020-07-28 2020-12-15 中国科学院深圳先进技术研究院 Training method of motor imagery classification model, motor imagery method and related equipment
WO2022047377A1 (en) * 2020-08-31 2022-03-03 Vincent John Macri Digital virtual limb and body interaction
CN115349857A (en) * 2022-07-18 2022-11-18 国家康复辅具研究中心 Dynamic rehabilitation assessment method and system based on fNIRS brain function map

Similar Documents

Publication Publication Date Title
WO2019016811A1 (en) Brain-computer interface rehabilitation system and method
Guo et al. Human–robot interaction for rehabilitation robotics
US20220338761A1 (en) Remote Training and Practicing Apparatus and System for Upper-Limb Rehabilitation
Vogel et al. An assistive decision-and-control architecture for force-sensitive hand–arm systems driven by human–machine interfaces
Kaufmann et al. Toward brain-computer interface based wheelchair control utilizing tactually-evoked event-related potentials
Ferreira et al. Human-machine interfaces based on EMG and EEG applied to robotic systems
Vourvopoulos et al. Robot navigation using brain-computer interfaces
Ktena et al. A virtual reality platform for safe evaluation and training of natural gaze-based wheelchair driving
Rao et al. Evaluation of an isometric and a position joystick in a target acquisition task for individuals with cerebral palsy
Achic et al. Hybrid BCI system to operate an electric wheelchair and a robotic arm for navigation and manipulation tasks
Araujo et al. Development of a low-cost EEG-controlled hand exoskeleton 3D printed on textiles
Lupu et al. Virtual reality based stroke recovery for upper limbs using leap motion
Noronha et al. “Wink to grasp”—comparing eye, voice & EMG gesture control of grasp with soft-robotic gloves
Zhang et al. Combining mental training and physical training with goal-oriented protocols in stroke rehabilitation: A feasibility case study
Rechy-Ramirez et al. Impact of commercial sensors in human computer interaction: a review
D'Auria et al. Human-computer interaction in healthcare: How to support patients during their wrist rehabilitation
JP2021529368A (en) Virtual environment for physiotherapy
Allison The I of BCIs: next generation interfaces for brain–computer interface systems that adapt to individual users
Zhu et al. Face-computer interface (FCI): Intent recognition based on facial electromyography (fEMG) and online human-computer interface with audiovisual feedback
Mathew et al. A systematic review of technological advancements in signal sensing, actuation, control and training methods in robotic exoskeletons for rehabilitation
Ma et al. Sensing and force-feedback exoskeleton robotic (SAFER) glove mechanism for hand rehabilitation
Batula et al. Developing an optical brain-computer interface for humanoid robot control
Feng et al. An interactive framework for personalized computer-assisted neurorehabilitation
Kakkos et al. Human–machine interfaces for motor rehabilitation
Al Nuaimi et al. Real-time Control of UGV Robot in Gazebo Simulator using P300-based Brain-Computer Interface

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18834555

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18834555

Country of ref document: EP

Kind code of ref document: A1