CN109172067B - Intelligent artificial limb system based on joint control of electroencephalogram signals and voice signals - Google Patents

Intelligent artificial limb system based on joint control of electroencephalogram signals and voice signals Download PDF

Info

Publication number
CN109172067B
CN109172067B CN201810956123.0A CN201810956123A CN109172067B CN 109172067 B CN109172067 B CN 109172067B CN 201810956123 A CN201810956123 A CN 201810956123A CN 109172067 B CN109172067 B CN 109172067B
Authority
CN
China
Prior art keywords
artificial limb
signal
control unit
main control
movement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810956123.0A
Other languages
Chinese (zh)
Other versions
CN109172067A (en
Inventor
盖龄杰
郭琳炜
郭红想
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China University of Geosciences
Original Assignee
China University of Geosciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China University of Geosciences filed Critical China University of Geosciences
Priority to CN201810956123.0A priority Critical patent/CN109172067B/en
Publication of CN109172067A publication Critical patent/CN109172067A/en
Application granted granted Critical
Publication of CN109172067B publication Critical patent/CN109172067B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/68Operating or control means
    • A61F2/70Operating or control means electrical
    • A61F2/72Bioelectric control, e.g. myoelectric
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/68Operating or control means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/68Operating or control means
    • A61F2002/6809Operating or control means acoustic
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/68Operating or control means
    • A61F2/70Operating or control means electrical
    • A61F2002/704Operating or control means electrical computer-controlled, e.g. robotic control
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Health & Medical Sciences (AREA)
  • Cardiology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Transplantation (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Vascular Medicine (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Rehabilitation Tools (AREA)
  • Prostheses (AREA)

Abstract

The application provides an intelligent artificial limb system based on the common control of brain wave signals and voice signals, which comprises an artificial limb support, a main control unit, a Bluetooth communication module and a power supply, wherein the artificial limb support is provided with a steering engine mechanism and an artificial limb control unit; the main control MCU determines a preset movement speed or a rotation direction corresponding to the voice signal according to the recognized voice signal to form a second movement signal, the main control MCU transmits the first or second movement signal to the artificial limb control unit, and the artificial limb control unit controls the steering engine mechanism to drive the artificial limb support to move according to the first or second movement signal. The application has the beneficial effects that: the brain wave signals and the voice signals are combined to realize intelligent control of the artificial limb.

Description

Intelligent artificial limb system based on joint control of electroencephalogram signals and voice signals
Technical Field
The application relates to the field of artificial limbs, in particular to an intelligent artificial limb system based on joint control of brain electrical signals and voice signals.
Background
The common artificial limb on the market at present mainly adopts a connecting rod structure, and the design ensures that a patient can keep stable when the foot is grounded, and the joint can be buckled by the small force of the residual limb when the foot is lifted off the ground. No problem in appearance occurs even when sitting down.
In fact, such a prosthesis may be used by most physically handicapped persons, and may be more suitable when only a small portion of the body is handicapped. However, the artificial limb needs to control walking through hands, legs and other parts to provide support for the body, but is not suitable for patients with serious physical disabilities, such as high paraplegia, and cannot control the artificial limb through body actions, so the artificial limb in the common sense on the market has great limitation.
Disclosure of Invention
In view of the foregoing, embodiments of the present application provide an intelligent prosthetic system based on a joint control of an electroencephalogram signal and a voice signal.
The embodiment of the application provides an intelligent artificial limb system based on joint control of brain wave signals and voice signals, which comprises an artificial limb support, a main control unit, a Bluetooth communication module and a power supply for supplying power to the artificial limb system, wherein the artificial limb support is provided with a steering engine mechanism and an artificial limb control unit for controlling the steering engine mechanism, the main control unit and the artificial limb control unit are in wireless connection through the Bluetooth communication module, the main control unit comprises a brain wave acquisition device for acquiring brain wave of a user, a voice recognition module for recognizing user sentences and a main control MCU, the main control MCU analyzes the acquired brain wave signals to obtain concentration signals and blink signals, determines a movement speed according to the concentration signals, and determines a rotation direction according to the blink signals, and the movement speed and the rotation direction form a first movement signal; the main control MCU determines a preset movement speed or a rotation direction corresponding to the voice signal according to the recognized voice signal, the preset movement speed and the rotation direction form a second movement signal, the main control MCU receives the first movement signal for the first time, the first movement signal is transmitted to the artificial limb control unit, the artificial limb control unit controls the steering engine mechanism to drive the artificial limb support to start to move according to the first movement signal, then the main control MCU transmits the first movement signal to the artificial limb control unit when judging that the first movement signal is changed compared with the previous time, and the main control MCU transmits the second movement signal corresponding to the voice signal to the artificial limb control unit when judging that the first movement signal is not changed compared with the previous time and the voice recognition module recognizes the voice signal, and the artificial limb control unit controls the steering engine mechanism to drive the artificial limb support to continue to move according to the first movement signal or the second movement signal received by the main control MCU.
Further, the concentration level signal is a concentration level, and the motion speed and the concentration level in the first motion signal are in a linear relationship and are in positive correlation.
Further, the blink signal is a left eye blink or a right eye blink, when the left eye blinks, the rotation direction in the first motion signal is rotated to a preset angle in one direction, and when the right eye rotates, the rotation direction in the first motion signal is rotated to a preset angle in the other opposite direction.
Further, the voice recognition module is an LD3320 voice recognition chip, the LD3320 voice recognition chip inputs characters representing the movement speed or the rotation direction in advance, and when external voice is received, the characters are compared, and the voice signal is recognized as the same as the input characters.
Further, the brain wave acquisition device is a TGAM brain wave detection chip.
Further, the Bluetooth communication module is an HC-05 Bluetooth module and comprises a Bluetooth transmitter and a Bluetooth receiver, wherein the Bluetooth transmitter is electrically connected with the main control MCU, and the Bluetooth receiver is electrically connected with the artificial limb control unit.
The technical scheme provided by the embodiment of the application has the beneficial effects that: the intelligent artificial limb system based on the joint control of the brain wave signals and the voice signals realizes the intelligent control of the artificial limb, combines the brain wave signals and the voice signals, controls the artificial limb support to move in two modes, ensures the reliability and the stability of the control, and simultaneously ensures the simplicity and the practicability of products. Compared with the existing mechanical control artificial limb, the artificial limb is convenient to operate and control through brain waves or sentences, easy to realize the movement intention of a user, wider in application range, capable of conveniently serving disabled persons, more universal in applicability, capable of providing intelligent assistance outside the body for high-level paraplegic patients, and higher in efficiency for serving disabled persons.
Drawings
FIG. 1 is a schematic diagram of an intelligent prosthetic system based on the co-control of an electroencephalogram signal and a voice signal of the present application;
fig. 2 is a schematic diagram of an intelligent prosthetic system based on the co-control of an electroencephalogram signal and a voice signal according to the present application.
In the figure: 1-artificial limb support, 2-steering engine mechanism, 3-artificial limb control MCU, 4-bluetooth receiver, 5-second power adapter, 6-master control MCU, 7-bluetooth transmitter, 8-speech recognition module, 9-brain wave collection system, 10-master control unit.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, embodiments of the present application will be further described with reference to the accompanying drawings.
Referring to fig. 1, an embodiment of the present application provides an intelligent artificial limb system based on joint control of brain electrical signals and voice signals, which includes an artificial limb support 1, a main control unit 10, a bluetooth communication module and a power supply for supplying power to the artificial limb system.
The artificial limb support 1 is provided with a steering engine mechanism 2 and an artificial limb control unit for controlling the steering engine mechanism 2, the artificial limb control unit is an artificial limb control MCU3, the Bluetooth communication module is an HC-05 Bluetooth module and comprises a Bluetooth transmitter 7 and a Bluetooth receiver 4, the power supply comprises a first power supply adapter and a second power supply adapter 5, the Bluetooth receiver 4 and the first power supply adapter are respectively and electrically connected with the artificial limb control MCU3, and the main control unit and the artificial limb control MCU3 can be in data transmission through wireless connection of the Bluetooth communication module.
The master control unit 10 comprises a brain wave acquisition device 9 for acquiring brain waves of a user, a voice recognition module 8 for recognizing user sentences and a master control MCU6, wherein the brain wave acquisition device 9 is worn on the head of the user, in the embodiment, the brain wave acquisition device 9 is a TGAM brain wave detection chip, other brain wave sensors can be selected, the voice recognition module 8 is an LD3320 voice recognition chip, the TGAM brain wave detection chip, the LD3320 voice recognition chip and the Bluetooth transmitter 7 are respectively and electrically connected with the master control MCU6, the TGAM brain wave detection chip can directly acquire brain wave signals of the user to form brain wave data streams, the master control MCU6 analyzes the acquired brain wave data streams to obtain concentration degree signals and blink signals, and determines the movement speed according to the concentration degree signals, wherein the concentration degree signals are concentration degree, the movement speed and the concentration degree in the first movement signals are in linear relation and are in positive correlation; the main control MCU6 also determines a rotation direction according to a blink signal, wherein the blink signal is left eye blink or right eye blink, when a user blinks left eye, the rotation direction in the first motion signal is left rotation preset angle, and when the user blinks right eye, the rotation direction in the first motion signal is right rotation preset angle, and the motion speed and the rotation direction judged by the brain wave signal form the first motion signal.
The main control MCU6 determines a preset movement speed or a rotation direction corresponding to the voice signal according to the voice signal recognized by the voice recognition module 8, the voice recognition module 8 inputs characters representing the movement speed or the rotation direction in advance, such as characters representing the movement speed such as "stop", "acceleration", "deceleration", etc., characters representing the rotation direction such as "left turn", "right turn", "reversing", etc., the voice recognition module 8 compares when receiving external voice, recognizes the voice signal as the input characters, and transmits the voice signal to the main control MCU6, and the main control MCU6 determines the movement speed or the rotation direction represented by the voice signal, and the movement speed and the rotation direction determined by voice recognition constitute a second movement signal.
The main control MCU6 transmits a first motion signal or a second motion signal to the artificial limb control MCU3 through the Bluetooth transmission module, the priority of the first motion signal is higher than that of the second motion signal, the main control MCU receives the first motion signal for the first time and transmits the first motion signal to the artificial limb control MCU3, and the artificial limb control MCU3 controls the steering engine mechanism to drive the artificial limb support to start moving according to the first motion signal, namely, a user uses brain waves to control the artificial limb system to start moving.
And when the main control MCU6 judges that the first motion signal is changed compared with the previous first motion signal, the first motion signal is transmitted to the artificial limb control MCU3, and when the main control MCU3 judges that the first motion signal is unchanged compared with the previous first motion signal and the voice recognition module 8 recognizes the voice signal, the main control MCU3 transmits a second motion signal corresponding to the voice signal to the artificial limb control MCU3, and the artificial limb control MCU3 controls the steering engine mechanism 2 to drive the artificial limb support 1 to change the motion speed or adjust the motion direction according to the received first motion signal or second motion signal, and the artificial limb system continues to move.
In the artificial limb system, the brain wave of the user controls the artificial limb system to start to move, during the movement process, the user can select to control the artificial limb system to continue to move through brain wave or voice, when noise possibly exists around the user, even if the noise is recognized by the voice recognition module 8, a second movement signal which does not accord with the intention of the user is generated, at the moment, the user only needs to blink or adjust attention, and the artificial limb system still can continue to move according to the intention of the user because the priority of the first signal is higher than that of the second signal.
In this document, terms such as front, rear, upper, lower, etc. are defined with respect to the positions of the components in the drawings and with respect to each other, for clarity and convenience in expressing the technical solution. It should be understood that the use of such orientation terms should not limit the scope of the claimed application.
The embodiments described above and features of the embodiments herein may be combined with each other without conflict.
The foregoing description of the preferred embodiments of the application is not intended to limit the application to the precise form disclosed, and any such modifications, equivalents, and alternatives falling within the spirit and scope of the application are intended to be included within the scope of the application.

Claims (3)

1. An intelligent artificial limb system based on brain electrical signal and voice signal common control, which is characterized in that: the artificial limb system comprises an artificial limb support, a main control unit, a Bluetooth communication module and a power supply for supplying power to the artificial limb system, wherein the artificial limb support is provided with a steering engine mechanism and an artificial limb control unit for controlling the steering engine mechanism, the main control unit and the artificial limb control unit are in wireless connection through the Bluetooth communication module, the main control unit comprises a brain wave acquisition device for acquiring brain waves of a user, a voice recognition module for recognizing user sentences and a main control MCU, the main control MCU analyzes the acquired brain wave signals to obtain concentration signals and blink signals, determines movement speed according to the concentration signals, determines rotation direction according to the blink signals, and forms a first movement signal with the movement speed; the main control MCU determines a preset movement speed or a rotation direction corresponding to the voice signal according to the recognized voice signal, the preset movement speed and the rotation direction form a second movement signal, the main control MCU receives a first movement signal for the first time, the first movement signal is transmitted to the artificial limb control unit, the artificial limb control unit controls the steering engine mechanism to drive the artificial limb support to start to move according to the first movement signal, then the main control MCU transmits the first movement signal to the artificial limb control unit when judging that the first movement signal is changed compared with the previous time, and the main control MCU transmits a second movement signal corresponding to the voice signal to the artificial limb control unit when judging that the first movement signal is not changed compared with the previous time and the voice recognition module recognizes the voice signal, and the artificial limb control unit controls the steering engine mechanism to drive the artificial limb support to continue to move according to the first movement signal or the second movement signal received by the main control MCU;
the concentration degree signal is concentration degree, and the movement speed and the concentration degree in the first movement signal are in a linear relation and are in positive correlation;
the blink signal is that the left eye blinks or the right eye blinks, when the left eye blinks, the rotation direction in the first motion signal rotates to a preset angle in one direction, and when the right eye blinks, the rotation direction in the first motion signal rotates to a preset angle in the other opposite direction;
the brain wave acquisition device is a TGAM brain wave detection chip.
2. The intelligent prosthetic system based on the joint control of an electroencephalogram signal and a voice signal as recited in claim 1, wherein: the voice recognition module is an LD3320 voice recognition chip, characters representing the movement speed or the rotation direction are recorded in advance by the LD3320 voice recognition chip, and when external voice is received, the characters are compared, and the voice signal is recognized as the same as the recorded characters.
3. The intelligent prosthetic system based on the joint control of an electroencephalogram signal and a voice signal as recited in claim 1, wherein: the Bluetooth communication module is an HC-05 Bluetooth module and comprises a Bluetooth transmitter and a Bluetooth receiver, wherein the Bluetooth transmitter is electrically connected with the main control MCU, and the Bluetooth receiver is electrically connected with the artificial limb control unit.
CN201810956123.0A 2018-08-21 2018-08-21 Intelligent artificial limb system based on joint control of electroencephalogram signals and voice signals Active CN109172067B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810956123.0A CN109172067B (en) 2018-08-21 2018-08-21 Intelligent artificial limb system based on joint control of electroencephalogram signals and voice signals

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810956123.0A CN109172067B (en) 2018-08-21 2018-08-21 Intelligent artificial limb system based on joint control of electroencephalogram signals and voice signals

Publications (2)

Publication Number Publication Date
CN109172067A CN109172067A (en) 2019-01-11
CN109172067B true CN109172067B (en) 2023-08-29

Family

ID=64919394

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810956123.0A Active CN109172067B (en) 2018-08-21 2018-08-21 Intelligent artificial limb system based on joint control of electroencephalogram signals and voice signals

Country Status (1)

Country Link
CN (1) CN109172067B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113031766B (en) * 2021-03-15 2022-09-23 哈尔滨工业大学 Method for decoding Chinese pronunciation through electroencephalogram

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101947152A (en) * 2010-09-11 2011-01-19 山东科技大学 Electroencephalogram-voice control system and working method of humanoid artificial limb
CN101987047A (en) * 2009-08-03 2011-03-23 深圳先进技术研究院 Artificial limb control system and method based on voice and myoelectricity information identification
CN104825256A (en) * 2015-04-30 2015-08-12 南京信息工程大学 Artificial limb system with perception feedback function
CN105943207A (en) * 2016-06-24 2016-09-21 吉林大学 Intelligent artificial limb movement system based on idiodynamics and control methods thereof
CN108279620A (en) * 2018-04-10 2018-07-13 贵州大学 Bionic arm control device based on brain wave combination limb action and control method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101987047A (en) * 2009-08-03 2011-03-23 深圳先进技术研究院 Artificial limb control system and method based on voice and myoelectricity information identification
CN101947152A (en) * 2010-09-11 2011-01-19 山东科技大学 Electroencephalogram-voice control system and working method of humanoid artificial limb
CN104825256A (en) * 2015-04-30 2015-08-12 南京信息工程大学 Artificial limb system with perception feedback function
CN105943207A (en) * 2016-06-24 2016-09-21 吉林大学 Intelligent artificial limb movement system based on idiodynamics and control methods thereof
CN108279620A (en) * 2018-04-10 2018-07-13 贵州大学 Bionic arm control device based on brain wave combination limb action and control method

Also Published As

Publication number Publication date
CN109172067A (en) 2019-01-11

Similar Documents

Publication Publication Date Title
Lund et al. Inductive tongue control of powered wheelchairs
US20130090931A1 (en) Multimodal communication system
EP1210927A3 (en) Massage chair
CN202096374U (en) Intelligent wheelchair based on eye electric signals and head movement signals
Ghorbel et al. A survey on different human-machine interactions used for controlling an electric wheelchair
CN106842623A (en) Electronics ophthalmic lens with alarm clock
CN204426918U (en) A kind of Intelligent bracelet for alleviating Parkinsonian's freezing of gait
EP1230904A3 (en) Massage machine
WO2001012108A8 (en) Medical implant apparatus with wireless energy transmission
CN110251372A (en) Walk-aiding exoskeleton gait adjusting method based on intelligent crutch
CN109172067B (en) Intelligent artificial limb system based on joint control of electroencephalogram signals and voice signals
US20190350482A1 (en) Electromyographic Controlled Vehicles and Chairs
US20160120664A1 (en) Breath and head tilt controlled prosthetic limb
CN106214163B (en) Recovered artifical psychological counseling device of low limbs deformity correction postoperative
CN112107397A (en) Myoelectric signal driven lower limb artificial limb continuous control system
CN106984028A (en) A kind of mechanism of upper extremity exercise control assessment and rehabilitation training
Lin et al. An FPGA-based brain-computer interface for wireless electric wheelchairs
Kumar et al. Design and development of head motion controlled wheelchair
Choi et al. Robust semi-synchronous bci controller for brain-actuated exoskeleton system
Huo et al. Wireless control of powered wheelchairs with tongue motion using tongue drive assistive technology
CN105769206B (en) A kind of gait phase method of discrimination based on upper and lower extremities movable information
CN211300970U (en) Exoskeleton rehabilitation robot control system
CN211610406U (en) Eye-moving type intelligent rehabilitation wheelchair
CN103110469A (en) Electronic knee-joint orthopedic device
Patil Design and making of head motion controlled wheel chair

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant