CN108098736A - A kind of exoskeleton robot auxiliary device and method based on new perception - Google Patents

A kind of exoskeleton robot auxiliary device and method based on new perception Download PDF

Info

Publication number
CN108098736A
CN108098736A CN201611044772.0A CN201611044772A CN108098736A CN 108098736 A CN108098736 A CN 108098736A CN 201611044772 A CN201611044772 A CN 201611044772A CN 108098736 A CN108098736 A CN 108098736A
Authority
CN
China
Prior art keywords
data
sensor assembly
right leg
auxiliary device
exoskeleton robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201611044772.0A
Other languages
Chinese (zh)
Inventor
陈墩金
覃争鸣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rich Intelligent Science And Technology Ltd Is Reflected In Guangzhou
Original Assignee
Rich Intelligent Science And Technology Ltd Is Reflected In Guangzhou
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rich Intelligent Science And Technology Ltd Is Reflected In Guangzhou filed Critical Rich Intelligent Science And Technology Ltd Is Reflected In Guangzhou
Priority to CN201611044772.0A priority Critical patent/CN108098736A/en
Publication of CN108098736A publication Critical patent/CN108098736A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7275Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0006Exoskeletons, i.e. resembling a human figure

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Physiology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Geometry (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Rehabilitation Tools (AREA)

Abstract

The invention discloses a kind of exoskeleton robot auxiliary device and method based on new perception, described device includes:Left leg sensor assembly, right leg sensor assembly, data processing mainboard, host computer and ectoskeleton controller;The described method includes:S1, left leg sensor assembly and right leg sensor assembly gathered data simultaneously;S2 calculates the angle data of left and right leg knee, ankle according to gathered data respectively;S3 predicts the rotation angle in joint according to result of calculation and sends prediction data;S4 integrates the data of the left and right leg sended over, realizes time synchronization;S5 is sent to ectoskeleton controller or host computer.The present invention program realizes body gait sensory perceptual system using sensor, micro controller and wireless communication module, realizes the motion conditions of prediction human body lower limbs joint following a period of time, solves the problems, such as that ectoskeleton is uncoordinated with wearer motion.

Description

A kind of exoskeleton robot auxiliary device and method based on new perception
Technical field
The invention belongs to medical auxiliary robot, be related to a kind of exoskeleton robot auxiliary device based on new perception and Method.
Background technology
Since 21 century, with the continuous maturation of robot technology, robot technology has obtained broader applications.From industry Robot develops to service humanoid robot, and robot has gradually been entered among daily life, is brought to us many It is convenient.With the raising of human substance life horizontal and becoming increasingly abundant for cultural life, future robot will be closer with the mankind Link together.
Loading by lower limbs ectoskeleton is a kind of wearable robot, is supported by providing external force for human body, reaches reduction human body Load, improve the lasting locomitivity of human body purpose, improve capacity for individual action, specially match somebody with somebody maintenance activity and the medical treatment side of helping the disabled Face has wide practical use.
Loading by lower limbs ectoskeleton working mechanism captures human locomotion state in real time for sensory perceptual system, and controller generates control letter Number driving mechanical bone follow human motion.But body gait is captured to output control signal and driving machine from sensory perceptual system Structure (being usually motor or hydraulic pressure) driving ectoskeleton joint reaches target trajectory and is required to the regular hour, and this process human body is Through moving to another state, therefore, mechanical exoskeleton gait lags behind wearer's gait, so as to disturb the walking row of wearer For.
To solve the problems, such as this, the reference signal of control system should be ahead of the motion state of human body, it is necessary to human body It moves gait and carries out real-time, accurately capture and prediction.
Application publication number is that the application for a patent for invention of CN103431929A discloses a kind of " enhanced power exoskeleton of strength Walking step state cognitive method and device ", concretely comprises the following steps:Using foot bottom pressure sensor and knee joint encoder and gyroscope It contacts to earth information, joint angles variation and the measurement of thigh and calf angular velocity information, is proposed according to man-machine portable rule double into pedestrian Leg is stood, and the support of left leg, right leg are swung, and both legs support, right leg are in preceding, right leg support, left-leg movement, both legs support, and left leg exists It is preceding to wait the walking sub- phase division methods of 5 gaits, and classified using machine learning algorithm to metrical information in 5 kinds of sub- phases of gait Decision-making provides a kind of sub- phase identification method of data fusion walking step state.The invention can allow hydraulic control system to shrink in advance, carry High man-machine portable speed, but because its reference information is limited, fast prediction can not be carried out to wearer's behavior, in quick movement according to It is old to solve the problems, such as that mechanical exoskeleton gait lags.
Application publication number is that the application for a patent for invention of CN105150211A discloses a kind of " loading-type lower limb exoskeleton machine The control system of people ", the system are made of two function same subsystems;Each subsystem is by signal pickup assembly, microcontroller Device, movement executing mechanism and control algolithm composition;The lower limb exoskeleton robot is used to provide power-assisted for wearer's lower limb; The signal pickup assembly is by diaphragm pressure sensor, magnetic degree sensor, multi-channel amplifier, Multi-channel low pass filter structure Into the acquisition of the ankle joint angle signal of completion wearer's plantar pressure signal and lower limb exoskeleton robot;The movement is held Row mechanism is made of servo-driver group and servo electric cylinders group;The control algolithm uses end model- following control algorithm.The invention Adaptability of the robot to wearer's height can be improved, but it does not have corresponding prediction algorithm, can not realize under wearer One step action prediction can not solve the problems, such as that mechanical exoskeleton gait lags.
Application publication number is that the application for a patent for invention of CN105078708A discloses a kind of " exoskeleton robot servo antrol Device ", the device include upper arm, underarm, and upper arm lower end connected with underarm upper end using rotatable joint, the upper arm or under Arm is equipped with the active briquetting that can be slided perpendicular to axis direction, and the active briquetting and human body are bound, on upper arm or underarm with Active briquetting is correspondingly arranged there are two microswitch, is equipped with the power set that driving rotates between the upper and lower arms, two Microswitch controls forward and reverse movement of power set respectively.The invention realizes that ectoskeleton and the real-time of human body are servo-actuated, and has good Harmony well, synchronism, but it does not have corresponding prediction algorithm, and next action of unpredictable wearer can not solve Mechanical exoskeleton disturbs the problem of walking behavior of wearer.
The content of the invention
Present invention aims at a kind of exoskeleton robot auxiliary device and method based on new perception is provided, to human body The motion state of lower limb key position is acquired and predicts, realizes and provides reliable reference information for ectoskeleton controller, It efficiently solve the problems, such ass that ectoskeleton is uncoordinated with wearer motion, while solves ectoskeleton gait and lag behind wearer's step The problem of state.
In order to solve the above technical problems, the present invention adopts the following technical scheme that:A kind of ectoskeleton based on new perception Robot assisted device and method, wherein, described device includes:Left leg sensor assembly, right leg sensor assembly, data processing Mainboard, host computer and ectoskeleton controller;Wherein, the left leg sensor assembly and the right leg sensor assembly difference It is connected with data processing mainboard;The data processing mainboard is connected respectively with host computer and ectoskeleton controller.
Further, the left and right leg sensor assembly is by thigh attitude transducer, shank attitude transducer and foot Slap attitude transducer composition.
Further, the data processing mainboard is made of electrical level transferring chip, micro controller and wireless module.
Further, the host computer and the ectoskeleton controller are used to refer to the data of reception And it after analyzing, is further processed.
The described method includes:S1, left leg sensor assembly and right leg sensor assembly gathered data simultaneously;S2, according to adopting Collection data calculate the angle data of left and right leg knee, ankle respectively;S3 carries out the rotation angle in joint according to result of calculation It predicts and sends prediction data;S4 integrates the data of the left and right leg sended over, realizes time synchronization;S5 is sent to Ectoskeleton controller or host computer.
Further, in the step S3, Detecting (is come from by Nonlinear Time Series Analysis Takens algorithms strange attractors in fluid turbulence[J].Takens F.Dy-namical Systems and Turbulence, 1981,898:366-381.) rotation angle in joint is predicted.
The present invention has following advantageous effect compared with prior art:
The present invention program utilizes the rotation mounted on the attitude transducer acquisition human body lower limbs joint of ectoskeleton wearer's lower limb Transhipment is dynamic, and the motion conditions of human body lower limbs joint following a period of time are predicted by particular algorithm, can be ectoskeleton controller Reliable reference information is provided, solves the problems, such as that ectoskeleton is uncoordinated with wearer motion, while solves ectoskeleton gait The problem of lagging behind wearer's gait.
Description of the drawings
Fig. 1 is the structure diagram based on new sensory perceptual system.
Fig. 2 is the mode of human body lower limbs wearable sensors.
Fig. 3 is the interface hardware block diagram of lower limb institute wearable sensors.
Fig. 4 is data processing motherboard hardware block diagram.
Fig. 5 is the process chart for gathering sensing data.
Fig. 6 is the resolving flow chart of pre-processing sensor data frame.
Fig. 7 is the data synchronization of left and right leg and forwarding process figure.
Specific embodiment
Below in conjunction with the accompanying drawings and specific embodiment the present invention is carried out in further detail with complete explanation.It is appreciated that It is that specific embodiment described herein is only used for explaining the present invention rather than limitation of the invention.
Reference Fig. 1, a kind of exoskeleton robot auxiliary device and method based on new perception of the invention, wherein, institute Stating device includes:Left leg sensor assembly, right leg sensor assembly, data processing mainboard, host computer and ectoskeleton control Device;The left leg sensor assembly and the right leg sensor assembly are connected respectively with data processing mainboard;The data processing Mainboard is connected respectively with host computer and ectoskeleton controller.
Wherein, the left and right leg sensor assembly is by thigh attitude transducer, shank attitude transducer and sole appearance State sensor forms;With reference to Fig. 2, the dot of human body lower limbs represents attitude transducer, by same on the thigh, shank and sole of people The mode of sample wears an attitude transducer respectively, can carry out simple algebraic operation by the data of two adjacent sensors and obtain Go out the angle information of knee joint, ankle-joint;With reference to Fig. 3, each attitude transducer signal output using RS-232 agreements, in order to It can communicate with the ARM micro controllers in data processing main plate, it is necessary to which level conversion, is TTL by the level conversion of RS-232 agreements Level, using chip MAX3387, which has 3 level conversion passages;And the present invention sets sensor per 10ms at data Manage mainboard transmission primaries sampled data.
The data processing mainboard is made of 2 electrical level transferring chips, 3 micro controllers and 1 wireless module;With reference to figure 4, electrical level transferring chip selection is MAX3387, as the terminal communicated between micro controller and attitude transducer, is responsible for TTL Conversion between level and RS-232 level;Micro controller is using the STM32F407 microcontrollers of ARM kernels, 2#, 3# micro controller Left foot, the sensing data of right crus of diaphragm are handled respectively, and passes through SPI modules and is sent to 1# micro controllers, for other two micro-control The data of device are integrated, and are realized time synchronization and are passed through wireless module and be sent to the ectoskeleton controller or described upper Computer.Wireless module is using XBee-PR900HP.
The host computer and the ectoskeleton controller are used to after the data of reception are referred to and analyzed, It is further processed.
The cognitive method includes:Reference Fig. 5,
S1, left leg sensor assembly and right leg sensor assembly gathered data simultaneously;When Programmable detection to 3 sensors Sampled data, which has all received, successfully just starts next-step operation.
S2 calculates the angle data of left and right leg knee, ankle according to gathered data respectively;MTI-30 attitude transducers Data are encapsulated according to the data frame of specific format, and therefore, it is necessary to design software elder generation resolved data frames, can just calculate knee Lid, the angle data of ankle.This method realizes data calculation by interrupting, and resolves flow with reference to Fig. 6, STM32 microcontrollers UART modules often receive the data of 1 byte and just inspire and once interrupt, therefore, the frame data needs of sensor inspire repeatedly Interruption, which could resolve, to be finished.
S3 predicts the rotation angle in joint according to result of calculation and sends prediction data;Pass through Nonlinear Time Sequence analysis Takens algorithms can predict the rotation angle in joint.
According to Takens embedding theorems, for given time sequences y (t) ∈ R, 0 < < t < < n, appropriate delay is given Time h and Embedded dimensions p can must then postpone vector
D (t)=[y (t), y (t-h) ..., y (t-h (p-1))]T (1)
Using phase space reconfiguration and Takens embedding theorems as theoretical foundation and mathematical tool, the data prediction realized herein Algorithm flow is as follows:
S31, in each sampling instant t, t >=hp obtains a delay vector D (t);
S32 calculates current time D (t) and all D (i) observed before, the Euclidean distance between hp < < i < < t δ (i)=| | D (t)-D (i) | |;
S33 finds M delay vector for being capable of best match with D (t) M delay vector i.e. minimum with current time D (t) Euclidean distances, wherein
S34, calculating parameter N:
S35 calculates weight factor wj
S36 calculates the prediction of k ranksI.e.
The algorithm it can be seen from algorithm flow is it needs to be determined that parameter k, h, p and M, since Takens embedding theorems are to phase space The no theoretic suggestion of selection of reconstruction parameter, selects reconstruction parameter as follows:It is first definite value by prediction order k values, P ∈ [1,20], h ∈ [1,20], M ∈ [1,50] are taken, MATLAB emulation is carried out to measured data, finds out predictablity rate highest one Parameter used when group parameter is realized as algorithm.Accuracy rate definition predicts error as shown in formula (3)When When error is 0, accuracy rate 100%;When prediction of failure, PR then shows as negative
Prediction order k is mainly determined that the prediction order k in micro controller should by exoskeleton system executing agency response speed It can be set by program, to adapt to different driving mechanisms.
S4 integrates the data of the left and right leg sended over, realizes time synchronization;Its flow is with reference to Fig. 7,2#, 3# After premeasuring and other data are sent to 1# micro controllers by micro controller by SPI modules, the equally first resolved data frame of 1# micro controllers, Then data are integrated, realizes time synchronization.
The final process result of data is sent to ectoskeleton controller or upper by S5,1# micro controller by wireless module Computer.
The foregoing is merely the preferred embodiment of the present invention, are not intended to limit the invention, for those skilled in the art For, the present invention can have various modifications and changes.All any modifications made within spirit and principles of the present invention are equal Replace, improve etc., it should all be included in the protection scope of the present invention.

Claims (7)

1. a kind of exoskeleton robot auxiliary device based on new perception, which is characterized in that described device includes:Left leg sensing Device module, right leg sensor assembly, data processing mainboard, host computer and ectoskeleton controller;Wherein, the left leg sensing Device module and the right leg sensor assembly are connected respectively with data processing mainboard;The data processing mainboard respectively with upper meter Calculation machine and ectoskeleton controller are connected.
A kind of 2. exoskeleton robot auxiliary device based on new perception according to claim 1, which is characterized in that institute Left leg sensor assembly is stated to be made of thigh attitude transducer, shank attitude transducer and sole attitude transducer.
A kind of 3. exoskeleton robot auxiliary device based on new perception according to claim 1, which is characterized in that institute Right leg sensor assembly is stated to be made of thigh attitude transducer, shank attitude transducer and sole attitude transducer.
A kind of 4. exoskeleton robot auxiliary device based on new perception according to claim 1, which is characterized in that institute Data processing mainboard is stated to be made of electrical level transferring chip, micro controller and wireless module.
A kind of 5. exoskeleton robot auxiliary device based on new perception according to claim 1, which is characterized in that institute After host computer is stated for the data of reception to be referred to and analyzed, it is further processed.
A kind of 6. exoskeleton robot auxiliary device based on new perception according to claim 1, which is characterized in that institute After ectoskeleton controller is stated for the data of reception to be referred to and analyzed, it is further processed.
7. a kind of perception realized using a kind of exoskeleton robot auxiliary device based on new perception described in claim 1 Method, which is characterized in that the described method includes:S1, left leg sensor assembly and right leg sensor assembly gathered data simultaneously; S2 calculates the angle data of left and right leg knee, ankle according to gathered data respectively;S3, according to rotation of the result of calculation to joint Gyration is predicted and sends prediction data;S4 integrates the data of the left and right leg sended over, realizes that the time is same Step;S5 is sent to ectoskeleton controller or host computer.
CN201611044772.0A 2016-11-24 2016-11-24 A kind of exoskeleton robot auxiliary device and method based on new perception Pending CN108098736A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611044772.0A CN108098736A (en) 2016-11-24 2016-11-24 A kind of exoskeleton robot auxiliary device and method based on new perception

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611044772.0A CN108098736A (en) 2016-11-24 2016-11-24 A kind of exoskeleton robot auxiliary device and method based on new perception

Publications (1)

Publication Number Publication Date
CN108098736A true CN108098736A (en) 2018-06-01

Family

ID=62204883

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611044772.0A Pending CN108098736A (en) 2016-11-24 2016-11-24 A kind of exoskeleton robot auxiliary device and method based on new perception

Country Status (1)

Country Link
CN (1) CN108098736A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109793645A (en) * 2019-01-21 2019-05-24 徐州医科大学附属医院 A kind of auxiliary patient Parkinson gait rehabilitation training device
CN109814432A (en) * 2018-12-18 2019-05-28 航天时代电子技术股份有限公司 A kind of the communication frame generating method and communication means of human body servo antrol
CN110103215A (en) * 2019-04-12 2019-08-09 上海中研久弋科技有限公司 Multi-node collaborative perceives ectoskeleton Neural control system, method, equipment and medium
CN110193830A (en) * 2019-05-24 2019-09-03 上海大学 Ankle-joint gait prediction technique based on RBF neural
CN110561391A (en) * 2019-09-24 2019-12-13 中国船舶重工集团公司第七0七研究所 Inertia information feedforward control device and method for lower limb exoskeleton system
CN111714129A (en) * 2020-05-07 2020-09-29 广西科技大学 Human gait information acquisition system
CN112704491A (en) * 2020-12-28 2021-04-27 华南理工大学 Lower limb gait prediction method based on attitude sensor and dynamic capture template data

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109814432A (en) * 2018-12-18 2019-05-28 航天时代电子技术股份有限公司 A kind of the communication frame generating method and communication means of human body servo antrol
CN109814432B (en) * 2018-12-18 2020-08-21 航天时代电子技术股份有限公司 Human body follow-up control communication frame generation method and communication method
CN109793645A (en) * 2019-01-21 2019-05-24 徐州医科大学附属医院 A kind of auxiliary patient Parkinson gait rehabilitation training device
CN109793645B (en) * 2019-01-21 2021-07-13 徐州医科大学附属医院 Supplementary recovered trainer of parkinsonism people gait
CN110103215A (en) * 2019-04-12 2019-08-09 上海中研久弋科技有限公司 Multi-node collaborative perceives ectoskeleton Neural control system, method, equipment and medium
CN110193830A (en) * 2019-05-24 2019-09-03 上海大学 Ankle-joint gait prediction technique based on RBF neural
CN110193830B (en) * 2019-05-24 2022-10-11 上海大学 Ankle joint gait prediction method based on RBF neural network
CN110561391A (en) * 2019-09-24 2019-12-13 中国船舶重工集团公司第七0七研究所 Inertia information feedforward control device and method for lower limb exoskeleton system
CN111714129A (en) * 2020-05-07 2020-09-29 广西科技大学 Human gait information acquisition system
CN112704491A (en) * 2020-12-28 2021-04-27 华南理工大学 Lower limb gait prediction method based on attitude sensor and dynamic capture template data

Similar Documents

Publication Publication Date Title
CN108098736A (en) A kind of exoskeleton robot auxiliary device and method based on new perception
CN109953761B (en) Lower limb rehabilitation robot movement intention reasoning method
CN111557828B (en) Active stroke lower limb rehabilitation robot control method based on healthy side coupling
CN110916679B (en) Human body lower limb pose gait detection device and method
CN103722550B (en) The embedded system of exoskeleton robot
CN111659006B (en) Gait acquisition and neuromuscular electrical stimulation system based on multi-sensing fusion
CN111506189B (en) Motion mode prediction and switching control method for complex motion of human body
CN108245164B (en) Human body gait information acquisition and calculation method for wearable inertial device
CN108836757A (en) A kind of assisted walk exoskeleton robot system with self-regulation
CN110537921A (en) Portable gait multi-sensing data acquisition system
CN111568700A (en) Gait control method, device and equipment for lower limb wearable robot
Cimolato et al. Hybrid machine learning-neuromusculoskeletal modeling for control of lower limb prosthetics
KR20150066185A (en) Method for transmitting and reconstructing data, system for transmitting and reconstructing data, method for converting an original signal, system for converting an original signal and method for reconstructing the original signal
CN112560594B (en) Human gait recognition method of flexible exoskeleton system
CN111728827A (en) Power lower limb exoskeleton control method, device and system
CN105232053B (en) A kind of model of human ankle plantar flexion phase detection and method
CN108175413A (en) A kind of body gait sensory perceptual system of exoskeleton robot
CN109901433A (en) A kind of auxiliary device and method based on robot perception
WO2021213214A1 (en) Motion instruction triggering method and apparatus, and exoskeleton device
KR102427048B1 (en) Apparatus and method for predicting motion intention of a user wearing a shoulder-worn exoskeletion device
CN113910206A (en) Exoskeleton assistance system combined with multiple sensors and assistance detection method thereof
CN111481197B (en) A living-machine multimode information acquisition fuses device for man-machine natural interaction
Bae et al. Real-time estimation of lower extremity joint torques in normal gait
CN113041092A (en) Remote rehabilitation training system and method based on multi-sensor information fusion
CN105287063A (en) Posture measuring system for external-skeleton follow-up control and use method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20180601