CN110125909A - A kind of multi-information fusion human body exoskeleton robot Control protection system - Google Patents

A kind of multi-information fusion human body exoskeleton robot Control protection system Download PDF

Info

Publication number
CN110125909A
CN110125909A CN201910427971.7A CN201910427971A CN110125909A CN 110125909 A CN110125909 A CN 110125909A CN 201910427971 A CN201910427971 A CN 201910427971A CN 110125909 A CN110125909 A CN 110125909A
Authority
CN
China
Prior art keywords
cosθ
human body
sinθ
signal
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910427971.7A
Other languages
Chinese (zh)
Other versions
CN110125909B (en
Inventor
钱伟行
吴文宣
蒲文浩
张彤彤
赵泽宇
刘志林
顾雅婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhenjiang Institute For Innovation And Development Of Nanjing Normal University
Original Assignee
Zhenjiang Institute For Innovation And Development Of Nanjing Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhenjiang Institute For Innovation And Development Of Nanjing Normal University filed Critical Zhenjiang Institute For Innovation And Development Of Nanjing Normal University
Priority to CN201910427971.7A priority Critical patent/CN110125909B/en
Publication of CN110125909A publication Critical patent/CN110125909A/en
Application granted granted Critical
Publication of CN110125909B publication Critical patent/CN110125909B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/389Electromyography [EMG]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0006Exoskeletons, i.e. resembling a human figure
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion

Abstract

The invention discloses a kind of multi-information fusion human body exoskeleton robot Control protection systems.The system includes that several groups are mounted on the inertial measurement cluster at human limb position, myoelectric sensor, is mounted on the respiration transducer of chest, is worn on the depth camera on head, and is mounted on the machine learning processing computer of human body any position.Using the valid data of certain moment each sensor acquisition as the input of the LSTM neural network based on BPTT algorithm, the sum of maximum delay time of control signal is received as time interval using time consumed by computer disposal machine learning algorithm from this moment and robot each position steering engine, the control signal of the motion intention of characterized human body is as output at the time of using after the time interval, pass through neural metwork training, the nonlinear function mapping relations for being difficult to be expressed with analytic expression are established in engineering, realize the motion intention that human body subsequent time is judged by the movement at the moment, to the function of the corresponding control signal of output.

Description

A kind of multi-information fusion human body exoskeleton robot Control protection system
Technical field
The invention belongs to robotic technology fields, are related to a kind of human body exoskeleton robot Control protection system.
Background technique
The dyskinesias such as hemiplegia caused by cerebral injury bring heavy burden to the family of patient and society.Just Really, scientific rehabilitation training is played the role of the recovery and raising of extremity motor function very important.It obtains in time and effective Rehabilitation training have become the urgent need of hemiplegia or paraplegia patient, however rehabilitation equipment backwardness is the main of Rehabilitation Obstacle.The combination of medical science of recovery therapy and robot technology improves the validity of rehabilitation training and ensure that the intensity of action training, Rehabilitation technique to study new opens new approach, but how to identify the different motion intention of human body to take different arrange It applies, prevents human body because dangerous sports intention makes aggressive behavior and hurts oneself or surrounding other staff, be still medical treatment In healing robot technology the problem of urgent need to resolve.
Summary of the invention
The technical issues of in order to solve proposed in the studies above background, the present invention is directed to propose a kind of multi-information fusion people Exoskeleton robot Control protection system, flexibly uses a variety of sensing means, and acquisition is mounted on the inertia at human limb position Component, electromyography signal sensor are measured, the depth camera information architecture machine learning model on head is worn on, is passed in conjunction with breathing Sensor identifies the different motion intention of human body, and then controls the movement of exoskeleton robot, improves exoskeleton robot system Reliability and security.
In order to achieve the above technical purposes, the invention provides the following technical scheme:
The present invention proposes a kind of multi-information fusion human body exoskeleton robot Control protection system, and the system includes several Group is mounted on the inertial measurement cluster at human limb position, electromyography signal sensor, and several groups are mounted on body chest positions' Respiration transducer is worn on head and depth camera that the machine visual field and the human body visual field are overlapped and is mounted on any position of human body The machine learning processing computer set;Wherein, inertial measurement cluster, electromyography signal sensor, respiration transducer, depth camera It is connect respectively with machine learning processing computer by wired or wireless mode, it is different in machine learning processing computer identification After the motion intention of type, corresponding control and protection is carried out by exoskeleton robot according to the actual situation.
Further, a kind of multi-information fusion human body exoskeleton robot Control protection system proposed by the invention, will The joint attitude angle and angular speed, electromyography signal sensor of the inertial measurement cluster acquisition of certain moment partes corporis humani position acquire The object space and depth of surface electromyogram signal, the respiratory rate of respiration transducer acquisition and amplitude of respiration, depth camera acquisition Degree information sends machine learning processing computer to, as the long short-term memory nerve based on back-propagation algorithm BPTT at any time The input of network LSTM, to receive control signal most through time consumed by machine learning algorithm and robot each position steering engine The sum of big delay time is used as time interval, the control signal of the motion intention of characterized human body at the time of after the time interval As the output of long Memory Neural Networks in short-term, it is trained inputted and exported between nonlinear function mapping relations, from And realize the controls in advance to exoskeleton robot.
Further, a kind of multi-information fusion human body exoskeleton robot Control protection system proposed by the invention, base In the length Memory Neural Networks LSTM in short-term of back-propagation algorithm BPTT at any time, it is trained inputted and exported between it is non- Linear function mapping relations, specific as follows:
If x is input, s is to hide layer state, and o is output, xtFor the input of t-th of moment RNN, stFor t-th of moment The input of RNN hidden layer, ytFor the label of t moment, ztFor the input that collects of output layer, U arrives for a moment on hidden layer The weight at current time, W are the weight from input layer to hidden layer, and V is weight of the hidden layer to output layer, after being temporally unfolded WithReplace o, then
st=tanh (Uxt+Wst-1)
Use cross entropy E as loss function
Gradient when back-propagating is calculated using chain rule, for the output E of networkt, have
zt=Vst
st=tanh (Uxt+Wst-1)
It therefore can be in the hope of the gradient of V
Wherein stDerivation to W is branch's derivation
Derivation to U is also branch's derivation:
Further, a kind of multi-information fusion human body exoskeleton robot Control protection system proposed by the invention, stWhen to W derivation, if without restriction, need to from t to 0 institute it is stateful recall, can be according to scene and required precision It is truncated,
Further, a kind of multi-information fusion human body exoskeleton robot Control protection system proposed by the invention, It is information collection and processing stage, the stage, specific step is as follows before constructing machine learning algorithm model:
Step 1: under human body different motion state, inertial measurement cluster, flesh being acquired with identical or different Frequency Synchronization Electric signal sensor, respiration transducer, depth camera information;
Step 2: human skin surface's electromyography signal of electromyography signal sensor acquisition being pre-processed, including signal has The detection of effect property, signal noise silencing and activity section are reinforced, and initiation threshold is rationally arranged;
Step 3: signal validation checking being carried out to the inertial measurement cluster that human limb position is arranged in, then to effective Signal carries out random error Real-time modeling set and correction, guarantees the accuracy of signal acquisition;
Step 4: signal validation checking is carried out to the respiration transducer that the suitable positions such as human chest are arranged in, it is then right Useful signal carries out random error Real-time modeling set and correction, guarantees the accuracy of signal acquisition;
Step 5: effective in information such as the inertial measurement clusters, myoelectric sensor, respiration transducer at human limb position Under the premise of ectoskeleton robot system is initialized, otherwise restart each component, be back to step 2;
Step 6: attitude algorithm being carried out to ectoskeleton robot system and speed, position resolve.
Further, a kind of multi-information fusion human body exoskeleton robot Control protection system proposed by the invention, After identifying different types of motion intention, exoskeleton robot can carry out corresponding control and protection according to the actual situation, tool Body are as follows: by the motion intention that multiple sensors identify be human body motion intention normal behaviour, if human body has generated but not complete At a certain movement, by exoskeleton robot kinetic control system, auxiliary human body continues under the premise of protecting gravity center of human body to stablize This movement is completed, human body is made to reach the motion intention and not come to harm;If judging, motion intention, can be to wearing for aggressive behavior Wearer or surrounding other staff cause centainly to injure, then by exoskeleton robot kinetic control system, to exoskeleton robot Movement velocity, acceleration carry out feedback control, establish the parametric equation of each component of robot, including displacement, acceleration parameter.
Further, a kind of multi-information fusion human body exoskeleton robot Control protection system proposed by the invention, institute The parametric equation for establishing each component of robot is stated, if the distance of mechanical arm shoulder steering engine to ancon steering engine is L1, ancon and wrist Distance is L2, wrist and palm distance are L3, arm L1、L2、L3Actuate motion range be Φ1、Φ2、Φ3, x, y, z composition machine Device people's fundamental coordinate system, the coordinate of robot arm end effector are represented by following equation:
X=L1cosθ11cosθ12+L2cosθ21cosθ22+L3cosθ31cosθ32
Y=L1cosθ11sinθ12+L2cosθ21sinθ22+L3cosθ31sinθ32
Z=L1sinθ11+L2sinθ2+L3sinθ3
Wherein θ11For L1With plane x1y1z1Angle;θ12For L1In plane x1y1z1Projection and x1Angle;θ21For L2 With plane x2y222Angle;θ22For L2In plane x2y2z2Projection and x2Angle;θ31For L3With plane x3y3z3Angle; θ32For L3In plane x3y3z3Projection and x3Angle;
Terminal position is obtained to the progress derivation of mechanical arm tail end position and obtains speed, equation is as follows:
X=-L1ω11sinθ11cosθ12-L1ω12cosθ11sinθ11-L2ω21sinθ21cosθ22
-L2ω22cosθ21sinθ22-L3ω31sinθ31cosθ32-L3ω32cosθ31sinθ32
Y=-L1ω11sinθ11sinθ12+L1ω12cosθ11cosθ12-L2ω21sinθ21sinθ22
+L2ω22cosθ21cosθ22-L3ω31sinθ31sinθ32+L3ω32cosθ31cosθ32
Z=L1ω11cosθ11+L2ω21cosθ21+L3ω31cosθ31
By above-mentioned equation, relationship and hand between the speed and each joint velocity in robot base's coordinate system are established With extraneous contact force and corresponding interarticular relationship, help system resolves the movement concrete condition of mechanical arm, is convenient for ectoskeleton Robot carries out controlling protection accordingly according to the actual situation.
The invention adopts the above technical scheme compared with prior art, has following technical effect that
The present invention introduces surface electromyogram signal in systems and combines patient motion intention, and human body electromyography signal includes Largely physiologic information related with human motion state embodies the combination and exploded relationship of motor pattern, and indicates limbs Motion intention, then assist controlling by visual sensor, guarantee the accuracy of motion control.Meanwhile passing through respiration transducer institute The difference of the respiratory rate signal of acquisition can identify that the different motion of people is intended to, and thus the motion intention different to human body is adopted Different counter-measures is taken, the aggressive behavior of human body is efficiently avoided, enhances the safety of exoskeleton robot and reliable Property.The present invention merges multiple sensors identification human motion and is intended to, and using the method for machine learning, realizes by human limb The judgement of current time movement, control robot carries out the movement of subsequent time, while protecting wound of the human body by aggressive behavior Evil, effectively improves the safety of human-computer interaction.
Detailed description of the invention
Fig. 1 is the structural block diagram of system of the present invention.
Fig. 2 is the installation site figure of each sensor of system of the present invention.
Fig. 3 is system machine learning structure block diagram of the present invention.
Fig. 4 is system neural network structure chart of the present invention.
Fig. 5 is the structure chart after system neural network of the present invention is temporally unfolded.
Fig. 6 is system machine learning algorithm model of the present invention.
Fig. 7 is that system protection mechanism starting of the present invention judges structural block diagram.
Specific embodiment
Technical solution provided by the invention is described in detail below with reference to specific embodiment, it should be understood that following specific Embodiment is only illustrative of the invention and is not intended to limit the scope of the invention.
The modes such as present invention combination inertia component, physiology signal sensor, computer vision identify that human motion is anticipated Figure, and then control the movement of exoskeleton robot.Inertial measurement cluster, physiological signal sensor and depth camera are installed At the corresponding position of human body, based on the different motion state signal of human body, pass through analysis to human cinology's mechanism and machine Learning algorithm constructs the output model of exoskeleton robot system.According to the amphicheirality of exoskeleton robot system, will operate The decision-making capability of person combines with the work executive capability of robot, the different motion intention of identification human body, realize human body with Exoskeleton robot system coordination campaign, prevent human body because dangerous sports intention make aggressive behavior hurt oneself or Surrounding other staff.
As shown in Figure 1, the invention proposes a kind of multi-information fusion human body exoskeleton robot Control protection systems.Such as figure Shown in 2, wherein electromyography signal sensor and inertial measurement cluster are mounted on each position of human body, specifically: electromyography signal sensing Device is installed on partes corporis humani position skin surface, can be used in practical application in low precision sensor, the upper arm may be selected in installation site The muscle groups such as biceps, triceps muscle of arm surface.Inertial measurement cluster is mountable to four limbs position, can be used in practical application in it is low The Inertial Measurement Unit of precision, such as NV-IMU200 type equipment.Respiration transducer can use the sensor such as HXB-1 type of contact Respiration transducer is placed on the suitable positions such as chest, it is also possible to contactless detector such as capacitance type sensing or continuous wave radar etc. Sensing scheme.The wearable position being overlapped in the machines visual field such as head and the mankind visual field of depth camera.Machine learning processing meter Calculation machine is mountable to any position of human body, by cable connection between the sensor component, or it is complete using wireless telecommunications It is transmitted at data.
Specific step is as follows for multi-information fusion human body exoskeleton robot control guard method based on above system.
1, the acquisition and pretreatment of electromyography signal:
The synthesis that surface electromyogram signal is Muscle electrical activity at the skin surface on time and space.Electromyography signal herein Acquisition use wired mode, signal is acquired by being pasted onto the electrode paste of skin surface, electrode and acquisition equipment and computer It is connected Deng control equipment.Pretreatment includes signal validation checking, signal noise silencing, active segment reinforcement, and initiation threshold is rationally arranged Deng.
Since the contact impedance between surface electrode and skin is by contact tightness, skin clean-up performance, humidity, the four seasons The many factors such as season variation influence, it is therefore desirable to high cmrr of the great care when designing circuit.It is actually using In can be used with high-gain, high input impedance, high cmrr, low-power consumption electromyographic signal collection circuit carry out signal Acquisition, such as with the pre-amplification circuit that differential three amplifiers instrument amplifier INA128PA builds as core devices that is in parallel. Since the interference such as the movement of electrode, 50Hz power frequency supply can all introduce noise, filter can be used, such as voltage controlled voltage source second order Low-pass filter.
2, the signal acquisition of the inertial measurement cluster at human limb position:
The output signal for acquiring gyroscope, piezoelectric accelerometer in the inertia sensing device assembly of human limb position, obtains Angular speed and angular acceleration under three-dimensional system of coordinate.
3, the signal acquisition of respiration transducer:
By installing respiration transducer in suitable positions such as human chests, the respiratory rate and amplitude of respiration of human body are obtained.
4, the Image Acquisition and processing of depth camera:
Acquisition by the camera image being overlapped with the mankind visual field obtains position and the depth of object, by image Processing and the interaction of sensor provide certain referential to next movement.
5, inertial sensor, the Real-time modeling set of the validation checking of respiration transducer and random error and correction:
Using self tuning regulator, On-line Estimation is carried out to system or controller parameter, identifying system in real time can be passed through Parameter is correspondingly modified in variation with environment automatically, and closed-loop control system is made to reach desired performance indicator.Specifically, it can use Mathematical prediction model of the linear difference equation (may include distracter) of one expression input/output relation as system, referred to as Controllable autoregressive moving-average model directly obtains an output side with the parameter of least square method of recursion On-line Estimation model The smallest self tuning regulator of difference.By taking real-time control exoskeleton robot as an example, the current value of the model present position can be seen Work is the weighted sum of the finite term of past value and its is superimposed now with the finite term weighted sum of past interference volume, if it is possible to A model is constructed, keeps its AIC criterion functional value minimum, then the model is exactly a best model.Usual position sequence The order of arma modeling is relatively low, when to its founding mathematical models, sets the upper limit of its model order first, and then will N, the order of the two models of m is chosen from some lesser numerical value, then successively increases, and estimates constructed arma modeling Parameter and residual variance, and its corresponding AIC criterion functional value is found out, finally it is chosen so that AIC criterion function minimalization Model order and parameter as best fit exoskeleton robot model.
6, the initialization of exoskeleton robot system:
In the effective premise of the information such as the inertial measurement cluster, myoelectric sensor, respiration transducer at human limb position Under ectoskeleton robot system is initialized: system starting after in a static condition, using accelerometer data, calculating Each component angle of revolution, pitch angle and deflection angle are obtained by horizontal autoregistration in machine, they are overlapped with the angle of specified referential, Otherwise restart each component, be back to step 2.
7, the pose resolving of exoskeleton robot system and velocity calculated:
Because output is pose and speed of the robot relative to inertial space, it is therefore desirable to gyroscope and accelerometer Obtained data are transformed into specified referential by resolving.
(1) pose resolves:
The resolving of pose what is involved is the transformation of coordinate system to coordinate system, the mapping of coordinate system can be divided into translational coordination system, The mapping of rotating coordinate system and General Coordinate System.Be given below General Coordinate System mapping resolve: set inertia space reference system as Coordinate system { B }, specifying referential coordinate system is { A }.The case where considering most typically, the origin of { B } and { A } are not overlapped, and have one Vector shift.Determine that the vector of { B } origin is usedA PBORGIt indicates, { B } is used relative to the rotation of { A } simultaneouslyDescription.It is knownBP, First willBP transforms to a middle coordinate system, this coordinate system is identical with the posture of { A }, the origin of origin and { B } are overlapped, It is available:
(2) velocity calculated:
If inertia space reference system is coordinate system { B }, specifying referential coordinate system is { A }.The case where only considering linear velocity, Have:Here it is only applicable to the case where relative bearing of { B } and { A } remains unchanged.Only consider angular speed The case where, have:To sum up, when linear velocity and angular speed exist simultaneously, have:
8, machine learning algorithm model construction:
In the case where such as excitement of human body different motion state such as walking, jump and emotional state, stablizing, with identical or The respiration transducer of the myoelectric sensor at the positions such as different Frequency Synchronization acquisition human limbs, inertial measurement cluster and chest Information is gone forward side by side line number Data preprocess, as shown in figure 3, the image that will handle obtained information combination depth camera acquisition above is made For the input of the LSTM neural network based on BPTT algorithm, through time consumed by machine learning algorithm and robot each position The sum of the maximum delay time that steering engine receives control signal is used as time interval, and subsequent time characterizes the control that human motion is intended to Signal sends machine learning processing computer, the output as LSTM neural network to.
As shown in figure 4, being LSTM neural network structure figure.Wherein x is input, and s is to hide layer state, and o is output, xtFor The input of t-th of moment RNN, stFor the input of t-th of moment RNN hidden layer, ytFor the label of t moment, ztFor output layer Collect input, U is the weight at moment a to current time on hidden layer, and W is the weight from input layer to hidden layer, and V is Hidden layer to output layer weight, temporally be unfolded after as shown in figure 5, withReplace o, then
st=tanh (Uxt+Wst-1)
Use cross entropy E as loss function
Gradient when back-propagating is calculated, using chain rule with the output E of network3For,
z3=Vs3
s3=tanh (Ux3+Ws2)
It therefore can be in the hope of the gradient of V
The gradient of opposite V, because of stIt is W, the function of U, and the s containedt-1In derivation, cannot be simply considered that is One constant, therefore in derivation, if without restriction, need it is stateful to the institute from t to 0 recall, in practice one As be truncated according to scene and required precision.
Wherein s3Derivation to W is branch's derivation
The gradient of U is also similar
BPTT algorithm is a simple variation of BP algorithm, the propagated forward of entire propagated forward algorithm and BP network in fact The information of the difference of the algorithm previous moment hidden layer that has been mostly, and backpropagation is exactly from the last one time by accumulation Residual error passes back.
It is practical although Simple RNN can theoretically keep the dependence between the state of long interval of time On can only learn to short-term dependence.This has resulted in " long-term to rely on " problem, needs to delay by the RNN with LSTM unit Gradient disappearance problem is solved, typically now the RNN of LSTM unit is just direct to be LSTM using.LSTM unit introduces door machine System controls the information for flowing through unit by forgeing door, input gate and out gate.Why Simple RNN has gradient disappearance to be Because of the multiplication relationship between error term;If derived with LSTM, it is found that this multiplication relationship becomes addition relationship, so Gradient disappearance can be alleviated.
The present invention summarizes the rule of sensor signal and human cinology, psychology relevant information, as shown in fig. 6, passing through LSTM neural network is constructed, allows robot according to current time input data, robot each position steering engine is received and is controlled The control signal of the possible motion intention of human body, the output as current time send machine to after the maximum delay time of signal processed Device study processing computer, it is trained inputted and export between engineering in be difficult to analytic expression expression nonlinear function Mapping relations, to realize the controls in advance to exoskeleton robot.
9, the feedback system of protected mode:
As shown in fig. 7, according to the machine learning algorithm model constructed, if the subsequent time motion intention performance of output Are as follows: it is excited, and estimated movement target can be generated under present speed, acceleration strike or other it is dangerous when, starting Safeguard protection mode.The feedback control of exoskeleton robot is established using kinematics formula:
If the distance of mechanical arm shoulder steering engine to ancon steering engine is L1, ancon and wrist distance are L2, wrist and palm distance For L3, arm L1、L2、L3Actuate motion range be Φ1、Φ2、Φ3.X, y, z constitutes robot fundamental coordinate system.Mechanical arm end The coordinate of end actuator is represented by following equation:
X=L1cosθ11cosθ12+L2cosθ21cosθ22+L3cosθ31cosθ32
Y=L1cosθ11sinθ12+L2cosθ21sinθ22+L3cosθ31sinθ32
Z=L1sinθ11+L2sinθ2+L3sinθ3
Wherein θ11For L1With plane x1y1z1Angle;θ12For L1In plane x1y1z1Projection and x1Angle;θ21For L2 With plane x2y2z2Angle;θ22For L2In plane x2y2z2Projection and x2Angle;θ31For L3With plane x3y3z3Angle; θ32For L3In plane x3y323Projection and x3Angle.
Terminal position is obtained to the progress derivation of mechanical arm tail end position and obtains speed, equation is as follows:
X=-L1ω11sinθ11cosθ12-L1ω12cosθ11sinθ11-L2ω21sinθ21cosθ22
-L2ω22cosθ21sinθ22-L3ω31sinθ31cosθ32-L3ω32cosθ31sinθ32
Y=-L1ω11sinθ11sinθ12+L1ω12cosθ11cosθ12-L2ω21sinθ21sinθ22
+L2ω22cosθ21cosθ22-L3ω31sinθ31sinθ32+L3ω32cosθ31cosθ32
Z=L1ω11cosθ11+L2ω21cosθ21+L3ω31cosθ31
Pass through above-mentioned equation, it is established that relationship between the speed and each joint velocity in robot base's coordinate system and Hand and extraneous contact force and corresponding interarticular relationship, can help the movement concrete condition of our system resolving mechanical arms, Convenient for next control protection.
By above-mentioned equation, the physical parameters such as damping force, motor speed needed for hindering movement are obtained, are constantly obtained in the process Displacement information, velocity information, acceleration information newly is obtained, and passes through the related algorithm of human cinology and motor drag, adjustment Drag acceleration realizes timely and effectively feedback control, and the drag acceleration for obtaining exoskeleton robot makes It obtains limbs to stop in the position for reaching moving target displacement half, prevents too drastic movement under the premise of protecting gravity center of human body stable Preferably protection wearer and surrounding other staff.
Specific embodiment is merely illustrative of the invention's technical idea, and this does not limit the scope of protection of the present invention, all It is any changes made on the basis of the technical scheme according to the technical idea provided by the invention, each falls within present invention protection model Within enclosing.

Claims (7)

1. a kind of multi-information fusion human body exoskeleton robot Control protection system, it is characterised in that: the system includes several Group is mounted on the inertial measurement cluster at human limb position, electromyography signal sensor, and several groups are mounted on body chest positions' Respiration transducer is worn on head and depth camera that the machine visual field and the human body visual field are overlapped and is mounted on any position of human body The machine learning processing computer set;Wherein, inertial measurement cluster, electromyography signal sensor, respiration transducer, depth camera It is connect respectively with machine learning processing computer by wired or wireless mode, it is different in machine learning processing computer identification After the motion intention of type, corresponding control and protection is carried out by exoskeleton robot according to the actual situation.
2. a kind of multi-information fusion human body exoskeleton robot Control protection system according to claim 1, feature It is, the joint attitude angle of the inertial measurement cluster acquisition of certain moment partes corporis humani position and angular speed, electromyography signal is sensed The object of the surface electromyogram signal of device acquisition, the respiratory rate of respiration transducer acquisition and amplitude of respiration, depth camera acquisition Position and depth information send machine learning processing computer to, in short-term as the length based on back-propagation algorithm BPTT at any time The input of Memory Neural Networks LSTM is controlled with receiving through time consumed by machine learning algorithm and robot each position steering engine The sum of maximum delay time of signal is used as time interval, the motion intention of characterized human body at the time of after the time interval Control output of the signal as long Memory Neural Networks in short-term, it is trained inputted and exported between nonlinear function map Relationship, to realize the controls in advance to exoskeleton robot.
3. being based on a kind of multi-information fusion human body exoskeleton robot Control protection system as claimed in claim 2, feature exists It is trained to obtain input and output in, the length based on back-propagation algorithm BPTT at any time Memory Neural Networks LSTM in short-term Between nonlinear function mapping relations, it is specific as follows:
If x is input, s is to hide layer state, and o is output, xtFor the input of t-th of moment RNN, stIt is hidden for t-th of moment RNN Hide the input of layer, ytFor the label of t moment, ztFor the input that collects of output layer, U be on hidden layer a moment to it is current when The weight at quarter, W are the weight from input layer to hidden layer, and V is weight of the hidden layer to output layer, are used after being temporally unfoldedCome Instead of o, then
st=tanh (Uxt+Wst-1)
Use cross entropy E as loss function
Gradient when back-propagating is calculated using chain rule, for the output E of networkt, have
zt=Vst
st=tanh (Uxt+Wst-1)
It therefore can be in the hope of the gradient of V
Wherein stDerivation to W is branch's derivation
Derivation to U is also branch's derivation:
4. being based on a kind of multi-information fusion human body exoskeleton robot Control protection system as claimed in claim 3, feature exists In: in stWhen to W derivation, if without restriction, need to from t to 0 institute it is stateful recall, can be according to scene and precision It is required that be truncated,
5. being based on a kind of multi-information fusion human body exoskeleton robot Control protection system as claimed in claim 3, feature exists In: be information collection and processing stage, the stage, specific step is as follows before constructing machine learning algorithm model:
Step 1: under human body different motion state, with identical or different Frequency Synchronization acquisition inertial measurement cluster, myoelectricity letter Number sensor, respiration transducer, depth camera information;
Step 2: human skin surface's electromyography signal of electromyography signal sensor acquisition being pre-processed, including signal validity Detection, signal noise silencing and activity section are reinforced, and initiation threshold is rationally arranged;
Step 3: signal validation checking being carried out to the inertial measurement cluster that human limb position is arranged in, then to useful signal Random error Real-time modeling set and correction are carried out, guarantees the accuracy of signal acquisition;
Step 4: signal validation checking being carried out to the respiration transducer that the suitable positions such as human chest are arranged in, then to effective Signal carries out random error Real-time modeling set and correction, guarantees the accuracy of signal acquisition;
Step 5: before the information such as the inertial measurement cluster, myoelectric sensor, respiration transducer at human limb position are effective It puts and ectoskeleton robot system is initialized, otherwise restart each component, be back to step 2;
Step 6: attitude algorithm being carried out to ectoskeleton robot system and speed, position resolve.
6. being based on a kind of multi-information fusion human body exoskeleton robot Control protection system as claimed in claim 3, feature exists In: after identifying different types of motion intention, exoskeleton robot can carry out controlling and protecting accordingly according to the actual situation Shield, specifically: it by the motion intention that multiple sensors identify is human body motion intention normal behaviour, if human body has generated but simultaneously A certain movement is not completed, human body is assisted under the premise of protecting gravity center of human body to stablize by exoskeleton robot kinetic control system This movement is continued to complete, human body is made to reach the motion intention and not come to harm;If judge motion intention for aggressive behavior, meeting Wearer or surrounding other staff are caused centainly to injure, then by exoskeleton robot kinetic control system, to ectoskeleton machine The movement velocity of device people, acceleration carry out feedback control, establish the parametric equation of each component of robot, including displacement, acceleration Parameter.
7. being based on a kind of multi-information fusion human body exoskeleton robot Control protection system as claimed in claim 6, feature exists In: the parametric equation for establishing each component of robot, if the distance of mechanical arm shoulder steering engine to ancon steering engine is L1, ancon with Wrist distance is L2, wrist and palm distance are L3, arm L1、L2、L3Actuate motion range be Φ1、Φ2、Φ3, x, y, z structure At robot fundamental coordinate system, the coordinate of robot arm end effector is represented by following equation:
X=L1cosθ11cosθ12+L2cosθ21cosθ22+L3cosθ31cosθ32
Y=L1cosθ11sinθ12+L2cosθ21sinθ22+L3cosθ31sinθ32
Z=L1sinθ11+L2sinθ2+L3sinθ3
Wherein θ11For L1With plane x1y1z1Angle;θ12For L1In plane x1y1z1Projection and x1Angle;θ21For L2With plane x2y2z2Angle;θ22For L2In plane x2y2z2Projection and x2Angle;θ31For L3With plane x3y3z3Angle;θ32For L3 In plane x3y3z3Projection and x3Angle;
Terminal position is obtained to the progress derivation of mechanical arm tail end position and obtains speed, equation is as follows:
X=-L1ω11sinθ11cosθ12-L1ω12cosθ11sinθ11-L2ω21sinθ21cosθ22-L2ω22cosθ21sinθ22-L3 ω31sinθ31cosθ32-L3ω32cosθ31sinθ32
Y=-L1ω11sinθ11sinθ12+L1ω12cosθ11cosθ12-L2ω21sinθ21sinθ22+L2ω22cosθ21cosθ22-L3 ω31sinθ31sinθ32+L3ω32cosθ31cosθ32
Z=L1ω11cosθ11+L2ω21cosθ21+L3ω31cosθ31
By above-mentioned equation, relationship between the speed and each joint velocity in robot base's coordinate system and hand and outer are established Boundary's contact force resolves the movement concrete condition of mechanical arm with corresponding interarticular relationship, help system, is convenient for ectoskeleton machine People carries out controlling protection accordingly according to the actual situation.
CN201910427971.7A 2019-05-22 2019-05-22 Multi-information fusion human body exoskeleton robot control protection system Active CN110125909B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910427971.7A CN110125909B (en) 2019-05-22 2019-05-22 Multi-information fusion human body exoskeleton robot control protection system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910427971.7A CN110125909B (en) 2019-05-22 2019-05-22 Multi-information fusion human body exoskeleton robot control protection system

Publications (2)

Publication Number Publication Date
CN110125909A true CN110125909A (en) 2019-08-16
CN110125909B CN110125909B (en) 2022-04-22

Family

ID=67572186

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910427971.7A Active CN110125909B (en) 2019-05-22 2019-05-22 Multi-information fusion human body exoskeleton robot control protection system

Country Status (1)

Country Link
CN (1) CN110125909B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110916970A (en) * 2019-11-18 2020-03-27 南京伟思医疗科技股份有限公司 Device and method for realizing cooperative motion of weight-reducing vehicle and lower limb robot through communication
CN111652155A (en) * 2020-06-04 2020-09-11 北京航空航天大学 Human body movement intention identification method and system
CN112621714A (en) * 2020-12-02 2021-04-09 上海微电机研究所(中国电子科技集团公司第二十一研究所) Upper limb exoskeleton robot control method and device based on LSTM neural network
CN112706158A (en) * 2019-10-25 2021-04-27 中国科学院沈阳自动化研究所 Industrial man-machine interaction system and method based on vision and inertial navigation positioning
CN113459102A (en) * 2021-07-09 2021-10-01 郑州大学 Human upper limb intention identification method based on projection reconstruction
CN114366559A (en) * 2021-12-31 2022-04-19 华南理工大学 Multi-mode sensing system for lower limb rehabilitation robot

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160310731A1 (en) * 2013-12-18 2016-10-27 University Of Florida Research Foundation, Inc. Closed-loop hybrid orthotic system for rehabilitation and functional mobility assistance
CN106539573A (en) * 2016-11-25 2017-03-29 惠州市德赛工业研究院有限公司 A kind of Intelligent bracelet and the bracelet based reminding method based on user preference
CN108670244A (en) * 2018-05-29 2018-10-19 浙江大学 A kind of wearable physiology of flexible combination formula and psychological condition monitoring device
CN108875601A (en) * 2018-05-31 2018-11-23 郑州云海信息技术有限公司 Action identification method and LSTM neural network training method and relevant apparatus
CN109394476A (en) * 2018-12-06 2019-03-01 上海神添实业有限公司 The automatic intention assessment of brain flesh information and upper limb intelligent control method and system
CN109528450A (en) * 2019-01-24 2019-03-29 郑州大学 A kind of exoskeleton rehabilitation robot of motion intention identification
CN109549821A (en) * 2018-12-30 2019-04-02 南京航空航天大学 The exoskeleton robot assisted control system and method merged based on electromyography signal and inertial navigation signal

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160310731A1 (en) * 2013-12-18 2016-10-27 University Of Florida Research Foundation, Inc. Closed-loop hybrid orthotic system for rehabilitation and functional mobility assistance
CN106539573A (en) * 2016-11-25 2017-03-29 惠州市德赛工业研究院有限公司 A kind of Intelligent bracelet and the bracelet based reminding method based on user preference
CN108670244A (en) * 2018-05-29 2018-10-19 浙江大学 A kind of wearable physiology of flexible combination formula and psychological condition monitoring device
CN108875601A (en) * 2018-05-31 2018-11-23 郑州云海信息技术有限公司 Action identification method and LSTM neural network training method and relevant apparatus
CN109394476A (en) * 2018-12-06 2019-03-01 上海神添实业有限公司 The automatic intention assessment of brain flesh information and upper limb intelligent control method and system
CN109549821A (en) * 2018-12-30 2019-04-02 南京航空航天大学 The exoskeleton robot assisted control system and method merged based on electromyography signal and inertial navigation signal
CN109528450A (en) * 2019-01-24 2019-03-29 郑州大学 A kind of exoskeleton rehabilitation robot of motion intention identification

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
滕千礼 等: "采用运动传感器的人体运动识别深度模型", 《西安交通大学学报》 *
范敏: "外骨骼机器人云脑架构及其学习算法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
辛阳 等: "《大数据技术原理与实践》", 31 January 2018 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112706158A (en) * 2019-10-25 2021-04-27 中国科学院沈阳自动化研究所 Industrial man-machine interaction system and method based on vision and inertial navigation positioning
CN110916970A (en) * 2019-11-18 2020-03-27 南京伟思医疗科技股份有限公司 Device and method for realizing cooperative motion of weight-reducing vehicle and lower limb robot through communication
CN111652155A (en) * 2020-06-04 2020-09-11 北京航空航天大学 Human body movement intention identification method and system
CN112621714A (en) * 2020-12-02 2021-04-09 上海微电机研究所(中国电子科技集团公司第二十一研究所) Upper limb exoskeleton robot control method and device based on LSTM neural network
CN113459102A (en) * 2021-07-09 2021-10-01 郑州大学 Human upper limb intention identification method based on projection reconstruction
CN113459102B (en) * 2021-07-09 2022-07-05 郑州大学 Human upper limb intention identification method based on projection reconstruction
CN114366559A (en) * 2021-12-31 2022-04-19 华南理工大学 Multi-mode sensing system for lower limb rehabilitation robot

Also Published As

Publication number Publication date
CN110125909B (en) 2022-04-22

Similar Documents

Publication Publication Date Title
CN110125909A (en) A kind of multi-information fusion human body exoskeleton robot Control protection system
Lu et al. Development of a sEMG-based torque estimation control strategy for a soft elbow exoskeleton
Chen et al. A continuous estimation model of upper limb joint angles by using surface electromyography and deep learning method
WO2018050191A1 (en) A human intention detection system for motion assistance
Zhou et al. Applications of wearable inertial sensors in estimation of upper limb movements
Chinmilli et al. A review on wearable inertial tracking based human gait analysis and control strategies of lower-limb exoskeletons
Meng et al. A practical gait feedback method based on wearable inertial sensors for a drop foot assistance device
Ma et al. Design on intelligent perception system for lower limb rehabilitation exoskeleton robot
Xi et al. Simultaneous and continuous estimation of joint angles based on surface electromyography state-space model
CN109498375B (en) Human motion intention recognition control device and control method
Li et al. Active human-following control of an exoskeleton robot with body weight support
CN111531537A (en) Mechanical arm control method based on multiple sensors
Tang et al. Continuous estimation of human upper limb joint angles by using PSO-LSTM model
Wang et al. Prediction of contralateral lower-limb joint angles using vibroarthrography and surface electromyography signals in time-series network
El-Gohary et al. Joint angle tracking with inertial sensors
Wang et al. Lower limb motion recognition based on surface electromyography signals and its experimental verification on a novel multi-posture lower limb rehabilitation robots☆
Gupta MAC-MAN
Lalitharatne et al. Evaluation of fuzzy-neuro modifiers for compensation of the effects of muscle fatigue on EMG-based control to be used in upper-limb power-assist exoskeletons
Li et al. Joint torque closed-loop estimation using NARX neural network based on sEMG signals
Morón et al. EMG-based hand gesture control system for robotics
Vigliotta et al. EMG controlled electric wheelchair
Suplino et al. Elbow movement estimation based on EMG with NARX Neural Networks
Peng A novel motion detecting strategy for rehabilitation in smart home
Raghavendra et al. Triggering a functional electrical stimulator based on gesture for stroke-induced movement disorder
Xie et al. Sensor-Based Exercise Rehabilitation Robot Training Method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant