CN106256336A - 基于逻辑回归的外骨骼助残机器人步相切换方法 - Google Patents

基于逻辑回归的外骨骼助残机器人步相切换方法 Download PDF

Info

Publication number
CN106256336A
CN106256336A CN201610701250.7A CN201610701250A CN106256336A CN 106256336 A CN106256336 A CN 106256336A CN 201610701250 A CN201610701250 A CN 201610701250A CN 106256336 A CN106256336 A CN 106256336A
Authority
CN
China
Prior art keywords
ectoskeleton
mutually
disabled
aiding robot
disabled aiding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610701250.7A
Other languages
English (en)
Other versions
CN106256336B (zh
Inventor
赵江海
叶晓东
王容川
孔令成
何锋
赵子毅
陈淑艳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Zhongke winsis Intelligent Robot Technology Co. Ltd.
Original Assignee
Hefei Institutes of Physical Science of CAS
Institute of Advanced Manufacturing Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Institutes of Physical Science of CAS, Institute of Advanced Manufacturing Technology filed Critical Hefei Institutes of Physical Science of CAS
Priority to CN201610701250.7A priority Critical patent/CN106256336B/zh
Publication of CN106256336A publication Critical patent/CN106256336A/zh
Application granted granted Critical
Publication of CN106256336B publication Critical patent/CN106256336B/zh
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H2003/007Appliances for aiding patients or disabled persons to walk about secured to the patient, e.g. with belts
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/01Constructive details
    • A61H2201/0157Constructive details portable
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/165Wearable interfaces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • A61H2201/5069Angle sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • A61H2201/5071Pressure sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2205/00Devices for specific parts of the body
    • A61H2205/10Leg
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2205/00Devices for specific parts of the body
    • A61H2205/12Feet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2230/00Measuring physical parameters of the user
    • A61H2230/62Posture
    • A61H2230/625Posture used as a control parameter for the apparatus

Landscapes

  • Health & Medical Sciences (AREA)
  • Epidemiology (AREA)
  • Pain & Pain Management (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Rehabilitation Therapy (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Manipulator (AREA)
  • Rehabilitation Tools (AREA)

Abstract

本发明公开了一种基于逻辑回归的外骨骼助残机器人步相切换方法,其特征是在外骨骼助残机器人上安装三类传感器,在设置步相切换时的各个传感器的阈值后,通过穿戴外骨骼助残机器人进行步行实验,分别记录与步态切换成功和失败两种状态对应的倾角、压力和角度传感数据,并作为训练数据,得到步相切换估计函数,从而对步相切换成功与失败两种状态进行正确分类,识别行走时支撑腿与摆动腿切换的时机。本发明有效提高了外骨骼助残机器人步相切换的准确率。

Description

基于逻辑回归的外骨骼助残机器人步相切换方法
技术领域
本发明涉及一种外骨骼助残机器人意图识别方法类,更具体地说,是一种用于外骨骼助残机器人支撑腿与摆动腿切换时机的识别方法。
背景技术
下肢可穿戴外骨骼机器人能在各种军事、医疗和工业等领域中的工作。识别穿戴者的行为和意图是外骨骼机器人当前急需解决的问题。获取穿戴者的步行状态与运动意图通常使用脑机接口、肌电信号和物理传感三类信号,脑机接口主要运用在康复机器人领域,但人脑意识分辨率低。肌电信号具有***性,但易受人体状态等因素的影响。运用压力、倾角和角度等物理传感器具有简单易实现和抗干扰性能力强的优点,在当前的外骨骼机器人上得到广泛应用。
中国专利申请CN201410063084.3公开了“一种适用于外骨骼辅助支撑机器人的具有踝关节参数测量的足部装置”,专利公开的机器人足底安装有压力传感器,通过检测人体行走时足底压力的变化,实现人体行走意图的判断。该机器人的传感信息较为单一,足底压力受到地面和落地姿态的影响,识别精度不高。
发明内容
本发明针对外骨骼助残机器人运动意图识别问题,提出一种基于逻辑回归的外骨骼助残机器人步相切换方法,以期能进一步提升机器人行走时支撑腿和摆动腿切换的准确性。
为了达到上述目的,本发明所采用的技术方案为:
本发明一种基于逻辑回归的外骨骼助残机器人步相切换方法的特点是按如下步骤进行:
步骤1、在外骨骼助残机器人的腰部和大腿安装倾角传感器,在所述外骨骼助残机器人下肢的髋关节和膝关节输出轴上安装角度传感器,在所述外骨骼助残机器人的足底贴附压力传感器;
步骤2、在所述外骨骼助残机器人不启动的情况下,利用各个传感器采集所述外骨骼助残机器人行走过程中支撑腿和摆动腿在步相切换时的倾角、角度和压力数据,并作为步相切换时的各个传感器阈值;
步骤3、启动所述外骨骼助残机器人,按照所设定的支撑腿和摆动腿运动行走,并利用各个传感器实时采集所述外骨骼助残机器人行走过程中的倾角、角度和压力实时数据,将采集的实时数据与所述各个传感器阈值进行比较,当采集的实时数据大于所述各类传感器阈值时,所述外骨骼助残机器人进行支撑腿和摆动腿的步相切换,并将相应的实时数据记录为1组训练样本,从而获得n组训练样本,记为X={X(1),X(2),…,X(i),…,X(n)};X(i)表示第i组训练样本,并有 表示第i组训练样本X(i)中第j个传感器数据;1≤i≤n;1≤j≤m;
步骤4、在所述步骤3中的所述外骨骼助残机器人进行支撑腿和摆动腿的步相切换时,同时判断所述外骨骼助残机器的切换动作是否与所述各个传感器阈值所对应的切换动作一致,若一致,则表示正确切换,记录切换结果为成功,并用输出变量Y(i)=1表示,否则表示错误切换,记录切换结果为失败,并用输出变量Y(i)=0表示;
步骤5、将所述n组训练样本和相应的输出变量输入到逻辑回归算法所建立的逻辑回归模型中,得到步相切换估计函数,并作为步相切换的判别依据;
步骤6、利用各个传感器实时采集所述外骨骼助残机器人行走时的倾角、角度和压力测试数据,并代入所述步相切换估计函数中,得到判别结果;
步骤7、所述外骨骼助残机器人根据所述判别结果进行支撑腿和摆动腿的步相切换。
本发明所述的基于逻辑回归的外骨骼助残机器人步相切换方法的特点也在于,所述步骤5中,所述步相切换估计函数是按如下步骤获得:
步骤5.1、定义当前迭代次数为k,并初始化k=1;令第0次迭代的j个分量为0;
步骤5.2、利用式(1)所示的逻辑回归参数θ的计算式,获得第k次迭代的第j个分量θj的计算值从而获得第k次迭代的m个分量所组成的向量
θ j ( k ) = θ j ( k - 1 ) - α Σ i = 1 n ( h θ ( k - 1 ) ( X ( i ) ) - Y ( i ) ) X j ( i ) - - - ( 1 )
式(1)中,α表示迭代步长;表示逻辑回归的假设函数,并有:
h θ ( k - 1 ) ( X ( i ) ) = 1 1 + e - θ ( k - 1 ) X ( i ) - - - ( 2 )
式(2)中,θ(k-1)X(i)表示第k-1次迭代获得的向量θ(k-1)的各个分量与第i组训练样本X(i)各个分量的乘积和;即
步骤5.3、将k+1赋值给k,并返回步骤5.2执行,直到迭代误差ε=|θ(k-1)(k)|小于所设定的阈值时,迭代停止,此时,对应的迭代次数记为p;从而获得第p次迭代的m个分量所组成的收敛向量并作为逻辑回归参数θ的最终值;
步骤5.4、利用式(3)获得步相切换估计函数hθ(X):
h θ ( X ) = 1 1 + e - θ ( p ) X - - - ( 3 )
式(3)中,表示自变量为-θ(p)X的指数函数,X表示n组训练样本。
与现有技术相比,本发明的有益效果体现在:
1、本发明提出的外骨骼助残机器人步相切换方法,通过采集倾角、角度和压力多传感信息并进行融合,推理判断穿戴者行走状态,运动信息融合度高,有效提高了机器人步相切换的可靠性,为外骨骼机器人应用于助力助残领域提供了有效的技术手段。
2、本发明提出的外骨骼助残机器人步相切换判别方法,不依赖于机器人的行走控制模型,利用穿戴者行走的大量训练样本进行学习,有效提高了步相切换的判别精度。
附图说明
图1为本发明方法流程的示意图;
图2为本发明基于步相切换判别算法的行走控制示意图;
图3为本发明外骨骼助残机器人的传感器安装分布示意图;
图中标号:1倾角传感器;2角度传感器;3压力传感器。
具体实施方式
下面结合附图举例对本发明作更详尽的说明。
本实施例中,如图1所示,基于逻辑回归的外骨骼助残机器人步相切换的一种判别方法,是用于机器人行走过程中的支撑腿和摆动腿的切换,该判别方法采用了逻辑回归算法,而逻辑回归算法所需的训练样本数据由安装在助残机器人上的传感器进行采集,传感器阈值由测试人员行走时带动外骨骼助残机器人获取;在设置步相切换时的各个传感器的阈值后,受试者穿戴外骨骼助残机器人后进行步行实验,分别记录与切换成功和失败两种状态对应的传感器数据,该传感器数据作为逻辑回归算法的训练样本,通过迭代计算出步相切换估计函数,从而正确识别行走支撑腿与摆动腿切换的时机。具体的说,该方法是按如下步骤进行:
步骤1、如图3所示,外骨骼助残机器人的传感器由倾角、压力和角度三类传感器构成,三类传感器总计有9个;在外骨骼助残机器人的腰部和两侧下肢的大腿安装倾角传感器1,在外骨骼助残机器人下肢的髋关节和膝关节的转动输出轴上安装角度传感器2,在外骨骼助残机器人的足底鞋垫下方贴附薄膜压力传感器3;倾角传感器1测量的角度为机器人腰部和大腿的前倾和后仰的角度。
步骤2、为获取支撑腿和摆动腿的切换阈值,在外骨骼助残机器人不启动的情况下,由测试人员穿戴外骨骼助残机器人后,带着机器人行走,利用各个传感器采集和记录穿戴者在外骨骼助残机器人行走过程中,支撑腿和摆动腿在步相切换时的倾角、角度和压力数据,并作为步相切换时的各个传感器阈值;
步骤3、测试人员穿戴外骨骼助残机器人后,启动外骨骼助残机器人,机器人按照事先设定的支撑腿和摆动腿运动行走,并利用各个传感器实时采集外骨骼助残机器人行走过程中的倾角、角度和压力实时数据,将采集的实时数据与各个传感器阈值进行比较,当采集的实时数据大于各类传感器阈值时,外骨骼助残机器人进行支撑腿和摆动腿的步相切换,并将相应的实时数据记录为1组训练样本,从而获得n组训练样本,记为X={X(1),X(2),…,X(i),…,X(n)};X(i)表示第i组训练样本,并有 表示第i组训练样本X(i)中第j个传感器数据;1≤i≤n;1≤j≤m;本实施例中,由于三类传感器总计有9个,所以采集的训练样本为9维特征向量,即m=9;其中为3个倾角传感器数据,为4个角度传感器数据,为2个足底压力传感器数据。
步骤4、在步骤3中的外骨骼助残机器人进行支撑腿和摆动腿的步相切换时,同时判断外骨骼助残机器的切换动作是否与各个传感器阈值所对应的穿戴者的切换动作一致,若一致,则表示正确切换,记录切换结果为成功,并用输出变量Y(i)=1表示,否则表示错误切换,记录切换结果为失败,并用输出变量Y(i)=0表示;
步骤5、将n组训练样本X和相应的输出变量Y输入到逻辑回归算法所建立的逻辑回归模型中,得到步相切换估计函数,并作为步相切换的判别依据;具体的说,
步骤5.1、定义当前迭代次数为k,并初始化k=1;令第0次迭代的j个分量为0;
步骤5.2、利用式(1)所示的逻辑回归参数θ的计算式,获得第k次迭代的第j个分量θj的计算值从而获得第k次迭代的m个分量所组成的向量本实施例中,θ(k)也为9维向量:
θ j ( k ) = θ j ( k - 1 ) - α Σ i = 1 n ( h θ ( k - 1 ) ( X ( i ) ) - Y ( i ) ) X j ( i ) - - - ( 1 )
式(1)中,α表示迭代步长;表示逻辑回归的假设函数,并有:
h θ ( k - 1 ) ( X ( i ) ) = 1 1 + e - θ ( k - 1 ) X ( i ) - - - ( 2 )
式(2)中,θ(k-1)X(i)表示第k-1次迭代获得的向量θ(k-1)的各个分量与第i组训练样本X(i)各个分量的乘积和;即
步骤5.3、将k+1赋值给k,并返回步骤5.2执行,直到迭代误差ε=|θ(k-1)(k)|小于所设定的阈值时,迭代停止,此时,对应的迭代次数记为p;从而获得第p次迭代的m个分量所组成的收敛向量并作为逻辑回归参数θ的最终值;
步骤5.4、利用式(3)获得步相切换估计函数hθ(X):
h θ ( X ) = 1 1 + e - θ ( p ) X - - - ( 3 )
式(3)中,表示自变量为-θ(p)X的指数函数,X表示n组训练样本。
步骤6、由图2构建基于步相切换判别算法的行走控制方法,当测试者穿戴外骨骼助残机器人行走时,利用各个传感器实时采集外骨骼助残机器人行走时的倾角、角度和压力测试数据,并代入步相切换估计函数中,得到判别结果;
步骤7、外骨骼助残机器人根据判别结果驱动机器人各个关节电机进行支撑腿和摆动腿的步相切换。
显然,本发明的上述具体实施方式仅是为清楚地说明本发明所作的举例,而并非是对本发明实施方式的限定。对于所属领域的普通技术人员来说,在上述说明的基础上还可以容易的做出其它形式上的变化或者替代,而这些改变或者替代也将包含在本发明确定的保护范围之内。

Claims (2)

1.一种基于逻辑回归的外骨骼助残机器人步相切换方法,其特征是按如下步骤进行:
步骤1、在外骨骼助残机器人的腰部和大腿安装倾角传感器(1),在所述外骨骼助残机器人下肢的髋关节和膝关节输出轴上安装角度传感器(2),在所述外骨骼助残机器人的足底贴附压力传感器(3);
步骤2、在所述外骨骼助残机器人不启动的情况下,利用各个传感器采集所述外骨骼助残机器人行走过程中支撑腿和摆动腿在步相切换时的倾角、角度和压力数据,并作为步相切换时的各个传感器阈值;
步骤3、启动所述外骨骼助残机器人,按照所设定的支撑腿和摆动腿运动行走,并利用各个传感器实时采集所述外骨骼助残机器人行走过程中的倾角、角度和压力实时数据,将采集的实时数据与所述各个传感器阈值进行比较,当采集的实时数据大于所述各类传感器阈值时,所述外骨骼助残机器人进行支撑腿和摆动腿的步相切换,并将相应的实时数据记录为1组训练样本,从而获得n组训练样本,记为X={X(1),X(2),…,X(i),…,X(n)};X(i)表示第i组训练样本,并有 表示第i组训练样本X(i)中第j个传感器数据;1≤i≤n;1≤j≤m;
步骤4、在所述步骤3中的所述外骨骼助残机器人进行支撑腿和摆动腿的步相切换时,同时判断所述外骨骼助残机器的切换动作是否与所述各个传感器阈值所对应的切换动作一致,若一致,则表示正确切换,记录切换结果为成功,并用输出变量Y(i)=1表示,否则表示错误切换,记录切换结果为失败,并用输出变量Y(i)=0表示;
步骤5、将所述n组训练样本和相应的输出变量输入到逻辑回归算法所建立的逻辑回归模型中,得到步相切换估计函数,并作为步相切换的判别依据;
步骤6、利用各个传感器实时采集所述外骨骼助残机器人行走时的倾角、角度和压力测试数据,并代入所述步相切换估计函数中,得到判别结果;
步骤7、所述外骨骼助残机器人根据所述判别结果进行支撑腿和摆动腿的步相切换。
2.根据权利要求1所述的基于逻辑回归的外骨骼助残机器人步相切换方法,其特征是,所述步骤5中,所述步相切换估计函数是按如下步骤获得:
步骤5.1、定义当前迭代次数为k,并初始化k=1;令第0次迭代的j个分量为0;
步骤5.2、利用式(1)所示的逻辑回归参数θ的计算式,获得第k次迭代的第j个分量θj的计算值从而获得第k次迭代的m个分量所组成的向量
θ j ( k ) = θ j ( k - 1 ) - α Σ i = 1 n ( h θ ( k - 1 ) ( X ( i ) ) - Y ( i ) ) X j ( i ) - - - ( 1 )
式(1)中,α表示迭代步长;表示逻辑回归的假设函数,并有:
h θ ( k - 1 ) ( X ( i ) ) = 1 1 + e - θ ( k - 1 ) X ( i ) - - - ( 2 )
式(2)中,θ(k-1)X(i)表示第k-1次迭代获得的向量θ(k-1)的各个分量与第i组训练样本X(i)各个分量的乘积和;即
步骤5.3、将k+1赋值给k,并返回步骤5.2执行,直到迭代误差ε=|θ(k-1)(k)|小于所设定的阈值时,迭代停止,此时,对应的迭代次数记为p;从而获得第p次迭代的m个分量所组成的收敛向量并作为逻辑回归参数θ的最终值;
步骤5.4、利用式(3)获得步相切换估计函数hθ(X):
h θ ( X ) = 1 1 + e - θ ( p ) X - - - ( 3 )
式(3)中,表示自变量为-θ(p)X的指数函数,X表示n组训练样本。
CN201610701250.7A 2016-08-22 2016-08-22 基于逻辑回归的外骨骼助残机器人步相切换方法 Active CN106256336B (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610701250.7A CN106256336B (zh) 2016-08-22 2016-08-22 基于逻辑回归的外骨骼助残机器人步相切换方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610701250.7A CN106256336B (zh) 2016-08-22 2016-08-22 基于逻辑回归的外骨骼助残机器人步相切换方法

Publications (2)

Publication Number Publication Date
CN106256336A true CN106256336A (zh) 2016-12-28
CN106256336B CN106256336B (zh) 2018-07-31

Family

ID=57713855

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610701250.7A Active CN106256336B (zh) 2016-08-22 2016-08-22 基于逻辑回归的外骨骼助残机器人步相切换方法

Country Status (1)

Country Link
CN (1) CN106256336B (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113520375A (zh) * 2021-07-21 2021-10-22 深圳大学 步态相位的划分方法、装置、存储介质及***

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102440854A (zh) * 2011-09-05 2012-05-09 中国人民解放军总后勤部军需装备研究所 一种人机耦合重载携行***装置及其控制方法
CN103083027A (zh) * 2013-01-10 2013-05-08 苏州大学 一种基于下肢关节运动信息的步态相位判别方法
JP2013208293A (ja) * 2012-03-30 2013-10-10 Equos Research Co Ltd 歩行支援装置、及び歩行支援プログラム
CN103431929A (zh) * 2013-08-29 2013-12-11 电子科技大学 一种力量增强型动力外骨骼行走步态感知方法及装置
CN104582668A (zh) * 2012-06-15 2015-04-29 范德比尔特大学 行动辅助设备
CN104729507A (zh) * 2015-04-13 2015-06-24 大连理工大学 一种基于惯性传感器的步态识别方法
KR20160089791A (ko) * 2015-01-20 2016-07-28 한국생산기술연구원 보행 단계 인식 시스템, 및 방법

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102440854A (zh) * 2011-09-05 2012-05-09 中国人民解放军总后勤部军需装备研究所 一种人机耦合重载携行***装置及其控制方法
JP2013208293A (ja) * 2012-03-30 2013-10-10 Equos Research Co Ltd 歩行支援装置、及び歩行支援プログラム
CN104582668A (zh) * 2012-06-15 2015-04-29 范德比尔特大学 行动辅助设备
CN103083027A (zh) * 2013-01-10 2013-05-08 苏州大学 一种基于下肢关节运动信息的步态相位判别方法
CN103431929A (zh) * 2013-08-29 2013-12-11 电子科技大学 一种力量增强型动力外骨骼行走步态感知方法及装置
KR20160089791A (ko) * 2015-01-20 2016-07-28 한국생산기술연구원 보행 단계 인식 시스템, 및 방법
CN104729507A (zh) * 2015-04-13 2015-06-24 大连理工大学 一种基于惯性传感器的步态识别方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113520375A (zh) * 2021-07-21 2021-10-22 深圳大学 步态相位的划分方法、装置、存储介质及***
CN113520375B (zh) * 2021-07-21 2023-08-01 深圳大学 步态相位的划分方法、装置、存储介质及***

Also Published As

Publication number Publication date
CN106256336B (zh) 2018-07-31

Similar Documents

Publication Publication Date Title
CN109953761B (zh) 一种下肢康复机器人运动意图推理方法
Huo et al. Fast gait mode detection and assistive torque control of an exoskeletal robotic orthosis for walking assistance
CN110755070B (zh) 基于多传感器融合的下肢运动位姿快速预测***及方法
Chen et al. Locomotion mode classification using a wearable capacitive sensing system
Huang et al. Posture estimation and human support using wearable sensors and walking-aid robot
CN107273611A (zh) 一种基于下肢行走特点的下肢康复机器人的步态规划方法
CN103984962A (zh) 一种基于肌电信号的外骨骼行走模式识别方法
CN110420029A (zh) 一种基于多传感器融合的行走步态无线检测***
CN106074073B (zh) 一种下肢康复机器人的控制***及康复训练策略
Li et al. Real-time gait event detection for a lower extremity exoskeleton robot by infrared distance sensors
CN107536613A (zh) 机器人及其人体下肢步态识别装置和方法
Nazmi et al. Generalization of ann model in classifying stance and swing phases of gait using EMG signals
Yi et al. Continuous prediction of lower-limb kinematics from multi-modal biomedical signals
Song et al. Adaptive neural fuzzy reasoning method for recognizing human movement gait phase
Liu et al. sEMG-based continuous estimation of knee joint angle using deep learning with convolutional neural network
Rabe et al. Performance of sonomyographic and electromyographic sensing for continuous estimation of joint torque during ambulation on multiple terrains
Rabe et al. Use of sonomyographic sensing to estimate knee angular velocity during varying modes of ambulation
CN106256336A (zh) 基于逻辑回归的外骨骼助残机器人步相切换方法
Kumar et al. Vision-based human joint angular velocity estimation during squat and walking on a treadmill actions
López-Delis et al. Continuous estimation prediction of knee joint angles using fusion of electromyographic and inertial sensors for active transfemoral leg prostheses
Khan et al. Pathological gait abnormality detection and segmentation by processing the hip joints motion data to support mobile gait rehabilitation
Zhang et al. A real-time gait phase recognition method based on multi-information fusion
Chalvatzaki et al. Estimating double support in pathological gaits using an hmm-based analyzer for an intelligent robotic walker
Chen et al. An adaptive gait learning strategy for lower limb exoskeleton robot
Guo et al. A real-time stable-control gait switching strategy for lower-limb rehabilitation exoskeleton

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20170505

Address after: 230031 Shushan Lake Road, Anhui, China, No. 350, No.

Applicant after: Hefei Inst. of Matter Sciences, Chinese Academy of Sciences

Address before: 230031 Shushan District, Anhui City, Dong Dong Island

Applicant before: Hefei Inst. of Matter Sciences, Chinese Academy of Sciences

Applicant before: Institute of Advanced Manufacturing Technology

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20171214

Address after: 213164, room 3, room 307, Changzhou science and Education City, No. 18, Changwu Middle Road, Wujin District, Changzhou, Jiangsu

Applicant after: Jiangsu Zhongke winsis Intelligent Robot Technology Co. Ltd.

Address before: 230031 Shushan Lake Road, Anhui, China, No. 350, No.

Applicant before: Hefei Inst. of Matter Sciences, Chinese Academy of Sciences

GR01 Patent grant
GR01 Patent grant