WO2017092225A1 - 基于emg的可穿戴式文本输入***及方法 - Google Patents

基于emg的可穿戴式文本输入***及方法 Download PDF

Info

Publication number
WO2017092225A1
WO2017092225A1 PCT/CN2016/080732 CN2016080732W WO2017092225A1 WO 2017092225 A1 WO2017092225 A1 WO 2017092225A1 CN 2016080732 W CN2016080732 W CN 2016080732W WO 2017092225 A1 WO2017092225 A1 WO 2017092225A1
Authority
WO
WIPO (PCT)
Prior art keywords
emg
sensor
text input
processor
based wearable
Prior art date
Application number
PCT/CN2016/080732
Other languages
English (en)
French (fr)
Inventor
伍楷舜
邹永攀
叶树锋
Original Assignee
深圳大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳大学 filed Critical 深圳大学
Publication of WO2017092225A1 publication Critical patent/WO2017092225A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns

Definitions

  • the present invention relates to the field of intelligent identification, and in particular, to an EMG-based wearable text input system and method.
  • wearable technology has always been a concern in the international computer academia and industry, but because of the high cost of construction and technical complexity, many related equipment only stays in the conceptual field.
  • Some wearable devices have been conceptualized and commercialized. New wearable devices are constantly coming out, Google, Apple, Microsoft, Sony, and Olin. Bath, Motorola and many other technology companies have begun to explore this new field.
  • "Wearing smart device” is a general term for applying wearable technology to intelligently design and wear wearable devices for daily wear, such as glasses, wristbands, watches, clothing and shoes, which greatly improves the quality of life and work. The efficiency has gradually developed into miniaturization, multi-functionality and strong endurance. According to research, by 2016, the global market for wearable smart devices will reach $6 billion.
  • the present invention provides an EMG-based wearable text input system and method, which solves the problem that no wearable device realizes text recognition in the prior art.
  • an EMG-based wearable text input system is designed and manufactured, including a wristband main body and a sensor module, and the wristband main body is embedded with an inertial test unit IMU, a processor, and a memory.
  • a communication module and an expansion slot the sensor module is embedded in the expansion slot; the sensor transmits electromyographic bioelectrical change information to the processor; and the inertia test unit IMU records motion information of the analysis arm and transmits to The processor; the communication module transmits the analyzed character information.
  • the sensor module is a myoelectric sensor that senses the myoelectric bioelectrical changes produced by the tapping action, converts them into discrete digital signals and sends them to the processor.
  • the wristband body has a built-in power module, the power module is a battery or an external charger; and the interface on the wristband body is used for charging or data transmission.
  • the processor performs feature extraction, self-learning, key motion recognition and matching, and text combination on the signal transmitted by the sensor module and the inertial test unit IMU through the communication mode.
  • the block is sent out.
  • the myoelectric sensor comprises five electrodes, two of which are used to connect the expansion slots on the sides and the other three to be used to sense the myoelectric changes at the bottom.
  • the invention also provides an EMG-based wearable text input method, comprising the steps of: (S101) receiving data information of a sensor module; (S102) performing feature extraction; (S103) establishing a feature set; (S105) character recognition (S105) Generate text characters.
  • the signals sensed by the sensor module and the inertial test unit IMU in each time window are collected, and a data model is established and stored in the memory;
  • the sensor module captures the bioelectrical change signal generated by the electromyogram when the finger hits the paper keyboard;
  • the IMU captures the reading of the acceleration sensor and the gyroscope when the arm moves, and analyzes the movement trajectory of the arm.
  • the noise affecting the EMG signal is removed by a filtering algorithm, and then the useful feature signal is extracted; in the step (S103), the latest feature data is performed according to the training model. Processing, establishing a structured feature model; in the step (S105), analyzing the data model, identifying a corresponding button of each action according to the matching algorithm, and determining a character to be input by the user; the step (S106) In the grammar rules, the missing characters are contextually analyzed, then the blur is fixed, and the text information is output.
  • the EMG-based wearable text input method further includes a self-learning step, which is: according to the latest feature set, combined with the historical feature model, further correcting the data to provide an accurate identification model.
  • the data provides a flexible interface.
  • the sensor module is a myoelectric sensor that senses a myoelectric bioelectrical change generated by a tapping action, converts it into a discrete digital signal and transmits it to a processor;
  • the test unit IMU records the motion information of the analysis arm and transmits it to the processor;
  • the communication module transmits the analyzed character information.
  • the smart wristband is more portable and more fashionable; the detachable design increases the fun of the device, and the reasonable selection of the sensor can improve the accuracy and reduce the energy consumption.
  • FIG. 1 is a schematic diagram of a demonstration of an EMG-based wearable text input system of the present invention.
  • FIG. 2 is a side elevational view of the wristband body of the EMG-based wearable text input system of the present invention.
  • FIG. 3 is a developed view of a wristband body of an EMG-based wearable text input system of the present invention.
  • Figure 4 is a schematic view showing the connection of the myoelectric sensor of the present invention.
  • FIG. 5 is a schematic structural diagram of an EMG-based wearable text input system according to the present invention.
  • FIG. 6 is a schematic flow chart of an EMG-based wearable text input method according to the present invention.
  • An EMG-based wearable text input system includes a wristband body and a sensor module, the wristband body embedding an inertial test unit IMU, a processor, a memory, a communication module, and an expansion slot; the sensor module is embedded in the Expanding the slot; the sensor transmitting the electromyographic bioelectrical change information to the processor; the inertia test unit IMU records the motion information of the analysis arm and transmits to the processor; the communication module analyzes the character Information is transmitted.
  • the sensor module is a myoelectric sensor that senses the myoelectric bioelectrical changes produced by the tapping action, converts them into discrete digital signals, and sends them to the processor.
  • the wristband body has a built-in power module, and the power module is a battery or an external charger; the interface on the wristband body is used for charging or data transmission.
  • the processor performs feature extraction, self-learning, key motion recognition and matching, and text combination on the signals transmitted by the sensor module and the inertial test unit IMU, and then sends the matched characters through the communication module.
  • the myoelectric sensor contains five electrodes, two of which are used to connect the expansion slots on the sides and the other three to sense the myoelectric changes at the bottom.
  • the invention also provides an EMG-based wearable text input method, comprising the steps of: (S101) receiving data information of a sensor module; (S102) performing feature extraction; (S103) establishing a feature set; (S105) character recognition (S105) Generate text characters.
  • the signals sensed by the sensor module and the inertial test unit IMU in each time window are collected, and a data model is established and stored in the memory; wherein the sensor module captures a finger tap The bioelectrical change signal generated by myoelectricity on the paper keyboard; the IMU captures the readings of the acceleration sensor and the gyroscope when the arm moves, and analyzes the movement trajectory of the arm.
  • the noise affecting the EMG signal is removed by the filtering algorithm, and then the useful feature signal is extracted; in the step (S103), the latest feature data is processed according to the training model to establish a structured a feature model; in the step (S105), analyzing the data model, identifying a corresponding button of each action according to the matching algorithm, and determining a character to be input by the user; in the step (S106), according to the grammar rule, Context analysis of missing characters, then blur repair, output text information.
  • the EMG-based wearable text input method further includes a self-learning step, which is: according to the latest feature set, combined with the historical feature model, further correcting the data, providing accurate data for the recognition model, and providing flexible interface.
  • the sensor module is a myoelectric sensor that senses a myoelectric bioelectrical change generated by a tapping action, converts it into a discrete digital signal and sends it to a processor; the inertial testing unit IMU records an analysis arm The motion information is transmitted to the processor; the communication module transmits the analyzed character information.
  • the present invention can be combined with other sleep monitoring, stepping and other functions on the same device, providing multiple functions at the same time, and more fashionable, according to successful cases using the myoelectric sensor, such as the MYO control arm ring,
  • the EMG sensor provides accurate EMG electrode change monitoring, even for simple tapping actions.
  • the present invention can use the myoelectric sensor to perceive the tapping action, and combines each tapping action with the character input through comprehensive analysis, as shown in FIG.
  • FIG. 2 is a side view of the smart wristband, wherein P101 is a processing module, P102 is a myoelectric sensor; and FIG. 3 is an expanded view of the smart wristband.
  • P201 is the expansion slot of the main wristband (eight in total), P202 is the internal connection point of the expansion slot, P203 is the processing module, P204 is the data transmission bus and power cable;
  • Figure 4 is a detachable myoelectric sensor, P301 It is three necessary sensing electrodes, P302 is two electrodes for connecting to the expansion slot connection point;
  • Figure 5 is the processing module, P401 is the processor, P402 is the memory, P403 is the six-axis inertial test unit (IMU), P404 It is a communication module, and P405 is a USB interface.
  • An EMG-based wearable text input system is a wearable intelligent system, which mainly includes a wristband main body and a myoelectric sensor module.
  • the myoelectric sensor is detachable and can be selectively embedded in the expansion groove of the wristband main body, and embedded. The position and number will affect the recognition accuracy and equipment energy consumption; the myoelectric sensor senses the myoelectric bioelectricity change and transmits it to the processor for processing; the wristband main processing module embeds a six-axis inertial test unit (IMU), which can record the user's arm movement. Data, the arm movement trajectory can be analyzed; the wristband main processing module has embedded processor and memory, and contains various processing algorithms to process the information acquired by the sensor; the wristband main body analyzes the character information through the embedded communication module. Transfer to other electronic devices.
  • An EMG sensor with the same function is optionally embedded in the expansion slot, and the sensor senses the electromyographic bioelectrical changes produced by the tapping action, converts it into discrete digital signals, and sends them to the processor.
  • the position and number of embedding of the myoelectric sensor will affect the accuracy. The more accurate the position of the sensor covering the myoelectricity, the higher the accuracy of the data. The more the sensor is used, the higher the accuracy, but the more energy is consumed.
  • the main body of the wristband has a built-in power module.
  • the power supply is provided by the battery.
  • the battery can be charged by a USB interface.
  • the USB interface can be used for charging, and can also be connected to other electronic devices to transmit data.
  • the main body is designed with an expansion slot.
  • the sensor can be embedded and embedded. Data can be transmitted, can also be charged; EMG sensor, built-in battery, can be independently charged; processor, contains a variety of algorithms (such as including "filtering, correction, matching” algorithm), signals sent by myoelectric sensors and IMU sensors Perform "feature extraction - self-learning - key motion recognition and matching - text combination", and finally send the matched characters through the communication module.
  • the present invention also provides an EMG-based wearable text input method.
  • the following The step completes the text entry.
  • S101 During the data collection phase, according to the training model, the signals sensed by the EMG and IMU sensors in each time window are collected, and a data model is established and stored in the memory; wherein the EMG sensor captures the muscles when the finger hits the paper keyboard.
  • S102 Feature extraction.
  • the data model is acquired from S101, and combined with the data acquired by the IMU, the noise affecting the EMG signal is removed by a filtering algorithm, and then the useful characteristic signal is extracted.
  • S103 Feature set.
  • a useful feature signal is obtained, which is recorded in the feature set, and the feature feature stores the historical feature data and the latest feature data.
  • the latest feature data is processed to establish a structured feature model. Combined with historical feature data, the existing feature models are gradually improved.
  • S104 Self-learning interface. According to the latest feature set, combined with the historical feature model, the data is further modified to provide accurate data for the recognition model, providing a flexible interface.
  • S105 Identify the model and the key recognition.
  • the data model provided by the self-learning interface is analyzed, and according to the matching algorithm, the corresponding button of each action is identified, and the character that the user wants to input is determined.
  • S106 A grammar-based text input interface. According to the grammar rules, the missing characters are contextually analyzed, then blurred and finally combined into words or even sentences.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Dermatology (AREA)
  • Neurosurgery (AREA)
  • Neurology (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • User Interface Of Digital Computer (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

本发明涉及智能识别领域,其公开了一种基于EMG的可穿戴式文本输入***,包括腕带主体以及传感器模块,所述腕带主体内嵌惯性测试单元IMU、处理器、存储器、通信模块以及扩展槽;所述传感器模块嵌入所述扩展槽中;所述传感器将肌电生物电变化信息传送给所述处理器;所述惯性测试单元IMU记录分析手臂的运动信息并传送给所述处理器;所述通信模块将分析后的字符信息进行传输。本发明的有益效果是:智能腕带更便携,更时尚;可拆卸的设计增加了设备的趣味性,传感器合理的选择能提高精度的同时也降低了能耗。

Description

基于EMG的可穿戴式文本输入***及方法 技术领域
本发明涉及智能识别领域,尤其涉及一种基于EMG的可穿戴式文本输入***及方法。
背景技术
如今,穿戴式技术在国际计算机学术界和工业界一直都备受关注,只不过由于造价成本高和技术复杂,很多相关设备仅仅停留在概念领域。随着移动互联网的发展、技术进步和高性能低功耗处理芯片的推出等,部分穿戴式设备已经从概念化走向商用化,新式穿戴式设备不断传出,谷歌、苹果、微软、索尼、奥林巴斯、摩托罗拉等诸多科技公司也都开始在这个全新的领域深入探索。“穿戴式智能设备”是应用穿戴式技术对日常穿戴进行智能化设计、开发出可以穿戴的设备的总称,如眼镜、腕带、手表、服饰及鞋等,极大地提高了生活的质量、工作的效率,逐渐向小型化、多功能、续航能力强等方面发展。据研究,到2016年,全球穿戴式智能设备市场的规模,将达到60亿美元。
在这种趋势之下,众多研究人员一直在深挖可穿戴技术的潜力,探讨可穿戴技术与其他工具或技术的结合,比如谷歌的Google glass,苹果的Apple watch等等。相比之下传统的键盘就显得极为笨重了,而且功能较为单一,现在还没有一种有效的可穿戴的虚拟键盘应用。
发明内容
为了解决现有技术中的问题,本发明提供了基于EMG的可穿戴式文本输入***及方法,解决现有技术中没有可穿戴设备实现文本识别的问题。
本发明是通过以下技术方案实现的:设计、制造了一种基于EMG的可穿戴式文本输入***,包括腕带主体以及传感器模块,所述腕带主体内嵌惯性测试单元IMU、处理器、存储器、通信模块以及扩展槽;所述传感器模块嵌入所述扩展槽中;所述传感器将肌电生物电变化信息传送给所述处理器;所述惯性测试单元IMU记录分析手臂的运动信息并传送给所述处理器;所述通信模块将分析后的字符信息进行传输。
作为本发明的进一步改进:所述传感器模块为肌电传感器,所述肌电传感器感知敲击动作产生的肌电生物电变化,将之转化为离散的数字信号并发送至处理器。
作为本发明的进一步改进:所述腕带主体内置电源模块,所述电源模块为电池或外接充电;所述腕带主体上的接口用于充电或数据传输。
作为本发明的进一步改进:所述处理器对传感器模块和惯性测试单元IMU传输过来的信号进行特征提取、自学习、按键动作识别与匹配和文本组合之后将匹配的字符通过通信模 块发送出去。
作为本发明的进一步改进:所述肌电传感器含有五个电极,其中两个在侧边用于连接扩展槽,另外三个在底部用于感知肌电变化。
本发明同时提供了一种基于EMG的可穿戴式文本输入方法,包括如下步骤:(S101)接收传感器模块的数据信息;(S102)进行特征提取;(S103)建立特征集;(S105)字符识别;(S105)生成文本字符。
作为本发明的进一步改进:所述步骤(S101)中,根据训练模型,把传感器模块和惯性测试单元IMU在每个时间窗口内感知的信号收集起来,并建立起数据模型,存储于存储器内;其中传感器模块捕捉手指敲击纸质键盘时肌电产生的生物电变化信号;IMU捕捉手臂移动时加速度传感器和陀螺仪的读数,分析可获得手臂的移动轨迹。
作为本发明的进一步改进:所述步骤(S102)中,通过滤波算法去除影响EMG信号的噪音,然后提取出有用的特征信号;所述步骤(S103)中,根据训练模型对最新的特征数据进行处理,建立起结构化的特征模型;所述步骤(S105)中,分析数据模型,根据匹配算法,识别出每次动作的所对应的按键,确定用户所欲输入的字符;所述步骤(S106)中,根据语法规则,对缺失的字符进行上下文分析,然后模糊修复,输出文本信息。
作为本发明的进一步改进:所述的基于EMG的可穿戴式文本输入方法还包括自学习步骤,具体为:根据最新的特征集,结合历史特征模型,对数据进行进一步修正,为识别模型提供精确的数据,提供了灵活的接口。
作为本发明的进一步改进:所述传感器模块为肌电传感器,所述肌电传感器感知敲击动作产生的肌电生物电变化,将之转化为离散的数字信号并发送至处理器;所述惯性测试单元IMU记录分析手臂的运动信息并传送给所述处理器;所述通信模块将分析后的字符信息进行传输。
本发明的有益效果是:智能腕带更便携,更时尚;可拆卸的设计增加了设备的趣味性,传感器合理的选择能提高精度的同时也降低了能耗。
【附图说明】
图1是本发明基于EMG的可穿戴式文本输入***的演示示意图。
图2是本发明基于EMG的可穿戴式文本输入***的腕带主体的侧面图。
图3是本发明基于EMG的可穿戴式文本输入***的腕带主体的展开图。
图4是本发明肌电传感器连接示意图。
图5是本发明基于EMG的可穿戴式文本输入***结构示意图。
图6是本发明基于EMG的可穿戴式文本输入方法的流程示意图。
【具体实施方式】
下面结合附图说明及具体实施方式对本发明进一步说明。
一种基于EMG的可穿戴式文本输入***,包括腕带主体以及传感器模块,所述腕带主体内嵌惯性测试单元IMU、处理器、存储器、通信模块以及扩展槽;所述传感器模块嵌入所述扩展槽中;所述传感器将肌电生物电变化信息传送给所述处理器;所述惯性测试单元IMU记录分析手臂的运动信息并传送给所述处理器;所述通信模块将分析后的字符信息进行传输。
所述传感器模块为肌电传感器,所述肌电传感器感知敲击动作产生的肌电生物电变化,将之转化为离散的数字信号并发送至处理器。
所述腕带主体内置电源模块,所述电源模块为电池或外接充电;所述腕带主体上的接口用于充电或数据传输。
所述处理器对传感器模块和惯性测试单元IMU传输过来的信号进行特征提取、自学习、按键动作识别与匹配和文本组合之后将匹配的字符通过通信模块发送出去。
所述肌电传感器含有五个电极,其中两个在侧边用于连接扩展槽,另外三个在底部用于感知肌电变化。
本发明同时提供了一种基于EMG的可穿戴式文本输入方法,包括如下步骤:(S101)接收传感器模块的数据信息;(S102)进行特征提取;(S103)建立特征集;(S105)字符识别;(S105)生成文本字符。
所述步骤(S101)中,根据训练模型,把传感器模块和惯性测试单元IMU在每个时间窗口内感知的信号收集起来,并建立起数据模型,存储于存储器内;其中传感器模块捕捉手指敲击纸质键盘时肌电产生的生物电变化信号;IMU捕捉手臂移动时加速度传感器和陀螺仪的读数,分析可获得手臂的移动轨迹。
所述步骤(S102)中,通过滤波算法去除影响EMG信号的噪音,然后提取出有用的特征信号;所述步骤(S103)中,根据训练模型对最新的特征数据进行处理,建立起结构化的特征模型;所述步骤(S105)中,分析数据模型,根据匹配算法,识别出每次动作的所对应的按键,确定用户所欲输入的字符;所述步骤(S106)中,根据语法规则,对缺失的字符进行上下文分析,然后模糊修复,输出文本信息。
所述的基于EMG的可穿戴式文本输入方法还包括自学习步骤,具体为:根据最新的特征集,结合历史特征模型,对数据进行进一步修正,为识别模型提供精确的数据,提供了灵活的接口。
所述传感器模块为肌电传感器,所述肌电传感器感知敲击动作产生的肌电生物电变化,将之转化为离散的数字信号并发送至处理器;所述惯性测试单元IMU记录分析手臂的运动信息并传送给所述处理器;所述通信模块将分析后的字符信息进行传输。
相比之下,本发明可以与其他睡眠监测、测步等功能集合与同一个设备上,同时提供多个功能,而且更显得时尚,根据使用肌电传感器的成功案例,例如MYO控制臂环,肌电传感器能提供精确的肌电电极变化监测功能,即使是简单的敲击动作。本发明可以使用肌电传感器感知敲击动作,并通过综合分析,将每次敲击动作与字符输入结合起来,如图1。
在一实施例中,如图2、图3、图4以及图5,图2是智能腕带的侧面图,其中P101是处理模块,P102是肌电传感器;图3是智能腕带的展开图,其中P201是主体腕带的拓展槽(共八个),P202是拓展槽内部连接点,P203是处理模块,P204是数据传输总线以及电源线;图4是一个可拆卸的肌电传感器,P301是三个必须的感应电极,P302是两个电极,用于与拓展槽连接点相连;图5是处理模块,P401是处理器,P402是存储器,P403是六轴惯性测试单元(IMU),P404是通信模块,P405是USB接口。一种基于EMG的可穿戴式文本输入***是一款可穿戴智能***,主要包括腕带主体以及肌电传感器模块,肌电传感器可拆卸,可选择性地嵌入腕带主体的拓展槽,嵌入的位置与数目会影响识别精度与设备能耗;肌电传感器感知肌电生物电变化,并传送给处理器处理;腕带主体处理模块内嵌六轴惯性测试单元(IMU),能记录用户手臂移动数据,可以分析出手臂运动轨迹;腕带主体处理模块内嵌处理器与存储器,含有多种处理算法,可处理传感器所获取的信息;腕带主体通过内嵌的通信模块将分析后的字符信息传送给其他电子设备。具有相同功能的肌电传感器可选择地嵌入到拓展槽,传感器感知到敲击动作产生的肌电生物电变化,将之转化为离散的数字信号,并发送至处理器。
肌电传感器嵌入位置与数目会影响精度,传感器覆盖肌电的位置越正确数据精度越高,传感器使用数目越多精度越高,但能耗越多。
腕带主体,主体内置电源模块,电源由电池提供,电池可通过诸如USB接口充电;USB接口除可用于充电,还可以连接其他电子设备传输数据;主体设计有拓展槽,传感器可以嵌入,嵌入后可传输数据,亦可充电;肌电传感器,内置电池,可独立充电;处理器,含有多种算法(如包括“滤波,纠正,匹配”算法),对肌电传感器以及IMU传感器发送过来的信号进行“特征提取–自学习–按键动作识别与匹配–文本组合”,最后将匹配的字符通过通信模块发送出去。
本发明同时提供了一种基于EMG的可穿戴式文本输入方法,在一实施例中,按以下 步骤完成文本输入。
S101:数据收集阶段,根据训练模型,把EMG和IMU传感器在每个时间窗口内感知的信号收集起来,并建立起数据模型,存储于存储器内;其中EMG传感器捕捉手指敲击纸质键盘时肌电产生的生物电变化信号;IMU传感器捕捉手臂移动时加速度传感器和陀螺仪的读数,分析可获得手臂的移动轨迹。
S102:特征提取。从S101中获取到数据模型,结合IMU获取的数据,通过滤波算法去除影响EMG信号的噪音,然后提取出有用的特征信号。
S103:特征集。根据S102获得有用的特征信号,将记录于特征集之中,特征集中存储有历史特征数据以及最新的特征数据。根据训练模型对最新的特征数据进行处理,建立起结构化的特征模型。结合历史特征数据,逐步统计完善已有的特征模型。
S104:自学习接口。根据最新的特征集,结合历史特征模型,对数据进行进一步修正,为识别模型提供精确的数据,提供了灵活的接口。
S105:识别模型与按键辨识。分析自学习接口提供的数据模型,根据匹配算法,识别出每次动作的所对应的按键,确定用户所欲输入的字符。
S106:基于语法的文本输入接口。根据语法规则,对缺失的字符进行上下文分析,然后模糊修复,最后组合成单词甚至句子。
以上内容是结合具体的优选实施方式对本发明所作的进一步详细说明,不能认定本发明的具体实施只局限于这些说明。对于本发明所属技术领域的普通技术人员来说,在不脱离本发明构思的前提下,还可以做出若干简单推演或替换,都应当视为属于本发明的保护范围。

Claims (10)

  1. 一种基于EMG的可穿戴式文本输入***,其特征在于:包括腕带主体以及传感器模块,所述腕带主体内嵌惯性测试单元IMU、处理器、存储器、通信模块以及扩展槽;所述传感器模块嵌入所述扩展槽中;所述传感器将肌电生物电变化信息传送给所述处理器;所述惯性测试单元IMU记录分析手臂的运动信息并传送给所述处理器;所述通信模块将分析后的字符信息进行传输。
  2. 根据权利要求1所述的基于EMG的可穿戴式文本输入***,其特征在于:所述传感器模块为肌电传感器,所述肌电传感器感知敲击动作产生的肌电生物电变化,将之转化为离散的数字信号并发送至处理器。
  3. 根据权利要求1所述的基于EMG的可穿戴式文本输入***,其特征在于:所述腕带主体内置电源模块,所述电源模块为电池或外接充电;所述腕带主体上的接口用于充电或数据传输。
  4. 根据权利要求1所述的基于EMG的可穿戴式文本输入***,其特征在于:所述处理器对传感器模块和惯性测试单元IMU传输过来的信号进行特征提取、自学习、按键动作识别与匹配和文本组合之后将匹配的字符通过通信模块发送出去。
  5. 根据权利要求2所述的基于EMG的可穿戴式文本输入***,其特征在于:所述肌电传感器含有五个电极,其中两个在侧边用于连接扩展槽,另外三个在底部用于感知肌电变化。
  6. 一种基于EMG的可穿戴式文本输入方法,其特征在于:包括如下步骤:(S101)接收传感器模块的数据信息;(S102)进行特征提取;(S103)建立特征集;(S105)字符识别;(S105)生成文本字符。
  7. 根据权利要求6所述的基于EMG的可穿戴式文本输入方法,其特征在于:所述步骤(S101)中,根据训练模型,把传感器模块和惯性测试单元IMU在每个时间窗口内感知的信号收集起来,并建立起数据模型,存储于存储器内;其中传感器模块捕捉手指敲击纸质键盘时肌电产生的生物电变化信号;IMU捕捉手臂移动时加速度传感器和陀螺仪的读数,分析可获得手臂的移动轨迹。
  8. 根据权利要求6所述的基于EMG的可穿戴式文本输入方法,其特征在于:所述步骤(S102)中,通过滤波算法去除影响EMG信号的噪音,然后提取出有用的特征信号;所述步骤(S103)中,根据训练模型对最新的特征数据进行处理,建立起结构化的特征模型;所述步骤(S105)中,分析数据模型,根据匹配算法,识别出每次动作的所对应的按键,确定用户所欲输入的字符;所述步骤(S106)中,根据语法规则,对缺失的字符进行上下文分析,然后模糊修复, 输出文本信息。
  9. 根据权利要求6所述的基于EMG的可穿戴式文本输入方法,其特征在于:所述的基于EMG的可穿戴式文本输入方法还包括自学习步骤,具体为:根据最新的特征集,结合历史特征模型,对数据进行进一步修正,为识别模型提供精确的数据,提供了灵活的接口。
  10. 根据权利要求6所述的基于EMG的可穿戴式文本输入方法,其特征在于:所述传感器模块为肌电传感器,所述肌电传感器感知敲击动作产生的肌电生物电变化,将之转化为离散的数字信号并发送至处理器;所述惯性测试单元IMU记录分析手臂的运动信息并传送给所述处理器;所述通信模块将分析后的字符信息进行传输。
PCT/CN2016/080732 2015-12-04 2016-04-29 基于emg的可穿戴式文本输入***及方法 WO2017092225A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510884533.5A CN105511615B (zh) 2015-12-04 2015-12-04 基于emg的可穿戴式文本输入***及方法
CN201510884533.5 2015-12-04

Publications (1)

Publication Number Publication Date
WO2017092225A1 true WO2017092225A1 (zh) 2017-06-08

Family

ID=55719665

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/080732 WO2017092225A1 (zh) 2015-12-04 2016-04-29 基于emg的可穿戴式文本输入***及方法

Country Status (2)

Country Link
CN (1) CN105511615B (zh)
WO (1) WO2017092225A1 (zh)

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10409371B2 (en) 2016-07-25 2019-09-10 Ctrl-Labs Corporation Methods and apparatus for inferring user intent based on neuromuscular signals
US10460455B2 (en) 2018-01-25 2019-10-29 Ctrl-Labs Corporation Real-time processing of handstate representation model estimates
US10489986B2 (en) 2018-01-25 2019-11-26 Ctrl-Labs Corporation User-controlled tuning of handstate representation model parameters
US10496168B2 (en) 2018-01-25 2019-12-03 Ctrl-Labs Corporation Calibration techniques for handstate representation modeling using neuromuscular signals
US10592001B2 (en) 2018-05-08 2020-03-17 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
US10684692B2 (en) 2014-06-19 2020-06-16 Facebook Technologies, Llc Systems, devices, and methods for gesture identification
US10687759B2 (en) 2018-05-29 2020-06-23 Facebook Technologies, Llc Shielding techniques for noise reduction in surface electromyography signal measurement and related systems and methods
US10772519B2 (en) 2018-05-25 2020-09-15 Facebook Technologies, Llc Methods and apparatus for providing sub-muscular control
CN111700718A (zh) * 2020-07-13 2020-09-25 北京海益同展信息科技有限公司 一种识别握姿的方法、装置、假肢及可读存储介质
US10817795B2 (en) 2018-01-25 2020-10-27 Facebook Technologies, Llc Handstate reconstruction based on multiple inputs
US10842407B2 (en) 2018-08-31 2020-11-24 Facebook Technologies, Llc Camera-guided interpretation of neuromuscular signals
US10905383B2 (en) 2019-02-28 2021-02-02 Facebook Technologies, Llc Methods and apparatus for unsupervised one-shot machine learning for classification of human gestures and estimation of applied forces
US10921764B2 (en) 2018-09-26 2021-02-16 Facebook Technologies, Llc Neuromuscular control of physical objects in an environment
US10937414B2 (en) 2018-05-08 2021-03-02 Facebook Technologies, Llc Systems and methods for text input using neuromuscular information
US10970936B2 (en) 2018-10-05 2021-04-06 Facebook Technologies, Llc Use of neuromuscular signals to provide enhanced interactions with physical objects in an augmented reality environment
US10970374B2 (en) 2018-06-14 2021-04-06 Facebook Technologies, Llc User identification and authentication with neuromuscular signatures
US10990174B2 (en) 2016-07-25 2021-04-27 Facebook Technologies, Llc Methods and apparatus for predicting musculo-skeletal position information using wearable autonomous sensors
US11000211B2 (en) 2016-07-25 2021-05-11 Facebook Technologies, Llc Adaptive system for deriving control signals from measurements of neuromuscular activity
US11045137B2 (en) 2018-07-19 2021-06-29 Facebook Technologies, Llc Methods and apparatus for improved signal robustness for a wearable neuromuscular recording device
US11069148B2 (en) 2018-01-25 2021-07-20 Facebook Technologies, Llc Visualization of reconstructed handstate information
US11079846B2 (en) 2013-11-12 2021-08-03 Facebook Technologies, Llc Systems, articles, and methods for capacitive electromyography sensors
US11179066B2 (en) 2018-08-13 2021-11-23 Facebook Technologies, Llc Real-time spike detection and identification
US11216069B2 (en) 2018-05-08 2022-01-04 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
US11337652B2 (en) 2016-07-25 2022-05-24 Facebook Technologies, Llc System and method for measuring the movements of articulated rigid bodies
US11481030B2 (en) 2019-03-29 2022-10-25 Meta Platforms Technologies, Llc Methods and apparatus for gesture detection and classification
US11481031B1 (en) 2019-04-30 2022-10-25 Meta Platforms Technologies, Llc Devices, systems, and methods for controlling computing devices via neuromuscular signals of users
US11493993B2 (en) 2019-09-04 2022-11-08 Meta Platforms Technologies, Llc Systems, methods, and interfaces for performing inputs based on neuromuscular control
US11567573B2 (en) 2018-09-20 2023-01-31 Meta Platforms Technologies, Llc Neuromuscular text entry, writing and drawing in augmented reality systems
US11635736B2 (en) 2017-10-19 2023-04-25 Meta Platforms Technologies, Llc Systems and methods for identifying biological structures associated with neuromuscular source signals
US11666264B1 (en) 2013-11-27 2023-06-06 Meta Platforms Technologies, Llc Systems, articles, and methods for electromyography sensors
US11797087B2 (en) 2018-11-27 2023-10-24 Meta Platforms Technologies, Llc Methods and apparatus for autocalibration of a wearable electrode sensor system
US11868531B1 (en) 2021-04-08 2024-01-09 Meta Platforms Technologies, Llc Wearable device providing for thumb-to-finger-based input gestures detected based on neuromuscular signals, and systems and methods of use thereof
US11907423B2 (en) 2019-11-25 2024-02-20 Meta Platforms Technologies, Llc Systems and methods for contextualized interactions with an environment
US11921471B2 (en) 2013-08-16 2024-03-05 Meta Platforms Technologies, Llc Systems, articles, and methods for wearable devices having secondary power sources in links of a band for providing secondary power in addition to a primary power source
US11961494B1 (en) 2019-03-29 2024-04-16 Meta Platforms Technologies, Llc Electromagnetic interference reduction in extended reality environments

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105511615B (zh) * 2015-12-04 2019-03-05 深圳大学 基于emg的可穿戴式文本输入***及方法
CN105975091A (zh) * 2016-07-05 2016-09-28 南京理工大学 一种基于惯性传感器的虚拟键盘人机交互技术
CN110362190B (zh) * 2018-04-09 2021-10-29 中国科学院沈阳自动化研究所 基于myo的文本输入***及方法
CN109634439B (zh) * 2018-12-20 2021-04-23 中国科学技术大学 智能文本输入方法
CN110811598A (zh) * 2019-11-14 2020-02-21 深圳先进技术研究院 腕带式生物信号采集设备及其制作方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101477636A (zh) * 2009-01-15 2009-07-08 电子科技大学 一种臂式可穿戴计算机终端设备
CN105022471A (zh) * 2014-04-23 2015-11-04 王建勤 基于压力传感器阵列进行手势识别的装置与方法
US20150346833A1 (en) * 2014-06-03 2015-12-03 Beijing TransBorder Information Technology Co., Ltd. Gesture recognition system and gesture recognition method
CN105511615A (zh) * 2015-12-04 2016-04-20 深圳大学 基于emg的可穿戴式文本输入***及方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101477636A (zh) * 2009-01-15 2009-07-08 电子科技大学 一种臂式可穿戴计算机终端设备
CN105022471A (zh) * 2014-04-23 2015-11-04 王建勤 基于压力传感器阵列进行手势识别的装置与方法
US20150346833A1 (en) * 2014-06-03 2015-12-03 Beijing TransBorder Information Technology Co., Ltd. Gesture recognition system and gesture recognition method
CN105511615A (zh) * 2015-12-04 2016-04-20 深圳大学 基于emg的可穿戴式文本输入***及方法

Cited By (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11921471B2 (en) 2013-08-16 2024-03-05 Meta Platforms Technologies, Llc Systems, articles, and methods for wearable devices having secondary power sources in links of a band for providing secondary power in addition to a primary power source
US11079846B2 (en) 2013-11-12 2021-08-03 Facebook Technologies, Llc Systems, articles, and methods for capacitive electromyography sensors
US11666264B1 (en) 2013-11-27 2023-06-06 Meta Platforms Technologies, Llc Systems, articles, and methods for electromyography sensors
US10684692B2 (en) 2014-06-19 2020-06-16 Facebook Technologies, Llc Systems, devices, and methods for gesture identification
US10409371B2 (en) 2016-07-25 2019-09-10 Ctrl-Labs Corporation Methods and apparatus for inferring user intent based on neuromuscular signals
US10656711B2 (en) 2016-07-25 2020-05-19 Facebook Technologies, Llc Methods and apparatus for inferring user intent based on neuromuscular signals
US11000211B2 (en) 2016-07-25 2021-05-11 Facebook Technologies, Llc Adaptive system for deriving control signals from measurements of neuromuscular activity
US11337652B2 (en) 2016-07-25 2022-05-24 Facebook Technologies, Llc System and method for measuring the movements of articulated rigid bodies
US10990174B2 (en) 2016-07-25 2021-04-27 Facebook Technologies, Llc Methods and apparatus for predicting musculo-skeletal position information using wearable autonomous sensors
US11635736B2 (en) 2017-10-19 2023-04-25 Meta Platforms Technologies, Llc Systems and methods for identifying biological structures associated with neuromuscular source signals
US10496168B2 (en) 2018-01-25 2019-12-03 Ctrl-Labs Corporation Calibration techniques for handstate representation modeling using neuromuscular signals
US11127143B2 (en) 2018-01-25 2021-09-21 Facebook Technologies, Llc Real-time processing of handstate representation model estimates
US10489986B2 (en) 2018-01-25 2019-11-26 Ctrl-Labs Corporation User-controlled tuning of handstate representation model parameters
US10460455B2 (en) 2018-01-25 2019-10-29 Ctrl-Labs Corporation Real-time processing of handstate representation model estimates
US11587242B1 (en) 2018-01-25 2023-02-21 Meta Platforms Technologies, Llc Real-time processing of handstate representation model estimates
US11069148B2 (en) 2018-01-25 2021-07-20 Facebook Technologies, Llc Visualization of reconstructed handstate information
US11361522B2 (en) 2018-01-25 2022-06-14 Facebook Technologies, Llc User-controlled tuning of handstate representation model parameters
US10817795B2 (en) 2018-01-25 2020-10-27 Facebook Technologies, Llc Handstate reconstruction based on multiple inputs
US11163361B2 (en) 2018-01-25 2021-11-02 Facebook Technologies, Llc Calibration techniques for handstate representation modeling using neuromuscular signals
US11216069B2 (en) 2018-05-08 2022-01-04 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
US11036302B1 (en) 2018-05-08 2021-06-15 Facebook Technologies, Llc Wearable devices and methods for improved speech recognition
US10937414B2 (en) 2018-05-08 2021-03-02 Facebook Technologies, Llc Systems and methods for text input using neuromuscular information
US10592001B2 (en) 2018-05-08 2020-03-17 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
US10772519B2 (en) 2018-05-25 2020-09-15 Facebook Technologies, Llc Methods and apparatus for providing sub-muscular control
US10687759B2 (en) 2018-05-29 2020-06-23 Facebook Technologies, Llc Shielding techniques for noise reduction in surface electromyography signal measurement and related systems and methods
US11129569B1 (en) 2018-05-29 2021-09-28 Facebook Technologies, Llc Shielding techniques for noise reduction in surface electromyography signal measurement and related systems and methods
US10970374B2 (en) 2018-06-14 2021-04-06 Facebook Technologies, Llc User identification and authentication with neuromuscular signatures
US11045137B2 (en) 2018-07-19 2021-06-29 Facebook Technologies, Llc Methods and apparatus for improved signal robustness for a wearable neuromuscular recording device
US11179066B2 (en) 2018-08-13 2021-11-23 Facebook Technologies, Llc Real-time spike detection and identification
US10842407B2 (en) 2018-08-31 2020-11-24 Facebook Technologies, Llc Camera-guided interpretation of neuromuscular signals
US10905350B2 (en) 2018-08-31 2021-02-02 Facebook Technologies, Llc Camera-guided interpretation of neuromuscular signals
US11567573B2 (en) 2018-09-20 2023-01-31 Meta Platforms Technologies, Llc Neuromuscular text entry, writing and drawing in augmented reality systems
US10921764B2 (en) 2018-09-26 2021-02-16 Facebook Technologies, Llc Neuromuscular control of physical objects in an environment
US10970936B2 (en) 2018-10-05 2021-04-06 Facebook Technologies, Llc Use of neuromuscular signals to provide enhanced interactions with physical objects in an augmented reality environment
US11941176B1 (en) 2018-11-27 2024-03-26 Meta Platforms Technologies, Llc Methods and apparatus for autocalibration of a wearable electrode sensor system
US11797087B2 (en) 2018-11-27 2023-10-24 Meta Platforms Technologies, Llc Methods and apparatus for autocalibration of a wearable electrode sensor system
US10905383B2 (en) 2019-02-28 2021-02-02 Facebook Technologies, Llc Methods and apparatus for unsupervised one-shot machine learning for classification of human gestures and estimation of applied forces
US11481030B2 (en) 2019-03-29 2022-10-25 Meta Platforms Technologies, Llc Methods and apparatus for gesture detection and classification
US11961494B1 (en) 2019-03-29 2024-04-16 Meta Platforms Technologies, Llc Electromagnetic interference reduction in extended reality environments
US11481031B1 (en) 2019-04-30 2022-10-25 Meta Platforms Technologies, Llc Devices, systems, and methods for controlling computing devices via neuromuscular signals of users
US11493993B2 (en) 2019-09-04 2022-11-08 Meta Platforms Technologies, Llc Systems, methods, and interfaces for performing inputs based on neuromuscular control
US11907423B2 (en) 2019-11-25 2024-02-20 Meta Platforms Technologies, Llc Systems and methods for contextualized interactions with an environment
CN111700718A (zh) * 2020-07-13 2020-09-25 北京海益同展信息科技有限公司 一种识别握姿的方法、装置、假肢及可读存储介质
US11868531B1 (en) 2021-04-08 2024-01-09 Meta Platforms Technologies, Llc Wearable device providing for thumb-to-finger-based input gestures detected based on neuromuscular signals, and systems and methods of use thereof

Also Published As

Publication number Publication date
CN105511615A (zh) 2016-04-20
CN105511615B (zh) 2019-03-05

Similar Documents

Publication Publication Date Title
WO2017092225A1 (zh) 基于emg的可穿戴式文本输入***及方法
US10649549B2 (en) Device, method, and system to recognize motion using gripped object
JP2021072136A (ja) ジェスチャに基づいて制御するための筋活動センサ信号と慣性センサ信号とを結合する方法および装置
CN104134060B (zh) 基于肌电信号和运动传感器的手语翻译和显示发声***
Ganti et al. Satire: a software architecture for smart attire
EP3315914B1 (en) Step counting method, device and terminal
US11222729B2 (en) Electronic device and method for providing stress index corresponding to activity of user
CN103513770A (zh) 基于三轴陀螺仪的人机接口设备及人机交互方法
CN109634439B (zh) 智能文本输入方法
CN107798322A (zh) 一种智能笔
CN110442233A (zh) 一种基于手势交互的增强现实键鼠***
CN105404390A (zh) 一种无线数据手套建模及手势动作识别方法
CN105530581A (zh) 一种基于声音识别的智能穿戴设备和控制方法
CN110807471B (zh) 一种多模态传感器的行为识别***及识别方法
CN116909390A (zh) 基于手套的多模态数据采集***
CN103685519A (zh) 毛笔书法辅助练习方法及辅助练习***
Roggen et al. An educational and research kit for activity and context recognition from on-body sensors
CN106843482A (zh) 一种基于无线自组网模式的手势姿态检测装置
CN107450672B (zh) 一种高识别率的腕式智能装置
CN111831122B (zh) 基于多关节数据融合的手势识别***及方法
KR20080053022A (ko) 유비쿼터스 컴퓨팅 환경에서의 다목적 최적화 기법을이용한 착용형 컴퓨터 및 그 방법
CN109567814B (zh) 刷牙动作的分类识别方法、计算设备、***及存储介质
CN207301977U (zh) 一种虚拟现实手套
Vishal et al. Sign language to speech conversion
CN112883935A (zh) 一种神经网络实时手语翻译***设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16869539

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 25/09/2018)

122 Ep: pct application non-entry in european phase

Ref document number: 16869539

Country of ref document: EP

Kind code of ref document: A1