CN104133554A - Method and device for identifying leading limb - Google Patents

Method and device for identifying leading limb Download PDF

Info

Publication number
CN104133554A
CN104133554A CN201410386783.1A CN201410386783A CN104133554A CN 104133554 A CN104133554 A CN 104133554A CN 201410386783 A CN201410386783 A CN 201410386783A CN 104133554 A CN104133554 A CN 104133554A
Authority
CN
China
Prior art keywords
limbs
information
skin conductivity
leading
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201410386783.1A
Other languages
Chinese (zh)
Inventor
刘浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhigu Ruituo Technology Services Co Ltd
Original Assignee
Beijing Zhigu Ruituo Technology Services Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhigu Ruituo Technology Services Co Ltd filed Critical Beijing Zhigu Ruituo Technology Services Co Ltd
Priority to CN201410386783.1A priority Critical patent/CN104133554A/en
Publication of CN104133554A publication Critical patent/CN104133554A/en
Priority to CN201410705579.1A priority patent/CN104360745A/en
Priority to US15/501,766 priority patent/US20170235366A1/en
Priority to PCT/CN2015/086310 priority patent/WO2016019894A1/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/053Measuring electrical impedance or conductance of a portion of the body
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Neurology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Dermatology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Neurosurgery (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • User Interface Of Digital Computer (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention provides a method and device for identifying a leading limb and relates to the field of wearable devices. The method comprises the steps of acquiring skin electric conduction information of a limb of a user; identifying whether the limb is the leading limb according to the skin electric conduction information and reference information. By means of the method and device for identifying the leading limb, the device worn by the user can perform automatic setting according to identification results conveniently, and user experience is improved.

Description

Leading limbs recognition methods and device
Technical field
The application relates to wearable apparatus field, relates in particular to a kind of leading limbs recognition methods and device.
Background technology
In recent years, along with the development of wearable equipment, Intelligent spire lamella, intelligent bracelet, intelligent glasses etc. progress into people's life, enrich and facilitate greatly people's life.Wearable equipment is due to small volume, and general interaction capabilities is poor.Therefore, user generally wishes that it has good self-recognizability, thereby reduces user's setting operation.
In crowd, approximately people's left hand of 10-13% is leading hand, and others's right hand is leading hand.If wearable equipment can identify user's leading hand, can be used as the input of self or other equipment, reduce user's setting operation, improve user's experience.
Summary of the invention
The application's object is: a kind of leading limbs recognition methods and device are provided.
According at least one embodiment of the application aspect, a kind of leading limbs recognition methods is provided, described method comprises:
Obtain a skin conductivity information of user's limbs;
Whether according to described skin conductivity information and a reference information, identifying described limbs is leading limbs.
According at least one embodiment of the application aspect, a kind of leading limbs recognition device is provided, described device comprises:
One acquisition module, for obtaining user's a skin conductivity information of limbs;
One identification module, whether for according to described skin conductivity information and a reference information, identifying described limbs is leading limbs.
Leading limbs recognition methods and device described in the embodiment of the present application, obtain the skin conductivity information of user's limbs, whether and then to identify described limbs according to described skin conductivity information and a reference information are leading limbs, thereby provide a kind of recognition methods of leading limbs, be conducive to the equipment that user wears and carry out Lookup protocol according to recognition result, promote user and experience.
Brief description of the drawings
Fig. 1 is the process flow diagram of leading limbs recognition methods described in embodiment of the application;
Fig. 2 is the contrast schematic diagram of dominating the skin conductivity information of hand and non-dominant hand in embodiment of the application;
Fig. 3 is the process flow diagram of leading limbs recognition methods described in embodiment of the application;
Fig. 4 is the process flow diagram of leading limbs recognition methods described in another embodiment of the application;
Fig. 5 is the process flow diagram of leading limbs recognition methods described in another embodiment of the application;
Fig. 6 is the process flow diagram of leading limbs recognition methods described in another embodiment of the application;
Fig. 7 is the modular structure schematic diagram of leading limbs recognition device described in embodiment of the application;
Fig. 8 is the modular structure schematic diagram of leading limbs recognition device described in embodiment of the application;
Fig. 9 is the modular structure schematic diagram of leading limbs recognition device described in another embodiment of the application;
Figure 10 is the modular structure schematic diagram of leading limbs recognition device described in another embodiment of the application;
Figure 11 is the modular structure schematic diagram of leading limbs recognition device described in another embodiment of the application;
Figure 12 is the modular structure schematic diagram of leading limbs recognition device described in another embodiment of the application;
Figure 13 is the hardware configuration schematic diagram of leading limbs recognition device described in embodiment of the application.
Embodiment
Below in conjunction with drawings and Examples, the application's embodiment is described in further detail.Following examples are used for illustrating the application, but are not used for limiting the application's scope.
Those skilled in the art understand, in the application's embodiment, the size of the sequence number of following each step does not also mean that the priority of execution sequence, and the execution sequence of each step should determine with its function and internal logic, and should not form any restriction to the implementation process of the embodiment of the present application.
Fig. 1 is the process flow diagram of leading limbs recognition methods described in embodiment of the application, and described method can for example realize on a leading limbs recognition device.As shown in Figure 1, described method comprises:
S120: a skin conductivity information of obtaining user's limbs;
S140: whether according to described skin conductivity information and a reference information, identifying described limbs is leading limbs.
Method described in the embodiment of the present application, obtain the skin conductivity information of user's limbs, whether and then to identify described limbs according to described skin conductivity information and a reference information are leading limbs, thereby provide a kind of recognition methods of leading limbs, be conducive to the equipment that user wears and carry out Lookup protocol according to recognition result, promote user and experience.
Below with reference to embodiment, describe the function of described step S120 and S140 in detail.
S120: a skin conductivity information of obtaining user's limbs.
Wherein, described limbs can be hands in two hands of described user, can be also pin in two pin of user.For the purpose of simple, below be that a hand in two hands of described user is that example describes mainly with described limbs.Described skin conductivity information can be SCL (skin conductance level) eigenwert, it can obtain by the skin conductivity sensor contacting with user's skin, current Intelligent spire lamella, intelligent watch etc. substantially all has this forwarder, therefore, the enforcement of described method can not increase the hardware cost of existing Wearable equipment.
Inventor finds under study for action, according to skin conductivity principle, the skin conductivity of the skin conductivity of user's leading hand and non-dominant hand has the marked difference of statistical, and the variance of the leading skin conductivity signal of hand and the skin conductivity signal of non-dominant hand is less than or equal to 0.05.As shown in Figure 2, upper curve represents the sample curve of the skin conductivity information obtaining of the skin conductivity signal of the non-dominant hand to a user, and lower curve represents that the skin conductivity signal of the leading hand to described user is sampled and obtains the curve of skin conductivity information.Wherein, transverse axis represents the time, and unit is second, and the longitudinal axis represents that electricity leads, and unit is micro-west.Can see, article two, curve has significant difference, after analysis, can obtain, the mean value of the skin conductivity information of described leading hand is significantly less than the mean value of the skin conductivity information of described non-dominant hand, and this is the ultimate principle of the leading hand of method identification described in the application just.
S140: whether according to described skin conductivity information and a reference information, identifying described limbs is leading limbs.
In one embodiment, described reference information is according to described user's left limb skin conductivity information and the definite threshold value of right limb skin conductivity information, in described step S140, can be according to described skin conductivity information and described threshold value, whether are leading limbs, it may further include if identifying described limbs:
S141: be less than described threshold value in response to the mean value of described skin conductivity information, identifying described limbs is leading limbs;
S142: be not less than described threshold value in response to the mean value of described skin conductivity information, identifying described limbs is not leading limbs.
Such as, described reference information can be a threshold value of determining according to the skin conductivity information of the skin conductivity information of described user's left hand and the right hand, supposes that the mean value of the skin conductivity information of described user's left hand falls into the first interval (L min, L max), suppose that the mean value of the skin conductivity information of described user's the right hand falls into (R between Second Region min, R max), and suppose that described user's left hand is leading hand, have L max< R min, can determine that described threshold value is M, and L max< M < R min.That is to say, described threshold value M is a numerical value between between described the first interval and described Second Region.
Therefore, if the mean value of described skin conductivity information is less than described threshold value M, think that it falls into described the first interval, described limbs are leading hands; If the mean value of described skin conductivity information is not less than described threshold value M, think that it falls between described Second Region, described limbs are not leading hands.
It should be noted that, in present embodiment, described reference information need to be determined according to the skin conductivity information of the skin conductivity information of described user's left limb and right limb, therefore, need to obtain in advance described user's the skin conductivity information of left limb and the skin conductivity information of right limb, such as user is formal use described leading limbs recognition device before, first wear a period of time on leftward, on the right hand, wear a period of time again, to complete training process.
In another embodiment, described reference information is described user's left limb skin conductivity information or right limb skin conductivity information, described step S140 can pass through more described skin conductivity information and described reference information, whether are leading limbs to identify described limbs, it may further include:
S141 ': have the marked difference of statistical in response to described skin conductivity information and described reference information, and the mean value of described skin conductivity information is less than the mean value of described reference information, identifying described limbs is leading limbs.
It can further include:
S142 ': have the marked difference of statistical in response to described skin conductivity information and described reference information, and the mean value of described skin conductivity information is greater than the mean value of described reference information, identifying described limbs is not leading limbs.
In present embodiment, user without carrying out preliminary election training, wears described leading limbs recognition device first such as the left hand in response to user or the right hand, obtains the first skin conductivity information, and as described reference information; (such as second day) wears described leading limbs recognition device again to have spent a period of time in response to user, obtains Second Skin conduction information.Described the first skin conductivity information and described Second Skin conduction information may be the skin conductivity information of the same hand, also may be respectively the skin conductivity information of two hands, and according to the recognition principle of above-mentioned leading hand, only having the described first skin conductivity information of working as and described Second Skin conduction information is respectively in the situation of skin conductivity information of two hands, could identify user's leading hand.
Therefore, in described step S141 ' and S142 ', first to meet the marked difference that described skin conductivity information and described reference information have statistical, represent that described skin conductivity information and described reference information are respectively the skin conductivity information of two hands, and then can, by the size of the mean value of more described skin conductivity information and the mean value of described reference information, complete identification.
The advantage of present embodiment is; do not need user deliberately to complete training process; user completes the collection of reference information in natural use procedure; but; described in present embodiment, method is obtained the skin conductivity information of two hands before and after still needing, and it is mainly applicable to often change the user who sets about wearing wearable equipment.
In addition, those skilled in the art understand, when user was after two were all worn described leading limbs recognition device on hand, described method can reasonably select the skin conductivity information of a hand as described reference information, thereby ensure that described skin conductivity information and described reference information have the marked difference of statistical, to complete identifying.
Referring to Fig. 3, in one embodiment, described method also comprises:
S150: carry out the first operation according to recognition result.
Such as, know that according to recognition result user's leading hand is being worn Intelligent spire lamella, often can participate in work because taking hand channel as the leading factor, in the case of user's the frequent motion of leading hand arm, can point out user to note wrist strap safety, prevent from breaking.
In addition, according to the leading limbs information of described recognition result and user's input, oneself be left-handed person such as user inputs, can further identify user's right-hand man.
Referring to Fig. 4, in one embodiment, described method also comprises:
S160: the leading limbs information that receives user's input.
Wherein, described leading limbs information is the information own which limbs of explanation are leading limbs.
Referring to Fig. 5, in one embodiment, described method also comprises:
S170: according to described leading limbs information and recognition result, determine that described limbs are left limb or right limb.
Such as, if described leading limbs information shows that described user is left-handed person, and described recognition result shows that described limbs are leading hands, can determine that described limbs are left hands.
Referring to Fig. 6, in one embodiment, described method can also comprise:
S180: are results of left limb or right limb according to definite described limbs, carry out the second operation.
After having determined that described limbs are left limb or right limb, can be adaptive the setting such as display interface of the wearable equipment worn of adjustment user's described limbs, or can also adjust the setting such as display interface of the equipment such as smart mobile phone that described limbs grip.
Such as, suppose to determine that described limbs are left hands of user, the deciphering gesture of the intelligent watch that described limbs wear can be set is to slide from left to right to described method, can arrange that on smart mobile phone, consonant is in the left-half of screen, and vowel is at the right half part of screen.
In addition, the embodiment of the present application also provides a kind of computer-readable medium, is included in the computer-readable instruction that carries out following operation while being performed: carry out the step S120 of method and the operation of S140 in above-mentioned Fig. 1 illustrated embodiment.
To sum up, method described in the embodiment of the present application, whether available to identify described limbs according to the skin conductivity information of user's one limbs and a reference information are leading limbs, and can determine that described limbs are left limb or right limb in conjunction with the leading limbs information of user's input, and carry out corresponding operating according to recognition result or definite result, reduce user's setting operation, promoted user's experience.
Fig. 7 is the modular structure schematic diagram of leading limbs recognition device described in one embodiment of the invention, described leading limbs recognition device can be used as a functional module and is arranged in the wearable equipment such as Intelligent spire lamella, intelligent watch, certainly also can be used as one independently wearable device for user.As shown in Figure 7, described device 700 can comprise:
One acquisition module 710, for obtaining user's a skin conductivity information of limbs;
One identification module 720, whether for according to described skin conductivity information and a reference information, identifying described limbs is leading limbs.
Described in the embodiment of the present application, install, obtain the skin conductivity information of user's limbs, whether and then to identify described limbs according to described skin conductivity information and a reference information are leading limbs, thereby provide a kind of leading limbs recognition device, be conducive to wearable device that user wears etc. and carry out Lookup protocol according to recognition result, promote user and experience.
Below with reference to embodiment, describe the function of described acquisition module 710 and identification module 720 in detail.
Described acquisition module 710, for obtaining user's a skin conductivity information of limbs.
Wherein, described limbs can be hands in two hands of described user, can be also pin in two pin of user.For the purpose of simple, below be that a hand in two hands of described user is that example describes mainly with described limbs.Described skin conductivity information can be SCL (skin conductance level) eigenwert, it can obtain by the skin conductivity sensor contacting with user's skin, existing intelligent bracelet, Intelligent spire lamella etc. generally all have described skin conductivity sensor, and device can't increase the cost of existing Wearable equipment described in the application.
Described identification module 720, whether for according to described skin conductivity information and a reference information, identifying described limbs is leading limbs.
Wherein, described reference information can be according to described user's left limb skin conductivity information and the definite threshold value of right limb skin conductivity information, or, can be described user's left limb skin conductivity information or right limb skin conductivity information.Below will divide these two kinds of situations to illustrate respectively.
Referring to Fig. 8, in one embodiment, described device 700 also comprises:
One first determination module 730, for determining that according to described user's left limb skin conductivity information and right limb skin conductivity information a threshold value is as described reference information.
In present embodiment, described identification module 720, for being less than described threshold value in response to the mean value of described skin conductivity information, identifying described limbs is leading limbs; And
Be not less than described threshold value in response to the mean value of described skin conductivity information, identifying described limbs is not leading limbs.
Such as, described reference information can be a threshold value of determining according to the skin conductivity information of the skin conductivity information of described user's left hand and the right hand, supposes that the mean value of the skin conductivity information of described user's left hand falls into the first interval (L min, L max), suppose that the mean value of the skin conductivity information of described user's the right hand falls into (R between Second Region min, R max), and suppose that described user's left hand is leading hand, there is L max< R min, described the first determination module 730 can determine that described threshold value is M, and L max< M < R min.That is to say, when described threshold value M described in a numerical value between between the first interval and described Second Region.
Therefore, for described identification module 720, if the mean value of described skin conductivity information is less than described threshold value M, think that it falls into described the first interval, described limbs are leading hands; If the mean value of described skin conductivity information is not less than described threshold value M, think that it falls between described Second Region, described limbs are not leading hands.
It should be noted that, in present embodiment, described reference information need to be determined according to described user's left limb skin conductivity information and right limb skin conductivity information, therefore, need to obtain in advance described user's left limb skin conductivity information and right limb skin conductivity information, such as user is formal use described leading limbs recognition device before, first wear a period of time on leftward, on the right hand, wear a period of time again, to complete training process.
In another embodiment, described reference information is described user's left limb skin conductivity information or right limb skin conductivity information;
Described identification module 720, for have the marked difference of statistical in response to described skin conductivity information and described reference information, and the mean value of described skin conductivity information is less than the mean value of described reference information, and identifying described limbs is leading limbs;
And, there is the marked difference of statistical in response to described skin conductivity information and described reference information, and the mean value of described skin conductivity information is greater than the mean value of described reference information, identifying described limbs is not leading limbs.
In present embodiment, user without carrying out preliminary election training, wears described leading limbs recognition device first such as the left hand in response to user or the right hand, obtains the first skin conductivity information, and as described reference information; (such as second day) wears described leading limbs recognition device again to have spent a period of time in response to user, obtains Second Skin conduction information.Described the first skin conductivity information and described Second Skin conduction information may be the skin conductivity information of the same hand, also may be respectively the skin conductivity information of two hands, and according to the recognition principle of leading hand in said method embodiment, only having the described first skin conductivity information of working as and described Second Skin conduction information is respectively in the situation of skin conductivity information of two hands, could identify user's leading hand.
Therefore, in described identification module 720, first to meet the marked difference that described skin conductivity information and described reference information have statistical, represent that described skin conductivity information and described reference information are respectively the skin conductivity information of two hands, and then can, by the size of the mean value of more described skin conductivity information and the mean value of described reference information, complete identification.
The advantage of present embodiment is; do not need user deliberately to complete training process; user completes the collection of reference information in natural use procedure; but; described in present embodiment, device obtains the skin conductivity information of two hands before and after still needing, and it is mainly applicable to often change the user who sets about wearing wearable equipment.
In addition, those skilled in the art understand, when user was after two were all worn described leading limbs recognition device on hand, described device 700 can reasonably select the skin conductivity information of a hand as described reference information, thereby ensure that described skin conductivity information and described reference information have the marked difference of statistical, to complete identifying.
Referring to Fig. 9, in one embodiment, described device 700 also comprises:
One first execution module 740, for carrying out the first operation according to recognition result.
Such as, know that according to recognition result user's leading hand is being worn Intelligent spire lamella, often can participate in work because taking hand channel as the leading factor, in the case of user's the frequent motion of leading hand arm, can point out user to note wrist strap safety, prevent from breaking.
In addition, according to the leading limbs information of described recognition result and user's input, oneself be left-handed person such as user inputs, can further identify user's right-hand man.
Referring to Figure 10, in one embodiment, described device 700 also comprises:
One load module 750, for receiving the leading limbs information of user's input.
Wherein, described leading limbs information is the information own which limbs of explanation are leading limbs.
Referring to Figure 11, in one embodiment, described device 700 also comprises:
One second determination module 760, for according to described leading limbs information and recognition result, determines that described limbs are left limb or right limb.
Such as, if described leading limbs information shows that described user is left-handed person, and described recognition result shows that described limbs are leading hands, can determine that described limbs are left hands.
Referring to Figure 12, in one embodiment, described device can also comprise:
One second operational module 770, for according to determining that described limbs are results of left limb or right limb, carries out the second operation.
After having determined that described limbs are left limb or right limb, can be adaptive the setting such as display interface of the wearable equipment worn of adjustment user's described limbs, or can also adjust the setting such as display interface of the equipment such as smart mobile phone that described limbs grip.Described wearable equipment or smart mobile phone etc. can be by communicating by letter with described leading limbs recognition device 700, worn or grip determining whether by described user's same limbs.
Such as, suppose to determine that described limbs are left hands of user, the deciphering gesture of the intelligent watch that described limbs wear can be set is to slide from left to right to described method, can arrange that on smart mobile phone, consonant is in the left-half of screen, and vowel is at the right half part of screen.
Described in the embodiment of the present application, an application scenarios of leading limbs recognition methods and device can be as follows: after user gets up morning, an intelligent bracelet is worn on to left hand wrist, intelligence bracelet gathers user's skin conductivity information, through and the reference information comparative analysis that prestores, determine to be currently worn on non-dominant on hand; And then what pre-enter according to user oneself is dextromanual information, determine current being worn on left hand; Then, in the time that user uses left-handed smart mobile phone, intelligence bracelet is communicated by letter with smart mobile phone, inform that smart mobile phone oneself is worn on left hand, and send acceleration information to smart mobile phone, smart mobile phone, by comparing the acceleration information of oneself and the acceleration information of the intelligent bracelet that receives, finds that oneself is also by left-handed, thereby the solution keyboard that screen can be set is slip from left to right.
Described in another embodiment of the application, the hardware configuration of leading hand recognition device as shown in figure 13.The application's specific embodiment does not limit the specific implementation of described leading hand recognition device, and referring to Figure 13, described device 1300 can comprise:
Processor (processor) 1310, communication interface (Communications Interface) 1320, storer (memory) 1330, and communication bus 1340.Wherein:
Processor 1310, communication interface 1320, and storer 1330 completes mutual communication by communication bus 1340.
Communication interface 1320, for other net element communications.
Processor 1310, for executive routine 1332, specifically can carry out the correlation step in the embodiment of the method shown in above-mentioned Fig. 1.
Particularly, program 1332 can comprise program code, and described program code comprises computer-managed instruction.
Processor 1310 may be a central processor CPU, or specific integrated circuit ASIC (Application Specific Integrated Circuit), or is configured to implement one or more integrated circuit of the embodiment of the present application.
Storer 1330, for depositing program 1332.Storer 1330 may comprise high-speed RAM storer, also may also comprise nonvolatile memory (non-volatile memory), for example at least one magnetic disk memory.Program 1332 specifically can be carried out following steps:
Obtain a skin conductivity information of user's limbs;
Whether according to described skin conductivity information and a reference information, identifying described limbs is leading limbs.
In program 1332, the specific implementation of each step can, referring to the corresponding steps in above-described embodiment or module, be not repeated herein.Those skilled in the art can be well understood to, and for convenience and simplicity of description, the specific works process of the equipment of foregoing description and module, can describe with reference to the corresponding process in preceding method embodiment, does not repeat them here.
Those of ordinary skill in the art can recognize, unit and the method step of each example of describing in conjunction with embodiment disclosed herein, can realize with the combination of electronic hardware or computer software and electronic hardware.These functions are carried out with hardware or software mode actually, depend on application-specific and the design constraint of technical scheme.Professional and technical personnel can realize described function with distinct methods to each specifically should being used for, but this realization should not thought and exceeds the application's scope.
If described function realizes and during as production marketing independently or use, can be stored in a computer read/write memory medium using the form of SFU software functional unit.Based on such understanding, the part that the application's technical scheme contributes to prior art in essence in other words or the part of this technical scheme can embody with the form of software product, this computer software product is stored in a storage medium, comprise that some instructions (can be personal computers in order to make a computer equipment, controller, or the network equipment etc.) carry out all or part of step of method described in each embodiment of the application.And aforesaid storage medium comprises: USB flash disk, portable hard drive, ROM (read-only memory) (ROM, Read-Only Memory), the various media that can be program code stored such as random access memory (RAM, Random Access Memory), magnetic disc or CD.
Above embodiment is only for illustrating the application; and the not restriction to the application; the those of ordinary skill in relevant technologies field; in the case of not departing from the application's spirit and scope; can also make a variety of changes and modification; therefore all technical schemes that are equal to also belong to the application's category, and the application's scope of patent protection should be defined by the claims.

Claims (20)

1. a leading limbs recognition methods, is characterized in that, described method comprises:
Obtain a skin conductivity information of user's limbs;
Whether according to described skin conductivity information and a reference information, identifying described limbs is leading limbs.
2. the method for claim 1, is characterized in that, described reference information is according to described user's left limb skin conductivity information and the definite threshold value of right limb skin conductivity information.
3. method as claimed in claim 2, is characterized in that, described according to described skin conductivity information and a reference information, and whether identify described limbs is that leading limbs comprise:
Be less than described threshold value in response to the mean value of described skin conductivity information, identifying described limbs is leading limbs;
Be not less than described threshold value in response to the mean value of described skin conductivity information, identifying described limbs is not leading limbs.
4. the method for claim 1, is characterized in that, described reference information is described user's left limb skin conductivity information or right limb skin conductivity information.
5. method as claimed in claim 4, is characterized in that, described according to described skin conductivity information and a reference information, and whether identify described limbs is that leading limbs comprise:
Have the marked difference of statistical in response to described skin conductivity information and described reference information, and the mean value of described skin conductivity information is less than the mean value of described reference information, identifying described limbs is leading limbs.
6. method as claimed in claim 4, is characterized in that, described according to described skin conductivity information and a reference information, and whether identify described limbs is that leading limbs comprise:
Have the marked difference of statistical in response to described skin conductivity information and described reference information, and the mean value of described skin conductivity information is greater than the mean value of described reference information, identifying described limbs is not leading limbs.
7. the method as described in claim 1 to 6 any one, is characterized in that, described method also comprises: carry out the first operation according to recognition result.
8. the method as described in claim 1 to 7 any one, is characterized in that, described method also comprises:
Receive the leading limbs information of user's input.
9. method as claimed in claim 8, is characterized in that, described method also comprises:
According to described leading limbs information and recognition result, determine that described limbs are left limb or right limb.
10. method as claimed in claim 9, is characterized in that, described method also comprises:
Are results of left limb or right limb according to definite described limbs, carry out the second operation.
11. 1 kinds of leading limbs recognition devices, is characterized in that, described device comprises:
One acquisition module, for obtaining user's a skin conductivity information of limbs;
One identification module, whether for according to described skin conductivity information and a reference information, identifying described limbs is leading limbs.
12. devices as claimed in claim 11, is characterized in that, described device also comprises:
One first determination module, for determining that according to described user's left limb skin conductivity information and right limb skin conductivity information a threshold value is as described reference information.
13. devices as claimed in claim 12, is characterized in that, described identification module, and for being less than described threshold value in response to the mean value of described skin conductivity information, identifying described limbs is leading limbs; And
Be not less than described threshold value in response to the mean value of described skin conductivity information, identifying described limbs is not leading limbs.
14. devices as claimed in claim 11, is characterized in that, described reference information is described user's left limb skin conductivity information or right limb skin conductivity information;
Described identification module, for have the marked difference of statistical in response to described skin conductivity information and described reference information, and the mean value of described skin conductivity information is less than the mean value of described reference information, and identifying described limbs is leading limbs.
15. devices as claimed in claim 11, is characterized in that, described reference information is described user's left limb skin conductivity information or right limb skin conductivity information;
Described identification module, for have the marked difference of statistical in response to described skin conductivity information and described reference information, and the mean value of described skin conductivity information is greater than the mean value of described reference information, and identifying described limbs is not leading limbs.
16. devices as described in claim 11 to 15 any one, is characterized in that, described device also comprises:
One first execution module, for carrying out the first operation according to recognition result.
17. devices as described in claim 11 to 16 any one, is characterized in that, described device also comprises:
One load module, for receiving the leading limbs information of user's input.
18. devices as claimed in claim 17, is characterized in that, described device also comprises:
One second determination module, for according to described leading limbs information and recognition result, determines that described limbs are left limb or right limb.
19. devices as claimed in claim 18, is characterized in that, described device also comprises:
One second operational module, for according to determining that described limbs are results of left limb or right limb, carries out the second operation.
20. devices as described in claim 11 to 19 any one, is characterized in that, described device is a wearable device.
CN201410386783.1A 2014-08-07 2014-08-07 Method and device for identifying leading limb Withdrawn CN104133554A (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN201410386783.1A CN104133554A (en) 2014-08-07 2014-08-07 Method and device for identifying leading limb
CN201410705579.1A CN104360745A (en) 2014-08-07 2014-11-27 Dominant limb determination method and dominant limb determination device
US15/501,766 US20170235366A1 (en) 2014-08-07 2015-08-07 Dominant limb identification method and device
PCT/CN2015/086310 WO2016019894A1 (en) 2014-08-07 2015-08-07 Dominant limb identification method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410386783.1A CN104133554A (en) 2014-08-07 2014-08-07 Method and device for identifying leading limb

Publications (1)

Publication Number Publication Date
CN104133554A true CN104133554A (en) 2014-11-05

Family

ID=51806266

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201410386783.1A Withdrawn CN104133554A (en) 2014-08-07 2014-08-07 Method and device for identifying leading limb
CN201410705579.1A Pending CN104360745A (en) 2014-08-07 2014-11-27 Dominant limb determination method and dominant limb determination device

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201410705579.1A Pending CN104360745A (en) 2014-08-07 2014-11-27 Dominant limb determination method and dominant limb determination device

Country Status (1)

Country Link
CN (2) CN104133554A (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104331163A (en) * 2014-11-27 2015-02-04 北京智谷睿拓技术服务有限公司 Method and equipment for determining dominant limb
CN104331165A (en) * 2014-11-27 2015-02-04 北京智谷睿拓技术服务有限公司 Method and equipment for determining dominant limb
CN104360747A (en) * 2014-11-27 2015-02-18 北京智谷睿拓技术服务有限公司 Dominant limb determination method and dominant limb determination device
CN104360750A (en) * 2014-11-27 2015-02-18 北京智谷睿拓技术服务有限公司 Dominant limb determination method and dominant limb determination device
CN104360746A (en) * 2014-11-27 2015-02-18 北京智谷睿拓技术服务有限公司 Dominant limb determination method and dominant limb determination device
CN104360748A (en) * 2014-11-27 2015-02-18 北京智谷睿拓技术服务有限公司 Dominant limb determination method and dominant limb determination device
CN104360739A (en) * 2014-11-07 2015-02-18 北京智谷睿拓技术服务有限公司 Method and equipment for determining dominant eye
CN104375644A (en) * 2014-11-07 2015-02-25 北京智谷睿拓技术服务有限公司 Method and device for determining dominant eye
CN104375649A (en) * 2014-11-27 2015-02-25 北京智谷睿拓技术服务有限公司 Leading limb determination method and device
CN104391580A (en) * 2014-12-09 2015-03-04 北京银河润泰科技有限公司 Wearing state processing method and device for wearable equipment
CN104407703A (en) * 2014-11-27 2015-03-11 北京智谷睿拓技术服务有限公司 Dominant limb determination method and apparatus
CN104615239A (en) * 2014-12-29 2015-05-13 北京智谷睿拓技术服务有限公司 Interaction method and device based on wearable device and wearable device
WO2016019894A1 (en) * 2014-08-07 2016-02-11 Beijing Zhigu Tech Co., Ltd. Dominant limb identification method and device
WO2016070653A1 (en) * 2014-11-07 2016-05-12 Beijing Zhigu Rui Tuo Tech Co., Ltd. Dominant eye determining method and device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100451924C (en) * 2005-12-30 2009-01-14 财团法人工业技术研究院 Emotion perception interdynamic recreational apparatus
CN102890558B (en) * 2012-10-26 2015-08-19 北京金和软件股份有限公司 The method of mobile hand-held device handheld motion state is detected based on sensor

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016019894A1 (en) * 2014-08-07 2016-02-11 Beijing Zhigu Tech Co., Ltd. Dominant limb identification method and device
CN104360739A (en) * 2014-11-07 2015-02-18 北京智谷睿拓技术服务有限公司 Method and equipment for determining dominant eye
US10646133B2 (en) 2014-11-07 2020-05-12 Beijing Zhigu Rui Tuo Tech Co., Ltd Dominant eye determining method and device
WO2016070653A1 (en) * 2014-11-07 2016-05-12 Beijing Zhigu Rui Tuo Tech Co., Ltd. Dominant eye determining method and device
CN104375644A (en) * 2014-11-07 2015-02-25 北京智谷睿拓技术服务有限公司 Method and device for determining dominant eye
CN104360746A (en) * 2014-11-27 2015-02-18 北京智谷睿拓技术服务有限公司 Dominant limb determination method and dominant limb determination device
CN104360748A (en) * 2014-11-27 2015-02-18 北京智谷睿拓技术服务有限公司 Dominant limb determination method and dominant limb determination device
CN104331163A (en) * 2014-11-27 2015-02-04 北京智谷睿拓技术服务有限公司 Method and equipment for determining dominant limb
CN104375649A (en) * 2014-11-27 2015-02-25 北京智谷睿拓技术服务有限公司 Leading limb determination method and device
CN104407703A (en) * 2014-11-27 2015-03-11 北京智谷睿拓技术服务有限公司 Dominant limb determination method and apparatus
CN104360750A (en) * 2014-11-27 2015-02-18 北京智谷睿拓技术服务有限公司 Dominant limb determination method and dominant limb determination device
CN104360747A (en) * 2014-11-27 2015-02-18 北京智谷睿拓技术服务有限公司 Dominant limb determination method and dominant limb determination device
CN104331165A (en) * 2014-11-27 2015-02-04 北京智谷睿拓技术服务有限公司 Method and equipment for determining dominant limb
CN104391580A (en) * 2014-12-09 2015-03-04 北京银河润泰科技有限公司 Wearing state processing method and device for wearable equipment
CN104391580B (en) * 2014-12-09 2017-05-17 北京银河润泰科技有限公司 Wearing state processing method and device for wearable equipment
CN104615239A (en) * 2014-12-29 2015-05-13 北京智谷睿拓技术服务有限公司 Interaction method and device based on wearable device and wearable device

Also Published As

Publication number Publication date
CN104360745A (en) 2015-02-18

Similar Documents

Publication Publication Date Title
CN104133554A (en) Method and device for identifying leading limb
CN105511615B (en) Wearable text input system and method based on EMG
CN105116995A (en) Intelligent wearable device and working method therefor
CN108062180B (en) Touch screen sensitivity control method and device, storage medium and mobile terminal
US10642820B2 (en) Method for data processing and related products
RU2662410C2 (en) Client intent in integrated search environment
CN104199543A (en) Leading limb identification method and system
CN103294586A (en) Automatic detection of user preferences for alternate user interface model
CN203943071U (en) Intelligent wireless electronics cigarette holder based on bluetooth 4.0 specifications
CN114707562B (en) Electromyographic signal sampling frequency control method and device and storage medium
CN104966011A (en) Method for non-collaborative judgment and operating authorization restriction for mobile terminal child user
CN108108117B (en) Screen capturing method and device and terminal
CN107004073A (en) The method and electronic equipment of a kind of face verification
CN104636321A (en) Text display method and text display device
WO2015134908A1 (en) Learn-by-example systems and methods
GB2582726A (en) Communication model for cognitive systems
CN102156574A (en) System and method for touch-control display by simulating mouse key through fingerprint identification
CN104573459B (en) Exchange method, interactive device and user equipment
CN104407703A (en) Dominant limb determination method and apparatus
CN106055404B (en) Method and device for cleaning background application program
CN108770046B (en) Method for saving electric quantity of smart watch
CN108304135A (en) A kind of method of adjustment and terminal of virtual modifier key
CN104216519A (en) Dominant limb cognition method and equipment
JP2018508081A (en) Input serial processing method, apparatus, device, and non-executable computer storage medium
CN114223139A (en) Interface switching method and device, wearable electronic equipment and storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C04 Withdrawal of patent application after publication (patent law 2001)
WW01 Invention patent application withdrawn after publication

Application publication date: 20141105