CN110348275A - Gesture identification method, device, smart machine and computer readable storage medium - Google Patents

Gesture identification method, device, smart machine and computer readable storage medium Download PDF

Info

Publication number
CN110348275A
CN110348275A CN201810307857.6A CN201810307857A CN110348275A CN 110348275 A CN110348275 A CN 110348275A CN 201810307857 A CN201810307857 A CN 201810307857A CN 110348275 A CN110348275 A CN 110348275A
Authority
CN
China
Prior art keywords
time series
gesture
acceleration
resultant force
angular speed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810307857.6A
Other languages
Chinese (zh)
Inventor
王海鹏
蔡亚菲
龚岩
刘武
李泽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ZTE Corp
Northwestern Polytechnical University
Northwest University of Technology
Original Assignee
ZTE Corp
Northwest University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ZTE Corp, Northwest University of Technology filed Critical ZTE Corp
Priority to CN201810307857.6A priority Critical patent/CN110348275A/en
Publication of CN110348275A publication Critical patent/CN110348275A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses a kind of gesture identification method, smart machine and storage mediums, belong to human-computer interaction technique field.This method comprises: obtaining the acceleration and angular speed data of gesture to be identified, and it is filtered;Acceleration resultant force time series is calculated according to acceleration information;Dimension is carried out to acceleration resultant force time series and angular speed time series to be uniformly processed;Acceleration resultant force time series and angular speed time series after dimension is uniformly processed according to default rule carry out branch mailbox;Acceleration after branch mailbox is matched with angular speed time series with the gesture template library prestored with joint efforts, determines the classification of gesture to be identified.The embodiment of the present invention can be in the case where not increasing sensor cost, the effective difference of regular gesture data spatially, otherness caused by reducing due to the variation of speed or the variation of strength, to realize efficient, accurate gesture identification, can be widely applied in the smart machine high to user-interaction experience.

Description

Gesture identification method, device, smart machine and computer readable storage medium
Technical field
The present invention relates to human-computer interaction technique fields, more particularly to one kind to be based on IMU (Inertial measurement Unit, Inertial Measurement Unit) gesture identification method, device, smart machine and computer readable storage medium.
Background technique
Gesture interaction is a kind of naturally convenient and fast human-computer interaction means, has the characteristics that simple, intuitive, is easy to learn and use, can be with The wish of people is intuitively showed.More and more the universal of smart machine bring the new purposes of gesture identification and side To.The sensing equipment of the various high-precisions, low-power consumption that are equipped on smart machine also makes the approach of gesture identification more diversified.Mesh Before, common gesture identification includes the number of ways such as photoelectric induction device, infrared detecting device, IMU and acceleration transducer. Wherein, IMU is the device for measuring object triaxial attitude angle (or angular speed) and acceleration, can use the acceleration of smart machine Spend the data during sensor, the sensors acquisition gesture such as gyroscope, then carry out gesture identification, make user accomplish at any time with Ground easily carries out gesture interaction.
DTW (Dynamic Time Warping, dynamic time consolidation) algorithm is a kind of common Gesture Recognition Algorithm, is passed System using Euclidean distance carry out template between gap measurement when, the length of two templates should be identical, i.e., data point is one by one It is corresponding.Since the length of gesture data might not be equal, need to handle gesture data, so that it reaches alignment Effect.DTW algorithm can have the function that data are regular by the way that data are stretched on a timeline or compressed, thus to solve shape In state it is similar but in time can not very well corresponding data comparison.
However, due to acceleration transducer and gyroscope can because the gesture habit of user (for example strength and speed is big It is small) occur either large or small variation, same gesture can the different data of output, cause acceleration and angular speed have in amplitude compared with Big difference, i.e. spatial diversity.Spatial diversity is not the real difference of different gestures, on the contrary, if in this, as feature, together One gesture is also possible to thus lead to the failure of identification.Although DTW algorithm can on a timeline be adaptively adjusted data, But can not make and be effectively treated on amplitude (spatial diversity), that is, lack the adaptability changed to space scale.
Summary of the invention
In view of this, the embodiment of the present invention is designed to provide a kind of gesture identification method, device, smart machine and meter Calculation machine readable storage medium storing program for executing, to solve the technical issues of DTW algorithm lacks adaptability change to space scale, thus diminution by The otherness caused by the variation of speed or the variation of strength.
It is as follows that the present invention solves technical solution used by above-mentioned technical problem:
According to an aspect of an embodiment of the present invention, a kind of gesture identification method provided includes:
The acceleration and angular speed data of gesture to be identified are obtained, and are filtered;
Acceleration resultant force time series is calculated according to acceleration information;
Dimension is carried out to acceleration resultant force time series and angular speed time series to be uniformly processed;
Acceleration resultant force time series and angular speed time series after dimension is uniformly processed according to default rule into Row branch mailbox;
By the acceleration resultant force time series and angular speed time series after branch mailbox and the gesture template library progress prestored Match, determines the classification of gesture to be identified.
Other side according to an embodiment of the present invention, a kind of gesture identifying device provided include:
Gesture acquisition module, for obtaining the acceleration and angular speed data of gesture to be identified;
Filter module, for being filtered to acceleration and angular speed data;
Resultant force computing module, for calculating acceleration resultant force time series according to acceleration information;
Dimension unified modules, for being carried out at dimension unification to acceleration resultant force time series and angular speed time series Reason;
Branch mailbox module, for the acceleration resultant force time series and angle speed after dimension is uniformly processed according to default rule It spends time series and carries out branch mailbox;
Matching module, for by after branch mailbox acceleration resultant force time series and angular speed time series and the gesture that prestores Template library is matched, and determines the classification of gesture to be identified.
Another aspect according to an embodiment of the present invention, a kind of smart machine provided include memory, processor and deposit The computer program that can be run on a memory and on a processor is stored up, above-mentioned hand is realized when computer program is executed by processor The step of gesture recognition methods.
Another aspect according to an embodiment of the present invention is stored with calculating on a kind of computer readable storage medium provided Machine program, when computer program is executed by processor, the step of realizing above-mentioned gesture identification method.
Gesture identification method, device, smart machine and computer readable storage medium provided in an embodiment of the present invention, can be It is unified by carrying out dimension to acceleration resultant force time series and angular speed time series in the case where not increasing sensor cost Processing and branch mailbox, the effective difference of regular gesture data spatially, reduce due to the variation of speed or the variation of strength and Caused otherness can be widely applied to the intelligence high to user-interaction experience and set to realize efficient, accurate gesture identification In standby.
Detailed description of the invention
Fig. 1 is a kind of flow chart for gesture identification method that the embodiment of the present invention one provides;
Fig. 2 is a kind of flow chart of gesture template matching method provided by Embodiment 2 of the present invention;
Fig. 3 is the function structure chart of three gesture identifying device of the embodiment of the present invention;
Fig. 4 is the function structure chart for the intelligent terminal that the embodiment of the present invention four provides.
The embodiments will be further described with reference to the accompanying drawings for the realization, the function and the advantages of the object of the present invention.
Specific embodiment
In order to be clearer and more clear technical problems, technical solutions and advantages to be solved, tie below Drawings and examples are closed, the present invention will be described in further detail.It should be appreciated that specific embodiment described herein is only To explain the present invention, it is not intended to limit the present invention.
Embodiment one
As shown in Figure 1, a kind of gesture identification method provided in an embodiment of the present invention includes:
S101, the acceleration and angular speed data for obtaining gesture to be identified, and be filtered.
Specifically, generally acquiring acceleration and angular speed by the acceleration transducer and gyroscope that configure in smart machine Data, acceleration and angular speed data are used to describe the information of gesture behavior.But non-targeted gesture can be adulterated in collection process Signal needs to be filtered to avoid its influence.Filtering mode has very much, and experiment and investigation obtain, the frequency of gesture behavior Rate is typically in 3.5HZ hereinafter, so using using 3.5HZ is the low-pass filter of cutoff frequency to acceleration and angular speed data It is filtered relatively good.
S102, acceleration resultant force time series is calculated according to acceleration information.
Specifically, common acceleration transducer can use 3-axis acceleration sensor, in orthogonal three axis of X, Y, Z The component of acceleration is acquired respectively, and the component on three axis can generate biggish gap because of the variation of equipment posture, equipment The variation of posture cannot directly be expressed as the situation of change of power during gesture, if calculating separately three axis using DTW Total gap that gap obtains two samples can not measure the real difference between gesture well.So needing to close three axis components As the vector sum of original resultant force, due to adulterating gravity in the resultant force that acquires, therefore gravity is subtracted.I.e. a certain moment, three axis Resultant force subtract gravity be exactly the moment generate acceleration resultant force.The power but generated in this way is a scalar, and there is no directions It is directed toward, and the angular velocity data that gyroscope provides can make up the missing of information on direction.It can be calculated by using the following formula The time series that acceleration resultant force changes over time:
Wherein, s indicates acceleration resultant force, a(xt)、a(yt)、a(zt))T moment is respectively indicated on x-axis, y-axis and z-axis direction Acceleration, g indicate acceleration of gravity.
S103, acceleration resultant force time series and angular speed time series progress dimension are uniformly processed.
Specifically, can make the gesture of friction speed under different strength, the speed of gesture motion also can be largely The size for influencing gesture data causes acceleration and angular speed to have biggish difference, i.e. spatial diversity in amplitude.But space Difference is not the real difference of different gestures, on the contrary, if same gesture is also possible to thus cause to identify in this, as feature Failure.If the Euclidean distance between DTW algorithm directly comparative sample, can amplify due to the difference that gesture speed difference generates Otherwise general character or difference are masked, unreasonable result is obtained.DTW algorithm is can be by the drawing to data time shaft Stretch or compress the regular effect for reaching data, solve the similar but time in form it is upper can not fine corresponding data comparison, But effective processing cannot be made on spatial diversity.So needing through normalized, by Data Integration to unification Under dimension, thus otherness caused by reducing due to the variation of speed or the variation of strength.
When carrying out dimension unification to acceleration resultant force time series and angular speed time series, if directly by sequence A section is integrally compressed or be stretched to data, i.e., using the methods of linear normalization, is easy because some very big or minimum Singular value lead to the loss of effective information.Therefore use zero-mean method for normalizing more reasonable.
It can be specifically uniformly processed using the dimension that following formula carries out acceleration resultant force time series:
Wherein, S(t)Indicate the acceleration resultant force of t moment, meanSIndicate all acceleration resultant forces of whole section of time series Mean value, stdSIndicate the standard deviation of all acceleration resultant forces of whole section of time series.
It can be uniformly processed using the dimension that following formula carries out angular speed time series:
Wherein, w(t)Indicate the angular speed of t moment, meanwIndicate all angular velocity vector sizes of whole section of time series Mean value, stdwIndicate the standard deviation of all angular velocity vector sizes of whole section of time series.
S104, dimension is uniformly processed according to default rule after acceleration resultant force time series and angular speed time sequence Column carry out branch mailbox.
Specifically, small-scale variation can occur for gesture motion during a gesture, it can not directly be utilized, It needs to be further processed.It i.e. will be in the acceleration resultant force time series using the first preset piecewise function The acceleration resultant force size at each moment is mapped as discrete acceleration resultant force value, using the discrete acceleration resultant force value table State the acceleration resultant force time series.Using the second preset piecewise function by each moment in the angular speed time series Angular velocity vector size be mapped as discrete magnitude of angular velocity, the angular speed time is stated using the discrete magnitude of angular velocity Sequence.
It is operated by data branch mailbox, thus bring difference can be reduced to a certain extent.Experiments verify that can adopt Branch mailbox is carried out with following branch mailbox table:
Table 1: acceleration resultant force branch mailbox table
Acceleration resultant force magnitude range Cabinet
0~0.25stdS 0
0.25stdS~0.5stdS 0.5
0.5stdS~stdS 1
stdS~1.5stdS 1.5
1.5stdS~2stdS 2
>2stdS 3
Note: stdSRefer to stdSIndicate the standard deviation of all acceleration resultant forces of whole section of acceleration resultant force time series.
According to upper table 1, power within the scope of resultant force, the resultant force with the digital representation in cabinet, such as t moment is S(t) Size be 0.2 when, be in 0~0.25stdSBetween, then set S(t)It is 0.
Table 2: angular speed branch mailbox table
Angular speed size Cabinet
0~0.25stdw 0
0.25stdw~0.5stdw 0.5
0.5stdw~stdw 1
1stdw~1.5stdw 1.5
1.5stdw~2stdw 2
>2stdw 3
Note: stdwIndicate the standard deviation of all angular velocity vector sizes of whole section of angular speed time series.
According to upper table 2, angular speed in angular velocity range, with the angle speed of the digital representation in cabinet, such as t moment Spend w(t)Size be 0.2stdw, then w is set(t)It is 0.
S105, the acceleration after branch mailbox is matched with angular speed time series with the gesture template library prestored with joint efforts, Determine the classification of gesture to be identified.
Gesture identification method provided in an embodiment of the present invention, can be in the case where not increasing sensor cost, by adding Speed resultant force time series and angular speed time series carry out dimension and are uniformly processed and branch mailbox, and effective regular gesture data is in sky Between on difference, reduce due to otherness caused by the variation of speed or the variation of strength, to realize efficient, accurate hand Gesture identification, can be widely applied in the smart machine high to user-interaction experience.
Embodiment two
As shown in Fig. 2, the method for gesture template matching provided in an embodiment of the present invention includes:
S1051, by the acceleration resultant force time series and angular speed time series and preset gesture template library after branch mailbox In the target gesture data of each major class be compared, choosing major class belonging to the smallest target gesture template of difference is object set It closes.
Wherein, the pre-stored double-deck classification model library on intelligent devices of preset gesture template library.In order to guarantee to identify Precision would generally generate numerous samples in advance for each gesture, at the step S101~step S104 of above-described embodiment one Then reason is obtained with other same gesture average Euclidean apart from the smallest sample deposit template library using DTW algorithm as template. Above steps may be repeated multiple times can be generated multiple gesture templates, to construct complete template library.
Although the accuracy of identification can be effectively improved in this way, with increasing for gesture type, template matching Operand will significantly increase, and consume more resources, and especially various intelligent movable equipment etc. are to storage, calculating and function Consumption is again extremely sensitive, so needing to reduce algorithm complexity using efficient template matching algorithm.The Europe obtained due to DTW algorithm Formula distance is to discriminate between the standard of different gestures, and the Euclidean distance that similar gesture is calculated is also relatively, it is possible to Compared by different level according to this to improve comparison efficiency, gesture is divided into different major class, i.e., similar gesture is divided into a major class. I.e. as unit of the specific other all samples of gesture class, calculates the DTW between all gesture classifications and be averaged regular distance, setting The gesture classification that distance between classification is less than threshold value is classified as a major class by class threshold.It is arrived secondly, calculating each sample in major class Gap between all samples of other in major class, the i.e. regular distance of DTW choose other the smallest samples of template average distance into major class This target template as this major class.Calculation amount is concentrated mainly on the template training generation phase of early period by the method, but in hand The operand of gesture cognitive phase will greatly reduce.
S1052, by the acceleration resultant force time series after branch mailbox and all in angular speed time series and target collection Gesture data is compared, and determines that classification belonging to the smallest gesture of difference is the classification of gesture to be identified.
Further, since the gesture data of input may not be identification target, thus need to be arranged a threshold value as Europe The boundary of formula distance then determines that it fails to match when Euclidean distance is greater than threshold value, to realize fault-tolerant function.
The method of gesture template matching provided in an embodiment of the present invention, by first with the target template in each major class into Row compares, and chooses the smallest classification of gap and is further compared, and other major class will not be considered any further.It in this way can will be right All templates are compared one by one to be switched to segment template and compares, and lowers template matching quantity, to improve matching efficiency.
Embodiment three
As shown in figure 3, a kind of gesture identifying device provided in an embodiment of the present invention includes gesture acquisition module 10, filtering mould Block 20, resultant force sequence computing module 30, dimension unified modules 40, branch mailbox module 50, template matching module 60, in which:
Gesture acquisition module 10, for obtaining the acceleration and angular speed data of gesture to be identified.
Filter module 20, for being filtered to acceleration and angular speed data.
Resultant force sequence computing module 30, for calculating acceleration resultant force time series according to acceleration information.
Dimension unified modules 40, for being carried out at dimension unification to acceleration resultant force time series and angular speed time series Reason.
Branch mailbox module 50, for after dimension is uniformly processed according to default rule acceleration resultant force time series and angle Velocity Time sequence carries out branch mailbox.
Template matching module 60, for by after branch mailbox acceleration resultant force time series and angular speed time series with prestore Gesture template library matched, determine the classification of gesture to be identified.
In some embodiments, template matching module 60 further comprises:
First determination unit 601, for by the acceleration resultant force time series and angular speed time series after branch mailbox and in advance If gesture template library in the target gesture data of each major class be compared, choose belonging to the smallest target gesture template of difference Major class be target collection.
Second determination unit 601, for by the acceleration resultant force time series and angular speed time series and mesh after branch mailbox All gesture datas in mark set are compared, and determine that classification belonging to the smallest gesture of difference is the class of gesture to be identified Not.
Gesture identifying device provided in an embodiment of the present invention, can be in the case where not increasing sensor cost, by adding Speed resultant force time series and angular speed time series carry out dimension and are uniformly processed and branch mailbox, and effective regular gesture data is in sky Between on difference, reduce due to otherness caused by the variation of speed or the variation of strength, to realize efficient, accurate hand Gesture identification, can be widely applied in the smart machine high to user-interaction experience.
Example IV
As shown in figure 4, a kind of smart machine provided in an embodiment of the present invention includes memory, processor and is stored in described It is real when the computer program is executed by the processor on memory and the computer program that can run on the processor The step of existing gesture identification method.
Smart machine can be implemented in a variety of manners.For example, smart machine described in the present invention may include such as Smart phone, tablet computer, laptop, palm PC, personal digital assistant (Personal Digital Assistant, PDA), portable media player (Portable Media Player, PMP), navigation device, wearable set The fixed terminals such as the mobile terminals such as standby, Intelligent bracelet, pedometer, and number TV, desktop computer.
It will be illustrated by taking mobile terminal as an example in subsequent descriptions, it will be appreciated by those skilled in the art that in addition to special Except element for moving purpose, the construction of embodiment according to the present invention can also apply to the terminal of fixed type.
Referring to Fig. 4, a kind of hardware structural diagram of its mobile terminal of each embodiment to realize the present invention, the shifting Dynamic terminal 100 may include: RF (Radio Frequency, radio frequency) unit 101, WiFi module 102, audio output unit 103, A/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, the components such as memory 109, processor 110 and power supply 111.It will be understood by those skilled in the art that shown in Fig. 1 Mobile terminal structure does not constitute the restriction to mobile terminal, and mobile terminal may include components more more or fewer than diagram, Perhaps certain components or different component layouts are combined.
It is specifically introduced below with reference to all parts of the Fig. 4 to mobile terminal:
Radio frequency unit 101 can be used for receiving and sending messages or communication process in, signal sends and receivees, specifically, by base station Downlink information receive after, to processor 110 handle;In addition, the data of uplink are sent to base station.In general, radio frequency unit 101 Including but not limited to antenna, at least one amplifier, transceiver, coupler, low-noise amplifier, duplexer etc..In addition, penetrating Frequency unit 101 can also be communicated with network and other equipment by wireless communication.Any communication can be used in above-mentioned wireless communication Standard or agreement, including but not limited to GSM (Global System of Mobile communication, global system for mobile telecommunications System), GPRS (General Packet Radio Service, general packet radio service), CDMA2000 (Code Division Multiple Access 2000, CDMA 2000), WCDMA (Wideband Code Division Multiple Access, wideband code division multiple access), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access, TD SDMA), FDD-LTE (Frequency Division Duplexing-Long Term Evolution, frequency division duplex long term evolution) and TDD-LTE (Time Division Duplexing-Long Term Evolution, time division duplex long term evolution) etc..
WiFi belongs to short range wireless transmission technology, and mobile terminal can help user to receive and dispatch electricity by WiFi module 102 Sub- mail, browsing webpage and access streaming video etc., it provides wireless broadband internet access for user.Although Fig. 1 shows Go out WiFi module 102, but it is understood that, and it is not belonging to must be configured into for mobile terminal, it completely can be according to need It to omit within the scope of not changing the essence of the invention.
Audio output unit 103 can be in call signal reception pattern, call mode, record mould in mobile terminal 100 When under the isotypes such as formula, speech recognition mode, broadcast reception mode, by radio frequency unit 101 or WiFi module 102 it is received or The audio data stored in memory 109 is converted into audio signal and exports to be sound.Moreover, audio output unit 103 Audio output relevant to the specific function that mobile terminal 100 executes can also be provided (for example, call signal receives sound, disappears Breath receives sound etc.).Audio output unit 103 may include loudspeaker, buzzer etc..
A/V input unit 104 is for receiving audio or video signal.A/V input unit 104 may include graphics processor (Graphics Processing Unit, GPU) 1041 and microphone 1042, graphics processor 1041 is in video acquisition mode Or the image data of the static images or video obtained in image capture mode by image capture apparatus (such as camera) carries out Reason.Treated, and picture frame may be displayed on display unit 106.Through graphics processor 1041, treated that picture frame can be deposited Storage is sent in memory 109 (or other storage mediums) or via radio frequency unit 101 or WiFi module 102.Mike Wind 1042 can connect in telephone calling model, logging mode, speech recognition mode etc. operational mode via microphone 1042 Quiet down sound (audio data), and can be audio data by such acoustic processing.Audio that treated (voice) data can To be converted to the format output that can be sent to mobile communication base station via radio frequency unit 101 in the case where telephone calling model. Microphone 1042 can be implemented various types of noises elimination (or inhibition) algorithms and send and receive sound to eliminate (or inhibition) The noise generated during frequency signal or interference.
Mobile terminal 100 further includes at least one sensor 105, such as optical sensor, motion sensor and other biographies Sensor.Specifically, optical sensor includes ambient light sensor and proximity sensor, wherein ambient light sensor can be according to environment The light and shade of light adjusts the brightness of display panel 1061, and proximity sensor can close when mobile terminal 100 is moved in one's ear Display panel 1061 and/or backlight.As a kind of motion sensor, accelerometer sensor can detect in all directions (general For three axis) size of acceleration, it can detect that size and the direction of gravity when static, can be used to identify the application of mobile phone posture (such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, percussion) etc.; The fingerprint sensor that can also configure as mobile phone, pressure sensor, iris sensor, molecule sensor, gyroscope, barometer, The other sensors such as hygrometer, thermometer, infrared sensor, details are not described herein.
In embodiments of the present invention, sensor 105 includes acceleration transducer and angular-rate sensor, acceleration transducer 3-axis acceleration sensor can be used, angular-rate sensor can use gyroscope, to acquire the acceleration of user gesture Data and angular velocity data.
Display unit 106 is for showing information input by user or being supplied to the information of user.Display unit 106 can wrap Display panel 1061 is included, liquid crystal display (Liquid Crystal Display, LCD), Organic Light Emitting Diode can be used Forms such as (Organic Light-Emitting Diode, OLED) configure display panel 1061.
User input unit 107 can be used for receiving the number or character information of input, and generate the use with mobile terminal Family setting and the related key signals input of function control.Specifically, user input unit 107 may include touch panel 1071 with And other input equipments 1072.Touch panel 1071, also referred to as touch screen collect the touch operation of user on it or nearby (for example user uses any suitable objects or attachment such as finger, stylus on touch panel 1071 or in touch panel 1071 Neighbouring operation), and corresponding attachment device is driven according to preset formula.Touch panel 1071 may include touch detection Two parts of device and touch controller.Wherein, the touch orientation of touch detecting apparatus detection user, and detect touch operation band The signal come, transmits a signal to touch controller;Touch controller receives touch information from touch detecting apparatus, and by it It is converted into contact coordinate, then gives processor 110, and order that processor 110 is sent can be received and executed.In addition, can To realize touch panel 1071 using multiple types such as resistance-type, condenser type, infrared ray and surface acoustic waves.In addition to touch panel 1071, user input unit 107 can also include other input equipments 1072.Specifically, other input equipments 1072 can wrap It includes but is not limited in physical keyboard, function key (such as volume control button, switch key etc.), trace ball, mouse, operating stick etc. It is one or more, specifically herein without limitation.
Further, touch panel 1071 can cover display panel 1061, when touch panel 1071 detect on it or After neighbouring touch operation, processor 110 is sent to determine the type of touch event, is followed by subsequent processing device 110 according to touch thing The type of part provides corresponding visual output on display panel 1061.Although in Fig. 1, touch panel 1071 and display panel 1061 be the function that outputs and inputs of realizing mobile terminal as two independent components, but in certain embodiments, it can The function that outputs and inputs of mobile terminal is realized so that touch panel 1071 and display panel 1061 is integrated, is not done herein specifically It limits.
Interface unit 108 be used as at least one external device (ED) connect with mobile terminal 100 can by interface.For example, External device (ED) may include wired or wireless headphone port, external power supply (or battery charger) port, wired or nothing Line data port, memory card port, the port for connecting the device with identification module, audio input/output (I/O) end Mouth, video i/o port, ear port etc..Interface unit 108 can be used for receiving the input from external device (ED) (for example, number It is believed that breath, electric power etc.) and the input received is transferred to one or more elements in mobile terminal 100 or can be with For transmitting data between mobile terminal 100 and external device (ED).
Memory 109 can be used for storing software program and various data.Memory 109 can mainly include storing program area The storage data area and, wherein storing program area can (such as the sound of application program needed for storage program area, at least one function Sound playing function, image player function etc.) etc.;Storage data area can store according to mobile phone use created data (such as Audio data, phone directory etc.) etc..In addition, memory 109 may include high-speed random access memory, it can also include non-easy The property lost memory, a for example, at least disk memory, flush memory device or other volatile solid-state parts.
In embodiments of the present invention, memory 109 stores the program of above-mentioned realization gesture identification method.
Processor 110 is the control centre of mobile terminal, utilizes each of various interfaces and the entire mobile terminal of connection A part by running or execute the software program and/or module that are stored in memory 109, and calls and is stored in storage Data in device 109 execute the various functions and processing data of mobile terminal, to carry out integral monitoring to mobile terminal.Place Managing device 110 may include one or more processing units;Preferably, processor 110 can integrate application processor and modulatedemodulate is mediated Manage device, wherein the main processing operation system of application processor, user interface and application program etc., modem processor is main Processing wireless communication.It is understood that above-mentioned modem processor can not also be integrated into processor 110.
In embodiments of the present invention, processor 110 executes real when the program of the gesture identification method stored on memory 109 The step of gesture identification method now as described in Figure 1:
S101, the acceleration and angular speed data for obtaining gesture to be identified, and be filtered;
S102, acceleration resultant force time series is calculated according to acceleration information;
S103, acceleration resultant force time series and angular speed time series progress dimension are uniformly processed;
S104, dimension is uniformly processed according to default rule after acceleration resultant force time series and angular speed time sequence Column carry out branch mailbox;
S105, by after branch mailbox acceleration resultant force time series and angular speed time series and the gesture template library that prestores into Row matching, determines the classification of gesture to be identified.
Mobile terminal 100 can also include the power supply 111 (such as battery) powered to all parts, it is preferred that power supply 111 Can be logically contiguous by power-supply management system and processor 110, to realize management charging by power-supply management system, put The functions such as electricity and power managed.
Although Fig. 4 is not shown, mobile terminal 100 can also be including bluetooth module etc., and details are not described herein.
It should be noted that above-mentioned intelligent terminal embodiment and embodiment of the method belong to same design, implemented Journey is shown in embodiment of the method in detail, and the technical characteristic in embodiment of the method is corresponding applicable in terminal embodiment, here no longer It repeats.
Intelligent terminal provided in an embodiment of the present invention, can be in the case where not increasing sensor cost, by acceleration Resultant force time series and angular speed time series carry out dimension and are uniformly processed and branch mailbox, and effective regular gesture data is spatially Difference, reduce due to otherness caused by the variation of speed or the variation of strength, to realize that efficient, accurate gesture is known Not.
In addition, the embodiment of the invention also provides a kind of computer readable storage medium, the computer-readable storage medium It is stored with the program of gesture identification in matter, realizes that the embodiment of the present invention provides when the program of the gesture identification is executed by processor Gesture identification method the step of.
It should be noted that gesture identification program embodiment and embodiment of the method on above-mentioned computer readable storage medium Belong to same design, implements process referring particularly to embodiment of the method, and the technical characteristic in embodiment of the method is above-mentioned Corresponding in the embodiment of computer readable storage medium to be applicable in, which is not described herein again.
Preferred embodiments of the present invention have been described above with reference to the accompanying drawings, not thereby limiting the scope of the invention.This Without departing from the scope and spirit of the invention, there are many variations to implement the present invention by field technical staff, for example as one The feature of a embodiment can be used for another embodiment and obtain another embodiment.It is all to use institute within technical concept of the invention Any modifications, equivalent replacements, and improvements of work, should all be within interest field of the invention.

Claims (10)

1. a kind of gesture identification method, which is characterized in that this method comprises:
The acceleration and angular speed data of gesture to be identified are obtained, and are filtered;
Acceleration resultant force time series is calculated according to the acceleration information;
Dimension is carried out to the acceleration resultant force time series and angular speed time series to be uniformly processed;
Acceleration resultant force time series and angular speed time series after the dimension is uniformly processed according to default rule into Row branch mailbox;
By after branch mailbox acceleration resultant force time series and angular speed time series match with the gesture template library prestored, really The classification of fixed gesture to be identified.
2. gesture identification method according to claim 1, which is characterized in that described be filtered include:
It uses and the acceleration information and angular velocity data is carried out using 3.5HZ as the Butterworth LPF of cutoff frequency Filtering processing.
3. gesture identification method according to claim 1, which is characterized in that described calculated according to the acceleration information adds Speed resultant force time series includes being calculated using the following equation:
Wherein, s indicates acceleration resultant force, a(xt)、a(yt)、a(zt))Respectively indicate acceleration of the t moment on x-axis, y-axis and z-axis direction Degree, g indicate acceleration of gravity.
4. gesture identification method according to claim 1, which is characterized in that described to the acceleration resultant force time series Dimension is carried out to be uniformly processed including calculating as follows:
Wherein, S(t)Indicate the acceleration resultant force of t moment, meanSIndicate the mean value of all acceleration resultant forces of whole section of time series, stdSIndicate the standard deviation of all acceleration resultant forces of whole section of time series;
The angular velocity time series carries out dimension and is uniformly processed including calculating as follows:
Wherein, w(t)Indicate the angular speed of t moment, meanwIndicate the mean value of all angular velocity vector sizes of whole section of time series, stdwIndicate the standard deviation of all angular velocity vector sizes of whole section of time series.
5. gesture identification method according to claim 1, which is characterized in that it is described according to default rule to the dimension Acceleration resultant force time series and the angular speed time series after being uniformly processed carry out branch mailbox
It is using the first preset piecewise function that the acceleration resultant force at each moment in the acceleration resultant force time series is big It is small to be mapped as discrete acceleration resultant force value, the acceleration resultant force time sequence is stated using the discrete acceleration resultant force value Column;
The angular velocity vector size at each moment in the angular speed time series is mapped using the second preset piecewise function For discrete magnitude of angular velocity, the angular speed time series is stated using the discrete magnitude of angular velocity.
6. gesture identification method according to claim 1, which is characterized in that by the acceleration resultant force time series after branch mailbox It is matched with angular speed time series with the gesture template library prestored, determines that the classification of gesture to be identified includes:
It will be each in the acceleration resultant force time series and angular speed time series and preset gesture template library after the branch mailbox The target gesture data of major class is compared, and choosing major class belonging to the smallest target gesture template of difference is target collection.
By the acceleration resultant force time series and angular speed time series after branch mailbox and all gesture numbers in the target collection According to being compared, determine that classification belonging to the smallest gesture of difference is the classification of gesture to be identified.
7. a kind of gesture identifying device, which is characterized in that the device includes:
Gesture acquisition module, for obtaining the acceleration and angular speed data of gesture to be identified;
Filter module, for being filtered to the acceleration and angular speed data;
Resultant force computing module, for calculating acceleration resultant force time series according to the acceleration information;
Dimension unified modules, for being carried out at dimension unification to the acceleration resultant force time series and angular speed time series Reason;
Branch mailbox module, for the acceleration resultant force time series and angle speed after the dimension is uniformly processed according to default rule It spends time series and carries out branch mailbox;
Template matching module, for by after branch mailbox acceleration resultant force time series and angular speed time series and the gesture that prestores Template library is matched, and determines the classification of gesture to be identified.
8. gesture identifying device according to claim 7, which is characterized in that the template matching module includes:
First determination unit, for by after the branch mailbox acceleration resultant force time series and angular speed time series with it is preset The target gesture data of each major class is compared in gesture template library, is chosen big belonging to the smallest target gesture template of difference Class is target collection.
Second determination unit, for by the acceleration resultant force time series and angular speed time series and the object set after branch mailbox All gesture datas in conjunction are compared, and determine that classification belonging to the smallest gesture of difference is the classification of gesture to be identified.
9. a kind of smart machine, which is characterized in that the smart machine includes memory, processor and is stored in the memory Computer program that is upper and can running on the processor, is realized when the computer program is executed by the processor as weighed Benefit require any one of 1 to 7 described in gesture identification method the step of.
10. a kind of computer readable storage medium, which is characterized in that be stored with computer on the computer readable storage medium Program when the computer program is executed by processor, realizes the gesture identification as described in any one of claim 1 to 6 The step of method.
CN201810307857.6A 2018-04-08 2018-04-08 Gesture identification method, device, smart machine and computer readable storage medium Pending CN110348275A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810307857.6A CN110348275A (en) 2018-04-08 2018-04-08 Gesture identification method, device, smart machine and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810307857.6A CN110348275A (en) 2018-04-08 2018-04-08 Gesture identification method, device, smart machine and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN110348275A true CN110348275A (en) 2019-10-18

Family

ID=68173397

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810307857.6A Pending CN110348275A (en) 2018-04-08 2018-04-08 Gesture identification method, device, smart machine and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN110348275A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111580666A (en) * 2020-05-11 2020-08-25 清华大学 Equipment control method, electronic equipment, equipment control system and storage medium
CN112114665A (en) * 2020-08-23 2020-12-22 西北工业大学 Hand tracking method based on multi-mode fusion
CN112363659A (en) * 2020-11-09 2021-02-12 平安普惠企业管理有限公司 APP interface operation method and device, electronic equipment and storage medium
CN112527118A (en) * 2020-12-16 2021-03-19 郑州轻工业大学 Head posture recognition method based on dynamic time warping
CN116360603A (en) * 2023-05-29 2023-06-30 中数元宇数字科技(上海)有限公司 Interaction method, device, medium and program product based on time sequence signal matching

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103345627A (en) * 2013-07-23 2013-10-09 清华大学 Action recognition method and device
CN104866099A (en) * 2015-05-27 2015-08-26 东南大学 Error compensation method for improving gesture identification precision of intelligent device based on motion sensor
US20150346834A1 (en) * 2014-06-02 2015-12-03 Samsung Electronics Co., Ltd. Wearable device and control method using gestures
CN105184325A (en) * 2015-09-23 2015-12-23 歌尔声学股份有限公司 Human body action recognition method and mobile intelligent terminal
CN107026928A (en) * 2017-05-24 2017-08-08 武汉大学 A kind of behavioural characteristic identification authentication method and device based on mobile phone sensor

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103345627A (en) * 2013-07-23 2013-10-09 清华大学 Action recognition method and device
US20150346834A1 (en) * 2014-06-02 2015-12-03 Samsung Electronics Co., Ltd. Wearable device and control method using gestures
CN104866099A (en) * 2015-05-27 2015-08-26 东南大学 Error compensation method for improving gesture identification precision of intelligent device based on motion sensor
CN105184325A (en) * 2015-09-23 2015-12-23 歌尔声学股份有限公司 Human body action recognition method and mobile intelligent terminal
CN107026928A (en) * 2017-05-24 2017-08-08 武汉大学 A kind of behavioural characteristic identification authentication method and device based on mobile phone sensor

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
姚桂林, 姚鸿勋, 姜峰: "一种基于DTW/ISODATA算法的多层分类器手语识别方法", 计算机工程与应用, 31 August 2005 (2005-08-31), pages 45 - 47 *
王海鹏;龚岩;刘武;李泽;张思美: "一种时空多尺度适应的手势识别方法研究", 计算机科学, vol. 44, no. 12, 31 December 2017 (2017-12-31), pages 287 - 291 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111580666A (en) * 2020-05-11 2020-08-25 清华大学 Equipment control method, electronic equipment, equipment control system and storage medium
CN111580666B (en) * 2020-05-11 2022-04-29 清华大学 Equipment control method, electronic equipment, equipment control system and storage medium
CN112114665A (en) * 2020-08-23 2020-12-22 西北工业大学 Hand tracking method based on multi-mode fusion
CN112363659A (en) * 2020-11-09 2021-02-12 平安普惠企业管理有限公司 APP interface operation method and device, electronic equipment and storage medium
CN112527118A (en) * 2020-12-16 2021-03-19 郑州轻工业大学 Head posture recognition method based on dynamic time warping
CN112527118B (en) * 2020-12-16 2022-11-25 郑州轻工业大学 Head posture recognition method based on dynamic time warping
CN116360603A (en) * 2023-05-29 2023-06-30 中数元宇数字科技(上海)有限公司 Interaction method, device, medium and program product based on time sequence signal matching

Similar Documents

Publication Publication Date Title
CN110348275A (en) Gesture identification method, device, smart machine and computer readable storage medium
CN108459797B (en) Control method of folding screen and mobile terminal
CN110740259A (en) Video processing method and electronic equipment
CN107835321A (en) A kind of incoming call processing method and mobile terminal
CN107817939A (en) A kind of image processing method and mobile terminal
CN107809526A (en) End application sorting technique, mobile terminal and computer-readable recording medium
CN111127509B (en) Target tracking method, apparatus and computer readable storage medium
CN107707729A (en) A kind of terminal go out screen or bright screen method, terminal and computer-readable recording medium
CN110365853B (en) Prompting method and electronic equipment
CN108521658A (en) Reduce interference method, mobile terminal and computer readable storage medium
CN111554321A (en) Noise reduction model training method and device, electronic equipment and storage medium
CN107632757A (en) A kind of terminal control method, terminal and computer-readable recording medium
CN109901809A (en) A kind of display control method, equipment and computer readable storage medium
CN108415641A (en) A kind of processing method and mobile terminal of icon
CN108196776A (en) A kind of terminal split screen method, terminal and computer readable storage medium
CN110139018A (en) Camera controls the control method for movement and terminal of mould group, camera
CN107463324A (en) A kind of image display method, mobile terminal and computer-readable recording medium
CN108257104A (en) A kind of image processing method and mobile terminal
CN109710130A (en) A kind of display methods and terminal
CN109754823A (en) A kind of voice activity detection method, mobile terminal
CN109819102A (en) A kind of navigation bar control method and mobile terminal, computer readable storage medium
CN109193975A (en) A kind of wireless charging device and terminal
CN109388326A (en) A kind of electronic equipment, the control method and device of dual-screen electronic device
CN107656727A (en) A kind of font object library generating method and mobile terminal
CN109871176A (en) A kind of object displaying method and terminal device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination