CN108255297A - A kind of wearable device application control method and apparatus - Google Patents

A kind of wearable device application control method and apparatus Download PDF

Info

Publication number
CN108255297A
CN108255297A CN201711474468.4A CN201711474468A CN108255297A CN 108255297 A CN108255297 A CN 108255297A CN 201711474468 A CN201711474468 A CN 201711474468A CN 108255297 A CN108255297 A CN 108255297A
Authority
CN
China
Prior art keywords
gesture
wearable device
upper layer
template
gesture motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201711474468.4A
Other languages
Chinese (zh)
Inventor
苏鹏程
张凡
张一凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Techology Co Ltd
Original Assignee
Qingdao Real Time Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Real Time Technology Co Ltd filed Critical Qingdao Real Time Technology Co Ltd
Priority to CN201711474468.4A priority Critical patent/CN108255297A/en
Publication of CN108255297A publication Critical patent/CN108255297A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a kind of application control method and apparatus of wearable device, the associated Software Development Kit SDK of gesture interaction is integrated in the upper layer application of wearable device, method includes:The dynamic link library for encapsulating gesture operation function in advance is called, obtains the action data of the wearable device user of sensor acquisition on wearable device;Detect identification maneuver data;When identifying that the gesture motion in action data and default gesture template matches, by the identification information of matched gesture motion, the Application Programming Interface provided through dynamic link library is sent to the upper layer application of wearable device, and upper layer application is made to be determined after searching gesture motion and the correspondence responded according to the gesture motion identification information received and performs corresponding operation.In this way, facilitating user, can send out different instruction control upper layer applications by performing certain gesture motion makes different responses at any time, realizes convenient, intuitive interaction, enhances user experience.

Description

A kind of wearable device application control method and apparatus
Technical field
The present invention relates to human-computer interaction technique fields, and in particular to a kind of wearable device application control method and apparatus.
Background technology
Wearable device is that the one kind of clothes or accessory that directly can wear or be integrated into user portable is set It is standby.Wearable device interacts to realize powerful function by software support and data interaction, high in the clouds, greatly facilitates people Life.
At present, wearable device is typically equipped with application, and general all with touch screen, should to realize by touch screen With human-computer interactions such as controls.Since wearable device is often small so that touch screen is also relatively small, in this way, user touches It is very inconvenient to operate, poor user experience.
Invention content
The present invention provides a kind of wearable device application control method and apparatus, solve existing wearable device application control The technical issues of scheme processed causes user experience poor.
To reach above-mentioned technical purpose, the technical scheme is that be achieved:
According to an aspect of the invention, there is provided a kind of wearable device application control method, wearable device application In integrate gesture interaction associated Software Development Kit SDK, method includes:
The dynamic link library for encapsulating gesture operation function in advance is called, sensor on the wearable device is obtained and acquires Wearable device user action data;
Detection identifies the action data;
When identifying that the gesture motion in the action data and default gesture template matches, matched gesture is moved The identification information of work, the upper strata that the Application Programming Interface provided through the dynamic link library is sent to the wearable device should With the upper layer application is made to be determined after searching gesture motion and the correspondence of response according to the gesture motion identification information that receives And perform corresponding operation.
According to another aspect of the present invention, a kind of wearable device application control device is provided, wearable device should The associated Software Development Kit SDK of gesture interaction is integrated in, device includes:
Data capture unit for calling the dynamic link library for encapsulating gesture operation function in advance, can be worn described in acquisition Wear the action data of the wearable device user that sensor acquires in equipment;
Recognition unit is detected, the action data is identified for detecting;
Control unit, for when identifying that the gesture motion in the action data and default gesture template matches, By the identification information of matched gesture motion, the upper layer application of the wearable device is sent to through Application Programming Interface, makes institute It states and determines and perform after upper layer application searches gesture motion and the correspondence of response according to the gesture motion identification information received Corresponding operation.
According to a further aspect of the invention, a kind of electronic equipment is provided, electronic equipment includes:Memory and processing Device is communicated by internal bus between the memory and the processor and connected, and the memory is stored with can be described The program instruction that processor performs, what described program instruction can realize one aspect of the invention when being performed by the processor can Wearable device application control method.
The beneficial effects of the invention are as follows:The wearable device application control method and apparatus of the embodiment of the present invention are sealed in advance The dynamic link library of gesture operation function has been filled, when needing to carry out gesture control, the dynamic link library has been called, obtains wearable The action data for the wearable device user that sensor acquires in equipment, detects identification maneuver data, when identifying action data When matching with the gesture motion in default gesture template, by the identification information of matched gesture motion, through dynamic link library Application Programming Interface is sent to upper layer application, make upper layer application according to the gesture motion identification information that receives search gesture motion with After the correspondence of response, determine and perform corresponding operation, realize application control.As a result, when user wears wearable device And when having the generation of certain gesture motion, it can detected automatically and send corresponding message to upper layer application, control upper strata should With being responded and being performed, so as to fulfill intuitive, intelligentized interactive operation is facilitated.
Description of the drawings
Fig. 1 is a kind of flow chart of wearable device application control method of one embodiment of the invention;
Fig. 2 is a kind of schematic diagram of wearable device application control method of one embodiment of the invention;
Fig. 3 is the establishment gesture template of one embodiment of the invention and the flow chart of identification gesture;
Fig. 4 is the gestures detection of one embodiment of the invention and the flow chart of identification;
Fig. 5 is a kind of block diagram of wearable device application control device of one embodiment of the invention;
Fig. 6 is the structure diagram of a kind of electronic equipment of one embodiment of the invention.
Specific embodiment
The present invention design concept be:With the development of information technology, people are to convenient intuitive human-computer interaction technology Demand is more and more urgent, and gesture interaction control is also more convenient than touch screen and intuitive interactive mode, can be good at meeting and use Family demand.Realize that gesture identification and the method for interaction can be mainly divided into view-based access control model and based on two class of inertial sensor at present. The gesture identification research of view-based access control model starts more early, recognition methods comparative maturity, but there are environmentally sensitive, system complex, The drawbacks such as computationally intensive.On the contrary, although the gesture identification based on inertial sensor is started late, but flexibility and reliability, not by ring Border, the influence of light, system are realized simply, are a kind of methods with very big potentiality.
On the other hand, the development of the wearable devices such as smartwatch is swift and violent, they have the computing capability and resource of oneself, and General a variety of MEMS (Micro-Electro-Mechanical System, the MEMS) sensor that can all be embedded in is (as accelerated Degree meter, gyroscope) software and hardware support is provided to data operation and sensor-based gesture identification.In addition, smartwatch etc. Wearable device generally all can long periods of wear with user, user can be operated at any time.
Based on this, present inventor expects:During user wears wearable device, by performing certain gesture Action can send out different instructions, and control upper layer application makes different responses according to obtained instruction, realization user and can Convenient between wearable device, intuitive interaction, enhances user experience.
Fig. 1 is a kind of flow chart of wearable device application control method of one embodiment of the invention, referring to Fig. 1, sheet The associated Software Development Kit SDK of gesture interaction, wearable device application control are integrated in the wearable device application of embodiment Method processed, includes the following steps:
Step S101 calls the dynamic link library for encapsulating gesture operation function in advance, obtains on the wearable device The action data of the wearable device user of sensor acquisition;
Step S102, detection identify the action data;
Step S103, when identifying that the gesture motion in the action data and default gesture template matches, general The identification information for the gesture motion matched, the Application Programming Interface provided through the dynamic link library are sent to the wearable device Upper layer application, the upper layer application is made to be searched according to the gesture motion identification information that receives, and gesture motion is corresponding with response to close It is determined after system and performs corresponding operation.
As shown in Figure 1 it is found that integrating the associated software development work of gesture interaction in the wearable device application of the present embodiment Tool packet SDK (including the associated Application Programming Interface of gesture interaction) facilitates application integration and calls gesture interaction control function.It should With control method, calling encapsulates the dynamic link library of gesture operation function, and pass through API (Application in advance Programming Interface, Application Programming Interface) it is called for upper layer application, it needs to carry out gesture control application so as to work as When, perform bottom dynamic link library acquisition user data, and detect identification obtain related gesture action after, by gesture motion mark Know information and upper layer application is sent to by API, control upper layer application performs corresponding operation according to gesture motion, for example, opening Address list, closed communication record, makes a phone call, hangs up the telephone.So as to avoid carrying out interactive operation on smaller touch screen It is caused to experience the problem of bad, it improves user experience and improves the competitiveness of wearable device, expanded application range.
Fig. 2 is a kind of schematic diagram of wearable device application control method of one embodiment of the invention, referring to Fig. 2, sheet The wearable device application control method of embodiment, which is based on principle, is:First, the sensor data acquisition of bottom, detection, The gesture operations functions such as identification are encapsulated as to carry out the dynamic link library 202 of gesture identification.Dynamic link library is considered as one Kind of warehouse, it provides variable, function or the class that some can be directly.Then, dynamic link library provides Application Programming Interface To upper layer application 201, the calling of upper layer application is facilitated to control upper layer application.
It accesses in addition, dynamic link library can receive upper layer application by api interface, create and change different gesture moulds The request of plate.That is, api interface realizes the interaction between upper layer application and dynamic link library.Here dynamic link library is for example It is the file that suffix is .so (shared object).
By shown in Fig. 2, it is found that the embodiment of the present invention is directed to, the wearable devices screens such as smartwatch, Intelligent bracelet are small to be touched It touches the problem of inconvenient for operation, encapsulation dynamic link library and open API provide for upper layer application and utilize gesture The platform of relevant operation is performed, realization facilitates intuitive human-computer interaction.Moreover, this sensor-based gesture control mode spirit It is living reliable, it is not influenced by environment, light, system is realized simple.The screen of wearable device is generally all smaller, is directly touching Poor user experience is operated on screen, but if this point can be avoided well using this interactive mode of gesture motion.With biography System interactive mode is compared, and greatly strengthens user experience.
Below in conjunction with Fig. 3 to the emphasis in wearable device application control method, that is, create gesture template and identification gesture Flow be specifically described.
Referring to Fig. 3, flow starts, and first carries out step S301, acquires sensing data;
It should be noted that there is sensor, such as MEMS inertial sensor in the wearable device of the present embodiment, it is this kind of Sensor can be with acceleration transducer or gyroscope.It is illustrated by taking 3-axis acceleration sensor as an example in the present embodiment.Three axis Acceleration information variation when acceleration transducer can acquire user movement on tri- directions of xyz.The present embodiment bottom moves Be packaged in state chained library data acquisition function, can from sensor continuous acquisition data, for subsequent gesture detect know Not.
Step S302, pretreatment;
After 3-axis acceleration data are collected, time-domain windowed processing is carried out to the 3-axis acceleration data respectively, And the acceleration information in sliding window is pre-processed.Here pretreatment mainly filters data using filtering algorithm To filter out interference and noise, the filtering algorithm that may be used has for wave processing:Mean filter, Butterworth filtering etc..Time domain adds Window processing is to data each axial in 3-axis acceleration data, data is obtained with the sliding window of preset length, that is, along when Countershaft moves window function, and preset length here is, for example, N, N 50, in one embodiment window function may be used rectangular window, Quarter window or Hamming window etc..
Step S303, gestures detection;
Gestures detection is detected and calculates according to the pretreated data in step 202, to determine current acquisition Data in whether have gesture motion data.
It should be noted that gestures detection is the technological means that the present embodiment is taken to reduce power consumption, this is because To passing through pretreated sensing data, it is detected in current time window, judges whether possible gesture Action if possible with the presence of gesture, then according to circumstances further performs the operation of gesture identification or drawing template establishment;Otherwise, directly The action data sensing data for returning and continuing to obtain the wearable device user of sensor acquisition on wearable device is connect, this Sample a, it is not necessary to gesture identification be performed to the sensing data acquired every time, execution step can be reduced, reduce power consumption.
From the foregoing, it will be observed that three-dimensional acceleration (or gyroscope) data that the present embodiment is obtained based on MEMS sensor, detection is worked as Whether possible gesture motion occurs in preceding time window, if possible there is gesture motion, then further performs identification operation (or being created as template) if without gesture, directly returns, greatly reduces calculation amount.
Step S304, judges whether gesture motion;It is then, to perform step S306 or perform step S305;Otherwise, Perform step S301;
It is with the presence or absence of gesture motion judgment mode specifically in this step:
In each sliding window that length is N, to pretreated 3-axis acceleration data, the standard on each axis is calculated Poor σx、σy、σz, and calculate average difference σ:
σ=(σxyz)/3
If σ is less than given threshold value STD_TH, then it is assumed that there is no gesture motions, are not further processed and directly return It returns.In this way during there is no gesture, calculation amount can be substantially reduced.
Wherein, σxRepresent the standard deviation of data in the sliding window in x-axis, σyRepresent the standard of data in the sliding window in y-axis Difference, σzRepresent the standard deviation of data in the sliding window in z-axis.
Step S305 creates gesture template;
It is exactly to act the user gesture detected as a gesture template to be saved in gesture template to create gesture template In library, in this way, when subsequent user makes same gesture motion again, the gesture is may recognize that by searching for gesture template library Action, and conveniently application is controlled based on the gesture motion.
As it can be seen that User Defined gesture template is supported in the present embodiment.For example, it applies and adjusts using gesture control first During with dynamic link library, by calling Basic API interface user by upper layer application for the self-defined of display interface input Instruction creates or changes gesture template.During specific implementation, upper layer application can call Basic API interface to obtain currently first Gesture template list, and made by oneself after instruction creates new gesture template or the current gesture template of modification according to the user of reception It is saved in the gesture template library of bottom.
Step S306, gesture identification.
Gesture identification is the premise and basis of gesture control application, and therefore, accurate gesture identification is to gesture control to pass It is important.The process of gesture identification is really the process of template matches, according to the explanation of previous step it is found that can be tieed up in the present embodiment A gesture template library is protected, the gesture mould of user-defined gesture template or system default is saved in the gesture template library Plate, will be in this step to gesture action recognition, to determine which is specifically when being determined in step s 304 there are after gesture motion One gesture motion is to get to gesture recognition result.
During specific implementation, the test feature sequence of gesture motion to be identified and the template of each gesture template can be calculated Characteristic sequence, and test feature sequence is matched one by one with each template characteristic sequence, using template matches (such as DTW algorithms) Or machine learning method is identified.DTW (Dynamic Time Warping, dynamic time consolidation) algorithm is when comparing two Between sequence similarity method, can obtain gesture identification result by calculating similarity.The usual base of method of machine learning Belong to the possibility of some gesture motion in statistics and probability to predict.
After matched gesture motion is identified in gesture motion template, by the unique identification information of matched gesture motion (such as id information) is sent to upper layer application by api interface, and upper layer application is made to search gesture motion according to the id information received And gesture motion and response correspondence, then perform corresponding operation.
Here application is, for example, the alarm clock application that system carries, and according to the present embodiment, user is wearing wearable device When, by performing the i.e. controllable alarm clock application of a scheduled gesture motion.It is beaten for example, making and drawing round gesture motion clockwise It winds up an alarm clock, makes and draw gesture motion setting alarm time of rectangle etc. clockwise.
Flow terminates.
In an embodiment of the present invention, aforementioned gesture identification mainly include principal component analysis (PCA) processing, feature extraction, The processes such as gesture matching.In order to reduce calculation amount, be employed herein PCA methods by three-dimensional acceleration data be reduced to it is one-dimensional after again into Row processing.Fig. 4 is the gestures detection of another embodiment of the present invention and the flow chart of identification, referring to Fig. 4, a gestures detection and The flow of identification is as follows:
It should be noted that step S401 to step S403 respectively with the realization process phase in above mentioned steps S3 01 to S303 Together, therefore the aforementioned explanation to step S301 to S303 is may refer to, which is not described herein again.
Template sequence (or cycle tests) can be obtained by above-mentioned step S401 to S403;
Step S404, judges whether gesture motion;It is then, to perform step S405 or step S406, otherwise, performs Step S401.
Referring to Fig. 4, the present embodiment has different implementation procedures to establishing template and cycle tests processing.Establishing template When, for template sequence, three-dimensional acceleration data sequence is subjected to PCA ((Principal Component Analysis, master Constituent analysis) processing, the one-dimensional template data after dimensionality reduction is obtained, and obtain the characteristic vector space of principal component.Once testing When (gesture motion to be identified is identified), for cycle tests, three-dimensional acceleration cycle tests is projected into template sequence The principal component characteristic vector space of row obtains the one-dimensional test data after dimensionality reduction.Specifically, step S405 includes:Step S4051 And S4052;
Step S4051, principal component analysis;
It is illustrated with acceleration information, after 3-axis acceleration data are collected, three axis is added using principal component analysis Speed data reduces data dimension processing, is down to characteristic vector space that is one-dimensional, and obtaining principal component.By using principal component point Analysis, can determine the importance of each independent element, and select most important ingredient according to the characteristic value size in calculating process, It is one-dimensional original acceleration signal to be down to, while computation complexity is reduced, a part of noise is eliminated, and reduce pair User performs Gesture during gesture.
Step S4052, feature extraction;
One-dimensional data extraction feature (e.g., the characteristics of mean at consecutive number strong point, Variance feature, fast Fourier change to acquisition Change FFT coefficient characteristics or directly extraction waveform variation characteristic etc.), obtain the template characteristic sequence being made of these features.Template Characteristic sequence, which is saved in gesture template library, is used for gesture identification and matching.
Similar with abovementioned steps S405, step S406 includes:Step S4061 and S4062;
Step S4061 projects to template principal component space;
For cycle tests, need the principal component feature vector that three-dimensional acceleration test data is projected to template sequence empty Between, to obtain the one-dimensional test data after dimensionality reduction.
Step S4062, feature extraction;
It is corresponding with abovementioned steps S4052, it is (e.g., adjacent to the one-dimensional test data of acquisition extraction feature in this step The characteristics of mean of data point, Variance feature, fast Fourier transform FFT coefficient characteristics or direct extraction waveform variation characteristic etc.), it obtains To the test feature sequence being made of these features.
Step S407, gesture motion identification.
In this step, obtained test feature sequence with each template characteristic sequence is matched, template may be used Matching or machine learning scheduling algorithm, accurate gesture identification is realized while computation complexity is reduced.
So far, flow terminates.
In addition, in order to support the upper layer application of the wearable device of gesture control the present embodiment, gesture configuration can be provided Module is interacted with bottom.Bottom dynamic link library receives upper layer application and is obtained by the Template Information that Application Programming Interface is called to send Request is taken, current gesture template list is sent to upper layer application through Application Programming Interface;And it receives upper layer application and passes through Call the gesture template configuration instruction that Application Programming Interface is sent;Establishment is performed according to the instruction of corresponding gesture template configuration, is repaiied Change, delete gesture template operation, and preserve implementing result to gesture template database.So as in use, facilitate user It is modified at any time by gesture configuration module to relevant control command and gesture.
For example, when using the gesture control function for the first time, upper layer application is by calling Basic API interface to create or change Gesture template.Specifically, gesture configuration module calls Basic API interface to obtain current gesture template list, Ke Yigen first Self-defined new gesture template is needed according to user or is changed current gesture template and is saved in the gesture template library of bottom.
On the other hand, the gesture configuration module of upper layer application can will hold gesture motion and correspondence according to the instruction of user Capable response and operation is associated setting, and will be in this related information saving to gesture command database.After configuration successful, When user performs different gesture motions, upper layer application is after the gesture ID message for receiving the transmission of bottom identification module, in number According to the corresponding control command of the gesture and parameter is inquired in library, corresponding operation or response are then performed, is realized convenient intuitive Human-computer interaction function.
Wearable device is resource-constrained equipment.And during gesture identification, it needs persistently to carry out gesture motion It perceives and identifies, many energy can be consumed, therefore, it is necessary to reduce the complexity of algorithm as possible, while calculation amount is reduced, The reliability of safety action identification.
Can start and stop the operation of bottom gesture identification to reduce power consumption the present embodiment, and provide api interface to Upper layer application, easily to control bottom.After bottom gesture identification is started, the hand of user is just persistently detected and identified Gesture.Specifically, upper layer application is received by calling the startup of Application Programming Interface transmission or stopping gesture control order, according to Initiation gesture control command calls state chained library, to obtain the action data of wearable device user;According to stopping gesture control Order stops obtaining the action data of wearable device user;
The embodiment of the present invention additionally provides a kind of wearable device application control device, and Fig. 5 is one embodiment of the invention A kind of wearable device application control device block diagram, in the application of wearable device integrate the associated software of gesture interaction open Kit SDK is sent out, referring to Fig. 5, wearable device application control device 500, including:
Data capture unit 501, can described in acquisition for calling the dynamic link library for encapsulating gesture operation function in advance The action data for the wearable device user that sensor acquires on wearable device;
Recognition unit 502 is detected, the action data is identified for detecting;
Control unit 503 identifies that the action data matches with the gesture motion in default gesture template for working as When, by the identification information of matched gesture motion, the Application Programming Interface that is provided through the dynamic link library be sent to it is described can The upper layer application of wearable device makes the upper layer application search gesture motion and response according to the gesture motion identification information received Correspondence after determine and perform corresponding operation.
In one embodiment of the invention, wearable device application control device 500 further includes:
Switch unit, for receiving the upper layer application by calling the startup of the Application Programming Interface transmission or stopping Only gesture control order according to initiation gesture control command, calls the state chained library, to obtain the wearable device user Action data;According to gesture control order is stopped, stopping obtaining the action data of the wearable device user.
In one embodiment of the invention, wearable device application control device 500 further includes:
Template operation unit is believed for receiving the upper layer application by the template that the Application Programming Interface is called to send Breath obtains request, and current gesture template list is sent to the upper layer application through the Application Programming Interface;And it is used for It receives the upper layer application to instruct by the gesture template configuration that the Application Programming Interface is called to send, according to corresponding The instruction of gesture template configuration performs establishment, modification, deletes gesture template operation, and preserve implementing result to gesture template data Library.
In one embodiment of the invention, data capture unit 501 add specifically for obtaining on the wearable device The current 3-axis acceleration data of wearable device user of velocity sensor acquisition;
Recognition unit 502 is detected, for respectively the 3-axis acceleration data to be carried out with time-domain windowed processing, and to sliding Acceleration information in window is pre-processed, and pretreated acceleration information is detected is in current sliding window to determine It is no that there are gesture motions;If there are gesture motion, gesture motion is matched with the gesture motion in gesture template with right The gesture motion is identified;If there is no gesture motion, return continues to obtain sensor on the wearable device and acquires Wearable device user action data.
In one embodiment of the invention, recognition unit 502 is detected, for the acceleration in preset length sliding window Data calculate standard deviation sigma respectivelyx、σy、σzAfterwards, it is poor by equation below calculating average:
σ=(σxyz)/3
Compare the size of σ and predetermined threshold value STD_TH, think that there is no gesture motions if σ is less than STD_TH.
It should be noted that the course of work of the wearable device application control device of the present embodiment be with it is aforementioned wearable The realization step of equipment application control method is corresponding, and therefore, the part of not detailed description may refer to aforementioned in the present embodiment Explanation in embodiment, details are not described herein.
Fig. 6 is the structure diagram of the electronic equipment of one embodiment of the invention.As shown in fig. 6, the electronic equipment includes Memory 601 and processor 602 are communicated by internal bus 603 between memory 601 and processor 602 and connected, memory 601 are stored with the program instruction that can be performed by processor 602, can be realized when program instruction is performed by processor 602 above-mentioned Wearable device application control method.
In addition, the logical order in above-mentioned memory 601 can be realized and be used as by the form of SFU software functional unit Independent product is sold or in use, can be stored in a computer read/write memory medium.Based on such understanding, sheet The part or the part of the technical solution that the technical solution of invention substantially in other words contributes to the prior art can be with The form of software product embodies, which is stored in a storage medium, including some instructions to (can be personal computer, server or the network equipment etc.) performs each implementation of the application so that computer equipment The all or part of step of example the method.And aforementioned storage medium includes:USB flash disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disc or CD etc. it is various The medium of program code can be stored.
An alternative embodiment of the invention provides a kind of computer readable storage medium, computer-readable recording medium storage Computer instruction, computer instruction make the computer perform above-mentioned method.
It should be understood by those skilled in the art that, the embodiment of the present invention can be provided as method, system or computer program Product.Therefore, the reality in terms of complete hardware embodiment, complete software embodiment or combination software and hardware can be used in the present invention Apply the form of example.Moreover, the computer for wherein including computer usable program code in one or more can be used in the present invention The computer program production that usable storage medium is implemented on (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.) The form of product.
The present invention be with reference to according to the method for the embodiment of the present invention, the flow of equipment (system) and computer program product Figure and/or block diagram describe.It should be understood that it can be realized by computer program instructions every first-class in flowchart and/or the block diagram The combination of flow and/or box in journey and/or box and flowchart and/or the block diagram.These computer programs can be provided The processor of all-purpose computer, special purpose computer, Embedded Processor or other programmable data processing devices is instructed to produce A raw machine so that the instruction performed by computer or the processor of other programmable data processing devices is generated for real The dress of function specified in a present flow of flow chart or a box or multiple boxes for multiple flows and/or block diagram It puts.
It should be noted that term " comprising ", "comprising" or its any other variant are intended to the packet of nonexcludability Contain so that process, method, article or equipment including a series of elements not only include those elements, but also including It other elements that are not explicitly listed or further includes as elements inherent to such a process, method, article, or device. In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including the element Process, method, also there are other identical elements in article or equipment.
In the specification of the present invention, numerous specific details are set forth.Although it is understood that the embodiment of the present invention can To put into practice without these specific details.In some instances, well known method, structure and skill is not been shown in detail Art, so as not to obscure the understanding of this description.Similarly, it should be understood that disclose in order to simplify the present invention and helps to understand respectively One or more of a inventive aspect, above in the description of exemplary embodiment of the present invention, each spy of the invention Sign is grouped together into sometimes in single embodiment, figure or descriptions thereof.It however, should not be by the method solution of the disclosure It releases and is intended in reflection is following:I.e. the claimed invention requirement is than the feature that is expressly recited in each claim more More features.More precisely, as the following claims reflect, inventive aspect is less than single reality disclosed above Apply all features of example.Therefore, it then follows thus claims of specific embodiment are expressly incorporated in the specific embodiment, Wherein each claim is in itself as separate embodiments of the invention.
The above description is merely a specific embodiment, under the above-mentioned introduction of the present invention, those skilled in the art Other improvement or deformation can be carried out on the basis of above-described embodiment.It will be understood by those skilled in the art that above-mentioned tool The purpose of the present invention is only preferably explained in body description, and protection scope of the present invention is subject to the protection scope in claims.

Claims (10)

  1. A kind of 1. wearable device application control method, which is characterized in that gesture interaction is integrated in wearable device upper layer application Associated Software Development Kit SDK, method include:
    It calls and encapsulates the dynamic link library of gesture operation function in advance, obtain that sensor on the wearable device acquires can The action data of wearable device user;
    Detection identifies the action data;
    When identifying that the gesture motion in the action data and default gesture template matches, by matched gesture motion Identification information, the Application Programming Interface provided through the dynamic link library are sent to the upper layer application of the wearable device, make The upper layer application is determined and is held after searching gesture motion and the correspondence of response according to the gesture motion identification information received The corresponding operation of row.
  2. 2. it according to the method described in claim 1, it is characterized in that, further includes:
    It receives the upper layer application and request is obtained by the Template Information that the Application Programming Interface is called to send, by current hand Gesture template list is sent to the upper layer application through the Application Programming Interface;
    And it receives the upper layer application and is instructed by the gesture template configuration that the Application Programming Interface is called to send;
    Establishment is performed according to the corresponding gesture template configuration instruction, modification, deletes gesture template operation, and preserve and perform knot Fruit is to gesture template database.
  3. 3. according to the method described in claim 1, it is characterized in that, obtain that sensor on the wearable device acquires wears The action data for wearing equipment user includes:
    Obtain the 3-axis acceleration data of the wearable device user that acceleration transducer acquires on the wearable device;
    Detection identifies that the action data includes:
    Time-domain windowed processing is carried out to the 3-axis acceleration data respectively, and the acceleration information in sliding window is located in advance Reason,
    Pretreated acceleration information is detected to determine in current sliding window with the presence or absence of gesture motion;
    If there are gesture motion, gesture motion is matched with the gesture motion in gesture template with to the gesture motion into Row identification;
    If there is no gesture motion, the wearable device user for continuing to obtain that sensor acquires on the wearable device is returned Action data.
  4. 4. according to the method described in claim 3, it is characterized in that, pretreated acceleration information is detected to determine Include in current sliding window with the presence or absence of gesture motion:
    To the acceleration information in preset length sliding window, standard deviation sigma is calculated respectivelyx、σy、σzAfterwards, it by equation below, calculates Average is poor:
    σ=(σxyz)/3
    Compare the size of σ and predetermined threshold value STD_TH, think that there is no gesture motions if σ is less than STD_TH.
  5. 5. according to the method described in claim 1, it is characterized in that, this method further includes:
    The upper layer application is received by calling the startup of the Application Programming Interface transmission or stopping gesture control order,
    According to initiation gesture control command, the state chained library is called, to obtain the action data of the wearable device user;
    According to gesture control order is stopped, stopping obtaining the action data of the wearable device user.
  6. 6. a kind of wearable device application control device, which is characterized in that gesture interaction association is integrated in wearable device application Software Development Kit SDK, device includes:
    Data capture unit encapsulates the dynamic link library of gesture operation function in advance for calling, and obtains described wearable set The action data of the wearable device user of standby upper sensor acquisition;
    Recognition unit is detected, the action data is identified for detecting;
    Control unit, for when identifying that the gesture motion in the action data and default gesture template matches, will The identification information for the gesture motion matched, the Application Programming Interface provided through the dynamic link library are sent to the wearable device Upper layer application, the upper layer application is made to be searched according to the gesture motion identification information that receives, and gesture motion is corresponding with response to close It is determined after system and performs corresponding operation.
  7. 7. device according to claim 6, which is characterized in that further include:
    Switch unit, for receiving the upper layer application by calling the startup of the Application Programming Interface transmission or stopping hand Gesture control command according to initiation gesture control command, calls the state chained library, to obtain the dynamic of the wearable device user Make data;According to gesture control order is stopped, stopping obtaining the action data of the wearable device user;
    Template operation unit is obtained for receiving the upper layer application by the Template Information that the Application Programming Interface is called to send Request is taken, current gesture template list is sent to the upper layer application through the Application Programming Interface;And for receiving The upper layer application is instructed by the gesture template configuration that the Application Programming Interface is called to send, according to the corresponding gesture Template configuration instruction performs establishment, modification, deletes gesture template operation, and preserve implementing result to gesture template database.
  8. 8. device according to claim 6, which is characterized in that
    The data capture unit, specifically for obtaining the wearable device of acceleration transducer acquisition on the wearable device The 3-axis acceleration data of user;
    The detection recognition unit, for carrying out time-domain windowed processing to the 3-axis acceleration data respectively, and to sliding window Interior acceleration information is pre-processed, pretreated acceleration information is detected with determine current sliding window in whether There are gesture motions;If there are gesture motion, gesture motion is matched with the gesture motion in gesture template with to this Gesture motion is identified;If there is no gesture motion, return and continue to obtain what sensor on the wearable device acquired The action data of wearable device user.
  9. 9. device according to claim 8, which is characterized in that the detection recognition unit, for being slided to preset length Acceleration information in window, calculates standard deviation sigma respectivelyx、σy、σzAfterwards, it is poor by equation below calculating average:
    σ=(σxyz)/3
    Compare the size of σ and predetermined threshold value STD_TH, think that there is no gesture motions if σ is less than STD_TH.
  10. 10. a kind of electronic equipment, which is characterized in that the electronic equipment includes:Memory and processor, the memory and institute It states to communicate by internal bus between processor and connect, the memory, which is stored with, to be referred to by the program that the processor performs It enables, described program instruction can realize that claim 1-5 any one of them wearable device should when being performed by the processor Use control method.
CN201711474468.4A 2017-12-29 2017-12-29 A kind of wearable device application control method and apparatus Pending CN108255297A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711474468.4A CN108255297A (en) 2017-12-29 2017-12-29 A kind of wearable device application control method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711474468.4A CN108255297A (en) 2017-12-29 2017-12-29 A kind of wearable device application control method and apparatus

Publications (1)

Publication Number Publication Date
CN108255297A true CN108255297A (en) 2018-07-06

Family

ID=62725297

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711474468.4A Pending CN108255297A (en) 2017-12-29 2017-12-29 A kind of wearable device application control method and apparatus

Country Status (1)

Country Link
CN (1) CN108255297A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109683982A (en) * 2018-12-06 2019-04-26 深圳市广和通无线股份有限公司 MES system control method, device, computer equipment and storage medium
CN110515462A (en) * 2019-08-27 2019-11-29 安徽华米信息科技有限公司 It is a kind of intelligence wearable device in apply control method, device
CN112085052A (en) * 2020-07-28 2020-12-15 中国科学院深圳先进技术研究院 Training method of motor imagery classification model, motor imagery method and related equipment
CN113196797A (en) * 2018-12-17 2021-07-30 高通股份有限公司 Acoustic gesture detection for control of audible devices
CN117348737A (en) * 2023-12-06 2024-01-05 之江实验室 Data processing system and method based on multi-channel interaction
CN117572963A (en) * 2023-11-06 2024-02-20 深圳市腾进达信息技术有限公司 Method for controlling operation of intelligent wearable device based on motion capture technology

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050289182A1 (en) * 2004-06-15 2005-12-29 Sand Hill Systems Inc. Document management system with enhanced intelligent document recognition capabilities
CN101620531A (en) * 2008-07-02 2010-01-06 英业达股份有限公司 Method for customizing GUI of applications and computer readable storage medium
CN104216646A (en) * 2013-05-30 2014-12-17 华为软件技术有限公司 Method and device for creating application program based on gesture
CN104317647A (en) * 2014-10-31 2015-01-28 小米科技有限责任公司 Application function realizing method, device and terminal
CN104898942A (en) * 2015-05-18 2015-09-09 百度在线网络技术(北京)有限公司 Control method and device of wearable equipment
CN105184325A (en) * 2015-09-23 2015-12-23 歌尔声学股份有限公司 Human body action recognition method and mobile intelligent terminal
CN105242779A (en) * 2015-09-23 2016-01-13 歌尔声学股份有限公司 Method for identifying user action and intelligent mobile terminal
CN105549408A (en) * 2015-12-31 2016-05-04 歌尔声学股份有限公司 Wearable device and control method thereof, intelligent household server and control method thereof, and system
CN105676860A (en) * 2016-03-17 2016-06-15 歌尔声学股份有限公司 Wearable equipment, unmanned plane control device and control realization method
CN107111675A (en) * 2014-12-04 2017-08-29 皇家飞利浦有限公司 For the dynamical feedback of wearable device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050289182A1 (en) * 2004-06-15 2005-12-29 Sand Hill Systems Inc. Document management system with enhanced intelligent document recognition capabilities
CN101620531A (en) * 2008-07-02 2010-01-06 英业达股份有限公司 Method for customizing GUI of applications and computer readable storage medium
CN104216646A (en) * 2013-05-30 2014-12-17 华为软件技术有限公司 Method and device for creating application program based on gesture
CN104317647A (en) * 2014-10-31 2015-01-28 小米科技有限责任公司 Application function realizing method, device and terminal
CN107111675A (en) * 2014-12-04 2017-08-29 皇家飞利浦有限公司 For the dynamical feedback of wearable device
CN104898942A (en) * 2015-05-18 2015-09-09 百度在线网络技术(北京)有限公司 Control method and device of wearable equipment
CN105184325A (en) * 2015-09-23 2015-12-23 歌尔声学股份有限公司 Human body action recognition method and mobile intelligent terminal
CN105242779A (en) * 2015-09-23 2016-01-13 歌尔声学股份有限公司 Method for identifying user action and intelligent mobile terminal
CN105549408A (en) * 2015-12-31 2016-05-04 歌尔声学股份有限公司 Wearable device and control method thereof, intelligent household server and control method thereof, and system
CN105676860A (en) * 2016-03-17 2016-06-15 歌尔声学股份有限公司 Wearable equipment, unmanned plane control device and control realization method

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109683982A (en) * 2018-12-06 2019-04-26 深圳市广和通无线股份有限公司 MES system control method, device, computer equipment and storage medium
CN113196797A (en) * 2018-12-17 2021-07-30 高通股份有限公司 Acoustic gesture detection for control of audible devices
CN113196797B (en) * 2018-12-17 2022-09-20 高通股份有限公司 Acoustic gesture detection for control of audible devices
CN110515462A (en) * 2019-08-27 2019-11-29 安徽华米信息科技有限公司 It is a kind of intelligence wearable device in apply control method, device
CN112085052A (en) * 2020-07-28 2020-12-15 中国科学院深圳先进技术研究院 Training method of motor imagery classification model, motor imagery method and related equipment
CN117572963A (en) * 2023-11-06 2024-02-20 深圳市腾进达信息技术有限公司 Method for controlling operation of intelligent wearable device based on motion capture technology
CN117348737A (en) * 2023-12-06 2024-01-05 之江实验室 Data processing system and method based on multi-channel interaction

Similar Documents

Publication Publication Date Title
CN108255297A (en) A kind of wearable device application control method and apparatus
KR102267482B1 (en) Systems and Methods for Simultaneous Localization and Mapping
US11580700B2 (en) Augmented reality object manipulation
US20200310601A1 (en) Dynamic media selection menu
EP3566111B1 (en) Augmented reality object manipulation
US11886681B2 (en) Standardizing user interface elements
KR102228448B1 (en) Method and system for improved performance of a video game engine
US20160203362A1 (en) Air Writing And Gesture System With Interactive Wearable Device
EP3844725A1 (en) Augmented reality anthropomorphization system
US20150205963A1 (en) Method and device for extracting message format
WO2012168886A2 (en) Method and apparatus for contextual gesture recognition
CN107544670A (en) The computing device of non-vision response with power triggering
US20180357321A1 (en) Sequentialized behavior based user guidance
WO2018212995A1 (en) Methods and systems for query segmentation
CN104571508A (en) Method for operating data displayed by mobile terminal
US20240037847A1 (en) Three-dimensional modeling toolkit
US11816506B2 (en) Core targeting in heterogeneous multiprocessor systems
US11403081B2 (en) Automatic software performance optimization
KR20210149120A (en) Location-Based Augmented-Reality System
US20220239616A1 (en) Messaging system
US20170090590A1 (en) Determining Digit Movement from Frequency Data
CN110909190B (en) Data searching method and device, electronic equipment and storage medium
US11309877B1 (en) Comparator with floating capacitive supply
US11681596B1 (en) Redundant segment for efficient in-service testing
US20240223490A1 (en) Device clustering

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20191115

Address after: 266104 Laoshan Qingdao District North House Street investment service center room, Room 308, Shandong

Applicant after: GEER TECHNOLOGY CO., LTD.

Address before: 266061, No. 3, building 18, Qinling Mountains Road, Laoshan District, Shandong, Qingdao 401

Applicant before: Qingdao real time Technology Co., Ltd.

TA01 Transfer of patent application right
RJ01 Rejection of invention patent application after publication

Application publication date: 20180706

RJ01 Rejection of invention patent application after publication