CN110196635B - Gesture input method based on wearable equipment - Google Patents

Gesture input method based on wearable equipment Download PDF

Info

Publication number
CN110196635B
CN110196635B CN201910351496.XA CN201910351496A CN110196635B CN 110196635 B CN110196635 B CN 110196635B CN 201910351496 A CN201910351496 A CN 201910351496A CN 110196635 B CN110196635 B CN 110196635B
Authority
CN
China
Prior art keywords
input
character string
character
user
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910351496.XA
Other languages
Chinese (zh)
Other versions
CN110196635A (en
Inventor
董玮
高艺
曾思钰
刘汶鑫
张文照
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN201910351496.XA priority Critical patent/CN110196635B/en
Publication of CN110196635A publication Critical patent/CN110196635A/en
Application granted granted Critical
Publication of CN110196635B publication Critical patent/CN110196635B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Biophysics (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Machine Translation (AREA)
  • Character Discrimination (AREA)

Abstract

A gesture input method based on wearable equipment comprises the following steps: the method comprises the steps that a wrist data set of a user when different letters are handwritten is collected through an intelligent watch, corresponding hand data sets are collected through an optical sensor, and letters corresponding to the two data sets are used as labels of the two data sets; clipping compression and time alignment are carried out on two groups of motion data corresponding to the wrist and the hand, and the processed wrist motion data is used as a handwriting data set; extracting motion data of finger tips, finger roots, wrists and other parts in the hand data set as a reference data set, and then calculating the lengths of metacarpals and forefingers respectively; building a neural network, and obtaining a fitting model from wrist data to reference data and a classification model for identifying letters through training; and establishing a word searching library to complete the construction work of the input method. When a user inputs, the input method collects wrist data input by the user, firstly performs fitting and then classification, gives corresponding input letters, sequentially combines the input letters to obtain an input character string, then calculates the character string close to the input character string, searches character strings meeting conditions in a word bank, and finally returns the character string input by the user and words close to the character string input by the user, so that a prediction result close to the input by the user can be given under the condition of wrong input by the user.

Description

Gesture input method based on wearable equipment
Technical Field
The invention relates to a gesture input method based on wearable equipment.
Background
In recent years, intelligent input methods become hot spots of domestic and foreign research, and convenient input method technology is essential in the information era. The smart mobile phone, the smart watch, the smart television and the like support the installation of the smart input method, and the good input method can bring high-efficiency interactive experience to users. The input method not only needs to give a corresponding output result under the condition that the user inputs correctly, but also needs to correct the input of the user and give a correct output result under the condition that the user inputs incorrectly. On the other hand, the associative input function is also more and more favored by people, when a user inputs a word, the input method can predict the content input by the user next time according to the current input of the user, and can continuously adjust the prediction algorithm according to the selection of the user on the prediction result, so that the prediction result can better meet the expectation of the user. In conclusion, the intelligent input method plays an increasingly important role in the life of people, and the research on the intelligent input method suitable for mass users has potential application value.
The existing input method on the intelligent device needs a user to click keys on a keyboard by hand to input letters, and then a corresponding output result is given according to the input of the user. In the process of user input, the input of text generally requires two hands, one hand is used for fixing the input device, and the other hand is used for performing input operation on the device. In some specific scenes, a user has difficulty in vacating two hands to perform input operation, for example, in rainy days, the user needs to hold an umbrella with one hand, and in express taking, the user needs to take an express with one hand. Therefore, the user can not use two hands to carry out input operation, and the intelligent input method which can complete the input operation by only one hand can meet the corresponding user requirements.
In general, in some specific scenes, the single-hand input method has a good application prospect.
Disclosure of Invention
The invention overcomes the defects of the prior art and provides a gesture input method based on a wearable device.
According to the invention, a wrist data set of a user when handwriting different letters is acquired through an intelligent watch, corresponding hand data sets are acquired through an optical sensor, characteristics of the data are extracted, a neural network is built, a fitting model from the wrist data to reference data and a classification model for identifying letters are obtained through training, then a word searching library is established, when the user inputs the data, the wrist data input by the user is acquired, fitting is carried out firstly, then classification is carried out, corresponding input letters are given, the input letters are combined in sequence to obtain an input character string, finally words matched with the input character string of the user are given through searching the word library, and the user finishes corresponding sentence input by screening correct words.
In order to realize the purpose, the technical scheme adopted by the invention is as follows: a gesture input method based on a wearable device comprises the following steps:
step 1, acquiring wrist motion data and hand motion data of a user, comprising:
collecting wrist movement data by using an intelligent wearable watch, and collecting hand movement data by using an optical somatosensory controller;
step 2, a training stage, namely preprocessing the wrist motion data and the hand motion data in the step 1, training a neural network and obtaining a fitting model and a classifier, wherein the training stage comprises the following steps:
(2.1) recording corresponding letters as labels C for the wrist and hand movement data of the user in the step 1 when the user hand writes different letters;
(2.2) clipping compression and time alignment are carried out on the two groups of motion data corresponding to the wrist and the hand acquired in the step 1, and the processed wrist motion data is used as a handwritten data set A;
(2.3) extracting motion data of parts such as fingertips, finger roots and wrists from the hand motion data acquired in the step 1 to obtain a reference data set B; respectively calculating the lengths of the metacarpal bones and the index finger;
(2.4) building a neural network, and taking a neural network model obtained by training as a fitting model Mfitting,MfittingFor fitting the handwritten data set to a reference data set D;
(2.5) constructing another neural network, and taking the neural network model obtained by training as a classifier Mclassify,MclassifyThe probability that the current input may be individual letters may be output;
(2.6) adding MfittingAnd MclassifySolidifying and converting into model M for mobile terminalfitting-liteAnd Mclassify-lite
Step 3, a prediction stage, namely recognizing the input of the user and performing input prediction, wherein the prediction stage comprises the following steps:
(3.1) establishing a corresponding word searching library;
(3.2) collecting a handwriting input data set of the user;
(3.3) compressing the handwriting data set in the step (3.2) to obtain a compressed handwriting data set;
(3.4) taking the compressed handwriting data set obtained in the step (3.3) as input, firstly fitting through a data fitting model, and then classifying through a word recognition classifier to obtain the letters input by the user;
(3.5) according to the current input character string of the user, namely the sequential combination of the letters input by the user, calculating the character string with the editing distance of 1 to form a candidate character string set;
(3.6) searching words meeting the conditions in the word stock established in the step (3.1) by using the candidate character string set obtained in the step (3.5);
(3.7) if the result obtained by the query in the step (3.6) is not empty, returning the query result; if the query result is empty, performing steps (3.8), (3.9) and (3.10);
(3.8) solving a similar character string, namely a character string with the editing distance of 1, of the candidate character string set in the step (3.5) again to obtain a candidate character string set with the editing distance of 2 to the character string input by the user;
(3.9) searching words meeting the conditions in the word library established in the step (3.1) by using the candidate character string set which is obtained in the step (3.8) and has the editing distance of 2 with the character string input by the user;
(3.10) if the result obtained by the query in the step (3.9) is not empty, returning the query result and the character string input by the user; if the query result is null, only returning the character string input by the user;
and (3.11) the user waves hands upwards or downwards or leftwards or rightwards, and then words in the corresponding directions of up, down, left and right on the dial can be selected.
The invention has the beneficial effects that: in the aspect of an input mode, the input can be finished by writing letters in the air by fingers, when one word is input, a user can select a desired word by rotating the wrist, the input mode is novel, and the input can be finished by only one hand. In the aspect of letter identification, the accuracy rate of letter classification reaches 80%, the fitting error is small, the accumulated error between the fitted fingertip track and the actual track is 3.3cm, the classification speed is high, the sum of the time of pretreatment and two-stage model operation on a watch is only about 0.28s, the size of the model is small, and the two-stage model after solidification and compression is about 30kb and 90kb respectively. In the aspect of word prediction, similarity between character strings is calculated through editing distance, and a correct word with the editing distance of the character string input by a user being less than or equal to 2 can be given. In the aspect of input speed, the average time for inputting a seven-letter word is 12s, and the input speed is high.
Drawings
FIG. 1 is a workflow diagram of data collection, letter classification and word selection for the method of the present invention.
FIG. 2 is a flow chart of model training for the method of the present invention.
FIG. 3 is a flow chart of word prediction for the method of the present invention.
Detailed Description
The invention is further described below with reference to the accompanying drawings. The invention discloses a gesture input method based on wearable equipment, which comprises the following specific implementation modes:
step 1, acquiring wrist motion data and hand motion data, comprising:
collecting wrist Motion data m groups of different letters handwritten by a user by using an intelligent wearable Watch L G Watch Urbank and TiCWatch, and simultaneously collecting hand Motion data m groups of different letters handwritten by the user by using an optical somatosensory controller L eap Motion;
step 2, in the training stage, the wrist motion data and the hand motion data in the step 1 are preprocessed, data features are extracted, and a feature database and a classifier are established, wherein the training stage comprises the following steps:
(2.1) recording the corresponding letters as the labels C { C ] for the motion data of m groups of wrists and hands when the user writes different letters in the step 11,C2,…,Cm};
(2.2) cutting the two sets of motion data corresponding to the wrist and the hand acquired in the step 1Time-aligned, compressed to the same length l using linear interpolation, and the processed wrist motion data as a handwritten data set A { A }1,A2,…,Am},AiSize l × 6 × 1, i ∈ (1,2, …, m);
(2.3) extracting motion data of the positions of fingertips, finger roots, wrists and the like from the hand motion data acquired in the step 1, translating the space coordinate system center to the center of metacarpophalangeal joints to be used as a reference data set B { B }1,B2,…,Bm},BiSize l × 3 × 3, i ∈ (1,2, …, m), calculating length l of metacarpal bonemetacarpalAnd length l of index fingerfinger
(2.4) building a convolutional neural network M1The layers are linked layer by layer from top to bottom and are convolutional layers 1-6 respectively, the optimizer is an Adam optimizer, the activation function is a Re L U function, and the loss function of the model is designed to be L1As shown in the formula (1),
Figure GDA0002496693820000051
wherein, T1K is a compensation coefficient for reducing the error of the length of the phalange collected by the sensor;
taking A obtained in the step (2.2) as input, taking B obtained in the step (2.3) as reference data to train, and taking a neural network model obtained by training as a fitting model Mfitting;MfittingFitting the handwritten data set to a near reference data set, the data set obtained by fitting again being D { D }1,D2,…,Dm},DiSize l × 3 × 3, i ∈ (1,2, …, m);
(2.5) construction of convolutional neural network M2The method comprises the steps of layer-by-layer linkage of each layer from top to bottom, wherein the layers are convolutional layers 1-6, a flatten layer 7, a full-link layer 8 and a softmax classifier layer 9, an optimizer is an Adam optimizer, an activation function is a Re L U function, and a loss function of a model is designed to be L2As shown in the formula (2),
Figure GDA0002496693820000061
T2is the output result of the neural network; d obtained in the step (2.4) is used as input, C obtained in the step (2.1) is used as reference data, and a neural network model obtained through training is used as a classifier Mclassify,MclassifyThe probability that the current input may be individual letters may be output;
(2.6) reading MfittingAnd MclassifyDesignating output node by using the meta graph saved after training and checkpoint file, and saving all necessary nodes to obtain a solidified model M'fittingAnd M'classifyM 'was obtained from Tensorflow L ite converter'fittingAnd M'classifyModel M for converting mobile terminal to usefitting-liteAnd Mclassify-lite
Step 3, an identification stage, data preprocessing and classifier result fusion, comprising:
(3.1) establishing a corresponding word searching library through the dictionary tree, and inserting each letter of each word into the dictionary tree one by one; before inserting, whether a prefix exists or not is firstly seen, and if the prefix exists, the prefix character string is shared; otherwise, creating corresponding nodes and edges; after the dictionary tree is built, edges between adjacent nodes represent a character, all characters passing through a path from a root node to a certain node are connected to represent character strings corresponding to the node, and the character strings corresponding to each node are different;
(3.2) acquiring a handwriting input data set W of the user;
(3.3) compressing the handwriting data set in the step (3.2) to obtain a compressed handwriting data set Wcompressed
(3.4) compressing the handwriting data set W obtained in the step (3.3)compressedAs input, model M is first fitted through datafitting-liteLine fitting, and word recognition classifier Mclassify-liteClassifying to obtain a letter lambda input by a user;
(3.5) according to the current input of the userEntering character string SinputThat is, the user inputs the sequential combination of the letters λ and calculates the character string S with the edit distance of 1editI.e. Edit [ S ]input][Sedit]1, forming a candidate character string Setedit
Figure GDA0002496693820000075
In (1), there are four kinds of S satisfying the conditioneditAnd (3) gathering:
Figure GDA0002496693820000074
the first one can be changed into an input string S by adding one characterinputIs/are as follows
Figure GDA0002496693820000076
In particular by traversing SinputAnd deleting one character, as shown in formula (3),
Figure GDA0002496693820000071
wherein DeleteChar () is a delete function for deleting a character at a position in a character string, SinputIs a character string to be deleted, i is SinputThe position of the character to be deleted, and n is the length of the character string to be deleted;
second, the input string S can be changed to a character by deleting a characterinputIs/are as follows
Figure GDA0002496693820000077
In particular by traversing SinputAnd inserts a character, as shown in formula (4),
Figure GDA0002496693820000072
wherein InsertChar () is an insertion function for inserting a corresponding character, S, at a certain position in a stringinputIs a character string to be inserted, j is SinputC is the inserted character, and the value range is [ a, z]N is the length of the insertion string;
thirdly, the input character string S can be changed by replacing one characterinputIs/are as follows
Figure GDA0002496693820000078
In particular by traversing SinputAnd replaces one character to obtain, as shown in formula (5),
Figure GDA0002496693820000073
wherein AlterChar () is a replacement function for replacing a corresponding character, S, at a certain position in a character stringinputIs a character string to be replaced, k is SinputThe position of the character in (c) α to replace SinputThe character at the middle position k has a value range of [ a, z]N is the length of the character string to be replaced;
fourth, by transposing two adjacent characters, it can become SinputIs/are as follows
Figure GDA00024966938200000810
In particular by traversing SinPutAnd two characters are turned to obtain, as shown in formula (6),
Figure GDA0002496693820000081
wherein TransposeChars () is a transposition function for transposing characters at two adjacent positions in a character string, SinputIs a character string to be transposed, p +1 are SinputThe position of the character to be transposed, n is the length of the character string to be transposed;
(3.6) utilizing the candidate character string set obtained in the step (3.5)
Figure GDA00024966938200000812
Searching words meeting the conditions in the word bank established in the step (3.1); if the candidate character string set
Figure GDA00024966938200000811
If the words in the Chinese database are not in the word stock, directly abandoning the words; otherwise, according to the occurrence frequency of the corresponding words in the word library, returning the top N words with the most occurrence frequency as the prediction results, as shown in formulas (7) and (8),
Figure GDA0002496693820000082
Figure GDA0002496693820000083
wherein the InDictionary () function is used to find a set of candidate strings
Figure GDA0002496693820000084
Set of words present in a lexicon
Figure GDA0002496693820000085
The MaxFrequence () function is used to find
Figure GDA0002496693820000086
The words of N before the occurrence times in the Chinese character;
(3.7) the result obtained by inquiring in the step (3.6)
Figure GDA0002496693820000087
If not, returning the query result; if it is
Figure GDA0002496693820000088
If the result is empty, performing the steps (3.8), (3.9) and (3.10);
(3.8) pairing the set of candidate character strings in step (3.5)
Figure GDA0002496693820000089
Find a close string again, i.e. with
Figure GDA0002496693820000091
Edit distance 1, Edit [ S ]input][Sedit]Obtaining a character string S input by a user as a character string of 2inputCandidate character string set with edit distance of 2
Figure GDA0002496693820000092
(3.9) utilizing the candidate character string set which is obtained in the step (3.5) and has the editing distance of 2 with the character string input by the user
Figure GDA0002496693820000093
Searching words meeting the conditions in the word bank established in the step (3.1); if the candidate character string set
Figure GDA0002496693820000094
If the words in the Chinese database are not in the word stock, directly abandoning the words; otherwise, according to the occurrence frequency of the corresponding words in the word stock, returning the top N words with the most occurrence frequency as the prediction results, as shown in formulas (9) and (10),
Figure GDA0002496693820000095
Figure GDA0002496693820000096
wherein the InDictionary () function is used to find a set of candidate strings
Figure GDA0002496693820000097
Set of words present in a lexicon
Figure GDA0002496693820000098
The MaxFrequence () function is used to find
Figure GDA0002496693820000099
The words of N before the occurrence times in the Chinese character;
(3.10) the result obtained by inquiring in the step (3.9)
Figure GDA00024966938200000910
If not, returning the query result
Figure GDA00024966938200000911
And a character string S input by a userinput(ii) a If it is
Figure GDA00024966938200000912
If the number of the characters is null, only returning the character strings input by the user;
and (3.11) the user waves hands upwards or downwards or leftwards or rightwards, and then words in the corresponding directions of up, down, left and right on the dial can be selected.
The embodiments described in this specification are merely illustrative of implementations of the inventive concept and the scope of the present invention should not be considered limited to the specific forms set forth in the embodiments but rather by the equivalents thereof as may occur to those skilled in the art upon consideration of the present inventive concept.

Claims (1)

1. A gesture input method based on a wearable device comprises the following steps:
step 1, acquiring wrist motion data and hand motion data, comprising:
collecting wrist Motion data m groups of different letters handwritten by a user by using an intelligent wearable Watch L G Watch Urbank and TiCWatch, and simultaneously collecting hand Motion data m groups of different letters handwritten by the user by using an optical somatosensory controller L eap Motion;
step 2, in the training stage, the wrist motion data and the hand motion data in the step 1 are preprocessed, data features are extracted, and a feature database and a classifier are established, wherein the training stage comprises the following steps:
(2.1) recording the corresponding letters as the labels C { C ] for the motion data of m groups of wrists and hands when the user writes different letters in the step 11,C2,...,Cm};
(2.2) clipping and time aligning the two groups of motion data corresponding to the wrist and the hand acquired in the step 1, and compressing the two groups of motion data to the same length l where the two groups of motion data are positioned by using a linear interpolation methodThe processed wrist movement data is used as a handwriting data set A { A }1,A2,...,Am},AiSize l × 6 × 1, i ∈ (1, 2.., m);
(2.3) extracting the motion data of the fingertip, the finger root and the wrist part from the hand motion data acquired in the step 1, and translating the space coordinate system center to the center of the metacarpophalangeal joint to be used as a reference data set B { B }1,B2,...,Bm},BiThe size is l × 3 × 3, i ∈ (1, 2.., m), and the length l of the metacarpal bone is calculated respectivelymetacarpalAnd length l of index fingerfinger
(2.4) building a convolutional neural network M1The layers are linked layer by layer from top to bottom and are convolutional layers 1-6 respectively, the optimizer is an Adam optimizer, the activation function is a Re L U function, and the loss function of the model is designed to be L1As shown in the formula (1),
Figure FDA0002520335360000011
wherein, T1K is a compensation coefficient for reducing the error of the length of the phalange collected by the sensor;
taking A obtained in the step (2.2) as input, taking B obtained in the step (2.3) as reference data to train, and taking a neural network model obtained by training as a fitting model Mfitting;MfittingFitting the handwritten data set to a near reference data set, the data set obtained by fitting again being D { D }1,D2,...,Dm},DiSize l × 3 × 3, i ∈ (1, 2.., m);
(2.5) construction of convolutional neural network M2The method comprises the steps of layer-by-layer linkage of each layer from top to bottom, wherein the layers are convolutional layers 1-6, a flatten layer 7, a full-link layer 8 and a softmax classifier layer 9, an optimizer is an Adam optimizer, an activation function is a Re L U function, and a loss function of a model is designed to be L2As shown in the formula (2),
Figure FDA0002520335360000021
T2is the output result of the neural network; d obtained in the step (2.4) is used as input, C obtained in the step (2.1) is used as reference data, and a neural network model obtained through training is used as a classifier Mclassify,MclassifyThe probability that the current input may be individual letters may be output;
(2.6) reading MfittingAnd MclassifyDesignating output node by using the meta graph saved after training and the checkpoint file checkpoint node, and saving all the nodes to obtain a solidification model M'fittingAnd M'classifyM 'was obtained from Tensorflow L ite converter'fittingAnd M'classifyModel M for converting mobile terminal to usefitting-liteAnd Mclassify-lite
Step 3, an identification stage, data preprocessing and classifier result fusion, comprising:
(3.1) establishing a corresponding word searching library through the dictionary tree, and inserting each letter of each word into the dictionary tree one by one; before inserting, whether the prefix exists or not is firstly seen, and if the prefix exists, the prefix character string is shared; otherwise, creating corresponding nodes and edges; after the dictionary tree is built, edges between adjacent nodes represent a character, all characters passing through a path from a root node to a certain node are connected to represent character strings corresponding to the node, and the character strings corresponding to each node are different;
(3.2) acquiring a handwriting input data set W of the user;
(3.3) compressing the handwriting data set in the step (3.2) to obtain a compressed handwriting data set Wcompressed
(3.4) compressing the handwriting data set W obtained in the step (3.3)compressedAs input, model M is first fitted through datafitting-liteLine fitting, and word recognition classifier Mclassify-liteClassifying to obtain a letter lambda input by a user;
(3.5) according to the user' S current input string SinputI.e. user inputThe sequential combination of the input letter lambda calculates the character string S with the edit distance of 1editI.e. Edit [ S ]input][Sedit]1, forming a candidate character string Setedit
Figure FDA0002520335360000031
In (1), there are four kinds of S satisfying the conditioneditAnd (3) gathering:
Figure FDA0002520335360000032
the first one can be changed into an input string S by adding one characterinputIs/are as follows
Figure FDA0002520335360000033
In particular by traversing SinputAnd deleting one character, as shown in formula (3),
Figure FDA0002520335360000034
wherein DeleteChar () is a delete function for deleting a character at a position in a character string, SinputIs a character string to be deleted, i is SinputThe position of the character to be deleted, and n is the length of the character string to be deleted;
second, the input string S can be changed to a character by deleting a characterinputIs/are as follows
Figure FDA0002520335360000035
In particular by traversing SinputAnd inserts a character, as shown in formula (4),
Figure FDA0002520335360000041
wherein InsertChar () is an insertion function for inserting a corresponding character, S, at a certain position in a stringinputIs a character string to be inserted, jIs SinputC is the inserted character, and the value range is [ a, z]N is the length of the insertion string;
thirdly, the input character string S can be changed by replacing one characterinputIs/are as follows
Figure FDA0002520335360000047
In particular by traversing SinputAnd replaces one character to obtain, as shown in formula (5),
Figure FDA0002520335360000042
wherein AlterChar () is a replacement function for replacing a corresponding character, S, at a certain position in a character stringinputIs a character string to be replaced, k is SinputThe position of the character in (c) α to replace SinputThe character at the middle position k has a value range of [ a, z]N is the length of the character string to be replaced;
fourth, by transposing two adjacent characters, it can become SinputIs/are as follows
Figure FDA0002520335360000043
In particular by traversing SinputAnd two characters are turned to obtain, as shown in formula (6),
Figure FDA0002520335360000044
wherein TransposeChars () is a transposition function for transposing characters at two adjacent positions in a character string, SinputIs a character string to be transposed, p +1 are SinputThe position of the character to be transposed, n is the length of the character string to be transposed;
(3.6) Using the candidate character string set obtained in step (3.5)
Figure FDA0002520335360000045
Built up by going to step (3.1)Inquiring words meeting the conditions in a word bank; if candidate character string set table
Figure FDA0002520335360000046
If the words in the Chinese database are not in the word stock, directly abandoning the words; otherwise, according to the occurrence frequency of the corresponding words in the word library, returning the top N words with the most occurrence frequency as the prediction results, as shown in formulas (7) and (8),
Figure FDA0002520335360000051
Figure FDA0002520335360000052
wherein the InDictionary () function is used to find a set of candidate strings
Figure FDA0002520335360000053
Set of words present in a lexicon
Figure FDA0002520335360000054
The MaxFrequence () function is used to find
Figure FDA0002520335360000055
The words of N before the occurrence times in the Chinese character;
(3.7) the result obtained by inquiring in the step (3.6)
Figure FDA0002520335360000056
If not, returning the query result; if it is
Figure FDA0002520335360000057
If the result is empty, performing the steps (3.8), (3.9) and (3.10);
(3.8) pairing the set of candidate character strings in step (3.5)
Figure FDA0002520335360000058
Find a close string again, i.e. with
Figure FDA0002520335360000059
Edit distance 1, Edit [ S ]input][Sedit]Obtaining a character string S input by a user as a character string of 2inputCandidate character string set table with editing distance of 2
Figure FDA00025203353600000510
(3.9) utilizing the candidate character string set which is obtained in the step (3.5) and has the editing distance of 2 with the character string input by the user
Figure FDA00025203353600000511
Searching words meeting the conditions in the word bank established in the step (3.1); if the candidate character string set
Figure FDA00025203353600000512
If the words in the Chinese database are not in the word stock, directly abandoning the words; otherwise, according to the occurrence frequency of the corresponding words in the word stock, returning the top N words with the most occurrence frequency as the prediction results, as shown in formulas (9) and (10),
Figure FDA00025203353600000513
Figure FDA00025203353600000514
wherein the InDictionary () function is used to find a set of candidate strings
Figure FDA00025203353600000515
Set of words present in a lexicon
Figure FDA00025203353600000516
The MaxFrequence () function is used to find
Figure FDA00025203353600000517
The words of N before the occurrence times in the Chinese character;
(3.10) the result obtained by inquiring in the step (3.9)
Figure FDA0002520335360000061
If not, returning the query result
Figure FDA0002520335360000062
And a character string S input by a userinput(ii) a If it is
Figure FDA0002520335360000063
If the number of the characters is null, only returning the character strings input by the user;
and (3.11) the user waves hands upwards or downwards or leftwards or rightwards, and then words in the corresponding directions of up, down, left and right on the dial can be selected.
CN201910351496.XA 2019-04-28 2019-04-28 Gesture input method based on wearable equipment Active CN110196635B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910351496.XA CN110196635B (en) 2019-04-28 2019-04-28 Gesture input method based on wearable equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910351496.XA CN110196635B (en) 2019-04-28 2019-04-28 Gesture input method based on wearable equipment

Publications (2)

Publication Number Publication Date
CN110196635A CN110196635A (en) 2019-09-03
CN110196635B true CN110196635B (en) 2020-07-31

Family

ID=67752241

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910351496.XA Active CN110196635B (en) 2019-04-28 2019-04-28 Gesture input method based on wearable equipment

Country Status (1)

Country Link
CN (1) CN110196635B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110569800B (en) * 2019-09-10 2022-04-12 武汉大学 Detection method of handwriting signal
CN115118688B (en) * 2022-08-29 2022-11-04 中航信移动科技有限公司 Method for sending message based on message template, storage medium and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106598227A (en) * 2016-11-15 2017-04-26 电子科技大学 Hand gesture identification method based on Leap Motion and Kinect
CN106874874A (en) * 2017-02-16 2017-06-20 南方科技大学 Motion state identification method and device
CN108664877A (en) * 2018-03-09 2018-10-16 北京理工大学 A kind of dynamic gesture identification method based on range data
CN109145793A (en) * 2018-08-09 2019-01-04 东软集团股份有限公司 Establish method, apparatus, storage medium and the electronic equipment of gesture identification model

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101930545A (en) * 2009-06-24 2010-12-29 夏普株式会社 Handwriting recognition method and device
CN103226388B (en) * 2013-04-07 2016-05-04 华南理工大学 A kind of handwriting sckeme based on Kinect
CN103577843B (en) * 2013-11-22 2016-06-22 中国科学院自动化研究所 A kind of aerial hand-written character string recognition methods
CN104156491A (en) * 2014-09-01 2014-11-19 携程计算机技术(上海)有限公司 Mobile terminal and list information retrieval method thereof
CN107330480B (en) * 2017-07-03 2020-10-13 贵州大学 Computer recognition method for hand-written character
KR20190008168A (en) * 2018-12-28 2019-01-23 정우곤 An artificial neural network application technology for converting hand-written Korean sample text into its own font.

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106598227A (en) * 2016-11-15 2017-04-26 电子科技大学 Hand gesture identification method based on Leap Motion and Kinect
CN106874874A (en) * 2017-02-16 2017-06-20 南方科技大学 Motion state identification method and device
CN108664877A (en) * 2018-03-09 2018-10-16 北京理工大学 A kind of dynamic gesture identification method based on range data
CN109145793A (en) * 2018-08-09 2019-01-04 东软集团股份有限公司 Establish method, apparatus, storage medium and the electronic equipment of gesture identification model

Also Published As

Publication number Publication date
CN110196635A (en) 2019-09-03

Similar Documents

Publication Publication Date Title
CN111488426B (en) Query intention determining method, device and processing equipment
CN100483417C (en) Method for catching limit word information, optimizing output and input method system
CN109165273B (en) General Chinese address matching method facing big data environment
CN110196635B (en) Gesture input method based on wearable equipment
CN112347284B (en) Combined trademark image retrieval method
CN1701323A (en) Digital ink database searching using handwriting feature synthesis
CN113032552B (en) Text abstract-based policy key point extraction method and system
CN105678244B (en) A kind of near video search method based on improved edit-distance
CN110516216A (en) A kind of automatic writing template base construction method of sports news
CN112417097A (en) Multi-modal data feature extraction and association method for public opinion analysis
CN106168954A (en) A kind of negative report pattern Method of Fuzzy Matching based on editing distance
CN109492168A (en) A kind of visualization tour interest recommendation information generation method based on tourism photo
CN108984159B (en) Abbreviative phrase expansion method based on Markov language model
CN114332519A (en) Image description generation method based on external triple and abstract relation
CN110442618A (en) Merge convolutional neural networks evaluation expert's recommended method of expert info incidence relation
CN113806554A (en) Knowledge graph construction method for massive conference texts
CN109671436A (en) The method of intelligent sound identification address book contact name
Singh et al. Online handwritten Gurmukhi strokes dataset based on minimal set of words
CN106021413B (en) Auto-expanding type feature selection approach and system based on topic model
Lin et al. On-line recognition by deviation-expansion model and dynamic programming matching
CN112380422A (en) Financial news recommending device based on keyword popularity
CN116932736A (en) Patent recommendation method based on combination of user requirements and inverted list
CN114610871B (en) Information system modeling analysis method based on artificial intelligence algorithm
CN110245275A (en) A kind of extensive similar quick method for normalizing of headline
Matsushita et al. Effect of Text/Non-text Classification for Ink Search Employing String Recognition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant