CN102866852B - Handwriting character input method based on body-sensing technology - Google Patents
Handwriting character input method based on body-sensing technology Download PDFInfo
- Publication number
- CN102866852B CN102866852B CN201210267577.XA CN201210267577A CN102866852B CN 102866852 B CN102866852 B CN 102866852B CN 201210267577 A CN201210267577 A CN 201210267577A CN 102866852 B CN102866852 B CN 102866852B
- Authority
- CN
- China
- Prior art keywords
- operator
- hand
- word
- virtual paper
- track
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses a handwriting character input method based on a body-sensing technology, which comprises the following steps: arranging a word library comprising a track of each word; arranging a body-sensing device and defining a virtual paper sheet in a sensing space; identifying an input state of an operator; writing by using a pen by the operator; judging if a single word is ended according to the pause of the operator by the body-sensing device; after handwriting one word, sensing a track streaked on the virtual paper sheet; performing feature matching on the track and the track in the word library by a computer; listing matched candidate words; and selecting by the operator. According to the handwriting character input method based on the body-sensing technology, the handwriting does not depend on the traditional fixed input equipment, such as a keyboard, a microphone and a writing pad, any sensing tool need not be held by hand and the handwriting can be finished in the sensing space of the body-sensing device, so that the writing is free and the operation is comfortable; the word inputting speed is increased; and the thickness of the virtual paper sheet and the distance for putting a pen to paper and lifting the pen can be arranged by the operator according to the individual handwriting habit, so that the optimal input experience is achieved.
Description
Technical field
The present invention relates to a kind of hand-written character input method based on body sense technology.
Background technology
Along with development and the maturation of body sense technology, body sense technology is applied to computer input field and is achieved, for the mankind provide more comfortable and natural computer input mode.The Leap 3D of Kinect and the Leap Motion company of Microsoft is exactly two main brands in current body sense input field, under the prerequisite of operator without the need to hand-held any induction stage property, is caught by camera and sensor, the action of identifying operation person.But current body sense technology is mainly used in field of play, can respond to and identify the actual act of player, to control the corresponding actions of role in game, good game experiencing can be brought for player.But, body sense technology based on space is not also applied to text event detection field, the mode that existing computer text input adopts input through keyboard, phonetic entry or board to input usually, needs to rely on fixing input equipment, use not too free, comfortable writing degree is undesirable.
Summary of the invention
The object of the invention is to solve the deficiencies in the prior art, a kind of input equipment without the need to relying on conventional keyboard, microphone and board etc. fixing is provided, also without the need to hand-held any sensing stage property, as long as just can complete hand-written in the inductive spacing of body sense equipment, write freedom, use the comfortable hand-written character input method based on body sense technology.
The object of the invention is to be achieved through the following technical solutions: based on the hand-written character input method of body sense technology, it comprises the following steps:
(1) literal pool comprising the track of each word is set;
(2) an individual sense equipment is set, and defines a virtual paper in the inductive spacing of body sense equipment;
(3) input state of identifying operation person, it comprises the following steps:
S11: the hand of operator stretches out expression to virtual paper locality and starts writing, contact virtual paper;
S12: the hand of operator is regained expression in the other direction and lifted pen to virtual paper position, leave virtual paper;
(4) operator starts writing and carries out hand-written, meanwhile, shows the position of operator's hand over the display and starts writing or lift the state of pen;
(5) body sense equipment judges single character end of input by the pause of operator;
(6), after a hand-written complete word, computing machine senses by body sense equipment the track that operator streaks on virtual paper;
(7) in computing machine track operator marked and the literal pool that prestores, the track of each word carries out characteristic matching;
(8) each candidate character is listed in word candidate bar from high to low according to matching degree;
(9) candidate character that operator's selective recognition is correct.
The thickness of virtual paper of the present invention, start writing and lift pen distance, represent that the dead time value of single character end of input all can by the self-defined setting of operator.
The invention has the beneficial effects as follows: without the need to the input equipment relying on conventional keyboard, microphone and board etc. fixing, also without the need to hand-held any sensing stage property, as long as just can complete hand-written in the inductive spacing of body sense equipment, write freedom, operate comfortable; Be applied to and surf the web and social dialogue, the even text event detection of office realm, the comfort level of operator's operation can be improved while improving text input speed; Operator can according to the hand-written custom of self arrange virtual paper thickness, start writing and lift the distance of pen, while handwriting input, handwriting tracks can be shown in real time over the display, experience to reach best input.
Accompanying drawing explanation
Fig. 1 is operational flowchart of the present invention.
Embodiment
Below in conjunction with accompanying drawing, technical scheme of the present invention is described in further detail, but protection scope of the present invention is not limited to the following stated.
As shown in Figure 1, based on the hand-written character input method of body sense technology, it comprises the following steps:
(1) literal pool comprising the track of each word is set;
(2) an individual sense equipment is set, and defines a virtual paper in the inductive spacing of body sense equipment;
(3) input state of identifying operation person, it comprises the following steps:
S11: the hand of operator stretches out expression to virtual paper locality and starts writing, contact virtual paper;
S12: the hand of operator is regained expression in the other direction and lifted pen to virtual paper position, leave virtual paper;
The thickness of virtual paper, start writing and lift pen distance, represent that the dead time value of single character end of input all can by the self-defined setting of operator.
(4) operator starts writing and carries out hand-written, meanwhile, shows the position of operator's hand over the display and starts writing or lift the state of pen;
(5) body sense equipment judges single character end of input by the pause of operator;
(6), after a hand-written complete word, computing machine senses by body sense equipment the track that operator streaks on virtual paper;
(7) in computing machine track operator marked and the literal pool that prestores, the track of each word carries out characteristic matching;
(8) each candidate character is listed in word candidate bar from high to low according to matching degree;
(9) candidate character that operator's selective recognition is correct.
Claims (2)
1., based on the hand-written character input method of body sense technology, it comprises the following steps:
(1) literal pool comprising the track of each word is set;
It is characterized in that: it is further comprising the steps of:
(2) an individual sense equipment is set, and defines a virtual paper in the inductive spacing of body sense equipment;
(3) input state of identifying operation person, it comprises the following steps:
S11: the hand of operator stretches out expression to virtual paper locality and starts writing, contact virtual paper;
S12: the hand of operator is regained expression in the other direction and lifted pen to virtual paper position, leave virtual paper;
(4) operator starts writing and carries out hand-written, meanwhile, shows the position of operator's hand over the display and starts writing or lift the state of pen;
(5) body sense equipment judges single character end of input by the pause of operator;
(6), after a hand-written complete word, computing machine senses by body sense equipment the track that operator streaks on virtual paper;
(7) in computing machine track operator marked and the literal pool that prestores, the track of each word carries out characteristic matching;
(8) each candidate character is listed in word candidate bar from high to low according to matching degree;
(9) candidate character that operator's selective recognition is correct;
Without the need to relying on fixing input equipment, also without the need to hand-held any sensing stage property, as long as just can complete hand-written in the inductive spacing of body sense equipment, writing freedom, operating comfortable.
2. the hand-written character input method based on body sense technology according to claim 1, is characterized in that: the thickness of described virtual paper, start writing and lift pen distance, represent that the dead time value of single character end of input is by the self-defined setting of operator.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210267577.XA CN102866852B (en) | 2012-07-30 | 2012-07-30 | Handwriting character input method based on body-sensing technology |
PCT/CN2013/071590 WO2014019355A1 (en) | 2012-07-30 | 2013-02-09 | Motion-sensing technology-based handwritten text input method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210267577.XA CN102866852B (en) | 2012-07-30 | 2012-07-30 | Handwriting character input method based on body-sensing technology |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102866852A CN102866852A (en) | 2013-01-09 |
CN102866852B true CN102866852B (en) | 2014-12-24 |
Family
ID=47445740
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201210267577.XA Expired - Fee Related CN102866852B (en) | 2012-07-30 | 2012-07-30 | Handwriting character input method based on body-sensing technology |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN102866852B (en) |
WO (1) | WO2014019355A1 (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102866852B (en) * | 2012-07-30 | 2014-12-24 | 成都西可科技有限公司 | Handwriting character input method based on body-sensing technology |
CN103226388B (en) * | 2013-04-07 | 2016-05-04 | 华南理工大学 | A kind of handwriting sckeme based on Kinect |
CN104571475A (en) * | 2013-10-18 | 2015-04-29 | 宁夏先锋软件有限公司 | Virtual character input device based on motion sensing technology |
CN106293368B (en) * | 2015-05-26 | 2020-05-26 | 联想(北京)有限公司 | Data processing method and electronic equipment |
CN108492377A (en) * | 2018-02-13 | 2018-09-04 | 网易(杭州)网络有限公司 | Writing control method and device, mobile terminal in a kind of virtual scene |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101685341A (en) * | 2008-09-28 | 2010-03-31 | Tcl集团股份有限公司 | Remote control with function of inputting characters and character inputting method and system thereof |
CN102163119A (en) * | 2010-02-23 | 2011-08-24 | 中兴通讯股份有限公司 | Single-hand inputting method and device |
CN102395941A (en) * | 2011-09-02 | 2012-03-28 | 华为终端有限公司 | Motion sensing input method, motion sensing device, wireless handheld device and motion sensing system |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102866852B (en) * | 2012-07-30 | 2014-12-24 | 成都西可科技有限公司 | Handwriting character input method based on body-sensing technology |
-
2012
- 2012-07-30 CN CN201210267577.XA patent/CN102866852B/en not_active Expired - Fee Related
-
2013
- 2013-02-09 WO PCT/CN2013/071590 patent/WO2014019355A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101685341A (en) * | 2008-09-28 | 2010-03-31 | Tcl集团股份有限公司 | Remote control with function of inputting characters and character inputting method and system thereof |
CN102163119A (en) * | 2010-02-23 | 2011-08-24 | 中兴通讯股份有限公司 | Single-hand inputting method and device |
CN102395941A (en) * | 2011-09-02 | 2012-03-28 | 华为终端有限公司 | Motion sensing input method, motion sensing device, wireless handheld device and motion sensing system |
Also Published As
Publication number | Publication date |
---|---|
WO2014019355A1 (en) | 2014-02-06 |
CN102866852A (en) | 2013-01-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102866852B (en) | Handwriting character input method based on body-sensing technology | |
US10185440B2 (en) | Electronic device operating according to pressure state of touch input and method thereof | |
CN106716317B (en) | Method and apparatus for resolving touch discontinuities | |
CN101551724B (en) | Method and device for writing characters on touch screen | |
EP2650766A1 (en) | Multi-character continuous handwriting input method | |
CN202548800U (en) | Electronic signature device | |
CN103995600B (en) | A kind of braille Chinese character converter and its method | |
KR20200115670A (en) | Input display device, input display method, and program | |
CN104516649A (en) | Intelligent cell phone operating technology based on motion-sensing technology | |
CN108762557A (en) | A kind of touch detecting method and computer readable storage medium | |
CN102063620A (en) | Handwriting identification method, system and terminal | |
CN102646023A (en) | Method for generating original user handwriting fonts | |
CN102903276A (en) | Electronic handwriting trainer | |
CN101339703A (en) | Character calligraph exercising method based on computer | |
CN101859177B (en) | Method and device for calling and operating application program on intelligent electronic device | |
US20200168121A1 (en) | Device for Interpretation of Digital Content for the Visually Impaired | |
CN103268195B (en) | Realization system based on the dummy keyboard with touch sensible equipment | |
CN104898855B (en) | Based on text input system and method with rocking bar equipment | |
CN102854981A (en) | Body technology based virtual keyboard character input method | |
CN103809909B (en) | A kind of information processing method and electronic equipment | |
CN201548945U (en) | Computer handwriting board capable of displaying handwritings | |
JP6492545B2 (en) | Information processing apparatus, information processing system, and program | |
JP2013084223A (en) | Touch detection device, recording display device, and program | |
Singh et al. | Data capturing process for online Gurmukhi script recognition system | |
CN104516566A (en) | Handwriting input method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
C56 | Change in the name or address of the patentee | ||
CP02 | Change in the address of a patent holder |
Address after: Two Lu Tian Hua high tech Zone of Chengdu City, Sichuan province 610041 No. 219 Tianfu Software Park C District 12 building 6 layer Patentee after: Chengdu Xike Technology Co.,Ltd. Address before: The middle Tianfu Avenue in Chengdu city Sichuan province 610041 No. 765 talent Software Park A District No. 1 Building 2 room 214 Patentee before: Chengdu Xike Technology Co.,Ltd. |
|
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20141224 Termination date: 20200730 |
|
CF01 | Termination of patent right due to non-payment of annual fee |