WO2002031753A1 - Character entry apparatus based on character recognition in the video signal - Google Patents

Character entry apparatus based on character recognition in the video signal Download PDF

Info

Publication number
WO2002031753A1
WO2002031753A1 PCT/TR2001/000054 TR0100054W WO0231753A1 WO 2002031753 A1 WO2002031753 A1 WO 2002031753A1 TR 0100054 W TR0100054 W TR 0100054W WO 0231753 A1 WO0231753 A1 WO 0231753A1
Authority
WO
WIPO (PCT)
Prior art keywords
characters
computer
character
camera
video signal
Prior art date
Application number
PCT/TR2001/000054
Other languages
French (fr)
Inventor
Yasemin Cetin
Volkan Atalay
Original Assignee
Cetin, Ahmet, Enis
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cetin, Ahmet, Enis filed Critical Cetin, Ahmet, Enis
Priority to AU2002214528A priority Critical patent/AU2002214528A1/en
Publication of WO2002031753A1 publication Critical patent/WO2002031753A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/14Image acquisition
    • G06V30/142Image acquisition using hand-held instruments; Constructional details of the instruments
    • G06V30/1423Image acquisition using hand-held instruments; Constructional details of the instruments the instrument generating sequences of position coordinates corresponding to handwriting

Definitions

  • the computer vision based apparatus is invented to enter text, numbers, ' and characters to computers, personal digital assistants, pagers, computerized systems, and other devices having digital signal processors or microprocessors.
  • the user draws a character using his or her fmger or a tip pointer or a laser pointer.
  • the drawing action of the user are captured by a camera connected to the computer.
  • Video signal generated by the camera is analyzed by using computer vision techniques and hand drawn characters are recognized in real- time.
  • the computer vision based character entry apparatus does neither need an antenna nor a cable to enter characters as the communicatip ⁇ between the user and the .computer is made possible . b the camera..
  • the camera connected to the computer or computerized system may generate analog or digital video.
  • a red laser pointer is used for drawing a character on a flat non-reflective surface which is in the viewing area of the camera.
  • the beam of the laser pointer produces a maximum in the red component of the image I 0 (if a white light source is used to draw a character then a maximum occurs in the gray level image component).
  • the detection of the beam of the laser pointer is equivalent to finding the maximum valued pixel(s) of the image.
  • Characters can be simply drawn by a finger or an ordinary pointer on a paper, on a fabric or a board or on the air as well.
  • Detection of the tip of the finger can be carried out by estimating the edges of the finger using one of the algorithms described in the article "Finger tracking as an input device for augmented reality," by J. Crowley, F. Berard and J. Coutaz in Proceedings of International Workshop on Automatic Face and Gesture Recognition, pp.
  • Characters that can be entered to a computer or a computerized system using our apparatus are members of the characters described by 8-bit ASCII or 16-bit Unicode or the single stroke alphabets used in personal digital assistants (pda).
  • the user can also define other special characters which can be used as x bookmarks.
  • the characters that our text entry device recognizes are not the characters of a sign language used by deaf people.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Character Discrimination (AREA)
  • Telephone Function (AREA)

Abstract

The computer vision based apparatus is invented to enter text, numbers, and characters to computers, personal digital assistants, pagers, computerized systems, and other devices having digital signal processors or microprocessors. The user draws a character using his or her finger or a tip pointer or a laser pointer. The drawing action of the user are captured by a camera connected to the computer. Video signal generated by the camera is analyzed by using computer vision techniques and hand drawn characters are recognized in real-time. The computer vision based character entry apparatus does neither need an antenna nor a cable to enter characters as the communication between the user and the computer is made possible by the camera.

Description

DESCRIPTION
Character Entry Apparatus based on Character Recognition in the Video Signal
The computer vision based apparatus is invented to enter text, numbers, ' and characters to computers, personal digital assistants, pagers, computerized systems, and other devices having digital signal processors or microprocessors. The user draws a character using his or her fmger or a tip pointer or a laser pointer. The drawing action of the user are captured by a camera connected to the computer. Video signal generated by the camera is analyzed by using computer vision techniques and hand drawn characters are recognized in real- time. The computer vision based character entry apparatus does neither need an antenna nor a cable to enter characters as the communicatipή between the user and the .computer is made possible . b the camera..
The camera connected to the computer or computerized system may generate analog or digital video. Computer produces an image sequence from this signal. Let us assume that the user starts writing at t=0, and the corresponding image be I0. Whenever the user starts drawing a character the tip of the finger or the pointer, or the beam of the laser pointer is detected in the image I0. After the detection step the tip or the beam of the laser pointer is tracked in the consequitive images. Let the location of the tip be (n0, m0) in the image I0, and (ni,
Figure imgf000002_0001
in the next image L. By connecting the pixel locations (nj, m;) an image of the character can be constructed in the computer and -use a pattern recognition algorithm to recognize the character drawn by the User.
Let us assume that a red laser pointer is used for drawing a character on a flat non-reflective surface which is in the viewing area of the camera. In this case the beam of the laser pointer produces a maximum in the red component of the image I0 (if a white light source is used to draw a character then a maximum occurs in the gray level image component). The detection of the beam of the laser pointer is equivalent to finding the maximum valued pixel(s) of the image. Usually the beam smears on the surface and there may be several pixels with the same maximum value. In this case the center of mass of these pixels are assumed to be the location (no, m0) of the beam. This maximum detection method is repeated for all images in the image sequence generated by the camera.. In this way a sequence of pixel coordinates (nl5 ihi), i=l,2,.,.,L, where L is the image > number when the user finishes drawing are obtained corresponding to each ' ■ character drawn by the user. As mention in the previous paragragh the above pixel coordinates are connected to form an image of the character. This image can be recognized by any character recognition method described in the book by S. Mori, H. Nishida, H. Yamada, Optical Character Recognition, John Wiley and Sons, NY, 1999.
Characters can be simply drawn by a finger or an ordinary pointer on a paper, on a fabric or a board or on the air as well. The above algorithm can still be used to recognize the characters drawn by the user as long as the tip. of the finger or the pointer is detected:and tracked to generate the pixel coordinates (ϊii,n i), i=l,2,.. ,,L, forming the character. Detection of the tip of the finger can be carried out by estimating the edges of the finger using one of the algorithms described in the article "Finger tracking as an input device for augmented reality," by J. Crowley, F. Berard and J. Coutaz in Proceedings of International Workshop on Automatic Face and Gesture Recognition, pp. 195-200, Zurich, Switzerland, 1995. Characters that can be entered to a computer or a computerized system using our apparatus are members of the characters described by 8-bit ASCII or 16-bit Unicode or the single stroke alphabets used in personal digital assistants (pda). The user can also define other special characters which can be used as x bookmarks. The characters that our text entry device recognizes are not the characters of a sign language used by deaf people.

Claims

CLAIMSCharacter Entry Apparatus based on Character Recognition in the Video Signal
1. This apparatus which recognizes characters drawn by hand on a flat surface or air is a character entry device to a computer, or a computerized system or a personal digital assistant or a mobile communications device. The communication with the computer or a computerized system or. a personal digital assistant or the mobile communication device is c;arried out via a camera connected to the computer. The video signal containing, the drawing actions of :'.;;■ the user is analyzed in real-time and the characters are recognized.
2. The apparatus of claim 1 wherein the characters can be drawn within the viewing range of the camera on a flat surface or air by a pointer.
3. The apparatus of claim 1 wherein the pointer used in drawing the characters can be a finger, pen, pencil, a light source such as a torch or a laser pointer.
4. The apparatus of claim 1 wherein the characters are the members of any alphabet or the single stroke alphabets used in personal digital assistants.
PCT/TR2001/000054 2000-10-10 2001-10-05 Character entry apparatus based on character recognition in the video signal WO2002031753A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2002214528A AU2002214528A1 (en) 2000-10-10 2001-10-05 Character entry apparatus based on character recognition in the video signal

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TR2000/02943 2000-10-10
TR2000/02943A TR200002943A2 (en) 2000-10-10 2000-10-10 Data input device based on character recognition from video signal

Publications (1)

Publication Number Publication Date
WO2002031753A1 true WO2002031753A1 (en) 2002-04-18

Family

ID=21622727

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/TR2001/000054 WO2002031753A1 (en) 2000-10-10 2001-10-05 Character entry apparatus based on character recognition in the video signal

Country Status (3)

Country Link
AU (1) AU2002214528A1 (en)
TR (1) TR200002943A2 (en)
WO (1) WO2002031753A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102012021220A1 (en) 2012-10-27 2014-04-30 Volkswagen Aktiengesellschaft Operating arrangement for detection of gestures in motor vehicle, has gesture detection sensor for detecting gestures and for passing on gesture signals, and processing unit for processing gesture signals and for outputting result signals

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999039302A1 (en) * 1998-01-28 1999-08-05 California Institute Of Technology Camera-based handwriting tracking
US5945981A (en) * 1993-11-17 1999-08-31 Microsoft Corporation Wireless input device, for use with a computer, employing a movable light-emitting element and a stationary light-receiving element
US6044165A (en) * 1995-06-15 2000-03-28 California Institute Of Technology Apparatus and method for tracking handwriting from visual input
EP1087327A2 (en) * 1999-09-21 2001-03-28 Seiko Epson Corporation Interactive display presentation system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5945981A (en) * 1993-11-17 1999-08-31 Microsoft Corporation Wireless input device, for use with a computer, employing a movable light-emitting element and a stationary light-receiving element
US6044165A (en) * 1995-06-15 2000-03-28 California Institute Of Technology Apparatus and method for tracking handwriting from visual input
WO1999039302A1 (en) * 1998-01-28 1999-08-05 California Institute Of Technology Camera-based handwriting tracking
EP1087327A2 (en) * 1999-09-21 2001-03-28 Seiko Epson Corporation Interactive display presentation system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102012021220A1 (en) 2012-10-27 2014-04-30 Volkswagen Aktiengesellschaft Operating arrangement for detection of gestures in motor vehicle, has gesture detection sensor for detecting gestures and for passing on gesture signals, and processing unit for processing gesture signals and for outputting result signals

Also Published As

Publication number Publication date
AU2002214528A1 (en) 2002-04-22
TR200002943A2 (en) 2002-05-21

Similar Documents

Publication Publication Date Title
US20170032493A1 (en) Image display apparatus and image display method
EP1617351A3 (en) Character recognition method, method of processing correction history of character data, and character recognition system
US5737443A (en) Method of joining handwritten input
EP1374148B1 (en) Method and device for recognition of a handwritten pattern
CN106774850B (en) Mobile terminal and interaction control method thereof
US20120287070A1 (en) Method and apparatus for notification of input environment
KR20020052217A (en) Electronics device applying an image sensor
US20010026262A1 (en) Apparatus and system for reproduction of handwritten input
US20210158031A1 (en) Gesture Recognition Method, and Electronic Device and Storage Medium
CN110378318B (en) Character recognition method and device, computer equipment and storage medium
KR20050049338A (en) A video based handwriting recognition system and method
US20210182546A1 (en) Display device, display method, and computer-readable recording medium
JP2017090998A (en) Character recognizing program, and character recognizing device
CN113610809A (en) Fracture detection method, fracture detection device, electronic device, and storage medium
CN114140839B (en) Image transmission method, device, equipment and storage medium for face recognition
KR20060096208A (en) Method for displaying script in character recognition system
Liu et al. Mobile Retriever: access to digital documents from their physical source
WO2002031753A1 (en) Character entry apparatus based on character recognition in the video signal
CN111213157A (en) Express information input method and system based on intelligent terminal
Puranik et al. AirNote–Pen it Down!
KR102396885B1 (en) Method for alligning the image include the text
Patil et al. Indian sign language recognition
EP0377129A3 (en) Fast spatial segmenter for handwritten characters
US20210294965A1 (en) Display device, display method, and computer-readable recording medium
JPH0883319A (en) Pattern recognizing device

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PH PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP