SG184583A1 - A device for facilitating efficient learning and a processing method in association thereto - Google Patents

A device for facilitating efficient learning and a processing method in association thereto Download PDF

Info

Publication number
SG184583A1
SG184583A1 SG2011016664A SG2011016664A SG184583A1 SG 184583 A1 SG184583 A1 SG 184583A1 SG 2011016664 A SG2011016664 A SG 2011016664A SG 2011016664 A SG2011016664 A SG 2011016664A SG 184583 A1 SG184583 A1 SG 184583A1
Authority
SG
Singapore
Prior art keywords
data
graphic
input
audio
electronic device
Prior art date
Application number
SG2011016664A
Inventor
Wong Hoo Sim
Kin Fui Chong
Xin Yi Wong
Original Assignee
Creative Tech Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Creative Tech Ltd filed Critical Creative Tech Ltd
Priority to SG2011016664A priority Critical patent/SG184583A1/en
Priority to SG2011018017A priority patent/SG184589A1/en
Priority to PCT/SG2012/000047 priority patent/WO2012121666A1/en
Priority to CN2012800123937A priority patent/CN103415826A/en
Priority to CN2012800119293A priority patent/CN103403780A/en
Priority to PCT/SG2012/000072 priority patent/WO2012121671A1/en
Priority to TW101107625A priority patent/TWI601101B/en
Publication of SG184583A1 publication Critical patent/SG184583A1/en

Links

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/26Speech to text systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B11/00Teaching hand-writing, shorthand, drawing, or painting
    • G09B11/04Guide sheets or plates; Tracing charts
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/06Foreign languages

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Human Computer Interaction (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • Electrically Operated Instructional Devices (AREA)
  • Input From Keyboards Or The Like (AREA)

Abstract

A DEVICE FOR FACILITATING EFFICIENT LEARNING AND A PROCESSING METHOD IN ASSOCIATION THERETOAn electronic device which can include an input portion, a processing portion and a display portion. The processing portion can be coupled to the input portion. The display portion can be coupled to the processing portion. Input signals can be communicated from the input portion. The processing portion can be coupled to the input portion such that input signals are receivable therefrom and processed in a manner so as to produce output signals. The display portion can be coupled to the processing portion such that output signals are receivable therefrom and displayable at the display portion as display data. The input signals can correspond to at least one of graphic data, control data and audio data. The output signals can correspond to at least one of indicator data and character data. The display data can correspond to at least one of character data and indicator data.Figure la

Description

. A DEVICE FOR FACILITATING EFFICIENT LEARNING AND A PROCESSING METHOD IN
ASSOCIATION THERETO
Field Of Invention
The present disclosure generally relates to data processing. More particularly, various embodiments of the disclosure relate to a device and a processing method for processing data in a manner so as to facilitate efficient learning.
Background
Learners of a language, such as the English language or the Chinese language, can learn the language by, for example, learning to write or pronounce a letter of the English alphabet or a character of the Chinese language. A letter or a character of a language can generally be formed by one or more basic strokes. A letter or a character of a language can also be generally associated with a pronunciation.
Conventional techniques to facilitate a learner’s learning a language include requiring the learner to emulate a letter or a character of the language by reproducing strokes associated with the letter or character on writing materials such as paper.
Appreciably, a learner who is beginning to learn a language may not be able to accurately emulate a letter or character of the language at the first instance. Hence, the learner may be required to emulate the letter or character on a writing material in a repetitive manner. As such, depending on the learner’s learning progress, more writing materials may be required.
Furthermore, a person familiar with the language, which the learner is learning, may be required to provide. feedback based on the letter or character emulated, on the writing material, by the learner. Thus a person familiar with the language may be required to monitor the learner’s learning progress. . Conventional techniques to facilitate a learner's learning a language further include requiring the learner to pronounce a letter or a character of the language. Appreciably, a person familiar with the language may similarly be required to provide feedback based on the letter or character pronounced. mn - 1 msser _ *Goooo2* :
} As such, conventional techniques may not be capable of facilitating a learner’s learning a language in a suitably efficient manner.
It is therefore desirable to provide a solution to address at least one of the foregoing problems of conventional techniques.
Summary of the Invention
In accordance with a first aspect of the disclosure, an electronic device is provided. The electronic device includes an input portion, a processing portion and a display portion. The processing "portion can be coupled to the input portion. The display portion can be coupled to the processing portion. :
Input signals can be communicated from the input portion. The input portion can have at least one of a graphic input segment for communicating graphic data, a control input segment for communicating control data and an audio input segment for communicating audio data. The : _ input signals can correspond to at least one of graphic data, control data and audio data. . The processing portion can be coupled to the input portion such that input signals are receivable therefrom and processed in a manner so as to produce output signals. The output signals can correspond to at least one of indicator data and character data. The indicator data can be based on at least one of graphic data and audio data. The character data can be based on control data.
The display portion can be coupled to the processing portion such that output signals are receivable therefrom and displayable at the display portion as display data. The display data can correspond to at least one of character data and indicator data.
In accordance with a second aspect of the disclosure, a processing method is provided. The processing method includes providing control input for communicating control data. The ‘ processing method further includes displaying graphic symbol based on the control data communicated. The processing method yet further includes receiving user input based on the displayed graphic symbol such that input signals are communicable. The input signals can correspond to at least one of graphic data and audio data.
} Moreover, the processing method includes determining performance characteristics based on the input signals communicated, wherein indicator data indicative of various performance parameters is produced. Furthermore, the processing method includes displaying performance result, : wherein indicator data is displayable as display data.
In accordance with a third aspect of the disclosure, an electronic device is provided. The electronic device includes an audio module, a screen and a processor. . The audio module can be operated to. detect and receive audio signals upon activation.
The screen can be configured to display a graphics user interface (GUI). The GUI can include a first interface portion associable with a graphic input segment for communicating graphic data. The
GUI can further include a second interface portion associable with a control input segment for : communicating control data. The GUI can yet further include a third interface portion associable : with an audio input segment for communicating an activation signal to the audio module in a manner so as to activate the audio module. The audio module, when activated, can be operated to detect and receive audio signals in a manner so as to produce audio data.
The processor can be configured to receive and process communicated control data in a manner so as to produce character data corresponding to a graphic symbol.
Based on the graphic symbol, at least one of graphic data and audio data can be communicated to the processor for processing in a manner so as to produce indicator data which is indicative of various performance parameters. The indicator data and graphic symbol can be displayed as display data on the screen. :
Brief Description of the Drawings
Embodiments of the disclosure are described hereinafter with reference to the following drawings, in which:
Fig. 1a shows a system which includes an input portion, a processing portion and a display portion, according to an embodiment of the disclosure; - Fig. 1b shows the system of Fig. 1a in further detail;
Fig. 2a to Fig. 2d show an exemplary application of the system of Fig. 1a which can be in the form of an electronic device 200; :
Fig, 2e shows an exemplary implementation of the electronic device of Fig. 2a;
Fig. 3a to Fig. 3c illustrate, in accordance with an embodiment of the disclosure, numerical representation and graphical representation of indicator data which can be displayed at the display portion of the system of Fig. 13;
Fig. 4a shows a first play mode of the electronic device of Fig. 2a;
Fig. 4b and Fig. 4c show a second play mode of the electronic device of Fig. 2a; and
Fig. 5 is a flow diagram illustrating a processing method which can be implemented in association with the system of Fig. 1a, in accordance with another embodiment of the disclosure.
Detailed Description
Representative embodiments of the disclosure, for addressing one or more of the foregoing : problems associated with conventional techniques, are described hereinafter with reference to Fig. 1 to Fig. 5.
A system 100, in accordance with an embodiment of the disclosure, is shown in Fig. 1a. The system 100 includes an input portion 110, a processing portion 120 and a display portion 130. The system 100 can further include a communication portion 140. The input potion 110 is coupled to : © the processing portion 120. The processing portion 120 is coupled to the display portion 130. The communication portion 140 can be coupled to the processing portion 120.
The input portion 110 can be configured to data communicate with the processing portion 120 in a manner such that input signals can be communicated from the input portion 110 to the . processing portion 120.
The processing portion 120 can be configured to receive and process the input signals in a manner so as to produce output signals.
The display portion 130 can be a screen. More specifically, the display portion 130 can either be a touch sensitive type screen or a non-touch sensitive type screen. Based on the output signals, display data can be displayed at the display portion 130.
The communication portion 140 can, for example, be a transceiver configured for one or both of wired communication and wireless communication _over a communication medium or communication network.
Referring to Fig. 1b, the input portion 110 includes one or both of a graphic input segment 110a and a control input segment 110b. The input portion 110 can further include an audio input ‘ segment 110c.
The graphic input segment 110a can be configured to communicate graphic data to the processing portion 120. The control input segment 110b can be configured to communicate control data to the processing portion 120. The audio input segment 110c can be configured to communicate audio data to the processing portion 120. In this regard, the input signals can include any one of graphic data, audio data and control data, or any combination thereof. :
The input portion 110 can be one or both of a software based implementation and a hardware based implementation, as will be further discussed with reference to Fig. 2. : The processing portion 120 can include one or both of a processor 120a and an audio processing module 120b.
In one embodiment, the processor 120a and the audio processing module 120b can be configured to receive and process the graphic data and the audio data respectively to produce indicator data.
In another embodiment, the processor 120a can be configured to receive and process the graphic data and the audio data to produce indicator data. In yet another embodiment, the processor 120a can be configured to receive and process the graphic data to produce replication data. The replication data can be based on the graphic data.
The processor 120a can be further configured to receive and process control data such that character data can be displayed at the display portion 130. Additionally, based on indicator data,
} the processor 120a can be configured to communicate consolidated data. Therefore the output signals can include one or both of indicator data and character data. Moreover, the output signals can further include consolidated data.
Earlier mentioned, based on the output signals, display data can be displayed at the display portion 130. In this regard, the display data can correspond to any one of the indicator data, character data and consolidated data, replication data or any combination thereof.
The system 100 will be described in further detail hereinafter with reference to an exemplary application which can be in the form of a device such as an electronic device 200 as shown in Fig. 2a to Fig. 2e. The electronic device 200 can, for example, be an electronic tablet which can be used by a user (not shown). Additionally, the electronic device 200 can be one of powered up and powered down via conventional power on/off arrangements which will not be further discussed for the purposes of brevity. - Fig. 2a provides a general overview of the electronic device 200. Fig. 2b provides an isometric view of the electronic device 200 of Fig. 2a. Additionally, the input portion 110 can be implemented and displayed by the electronic device 200 in one or more exemplary manners which will be further discussed with reference to Fig. 2c and Fig. 2d. Furthermore, the electronic : device 200 will be discussed in further detail with reference to an exemplary implementation 202 . with respect to Fig. 2e.
Referring to Fig. 2a and Fig. 2b, the electronic device 200 includes a casing 205, a screen 210 and a microprocessor 220. The electronic device 200 can optionally include an audio module 230, a transceiver 240 and an input module 245. The casing 205 can be dimensioned to house the screen 210 such that the screen 210 is visible to the user. Additionally, the casing 205 can be dimensioned to house the microprocessor 220, the audio module 230, the transceiver 240 and the input module 245. For example, as shown in Fig. 2b, the microprocessor 220 can be housed within the casing 205 and is not accessible to the user whereas the audio module 230, the transceiver 240 and the input module 245 can be accessible to the user.
"The screen 210 can be coupled to the microprocessor 220 for signal communication therebetween. ’ Additionally, the microprocessor 220 can be coupled to each of the audio module 230, the transceiver 240 and the input module 245.
The screen 210 can be either a touch sensitive type screen such as a touch screen or a non-touch sensitive type screen. The audio module 230 can, for example, be a microphone which can be configured to .detect audio signals from a user. Based on the audio signals detected, the audio module 230 can be configured to communicate audio data to the microprocessor 220. The input module 245 can include, for example, a touchpad, buttons or switches for one or both of communicating graphic data and control data.
The screen 210 can correspond to the aforementioned display portion 130. The screen 210 can _ further correspond to the input portion 110 or a part of the input portion 110, depending on whether the input portion 110 is software implemented, hardware implemented or both hardware and software implemented.
Additionally, the microprocessor 220 can correspond to the aforementioned processing portion 120. Particularly, the microprocessor 220 can correspond to either one or both of the processor 120a and the audio processing module 120b. :
Furthermore, where included, the audio module 230 and the input module 245, as with the screen 210, can correspond to the input portion 110 or a part of the input portion 110, depending on } whether the input portion 110 is software implemented, hardware implemented or both hardware and software implemented.
Earlier mentioned, the input portion 110 can be one or both of software implemented and hardware implemented. in one embodiment, as shown in Fig. 2¢, the input portion 110 can be software implemented and the screen 210 can be a touch sensitive type screen. The input portion 110 can, for example, be software implemented in a form of a graphics user interface (GUI) 250 which is displayable on the screen 210. The GUI 250 can have a first interface portion 250a corresponding to the graphic input segment 110a and a second interface portion 250b corresponding to the control input
} segment 110b. In this regard, the screen 210 can include a first touch screen area and a second touch screen area corresponding to the first and second interface portions 250a/250b respectively.
In this regard, the first and second interface portions 250a/250b can be used to communicate graphic data and control data respectively. Co
In another embodiment, further shown in Fig. 2c, the input portion 110 can be software and hardware implemented, and the screen 210 can be a touch sensitive type screen. The input portion 110 can, for example, be software implemented in a form of the aforementioned GUI 250.
The input portion 110 can, for example, be hardware implemented using the audio module 230.
In this regard, the GUI 250 can, in addition to the first and second interface portions 250a/250b, further include a third interface portion 250c corresponding to the audio input segment 110c.
Thus, the screen 210 can further include a third touch screen area corresponding to the third interface portion 250c.
In this regard, the first and second interface portions 250a/250b can be used to communicate + graphic data and control data respectively. Additionally, the third interface portion 250c can be oo used to communicate an activation signal to the audio module 230. Upon detection of the activation signal, the audio module 230 can be operated to detect and receive audio signals from the user. :
In yet another embodiment, as shown in Fig. 2d, the input portion 110 can be hardware implemented. The input portion 110 can, for example, be hardware implemented using one or both of the audio module 230 and the input module 245. In this regard, the aforementioned GUI 250 having the first, second and third interface portions 250a/250b/250c can be displayed on the screen 210. However it may not necessary for the screen 210 to be of a touch sensitive type screen.
In this regard, the input module 245 which can, for example, be a touchpad, can be used to } communicate one or both of graphic data and control data. For example, a user can contact the touchpad which can be capable of translating motion and position of the user's contact to a relative position on screen 210 corresponding to any of the first, second and third interface portions 250a/250b/250c. Thus, any of the graphic data, control data and audio data can similarly be communicated as discussed above.
Furthermore, the above mentioned input portion 110 can be configured such that graphic data can be communicated in an ergonomic manner. Specifically, the input portion 110 can be configured such that graphic data can be ergonomically communicated by either a user who is a left hander or a user who is a right hander. In one example, where the user is a left hander, the _input portion 110 can be configured such that the first interface portion 250a can be located at the left side of the screen 210 relative to the user facing the screen 210. In another example, where the user is a right hander, the input portion 110 can be configured such that the first interface portion 250a can be located at the right side of the screen 210 relative to the user facing the screen 210.
Referring to Fig. 2e, the aforementioned exemplary implementation 202 can be such that the input portion 110 is software and hardware implemented, and the screen 210 can be a touch sensitive type screen, as discussed earlier. In the exemplary implementation 202, the electronic device 200 can be configured to operate in one or more of a plurality of operation modes. The plurality of operation modes can include a graphic recognition mode and a voice recognition mode.
Thus, the electronic device 200 can be configured to operate in one or both of the graphic recognition mode and the voice recognition mode.
Earlier mentioned, the first, second and third interface portions 250a/250b/250c can correspond to the graphic input segment 110a, the control input segment 110b and the audio input segment 110c respectively. | :
Thus control data can be communicated via the second interface portion 250b to the . microprocessor 220 so as to control operation mode of the electronic device 200. In one example, control data can be communicated via the second interface portion 250b such that the electronic device 200 operates in the graphic recognition mode. In another example, contro! data can be communicated via the second interface portion 250b such that the electronic device 200 operates in the voice recognition mode. In yet another example, control data can be communicated via the second interface portion 250b such that the electronic device 200 operates in both the graphic and voice recognition modes.
] In either one or both of the graphic recognition mode and the voice recognition mode, the microprocessor 220 can, upon receipt of control data, be configured to communicate output signals such that display data corresponding to character data can be displayed at the screen 210.
The character data can correspond to a graphic symbol 260 which can be displayed on the screen 210 as display data. The graphic symbol 260 can, in one example, be a character such as a letter of the English alphabet. For example, the graphic symbol 260 can correspond to the letter “A”.
The graphic symbol 260 can, in another example, correspond to a character in another language such as Mandarin. As shown, the graphic symbol 260 can correspond to the character “X” in
Mandarin. In yet another example, the graphic symbol 260 can correspond to more than one character. More specifically, the graphic symbol 260 can correspond to a string of characters.
When in the graphic recognition mode, a user can emulate the displayed graphic symbol 260 by providing appropriate basic strokes 270 via the first interface portion 250a. In this regard, the i basic strokes 270 provided are communicated as graphic data from the graphic input segment 110a to the microprocessor 220 which can be configured to determine whether or not the appropriate strokes have been provided.
For example, where the graphic symbol 260 displayed on the screen 210 is the character “kX”, a user is required to provide a first stroke 270a, a second stroke 270b, a third stroke 270c and a fourth stroke 270d in order to emulate the displayed graphic symbol 260. The first to fourth strokes 270a/270b/270c/270d can be communicated as graphic data from the graphic input segment 110a to the microprocessor 220.
Based on the graphic data communicated from the graphic input segment 110a, the ’ microprocessor 220 can be configured to produce indicator data. More specifically, the . microprocessor 220 can be configured to process the graphic data in a manner so as to produce - indicator data. The indicator data can be indicative of various performance parameters. The performance parameters can include parameters such as accuracy in emulation of the graphic symbol 260, speed at which the graphic symbol 260 is accurately emulated and correctness in order of strokes provided. Other parameters such as penmanship can also be included. oo
In one example, the microprocessor 220 can be configured to determine whether or not the appropriate strokes have been provided based on whether or not the graphic symbol 260
} displayed on the screen is emulated accurately. Where the microprocessor 220 makes a - determination that the graphic symbol 260 has been emulated accurately, a positive indication signal can be communicated from the microprocessor 220. Otherwise, a negative indication signal can be communicated. Thus the indicator data can correspond to either the positive indication signal or the negative indication signal.
In another example, the microprocessor 220 can be configured to determine whether or not the . appropriate basic strokes 270 have been provided based on whether or not order of strokes is provided correctly. For example, in the case of the character “X.”, for the order of strokes to be provided correctly, the first to fourth strokes 270a/270b/270c/270d should be provided in a sequential order starting with the first stroke 270a and ending with the fourth stroke 270d. If the first to fourth strokes 270a/270d/270c/270d are provided in the sequential order as described above, the microprocessor 220 can be configured to communicate a positive indication signal.
Otherwise, a negative indication signal can be communicated. In this manner, the microprocessor 220 can be capable of determining whether or not the appropriate strokes have been provided, i based on the order of strokes provided.
Moreover, each stroke of the basic strokes 270 can be associated with a stroke direction. As shown in Fig. 2e, each of the first to fourth strokes 270a/270b/270c/270d is illustrated via an arrow which is indicative of correct stroke direction. For example, with regard to the second stroke 270b, in order to provide a correct stroke direction, the second stroke 270b should be provided starting with a start point 270e and ending with an end point 270f. Thus, in addition to determining whether or not the appropriate basic strokes 270 have been provided based on whether or not the order of strokes is provided correctly, it can also be useful to determine whether or not stroke direction of each stroke of the basic strokes 270 is provided correctly. : In yet another example, the microprocessor 220 can be configured to determine penmanship based on the graphic data. The microprocessor 220 can, for example, be configured to determine penmanship based on any of the aforementioned accuracy in emulation, correctness in the order of strokes provided and correctness of stroke direction, or any combination thereof. In one exemplary scenario, a positive indication signal based on each of accuracy in emulation, correctness in the order of strokes provided and correctness of stroke direction can be indicative of good penmanship. In another exemplary scenario, a positive indication signal and a negative
* » . ] indication signal based, respectively, on accuracy in emulation and correctness in the order of strokes provided can be indicative of poor penmanship.
In yet a further example, where the graphic symbol 260 displayed on the screen is not emulated in an exact manner (i.e., emulated partially accurately), a user can communicate control data via the second interface portion 250b such that the microprocessor 220 makes a determination that the graphic symbol 260 has been emulated partially accurately. In this situation, a partial positive indication signal can be communicated from the microprocessor 220. Thus in addition to the aforementioned positive and negative indication signals, the indicator data can further correspond : to a partial positive indication signal.
Moreover, earlier mentioned, when in the graphic recognition mode, a user can emulate the displayed graphic symbol 260 by providing appropriate basic strokes 270 via the first interface portion 250a. The basic strokes 270 provided are communicated as graphic data from the graphic input segment 110a to the microprocessor 220. In this regard, it is appreciable that graphic data communicated from the graphic input segment 110a can be associated with a writing style unique to a user. Thus graphic data can be associated with a font unique to a user.
Further earlier mentioned, the processor 120a can be configured to receive and process the graphic data to produce replication data. The replication data can be based on the graphic data.
Thus replication data can, substantially, be a replication of the font of graphic data, the font being : unique to a user.
Yet further earlier mentioned, the display data can correspond to replication data. Thus graphic data and its font in association thereto, can be displayed at a portion of the screen 210.
Moreover, the electronic device 200 can be configured to operate with another electronic device such as a computer having a screen (not shown). Specifically, the electronic device 200 can be oo. configured to communicate with the computer via one or both of wired or wireless communication.
In this regard, the display data can, optionally, be communicated from the electronic device 200 to the computer such that replication data can be displayed at the screen of the computer.
* [J } When in the voice recognition mode, a user can provide audio signals corresponding to the pronunciation of the displayed graphic symbol 260. The audio signals are detected and processed by the audio module 230 in a manner so as to communicate audio data to the microprocessor 220.
Based on the audio data communicated from the audio module 230, the microprocessor 220 can be configured to produce indicator data. Earlier mentioned, in relation to graphic data, the indicator data can be indicative of various performance parameters. In the case of audio data, the performance parameters can be based on parameters such as accuracy in pronunciation of the displayed graphic symbol 260 and speed at which the displayed graphic symbol 260 is accurately pronounced. In this regard, the foregoing pertaining to the discussion of indicator data based on graphic data analogously applies.
As described above, the indicator data produced can be based on a displayed graphic symbol 260 at a first instance. Appreciably a plurality of sets of indicator data can be produced corresponding to a plurality of instances.
For example, when a user emulates a displayed graphic symbol 260 at a first instance, a first set of } indicator data can be produced. When the user emulates another displayed graphic symbol 260 at a second instance, a second set of indicator data can be produced.
Thus based on the plurality of sets of indicator data, the microprocessor 220 can be configured to produce the earlier mentioned consolidated data. In this regard, the microprocessor 220 can be configured to tabulate the plurality of sets of indicator data to produce consolidated data.
Appreciably, by tabulating the plurality of sets of indicator data, a user of the electronic device 200 can conveniently keep track of learning progress. The plurality of sets of indicator data tabulated by the microprocessor 220 can be displayed as display data at the screen 210 in the form of, for example, pie charts, bar charts or numerical statistics. . In this regard, the indicator data and the consolidated data can be one or both of numerically represented and graphically represented. For illustrative purposes, numerical representation and graphical representation of the indicator data will be discussed in further detail with reference to
Fig. 3.
. i : . . Furthermore, in either of the graphic recognition mode and voice recognition mode, the electronic device 200 is operable in one of a plurality of user preference modes. The plurality of user preference modes can include a beginner mode, an intermediate mode and an expert mode. The plurality of user preference modes can further include a training mode, a first play mode and a second play mode. Control data can be communicated via the second interface portion 250b to the microprocessor 220 such that the electronic device 200 can be operated in one of the plurality of user preference modes.
In one example, in the beginner mode, a user may only be required to emulate a simple character associated with four or less strokes such as the aforementioned character “KX”. In the intermediate mode, the user may be required to emulate a relatively more complicated character associated with more than four strokes. In the expert mode, the user may be required to emulate a yet more complicated character associated with yet more strokes as compared with a character in the intermediate mode. :
In another example, a user may be required to emulate a single character in the beginner mode.
In the intermediate mode, the user may be required to emulate a string of characters. In the expert mode, the user may be required to emulate a longer string of characters as compared to . the string of characters in the intermediate mode.
In yet another example, in the training mode, the electronic device 200 can be configured to * display indicator data in a manner, as will be discussed in further detail with reference to Fig 3, at the screen 210 so as to provide the user with an indication indicative of learning progress.
In yet a further example, the electronic device 200 can be operated in one of the first and second play modes, as will be discussed in further detail with reference to Fig. 4.
Referring to Fig. 3a, the electronic device 200 can be configured to display a score segment 310 at the screen 210. Earlier mentioned, the indicator data can be one or both of numerically ’ represented and graphically represented. In particular, the indicator data can be one or both of : numerically and graphically represented via the score segment 310 as will be further illustrated in
Fig 3b and Fig. 3c respectively. Furthermore, indicator data can be one or both of numerically and graphically represented via a first representation scheme and a second representation scheme.
} In the first representation scheme, in one example, indicator data can be one or both of numerically and graphically represented in a manner such that for every accurately emulated or pronounced displayed graphic character 260, a score can be awarded. In another example, indicator data can be one or both of numerically and graphically represented in a manner such . that for every partially accurately emulated or pronounced displayed graphic character 260, another score can be awarded. Thus, the first representative scheme can be a scoring scheme in which a score can be awarded for every accurately emulated or pronounced displayed graphic character 260. The first representative scheme can also be a scoring scheme in which another i score can be awarded for every partially accurately emulated or pronounced displayed graphic character 260.
In the second representative scheme, indicator data can be one or both of numerically and graphically represented in a progressive manner. Specifically, the indicator data can be one or both of numerically and graphically represented such that user learning progress can be indicated.
For example, in the training mode where a user is required to emulate a displayed graphic character 260 corresponding to the character “X”, for every stroke of the basic strokes 270 accurately provided, one or both of a numeric and graphic indication can be provided. In this manner, a user can be conveniently kept abreast of the user learning progress in real-time. : Referring to Fig. 3b, the numerical representation can, in one example, be in the form of scored points 320. In one scenario, for each character emulated accurately, a score of, for example, fifty points can be awarded. In another scenario, for each stroke accurately provided, a score of, for example, fifty points can be awarded. In yet another scenario, for each character emulated partially accurately, a score of, for example, ten points can be awarded. The numerical representation can, in another example, be in the form of percentages. In one scenario, for each character emulated accurately, a score of, for example, ten percent can be awarded. In another scenario, for each stroke accurately provided, a score of, for example, ten percent can be awarded.
In yet another scenario, for each character emulated partially accurately, a score of, for example, five percent can be awarded.
Referring to Fig. 3c, the graphical representation can, in one example, be in the form of animated illustrations such as a plurality of treasure chests 330. Each of the plurality of treasure chests 330 can be initially presented in a form of a secured treasure chest 330a. In one scenario, for each character emulated accurately, the plurality of treasure chests 330 can be animated such that an oo originally secured treasure chest 330a can become unsecured to become an unsecured treasure chest 330b. In another scenario, for each stroke accurately provided, the plurality of treasure chests 330 can be animated such that an originally secured treasure chest 330a can become unsecured to become an unsecured treasure chest 330b. In yet a further scenario, where a character is emulated partially accurately, the plurality of treasure chests 330 can be animated such that an originally secured treasure chest can become partially unsecured to become a partially unsecured treasure chest (not shown).
Earlier mentioned, the electronic device 200 can be operated on one of a first play mode and a . second play mode. The first play mode is illustrated with reference to Fig. 4a. The second play mode is illustrated with reference to Fig. 4b and Fig. 4c.
Referring to Fig. 4a, the first play mode can, for example, be an online competition mode 400 where a user of the electronic device 200 can competitively compete with a user of another electronic device 410 which is analogous to the electronic device 200. In this regard, the aforementioned another electronic device 410 can include a communication portion (not shown) analogous to that of the communication portion 140 of the electronic device 200. Appreciably, the first play mode can be useful for applications such as online gaming.
Specifically, when in the online competition mode 400, the electronic device 200 and the : aforementioned. another electronic device 410 can be configured to communicate with one another. The electronic devices 200/410 can communicate with each other via a communication network 450. Thus the communication portions of the respective electronic devices 200/410 can be configured for one or both of wired communication and wireless communication over the communication network 450 such that the electronic device 200 and the aforementioned another electronic device 410 can communicate with one another for applications such as online gaming.
Additionally, when in the online competition mode 400, one or both of the electronic device 200 and the aforementioned another electronic device 410 can also be configured to communicate : with a plurality of other electronic devices 460 via the communication network 450. In this regard, the above description pertaining to communication between the electronic device 200 and the aforementioned another electronic device 410 analogously applies.
. Referring to Fig. 4b, the second play mode can, for example, be an offline competition mode 470 where a plurality of users 480 can compete with one another via the electronic device 200. For example each user from the plurality of users 480 can take turns using the electronic device 200 and the score associated with each user can be compared thereafter.
Referring to Fig. 4c, the second play mode can, for example, be a standalone mode 490 where the electronic device 200 is configured for use by a single user 495. When in the standalone mode 490, the single user 495 can operate the electronic device 200 in one of the aforementioned plurality of user preference modes such as the beginner mode, the intermediate mode, the expert . mode and the training mode.
In accordance with another embodiment of the disclosure, as shown in Fig. 5, a processing method 500 can be implemented in association with the system 100.
The processing method 500 includes providing control input 510 wherein control data can be communicated from the input portion 110 to the processing portion 120. As mentioned earlier, control data can be communicated from the control input segment 110b.
The processing method 500 further includes displaying graphic symbol 520 wherein based on the control data the processing portion 120 can be configured to produce output signals. The output signals can include character data corresponding to the graphic symbol 260. } Additionally, the processing method 500 can also include receiving user input 530 wherein based on the graphic symbol 260, input signals corresponding to one or both of graphic data and audio : data can be communicated to the processing portion 120. The graphic data and the audio data can be communicated from the graphic input segment 110a and the audio input segment 110c respectively. :
The processing method 500 further includes determining performance characteristics 540 wherein the processing portion 120 can be configured to process one or both of the graphic data and the audio data to produce indicator data.
The processing method 500 yet further includes displaying performance result 550 wherein indicator data can be displayed as display data at the display portion 130.
In the foregoing manner, various embodiments of the disclosure are described for addressing at least one of the foregoing disadvantages.
Such embodiments are intended to be encompassed by the following claims, and are not to be limited to specific forms or arrangements of parts so described and it will be apparent to one skilled in the art in view of this disclosure that numerous changes and/or modification can be made, which are also intended to be encompassed by the following claims.

Claims (20)

Claims :
1. An electronic device comprising: an input portion from which input signals are communicable, the input portion having at least one of a graphic input segment for communicating graphic data, a control input segment for communicating control data and an audio input segment for communicating audio data, the input signals corresponding to at least one of graphic data, : : control data and audio data; a processing portion coupled to the input portion such that input signals are receivable therefrom and processed in a manner so as to produce output signals, the : output signals corresponding to at least one of indicator data and character data, the indicator data being based on at least one of graphic data and audio data, the character data being based on control data; and a display portion coupled to the processing portion such that output signals are receivable therefrom and displayable at the display portion as display data, the display data corresponding to at least one of character data and indicator data.
2. The electronic device as in claim 1, wherein control data communicated from the control input segment is receivable by the processing portion and processed thereby such that character data is produced, and wherein the character data corresponds to a graphic symbol which is displayable to at the display portion as display data.
3. The electronic device as in claim 2, wherein at least one of graphic data and audio data is communicable from the input portion based on the graphic symbol displayed at the display portion.
4, The electronic device as in claim 3, wherein at least one of graphic data and audio data is receivable by the processing portion for processing in a manner so as to produce indicator data.
5. The electronic device as in claim 4, wherein the indicator data is communicable from the processing portion and displayable as display data at the display portion.
6. The electronic device as in claim 5, wherein the indicator data is at least one of graphically ‘ represented and numerically represented at the display portion.
7. The electronic device as in claim 6, } wherein the graphic symbol corresponds to a character associable with a plurality of basic strokes, and wherein the graphic data is based on emulation of the graphic symbol by providing appropriate strokes corresponding to the plurality of ‘basic strokes associable with the character.
8. The electronic device as in claim 7, wherein the processor portion is capable of processing the graphic data to determine whether the appropriate strokes have been provided.
9. The electronic device as in claim 8, the processor portion being capable of determining whether the appropriate strokes have been provided based on order of strokes provided.
10. The electronic device as in claim 9, wherein the indicator data is indicative of various performance parameters including at least one of: . accuracy in emulation of the character; speed at which the character is accurately emulated; correctness in order of strokes provided; and . "penmanship.
11. The device as in claim 6, wherein the graphic symbol corresponds to a character associable with a pronunciation.
12. The electronic device as in claim 11, wherein the indicator is indicative of various performance parameters including at least one of: accuracy in pronunciation of the character; and. speed at which the character is accurately pronounced. :
}
13. The electronic device as in claim 1 wherein the device is operable in at least one of a graphic recognition mode and a voice recognition mode. '
14. A processing method comprising: : providing control input for communicating contro! data; displaying graphic symbol based on the control data communicated; receiving user input based on the displayed graphic symbol such that input signals are communicable, the input signals corresponding to at least one of graphic data and audio data ; determining performance characteristics based on the input signals communicated, wherein indicator data indicative of various performance parameters is produced; and displaying performance result, wherein indicator data is displayable as display data.
15. The processing method as in claim 14 wherein an input portion is provided for communicating the control data, the input portion having a control input segment for communicating the control data.
16. The processing method as in claim 15 wherein a processing portion is provided for processing the control data in a manner so as to produce character data corresponding to a graphic symbol.
17. The processing method as in claim 16 wherein the input portion is further provided for communicating at least one of graphic data and audio data, the input portion further : having a graphic input segment and an audio input segment for communicating the graphic data and the audio data respectively.
18. The processing method as in claim 17, wherein the processing portion is further provided for processing at least one of the graphic data and the audio data in a manner so as to produce the indicator data.
i 19. The processing method as in claim 18 wherein a display portion is provided for displaying the display data.
20. An electronic device comprising: an audio module operable to detect and receive audio signals upon activation; a screen on which a graphics user interface (GUI) is displayable, the GUI comprising: a first interface portion associable with a graphic input segment for communicating graphic data; a second interface portion associable with a control input segment for communicating control data; a third interface portion associable with an audio input segment for communicating an activation signal to the audio module in a manner so as to : activate the audio module, the audio module when activated being operable to detect and receive audio signals in a manner so as to produce audio data; and a processor which is configurable to receive and process communicated control data in a manner so as to produce character data corresponding to a graphic symbol, wherein based on the graphic symbol, at least one of graphic data and audio data is communicable to the processor for processing in a manner so as to produce indicator - data which is indicative of various performance parameters, and wherein the indicator data and graphic symbol are displayable as display data on the screen.
SG2011016664A 2011-03-07 2011-03-07 A device for facilitating efficient learning and a processing method in association thereto SG184583A1 (en)

Priority Applications (7)

Application Number Priority Date Filing Date Title
SG2011016664A SG184583A1 (en) 2011-03-07 2011-03-07 A device for facilitating efficient learning and a processing method in association thereto
SG2011018017A SG184589A1 (en) 2011-03-07 2011-03-11 A device suitable for use by a user for communicating graphic data
PCT/SG2012/000047 WO2012121666A1 (en) 2011-03-07 2012-02-15 A device suitable for use by a user for communicating graphic data
CN2012800123937A CN103415826A (en) 2011-03-07 2012-02-15 A device suitable for use by a user for communicating graphic data
CN2012800119293A CN103403780A (en) 2011-03-07 2012-03-05 A device for facilitating efficient learning and a processing method in association thereto
PCT/SG2012/000072 WO2012121671A1 (en) 2011-03-07 2012-03-05 A device for facilitating efficient learning and a processing method in association thereto
TW101107625A TWI601101B (en) 2011-03-07 2012-03-07 A device suitable for use by a user for communicating graphic data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SG2011016664A SG184583A1 (en) 2011-03-07 2011-03-07 A device for facilitating efficient learning and a processing method in association thereto

Publications (1)

Publication Number Publication Date
SG184583A1 true SG184583A1 (en) 2012-10-30

Family

ID=46798462

Family Applications (1)

Application Number Title Priority Date Filing Date
SG2011016664A SG184583A1 (en) 2011-03-07 2011-03-07 A device for facilitating efficient learning and a processing method in association thereto

Country Status (3)

Country Link
CN (1) CN103403780A (en)
SG (1) SG184583A1 (en)
WO (1) WO2012121671A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9514748B2 (en) * 2014-01-15 2016-12-06 Microsoft Technology Licensing, Llc Digital personal assistant interaction with impersonations and rich multimedia in responses

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7720682B2 (en) * 1998-12-04 2010-05-18 Tegic Communications, Inc. Method and apparatus utilizing voice input to resolve ambiguous manually entered text input
US20050027534A1 (en) * 2003-07-30 2005-02-03 Meurs Pim Van Phonetic and stroke input methods of Chinese characters and phrases
CN101017617A (en) * 2006-02-10 2007-08-15 淡江大学 Handwriting practice system
CN101079191A (en) * 2007-06-07 2007-11-28 深圳市和而泰电子科技有限公司 Touching electronic copybook
CN101325011B (en) * 2008-07-30 2010-10-13 上海超之璐文化传播有限公司 Radical cartoon Chinese character hand-written teaching system
US8175389B2 (en) * 2009-03-30 2012-05-08 Synaptics Incorporated Recognizing handwritten words

Also Published As

Publication number Publication date
WO2012121671A8 (en) 2012-10-18
CN103403780A (en) 2013-11-20
WO2012121671A1 (en) 2012-09-13

Similar Documents

Publication Publication Date Title
US11046106B2 (en) Digital pen with enhanced educational feedback
Anson et al. The effects of word completion and word prediction on typing rates using on-screen keyboards
Alvina et al. Expressive keyboards: Enriching gesture-typing on mobile devices
US20040234938A1 (en) System and method for providing instructional feedback to a user
CN107798322A (en) A kind of smart pen
CN1092371C (en) Device for practising Chinese character calligraphy
Kristensson Discrete and continuous shape writing for text entry and control
SG184583A1 (en) A device for facilitating efficient learning and a processing method in association thereto
SG184589A1 (en) A device suitable for use by a user for communicating graphic data
CN202352115U (en) Early education machine
CN201993941U (en) Teaching machine using Chinese phonetic alphabets to conduct English learning
WO2020036011A1 (en) Information processing device, information processing method, and program
US20230282130A1 (en) Reading level determination and feedback
CN201255931Y (en) Learning machine for the blind children
KR100459030B1 (en) Method and Apparatus for English study using touch screen
CN107390884A (en) Entering method keyboard and physical keyboard
Anu Bharath et al. Performance of accessible gesture-based indic keyboard
Ashe et al. An empirical study of icon recognition in a virtual gallery interface
JP6100749B2 (en) Soft keyboard generation system for test score input
CN207992955U (en) It imparts knowledge to students special typing exercising Multi-Function Keyboard
Hagiya et al. Assistive typing application for older adults based on input stumble detection
CN114092943B (en) Method and device for displaying and training text writing
US20230088532A1 (en) Processing apparatus, processing method, and non-transitory storage medium
CN2388666Y (en) Multi-functional English learning machine
Mandyartha et al. One Line Plus Soft Keyboard Layout for Smartwatches Text Entry