US20150058785A1 - Character Input Device And Computer Readable Recording Medium - Google Patents
Character Input Device And Computer Readable Recording Medium Download PDFInfo
- Publication number
- US20150058785A1 US20150058785A1 US14/465,461 US201414465461A US2015058785A1 US 20150058785 A1 US20150058785 A1 US 20150058785A1 US 201414465461 A US201414465461 A US 201414465461A US 2015058785 A1 US2015058785 A1 US 2015058785A1
- Authority
- US
- United States
- Prior art keywords
- input
- character
- unit
- target character
- correction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/018—Input/output arrangements for oriental characters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/023—Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
- G06F3/0233—Character input methods
- G06F3/0236—Character input methods using selection techniques to select from displayed items
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
Definitions
- the present invention relates to a character input device and a computer readable recording medium.
- a software keyboard is displayed on a display screen, and a character can be input by touching or flicking a character button on the software keyboard.
- Such software keyboard is displayed as an image simulating an actual JIS (Japan Industrial Standard) keyboard or a keypad used in a mobile phone.
- JIS Japanese Industrial Standard
- An actual keyboard is provided with projections on the “F” key and the “J” key, which enables a user to recognize what is called a home position only by the sense of touch and facilitates touch typing.
- a projection is provided on the “5” key in the keypad, which facilitates tactile operations.
- a conventional character input device has a configuration that when a key input by a user is detected at the software keyboard including a plurality of keys, keys adjacent to the detected key are extracted, the detected key and the extracted keys are stored, conversion candidates are generated from the stored keys and the generated conversion candidates are displayed on the display screen (for example, Japanese Patent Application Laid Open Publication No. 2013-47872).
- An object of the present invention is to make incorrect input recognized rapidly.
- a character input device including: a touch panel which integrally includes a display unit for displaying a screen and an input unit for receiving a touch input at a position in the screen displayed on the display unit; a first control unit which displays a character input screen having a character display region on the display unit, associates a keyboard including a plurality of characters with the touch panel and displays a character in the keyboard corresponding to the position where the touch input is performed via the input unit in the character display region as an input target character; an evaluation unit which obtains an evaluation value for the input target character on basis of an input manner in which the touch input is performed via the input unit; a determination unit which determines whether the input target character is a correction target character on basis of the evaluation value for the input target character obtained by the evaluation unit; and a second control unit which displays the correction target character determined by the determination unit on the display unit so as to be distinguishable.
- a non-transitory computer readable medium that stores a program for making a computer execute a procedure
- the computer being a character input device with a touch panel which integrally includes a display unit for displaying a screen and an input unit for receiving a touch input at a position in the screen displayed on the display unit
- the procedure including: controlling so as to display a character input screen having a character display region on the display unit, associate a keyboard including a plurality of characters with the touch panel and display a character in the keyboard corresponding to the position where the touch input is performed via the input unit in the character display region as an input target character; obtaining an evaluation value for the input target character on basis of an input manner in which the touch input is performed via the input unit; determining whether the input target character is a correction target character on basis of the obtained evaluation value for the input target character; and displaying the determined correction target character on the display unit so as to be distinguishable.
- FIG. 1 is a front view showing an outer appearance of an information terminal device according to a first embodiment
- FIG. 2 is a block diagram showing a schematic configuration of the information terminal device according to the first embodiment
- FIG. 3 is a view for explaining an example of a character input screen
- FIG. 4A is a view for explaining an example of the character input screen
- FIG. 4B is a view for explaining an example of the character input screen
- FIG. 5 is a view for explaining a detection region
- FIG. 6 is a flowchart explaining input processing
- FIG. 7 is a view showing a configuration of a data table
- FIG. 8A is a view for explaining a calculation procedure of an evaluation value
- FIG. 8B is a view for explaining the calculation procedure of the evaluation value
- FIG. 9A is a view for explaining a calculation procedure of an evaluation value
- FIG. 9B is a view for explaining the calculation procedure of the evaluation value
- FIG. 10A is a view for explaining a calculation procedure of an evaluation value
- FIG. 10B is a view for explaining the calculation procedure of the evaluation value
- FIG. 11 is a view for explaining an example of the character input screen
- FIG. 12 is a view for explaining an example of the character input screen
- FIG. 13 is a view for explaining an example of the character input screen
- FIG. 14 is a view for explaining the detection region
- FIG. 15 is a front view showing an outer appearance of an information terminal device according to a second embodiment
- FIG. 16 is a block diagram showing a schematic configuration of the information terminal device according to the second embodiment.
- FIG. 17A is a view for explaining an example of a character input screen
- FIG. 17B is a view for explaining an example of the character input screen
- FIG. 18 is a flowchart explaining input processing
- FIG. 19 is a view for explaining a configuration of a data table
- FIG. 20A is a view explaining a calculation procedure of an evaluation value
- FIG. 20B is a view explaining the calculation procedure of the evaluation value
- FIG. 21A is a view explaining a calculation procedure of an evaluation value
- FIG. 21B is a view explaining the calculation procedure of the evaluation value
- FIG. 22 is a view for explaining an example of a character input screen
- FIG. 23 is a view for explaining an example of the character input screen
- FIG. 24A is a view for explaining a detection region
- FIG. 24B is a view for explaining the detection region.
- the information terminal device 1 is a smartphone having a telephone function, for example.
- the information terminal device 1 includes a thin plate-like main body 2 and a touch panel 3 arranged on a surface of the main body 2 .
- the touch panel 3 has, in an integrated manner, a display unit 3 a as a display unit for displaying an image and an input unit 3 b as an input unit which is arranged on the entire surface of the display screen of the display unit 3 a and touched by a finger, a stylus pen or such like to directly perform input (see FIG. 2 ).
- a speaker 4 for listening is provided above the display unit 3 a and a microphone 5 for speaking is provided below the display unit 3 a.
- a power button 6 for turning on and off the information terminal device 1 is arranged on the upper end surface of the main body 2 , and volume buttons 7 a and 7 b for controlling the receiver volume and such like are arranged on a lateral surface.
- the information terminal device 1 has a function as a character input device for inputting characters in addition to communication and telephone functions and such like.
- the information terminal device 1 is configured by including a CPU (Central Processing Unit) 11 , a RAM (Random Access Memory) 12 , a ROM (Read Only Memory) 13 , a flash memory 14 and a communication unit 15 in addition to the above-mentioned touch panel 3 , which are connected to each other via a bus 16 .
- a CPU Central Processing Unit
- RAM Random Access Memory
- ROM Read Only Memory
- the CPU 11 reads out a system program stored in the ROM 13 , opens the system program into a work area in the RAM 12 and controls the units in accordance with the system program.
- the CPU 11 reads out a processing program stored in the ROM 13 to open it into the work area and executes various types of processing.
- the display unit 3 a in the touch panel 3 includes a LCD (Liquid Cristal Display) or such like and displays a display screen such as a character input screen in accordance with an instruction by a display signal which is input from the CPU 11 . That is, the CPU 11 functions as a display control unit which controls display in the display unit 3 a.
- LCD Liquid Cristal Display
- the input unit 3 b receives a position input on the display screen of the display unit 3 a by a finger or a stylus pen, and outputs the position (coordinates) information to the CPU 11 .
- the RAM 12 is a volatile memory and forms a work area for temporarily storing various programs to be executed and data according to the various programs.
- the ROM 13 is a read only memory and stores the programs for executing various types of processing and data to be read out.
- the programs are stored in the ROM 13 in a form of computer readable program code.
- the flash memory 14 is a non-volatile memory storing information so as to be readable and writable.
- the communication unit 15 sends and receives data for telephone and communication with outside.
- the information terminal device 1 is configured to be connectable to a communication network including the Internet via the communication unit 15 .
- characters can be input by using a software keyboard KB displayed on a character input screen DP of the display unit 3 a , for example.
- a character display region CR is formed at the upper part of the character input screen DP, the character can be input by touching and flicking (sliding) the software keyboard KB displayed at the lower part of the character input screen DP, and the input character is displayed in the character display region CR.
- the software keyboard KB includes kana keys 21 a to 21 j for inputting kana characters (here, kana are Japanese syllables), an option key 22 for adding a dull sound/p-sound symbol to the input kana character and converting the kana character into a small letter, a mark key 23 for inputting a mark, a space key 24 and an enter key 25 .
- kana keys 21 a to 21 j for inputting kana characters (here, kana are Japanese syllables)
- an option key 22 for adding a dull sound/p-sound symbol to the input kana character and converting the kana character into a small letter
- a mark key 23 for inputting a mark
- a space key 24 for inputting a mark
- enter key 25 ⁇
- the kana key 21 a is a key for inputting characters (a), (i), (u), (e) and (o).
- the kana key 21 b is a key for inputting characters (ka), (ki), (ku), (ke) and (ko).
- the kana key 21 c is a key for inputting characters (sa), (si), (su), (se) and (so).
- the kana key 21 d is a key for inputting characters (ta), (ti), (tu), (te) and (to).
- the kana key 21 e is a key for inputting characters (na), (ni), (nu), (ne) and (no).
- the kana key 21 f is a key for inputting characters (ha), (hi), (hu), (he) and (ho).
- the kana key 21 g is a key for inputting characters (ma), (mi), (mu), (me) and (mo).
- the kana key 21 h is a key for inputting characters (ya), (yu) and (yo).
- the kana key 21 i is a key for inputting characters (ra), (ri), (ru), (re) and (ro).
- the kana key 21 j is a key for inputting characters (wa), (wo) and (n).
- the kana key 21 e is touched as shown in FIG. 4A . Then, the characters (ni), (nu), (ne) and (no) composing the (na) column are highlighted in the respective four sides of the region displaying the character (na), and the other keys are grayed out to indicate they are invalid.
- the (ni) detection region 31 b , (nu) detection region 31 c , (ne) detection region 31 d and (no) detection region 31 e which are the respective touch detection ranges of (ni), (nu), (ne) and (no) are set to the left, upper, right and lower sides of the (na) detection region 31 a , respectively.
- Each of the (ni) detection region 31 b , (nu) detection region 31 c , (ne) detection region 31 d and (no) detection region 31 e is formed in a nearly trapezoidal shape with an increasing area outwardly.
- the character (ne) can be input by touching the (na) detection region 31 a and thereafter flicking toward any position in the (ne) detection region 31 d.
- the characters (ni), (nu) and (no) can also be input by the same operation as the character (ne).
- the character (na) can be input by touching the (na) detection region 31 a and thereafter lifting the finger off the (na) detection region 31 a without the flick operation.
- the characters in the other columns can also be input similarly.
- the input processing is executed when a user inputs characters, for example.
- the CPU 11 stores input data regarding an operation via the touch panel 3 (step S 101 ). Specifically, the CPU 11 stores data regarding input start coordinates which are the coordinates of the position where the user starts the input by touching the touch panel 3 , input end coordinates which are the coordinates of the position where the user ends the input by the touch operation of the touch panel 3 and the trace of the flick operation from the input start coordinates to the input end coordinates in a predetermined region of the RAM 12 in a form of data table as shown in FIG. 7 .
- the CPU 11 functions as a position detecting unit which detects the input starting position where the touch input is started by an input unit and the input end position where the touch input ends after a slide operation from the input starting position for each input target character.
- the CPU 11 determines the input target character from input data stored in the RAM 12 (step S 102 ).
- the CPU 11 specifies the detection region on the basis of the input start coordinates and the input end coordinates in the data table stored in the RAM 12 and thereby determines the input target character.
- the input target character is (ne) when the input start coordinates belong to the (na) detection region 31 a and the input end coordinates belong to the (ne) detection region 31 d in FIG. 5 .
- the determined input target character is stored in the data table shown in FIG. 7 and displayed in the character display region CR on the character input screen DP.
- the CPU 11 calculates an evaluation value on the basis of the input data stored in the RAM 12 and stores the evaluation value in a predetermined field of the data table shown in FIG. 7 (step S 103 ).
- the evaluation value is obtained for each input target character.
- the CPU 11 obtains the distance from the center of the detection region of character, to which the input start coordinates S(x, y) belong, to the input start coordinates S(x, y) as shown in FIG. 8A , and calculates an evaluation value on the basis of the obtained distance.
- the evaluation value is calculated with reference to a conversion table such as an LUT (Look Up Table) as shown in FIG. 8B , for example.
- the evaluation value is ranged from 0 to 1 and is weighted so as to be large to a certain distance from the center and significantly small for the larger distance.
- the distance is ranged from 0 to the half of side length (t) of the detection region.
- the conversion table is not limited to that shown in FIG. 8B , and various conversion tables can be adopted.
- the evaluation value may be decreased linearly with the distance.
- the CPU 11 functions as a distance calculation unit which calculates the distance to the touched position from the central position in the touch detection range of the input target character corresponding to the position where the touch input is performed via the input unit.
- the CPU 11 obtains the distance from the input start coordinates S(x, y) to the input end coordinates E(x, y) as shown in FIG. 9A and calculates an evaluation value on the basis of the obtained distance.
- the evaluation value is calculated with reference to the conversion table as shown in FIG. 9B , for example.
- the evaluation value is ranged from 0 to 1 and varies with the distance from the input start coordinates S(x, y) to the input end coordinates E(x, y).
- the evaluation value is larger with a smaller distance since the smaller distance from the input start coordinates S(x, y) to the input end coordinates E(x, y) is more preferable.
- the evaluation value is larger as approaching the side length (t) of the detection region corresponding to the character (na) since it is more preferable that the distance from the input start coordinates S(x, y) to the input end coordinates E(x, y) is closer to the side length (t).
- the conversion table is not limited to that of FIG. 9B and various types can be adopted.
- the evaluation value may linearly increase and decrease with the distance from the input start coordinates S(x, y) to the input end coordinates E(x, y).
- the CPU 11 functions as a position detecting unit which detects the input starting position where the touch input via the input unit is started and the input end position where the touch input ends after a slide operation from the input starting position for each input target character.
- the CPU 11 obtains an angle 8 between the straight line passing through the input start coordinates S(x, y) and the input end coordinates E(x, y) and the horizontal line as shown in FIG. 10A and calculates an evaluation value on the basis of the obtained angle ⁇ .
- the evaluation value is calculated with reference to a conversion table as shown in FIG. 10B , for example.
- the evaluation value is ranged from 0 to 1 and converted according to the angle ⁇ .
- the evaluation value is larger as the straight line passing through the input start coordinates S(x, y) and the input end coordinates E(x, y) is closer to the horizontal or vertical line, and the evaluation value is smaller as the straight line is inclined more.
- the conversion table is not limited to that of FIG. 10B , and various conversion tables can be adopted.
- the evaluation value may be linearly decreased and increased in accordance with the angle ⁇ .
- the CPU 11 functions as an angle detecting unit which detects the angle of straight line connecting the input starting position and the input end position detected by the detecting unit.
- the average value of the evaluation values is obtained and the obtained average evaluation value is stored in the data table.
- the evaluation value to be stored in the data table may be obtained from one or two of the above three evaluation values.
- the method for obtaining the evaluation value is not limited to the above mentioned methods, and various methods can be adopted as long as it can evaluate the input manner in which the touch input is performed.
- the CPU 11 functions as an evaluation unit which obtains the evaluation value for each of the input target characters on the basis of the input manner of touch input via the input unit.
- the CPU 11 determines whether any evaluation value of the input target characters is less than a threshold value with reference to the data table shown in FIG. 7 (step S 104 ).
- the threshold value is set to “0.5” in the embodiment, for example, the threshold value may be set to any appropriate value.
- step S 104 If it is determined that there is an input target character having an evaluation value less than the threshold value (step S 104 : YES), the CPU 11 extracts the input target character which has the smallest evaluation value as a correction target character (step S 105 ). Since the input target character having the smallest evaluation value is most likely to be incorrect, this input target character is extracted in the embodiment.
- the CPU 11 functions as a correction determination unit which determines the correction target character from among the input target characters on the basis of the evaluation values for the input target characters obtained by the evaluation unit.
- the CPU 11 displays a correction candidate button on the character input screen (step S 106 ). Specifically, when the input target character having the smallest evaluation value is (no) as shown in FIG. 7 , for example, the CPU 11 highlights the character (no) in the character string of “ ” displayed in the character display region CR so as to be recognized as the character to be corrected as shown in FIG. 11 .
- a correction candidate button TS corresponding to the character (no) is displayed near the character string of “ ”.
- the correction candidate button TS includes characters that are (na), (ni), (nu), (ne) and ⁇ (no) as replacement character candidates corresponding to the character ⁇ (no).
- the CPU 11 determines whether any of the character buttons forming the correction candidate button TS is touched (step S 107 ). If it is determined that any of the character buttons forming the correction candidate button TS is touched (step S 107 : YES), the CPU 11 replaces the input target character to be corrected with the character corresponding to the touched character button (step S 108 ). For example, when the character button (ne) in the correction candidate button TS is touched in FIG. 11 , the input target character (no) is corrected to (ne) as shown in FIG. 12 . The input target character stored in the data table is also corrected from (no) to (ne).
- the CPU 11 functions as a correction input receiving unit which receives a correction input of an input target character made by the touch input via the input unit to correct the input target character determined as the correction target character.
- the CPU 11 rewrites the evaluation value of the input target character to be corrected to the largest value (step S 109 ), and thereafter executes the processing of step S 104 . Specifically, the CPU 11 rewrites the evaluation value stored in the data table and corresponding to the corrected input target character to “1” that is the largest value.
- step S 107 determines whether the correction button P is touched.
- the CPU 11 can perform the determination according to whether a touch operation is performed with respect to the correction button P which is displayed on the right of the correction candidate button TS displayed in the character display region CR on the character input screen DP.
- step S 110 If it is determined that the correction button P is touched (step S 110 : YES), the CPU 11 ends display of the correction candidate button TS (step S 111 ), thereafter executes a correction mode for correcting a character other than the correction target character (step S 112 ), and ends the processing.
- the character can be corrected by moving the cursor key forward or backward and operating the “DEL” key or the like in the software keyboard KB in FIG. 3 .
- step S 110 determines whether the correction button P is touched in step S 110 (step S 110 : NO).
- the determination can be made according to whether a touch operation is performed with respect to the confirmation button Q which is displayed on the right of the correction candidate button TS displayed in the character display region CR on the character input screen DP.
- step S 113 If it is determined that the confirmation button Q is touched (step S 113 : YES), the CPU 11 ends display of the correction candidate button TS, confirms the input (step S 114 ), and thereafter ends the processing.
- step S 113 NO
- the CPU 11 executes the processing in step S 107 .
- step S 104 if it is not determined that there is an input target character having an evaluation value less than the threshold value (step S 104 : NO), the CPU 11 executes the processing of step S 112 .
- the CPU 11 After executing the above mentioned input processing, the CPU 11 converts the character string displayed in the character display region CR on the character input screen DP to as shown in FIG. 13 by a predetermined conversion operation.
- the number of correction target character to be extracted is not limited to one and can be appropriately set. All the input target characters having the evaluation values less than the threshold value may be extracted as the correction target character.
- the CPU 1 may function as a detection range enlargement unit which controls so as to change the ranges of detection regions corresponding to the characters before and after the correction.
- the CPU 11 changes the ranges of detection regions so as to enlarge the range of (ne) detection region 31 d and reduce the range of (no) detection region 31 e as shown in FIG. 14 .
- the CPU 11 functions as the detection range enlargement unit which enlarges the touch detection range of the input target character for which the correction input is received by the correction input receiving unit.
- the ranges of detection regions may be changed when a character is corrected once or when a character is corrected a plurality of times.
- the change amount of the ranges of detection regions may be variable according to the number of times the correction is performed.
- the change amount of the ranges of detection regions may be variable according to the input start coordinates, input end coordinates and the trace.
- the change amount of the ranges of detection regions may also be variable according to the evaluation value.
- the information terminal device 100 is a tablet terminal, for example.
- the information terminal device 100 includes a thin plate-like main body 102 and a touch panel 103 provided on a surface of the main body 102 .
- the touch panel 103 integrally includes a display unit 103 a as a display unit for displaying an image and an input unit 103 b as an input unit which is provided on the entire surface of the display screen of the display unit 103 a and touched by a finger, a stylus pen or the like to directly perform input (see FIG. 16 ).
- the information terminal device 100 has a function as a character input device for inputting characters in addition to the communication function and the like.
- the information terminal device 100 is configured by including the CPU 111 , RAM 112 , ROM 113 and flash memory 114 in addition to the above mentioned touch panel 103 , and the units are connected to each other via a bus 116 . Since the functions of the touch panel 103 , CPU 111 , RAM 112 , ROM 113 and flash memory 114 are similar to those of the information terminal device 1 in the first embodiment, the detailed description thereof is omitted.
- characters can be input by using a software keyboard KB displayed on the character input screen DP of the display unit 103 a as shown in FIG. 17A , for example.
- the character display region CR is formed in the upper part of the character input screen DP, the character can be input by touching the software keyboard KB displayed in the lower part of the character input screen DP, and the input character is displayed in the character display region CR.
- the software keyboard KB is a software keyboard having QWERTY arrangement with Roman character keys 121 a to 121 z for inputting Roman characters, shift keys 122 and 122 , mark keys 123 a and 123 b for inputting marks, a space key 124 and an enter key 125 .
- the detection region for each character is set so as to superpose the character key.
- Characters can be input in such way in the embodiment.
- the input processing is executed when a user inputs characters, for example.
- the CPU 111 stores input data regarding the operation with respect to the touch panel 103 (step S 201 ).
- each set of data regarding input start coordinates and input end coordinates is stored in a predetermined region of the RAM 12 in a form of data table as shown in FIG. 19 .
- the input start coordinates and the input end coordinates are obtained in the same method as the above-mentioned first embodiment.
- the CPU 111 determines an input target character from the input data stored in the RAM 112 (step S 202 ).
- the CPU 111 specifies a character detection region to which the input end coordinates in the data table stored in the RAM 112 belongs, and thereby determines the input target character.
- the determined input target character is stored in the data table shown in FIG. 19 and the character is displayed in the character display region CR on the character input screen DP.
- the CPU 111 calculates the evaluation value on the basis of the input data stored in the RAM 112 , and stores the evaluation value in a predetermined field of the data table shown in FIG. 19 (step S 203 ).
- the evaluation value is obtained for each input target character.
- the CPU 111 calculates the distance from the center of the character detection region, to which the input start coordinates S(x, y) belong, to the input start coordinates S(x, y), and calculates an evaluation value on the basis of the obtained distance.
- the evaluation value is calculated with reference to the conversion table shown in FIG. 20B , for example. In the conversion table, the evaluation value is ranged from 0 to 1 and varies according to the distance from the center of the detection region to the input start coordinates S(x, y).
- the distance is ranged from 0 to the half of one side length (t) of the detection region.
- the conversion table is not limited to that of FIG. 20B and various conversion tables can be adopted.
- the evaluation value may be linearly decreased according to the distance.
- the CPU 111 obtains the distance from the input start coordinates S(x, y) to the input end coordinates E(x, y) as shown in FIG. 21A , and calculates an evaluation value on the basis of the obtained distance.
- the CPU 111 obtains the evaluation value from the distance of slide operation from the input start coordinates S(x, y) to the input end coordinates E(x, y).
- the evaluation is calculated with reference to the conversion table shown in FIG. 21B , for example.
- the evaluation value is ranged from 0 to 1 and varies according to the distance from the input start coordinates S(x, y) to the input end coordinates E(x, y).
- the evaluation value is larger as the distance is smaller.
- the conversion table is not limited to that of FIG. 21B , and various types can be adopted.
- the evaluation value may be linearly decreased according to the distance from the input start coordinates S(x, y) to the input end coordinates E(x, y).
- the CPU 111 functions as a length detecting unit for detecting the length of slide operation from the input starting position to the input end position detected by the position detecting unit.
- the average value of the evaluation values is calculated and the calculated average evaluation value is stored in the data table.
- the evaluation value to be stored in the data table may be calculated from one of the above two evaluation values.
- the method for obtaining the evaluation value is not limited to the above mentioned method and various methods can be adopted as long as the input manner of touch input can be evaluated.
- the CPU 111 next determines whether there is an input target character having an evaluation value less than a threshold value with reference to the data table in FIG. 19 (step S 204 ).
- the threshold value is set to “0.5” in the embodiment, for example, the threshold value can be set to any appropriate value.
- step S 204 If it is determined that there is an input target character having the evaluation value less than the threshold value (step S 204 : YES), the CPU 111 extracts three input target characters having the smallest evaluation values (step S 205 ).
- the three input target characters having the smallest evaluation values are X, P and R, and the characters are extracted as correction target characters. These input target characters are extracted in the embodiment since they are highly likely to be incorrect.
- the CPU 111 displays a correction candidate button on the character input screen (step S 206 ). Specifically, as shown in FIG. 22 , the CPU 111 highlights the three input target characters X, P and R having smallest evaluation values in the character string of “XOMPUTER” displayed in the character display region CR so as to be recognized as characters to be corrected, for example.
- the CPU 111 displays correction candidate buttons TS1, TS2 and TS3 respectively corresponding to the characters X, P and R near the character string “XOMPUTER”.
- the correction candidate button TS1 consists of character buttons X and C as replacement character candidates corresponding to the character X.
- the correction candidate button TS2 consists of character buttons O and P as replacement character candidates corresponding to the character P.
- the correction candidate button TS3 consists of character buttons D, F and R as replacement character candidates corresponding to the character R.
- the touched positions are respectively displayed so as to be recognized. Thus, the user can recognize the touched positions. The touched positions may not be displayed.
- the CPU 111 determines whether any of the character buttons consisting the correction candidate buttons TS1 to TS3 is touched (step S 207 ).
- step S 207 If it is determined that any of the character buttons consisting the correction candidate buttons TS1 to TS3 is touched (step S 207 : YES), the CPU 111 replaces the input target character corresponding to the touched character button with the character corresponding to the touched character button (step S 208 ).
- the input target character X is corrected to C as shown in FIG. 23 .
- the input target character stored in the data table is also corrected from X to C.
- the CPU 111 rewrites the evaluation value of the input target character which is the correction target to the largest value (step S 209 ), and thereafter executes the processing of step S 204 .
- step S 207 determines whether the correction button P is touched.
- the determination can be made according to whether a touch operation is performed with respect to the correction button P displayed on the right of the correction candidate button TS3 displayed in the character display region CR on the character input screen DP.
- step S 210 If it is determined that the correction button P is touched (step S 210 : YES), the CPU 111 ends display of the correction candidate buttons TS1 to TS3 (step S 211 ), thereafter executes the correction mode for correcting a character other than the character which is the correction target (step S 212 ), and ends the processing.
- correction can be performed by moving forward or backward the cursor key and operating keys such as the BS key in the software keyboard KB in FIG. 17A , for example.
- step S 210 determines whether the correction button P is touched (step S 210 : NO). Specifically, as shown in FIG. 22 , the determination can be made according to whether a touch operation is performed with respect to the confirmation button Q displayed on the right of the correction candidate button TS3 displayed in the character display region CR on the character input screen DP.
- step S 213 If it is determined that the confirmation button Q is touched (step S 213 : YES), the CPU 111 ends the display of correction candidate buttons TS1 to TS3 to confirm the input (step S 214 ), and ends the processing.
- step S 213 NO
- the CPU 111 executes the processing of step S 207 .
- step S 204 if it is not determined that there is an input target character having an evaluation value less than the threshold value (step S 204 : NO), the CPU 111 executes the processing in step S 212 .
- the number of characters to be extracted as the correction target characters can be appropriately set. All the input target characters having the evaluation values less than the threshold value may be extracted as the correction target characters.
- control when correction was performed, control may be performed so as to change the ranges of detection regions corresponding to the characters before and after the correction.
- the CPU 111 changes the ranges of the X detection region 131 x corresponding to X key 121 x and C detection region 131 c corresponding to C key 121 C set shown in FIG. 24A to the ranges shown in FIG. 24B so as to reduce the range of X detection region 131 x and enlarge the range of C detection region 131 c.
- the ranges of detection regions may be changed either when the correction is performed once or when the correction is performed a plurality of times.
- the change amount of the ranges of detection regions may be variable according to the number of times of correction.
- the change amount of the ranges of detection regions may be variable according to the evaluation value.
- the change amount of the ranges of detection regions may be variable according to the touched position.
- the touch panel 3 ( 103 ) integrally includes the display unit 3 a ( 103 a ) for displaying a screen and the input unit 3 b ( 103 b ) which receives the touch input of a position on the screen displayed on the display unit 3 a ( 103 a ).
- the CPU 11 ( 111 ) displays the character input screen DP having the character display region CR on the display unit 3 a ( 103 a ), associates the software keyboard KB including a plurality of characters with the touch panel 3 ( 103 ), and displays the character in the software keyboard KB corresponding to the position for which the touch input is performed via the input unit 3 b in the character display region CR as the input target character.
- the CPU 11 ( 111 ) obtains an evaluation value for each of the input target characters on the basis of the input manner of the touch input via the input unit 3 b ( 103 b ).
- the CPU 11 ( 111 ) determines the correction target character from among the input target characters on the basis of the evaluation value obtained for each of the input target characters.
- the CPU 11 ( 111 ) displays the determined correction target character on the display unit 3 a ( 103 a ) so as to be distinguishable. As a result, the user can recognize the incorrect input rapidly.
- the CPU 11 ( 111 ) calculates the distance from the central position of the touch detection region of the input target character corresponding to the position where the touch input is performed via the input unit 3 b ( 103 b ) to the touched position.
- the CPU 11 ( 111 ) obtains the evaluation value for each of the input target characters on the basis of the calculated distance. As a result, incorrect input can be appropriately detected.
- the CPU 11 ( 111 ) detects the input starting position where the touch input via the input unit 3 b ( 103 b ) starts and the input end position where the touch input ends after the slide operation from the input starting position for each of the input target characters.
- the CPU 11 ( 111 ) obtains the evaluation value for each of the input target characters on the basis of the detection result. As a result, incorrect input can be appropriately detected.
- the CPU 11 ( 111 ) detects the length of slide operation from the input starting position to the input end position which were detected.
- the CPU 11 ( 111 ) obtains the evaluation value for each of the input target characters on the basis of the detected length. As a result, incorrect input can be appropriately detected.
- the CPU 11 detects the angle of the straight line connecting the input starting position to the input end position.
- the CPU 11 obtains the evaluation value for each of the characters on the basis of the detected angle. As a result, incorrect input can be appropriately detected.
- the CPU 11 determines at least the input target character having the smallest evaluation value as the correction target character. As a result, the user can recognize the input target character which is highly likely to be incorrect more precisely.
- the CPU 11 ( 111 ) receives correction input of an input target character by the touch input via the input unit 3 b ( 103 b ) to correct the input target character which was determined to be the correction target character.
- the CPU 11 ( 111 ) replaces the input target character which was determined to be the correction target character and is displayed on the display unit 3 a ( 103 a ) with the input target character for which correction input is received.
- the CPU 11 ( 111 ) displays replacement character candidates corresponding to the input target character determined to be the correction target character near the correction target character displayed in the character display region CR.
- the CPU 11 ( 111 ) receives the character of the replacement character candidate as the input target character to replace the correction target character.
- the CPU 11 ( 111 ) enlarges the touch detection range of the input target character for which correction input was received. As a result, incorrect input thereafter can be suppressed.
- the CPU 11 when correction input is received for a character a predetermined number of times, the CPU 11 ( 111 ) enlarges the touch detection range of the input target character for which correction input was received. As a result, the touch detection range can be enlarged appropriately according to the manner of user's touch operation.
- the input target character having the smallest evaluation value is extracted as the correction target character and displayed when there is an evaluation value less than the threshold value.
- the input target character having the smallest evaluation value may be extracted as the correction target character and displayed regardless of whether the evaluation value is less than the threshold value.
- the correction candidate button is displayed on the character input screen to perform correction input of a character.
- the correction input may be performed via the software keyboard KB without displaying the correction candidate button.
- a non-volatile memory such as a flash memory and a portable recording medium such as a CD-ROM can also be applied in addition to the ROM, hard disk and such like.
- a carrier wave also applies as the medium providing the program data via a predetermined communication line.
Abstract
A character input device, including: a touch panel which integrally includes a display unit and an input unit; a first control unit which displays a character input screen having a character display region on the display unit, associates a keyboard including a plurality of characters with the touch panel and displays a character in the keyboard corresponding to a position where a touch input is performed via the input unit in the character display region as an input target character; an evaluation unit which obtains an evaluation value for the input target character on basis of an input manner; a determination unit which determines whether the input target character is a correction target character on basis of the evaluation value; and a second control unit which displays the correction target character on the display unit so as to be distinguishable.
Description
- The present invention relates to a character input device and a computer readable recording medium.
- In recent years, an increasing number of mobile terminals such as smartphones and tablet terminals can be operated via touch panels. In such device, a software keyboard is displayed on a display screen, and a character can be input by touching or flicking a character button on the software keyboard.
- Such software keyboard is displayed as an image simulating an actual JIS (Japan Industrial Standard) keyboard or a keypad used in a mobile phone.
- An actual keyboard is provided with projections on the “F” key and the “J” key, which enables a user to recognize what is called a home position only by the sense of touch and facilitates touch typing. In a mobile phone, a projection is provided on the “5” key in the keypad, which facilitates tactile operations.
- On the other hand, since the software keyboard is not provided with such projection, a character is input by a user touching the position of the character in the software keyboard displayed on the screen while visually recognizing the position. Accordingly, since the user cannot have the tactile sense as in the actual keyboard, there is a problem of incorrect input due to the user touching a position deviated from a desired position.
- In view of such problem, a conventional character input device has a configuration that when a key input by a user is detected at the software keyboard including a plurality of keys, keys adjacent to the detected key are extracted, the detected key and the extracted keys are stored, conversion candidates are generated from the stored keys and the generated conversion candidates are displayed on the display screen (for example, Japanese Patent Application Laid Open Publication No. 2013-47872).
- However, in the invention described in Japanese Patent Application Laid Open Publication No. 2013-47872, all the conversion candidates which meet the conditions are displayed on the display screen regardless of whether the key input is correct, and one of the displayed conversion candidates is to be selected. Thus, such device requires a troublesome task of finding the desired conversion candidate one by one. In addition, whether the input is wrong cannot be recognized rapidly, leading to a bad operability.
- An object of the present invention is to make incorrect input recognized rapidly.
- According to one aspect of the present invention, there is provided a character input device, including: a touch panel which integrally includes a display unit for displaying a screen and an input unit for receiving a touch input at a position in the screen displayed on the display unit; a first control unit which displays a character input screen having a character display region on the display unit, associates a keyboard including a plurality of characters with the touch panel and displays a character in the keyboard corresponding to the position where the touch input is performed via the input unit in the character display region as an input target character; an evaluation unit which obtains an evaluation value for the input target character on basis of an input manner in which the touch input is performed via the input unit; a determination unit which determines whether the input target character is a correction target character on basis of the evaluation value for the input target character obtained by the evaluation unit; and a second control unit which displays the correction target character determined by the determination unit on the display unit so as to be distinguishable.
- According to another aspect of the present invention, there is provided a non-transitory computer readable medium that stores a program for making a computer execute a procedure, the computer being a character input device with a touch panel which integrally includes a display unit for displaying a screen and an input unit for receiving a touch input at a position in the screen displayed on the display unit, the procedure including: controlling so as to display a character input screen having a character display region on the display unit, associate a keyboard including a plurality of characters with the touch panel and display a character in the keyboard corresponding to the position where the touch input is performed via the input unit in the character display region as an input target character; obtaining an evaluation value for the input target character on basis of an input manner in which the touch input is performed via the input unit; determining whether the input target character is a correction target character on basis of the obtained evaluation value for the input target character; and displaying the determined correction target character on the display unit so as to be distinguishable.
- According to the present invention, it is possible to make incorrect input recognized rapidly.
- The above and other objects, advantages and features of the present invention will become more fully understood from the detailed description given hereinafter and the appended drawings which are given by way of illustration only, and thus are not intended as a definition of the limits of the present invention, and wherein:
-
FIG. 1 is a front view showing an outer appearance of an information terminal device according to a first embodiment; -
FIG. 2 is a block diagram showing a schematic configuration of the information terminal device according to the first embodiment; -
FIG. 3 is a view for explaining an example of a character input screen; -
FIG. 4A is a view for explaining an example of the character input screen; -
FIG. 4B is a view for explaining an example of the character input screen; -
FIG. 5 is a view for explaining a detection region; -
FIG. 6 is a flowchart explaining input processing; -
FIG. 7 is a view showing a configuration of a data table; -
FIG. 8A is a view for explaining a calculation procedure of an evaluation value; -
FIG. 8B is a view for explaining the calculation procedure of the evaluation value; -
FIG. 9A is a view for explaining a calculation procedure of an evaluation value; -
FIG. 9B is a view for explaining the calculation procedure of the evaluation value; -
FIG. 10A is a view for explaining a calculation procedure of an evaluation value; -
FIG. 10B is a view for explaining the calculation procedure of the evaluation value; -
FIG. 11 is a view for explaining an example of the character input screen; -
FIG. 12 is a view for explaining an example of the character input screen; -
FIG. 13 is a view for explaining an example of the character input screen; -
FIG. 14 is a view for explaining the detection region; -
FIG. 15 is a front view showing an outer appearance of an information terminal device according to a second embodiment; -
FIG. 16 is a block diagram showing a schematic configuration of the information terminal device according to the second embodiment; -
FIG. 17A is a view for explaining an example of a character input screen; -
FIG. 17B is a view for explaining an example of the character input screen; -
FIG. 18 is a flowchart explaining input processing; -
FIG. 19 is a view for explaining a configuration of a data table; -
FIG. 20A is a view explaining a calculation procedure of an evaluation value; -
FIG. 20B is a view explaining the calculation procedure of the evaluation value; -
FIG. 21A is a view explaining a calculation procedure of an evaluation value; -
FIG. 21B is a view explaining the calculation procedure of the evaluation value; -
FIG. 22 is a view for explaining an example of a character input screen; -
FIG. 23 is a view for explaining an example of the character input screen; -
FIG. 24A is a view for explaining a detection region; and -
FIG. 24B is a view for explaining the detection region. - Hereinafter, preferred embodiments according to the present invention will be described with reference to the drawings. Though the after-mentioned embodiments are provided with various technically preferred limitations to perform the present invention, the scope of the present invention is not limited to the following embodiments and illustrated examples.
- First, a configuration of an information terminal device according to the first embodiment of the present invention will be described.
- As shown in
FIG. 1 , theinformation terminal device 1 is a smartphone having a telephone function, for example. Theinformation terminal device 1 includes a thin plate-likemain body 2 and atouch panel 3 arranged on a surface of themain body 2. - The
touch panel 3 has, in an integrated manner, adisplay unit 3 a as a display unit for displaying an image and aninput unit 3 b as an input unit which is arranged on the entire surface of the display screen of thedisplay unit 3 a and touched by a finger, a stylus pen or such like to directly perform input (seeFIG. 2 ). - A
speaker 4 for listening is provided above thedisplay unit 3 a and amicrophone 5 for speaking is provided below thedisplay unit 3 a. - A
power button 6 for turning on and off theinformation terminal device 1 is arranged on the upper end surface of themain body 2, andvolume buttons - The
information terminal device 1 has a function as a character input device for inputting characters in addition to communication and telephone functions and such like. - As shown in
FIG. 2 , theinformation terminal device 1 is configured by including a CPU (Central Processing Unit) 11, a RAM (Random Access Memory) 12, a ROM (Read Only Memory) 13, aflash memory 14 and acommunication unit 15 in addition to the above-mentionedtouch panel 3, which are connected to each other via abus 16. - The
CPU 11 reads out a system program stored in theROM 13, opens the system program into a work area in theRAM 12 and controls the units in accordance with the system program. TheCPU 11 reads out a processing program stored in theROM 13 to open it into the work area and executes various types of processing. - The
display unit 3 a in thetouch panel 3 includes a LCD (Liquid Cristal Display) or such like and displays a display screen such as a character input screen in accordance with an instruction by a display signal which is input from theCPU 11. That is, theCPU 11 functions as a display control unit which controls display in thedisplay unit 3 a. - The
input unit 3 b receives a position input on the display screen of thedisplay unit 3 a by a finger or a stylus pen, and outputs the position (coordinates) information to theCPU 11. - The
RAM 12 is a volatile memory and forms a work area for temporarily storing various programs to be executed and data according to the various programs. - The
ROM 13 is a read only memory and stores the programs for executing various types of processing and data to be read out. The programs are stored in theROM 13 in a form of computer readable program code. - The
flash memory 14 is a non-volatile memory storing information so as to be readable and writable. - The
communication unit 15 sends and receives data for telephone and communication with outside. Theinformation terminal device 1 is configured to be connectable to a communication network including the Internet via thecommunication unit 15. - In the
information terminal device 1 in the embodiment, as shown inFIG. 3 , characters can be input by using a software keyboard KB displayed on a character input screen DP of thedisplay unit 3 a, for example. - Specifically, when a character is to be input, a character display region CR is formed at the upper part of the character input screen DP, the character can be input by touching and flicking (sliding) the software keyboard KB displayed at the lower part of the character input screen DP, and the input character is displayed in the character display region CR.
- The software keyboard KB includes kana keys 21 a to 21 j for inputting kana characters (here, kana are Japanese syllables), an
option key 22 for adding a dull sound/p-sound symbol to the input kana character and converting the kana character into a small letter, amark key 23 for inputting a mark, aspace key 24 and anenter key 25. ¥ -
-
-
-
-
-
-
-
-
-
- For example, when the character (ne) is to be input by using the above-mentioned software keyboard KB, the kana key 21 e is touched as shown in
FIG. 4A . Then, the characters (ni), (nu), (ne) and (no) composing the (na) column are highlighted in the respective four sides of the region displaying the character (na), and the other keys are grayed out to indicate they are invalid. - When a flick operation is performed in the right direction where (ne) is displayed with the finger touching the
touch panel 3 and the finger is lifted off thetouch panel 3 at the position of (ne), the character (ne) is input and displayed in the character display region CR. In the embodiment, characters can be input in such way. The detection region of each character will be described with reference toFIG. 5 . When the kana key 21 e is touched, (na)detection region 31 a which is the touch detection range of the character (na) is set at the region superposing the region displaying (na). - The (ni) detection region 31 b, (nu) detection region 31 c, (ne)
detection region 31 d and (no)detection region 31 e which are the respective touch detection ranges of (ni), (nu), (ne) and (no) are set to the left, upper, right and lower sides of the (na)detection region 31 a, respectively. Each of the (ni) detection region 31 b, (nu) detection region 31 c, (ne)detection region 31 d and (no)detection region 31 e is formed in a nearly trapezoidal shape with an increasing area outwardly. -
- The characters (ni), (nu) and (no) can also be input by the same operation as the character (ne). The character (na) can be input by touching the (na)
detection region 31 a and thereafter lifting the finger off the (na)detection region 31 a without the flick operation. The characters in the other columns can also be input similarly. - There are some cases in which a user flicks wrongly and inputs an unintended character. For example, as shown in
FIG. 4B , when a user intends to flick the (ne)detection region 31 d to input the character (ne), the user may accidentally flick the (no)detection region 31 e and inputs the character (no). - In the embodiment, since the manner of user's touch operation is evaluated and the input character which is highly likely to be wrong is displayed so as to be distinguishable as described later, the user can recognize wrong input rapidly.
- Next, input processing executed by the
CPU 11 in theinformation terminal device 1 configured as described above will be described with reference toFIG. 6 . The input processing is executed when a user inputs characters, for example. - First, the
CPU 11 stores input data regarding an operation via the touch panel 3 (step S101). Specifically, theCPU 11 stores data regarding input start coordinates which are the coordinates of the position where the user starts the input by touching thetouch panel 3, input end coordinates which are the coordinates of the position where the user ends the input by the touch operation of thetouch panel 3 and the trace of the flick operation from the input start coordinates to the input end coordinates in a predetermined region of theRAM 12 in a form of data table as shown inFIG. 7 . - That is, the
CPU 11 functions as a position detecting unit which detects the input starting position where the touch input is started by an input unit and the input end position where the touch input ends after a slide operation from the input starting position for each input target character. - Next, the
CPU 11 determines the input target character from input data stored in the RAM 12 (step S102). - Specifically, the
CPU 11 specifies the detection region on the basis of the input start coordinates and the input end coordinates in the data table stored in theRAM 12 and thereby determines the input target character. - For example, the input target character is (ne) when the input start coordinates belong to the (na)
detection region 31 a and the input end coordinates belong to the (ne)detection region 31 d inFIG. 5 . The determined input target character is stored in the data table shown inFIG. 7 and displayed in the character display region CR on the character input screen DP. - Next, the
CPU 11 calculates an evaluation value on the basis of the input data stored in theRAM 12 and stores the evaluation value in a predetermined field of the data table shown inFIG. 7 (step S103). The evaluation value is obtained for each input target character. - Here, the calculation procedure of the evaluation value in the embodiment will be described specifically.
- First, the
CPU 11 obtains the distance from the center of the detection region of character, to which the input start coordinates S(x, y) belong, to the input start coordinates S(x, y) as shown inFIG. 8A , and calculates an evaluation value on the basis of the obtained distance. - The evaluation value is calculated with reference to a conversion table such as an LUT (Look Up Table) as shown in
FIG. 8B , for example. - In the conversion table, the evaluation value is ranged from 0 to 1 and is weighted so as to be large to a certain distance from the center and significantly small for the larger distance.
- The distance is ranged from 0 to the half of side length (t) of the detection region. The conversion table is not limited to that shown in
FIG. 8B , and various conversion tables can be adopted. For example, the evaluation value may be decreased linearly with the distance. - Thus, the
CPU 11 functions as a distance calculation unit which calculates the distance to the touched position from the central position in the touch detection range of the input target character corresponding to the position where the touch input is performed via the input unit. - Second, the
CPU 11 obtains the distance from the input start coordinates S(x, y) to the input end coordinates E(x, y) as shown inFIG. 9A and calculates an evaluation value on the basis of the obtained distance. - The evaluation value is calculated with reference to the conversion table as shown in
FIG. 9B , for example. In the conversion table, the evaluation value is ranged from 0 to 1 and varies with the distance from the input start coordinates S(x, y) to the input end coordinates E(x, y). - According to the conversion table, when the input character is (na), for example, the evaluation value is larger with a smaller distance since the smaller distance from the input start coordinates S(x, y) to the input end coordinates E(x, y) is more preferable. When the input target character is (ni), (nu), (ne) and (no), the evaluation value is larger as approaching the side length (t) of the detection region corresponding to the character (na) since it is more preferable that the distance from the input start coordinates S(x, y) to the input end coordinates E(x, y) is closer to the side length (t).
- The conversion table is not limited to that of
FIG. 9B and various types can be adopted. For example, the evaluation value may linearly increase and decrease with the distance from the input start coordinates S(x, y) to the input end coordinates E(x, y). - Thus, the
CPU 11 functions as a position detecting unit which detects the input starting position where the touch input via the input unit is started and the input end position where the touch input ends after a slide operation from the input starting position for each input target character. - Third, the
CPU 11 obtains anangle 8 between the straight line passing through the input start coordinates S(x, y) and the input end coordinates E(x, y) and the horizontal line as shown inFIG. 10A and calculates an evaluation value on the basis of the obtained angle θ. - The evaluation value is calculated with reference to a conversion table as shown in
FIG. 10B , for example. In the conversion table, the evaluation value is ranged from 0 to 1 and converted according to the angle θ. - That is, the evaluation value is larger as the straight line passing through the input start coordinates S(x, y) and the input end coordinates E(x, y) is closer to the horizontal or vertical line, and the evaluation value is smaller as the straight line is inclined more.
- The conversion table is not limited to that of
FIG. 10B , and various conversion tables can be adopted. For example, the evaluation value may be linearly decreased and increased in accordance with the angle θ. - In such way, the
CPU 11 functions as an angle detecting unit which detects the angle of straight line connecting the input starting position and the input end position detected by the detecting unit. - After the three evaluation values are calculated as mentioned above, the average value of the evaluation values is obtained and the obtained average evaluation value is stored in the data table.
- The evaluation value to be stored in the data table may be obtained from one or two of the above three evaluation values.
- The method for obtaining the evaluation value is not limited to the above mentioned methods, and various methods can be adopted as long as it can evaluate the input manner in which the touch input is performed.
- In such way, the
CPU 11 functions as an evaluation unit which obtains the evaluation value for each of the input target characters on the basis of the input manner of touch input via the input unit. - Returning to
FIG. 6 , theCPU 11 determines whether any evaluation value of the input target characters is less than a threshold value with reference to the data table shown inFIG. 7 (step S104). Though the threshold value is set to “0.5” in the embodiment, for example, the threshold value may be set to any appropriate value. - If it is determined that there is an input target character having an evaluation value less than the threshold value (step S104: YES), the
CPU 11 extracts the input target character which has the smallest evaluation value as a correction target character (step S105). Since the input target character having the smallest evaluation value is most likely to be incorrect, this input target character is extracted in the embodiment. - In such way, the
CPU 11 functions as a correction determination unit which determines the correction target character from among the input target characters on the basis of the evaluation values for the input target characters obtained by the evaluation unit. - Then, the
CPU 11 displays a correction candidate button on the character input screen (step S106). Specifically, when the input target character having the smallest evaluation value is (no) as shown inFIG. 7 , for example, theCPU 11 highlights the character (no) in the character string of “” displayed in the character display region CR so as to be recognized as the character to be corrected as shown inFIG. 11 . A correction candidate button TS corresponding to the character (no) is displayed near the character string of “ ”. The correction candidate button TS includes characters that are (na), (ni), (nu), (ne) and Ψ(no) as replacement character candidates corresponding to the character Ψ(no). - The
CPU 11 determines whether any of the character buttons forming the correction candidate button TS is touched (step S107). If it is determined that any of the character buttons forming the correction candidate button TS is touched (step S107: YES), theCPU 11 replaces the input target character to be corrected with the character corresponding to the touched character button (step S108). For example, when the character button (ne) in the correction candidate button TS is touched inFIG. 11 , the input target character (no) is corrected to (ne) as shown inFIG. 12 . The input target character stored in the data table is also corrected from (no) to (ne). - In such way, the
CPU 11 functions as a correction input receiving unit which receives a correction input of an input target character made by the touch input via the input unit to correct the input target character determined as the correction target character. - The
CPU 11 rewrites the evaluation value of the input target character to be corrected to the largest value (step S109), and thereafter executes the processing of step S104. Specifically, theCPU 11 rewrites the evaluation value stored in the data table and corresponding to the corrected input target character to “1” that is the largest value. - On the other hand, if it is not determined that any of the character buttons forming the correction candidate button TS is touched (step S107: NO), the
CPU 11 determines whether the correction button P is touched (step S110). - Specifically, as shown in
FIG. 11 , theCPU 11 can perform the determination according to whether a touch operation is performed with respect to the correction button P which is displayed on the right of the correction candidate button TS displayed in the character display region CR on the character input screen DP. - If it is determined that the correction button P is touched (step S110: YES), the
CPU 11 ends display of the correction candidate button TS (step S111), thereafter executes a correction mode for correcting a character other than the correction target character (step S112), and ends the processing. - In the correction mode, for example, the character can be corrected by moving the cursor key forward or backward and operating the “DEL” key or the like in the software keyboard KB in
FIG. 3 . - On the other hand, if it is not determined that the correction button P is touched in step S110 (step S110: NO), the
CPU 11 determines whether the confirmation button Q is touched (step S113). - Specifically, as shown in
FIG. 11 , the determination can be made according to whether a touch operation is performed with respect to the confirmation button Q which is displayed on the right of the correction candidate button TS displayed in the character display region CR on the character input screen DP. - If it is determined that the confirmation button Q is touched (step S113: YES), the
CPU 11 ends display of the correction candidate button TS, confirms the input (step S114), and thereafter ends the processing. - On the other hand, if it is not determined that the confirmation button Q is touched (step S113: NO), the
CPU 11 executes the processing in step S107. - In step S104, if it is not determined that there is an input target character having an evaluation value less than the threshold value (step S104: NO), the
CPU 11 executes the processing of step S112. - When characters are input continuously, the input processing is activated again.
-
- Though the input target character having the smallest evaluation value is extracted as the correction target character in the embodiment, the number of correction target character to be extracted is not limited to one and can be appropriately set. All the input target characters having the evaluation values less than the threshold value may be extracted as the correction target character.
- In the embodiment, when correction is performed, the
CPU 1 may function as a detection range enlargement unit which controls so as to change the ranges of detection regions corresponding to the characters before and after the correction. - Specifically, when the input target character is corrected, in a case where the input target character before correction is (no) and the input target character after the correction is (ne), the
CPU 11 changes the ranges of detection regions so as to enlarge the range of (ne)detection region 31 d and reduce the range of (no)detection region 31 e as shown inFIG. 14 . - Thus, incorrect input by a user can be more prevented.
- In such way, the
CPU 11 functions as the detection range enlargement unit which enlarges the touch detection range of the input target character for which the correction input is received by the correction input receiving unit. - The ranges of detection regions may be changed when a character is corrected once or when a character is corrected a plurality of times. The change amount of the ranges of detection regions may be variable according to the number of times the correction is performed.
- The change amount of the ranges of detection regions may be variable according to the input start coordinates, input end coordinates and the trace. The change amount of the ranges of detection regions may also be variable according to the evaluation value.
- Next, the configuration of an information terminal device according the second embodiment of the present invention will be described.
- As shown in
FIG. 15 , theinformation terminal device 100 is a tablet terminal, for example. Theinformation terminal device 100 includes a thin plate-likemain body 102 and atouch panel 103 provided on a surface of themain body 102. Thetouch panel 103 integrally includes adisplay unit 103 a as a display unit for displaying an image and aninput unit 103 b as an input unit which is provided on the entire surface of the display screen of thedisplay unit 103 a and touched by a finger, a stylus pen or the like to directly perform input (seeFIG. 16 ). - The
information terminal device 100 has a function as a character input device for inputting characters in addition to the communication function and the like. - As shown in
FIG. 16 , theinformation terminal device 100 is configured by including theCPU 111,RAM 112,ROM 113 andflash memory 114 in addition to the above mentionedtouch panel 103, and the units are connected to each other via abus 116. Since the functions of thetouch panel 103,CPU 111,RAM 112,ROM 113 andflash memory 114 are similar to those of theinformation terminal device 1 in the first embodiment, the detailed description thereof is omitted. - In the
information terminal device 100 in the embodiment, characters can be input by using a software keyboard KB displayed on the character input screen DP of thedisplay unit 103 a as shown inFIG. 17A , for example. - Specifically, when a character is to be input, the character display region CR is formed in the upper part of the character input screen DP, the character can be input by touching the software keyboard KB displayed in the lower part of the character input screen DP, and the input character is displayed in the character display region CR.
- The software keyboard KB is a software keyboard having QWERTY arrangement with
Roman character keys 121 a to 121 z for inputting Roman characters,shift keys mark keys space key 124 and anenter key 125. In the embodiment, the detection region for each character is set so as to superpose the character key. - Thus, as shown in
FIG. 17A , by touching the C key 121 c, O key 121 o, M key 121 m,P key 121 p, U key 121 u,T key 121 t, E key 121 e and R key 121 r in the software keyboard KB in this order, for example, the touch operation of the detection region of each of the characters is detected, each of the characters C, O, M, P, U, T, E and R is input, and the character string “COMPUTER” is displayed in the character display region CR. - Characters can be input in such way in the embodiment.
- There is a case where a user performs a wrong touch operation and inputs a character different from the intended character. For example, as shown in
FIG. 17B , when the user performs touch operations with respect to the touch positions R1 to R8 in order so as to input “COMPUTER”, “XOMPUTER” is input since the touch position R1 is the X key 121 x which is not the C key 121 c. - That is, though the user intends to touch the
C key 121 c, the X key 121X is touched, which is an incorrect input. This is due to the difficulties in finding the correct key position by tactile sense since the keyboard has no irregularities as in an actual keyboard. - In the embodiment, since the manner of user's touch operation is evaluated and the character input which is highly likely to be incorrect is displayed so as to be distinguishable as described later, the user can recognize the incorrect input rapidly.
- Next, input processing executed by the
CPU 111 in theinformation terminal device 100 configured as described above will be described with reference toFIG. 18 . - The input processing is executed when a user inputs characters, for example.
- First, the
CPU 111 stores input data regarding the operation with respect to the touch panel 103 (step S201). - Specifically, each set of data regarding input start coordinates and input end coordinates is stored in a predetermined region of the
RAM 12 in a form of data table as shown inFIG. 19 . The input start coordinates and the input end coordinates are obtained in the same method as the above-mentioned first embodiment. - Next, the
CPU 111 determines an input target character from the input data stored in the RAM 112 (step S202). - Specifically, the
CPU 111 specifies a character detection region to which the input end coordinates in the data table stored in theRAM 112 belongs, and thereby determines the input target character. The determined input target character is stored in the data table shown inFIG. 19 and the character is displayed in the character display region CR on the character input screen DP. - Next, the
CPU 111 calculates the evaluation value on the basis of the input data stored in theRAM 112, and stores the evaluation value in a predetermined field of the data table shown inFIG. 19 (step S203). The evaluation value is obtained for each input target character. - Here, the calculation procedure of the evaluation value in the embodiment will be described in detail.
- First, as shown in
FIG. 20A , theCPU 111 calculates the distance from the center of the character detection region, to which the input start coordinates S(x, y) belong, to the input start coordinates S(x, y), and calculates an evaluation value on the basis of the obtained distance. The evaluation value is calculated with reference to the conversion table shown inFIG. 20B , for example. In the conversion table, the evaluation value is ranged from 0 to 1 and varies according to the distance from the center of the detection region to the input start coordinates S(x, y). - The distance is ranged from 0 to the half of one side length (t) of the detection region. The conversion table is not limited to that of
FIG. 20B and various conversion tables can be adopted. For example, the evaluation value may be linearly decreased according to the distance. - Second, the
CPU 111 obtains the distance from the input start coordinates S(x, y) to the input end coordinates E(x, y) as shown inFIG. 21A , and calculates an evaluation value on the basis of the obtained distance. - That is, the
CPU 111 obtains the evaluation value from the distance of slide operation from the input start coordinates S(x, y) to the input end coordinates E(x, y). The evaluation is calculated with reference to the conversion table shown inFIG. 21B , for example. In the conversion table, the evaluation value is ranged from 0 to 1 and varies according to the distance from the input start coordinates S(x, y) to the input end coordinates E(x, y). - According to the conversion table, smaller distance from the input start coordinates S(x, y) to the input end coordinates E(x, y), that is, a smaller movement amount of a finger or the like for touch operation is more preferable, and thus, the evaluation value is larger as the distance is smaller.
- The conversion table is not limited to that of
FIG. 21B , and various types can be adopted. For example, the evaluation value may be linearly decreased according to the distance from the input start coordinates S(x, y) to the input end coordinates E(x, y). - The
CPU 111 functions as a length detecting unit for detecting the length of slide operation from the input starting position to the input end position detected by the position detecting unit. - After the two evaluation values are calculated as mentioned above, the average value of the evaluation values is calculated and the calculated average evaluation value is stored in the data table.
- The evaluation value to be stored in the data table may be calculated from one of the above two evaluation values.
- The method for obtaining the evaluation value is not limited to the above mentioned method and various methods can be adopted as long as the input manner of touch input can be evaluated.
- Returning to
FIG. 18 , theCPU 111 next determines whether there is an input target character having an evaluation value less than a threshold value with reference to the data table inFIG. 19 (step S204). Though the threshold value is set to “0.5” in the embodiment, for example, the threshold value can be set to any appropriate value. - If it is determined that there is an input target character having the evaluation value less than the threshold value (step S204: YES), the
CPU 111 extracts three input target characters having the smallest evaluation values (step S205). - For example, as shown in
FIG. 19 , the three input target characters having the smallest evaluation values are X, P and R, and the characters are extracted as correction target characters. These input target characters are extracted in the embodiment since they are highly likely to be incorrect. - The
CPU 111 displays a correction candidate button on the character input screen (step S206). Specifically, as shown inFIG. 22 , theCPU 111 highlights the three input target characters X, P and R having smallest evaluation values in the character string of “XOMPUTER” displayed in the character display region CR so as to be recognized as characters to be corrected, for example. - The
CPU 111 displays correction candidate buttons TS1, TS2 and TS3 respectively corresponding to the characters X, P and R near the character string “XOMPUTER”. The correction candidate button TS1 consists of character buttons X and C as replacement character candidates corresponding to the character X. - The correction candidate button TS2 consists of character buttons O and P as replacement character candidates corresponding to the character P. The correction candidate button TS3 consists of character buttons D, F and R as replacement character candidates corresponding to the character R. In the correction candidate buttons TS1 to TS3, the touched positions are respectively displayed so as to be recognized. Thus, the user can recognize the touched positions. The touched positions may not be displayed.
- Next, the
CPU 111 determines whether any of the character buttons consisting the correction candidate buttons TS1 to TS3 is touched (step S207). - If it is determined that any of the character buttons consisting the correction candidate buttons TS1 to TS3 is touched (step S207: YES), the
CPU 111 replaces the input target character corresponding to the touched character button with the character corresponding to the touched character button (step S208). - For example, in
FIG. 22 , when the character button C in the correction candidate button TS1 is touched, the input target character X is corrected to C as shown inFIG. 23 . The input target character stored in the data table is also corrected from X to C. - The
CPU 111 rewrites the evaluation value of the input target character which is the correction target to the largest value (step S209), and thereafter executes the processing of step S204. - On the other hand, if it is not determined that any of the character buttons consisting the correction candidate buttons TS1 to TS3 is touched (step S207: NO), the
CPU 111 determines whether the correction button P is touched (step S210). - Specifically, as shown in
FIG. 22 , the determination can be made according to whether a touch operation is performed with respect to the correction button P displayed on the right of the correction candidate button TS3 displayed in the character display region CR on the character input screen DP. - If it is determined that the correction button P is touched (step S210: YES), the
CPU 111 ends display of the correction candidate buttons TS1 to TS3 (step S211), thereafter executes the correction mode for correcting a character other than the character which is the correction target (step S212), and ends the processing. In the correction mode, correction can be performed by moving forward or backward the cursor key and operating keys such as the BS key in the software keyboard KB inFIG. 17A , for example. - On the other hand, if it is not determined that the correction button P is touched (step S210: NO), the
CPU 111 determines whether the confirmation button Q is touched (step S213). Specifically, as shown inFIG. 22 , the determination can be made according to whether a touch operation is performed with respect to the confirmation button Q displayed on the right of the correction candidate button TS3 displayed in the character display region CR on the character input screen DP. - If it is determined that the confirmation button Q is touched (step S213: YES), the
CPU 111 ends the display of correction candidate buttons TS1 to TS3 to confirm the input (step S214), and ends the processing. - On the other hand, if it is not determined that the confirmation button Q is touched (step S213: NO), the
CPU 111 executes the processing of step S207. - In step S204, if it is not determined that there is an input target character having an evaluation value less than the threshold value (step S204: NO), the
CPU 111 executes the processing in step S212. - When characters are input continuously, the input processing is activated again.
- Though the three input target characters having the smallest evaluation values are extracted as the correction target characters in the embodiment, the number of characters to be extracted as the correction target characters can be appropriately set. All the input target characters having the evaluation values less than the threshold value may be extracted as the correction target characters.
- In the embodiment, when correction was performed, control may be performed so as to change the ranges of detection regions corresponding to the characters before and after the correction. Specifically, when the input target character was corrected, in a case where the input target character before correction is X and the input target character after the correction is C, the
CPU 111 changes the ranges of theX detection region 131 x corresponding to X key 121 x andC detection region 131 c corresponding to C key 121C set shown inFIG. 24A to the ranges shown inFIG. 24B so as to reduce the range ofX detection region 131 x and enlarge the range ofC detection region 131 c. - Thus, incorrect input by a user can be prevented more.
- The ranges of detection regions may be changed either when the correction is performed once or when the correction is performed a plurality of times.
- The change amount of the ranges of detection regions may be variable according to the number of times of correction.
- The change amount of the ranges of detection regions may be variable according to the evaluation value.
- The change amount of the ranges of detection regions may be variable according to the touched position.
- As described above, according to the embodiment, the touch panel 3 (103) integrally includes the
display unit 3 a (103 a) for displaying a screen and theinput unit 3 b (103 b) which receives the touch input of a position on the screen displayed on thedisplay unit 3 a (103 a). - The CPU 11 (111) displays the character input screen DP having the character display region CR on the
display unit 3 a (103 a), associates the software keyboard KB including a plurality of characters with the touch panel 3 (103), and displays the character in the software keyboard KB corresponding to the position for which the touch input is performed via theinput unit 3 b in the character display region CR as the input target character. - The CPU 11 (111) obtains an evaluation value for each of the input target characters on the basis of the input manner of the touch input via the
input unit 3 b (103 b). The CPU 11 (111) determines the correction target character from among the input target characters on the basis of the evaluation value obtained for each of the input target characters. The CPU 11 (111) displays the determined correction target character on thedisplay unit 3 a (103 a) so as to be distinguishable. As a result, the user can recognize the incorrect input rapidly. - According to the embodiment, the CPU 11 (111) calculates the distance from the central position of the touch detection region of the input target character corresponding to the position where the touch input is performed via the
input unit 3 b (103 b) to the touched position. The CPU 11 (111) obtains the evaluation value for each of the input target characters on the basis of the calculated distance. As a result, incorrect input can be appropriately detected. - According to the embodiment, the CPU 11 (111) detects the input starting position where the touch input via the
input unit 3 b (103 b) starts and the input end position where the touch input ends after the slide operation from the input starting position for each of the input target characters. The CPU 11 (111) obtains the evaluation value for each of the input target characters on the basis of the detection result. As a result, incorrect input can be appropriately detected. - According to the embodiment, the CPU 11 (111) detects the length of slide operation from the input starting position to the input end position which were detected. The CPU 11 (111) obtains the evaluation value for each of the input target characters on the basis of the detected length. As a result, incorrect input can be appropriately detected.
- According to the embodiment, the
CPU 11 detects the angle of the straight line connecting the input starting position to the input end position. TheCPU 11 obtains the evaluation value for each of the characters on the basis of the detected angle. As a result, incorrect input can be appropriately detected. - According to the embodiment, the CPU 11 (111) determines at least the input target character having the smallest evaluation value as the correction target character. As a result, the user can recognize the input target character which is highly likely to be incorrect more precisely.
- According to the embodiment, the CPU 11 (111) receives correction input of an input target character by the touch input via the
input unit 3 b (103 b) to correct the input target character which was determined to be the correction target character. The CPU 11 (111) replaces the input target character which was determined to be the correction target character and is displayed on thedisplay unit 3 a (103 a) with the input target character for which correction input is received. - As a result, incorrect input can be appropriately corrected.
- According to the embodiment, the CPU 11 (111) displays replacement character candidates corresponding to the input target character determined to be the correction target character near the correction target character displayed in the character display region CR. When a position corresponding to the display of a replacement character candidate is touched, the CPU 11 (111) receives the character of the replacement character candidate as the input target character to replace the correction target character. As a result, correction can be performed more efficiently with an easy operation.
- According to the embodiment, the CPU 11 (111) enlarges the touch detection range of the input target character for which correction input was received. As a result, incorrect input thereafter can be suppressed.
- According to the embodiment, when correction input is received for a character a predetermined number of times, the CPU 11 (111) enlarges the touch detection range of the input target character for which correction input was received. As a result, the touch detection range can be enlarged appropriately according to the manner of user's touch operation.
- The description in the above embodiments are preferred examples of the information terminal device according to the present invention, and the present invention is not limited to the examples. In the embodiment, the input target character having the smallest evaluation value is extracted as the correction target character and displayed when there is an evaluation value less than the threshold value. However, the input target character having the smallest evaluation value may be extracted as the correction target character and displayed regardless of whether the evaluation value is less than the threshold value.
- In the embodiment, the correction candidate button is displayed on the character input screen to perform correction input of a character. However, the correction input may be performed via the software keyboard KB without displaying the correction candidate button.
- As the computer readable medium storing programs for executing the above processing, a non-volatile memory such as a flash memory and a portable recording medium such as a CD-ROM can also be applied in addition to the ROM, hard disk and such like.
- A carrier wave also applies as the medium providing the program data via a predetermined communication line.
- Various changes can also be appropriately made within the scope of the present invention with respect to the other detailed configuration and detailed operation of the components forming the information terminal device.
- Though several embodiments and variations of the present invention have been described above, the scope of the present invention is not limited to the above embodiments and variations, and includes the scope of inventions, which is described in the scope of claims, and the scope equivalent thereof.
Claims (11)
1. A character input device, comprising:
a touch panel which integrally includes a display unit for displaying a screen and an input unit for receiving a touch input at a position in the screen displayed on the display unit;
a first control unit which displays a character input screen having a character display region on the display unit, associates a keyboard including a plurality of characters with the touch panel and displays a character in the keyboard corresponding to the position where the touch input is performed via the input unit in the character display region as an input target character;
an evaluation unit which obtains an evaluation value for the input target character on basis of an input manner in which the touch input is performed via the input unit;
a determination unit which determines whether the input target character is a correction target character on basis of the evaluation value for the input target character obtained by the evaluation unit; and
a second control unit which displays the correction target character determined by the determination unit on the display unit so as to be distinguishable.
2. The character input device according to claim 1 , further comprising a distance calculation unit which calculates, as the input manner, a distance from a central position of a touch detection range of the input target character corresponding to the position of the touch input performed via the input unit to the position of the touch input, wherein
the evaluation unit obtains the evaluation value for the input target character on basis of the distance calculated by the distance calculation unit.
3. The character input device according to claim 1 , further comprising a position detecting unit which detects an input starting position and an input end position as the input manner for the input target character, the input starting position being a position where the touch input via the input unit starts and the input end position being a position where the touch input ends after a slide operation from the input starting position, wherein
the evaluation unit obtains the evaluation value for the input target character on basis of a detection result by the position detecting unit.
4. The character input device according to claim 3 , further comprising a length detecting unit which detects, as the input manner, a length of the slide operation from the input starting position to the input end position detected by the position detecting unit, wherein
the evaluation unit obtains the evaluation value for the input target character on basis of the length detected by the length detecting unit.
5. The character input device according to claim 3 , further comprising an angle detecting unit which detects, as the input manner, an angle between a straight line and a horizontal line, the straight line connecting the input starting position and the input end position detected by the position detecting unit, wherein
the evaluation unit obtains the evaluation value for the input target character on basis of the angle detected by the angle detecting unit.
6. The character input device according to claim 1 , wherein the determination unit determines at least an input target character having a smallest evaluation value as the correction target character.
7. The character input device according to claim 1 , further comprising a receiving unit which receives a correction input of an input target character by a touch input via the input unit with respect to the input target character that is determined to be the correction target character, wherein
the second control unit replaces the input target character that is displayed on the display unit and determined to be the correction target character with the input target character for which the receiving unit receives the correction input.
8. The character input device according to claim 7 , wherein
the second control unit displays a replacement character candidate corresponding to the input target character determined to be the correction target character by the determination unit near the correction target character displayed in the character display region, and
when the touch input is performed at a position corresponding to display of the replacement character candidate, the receiving unit receives the replacement character candidate as the input target character to replace the correction target character.
9. The character input device according to claim 7 , further comprising an enlargement unit which enlarges a touch detection range of the input target character for which the correction input is received by the receiving unit.
10. The character input device according to claim 9 , wherein the enlargement unit enlarges the touch detection range of the input target character for which the correction input is received when the correction input is performed for the input target character a predetermined number of times.
11. A non-transitory computer readable medium that stores a program for making a computer execute a procedure, the computer being a character input device with a touch panel which integrally includes a display unit for displaying a screen and an input unit for receiving a touch input at a position in the screen displayed on the display unit, the procedure comprising:
controlling so as to display a character input screen having a character display region on the display unit, associate a keyboard including a plurality of characters with the touch panel and display a character in the keyboard corresponding to the position where the touch input is performed via the input unit in the character display region as an input target character;
obtaining an evaluation value for the input target character on basis of an input manner in which the touch input is performed via the input unit;
determining whether the input target character is a correction target character on basis of the obtained evaluation value for the input target character; and
displaying the determined correction target character on the display unit so as to be distinguishable.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-171372 | 2013-08-21 | ||
JP2013171372A JP2015041845A (en) | 2013-08-21 | 2013-08-21 | Character input device and program |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150058785A1 true US20150058785A1 (en) | 2015-02-26 |
Family
ID=52481563
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/465,461 Abandoned US20150058785A1 (en) | 2013-08-21 | 2014-08-21 | Character Input Device And Computer Readable Recording Medium |
Country Status (3)
Country | Link |
---|---|
US (1) | US20150058785A1 (en) |
JP (1) | JP2015041845A (en) |
CN (1) | CN104423625A (en) |
Cited By (124)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017037495A (en) * | 2015-08-10 | 2017-02-16 | 富士通株式会社 | Electronic device and input control program |
US10043516B2 (en) | 2016-09-23 | 2018-08-07 | Apple Inc. | Intelligent automated assistant |
US10079014B2 (en) | 2012-06-08 | 2018-09-18 | Apple Inc. | Name recognition system |
US10083690B2 (en) | 2014-05-30 | 2018-09-25 | Apple Inc. | Better resolution when referencing to concepts |
US10101822B2 (en) | 2015-06-05 | 2018-10-16 | Apple Inc. | Language input correction |
US10108612B2 (en) | 2008-07-31 | 2018-10-23 | Apple Inc. | Mobile device having human language translation capability with positional feedback |
US20190026018A1 (en) * | 2017-07-19 | 2019-01-24 | Kyocera Document Solutions Inc. | Display control device, display control method, and computer-readable storage medium non-transitorily storing display control program |
US10303715B2 (en) | 2017-05-16 | 2019-05-28 | Apple Inc. | Intelligent automated assistant for media exploration |
US10311871B2 (en) | 2015-03-08 | 2019-06-04 | Apple Inc. | Competing devices responding to voice triggers |
US10311144B2 (en) | 2017-05-16 | 2019-06-04 | Apple Inc. | Emoji word sense disambiguation |
US10332518B2 (en) | 2017-05-09 | 2019-06-25 | Apple Inc. | User interface for correcting recognition errors |
US10356243B2 (en) | 2015-06-05 | 2019-07-16 | Apple Inc. | Virtual assistant aided communication with 3rd party service in a communication session |
US10354652B2 (en) | 2015-12-02 | 2019-07-16 | Apple Inc. | Applying neural network language models to weighted finite state transducers for automatic speech recognition |
US10381016B2 (en) | 2008-01-03 | 2019-08-13 | Apple Inc. | Methods and apparatus for altering audio output signals |
US10390213B2 (en) | 2014-09-30 | 2019-08-20 | Apple Inc. | Social reminders |
US10395654B2 (en) | 2017-05-11 | 2019-08-27 | Apple Inc. | Text normalization based on a data-driven learning network |
US10403278B2 (en) | 2017-05-16 | 2019-09-03 | Apple Inc. | Methods and systems for phonetic matching in digital assistant services |
US10403283B1 (en) | 2018-06-01 | 2019-09-03 | Apple Inc. | Voice interaction at a primary device to access call functionality of a companion device |
US10410637B2 (en) | 2017-05-12 | 2019-09-10 | Apple Inc. | User-specific acoustic models |
US10417266B2 (en) | 2017-05-09 | 2019-09-17 | Apple Inc. | Context-aware ranking of intelligent response suggestions |
US10417344B2 (en) | 2014-05-30 | 2019-09-17 | Apple Inc. | Exemplar-based natural language processing |
US10417405B2 (en) | 2011-03-21 | 2019-09-17 | Apple Inc. | Device access using voice authentication |
US10431204B2 (en) | 2014-09-11 | 2019-10-01 | Apple Inc. | Method and apparatus for discovering trending terms in speech requests |
CN110297777A (en) * | 2019-07-10 | 2019-10-01 | 北京百度网讯科技有限公司 | The assessment method and device of input method |
US10438595B2 (en) | 2014-09-30 | 2019-10-08 | Apple Inc. | Speaker identification and unsupervised speaker adaptation techniques |
US10445429B2 (en) | 2017-09-21 | 2019-10-15 | Apple Inc. | Natural language understanding using vocabularies with compressed serialized tries |
US10453443B2 (en) | 2014-09-30 | 2019-10-22 | Apple Inc. | Providing an indication of the suitability of speech recognition |
US10474753B2 (en) | 2016-09-07 | 2019-11-12 | Apple Inc. | Language identification using recurrent neural networks |
US10482874B2 (en) | 2017-05-15 | 2019-11-19 | Apple Inc. | Hierarchical belief states for digital assistants |
US10497365B2 (en) | 2014-05-30 | 2019-12-03 | Apple Inc. | Multi-command single utterance input method |
US10496705B1 (en) | 2018-06-03 | 2019-12-03 | Apple Inc. | Accelerated task performance |
US10529332B2 (en) | 2015-03-08 | 2020-01-07 | Apple Inc. | Virtual assistant activation |
US10567477B2 (en) | 2015-03-08 | 2020-02-18 | Apple Inc. | Virtual assistant continuity |
US10580409B2 (en) | 2016-06-11 | 2020-03-03 | Apple Inc. | Application integration with a digital assistant |
US10592604B2 (en) | 2018-03-12 | 2020-03-17 | Apple Inc. | Inverse text normalization for automatic speech recognition |
US10636424B2 (en) | 2017-11-30 | 2020-04-28 | Apple Inc. | Multi-turn canned dialog |
US10643611B2 (en) | 2008-10-02 | 2020-05-05 | Apple Inc. | Electronic devices with voice command and contextual data processing capabilities |
US10657328B2 (en) | 2017-06-02 | 2020-05-19 | Apple Inc. | Multi-task recurrent neural network architecture for efficient morphology handling in neural language modeling |
US10657961B2 (en) | 2013-06-08 | 2020-05-19 | Apple Inc. | Interpreting and acting upon commands that involve sharing information with remote devices |
US10684703B2 (en) | 2018-06-01 | 2020-06-16 | Apple Inc. | Attention aware virtual assistant dismissal |
US10692504B2 (en) | 2010-02-25 | 2020-06-23 | Apple Inc. | User profiling for voice input processing |
US10699717B2 (en) | 2014-05-30 | 2020-06-30 | Apple Inc. | Intelligent assistant for home automation |
US10714117B2 (en) | 2013-02-07 | 2020-07-14 | Apple Inc. | Voice trigger for a digital assistant |
US10726832B2 (en) | 2017-05-11 | 2020-07-28 | Apple Inc. | Maintaining privacy of personal information |
US10733993B2 (en) | 2016-06-10 | 2020-08-04 | Apple Inc. | Intelligent digital assistant in a multi-tasking environment |
US10733375B2 (en) | 2018-01-31 | 2020-08-04 | Apple Inc. | Knowledge-based framework for improving natural language understanding |
US10733982B2 (en) | 2018-01-08 | 2020-08-04 | Apple Inc. | Multi-directional dialog |
US10741185B2 (en) | 2010-01-18 | 2020-08-11 | Apple Inc. | Intelligent automated assistant |
US10748546B2 (en) | 2017-05-16 | 2020-08-18 | Apple Inc. | Digital assistant services based on device capabilities |
US10755703B2 (en) | 2017-05-11 | 2020-08-25 | Apple Inc. | Offline personal assistant |
US10755051B2 (en) | 2017-09-29 | 2020-08-25 | Apple Inc. | Rule-based natural language processing |
US10769385B2 (en) | 2013-06-09 | 2020-09-08 | Apple Inc. | System and method for inferring user intent from speech inputs |
US10791176B2 (en) | 2017-05-12 | 2020-09-29 | Apple Inc. | Synchronization and task delegation of a digital assistant |
US10789959B2 (en) | 2018-03-02 | 2020-09-29 | Apple Inc. | Training speaker recognition models for digital assistants |
US10789945B2 (en) | 2017-05-12 | 2020-09-29 | Apple Inc. | Low-latency intelligent automated assistant |
US10810274B2 (en) | 2017-05-15 | 2020-10-20 | Apple Inc. | Optimizing dialogue policy decisions for digital assistants using implicit feedback |
US10818288B2 (en) | 2018-03-26 | 2020-10-27 | Apple Inc. | Natural assistant interaction |
US10839159B2 (en) | 2018-09-28 | 2020-11-17 | Apple Inc. | Named entity normalization in a spoken dialog system |
US10892996B2 (en) | 2018-06-01 | 2021-01-12 | Apple Inc. | Variable latency device coordination |
US10909331B2 (en) | 2018-03-30 | 2021-02-02 | Apple Inc. | Implicit identification of translation payload with neural machine translation |
US10928918B2 (en) | 2018-05-07 | 2021-02-23 | Apple Inc. | Raise to speak |
US10942703B2 (en) | 2015-12-23 | 2021-03-09 | Apple Inc. | Proactive assistance based on dialog communication between devices |
US10942702B2 (en) | 2016-06-11 | 2021-03-09 | Apple Inc. | Intelligent device arbitration and control |
US10956666B2 (en) | 2015-11-09 | 2021-03-23 | Apple Inc. | Unconventional virtual assistant interactions |
US10984780B2 (en) | 2018-05-21 | 2021-04-20 | Apple Inc. | Global semantic word embeddings using bi-directional recurrent neural networks |
US11010561B2 (en) | 2018-09-27 | 2021-05-18 | Apple Inc. | Sentiment prediction from textual data |
US11010127B2 (en) | 2015-06-29 | 2021-05-18 | Apple Inc. | Virtual assistant for media playback |
US11025565B2 (en) | 2015-06-07 | 2021-06-01 | Apple Inc. | Personalized prediction of responses for instant messaging |
US11023513B2 (en) | 2007-12-20 | 2021-06-01 | Apple Inc. | Method and apparatus for searching using an active ontology |
US11048473B2 (en) | 2013-06-09 | 2021-06-29 | Apple Inc. | Device, method, and graphical user interface for enabling conversation persistence across two or more instances of a digital assistant |
US11070949B2 (en) | 2015-05-27 | 2021-07-20 | Apple Inc. | Systems and methods for proactively identifying and surfacing relevant content on an electronic device with a touch-sensitive display |
US11069347B2 (en) | 2016-06-08 | 2021-07-20 | Apple Inc. | Intelligent automated assistant for media exploration |
US11069336B2 (en) | 2012-03-02 | 2021-07-20 | Apple Inc. | Systems and methods for name pronunciation |
US11120372B2 (en) | 2011-06-03 | 2021-09-14 | Apple Inc. | Performing actions associated with task items that represent tasks to perform |
US11127397B2 (en) | 2015-05-27 | 2021-09-21 | Apple Inc. | Device voice control |
US11126400B2 (en) | 2015-09-08 | 2021-09-21 | Apple Inc. | Zero latency digital assistant |
US11133008B2 (en) | 2014-05-30 | 2021-09-28 | Apple Inc. | Reducing the need for manual start/end-pointing and trigger phrases |
US11140099B2 (en) | 2019-05-21 | 2021-10-05 | Apple Inc. | Providing message response suggestions |
US11145294B2 (en) | 2018-05-07 | 2021-10-12 | Apple Inc. | Intelligent automated assistant for delivering content from user experiences |
US11170166B2 (en) | 2018-09-28 | 2021-11-09 | Apple Inc. | Neural typographical error modeling via generative adversarial networks |
US11204787B2 (en) | 2017-01-09 | 2021-12-21 | Apple Inc. | Application integration with a digital assistant |
US11217251B2 (en) | 2019-05-06 | 2022-01-04 | Apple Inc. | Spoken notifications |
US11227589B2 (en) | 2016-06-06 | 2022-01-18 | Apple Inc. | Intelligent list reading |
US11231904B2 (en) | 2015-03-06 | 2022-01-25 | Apple Inc. | Reducing response latency of intelligent automated assistants |
US11237797B2 (en) | 2019-05-31 | 2022-02-01 | Apple Inc. | User activity shortcut suggestions |
US11269678B2 (en) | 2012-05-15 | 2022-03-08 | Apple Inc. | Systems and methods for integrating third party services with a digital assistant |
US11281993B2 (en) | 2016-12-05 | 2022-03-22 | Apple Inc. | Model and ensemble compression for metric learning |
US11289073B2 (en) | 2019-05-31 | 2022-03-29 | Apple Inc. | Device text to speech |
US11295088B2 (en) | 2019-11-20 | 2022-04-05 | Apple Inc. | Sanitizing word predictions |
US11301477B2 (en) | 2017-05-12 | 2022-04-12 | Apple Inc. | Feedback analysis of a digital assistant |
US11307752B2 (en) | 2019-05-06 | 2022-04-19 | Apple Inc. | User configurable task triggers |
US11314370B2 (en) | 2013-12-06 | 2022-04-26 | Apple Inc. | Method for extracting salient dialog usage from live data |
US11348573B2 (en) | 2019-03-18 | 2022-05-31 | Apple Inc. | Multimodality in digital assistant systems |
US11350253B2 (en) | 2011-06-03 | 2022-05-31 | Apple Inc. | Active transport based notifications |
US11360641B2 (en) | 2019-06-01 | 2022-06-14 | Apple Inc. | Increasing the relevance of new available information |
US11386266B2 (en) | 2018-06-01 | 2022-07-12 | Apple Inc. | Text correction |
US11388291B2 (en) | 2013-03-14 | 2022-07-12 | Apple Inc. | System and method for processing voicemail |
US11423886B2 (en) | 2010-01-18 | 2022-08-23 | Apple Inc. | Task flow identification based on user intent |
US11423908B2 (en) | 2019-05-06 | 2022-08-23 | Apple Inc. | Interpreting spoken requests |
US11462215B2 (en) | 2018-09-28 | 2022-10-04 | Apple Inc. | Multi-modal inputs for voice commands |
US11467802B2 (en) | 2017-05-11 | 2022-10-11 | Apple Inc. | Maintaining privacy of personal information |
US11468282B2 (en) | 2015-05-15 | 2022-10-11 | Apple Inc. | Virtual assistant in a communication session |
US11475884B2 (en) | 2019-05-06 | 2022-10-18 | Apple Inc. | Reducing digital assistant latency when a language is incorrectly determined |
US11475898B2 (en) | 2018-10-26 | 2022-10-18 | Apple Inc. | Low-latency multi-speaker speech recognition |
US11488406B2 (en) | 2019-09-25 | 2022-11-01 | Apple Inc. | Text detection using global geometry estimators |
US11495218B2 (en) | 2018-06-01 | 2022-11-08 | Apple Inc. | Virtual assistant operation in multi-device environments |
US11496600B2 (en) | 2019-05-31 | 2022-11-08 | Apple Inc. | Remote execution of machine-learned models |
US11500672B2 (en) | 2015-09-08 | 2022-11-15 | Apple Inc. | Distributed personal assistant |
US11516537B2 (en) | 2014-06-30 | 2022-11-29 | Apple Inc. | Intelligent automated assistant for TV user interactions |
US11526368B2 (en) | 2015-11-06 | 2022-12-13 | Apple Inc. | Intelligent automated assistant in a messaging environment |
US11532306B2 (en) | 2017-05-16 | 2022-12-20 | Apple Inc. | Detecting a trigger of a digital assistant |
US11556244B2 (en) * | 2017-12-28 | 2023-01-17 | Maxell, Ltd. | Input information correction method and information terminal |
US11638059B2 (en) | 2019-01-04 | 2023-04-25 | Apple Inc. | Content playback on multiple devices |
US11657813B2 (en) | 2019-05-31 | 2023-05-23 | Apple Inc. | Voice identification in digital assistant systems |
US11671920B2 (en) | 2007-04-03 | 2023-06-06 | Apple Inc. | Method and system for operating a multifunction portable electronic device using voice-activation |
US11696060B2 (en) | 2020-07-21 | 2023-07-04 | Apple Inc. | User identification using headphones |
US11765209B2 (en) | 2020-05-11 | 2023-09-19 | Apple Inc. | Digital assistant hardware abstraction |
US11790914B2 (en) | 2019-06-01 | 2023-10-17 | Apple Inc. | Methods and user interfaces for voice-based control of electronic devices |
US11798547B2 (en) | 2013-03-15 | 2023-10-24 | Apple Inc. | Voice activated device for use with a voice-based digital assistant |
US11809483B2 (en) | 2015-09-08 | 2023-11-07 | Apple Inc. | Intelligent automated assistant for media search and playback |
US11838734B2 (en) | 2020-07-20 | 2023-12-05 | Apple Inc. | Multi-device audio adjustment coordination |
US11853536B2 (en) | 2015-09-08 | 2023-12-26 | Apple Inc. | Intelligent automated assistant in a media environment |
US11914848B2 (en) | 2020-05-11 | 2024-02-27 | Apple Inc. | Providing relevant data items based on context |
US11928604B2 (en) | 2005-09-08 | 2024-03-12 | Apple Inc. | Method and apparatus for building an intelligent automated assistant |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6319236B2 (en) * | 2015-09-02 | 2018-05-09 | 京セラドキュメントソリューションズ株式会社 | Display input device and image forming apparatus |
US10926415B2 (en) * | 2015-11-16 | 2021-02-23 | Kawasaki Jukogyo Kabushiki Kaisha | Robot system and control method of robot system |
JP6524903B2 (en) * | 2015-12-21 | 2019-06-05 | 富士通株式会社 | Input program, input device, and input method |
KR101858999B1 (en) * | 2016-11-28 | 2018-05-17 | (주)헤르메시스 | Apparatus for correcting input of virtual keyboard, and method thereof |
JP6859711B2 (en) * | 2017-01-13 | 2021-04-14 | オムロン株式会社 | String input device, input string estimation method, and input string estimation program |
JP6911361B2 (en) * | 2017-01-19 | 2021-07-28 | カシオ計算機株式会社 | Calculator, calculation method and program |
JP7143792B2 (en) * | 2019-03-14 | 2022-09-29 | オムロン株式会社 | Character input device, character input method, and character input program |
CN112214154B (en) * | 2019-07-12 | 2022-10-28 | 北京搜狗科技发展有限公司 | Interface processing method and device and interface processing device |
Citations (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6292179B1 (en) * | 1998-05-12 | 2001-09-18 | Samsung Electronics Co., Ltd. | Software keyboard system using trace of stylus on a touch screen and method for recognizing key code using the same |
US20020099542A1 (en) * | 1996-09-24 | 2002-07-25 | Allvoice Computing Plc. | Method and apparatus for processing the output of a speech recognition engine |
US20030014239A1 (en) * | 2001-06-08 | 2003-01-16 | Ichbiah Jean D. | Method and system for entering accented and other extended characters |
US20040155869A1 (en) * | 1999-05-27 | 2004-08-12 | Robinson B. Alex | Keyboard system with automatic correction |
US6822585B1 (en) * | 1999-09-17 | 2004-11-23 | Nokia Mobile Phones, Ltd. | Input of symbols |
US20060085757A1 (en) * | 2004-07-30 | 2006-04-20 | Apple Computer, Inc. | Activating virtual keys of a touch-screen virtual keyboard |
US20070061753A1 (en) * | 2003-07-17 | 2007-03-15 | Xrgomics Pte Ltd | Letter and word choice text input method for keyboards and reduced keyboard systems |
US7256769B2 (en) * | 2003-02-24 | 2007-08-14 | Zi Corporation Of Canada, Inc. | System and method for text entry on a reduced keyboard |
US20070216658A1 (en) * | 2006-03-17 | 2007-09-20 | Nokia Corporation | Mobile communication terminal |
US20080189605A1 (en) * | 2007-02-01 | 2008-08-07 | David Kay | Spell-check for a keyboard system with automatic correction |
US7453439B1 (en) * | 2003-01-16 | 2008-11-18 | Forward Input Inc. | System and method for continuous stroke word-based text input |
US20090182552A1 (en) * | 2008-01-14 | 2009-07-16 | Fyke Steven H | Method and handheld electronic device employing a touch screen for ambiguous word review or correction |
US20090281787A1 (en) * | 2008-05-11 | 2009-11-12 | Xin Wang | Mobile electronic device and associated method enabling transliteration of a text input |
US7707515B2 (en) * | 2006-01-23 | 2010-04-27 | Microsoft Corporation | Digital user interface for inputting Indic scripts |
US7750891B2 (en) * | 2003-04-09 | 2010-07-06 | Tegic Communications, Inc. | Selective input system based on tracking of motion parameters of an input device |
US7996781B2 (en) * | 2007-04-04 | 2011-08-09 | Vadim Zaliva | List entry selection for electronic devices |
US20120113008A1 (en) * | 2010-11-08 | 2012-05-10 | Ville Makinen | On-screen keyboard with haptic effects |
US8327287B2 (en) * | 2006-04-12 | 2012-12-04 | Nintendo Co., Ltd. | Character input program and character input device |
US20130063361A1 (en) * | 2011-09-08 | 2013-03-14 | Research In Motion Limited | Method of facilitating input at an electronic device |
US20130076669A1 (en) * | 2011-09-27 | 2013-03-28 | Kyocera Corporation | Portable terminal and reception control method |
US20130268879A1 (en) * | 2012-04-06 | 2013-10-10 | Google Inc. | Smart user-customized graphical keyboard |
US8564541B2 (en) * | 2009-03-16 | 2013-10-22 | Apple Inc. | Zhuyin input interface on a device |
US20140115522A1 (en) * | 2012-10-19 | 2014-04-24 | Google Inc. | Gesture-keyboard decoding using gesture path deviation |
US20140164973A1 (en) * | 2012-12-07 | 2014-06-12 | Apple Inc. | Techniques for preventing typographical errors on software keyboards |
US8782550B1 (en) * | 2013-02-28 | 2014-07-15 | Google Inc. | Character string replacement |
US20140351760A1 (en) * | 2013-05-24 | 2014-11-27 | Google Inc. | Order-independent text input |
US9092134B2 (en) * | 2008-02-04 | 2015-07-28 | Nokia Technologies Oy | User touch display interface providing an expanded selection area for a user selectable object |
US9089776B2 (en) * | 2010-12-10 | 2015-07-28 | Konami Digital Entertainment Co., Ltd. | Game device which recognizes and replaces a substitution object participating in a competition game in a virtual space |
US9411425B2 (en) * | 2011-01-25 | 2016-08-09 | Sony Corporation | Input device, input method, and computer program for inputting characters, numbers, or symbols by using an on-screen keyboard |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3927412B2 (en) * | 2001-12-28 | 2007-06-06 | シャープ株式会社 | Touch panel input device, program, and recording medium recording program |
JP2006005655A (en) * | 2004-06-17 | 2006-01-05 | Sharp Corp | Input device and input program provided with item processing function, and computer readable recording medium |
US7843427B2 (en) * | 2006-09-06 | 2010-11-30 | Apple Inc. | Methods for determining a cursor position from a finger contact with a touch screen display |
JP2008292731A (en) * | 2007-05-24 | 2008-12-04 | Kyocera Mita Corp | Operation device and image formation device |
JP5623053B2 (en) * | 2009-10-08 | 2014-11-12 | 京セラ株式会社 | Input device |
JP5623054B2 (en) * | 2009-10-08 | 2014-11-12 | 京セラ株式会社 | Input device |
JP2011150489A (en) * | 2010-01-20 | 2011-08-04 | Sony Corp | Information processing apparatus and program |
JP5752150B2 (en) * | 2010-02-01 | 2015-07-22 | ジンジャー ソフトウェア、インコーポレイティッド | Context-sensitive automatic language correction using an Internet corpus specifically for small keyboard devices |
US20130271379A1 (en) * | 2011-01-27 | 2013-10-17 | Sharp Kabushiki Kaisha | Character input device and character input method |
CN102915224A (en) * | 2011-08-01 | 2013-02-06 | 环达电脑(上海)有限公司 | Digitally assisted input and correction speech input system, digitally assisted input method, and digitally assisted correction method |
US9176666B2 (en) * | 2011-12-23 | 2015-11-03 | Symbol Technologies, Llc | Method and device for a multi-touch based correction of a handwriting sentence system |
-
2013
- 2013-08-21 JP JP2013171372A patent/JP2015041845A/en active Pending
-
2014
- 2014-08-20 CN CN201410411648.8A patent/CN104423625A/en active Pending
- 2014-08-21 US US14/465,461 patent/US20150058785A1/en not_active Abandoned
Patent Citations (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020099542A1 (en) * | 1996-09-24 | 2002-07-25 | Allvoice Computing Plc. | Method and apparatus for processing the output of a speech recognition engine |
US6292179B1 (en) * | 1998-05-12 | 2001-09-18 | Samsung Electronics Co., Ltd. | Software keyboard system using trace of stylus on a touch screen and method for recognizing key code using the same |
US20040155869A1 (en) * | 1999-05-27 | 2004-08-12 | Robinson B. Alex | Keyboard system with automatic correction |
US6822585B1 (en) * | 1999-09-17 | 2004-11-23 | Nokia Mobile Phones, Ltd. | Input of symbols |
US20030014239A1 (en) * | 2001-06-08 | 2003-01-16 | Ichbiah Jean D. | Method and system for entering accented and other extended characters |
US7453439B1 (en) * | 2003-01-16 | 2008-11-18 | Forward Input Inc. | System and method for continuous stroke word-based text input |
US7256769B2 (en) * | 2003-02-24 | 2007-08-14 | Zi Corporation Of Canada, Inc. | System and method for text entry on a reduced keyboard |
US7750891B2 (en) * | 2003-04-09 | 2010-07-06 | Tegic Communications, Inc. | Selective input system based on tracking of motion parameters of an input device |
US20070061753A1 (en) * | 2003-07-17 | 2007-03-15 | Xrgomics Pte Ltd | Letter and word choice text input method for keyboards and reduced keyboard systems |
US20060085757A1 (en) * | 2004-07-30 | 2006-04-20 | Apple Computer, Inc. | Activating virtual keys of a touch-screen virtual keyboard |
US7707515B2 (en) * | 2006-01-23 | 2010-04-27 | Microsoft Corporation | Digital user interface for inputting Indic scripts |
US20070216658A1 (en) * | 2006-03-17 | 2007-09-20 | Nokia Corporation | Mobile communication terminal |
US8327287B2 (en) * | 2006-04-12 | 2012-12-04 | Nintendo Co., Ltd. | Character input program and character input device |
US20080189605A1 (en) * | 2007-02-01 | 2008-08-07 | David Kay | Spell-check for a keyboard system with automatic correction |
US7996781B2 (en) * | 2007-04-04 | 2011-08-09 | Vadim Zaliva | List entry selection for electronic devices |
US20090182552A1 (en) * | 2008-01-14 | 2009-07-16 | Fyke Steven H | Method and handheld electronic device employing a touch screen for ambiguous word review or correction |
US9454516B2 (en) * | 2008-01-14 | 2016-09-27 | Blackberry Limited | Method and handheld electronic device employing a touch screen for ambiguous word review or correction |
US9092134B2 (en) * | 2008-02-04 | 2015-07-28 | Nokia Technologies Oy | User touch display interface providing an expanded selection area for a user selectable object |
US20090281787A1 (en) * | 2008-05-11 | 2009-11-12 | Xin Wang | Mobile electronic device and associated method enabling transliteration of a text input |
US8564541B2 (en) * | 2009-03-16 | 2013-10-22 | Apple Inc. | Zhuyin input interface on a device |
US20120113008A1 (en) * | 2010-11-08 | 2012-05-10 | Ville Makinen | On-screen keyboard with haptic effects |
US9089776B2 (en) * | 2010-12-10 | 2015-07-28 | Konami Digital Entertainment Co., Ltd. | Game device which recognizes and replaces a substitution object participating in a competition game in a virtual space |
US9411425B2 (en) * | 2011-01-25 | 2016-08-09 | Sony Corporation | Input device, input method, and computer program for inputting characters, numbers, or symbols by using an on-screen keyboard |
US20130063361A1 (en) * | 2011-09-08 | 2013-03-14 | Research In Motion Limited | Method of facilitating input at an electronic device |
US20130076669A1 (en) * | 2011-09-27 | 2013-03-28 | Kyocera Corporation | Portable terminal and reception control method |
US20130268879A1 (en) * | 2012-04-06 | 2013-10-10 | Google Inc. | Smart user-customized graphical keyboard |
US20140115522A1 (en) * | 2012-10-19 | 2014-04-24 | Google Inc. | Gesture-keyboard decoding using gesture path deviation |
US20140164973A1 (en) * | 2012-12-07 | 2014-06-12 | Apple Inc. | Techniques for preventing typographical errors on software keyboards |
US8782550B1 (en) * | 2013-02-28 | 2014-07-15 | Google Inc. | Character string replacement |
US20140351760A1 (en) * | 2013-05-24 | 2014-11-27 | Google Inc. | Order-independent text input |
Cited By (193)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11928604B2 (en) | 2005-09-08 | 2024-03-12 | Apple Inc. | Method and apparatus for building an intelligent automated assistant |
US11671920B2 (en) | 2007-04-03 | 2023-06-06 | Apple Inc. | Method and system for operating a multifunction portable electronic device using voice-activation |
US11979836B2 (en) | 2007-04-03 | 2024-05-07 | Apple Inc. | Method and system for operating a multi-function portable electronic device using voice-activation |
US11023513B2 (en) | 2007-12-20 | 2021-06-01 | Apple Inc. | Method and apparatus for searching using an active ontology |
US10381016B2 (en) | 2008-01-03 | 2019-08-13 | Apple Inc. | Methods and apparatus for altering audio output signals |
US10108612B2 (en) | 2008-07-31 | 2018-10-23 | Apple Inc. | Mobile device having human language translation capability with positional feedback |
US11900936B2 (en) | 2008-10-02 | 2024-02-13 | Apple Inc. | Electronic devices with voice command and contextual data processing capabilities |
US11348582B2 (en) | 2008-10-02 | 2022-05-31 | Apple Inc. | Electronic devices with voice command and contextual data processing capabilities |
US10643611B2 (en) | 2008-10-02 | 2020-05-05 | Apple Inc. | Electronic devices with voice command and contextual data processing capabilities |
US11423886B2 (en) | 2010-01-18 | 2022-08-23 | Apple Inc. | Task flow identification based on user intent |
US10741185B2 (en) | 2010-01-18 | 2020-08-11 | Apple Inc. | Intelligent automated assistant |
US10692504B2 (en) | 2010-02-25 | 2020-06-23 | Apple Inc. | User profiling for voice input processing |
US10417405B2 (en) | 2011-03-21 | 2019-09-17 | Apple Inc. | Device access using voice authentication |
US11120372B2 (en) | 2011-06-03 | 2021-09-14 | Apple Inc. | Performing actions associated with task items that represent tasks to perform |
US11350253B2 (en) | 2011-06-03 | 2022-05-31 | Apple Inc. | Active transport based notifications |
US11069336B2 (en) | 2012-03-02 | 2021-07-20 | Apple Inc. | Systems and methods for name pronunciation |
US11269678B2 (en) | 2012-05-15 | 2022-03-08 | Apple Inc. | Systems and methods for integrating third party services with a digital assistant |
US11321116B2 (en) | 2012-05-15 | 2022-05-03 | Apple Inc. | Systems and methods for integrating third party services with a digital assistant |
US10079014B2 (en) | 2012-06-08 | 2018-09-18 | Apple Inc. | Name recognition system |
US11636869B2 (en) | 2013-02-07 | 2023-04-25 | Apple Inc. | Voice trigger for a digital assistant |
US11557310B2 (en) | 2013-02-07 | 2023-01-17 | Apple Inc. | Voice trigger for a digital assistant |
US10978090B2 (en) | 2013-02-07 | 2021-04-13 | Apple Inc. | Voice trigger for a digital assistant |
US11862186B2 (en) | 2013-02-07 | 2024-01-02 | Apple Inc. | Voice trigger for a digital assistant |
US10714117B2 (en) | 2013-02-07 | 2020-07-14 | Apple Inc. | Voice trigger for a digital assistant |
US11388291B2 (en) | 2013-03-14 | 2022-07-12 | Apple Inc. | System and method for processing voicemail |
US11798547B2 (en) | 2013-03-15 | 2023-10-24 | Apple Inc. | Voice activated device for use with a voice-based digital assistant |
US10657961B2 (en) | 2013-06-08 | 2020-05-19 | Apple Inc. | Interpreting and acting upon commands that involve sharing information with remote devices |
US11727219B2 (en) | 2013-06-09 | 2023-08-15 | Apple Inc. | System and method for inferring user intent from speech inputs |
US10769385B2 (en) | 2013-06-09 | 2020-09-08 | Apple Inc. | System and method for inferring user intent from speech inputs |
US11048473B2 (en) | 2013-06-09 | 2021-06-29 | Apple Inc. | Device, method, and graphical user interface for enabling conversation persistence across two or more instances of a digital assistant |
US11314370B2 (en) | 2013-12-06 | 2022-04-26 | Apple Inc. | Method for extracting salient dialog usage from live data |
US10417344B2 (en) | 2014-05-30 | 2019-09-17 | Apple Inc. | Exemplar-based natural language processing |
US10878809B2 (en) | 2014-05-30 | 2020-12-29 | Apple Inc. | Multi-command single utterance input method |
US11257504B2 (en) | 2014-05-30 | 2022-02-22 | Apple Inc. | Intelligent assistant for home automation |
US11670289B2 (en) | 2014-05-30 | 2023-06-06 | Apple Inc. | Multi-command single utterance input method |
US10714095B2 (en) | 2014-05-30 | 2020-07-14 | Apple Inc. | Intelligent assistant for home automation |
US10083690B2 (en) | 2014-05-30 | 2018-09-25 | Apple Inc. | Better resolution when referencing to concepts |
US11810562B2 (en) | 2014-05-30 | 2023-11-07 | Apple Inc. | Reducing the need for manual start/end-pointing and trigger phrases |
US10699717B2 (en) | 2014-05-30 | 2020-06-30 | Apple Inc. | Intelligent assistant for home automation |
US11133008B2 (en) | 2014-05-30 | 2021-09-28 | Apple Inc. | Reducing the need for manual start/end-pointing and trigger phrases |
US10497365B2 (en) | 2014-05-30 | 2019-12-03 | Apple Inc. | Multi-command single utterance input method |
US10657966B2 (en) | 2014-05-30 | 2020-05-19 | Apple Inc. | Better resolution when referencing to concepts |
US11699448B2 (en) | 2014-05-30 | 2023-07-11 | Apple Inc. | Intelligent assistant for home automation |
US11838579B2 (en) | 2014-06-30 | 2023-12-05 | Apple Inc. | Intelligent automated assistant for TV user interactions |
US11516537B2 (en) | 2014-06-30 | 2022-11-29 | Apple Inc. | Intelligent automated assistant for TV user interactions |
US10431204B2 (en) | 2014-09-11 | 2019-10-01 | Apple Inc. | Method and apparatus for discovering trending terms in speech requests |
US10453443B2 (en) | 2014-09-30 | 2019-10-22 | Apple Inc. | Providing an indication of the suitability of speech recognition |
US10390213B2 (en) | 2014-09-30 | 2019-08-20 | Apple Inc. | Social reminders |
US10438595B2 (en) | 2014-09-30 | 2019-10-08 | Apple Inc. | Speaker identification and unsupervised speaker adaptation techniques |
US11231904B2 (en) | 2015-03-06 | 2022-01-25 | Apple Inc. | Reducing response latency of intelligent automated assistants |
US10567477B2 (en) | 2015-03-08 | 2020-02-18 | Apple Inc. | Virtual assistant continuity |
US10311871B2 (en) | 2015-03-08 | 2019-06-04 | Apple Inc. | Competing devices responding to voice triggers |
US11087759B2 (en) | 2015-03-08 | 2021-08-10 | Apple Inc. | Virtual assistant activation |
US11842734B2 (en) | 2015-03-08 | 2023-12-12 | Apple Inc. | Virtual assistant activation |
US10930282B2 (en) | 2015-03-08 | 2021-02-23 | Apple Inc. | Competing devices responding to voice triggers |
US10529332B2 (en) | 2015-03-08 | 2020-01-07 | Apple Inc. | Virtual assistant activation |
US11468282B2 (en) | 2015-05-15 | 2022-10-11 | Apple Inc. | Virtual assistant in a communication session |
US11127397B2 (en) | 2015-05-27 | 2021-09-21 | Apple Inc. | Device voice control |
US11070949B2 (en) | 2015-05-27 | 2021-07-20 | Apple Inc. | Systems and methods for proactively identifying and surfacing relevant content on an electronic device with a touch-sensitive display |
US10101822B2 (en) | 2015-06-05 | 2018-10-16 | Apple Inc. | Language input correction |
US10356243B2 (en) | 2015-06-05 | 2019-07-16 | Apple Inc. | Virtual assistant aided communication with 3rd party service in a communication session |
US10681212B2 (en) | 2015-06-05 | 2020-06-09 | Apple Inc. | Virtual assistant aided communication with 3rd party service in a communication session |
US11025565B2 (en) | 2015-06-07 | 2021-06-01 | Apple Inc. | Personalized prediction of responses for instant messaging |
US11947873B2 (en) | 2015-06-29 | 2024-04-02 | Apple Inc. | Virtual assistant for media playback |
US11010127B2 (en) | 2015-06-29 | 2021-05-18 | Apple Inc. | Virtual assistant for media playback |
JP2017037495A (en) * | 2015-08-10 | 2017-02-16 | 富士通株式会社 | Electronic device and input control program |
US11500672B2 (en) | 2015-09-08 | 2022-11-15 | Apple Inc. | Distributed personal assistant |
US11954405B2 (en) | 2015-09-08 | 2024-04-09 | Apple Inc. | Zero latency digital assistant |
US11126400B2 (en) | 2015-09-08 | 2021-09-21 | Apple Inc. | Zero latency digital assistant |
US11550542B2 (en) | 2015-09-08 | 2023-01-10 | Apple Inc. | Zero latency digital assistant |
US11853536B2 (en) | 2015-09-08 | 2023-12-26 | Apple Inc. | Intelligent automated assistant in a media environment |
US11809483B2 (en) | 2015-09-08 | 2023-11-07 | Apple Inc. | Intelligent automated assistant for media search and playback |
US11526368B2 (en) | 2015-11-06 | 2022-12-13 | Apple Inc. | Intelligent automated assistant in a messaging environment |
US11809886B2 (en) | 2015-11-06 | 2023-11-07 | Apple Inc. | Intelligent automated assistant in a messaging environment |
US11886805B2 (en) | 2015-11-09 | 2024-01-30 | Apple Inc. | Unconventional virtual assistant interactions |
US10956666B2 (en) | 2015-11-09 | 2021-03-23 | Apple Inc. | Unconventional virtual assistant interactions |
US10354652B2 (en) | 2015-12-02 | 2019-07-16 | Apple Inc. | Applying neural network language models to weighted finite state transducers for automatic speech recognition |
US10942703B2 (en) | 2015-12-23 | 2021-03-09 | Apple Inc. | Proactive assistance based on dialog communication between devices |
US11853647B2 (en) | 2015-12-23 | 2023-12-26 | Apple Inc. | Proactive assistance based on dialog communication between devices |
US11227589B2 (en) | 2016-06-06 | 2022-01-18 | Apple Inc. | Intelligent list reading |
US11069347B2 (en) | 2016-06-08 | 2021-07-20 | Apple Inc. | Intelligent automated assistant for media exploration |
US11657820B2 (en) | 2016-06-10 | 2023-05-23 | Apple Inc. | Intelligent digital assistant in a multi-tasking environment |
US10733993B2 (en) | 2016-06-10 | 2020-08-04 | Apple Inc. | Intelligent digital assistant in a multi-tasking environment |
US11037565B2 (en) | 2016-06-10 | 2021-06-15 | Apple Inc. | Intelligent digital assistant in a multi-tasking environment |
US10942702B2 (en) | 2016-06-11 | 2021-03-09 | Apple Inc. | Intelligent device arbitration and control |
US11809783B2 (en) | 2016-06-11 | 2023-11-07 | Apple Inc. | Intelligent device arbitration and control |
US11749275B2 (en) | 2016-06-11 | 2023-09-05 | Apple Inc. | Application integration with a digital assistant |
US11152002B2 (en) | 2016-06-11 | 2021-10-19 | Apple Inc. | Application integration with a digital assistant |
US10580409B2 (en) | 2016-06-11 | 2020-03-03 | Apple Inc. | Application integration with a digital assistant |
US10474753B2 (en) | 2016-09-07 | 2019-11-12 | Apple Inc. | Language identification using recurrent neural networks |
US10553215B2 (en) | 2016-09-23 | 2020-02-04 | Apple Inc. | Intelligent automated assistant |
US10043516B2 (en) | 2016-09-23 | 2018-08-07 | Apple Inc. | Intelligent automated assistant |
US11281993B2 (en) | 2016-12-05 | 2022-03-22 | Apple Inc. | Model and ensemble compression for metric learning |
US11656884B2 (en) | 2017-01-09 | 2023-05-23 | Apple Inc. | Application integration with a digital assistant |
US11204787B2 (en) | 2017-01-09 | 2021-12-21 | Apple Inc. | Application integration with a digital assistant |
US10417266B2 (en) | 2017-05-09 | 2019-09-17 | Apple Inc. | Context-aware ranking of intelligent response suggestions |
US10741181B2 (en) | 2017-05-09 | 2020-08-11 | Apple Inc. | User interface for correcting recognition errors |
US10332518B2 (en) | 2017-05-09 | 2019-06-25 | Apple Inc. | User interface for correcting recognition errors |
US10726832B2 (en) | 2017-05-11 | 2020-07-28 | Apple Inc. | Maintaining privacy of personal information |
US10847142B2 (en) | 2017-05-11 | 2020-11-24 | Apple Inc. | Maintaining privacy of personal information |
US11599331B2 (en) | 2017-05-11 | 2023-03-07 | Apple Inc. | Maintaining privacy of personal information |
US11467802B2 (en) | 2017-05-11 | 2022-10-11 | Apple Inc. | Maintaining privacy of personal information |
US10755703B2 (en) | 2017-05-11 | 2020-08-25 | Apple Inc. | Offline personal assistant |
US10395654B2 (en) | 2017-05-11 | 2019-08-27 | Apple Inc. | Text normalization based on a data-driven learning network |
US11580990B2 (en) | 2017-05-12 | 2023-02-14 | Apple Inc. | User-specific acoustic models |
US11538469B2 (en) | 2017-05-12 | 2022-12-27 | Apple Inc. | Low-latency intelligent automated assistant |
US11405466B2 (en) | 2017-05-12 | 2022-08-02 | Apple Inc. | Synchronization and task delegation of a digital assistant |
US11380310B2 (en) | 2017-05-12 | 2022-07-05 | Apple Inc. | Low-latency intelligent automated assistant |
US10791176B2 (en) | 2017-05-12 | 2020-09-29 | Apple Inc. | Synchronization and task delegation of a digital assistant |
US10789945B2 (en) | 2017-05-12 | 2020-09-29 | Apple Inc. | Low-latency intelligent automated assistant |
US11837237B2 (en) | 2017-05-12 | 2023-12-05 | Apple Inc. | User-specific acoustic models |
US11301477B2 (en) | 2017-05-12 | 2022-04-12 | Apple Inc. | Feedback analysis of a digital assistant |
US10410637B2 (en) | 2017-05-12 | 2019-09-10 | Apple Inc. | User-specific acoustic models |
US11862151B2 (en) | 2017-05-12 | 2024-01-02 | Apple Inc. | Low-latency intelligent automated assistant |
US10482874B2 (en) | 2017-05-15 | 2019-11-19 | Apple Inc. | Hierarchical belief states for digital assistants |
US10810274B2 (en) | 2017-05-15 | 2020-10-20 | Apple Inc. | Optimizing dialogue policy decisions for digital assistants using implicit feedback |
US10909171B2 (en) | 2017-05-16 | 2021-02-02 | Apple Inc. | Intelligent automated assistant for media exploration |
US10403278B2 (en) | 2017-05-16 | 2019-09-03 | Apple Inc. | Methods and systems for phonetic matching in digital assistant services |
US10311144B2 (en) | 2017-05-16 | 2019-06-04 | Apple Inc. | Emoji word sense disambiguation |
US10303715B2 (en) | 2017-05-16 | 2019-05-28 | Apple Inc. | Intelligent automated assistant for media exploration |
US11217255B2 (en) | 2017-05-16 | 2022-01-04 | Apple Inc. | Far-field extension for digital assistant services |
US10748546B2 (en) | 2017-05-16 | 2020-08-18 | Apple Inc. | Digital assistant services based on device capabilities |
US11675829B2 (en) | 2017-05-16 | 2023-06-13 | Apple Inc. | Intelligent automated assistant for media exploration |
US11532306B2 (en) | 2017-05-16 | 2022-12-20 | Apple Inc. | Detecting a trigger of a digital assistant |
US10657328B2 (en) | 2017-06-02 | 2020-05-19 | Apple Inc. | Multi-task recurrent neural network architecture for efficient morphology handling in neural language modeling |
US10761720B2 (en) * | 2017-07-19 | 2020-09-01 | Kyocera Document Solutions Inc. | Display control device, display control method, and computer-readable storage medium non-transitorily storing display control program |
US20190026018A1 (en) * | 2017-07-19 | 2019-01-24 | Kyocera Document Solutions Inc. | Display control device, display control method, and computer-readable storage medium non-transitorily storing display control program |
US10445429B2 (en) | 2017-09-21 | 2019-10-15 | Apple Inc. | Natural language understanding using vocabularies with compressed serialized tries |
US10755051B2 (en) | 2017-09-29 | 2020-08-25 | Apple Inc. | Rule-based natural language processing |
US10636424B2 (en) | 2017-11-30 | 2020-04-28 | Apple Inc. | Multi-turn canned dialog |
US11556244B2 (en) * | 2017-12-28 | 2023-01-17 | Maxell, Ltd. | Input information correction method and information terminal |
US10733982B2 (en) | 2018-01-08 | 2020-08-04 | Apple Inc. | Multi-directional dialog |
US10733375B2 (en) | 2018-01-31 | 2020-08-04 | Apple Inc. | Knowledge-based framework for improving natural language understanding |
US10789959B2 (en) | 2018-03-02 | 2020-09-29 | Apple Inc. | Training speaker recognition models for digital assistants |
US10592604B2 (en) | 2018-03-12 | 2020-03-17 | Apple Inc. | Inverse text normalization for automatic speech recognition |
US11710482B2 (en) | 2018-03-26 | 2023-07-25 | Apple Inc. | Natural assistant interaction |
US10818288B2 (en) | 2018-03-26 | 2020-10-27 | Apple Inc. | Natural assistant interaction |
US10909331B2 (en) | 2018-03-30 | 2021-02-02 | Apple Inc. | Implicit identification of translation payload with neural machine translation |
US11854539B2 (en) | 2018-05-07 | 2023-12-26 | Apple Inc. | Intelligent automated assistant for delivering content from user experiences |
US11907436B2 (en) | 2018-05-07 | 2024-02-20 | Apple Inc. | Raise to speak |
US10928918B2 (en) | 2018-05-07 | 2021-02-23 | Apple Inc. | Raise to speak |
US11487364B2 (en) | 2018-05-07 | 2022-11-01 | Apple Inc. | Raise to speak |
US11900923B2 (en) | 2018-05-07 | 2024-02-13 | Apple Inc. | Intelligent automated assistant for delivering content from user experiences |
US11169616B2 (en) | 2018-05-07 | 2021-11-09 | Apple Inc. | Raise to speak |
US11145294B2 (en) | 2018-05-07 | 2021-10-12 | Apple Inc. | Intelligent automated assistant for delivering content from user experiences |
US10984780B2 (en) | 2018-05-21 | 2021-04-20 | Apple Inc. | Global semantic word embeddings using bi-directional recurrent neural networks |
US11630525B2 (en) | 2018-06-01 | 2023-04-18 | Apple Inc. | Attention aware virtual assistant dismissal |
US11009970B2 (en) | 2018-06-01 | 2021-05-18 | Apple Inc. | Attention aware virtual assistant dismissal |
US10403283B1 (en) | 2018-06-01 | 2019-09-03 | Apple Inc. | Voice interaction at a primary device to access call functionality of a companion device |
US10984798B2 (en) | 2018-06-01 | 2021-04-20 | Apple Inc. | Voice interaction at a primary device to access call functionality of a companion device |
US11495218B2 (en) | 2018-06-01 | 2022-11-08 | Apple Inc. | Virtual assistant operation in multi-device environments |
US10684703B2 (en) | 2018-06-01 | 2020-06-16 | Apple Inc. | Attention aware virtual assistant dismissal |
US11431642B2 (en) | 2018-06-01 | 2022-08-30 | Apple Inc. | Variable latency device coordination |
US11386266B2 (en) | 2018-06-01 | 2022-07-12 | Apple Inc. | Text correction |
US10892996B2 (en) | 2018-06-01 | 2021-01-12 | Apple Inc. | Variable latency device coordination |
US11360577B2 (en) | 2018-06-01 | 2022-06-14 | Apple Inc. | Attention aware virtual assistant dismissal |
US10720160B2 (en) | 2018-06-01 | 2020-07-21 | Apple Inc. | Voice interaction at a primary device to access call functionality of a companion device |
US10504518B1 (en) | 2018-06-03 | 2019-12-10 | Apple Inc. | Accelerated task performance |
US10496705B1 (en) | 2018-06-03 | 2019-12-03 | Apple Inc. | Accelerated task performance |
US10944859B2 (en) | 2018-06-03 | 2021-03-09 | Apple Inc. | Accelerated task performance |
US11010561B2 (en) | 2018-09-27 | 2021-05-18 | Apple Inc. | Sentiment prediction from textual data |
US11462215B2 (en) | 2018-09-28 | 2022-10-04 | Apple Inc. | Multi-modal inputs for voice commands |
US10839159B2 (en) | 2018-09-28 | 2020-11-17 | Apple Inc. | Named entity normalization in a spoken dialog system |
US11170166B2 (en) | 2018-09-28 | 2021-11-09 | Apple Inc. | Neural typographical error modeling via generative adversarial networks |
US11893992B2 (en) | 2018-09-28 | 2024-02-06 | Apple Inc. | Multi-modal inputs for voice commands |
US11475898B2 (en) | 2018-10-26 | 2022-10-18 | Apple Inc. | Low-latency multi-speaker speech recognition |
US11638059B2 (en) | 2019-01-04 | 2023-04-25 | Apple Inc. | Content playback on multiple devices |
US11348573B2 (en) | 2019-03-18 | 2022-05-31 | Apple Inc. | Multimodality in digital assistant systems |
US11783815B2 (en) | 2019-03-18 | 2023-10-10 | Apple Inc. | Multimodality in digital assistant systems |
US11705130B2 (en) | 2019-05-06 | 2023-07-18 | Apple Inc. | Spoken notifications |
US11217251B2 (en) | 2019-05-06 | 2022-01-04 | Apple Inc. | Spoken notifications |
US11675491B2 (en) | 2019-05-06 | 2023-06-13 | Apple Inc. | User configurable task triggers |
US11307752B2 (en) | 2019-05-06 | 2022-04-19 | Apple Inc. | User configurable task triggers |
US11475884B2 (en) | 2019-05-06 | 2022-10-18 | Apple Inc. | Reducing digital assistant latency when a language is incorrectly determined |
US11423908B2 (en) | 2019-05-06 | 2022-08-23 | Apple Inc. | Interpreting spoken requests |
US11140099B2 (en) | 2019-05-21 | 2021-10-05 | Apple Inc. | Providing message response suggestions |
US11888791B2 (en) | 2019-05-21 | 2024-01-30 | Apple Inc. | Providing message response suggestions |
US11360739B2 (en) | 2019-05-31 | 2022-06-14 | Apple Inc. | User activity shortcut suggestions |
US11657813B2 (en) | 2019-05-31 | 2023-05-23 | Apple Inc. | Voice identification in digital assistant systems |
US11496600B2 (en) | 2019-05-31 | 2022-11-08 | Apple Inc. | Remote execution of machine-learned models |
US11289073B2 (en) | 2019-05-31 | 2022-03-29 | Apple Inc. | Device text to speech |
US11237797B2 (en) | 2019-05-31 | 2022-02-01 | Apple Inc. | User activity shortcut suggestions |
US11790914B2 (en) | 2019-06-01 | 2023-10-17 | Apple Inc. | Methods and user interfaces for voice-based control of electronic devices |
US11360641B2 (en) | 2019-06-01 | 2022-06-14 | Apple Inc. | Increasing the relevance of new available information |
CN110297777A (en) * | 2019-07-10 | 2019-10-01 | 北京百度网讯科技有限公司 | The assessment method and device of input method |
US11488406B2 (en) | 2019-09-25 | 2022-11-01 | Apple Inc. | Text detection using global geometry estimators |
US11295088B2 (en) | 2019-11-20 | 2022-04-05 | Apple Inc. | Sanitizing word predictions |
US11765209B2 (en) | 2020-05-11 | 2023-09-19 | Apple Inc. | Digital assistant hardware abstraction |
US11914848B2 (en) | 2020-05-11 | 2024-02-27 | Apple Inc. | Providing relevant data items based on context |
US11924254B2 (en) | 2020-05-11 | 2024-03-05 | Apple Inc. | Digital assistant hardware abstraction |
US11838734B2 (en) | 2020-07-20 | 2023-12-05 | Apple Inc. | Multi-device audio adjustment coordination |
US11750962B2 (en) | 2020-07-21 | 2023-09-05 | Apple Inc. | User identification using headphones |
US11696060B2 (en) | 2020-07-21 | 2023-07-04 | Apple Inc. | User identification using headphones |
Also Published As
Publication number | Publication date |
---|---|
CN104423625A (en) | 2015-03-18 |
JP2015041845A (en) | 2015-03-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20150058785A1 (en) | Character Input Device And Computer Readable Recording Medium | |
US20110248945A1 (en) | Mobile terminal | |
JP5910345B2 (en) | Character input program, information processing apparatus, and character input method | |
US20140333549A1 (en) | Input device, input method, and program | |
JP2009205303A (en) | Input method and input device | |
TW201305925A (en) | Handwritten character input device and handwritten character input method | |
CN103309596A (en) | Adjustment method of input method keyboard and mobile terminal thereof | |
KR20110088410A (en) | Portable electric apparatus | |
JP2014023080A (en) | Portable terminal device, program and input correction method | |
US8949731B1 (en) | Input from a soft keyboard on a touchscreen display | |
JP6085529B2 (en) | Character input device | |
JP6226057B2 (en) | Character input device and program | |
JP2014142681A (en) | Display control device, display control method and display control program | |
JP2006293987A (en) | Apparatus, method and program for character input, document creation apparatus, and computer readable recording medium stored with the program | |
KR20160082030A (en) | Method and apparatus for compensation of virtual keyboard | |
US11137902B2 (en) | Character input device, character input method, and character input program | |
JP2014167712A (en) | Information processing device, information processing method, and program | |
JP2014048783A (en) | Input device | |
US9804777B1 (en) | Gesture-based text selection | |
JP7124345B2 (en) | Character input device, character input method, and character input program | |
JP5712232B2 (en) | Input device | |
KR101255801B1 (en) | Mobile terminal capable of inputting hangul and method for displaying keypad thereof | |
KR101141728B1 (en) | Apparatus and method for inputing characters in small eletronic device | |
JP5521643B2 (en) | Mobile terminal device | |
KR101542862B1 (en) | Method for ergonomic recognition in touch screen keyboard |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: CASIO COMPUTER CO., LTD, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:OOKAWARA, HIROKAZU;REEL/FRAME:033655/0065 Effective date: 20140725 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |