CN104737107A - Touch panel-type input device, and control method and program thereof - Google Patents
Touch panel-type input device, and control method and program thereof Download PDFInfo
- Publication number
- CN104737107A CN104737107A CN201380053695.3A CN201380053695A CN104737107A CN 104737107 A CN104737107 A CN 104737107A CN 201380053695 A CN201380053695 A CN 201380053695A CN 104737107 A CN104737107 A CN 104737107A
- Authority
- CN
- China
- Prior art keywords
- touch sensor
- mentioned
- detection
- information
- mentioned touch
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 69
- 238000001514 detection method Methods 0.000 claims abstract description 74
- 230000008569 process Effects 0.000 claims abstract description 58
- 238000009826 distribution Methods 0.000 claims abstract description 33
- 230000008859 change Effects 0.000 claims abstract description 18
- 230000010365 information processing Effects 0.000 claims abstract description 13
- 230000035945 sensitivity Effects 0.000 claims description 26
- 239000004020 conductor Substances 0.000 description 25
- 210000003811 finger Anatomy 0.000 description 24
- 238000012545 processing Methods 0.000 description 15
- 238000012360 testing method Methods 0.000 description 13
- 238000005516 engineering process Methods 0.000 description 10
- 230000015654 memory Effects 0.000 description 10
- 238000003860 storage Methods 0.000 description 8
- 230000009471 action Effects 0.000 description 7
- 238000004891 communication Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 7
- 238000012937 correction Methods 0.000 description 6
- 238000006073 displacement reaction Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 239000011521 glass Substances 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 239000000758 substrate Substances 0.000 description 4
- 241001422033 Thestylus Species 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000012546 transfer Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000007547 defect Effects 0.000 description 2
- 230000005674 electromagnetic induction Effects 0.000 description 2
- 230000033001 locomotion Effects 0.000 description 2
- 210000003813 thumb Anatomy 0.000 description 2
- 241001025261 Neoraja caerulea Species 0.000 description 1
- 230000033228 biological regulation Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- FHUKASKVKWSLCY-UHFFFAOYSA-N bixlozone Chemical compound O=C1C(C)(C)CON1CC1=CC=C(Cl)C=C1Cl FHUKASKVKWSLCY-UHFFFAOYSA-N 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000010355 oscillation Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F13/00—Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
- G06F13/14—Handling requests for interconnection or transfer
- G06F13/36—Handling requests for interconnection or transfer for access to common bus or bus system
- G06F13/368—Handling requests for interconnection or transfer for access to common bus or bus system with decentralised access control
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03545—Pens or stylus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/04166—Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
- G06F3/041661—Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving using detection at multiple resolutions, e.g. coarse and fine scanning; using detection within a limited area, e.g. object tracking window
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/0418—Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
- G06F3/0443—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using a single layer of sensing electrodes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04104—Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04108—Touchless 2D- digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface without distance measurement in the Z direction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Provided is an input device, comprising a touch panel, further comprising: a touch sensor which detects a manipulation by an operator; and a display (35). The input device executes an information process on the basis of information which is inputted into the touch sensor. It is possible for the touch sensor to change a detection output for an information processing means according to a location of an object in a site which is separated from the touch sensor. The input device further determines, on the basis of a distribution of the detection output in the touch sensor, whether a manipulation upon the touch sensor has been carried out by the right hand, or carried out by the left hand, of the operator.
Description
Technical field
The present invention relates to touch panel formula input media, its control method and program, particularly relate to the touch panel formula input media of the input characteristics detecting operator, its control method and program.
Background technology
In the past, various technology was proposed about touch panel formula input media.Particularly very interested from the technology of viewpoint to the characteristic judging the operation that the strong hand etc. of user is undertaken by user this input media improving convenience.
Such as, patent documentation 1 (JP 2012-27581 publication) discloses the portable terminal device possessing sensor at the back side of the configuration plane being configured with keyboard and side.This sensor exports the coordinate information perceiving contact.Thus, portable terminal device detects hold mode according to the coordinate information exported.Then, portable terminal device calculates the movable range of thumb, based on the movable range display keyboard of the thumb extrapolated based on this coordinate information.
Patent documentation 2 (JP 2008-242958 publication) discloses the input media being undertaken inputting by pressing one or more buttons shown on touch panel.In this input media, press the contact search coverage on key definition touch panel according to each.Further, input media possesses: placement file unit, its record input information in the past; 1st judging unit, it judges the contact position whether comprising user in any one in the contact search coverage of each button; 2nd judging unit, its when be judged as by the 1st judging unit in above-mentioned contact search coverage any one in all do not comprise above-mentioned contact position, with placement file information judge can contact search coverage in any one comprise contact position; And position adds unit, it is when the judgement by the 2nd judging unit is judged as comprising, in the contact search coverage judged, add contact position.Thus, the error of the key position of this contact position of automatic calibration and standard is carried out by the record/contact position of study user to touch panel.
Patent documentation 3 (JP 2011-164746 publication) discloses the end device of the input accepting felt pen.This end device comprises a way of electromagnetic induction test section and electrostatic capacitance test section mode points test section.A way of electromagnetic induction test section obtains the nib coordinate (Xp, Yp) of pen.Electrostatic capacitance test section mode is pointed test section and is obtained palm coordinate (Xh, Yh).When nib X-coordinate Xp is less than palm X-coordinate Xh, right-handed person is with GUI (Graphical User Interface: graphic user interface) in end device setting.On the other hand, when nib X-coordinate Xp is larger than palm X-coordinate Xh, end device setting left-handed person GUI.
Patent documentation 4 (JP 2011-81646 publication) discloses the display terminal of the input accepting stylus.When touching display terminal with stylus, the detection based on the sensor being assembled in stylus exports and detects vergence direction.Display terminal judges the strong hand of user based on the vergence direction detected.Then, display terminal is according to the setting of the judged result control UI (User Interface: user interface) of strong hand.Thus, user can not operate display terminal by multi-pass operations by the UI corresponding to strong hand.
Patent documentation 5 (Unexamined Patent 08-212005 publication) discloses 3 dimension location recognition type touch-panel devices.This touch-panel device has: multiple sensor, and it is arranged on the vertical direction around it relative to display surface, detects the position of the object be inserted in space; Computing unit, its testing result based on multiple sensor calculates the position on the display surface indicated by object; And display unit, it shows indication point, and above-mentioned indication point represents the position on display surface that object instruction obtained by this computing unit.Further, this touch-panel device when from the nearest sensor senses of display surface to the object be inserted in space, or when being judged as that indication point exists the set time in the regulation coordinates regional representing input area, determines input.In addition, the position of the object front end detected is mapped with display magnification by touch-panel device.Thus, finger etc. can be made with display surface directly not carry out input operation contiguously, or repeatedly amplifieroperation can have been inputted with 1 time.
Patent documentation 6 (JP 2012-073658 publication) discloses the computer system carrying out multi-window operation.In this computer system, window system performs the control multiple application programs of concurrent activity being distributed respectively to intrinsic window.The hand of action sensor to the user of movement in three dimensions as positioning equipment irradiates light off and on, irradiate time and non-irradiated time perform photographing process, the difference image of the image obtained when resolving the image and non-irradiated that obtain when irradiating, detects the hand of user.The information of the hand of the user that window system detects based on action sensor controls window.
Patent documentation 7 (JP 2011-180712 publication) discloses projection type video display apparatus.The projection unit of projection type video display apparatus projects image to screen.The region comprising the image being projeced into screen at least taken by video camera.The superjacent air space of infrared camera photographed screen.The finger that contact judging part judges user based on the image photographed by infrared camera whether with screen contact.The coordinate of the finger tip of user, when being judged as finger and screen contact by contact judging part, based on the image photographed by video camera, exports as to the indicating positions of projection image by coordinate determination section.Thus, the touch operation of user to projection image is realized.
Patent documentation 8 (JP 2001-312369 publication) discloses input media.This input media utilize light sensors in measurement space from detect operating point to the position detected plate contact, based on contact position on this detection output and picture, the project selected by decision.Thus, detect the device that to contact the position tightly with picture and the device detecting the position contacted with picture and exist error situation, due to operator be not from directly over watch and operation screen thus operating position offset to adjacency destination locations situation, to shake etc. at the hand due to operator and making the situation of operating point fine motion etc., even if indication point contacts with the position slightly different with the project up to the present indicated by other color, also erroneous input can be prevented by the project selecting operator to wish.
prior art document
patent documentation
Patent documentation 1: JP 2012-27581 publication
Patent documentation 2: JP 2008-242958 publication
Patent documentation 3: JP 2011-164746 publication
Patent documentation 4: JP 2011-81646 publication
Patent documentation 5: Unexamined Patent 08-212005 publication
Patent documentation 6: JP 2012-073658 publication
Patent documentation 7: JP 2011-180712 publication
Patent documentation 8: JP 2001-312369 publication
Summary of the invention
the problem that invention will solve
Can expect, detect the input characteristics such as the strong hand of user in input media, and input the process etc. of information based on this Characteristics Control, just can contribute to the reduction etc. of erroneous input, this is important.Particularly when user is to touch panel input information, as described below, it is important item which of this user's right hand and left hand carries out operating.The finger of user's strong hand directly or be accustomed to the use of stylus dominated by hand and input information to touch panel.Its reason is, according to now inputting with which in the right hand and left hand, even if user will touch identical point, sometimes also can produce deviation relative to the touch location of reality.And according to the size tilted when finger or the input of stylus, the inclined extent of above-mentioned touch location changes sometimes.
But, the problem can expected, when prior art will be utilized to detect the input characteristics of user, patent documentation 1 and the technology described in patent documentation 4 ~ patent documentation 8 produce needs new sensor, make the production cost rising of device etc. new.In addition, the technology described in patent documentation 2 needs to use erroneous input information in the past, and the optimization judging contact position needs the time.Thus, exist and bring into use device to need the problem of time to grasping the input characteristics of user from user.In addition, the technology described in patent documentation 3 needs the wide input area of certain degree in the judgement of the strong hand of the input position based on pen or finger.Therefore, can envision, when the miniaturization development of communication facilities, not easily the equipment of the technology of application described in patent documentation 3 will increase.
The present invention considers this actual conditions and completing, and its object is to, need not carry sensor special in touch panel formula input media, when after user brings into use as soon as possible and also can detect the input characteristics of user when device is small-sized.
for the scheme of dealing with problems
According to certain aspect, provide the input media possessing touch panel, above-mentioned touch panel comprises the touch sensor detecting and employ operating body operation.Input media also possesses information process unit, and above-mentioned information process unit performs information processing based on the information being input to touch sensor.Touch sensor can change according to the position of object in the place leaving this touch sensor and exports the detection of information process unit.Information process unit based on the distribution that the detection of touch sensor exports judge to the operation of touch sensor be undertaken by the right hand of operator or undertaken by left hand.In addition, the distribution that information process unit exports based on the detection of touch sensor obtains for the information of determination operation body relative to the inclined degree of touch sensor.
The judgement undertaken by information process unit, using having carried out the detection sensitivity touch operation of touch sensor being improved to operation as condition, has terminated to make sensitivity get back to the sensitivity before raising as condition by preferred touch sensor.
Also preferably touch sensor only improves the detection sensitivity of operation to the part comprising the part part of having carried out touch operation being detected.
The judgement undertaken by information process unit, using having carried out the detection frequency touch operation of touch sensor being improved to operation as condition, has terminated to make this frequency get back to the frequency before raising as condition by preferred touch sensor.The touch operation carried out touch sensor is improved the frequency obtaining detection output from touch sensor by information process unit as condition, judgement terminated to make this frequency get back to the frequency before raising as condition.
Preference information processing unit corrects the positional information as information input object of touch sensor based on the result judged and inclined degree.
According to other side, provide control method that performed by the computing machine of the input media possessing touch panel, this input media, above-mentioned touch panel comprises the touch sensor detecting and employ operating body operation.This control method possesses the step performing information processing based on the information being input to touch sensor.Touch sensor can change according to the position of object in the place leaving this touch sensor and exports the detection of information process unit.Perform the step of information processing to comprise: the distribution that the detection based on touch sensor exports judge to the operation of touch sensor be undertaken by the right hand of operator or undertaken by left hand; And obtain the information of determination operation body relative to the inclined degree of touch sensor based on the distribution that the detection of touch sensor exports.
According to other side, provide the program performed by the computing machine of the input media possessing touch panel, above-mentioned touch panel comprises the touch sensor detecting and employ operating body operation.Program makes computing machine perform and performs the step of carrying out information processing based on the information being input to touch sensor.Touch sensor can change according to the position of object in the place leaving this touch sensor and exports the detection of information process unit.Perform the step of information processing to comprise: the distribution that the detection based on touch sensor exports judge to the operation of touch sensor be undertaken by the right hand of operator or undertaken by left hand; And obtain the information of determination operation body relative to the inclined degree of touch sensor based on the distribution that the detection of touch sensor exports.
invention effect
According to certain aspect, the distribution that input media exports based on the detection of touch sensor judges that operator is by right-hand operated or uses left-handed operation touch sensor.In addition, the distribution that input media exports based on the detection of touch sensor obtains for the information of determination operation body relative to the inclined degree of touch sensor.
Thus, touch panel formula input media need not possess sensor special, when after user brings into use as soon as possible and also can detect the input characteristics of user when device is small-sized.
Accompanying drawing explanation
Fig. 1 is the figure of the outward appearance of the entry terminal of the embodiment represented as touch panel formula input media.
Fig. 2 is the figure of the example of operation form for the display to entry terminal is schematically described.
Fig. 3 is the figure for illustration of the defect caused that is shifted relative to the touch location of reality by the touch location detected in touch sensor.
Fig. 4 be schematically show when operator use stylus to display input information time, figure to the example of the operation form of display.
Fig. 5 is for illustration of the figure of touch sensor detection to the mechanism of the operating position of display by entry terminal.
Fig. 6 is the figure of an example of the configuration of the touch sensor represented in entry terminal.
Fig. 7 be for illustration of in the distribution of the electrostatic capacitance of electrode, the figure of impact that brought by the part do not contacted with display of electric conductor.
Fig. 8 be for illustration of in the distribution of the electrostatic capacitance of electrode, the figure of impact that brought by the part do not contacted with display of electric conductor.
Fig. 9 is the figure of the distribution that the respective detection being shown schematically in the electrode pair configured two-dimensionally in the whole region of touch sensor exports.
Figure 10 is the block diagram representing the example that the hardware of entry terminal is formed.
Figure 11 performs in entry terminal, for detecting the process flow diagram of the process of touch operation.
Figure 12 is the figure of the change of the pattern relevant with the change of the sensitivity of touch sensor representing entry terminal.
Embodiment
Hereinafter, with reference to the accompanying drawings of the embodiment of touch panel formula input media.In addition, in the following description, for the parts with identical function and efficacy, enclose identical Reference numeral in the various figures, repeatedly do not carry out the explanation of repetition.
[outward appearance of input media]
Fig. 1 is the figure of the outward appearance of the entry terminal 1 of the embodiment represented as touch panel formula input media.With reference to Fig. 1, entry terminal 1 comprises display 35 and input key 25A at its outside surface.Display 35 is the touch panels formed integratedly with touch sensor 40 described later.In the present embodiment, entry terminal 1 is realized by smart phone (high function portable phone).In addition, if entry terminal 1 can play the information processing function described in this instructions, then also realize by the device of other kinds such as hand-written panel terminal or portable phone.In addition, in the present embodiment, touch sensor 40 and display 35 are formed integratedly, therefore sometimes also suitably can be said to be " touch operation to touch panel " " touch operation to display 35 " to the touch operation of touch sensor 40.About above-mentioned touch sensor 40 and display 35, may not be and form integratedly but separately form.
[summary of process]
Fig. 2 is the figure of the example of operation form for the display 35 to entry terminal 1 is schematically described.With reference to Fig. 2, in entry terminal 1, operator can input information with the right hand to display 35 can input information with left hand again.In fig. 2, hand 202 represents operator with the hand of operator during left hand input information and finger relative to the position of display 35.Hand 204 represents operator with the hand of operator during right hand input information and finger relative to the position of display 35.Operator if right-handed person, then mainly inputs information with the right hand to display 35.Operator if left-handed person, then mainly inputs information with left hand to display 35.
As intelligible from Fig. 2, when the right hand (hand 204) of operator inputs information to display 35, the finger used in input extends from the right side of display 35 to this display 35.Therefore, as a certain tendency in this case, the tendency pointed and contact with display 35 in the position that the point of wishing than operator is kept right a little can be enumerated.In addition, as other tendency in this case, following tendency can be enumerated: when the detection output of the touch sensor of display 35 can be subject to affecting of the finger of the position leaving display 35 or the position of stylus, the position that in fact touch location detected by touch sensor will touch than user offsets to the right.In addition, following tendency can be enumerated: now export in distribution in the detection of touch sensor than the part touched and produce deviation on the right side.Its reason is, the finger tip of the finger of user contacts with display 35, and refers to that abapical part is also being located with display 35 close on the right side than this contact position.In addition, the example of the touch sensor of above-mentioned impact can be subject to as detection output, such as, can enumerate the sensor adopting electrostatic capacitance mode or infrared mode as detection mode.
In addition, as intelligible from Fig. 2, when the left hand (hand 202) of operator inputs information to display 35, the finger used in input extends from the left side of display 35 to this display 35.Therefore, as a certain tendency in this case, the tendency pointed and contact with display 35 in the position that the point of wishing than operator keeps left a little can be enumerated.In addition, as other tendency in this case, following tendency can be enumerated: when the detection output of the touch sensor of display 35 can be subject to affecting of the finger of the position leaving display 35 or the position of stylus, the position that in fact touch location detected by touch sensor will touch than user offsets to the left.In addition, following tendency can be enumerated: in now distributing by the detection output leftwards at touch sensor than the part touched, produce deviation.
Fig. 3 is the figure of the defect caused of being shifted relative to the touch location of reality for illustration of the touch location detected by touch sensor.When (A) of Fig. 3 represents that the touch location of user itself is detected as touch location and is processed in entry terminal 1 or the desirable handwriting input to display 35 one routine.On the other hand, (B) of Fig. 3 represents when not carrying out this correction, to an example of the handwriting input of display 35 figure.
The degree of above-mentioned displacement is also subject in the impact proceeding the direction touching operator's moveable finger in input.Such as can enumerate when point move from right to left time shift amount less, when point move from left to right time the tendencies such as shift amount is more.That is, even if when drawing single line, when drawing curve etc., drawing the moving direction pointed in the process of line changes, then the amount of drawing above-mentioned displacement in the process of line can change.Therefore, as shown in (B) of Fig. 3, even if user is by the original track touch display 35 along word, the track that the track detected by display 35 touches from user is sometimes different.On the other hand, according to the present embodiment, if user is by the original track touch display 35 along word, then, as shown in (A) of Fig. 3, display 35 detects the track of its former state.
Fig. 4 schematically shows the figure of operator when using stylus to input information to display 35, to the example of the operation form of display 35.With reference to Fig. 4, in entry terminal 1, operator can utilize the stylus 210 pairs of displays 35 held by the right hand 208 to input information, and the stylus 210 held by left hand 206 also can be utilized to input information.Operator is if right-handed person, then the main right hand holds stylus 210 and inputs information to display 35.Operator is if left-handed person, then main left hand holds stylus 210 and inputs information to display 35.
In the present embodiment, judge that operator is by right hand input information to display 35 or inputs information with left hand.Then, entry terminal 1 carries out the adjustment of the displaying contents of correction and/or the display 35 exported from the detection of the touch sensor of display 35 according to this judged result.
As detecting the example of correction exported, can enumerate and such as make when being judged as right-handed person to be shifted to the left this correction by exporting the touch location determined from the detection of touch sensor.In addition, the degree of the displacement form of distribution that can export according to the detection of touch sensor and changing.Such as, sometimes, be judged as that the inclination of the finger of user or hand (stylus 210) is large, then strengthen shift amount.
As the example of the adjustment of the displaying contents of display 35, such as, can enumerate the adjustment of the icon arrangement of display in display 35.More particularly, the arrangement of adjustment icon, makes when being judged as right-handed person, and the icon utilizing frequency high is more arranged in right side, in addition, when being judged as left-handed person, utilizes the high icon of frequency to be more arranged in left side.
[detecting the example of mechanism]
The following describes in entry terminal 1, that hand that operate for detecting operator is the right hand or left hand mechanism.In this manual, sometimes will detect that hand that operator operates is that the right hand is called and detects that operator is right-handed person.Its reason is, if operator is right-handed person, then as a rule, this operator operates with the right hand.In addition, sometimes will detect that hand that operator operates is that left hand is called and detects that operator is left-handed person.Its reason is, if operator is left-handed person, then as a rule, this operator operates with left hand.
Fig. 5 is the figure of touch sensor detection to the mechanism of the operating position of display 35 for illustration of utilizing entry terminal 1.
In Figure 5, the cross section of touch sensor 40 is schematically shown.Touch sensor 40 comprises: glass substrate 40C; Electrode pair 40X, it is configured on this glass substrate 40C; And fender 40D, it is configured on electrode pair 40X.Electrode pair 40X may not be and is configured on glass substrate 40C but is configured on fender 40D.Touch sensor 40 is configured in face side relative to the display 35 of the state of a control etc. of display entry terminal 1.Thus, operator is via the display on touch sensor 40 visual identity display 35.In the present embodiment, expression utilizes display 35 and touch sensor 40 to form the situation of touch panel.
In addition, touch sensor 40 also can be configured in the rear side of display 35.In this case, operator, from the display of the surperficial visual identity display 35 of entry terminal 1, carries out touch operation to the back side of entry terminal 1.
Each electrode pair 40X comprises electrode 40A and electrode 40B.
The electrostatic capacitance of the electrode 40A of each electrode pair 40X and the electrostatic capacitance of electrode 40B change close to when each electrode 40A, 40B when electric conductor.More specifically, as shown in Figure 5, when (operator's) finger F of the example as electric conductor is close to electrode pair 40X, electrode 40A, 40B electrostatic capacitance separately changes according to the distance with finger F.In Figure 5, the distance of electrode 40A, 40B and finger F is represented respectively with distance RA, RB.In entry terminal 1, as shown in Figure 6, electrode pair 40X configures in the mode arranged in the whole region of touch sensor 40 (overlapping with display 35 in figure 6).Electrode pair 40X is by such as rectangular arrangement.In addition, in entry terminal 1, detect the electrostatic capacitance of electrode 40A, 40B of each electrode pair 40X independently of each other.Thus, entry terminal 1 can obtain the distribution of the variable quantity of the electrostatic capacitance of electrode 40A, 40B of electrode pair 40X in the whole region of touch sensor 40 (the whole region of display 35).Further, entry terminal 1 determines the touch location in display 35 based on the distribution of this variable quantity.
In addition, under the state that above-mentioned electric conductor does not contact with display 35 (touch panel), the electrostatic capacitance of each electrode 40A, 40B also can be subject to the impact of the position of this electric conductor (distance relative to each electrode 40A, 40B).Thus, the electrostatic capacitance of electrode 40A, 40B distribution according to such as illustrate with reference to Fig. 2, operator still can be affected by left-handed operation by right-hand operated.In addition, the electrostatic capacitance of electrode 40A, 40B distribution according to illustrate with reference to Fig. 4, operator to hold with the right hand or to hold stylus 210 and being affected with left hand.Fig. 7 with Fig. 8 be for illustration of in the distribution of the electrostatic capacitance of electrode 40A, 40B, the figure of impact that brought by the part do not contacted with display 35 (touch panel) of electric conductor.
In (A) of Fig. 7, represent in entry terminal 1 stylus 210 relative to display 35 not to the left and right in either party state contacted with display 35 obliquely.In (A) of Fig. 7, represent left and right directions with line L1.
In (B) of Fig. 7, represent the detection corresponding with (A) of Fig. 7 export in, an example that the detection of the electrostatic capacitance of the electrode pair 40X configured on online L1 exports.The longitudinal axis of the coordinate diagram of (B) of Fig. 7 is corresponding with electrostatic capacitance.Transverse axis is corresponding with determining the information (sensor ID) of each electrode pair 40X configured on online L1 respectively.The output E11 shown in (B) of Fig. 7 is corresponding with the electrostatic capacitance of electrode 40B.
Stylus 210 in entry terminal 1 is represented while tilt to the right relative to display 35 while the state that contacts with display 35 in (A) of Fig. 8.In (A) of Fig. 8, represent left and right directions with line L2.
In (B) of Fig. 8, represent the detection corresponding with (A) of Fig. 8 export in, an example that the detection of the electrostatic capacitance of the electrode pair 40X configured on online L2 exports.The longitudinal axis of the coordinate diagram of (B) of Fig. 8 is corresponding with electrostatic capacitance.Transverse axis is corresponding with determining the information (sensor ID) of each electrode pair 40X configured on online L2 respectively.The output E21 shown in (B) of Fig. 8 is corresponding with the electrostatic capacitance of electrode 40B.
In (B) of Fig. 8, output E21 reinstates the gentle gradient on the left of the ratio of slope on the right side shown in hollow arrow A21 from peak.Thus, in entry terminal 1, when electric conductor (stylus 210) is present in display 35 obliquely, the distribution that the detection of electrode 40A, 40B of touch sensor 40 exports also offsets to the side identical with this inclination.
Fig. 9 is the figure of distribution that exports of (such as rectangularly) respective detection of electrode pair 40X of configuring two-dimensionally in the whole region being shown schematically in touch sensor 40.In addition, represent that the detection corresponding with the state of (A) of Fig. 8 exports in fig .9.In entry terminal 1, as shown in Figure 9, the detection obtaining electrode 40A, the 40B of each electrode pair 40X configured two-dimensionally in the whole region of touch sensor 40 exports.In entry terminal 1, determine the position detected in the vertical direction existing for the peak value of output.Then, as shown in (B) of Fig. 8, the inclined degree of electric conductor is predicted in the distribution that the detection based on the left and right directions of determined position exports.
As described above, the relation of the distribution that entry terminal 1 utilizes the inclination of electric conductor and the detection of electrode 40A, 40B to export, and the inclination of above-mentioned electric conductor is predicted according to the distribution that above-mentioned detection exports.Then, based on the predicting the outcome of inclination of above-mentioned electric conductor, entry terminal 1 judges that operator inputs information with the right hand to display 35 or inputs information with left hand to display 35.
At this, illustrate according to the inclination with reference to the electric conductor illustrated by Fig. 2, the touch location detected by touch sensor 40 than actual touch location to the right or the tendency of left side displacement.
Illustrated by with reference to Fig. 5, the electrostatic capacitance of each electrode 40A, 40B of touch sensor 40 is subject to the impact with the distance of electric conductor sometimes.Thus, even if electric conductor does not contact with touch sensor 40, if there is electric conductor near touch sensor 40, then the electrostatic capacitance of electrode 40A, 40B is also likely affected.When operator inputs information with the right hand to display 35, predict the near surface at display 35, near the right side of the point contacted at finger or the stylus 120 of operator, there is electric conductor (right hand of operator).Original in the coordinate diagram shown in (B) of Fig. 7 or (B) of Fig. 8, the point that the peak value of electrostatic capacitance contacts with the finger of operator or stylus 120 is consistent.But it is contemplated that the near surface due to aforementioned display device 35 exists electric conductor, the state of affairs that the position touched is shifted to the right was wished from user originally in the position of this peak value.Such as, when right-handed finger operates, immediately below the place that user attempts touching is not pointed, and the situation being positioned at slightly left is more.But tripe is pointed at the center of the position contacted with touch sensor 40, therefore mostly peak is to be shifted slightly toward right.In addition, can expect, be also same when operator inputs information with left hand to display 35.
[hardware formation]
Illustrate that the hardware of entry terminal 1 is formed with reference to Figure 10.Figure 10 is the block diagram representing the example that the hardware of entry terminal 1 is formed.
Entry terminal 1 possesses: CPU20, antenna 23, communicator 24, hardware button 25, video camera 26, flash memory 27, RAM (Random Access Memory: random access memory) 28, ROM29, Storage Card Drivers device 30, microphone 32, loudspeaker 33, speed signal processing circuit 34, display 35, LED (Light Emitting Diode: light emitting diode) 36, data communication interface 37, oscillator 38, gyro sensor 39 and touch sensor 40.Storage card 31 can be assembled in Storage Card Drivers device 30.
Antenna 23 receives the signal sent by base station, or sends the signal for carrying out communicating via base station and other communicator.The signal received by antenna 23 after having carried out front-end processing by communicator 24, to the signal after CPU20 transmission processing.
Touch sensor 40 accepts the touch operation to entry terminal 1, is carried by the coordinate figure of the point this touch operation being detected to CPU20.CPU20 performs prespecified process according to the pattern of this coordinate figure and entry terminal 1.
In addition, CPU20 can export according to the detection from touch sensor 40 as mentioned above and judge that operator employs the right hand and is also the use of left hand in touch operation.In addition, CPU20 can carry out the coordinate figure of correct detection to the point of touch operation based on the result of this judgement.In Fig. 10, the function of these CPU20 is illustrated as judging part 20A and correction unit 20B.
Hardware button 25 comprises input key 25A.From each button included by peripheral operation hardware button 25, signal corresponding with each button to CPU20 input thus.
CPU20 performs the process of the action for control inputs terminal 1 based on the order provided entry terminal 1.When entry terminal 1 Received signal strength, CPU20 performs prespecified process based on the signal carried from communicator 24, is carried by the signal after process to speed signal processing circuit 34.Speed signal processing circuit 34 performs prespecified signal transacting to this signal, is carried by the signal after process to loudspeaker 33.Loudspeaker 33 exports voice based on this signal.
Microphone 32 accepts the voice sent entry terminal 1, is carried by the signal corresponding with the voice sent to speed signal processing circuit 34.Speed signal processing circuit 34 performs based on this signal and converses and prespecified process to carry out, and is carried by the signal after process to CPU20.This signal is converted to transmission data by CPU20, the data after conversion is carried communicator 24.This data genaration transmission signal of communicator 24, carries this signal towards antenna 23.
Flash memory 27 preserves the data from CPU20 conveying.In addition, CPU20 reads the data being stored in flash memory 27, uses these data to perform prespecified process.
RAM28 keeps based on the operation carried out touch sensor 40 or to the operation of other entry terminal and the data generated by CPU20 temporarily.ROM29 preserves program or data for making entry terminal 1 perform prespecified action.CPU20 reads this program or data from ROM29, the action of control inputs terminal 1.
Storage Card Drivers device 30 reads the data of being preserved by storage card 31 and it is carried to CPU20.The data exported by CPU20 are write the white space of storage card 31 by Storage Card Drivers device 30.The data being stored in storage card 31 are deleted based on the order received by CPU20 by Storage Card Drivers device 30.
In addition, Storage Card Drivers device 30 is sometimes replaceable for carry out the reading of information and the media drive of write to the recording medium of the form beyond storage card 31.Can enumerate as recording medium: CD-ROM (Compact Disk-Read Only Memory: compact disc read-only memory), DVD-ROM (Digital Versatile Disk-ReadOnly Memory: digital universal disc ROM (read-only memory)), Blue-ray (blue light) CD, USB (Universal Serial Bus: USB (universal serial bus)) storer, storage card, FD (Flexible Disk: floppy disk), hard disk, tape, magnetic tape cassette, MO (MagneticOptical Disk: magneto-optic disk), MD (Mini Disk: Mini Disk), IC (IntegratedCircuit: integrated circuit) blocks (not comprising storage card), light-card, mask rom, EPROM, the medium of the save routine in nonvolatile manner such as EEPROM (Electronically Erasable Programmable Read OnlyMemory: Electrically Erasable Read Only Memory).
Speed signal processing circuit 34 performs the signal transacting for carrying out above-mentioned call.In addition, in the example shown in Figure 10, be expressed as CPU20 and speed signal processing circuit 34 and be set to and independently form, in other side, CPU20 and speed signal processing circuit 34 also can be integrally constituted.
Display 35 shows the image specified by these data based on the data obtained from CPU20.Such as, show flash memory 27 preserve rest image, dynamic image, music file attribute (title, player, playing time etc. of this file).
LED36 realizes prespecified light-emission operation based on the signal from CPU20.
Data communication cable is assemblied in data communication interface 37.The signal that data communication interface 37 exports from CPU20 this cable transfer.Or data communication interface 37 pairs of CPU20 conveyings are via the data of this cable reception.
Oscillator 38 performs oscillation action based on the signal exported from CPU20 with prespecified frequency.
Gyro sensor 39 detects the direction of entry terminal 1, sends testing result to CPU20.CPU20 detects the attitude of entry terminal 1 based on this testing result.More specifically, the shape of the casing of entry terminal 1 is rectangle as shown in Fig. 1 etc.Further, CPU20 detects this rectangular long side direction based on above-mentioned testing result and whether is positioned at the above-below direction of the user of visual identity display 35 or whether is positioned at the attitude of the casing of the entry terminals such as left and right directions 1.In addition, about the detection of the attitude of the casing of the entry terminal 1 of the testing result based on gyro sensor 39, can known technology be adopted, therefore not repeat detailed description at this.In addition, gyro sensor 39 also can be replaced into any component of the data of the attitude of the casing obtained for detecting entry terminal 1.
[check processing of touch operation]
Below, the content of the process for detecting the touch operation to display 35 is described with reference to Figure 11.Figure 11 is to detect the process flow diagram of the process that touch operation makes CPU20 perform in entry terminal 1.In addition, entry terminal 1 to accept under to the pattern of the touch operation of touch sensor 40 to continue to perform the process of Figure 11 in just during action.
With reference to Figure 11, in step slo, CPU10 judges whether to there is the touch operation to touch sensor 40.Then, CPU10 carries out standby till this operation being detected when being judged as not having touch operation, when being judged as there is touch operation, process is shifted to step S20.In addition, CPU10 such as illustrated with reference to (B) etc. of Fig. 7, is judged as there is touch operation the absolute value of at least 1 electrostatic capacitance in all electrode pair 40X becomes more than setting.
In step S20, CPU20 changes pattern to improve the sensitivity of touch sensor 40, and process is shifted to step S30.In addition, " improve the sensitivity of touch sensor 40 " to be realized by the raising of the integral number of times of such as sensing or the raising of quantity of information.The raising of the integral number of times of sensing can be enumerated, such as CPU20 is when the detection of 1 secondary amounts of each electrode 40A, 40B of each electrode pair 40X of decision touch sensor 40 exports, use the integrated value exported for 8 times from each electrode 40A, 40B, result uses its integrated value exported for 32 times of 4 times.The raising of quantity of information can enumerate the gain such as improving the detection from each electrode 40A, 40B and export in CPU20.
In step s 30, CPU20 makes process shift to step S40 after carrying out distinguishing of the state in the overhead (leaving the position on display 35 surface a little) of display 35.Distinguish and refer to and distinguish that electric conductor is tilted to the right in the overhead of display 35 or is tilted to the left as shown in (A) of Fig. 8.More specifically, the distribution that CPU20 exports as the detection of making each electrode 40A, 40B of touch sensor 40 illustrated by with reference to Fig. 7 ~ Fig. 9, and in the distribution that this detection exports, using detect export peak value as center when, according to be deflection right side (with reference to Fig. 8 (B)) or deflection left side distinguish that the electric conductor in overhead tilts to the right or tilts to the left.In addition, the testing result of the attitude of casing that the detection based on gyro sensor 39 exports, entry terminal 1 also can be used in the decision in direction up and down.
In step s 40, CPU20 is based on distinguishing that result decides to make process to step S50 transfer or makes process shift to step S60 in step S30.More specifically, if distinguish that result is that the electric conductor in overhead is tilted to the right in step S30, then CPU20 makes process shift to step S50.On the other hand, if distinguish that result is that the electric conductor in overhead is tilted to the left in step S30, then CPU20 makes process shift to step S60.
In step s 50, CPU20 is judged as that operator is right-handed person, and process is shifted to step S70.
On the other hand, in step S60, CPU20 is judged as that operator is left-handed person, and process is shifted to step S70.
In step S70, CPU20 makes the sensitivity of the touch sensor 40 be enhanced in step S20 get back to common sensitivity, and process is shifted to step S80.
In step S80, CPU20 performs the process (process of touch panel coordinate) of the coordinate figure of the operand of deriving in touch panel, makes to the processing returns to step S10.
In step S80, CPU20 can correct the coordinate figure (coordinate figure of the operand that the detection based on each electrode 40A, 40B of touch sensor 40 exports) of operand that determine based on the touch operation detected in step slo, touch sensor 40 based on the distribution of the testing result obtained in step s 30.The coordinate figure of the operand exported based on the detection of each electrode 40A, 40B of touch sensor 40 utilizes known technology to realize really surely, does not therefore repeat detailed description at this.In addition, as the concrete example of calibration substance, be " example that correct detection exports " by foregoing example.
The coordinate figure of deriving in step S80 is submitted to the application program of starting in entry terminal 1 by CPU20.Now, the judged result in coordinate figure and step S50 or step S60 can also be submitted to this application program by CPU20.Thus, this application program can carry out Change Example as comprised the contents processing in the application program of the displaying contents in the arrangement of as mentioned above adjustment icon etc., display 35 according to judged result.In addition, this application program also performs by CPU20 sometimes.
In present embodiment described above, in step s 40, when cannot distinguish electric conductor the overhead of display 35 to the left and right in which side tilt, CPU20 makes process to the side predetermined in step S50 and step S60, the transfers such as such as step S50.
[form of drive pattern change]
Figure 12 is the figure of the change of the pattern relevant with the change of the sensitivity of touch sensor 40 of the entry terminal 1 represented in present embodiment.In fig. 12, the sensitivity (transducer sensitivity) of the touch sensor 40 in the presence or absence (touch operation) of touch operation, the drive pattern (sensor driving) of touch sensor 40 and each drive pattern is represented.
With reference to Figure 12, drive pattern is standby mode, till the touch operation to touch sensor 40 being detected.When starting touch operation (process be equivalent in Figure 11 is shifted from step S10 to step S20), correspondingly, drive pattern distinguishes mode shifts to overhead, and the sensitivity of sensor (touch sensor 40) improves.In fig. 12, represent the sensitivity of the sensor before improving with " normally ", represent the sensitivity of the sensor after improving with " height ".
Afterwards, terminate the judgement of strong hand with the step S50 of Figure 12 or step S60, thus, overhead distinguishes that pattern terminates.Correspondingly, the raising of the sensitivity of sensor is removed.Afterwards, in during the touch operation of touch sensor 40 is continued, continue the detection (usual coordinate detection mode) of common touch location etc.Then, when touch operation is removed, pattern shifts to standby mode again.
Above-mentioned overhead distinguish pattern and usual coordinate detection mode also can during touch operation in alternately perform.Under overhead distinguishes pattern, be likely easy to owing to improving transducer sensitivity the impact being subject to noise.Thus, the positional precision of touch operation likely reduces, and therefore also can consider not use overhead to distinguish touching position information in pattern.But, it is contemplated that, only perform in during 1 touch operation and belong to overhead for 1 time when distinguishing process (the step S30 ~ step S60) of pattern, the decision of touch location likely cannot follow the change of inclined degree etc.Therefore, when this change cannot be followed, as long as alternately carry out overhead in during touch operation continues to distinguish pattern and usual coordinate detection mode.
[effect of embodiment and variation]
In present embodiment described above, when the touch operation of electric conductor (stylus, finger etc.) to touch sensor 40 being detected, discriminated operant person is right-handed person or left-handed person.Thus, comprise the displaying contents in display 35, the contents processing of application program can be corresponding with the hand (right hand or left hand) operated by user.In addition, this distinguishes to export based on the detection in order to detect the touch sensor 40 carried the touch location of display 35 and carries out, without the need to the sensor of its its special Do.In the present embodiment, operating body comprises the above-mentioned electric conductor to touch sensor input information.
In addition, in the present embodiment, the touch location (coordinate figure as touching object) of display 35 can be corrected based on the above-mentioned result distinguished.Thus, as explained in relation to fig. 3, the difference of the position that the coordinate figure of the touch operation obtained in entry terminal 1 and user wish can be reduced further.
In the present embodiment, during the strong hand (hand used in operation) of discriminated operant person in (step S20 ~ step S70), CPU20 improves the sensitivity of touch sensor 40.Thus, CPU20 can distinguish strong hand more accurately, and can detect its inclined degree.
By the raising of the sensitivity of touch sensor 40, the power consumption of entry terminal 1 can improve.But, in the present embodiment, only within above-mentioned period, improve sensitivity, suppress the raising of power consumption thus as far as possible.
In addition, owing to improving sensitivity, so comprise the possibility change of noise greatly from touch sensor 40 to the output of CPU20, thus, become large sometimes based on the error in the determined position of output of touch sensor 40.About this point, in preferably during step S20 ~ step S70, CPU20 based in the multiple electrode pair 40X included by touch sensor 40, the detection that is positioned at position and the neighbouring electrode pair 40X thereof touch operation being detected in step slo exports and performs process.In addition, above-mentioned error becomes large sometimes, therefore, does not use the positional information of the touch in the above-mentioned sensitivity raising time, and only distinguishes strong hand, obtain inclination information, thus the impact of this error can be fixed as Min..
In addition, as putting forward a highly sensitive example, enumerate the integral number of times of the detection output increasing touch sensor 40.In this case, CPU20, before obtain the detection output than usually many number of times from touch sensor 40, can not determine the detected value of each electrode 40A, the 40B in touch sensor 40.Thus, it is contemplated that the possibility that process is slack-off.In this case, in during step S20 ~ step S70, as long as carry out changing in the mode of the operating frequency improving CPU20 and touch sensor 40.On the other hand, in during in addition, by reducing the raising that operating frequency suppresses power consumption compared with during this period as far as possible.
Should consider that this time disclosed embodiment and variation thereof are example but not restricted contents in whole.Scope of the present invention is not by above-mentioned explanation but is illustrated by claim, is intended to comprise all changes in the implication and scope that are equal to claim.Disclosed in embodiment and variation thereof, technology is intended to implement as far as possible independently or after combination.
description of reference numerals
1 entry terminal; 20 CPU; 20A judging part; 20B correction unit; 23 antennas; 24 communicators; 25 hardware button; 26 video cameras; 27 flash memories; 28RAM; 29 ROM; 30 Storage Card Drivers devices; 31 storage cards; 32 microphones; 33 loudspeakers; 34 speed signal processing circuits; 35 displays; 36 LED; 37 data communication interfaces; 38 oscillators; 35 displays; 39 gyro sensors; 40 touch sensors; 40A, 40B electrode; 40C glass substrate; 40D fender; 40X electrode pair.
Claims (7)
1. a touch panel formula input media is the input media possessing touch panel, and above-mentioned touch panel comprises the touch sensor detecting and employ operating body operation, and the feature of above-mentioned touch panel formula input media is,
Also possess information process unit, above-mentioned information process unit performs information processing based on the information being input to above-mentioned touch sensor,
Above-mentioned touch sensor can change according to the position of object in the place leaving this touch sensor and exports the detection of above-mentioned information process unit,
Above-mentioned information process unit based on the distribution that the detection of above-mentioned touch sensor exports judge to the operation of above-mentioned touch sensor be undertaken by the right hand of operator or undertaken by left hand,
The distribution that above-mentioned information process unit exports based on the detection of above-mentioned touch sensor obtains for determining the information of aforesaid operations body relative to the inclined degree of above-mentioned touch sensor.
2. touch panel formula input media according to claim 1,
The above-mentioned judgement undertaken by above-mentioned information process unit, using having carried out the detection sensitivity touch operation of above-mentioned touch sensor being improved to aforesaid operations as condition, has terminated to make above-mentioned sensitivity get back to the sensitivity before raising as condition by above-mentioned touch sensor.
3. touch panel formula input media according to claim 2,
Above-mentioned touch sensor only improves the detection sensitivity of aforesaid operations to the part comprising the part part of having carried out above-mentioned touch operation being detected.
4. the touch panel formula input media according to any one in claim 1 ~ claim 3,
The above-mentioned judgement undertaken by above-mentioned information process unit, using having carried out the detection frequency touch operation of above-mentioned touch sensor being improved to aforesaid operations as condition, has terminated to make this frequency get back to the frequency before raising as condition by above-mentioned touch sensor,
The touch operation carried out above-mentioned touch sensor is improved the frequency obtaining detection output from above-mentioned touch sensor by above-mentioned information process unit as condition, above-mentioned judgement terminated to make this frequency get back to the frequency before raising as condition.
5. the touch panel formula input media according to any one in claim 1 ~ claim 4,
Above-mentioned information process unit corrects the positional information as information input object of above-mentioned touch sensor based on the result of above-mentioned judgement and above-mentioned inclined degree.
6. the control method of a touch panel formula input media, it is control method that performed by the computing machine of the input media possessing touch panel, this input media, above-mentioned touch panel comprises the touch sensor detecting and employ operating body operation, the feature of the control method of above-mentioned touch panel formula input media is
Possess the step performing information processing based on the information being input to above-mentioned touch sensor,
Above-mentioned touch sensor can change according to the position of object in the place leaving this touch sensor and exports the detection of above-mentioned information process unit,
The step of above-mentioned execution information processing comprises:
The distribution exported based on the detection of above-mentioned touch sensor judge to the operation of above-mentioned touch sensor be undertaken by the right hand of operator or undertaken by left hand; And
The distribution exported based on the detection of above-mentioned touch sensor obtains for determining the information of aforesaid operations body relative to the inclined degree of above-mentioned touch sensor.
7. a program, performed by the computing machine of the input media possessing touch panel, above-mentioned touch panel comprises the touch sensor detecting and employ operating body operation, and the feature of said procedure is,
Said procedure makes above computer perform
The step of information processing is performed based on the information being input to above-mentioned touch sensor,
Above-mentioned touch sensor can change according to the position of object in the place leaving above-mentioned touch sensor and exports the detection of above-mentioned information process unit,
The step of above-mentioned execution information processing comprises:
The distribution exported based on the detection of above-mentioned touch sensor judge to the operation of above-mentioned touch sensor be undertaken by the right hand of operator or undertaken by left hand; And
The distribution exported based on the detection of above-mentioned touch sensor obtains for determining the information of aforesaid operations body relative to the inclined degree of above-mentioned touch sensor.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-229693 | 2012-10-17 | ||
JP2012229693A JP6000797B2 (en) | 2012-10-17 | 2012-10-17 | Touch panel type input device, control method thereof, and program |
PCT/JP2013/077894 WO2014061626A1 (en) | 2012-10-17 | 2013-10-15 | Touch panel-type input device, and control method and program thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104737107A true CN104737107A (en) | 2015-06-24 |
CN104737107B CN104737107B (en) | 2017-05-31 |
Family
ID=50488189
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201380053695.3A Expired - Fee Related CN104737107B (en) | 2012-10-17 | 2013-10-15 | The board-like input unit of touch surface, its control method |
Country Status (4)
Country | Link |
---|---|
US (1) | US20150301647A1 (en) |
JP (1) | JP6000797B2 (en) |
CN (1) | CN104737107B (en) |
WO (1) | WO2014061626A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109416600A (en) * | 2016-07-06 | 2019-03-01 | 夏普株式会社 | Touch panel control device and electronic equipment |
CN110618781A (en) * | 2018-06-20 | 2019-12-27 | 卡西欧计算机株式会社 | Electronic device, control method, and storage medium |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015053034A (en) * | 2013-08-07 | 2015-03-19 | 船井電機株式会社 | Input device |
KR20150019352A (en) * | 2013-08-13 | 2015-02-25 | 삼성전자주식회사 | Method and apparatus for grip recognition in electronic device |
US9665206B1 (en) * | 2013-09-18 | 2017-05-30 | Apple Inc. | Dynamic user interface adaptable to multiple input tools |
USD755244S1 (en) * | 2013-12-30 | 2016-05-03 | Samsung Electronics Co., Ltd. | Display screen with animated icon |
US10416801B2 (en) * | 2014-04-08 | 2019-09-17 | Atmel Corporation | Apparatus, controller, and device for touch sensor hand-configuration analysis based at least on a distribution of capacitance values |
USD776200S1 (en) * | 2014-05-27 | 2017-01-10 | Amazon Technologies, Inc. | Label with a touch graphic |
US20150355762A1 (en) * | 2014-06-04 | 2015-12-10 | Apple Inc. | Mid-frame blanking |
US10175741B2 (en) | 2016-03-03 | 2019-01-08 | Atmel Corporation | Touch sensor mode transitioning |
US20180181245A1 (en) * | 2016-09-23 | 2018-06-28 | Microsoft Technology Licensing, Llc | Capacitive touch mapping |
KR101815889B1 (en) | 2017-02-08 | 2018-01-08 | 계명대학교 산학협력단 | Method for estimating user's key input method using virtual keypad learning user key input feature and system thereof |
US11775120B2 (en) * | 2021-01-28 | 2023-10-03 | Texas Instruments Incorporated | Combined capacitive and piezoelectric sensing in a human machine interface |
US11537239B1 (en) * | 2022-01-14 | 2022-12-27 | Microsoft Technology Licensing, Llc | Diffusion-based handedness classification for touch-based input |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101866227A (en) * | 2009-04-16 | 2010-10-20 | 索尼公司 | Messaging device, inclination checking method and inclination detection program |
CN101968695A (en) * | 2009-07-27 | 2011-02-09 | 索尼公司 | Information processing apparatus, display method, and display program |
CN102609130A (en) * | 2010-12-29 | 2012-07-25 | 微软公司 | Touch event anticipation in a computing device |
JP2012146026A (en) * | 2011-01-07 | 2012-08-02 | Canon Inc | Touch panel device and touch panel detection position correction method |
CN102713822A (en) * | 2010-06-16 | 2012-10-03 | 松下电器产业株式会社 | Information input device, information input method and programme |
JP2012194692A (en) * | 2011-03-15 | 2012-10-11 | Ntt Docomo Inc | Display device, control method of display device, and program |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9244545B2 (en) * | 2010-12-17 | 2016-01-26 | Microsoft Technology Licensing, Llc | Touch and stylus discrimination and rejection for contact sensitive computing devices |
-
2012
- 2012-10-17 JP JP2012229693A patent/JP6000797B2/en not_active Expired - Fee Related
-
2013
- 2013-10-15 WO PCT/JP2013/077894 patent/WO2014061626A1/en active Application Filing
- 2013-10-15 CN CN201380053695.3A patent/CN104737107B/en not_active Expired - Fee Related
- 2013-10-15 US US14/435,499 patent/US20150301647A1/en not_active Abandoned
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101866227A (en) * | 2009-04-16 | 2010-10-20 | 索尼公司 | Messaging device, inclination checking method and inclination detection program |
CN101968695A (en) * | 2009-07-27 | 2011-02-09 | 索尼公司 | Information processing apparatus, display method, and display program |
CN102713822A (en) * | 2010-06-16 | 2012-10-03 | 松下电器产业株式会社 | Information input device, information input method and programme |
CN102609130A (en) * | 2010-12-29 | 2012-07-25 | 微软公司 | Touch event anticipation in a computing device |
JP2012146026A (en) * | 2011-01-07 | 2012-08-02 | Canon Inc | Touch panel device and touch panel detection position correction method |
JP2012194692A (en) * | 2011-03-15 | 2012-10-11 | Ntt Docomo Inc | Display device, control method of display device, and program |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109416600A (en) * | 2016-07-06 | 2019-03-01 | 夏普株式会社 | Touch panel control device and electronic equipment |
CN110618781A (en) * | 2018-06-20 | 2019-12-27 | 卡西欧计算机株式会社 | Electronic device, control method, and storage medium |
CN110618781B (en) * | 2018-06-20 | 2023-07-28 | 卡西欧计算机株式会社 | Electronic apparatus, control method, and storage medium |
Also Published As
Publication number | Publication date |
---|---|
JP2014081807A (en) | 2014-05-08 |
WO2014061626A1 (en) | 2014-04-24 |
JP6000797B2 (en) | 2016-10-05 |
US20150301647A1 (en) | 2015-10-22 |
CN104737107B (en) | 2017-05-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104737107B (en) | The board-like input unit of touch surface, its control method | |
JP5423686B2 (en) | Computer program, input device and input method | |
US9274619B2 (en) | Input apparatus, input method, and input program | |
US10268302B2 (en) | Method and apparatus for recognizing grip state in electronic device | |
KR102440965B1 (en) | Sylus pen, electriconic apparatus for receiving signal from the stylus pen and controlling method thereof | |
EP3196752B1 (en) | Capacitive touch panel device, corresponding touch input detection method and computer program product | |
JP5599741B2 (en) | Electronic device, content display method, and content display program | |
US9201521B2 (en) | Storing trace information | |
US20150035781A1 (en) | Electronic device | |
EP2354893A1 (en) | Reducing inertial-based motion estimation drift of a game input controller with an image-based motion estimation | |
US20110122080A1 (en) | Electronic device, display control method, and recording medium | |
US20070126711A1 (en) | Input device | |
CN101833391A (en) | Messaging device, information processing method and program | |
US20190091562A1 (en) | Information processing system, extended input device, and information processing method | |
US20150022467A1 (en) | Electronic device, control method of electronic device, and control program of electronic device | |
CN104620196A (en) | Systems and methods for switching sensing regimes for gloved and ungloved user input | |
CN105786275A (en) | Electronic device and control method for the same | |
US20140300558A1 (en) | Electronic apparatus, method of controlling electronic apparatus, and program for controlling electronic apparatus | |
CN104423838A (en) | Document dividing and merging | |
WO2018159414A1 (en) | Terminal device and operation control program | |
JP2014071507A (en) | Electronic apparatus, control method of electronic apparatus, and computer program | |
CN104281301A (en) | Input method and electronic equipment | |
KR101901234B1 (en) | Method and device for inputting of mobile terminal using a pen | |
JP2022082074A (en) | Display device and display system | |
JP2012220933A (en) | Learning device, control program therefor, and learning system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20170531 |