CN102763062A - Three-state touch input system - Google Patents
Three-state touch input system Download PDFInfo
- Publication number
- CN102763062A CN102763062A CN2010800546364A CN201080054636A CN102763062A CN 102763062 A CN102763062 A CN 102763062A CN 2010800546364 A CN2010800546364 A CN 2010800546364A CN 201080054636 A CN201080054636 A CN 201080054636A CN 102763062 A CN102763062 A CN 102763062A
- Authority
- CN
- China
- Prior art keywords
- touch
- screen
- state
- graphical user
- interface element
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 claims description 14
- 230000004044 response Effects 0.000 claims description 11
- 230000008569 process Effects 0.000 claims description 3
- 230000009471 action Effects 0.000 abstract description 15
- 230000000694 effects Effects 0.000 abstract description 3
- 230000003213 activating effect Effects 0.000 abstract description 2
- 210000003811 finger Anatomy 0.000 description 30
- 238000010586 diagram Methods 0.000 description 6
- 230000000712 assembly Effects 0.000 description 5
- 238000000429 assembly Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 239000006185 dispersion Substances 0.000 description 4
- 230000004438 eyesight Effects 0.000 description 3
- 238000010438 heat treatment Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 238000012163 sequencing technique Methods 0.000 description 2
- 238000003860 storage Methods 0.000 description 2
- 210000003813 thumb Anatomy 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 238000004378 air conditioning Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000008676 import Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000004321 preservation Methods 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
- 238000009423 ventilation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B21/00—Teaching, or communicating with, the blind, deaf or mute
- G09B21/001—Teaching or communicating with blind persons
- G09B21/006—Teaching or communicating with blind persons using audible presentation of the information
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04104—Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Multimedia (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
A touch screen input device is provided which simulates a 3-state input device such as a mouse. One of these states is used to preview the effect of activating a graphical user interface element when the screen is touched. In this preview state touching a graphical user interface element on the screen with a finger or stylus does not cause the action associated with that element to be performed. Rather, when the screen is touched while in the preview state audio cues are provided to the user indicating what action would arise if the action associated with the touched element were to be performed.
Description
Background technology
Touching quick display screen has become more prevalently as traditional keyboard and substituting of other man-machine interface (" HMI ") are received from user's data typing or other input.Touch-screen is used in the various equipment, comprises portable and fixed location device.Portable set with touch-screen generally comprises the for example personal media player of mobile phone, personal digital assistant (" PDA ") and playing back music and video.Use the equipment of the stationkeeping of touch-screen to generally comprise those equipment that for example are used in vehicle, the terminal, point of sale (" POS "), and be used in the equipment in medical science and the commercial Application.
The ability that on touch-screen, directly touches and control data has powerful attractive force for the user.In many aspects, touch-screen can be used as than conventional mouse input mechanism and being used more easily.When using touch-screen, the user can directly rap screen simply on the graphical user-interface element (for example, icon) that they hope to select, and needn't be placed into cursor on the user interface with mouse.
Touch-screen both can be used for being shown to the user to the output from computing equipment, also can receive the input from the user.User's input option can for example be shown as on screen and control, navigation or object icon.When the user when selecting to import option with the associated icon on stylus or the finger touch screen, the position that the computing equipment perception touches also sends a message to application or the utility routine that presents this icon.
Summary of the invention
Conventional touch-screen input device has problem for the user that eyesight weakens, because visually judge their finger or stylus and aiming at of this graphical user-interface element before the graphical user-interface element of wanting that they can not occur on contact screen.In addition, they do not have means to come before contacting with screen checking to touch the influence of this screen, and (underlying) application below when contacting will be taken action in response to this contact.
In order to overcome this restriction, in a kind of realization, the touch-screen input device of imitation such as 3 such attitude input equipments of mouse is provided.One of these states are used for the effect of preview activation graphical user-interface element when screen is touched.In this preview state, use the graphical user-interface element of finger or stylus touch on screen not impel and be performed with that element associated action.On the contrary, when screen was touched when being in the preview state, audio prompt was provided for the user, if indication will be performed with the element associated action that is touched, then will produce what action.
In some were realized, he or she wanted the graphical user-interface element selected in case the user has located, and the user can keep and is placed into second finger or stylus on the touch-screen when this element contacts at first finger or stylus.By this way, the graphical user-interface element of wanting can be activated.That is, bring touch-screen into below having impelled in second state application responds, just as such when selecting that element through the input equipment that uses routine through contacting with second finger or stylus.
This summary is provided to introduce with the form of simplifying the selection of notion, and these notions are with describing further in the detailed description below.This summary neither plans to confirm the key feature or the essential feature of theme required for protection, does not also plan to be used for helping to confirm the scope of theme required for protection.
Description of drawings
Fig. 1 has shown illustrative portable computing environment, and wherein the user carries out with equipment through using touch-screen to receive user's input alternately.
Fig. 2 has shown the various illustrative shape factors (form factor) of the computing equipment that can utilize touch-screen therein.
Fig. 3 has shown the constitutional diagram of conventional mouse input equipment.
Fig. 4 has shown the constitutional diagram of conventional touch-screen input device.
Fig. 5 has shown an example of the constitutional diagram of 3 attitude touch-screen input devices.
Fig. 6 has shown that user's finger touches the touch-screen that presents options menu.
Fig. 7 has shown that the user's finger among Fig. 6 touches the option that is labeled as " dispersion view ".
Fig. 8 has shown the touch-screen shown in finger touch Fig. 6-7, this impel on touch-screen, appear with the finger with screen carry out the annulus that position contacting is the center.
Fig. 9 has shown that second finger touches the touch-screen shown in Fig. 8, so that activate selected graphical user-interface element.
Figure 10 is illustrative architecture, and its demonstration can be installed in and utilize touch-screen to receive the functional module on the computing equipment of user input.
Embodiment
Fig. 1 has shown illustrative portable computing environment 100, and wherein user 102 carries out with equipment 105 through using touch-screen 110 to receive user's input alternately.Equipment 105, as shown in Figure 1, generally be configured to portable computing platform or information utensil, such as mobile phone, smart mobile phone, PDA, ultra mobile PC (personal computer), portable game equipment, personal media player or the like.Typically, touch-screen 110 is made up of touch sensor assemblies, and touch sensor assemblies is structured on the display module.Display module is to be similar to the such mode display image of typical monitor on PC or laptop computer.In many application, because in light weight, the thin and cost of LCD (" LCD ") is low, equipment 105 will use LCD.Yet, in the application of replacement, can utilize other conventional display technology, comprise for example cathode-ray tube (CRT) (" CRT "), plasma screen and EL screen.
Touch sensor assemblies is positioned on the display module.Touch sensor is transparent, sees display so that can see through it.Many dissimilar touch sensor technologies are known, and can be taken the circumstances into consideration to use to satisfy the needs of specific implementation.These technology especially comprise resistance, electric capacity, near field, optical imagery, strainmeter, dispersion signal, acoustic pulses identification, infrared and surface acoustic wave technique.Some present touch-screens can be distinguished a plurality of touch points of carrying out simultaneously and/or be pressure-sensitive.Carry out typically accomplishing alternately with touch-screen 110, perhaps, also can use stylus for the touch sensor of non-capacitive type through the finger beyond use thumb or the thumb.
Fig. 2 is presented at other illustrative shape factor that wherein can utilize computing equipment, comprises desk-top computer 1301, notebook computer 1302, panel computer 1303, handheld computer 1304, personal digital assistant 1305, media player 1306, mobile phone 1307 or the like.In addition, said computing machine can be the combination of these types, for example, and as the equipment of the combination of personal digital assistant, media player and mobile phone.
Although the many shape factors that show among Fig. 1 and Fig. 2 are of portable form, present arrangement also can be used in any fixing, wherein utilize the computing equipment of touch-screen.These equipment comprise for example auto-teller (" ATM "), (" POS ") terminal, point of sale or self-assistant information booth (self-service kiosk) and similar devices; Such as use by airline, bank, restaurant and retailing facility those, it makes the transaction that the user can inquire about, carry out self-checkout or accomplish other type.Industry, medical science and other are used also to be contemplated in and are wherein used touch-screen, for example, are used for controlling machine or equipment, place an order, managed inventory or the like.Touch-screen also becomes in automobile more generally with the subsystem of control ratio like heating, heating ventilation and air-conditioning (" HVAC "), amusement and navigation.The Microsoft Surface of new plane computations machine product, particularly Microsoft
TM, also can be suitable for using together with this input equipment.
For the ease of understanding method described herein, technology and system, it possibly be helpful coming the functional modeling of conventional mouse and conventional touch-screen input device and their operation is compared through user mode figure.
At first, when mouse is in outside its following range (such as when mechanical mouse is lifted away from the surface, taking place), mouse is in state 0, and this state can be called as outside the scope.Then, consider certain mouse, this mouse is in its following range but its any button all is not pressed.This state can be called as tracking, and it has described the state of the motion that the cursor that wherein appears on the screen or pointer follow mouse.Tracking mode can be called as state 1.In tracking mode, can navigate to cursor or pointer on any graphical user-interface element of wanting through rolling mouse.When button was pressed, mouse also can be operated in second state (being called state 2).In can being called as this state that drags, graphical user-interface element or object are moved (" dragging ") so that they follow the motion of mouse on display.Should be pointed out that the action of selecting icon can be considered to the sub-state of the state of dragging, press and release-push because select to involve.
Fig. 3 has shown the constitutional diagram of above-described mouse.Be in outside the scope at state 0 mouse, and it is in the tracking mode at state 1.Through bringing back to mouse in the scope, mouse can get the hang of 1 from state 0.Under the situation of mechanical mouse, this involves and makes mouse turn back to the surface such as mouse pad.Through pressing (" clicking ") button, mouse can get the hang of 2 from state 1.Through release-push, mouse also can be from state 2 return states 1.
Fig. 4 has shown the constitutional diagram of conventional touch-screen input device, its only can be assumed to be can a bit of sensing pressure (bit of pressure), promptly touch or no touch.Though mouse has three states, touch-screen input device has only two states, and it is corresponding to state 0 (scope is outer) and state 2 (dragging).Just, conventional touch-screen input device does not have tracking mode.
In the touch-screen input device of routine, lacking tracking mode can be overcome by the user that eyesight is arranged, because their finger or stylus and aiming at of this graphical user-interface element are judged in the graphical user-interface element anterior optic of wanting that they can occur on contact screen ground.Yet the user that eyesight weakens does not have means checking before contacting with screen to touch the influence of this screen, and the application below when contacting will be taken action in response to that contact.
In order to overcome this restriction, the touch-screen input device of imitation such as 3 such attitude input equipments of mouse is provided.Preview 2 the effect that gets the hang of when additional state is used in screen and is touched.In this preview state, the graphical user-interface element on the touch screen is not impelled with that element associated action and is performed.On the contrary, when screen is touched when being in the preview state, if the indication touch-screen input device to get the hang of 2 will produce what action audio prompt be provided for the user.
Fig. 5 has shown an example of the constitutional diagram of 3 attitude touch-screen input devices.State 0 and state 2 are corresponding to state shown in Figure 40 and state 2.Yet should be pointed out that for generality, call touch condition to the state among Fig. 52, it can comprise the action such as the graphical user-interface element that drags and select just be touched.For example, second state can allow on touch-screen, to drag graphical user-interface element in response to first touch along moving of touch-screen.Except these two states, new state also is provided, state 1, it can be called as the audio preview state in some are realized.Through coming touch screen, can outside scope, get into the audio preview state by state (state 0) with single finger or stylus.When various graphical user-interface element are contacted when being in this state, provide a description the audio prompt of the function of the element that is being contacted.For example, as shown in Figure 6, with Microsoft Surface
TMThe touch-screen that computer product uses together receives user's finger.This finger is touching the screen that presents options menu 205.As the result who on touch-screen, receives finger, on touch-screen, generate annulus 210.In Fig. 7, finger touch is labeled as the option of " ScatterView (dispersion view) ".In response to this touch, generate the for example audio prompt of " dispersion view ".
He or she wants the graphical user-interface element selected in case the user has located, and the user just can be through when first finger or stylus and this element keep in touch, is placed on second finger or stylus on the touch-screen and gets the hang of 2.By this way, the graphical user-interface element of wanting can be activated.That is,, impel following application to respond, just as that kind when that element is selected through using conventional input equipment through contacting with second finger or stylus and bringing touch-screen into second state.
Indicated like Fig. 5, the user can withdraw from second state through lift second finger or stylus from touch-screen, and it makes screen turn back to the audio preview state.That is, detect and lack second finger or stylus just makes screen turn back to the audio preview state.
In some are realized,, can get into touch condition from the audio preview state through second finger or stylus being placed into the predefine part that is placed into screen Anywhere or alternatively of screen.In other was realized, the user contacted with screen near first finger or stylus.For example, under some situations, second finger or stylus are contacting in first finger or the predefined distance of stylus.Fig. 8 shows such example.In this example, in order to get into touch condition, carrying out position contacting with first finger or stylus and screen is that the annulus 210 at center is present on the touch-screen.Point the rectangle 220 of positive contact mark for " large project ".Firm touch rectangle 220, audio prompt " large project " is just presented to the user.In order to get into touch condition, the user uses second finger or stylus screen next and in the annulus that is shown 210 to contact.Fig. 9 has shown this input equipment that is in touch condition.Second finger has caused annulus 230, and it is shown as with annulus 210 overlapping.
Figure 10 is illustrative architecture 400, and its demonstration can be installed in and utilize touch-screen to receive the functional module on the computing equipment of user input.Said functional module can be alternatively through using software, hardware, firmware, or the various combinations of software, hardware and firmware and realizing.For example, the functional module in illustrative architecture 404 can be created during carrying out the working time that is stored in the instruction in the storer by processor.
Therefore, audio preview assembly 420 is arranged to receive incoming event, such as the physical coordinates from touch screen controller 425.The state of the character decision touch-screen of incoming event.That is, wherein the user points or the mode of stylus contact screen determines screen to be in the outer state of scope, audio preview state or touch condition with one or two.In preview state, audio preview assembly 420 is formulated (formulate) then to suitably the calling of host application, so that obtain functional information of carrying out about by the graphical user-interface element that is touched or contacts.For example; If host application 407 allows (programmatic) visit of sequencing; Then audio preview assembly 420 can be extracted in the data in the host application 407, the graphical user-interface element that this Data Identification user has selected in audio preview state or touch condition.If the content that audio preview assembly 420 cannot sequencing accessing host application 407, then mainframe program possibly write so that incorporate into and can be disclosed the proper A PI that necessary information is given audio preview assembly 420.The conversion of typically adopting the data of being extracted of textual form to stand Text To Speech through Text To Speech converter or the module of using by 420 visits of audio preview assembly.Alternatively, the data of being extracted can be used for generating voice data, and the indication of this voice data is through activating the function that the graphical user-interface element that is being touched or contacting is carried out.For example, under some situations, different tones can be used for representing normally used graphical user-interface element, such as " preservation ", " closing " or the like.Audio preview assembly 420 can disclose voice data then and give Audio Controller 434, and it can send the audio frequency maker of drive signal in the hardware layer 440, so that this audio frequency can be reproduced.
When using in this application, term " assembly " and plans such as " systems " are meant combination, software, or the executory software of entity that computing machine is relevant or hardware, hardware and software.For example, assembly can be, but is not limited to run on thread, program and/or the computing machine of process on the processor, processor, object, instance, executable file, execution.As illustrating, the application and the computing machine that run on the computing machine can be assemblies.One or more assemblies can reside in the thread of process and/or execution, and assembly can be positioned on the computing machine and/or is distributed between two or more the computing machines.
And theme required for protection can be implemented as method, equipment or manufacture through using standard program and/or engineering, is used for control computer and goes to implement software, firmware, hardware or their any combination of disclosed theme so that produce.When used herein, term " manufacture " plans to comprise from any computer readable device or the addressable machine-readable computer program of medium.For example; Computer-readable storage medium (for example can include but not limited to magnetic storage apparatus; Hard disk, floppy disk, tape ...), CD (for example, compact disk (CD), digital versatile disc (DVD) ...), (for example, card, rod, key drive for smart card and flash memory device ...).Certainly, person of skill in the art will appreciate that, can carry out many modifications to this configuration, and not break away from the scope or the spirit of theme required for protection.
Although this theme is to be described with the language specific to architectural feature and/or method action, should be appreciated that the specific characteristic or the action that not necessarily are confined to above description at theme defined in the appended claims.On the contrary, the specific characteristic of more than describing and action are as the exemplary form of implementing claim and disclosed.
Claims (15)
1. one kind for mobile device provides the method for user interface, comprising:
On touch-screen, show one or more graphical user-interface element;
Be received in graphical user-interface element position first of touch-screen is touched; With
In response to the reception of first touch, indication is reproduced by the audio prompt of the function of graphical user-interface element execution in said preview state through getting into preview state.
2. the method for claim 1 also comprises:
When continuing to receive first touch, receive second of touch-screen is touched; With
Be different from second state preview state, that be associated with graphical user-interface element through entering, and in response to second reception that touch.
3. the method for claim 2, wherein second state allows graphical user-interface element on touch-screen, to be dragged, with in response to first touching moving along touch-screen when second state.
4. the method for claim 2 wherein has only second state that when the predefine at touch-screen partly receives second touch, just gets into.
5. the method for claim 2, wherein have only when touch-screen, receive second and just get into second state when touching from receiving the part of first position that touch less than the predefine distance.
6. the process of claim 1 wherein graphical user-interface element representative for the part of the user interface of the application of on electronic equipment, carrying out, and impel said application to respond with predefined mode by the function that graphical user-interface element is carried out.
7. the method for claim 6 wherein gets into preview state and does not impel said application to respond according to the function of being carried out by graphical user-interface element.
8. the method for claim 2 also comprises:
Detect lacking of second touch; With
In response to lacking of second touch, turn back to preview state.
9. touch screen display system that is used for electronic equipment comprises:
Touch-screen is configured to receive user's input and shows one or more graphical user-interface element; With
The audio preview assembly; Be configured to through getting into preview state in response to receiving first of touch-screen is touched in the graphical user-interface element position; In said preview state, indication is reproduced by the audio prompt of the function that this graphical user-interface element is carried out.
10. the touch screen display system of claim 9; Also comprise the application that resides on the electronic equipment; This application has the user interface that comprises graphical user-interface element; And wherein the audio preview assembly comprises the Text To Speech converter assembly, is used for the text-converted that is associated with graphical user-interface element is become audio prompt, and said text is disclosed to the audio preview assembly by said application.
11. the touch screen display system of claim 9; Also comprise touch screen controller, it is configured to be different from second state preview state, that be associated with graphical user-interface element and in response to receiving first touch in continuation when, receive second touch to touch-screen through entering.
12. the touch screen display system of claim 11, wherein second state allows graphical user-interface element on touch-screen, to be dragged, to touch moving along touch-screen in response to first.
13. the touch screen display system of claim 11 wherein has only second state that when the predefine at touch-screen partly receives second touch, just gets into.
14. the touch screen display system of claim 11, wherein have only when touch-screen, receive second and just get into second state when touching from receiving the part of first position that touch less than the predefine distance.
15. the touch screen display system of claim 9; Also comprise the application that resides on the electronic equipment; This application has the user interface that comprises graphical user-interface element, and the function of wherein being carried out by graphical user-interface element impels this application to respond with predefined mode.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/630,381 US20110138284A1 (en) | 2009-12-03 | 2009-12-03 | Three-state touch input system |
US12/630,381 | 2009-12-03 | ||
US12/630381 | 2009-12-03 | ||
PCT/US2010/057701 WO2011068713A2 (en) | 2009-12-03 | 2010-11-23 | Three-state touch input system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102763062A true CN102763062A (en) | 2012-10-31 |
CN102763062B CN102763062B (en) | 2015-09-16 |
Family
ID=44083226
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201080054636.4A Active CN102763062B (en) | 2009-12-03 | 2010-11-23 | 3 state touch input systems |
Country Status (9)
Country | Link |
---|---|
US (1) | US20110138284A1 (en) |
EP (1) | EP2507698B1 (en) |
JP (1) | JP5775526B2 (en) |
KR (1) | KR101872533B1 (en) |
CN (1) | CN102763062B (en) |
AU (1) | AU2010326223B2 (en) |
CA (1) | CA2779706C (en) |
RU (1) | RU2559749C2 (en) |
WO (1) | WO2011068713A2 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103942000A (en) * | 2014-04-23 | 2014-07-23 | 宁波保税区攀峒信息科技有限公司 | Touch event identification method |
CN104516559A (en) * | 2013-09-27 | 2015-04-15 | 华硕电脑股份有限公司 | Multi-point touch method of touch input device |
CN106062855A (en) * | 2014-11-03 | 2016-10-26 | 天才工厂 | Electronic device and method for providing learning information using the same |
CN113220193A (en) * | 2015-05-11 | 2021-08-06 | 碧倬乐科技有限公司 | System and method for previewing digital content |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9953392B2 (en) | 2007-09-19 | 2018-04-24 | T1V, Inc. | Multimedia system and associated methods |
US8600816B2 (en) * | 2007-09-19 | 2013-12-03 | T1visions, Inc. | Multimedia, multiuser system and associated methods |
US9965067B2 (en) | 2007-09-19 | 2018-05-08 | T1V, Inc. | Multimedia, multiuser system and associated methods |
EP2490113B1 (en) * | 2011-02-15 | 2016-11-23 | Lg Electronics Inc. | Display device and method of controlling operation thereof |
TWI441052B (en) * | 2011-02-24 | 2014-06-11 | Avermedia Tech Inc | Gesture manipulation method and mutlimedia display apparatus |
US20130021269A1 (en) * | 2011-07-20 | 2013-01-24 | Google Inc. | Dynamic Control of an Active Input Region of a User Interface |
WO2013068793A1 (en) * | 2011-11-11 | 2013-05-16 | Nokia Corporation | A method, apparatus, computer program and user interface |
KR101849720B1 (en) * | 2012-03-21 | 2018-05-31 | 김시환 | System and method for providing information in phases |
JP5798532B2 (en) * | 2012-08-23 | 2015-10-21 | 株式会社Nttドコモ | User interface device, user interface method and program |
CN102902477A (en) * | 2012-08-24 | 2013-01-30 | 中国电力科学研究院 | Touch screen based power system simulation control method |
KR101940220B1 (en) | 2012-10-23 | 2019-01-18 | 엘지디스플레이 주식회사 | Display Device Including Power Control Unit And Method Of Driving The Same |
JP5806270B2 (en) * | 2013-09-21 | 2015-11-10 | 株式会社豊田自動織機 | Touch switch module |
US9542037B2 (en) | 2015-03-08 | 2017-01-10 | Apple Inc. | Device, method, and user interface for processing intensity of touch contacts |
US10671343B1 (en) * | 2016-06-30 | 2020-06-02 | Amazon Technologies, Inc. | Graphical interface to preview functionality available for speech-enabled processing |
DE102016216318A1 (en) | 2016-08-30 | 2018-03-01 | Continental Automotive Gmbh | Method and device for operating an electronic device |
EP3736677A1 (en) * | 2019-05-10 | 2020-11-11 | MyScript | A method and corresponding device for selecting and editing handwriting input elements |
US20200379716A1 (en) * | 2019-05-31 | 2020-12-03 | Apple Inc. | Audio media user interface |
CN110908580B (en) * | 2019-11-11 | 2021-11-02 | 广州视源电子科技股份有限公司 | Method and device for controlling application |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6958749B1 (en) * | 1999-11-04 | 2005-10-25 | Sony Corporation | Apparatus and method for manipulating a touch-sensitive display panel |
CN1977234A (en) * | 2003-09-25 | 2007-06-06 | 诺基亚有限公司 | User interface on a portable electronic device |
CN101098533A (en) * | 2006-06-26 | 2008-01-02 | 三星电子株式会社 | Keypad touch user interface method and mobile terminal using the same |
TW200805132A (en) * | 2006-05-24 | 2008-01-16 | Lg Electronics Inc | Touch screen device and operating method thereof |
CN101384977A (en) * | 2005-09-16 | 2009-03-11 | 苹果公司 | Operation of a computer with touch screen interface |
Family Cites Families (52)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4307266A (en) * | 1978-08-14 | 1981-12-22 | Messina John D | Communication apparatus for the handicapped |
US5483261A (en) * | 1992-02-14 | 1996-01-09 | Itu Research, Inc. | Graphical input controller and method with rear screen image detection |
US5825352A (en) * | 1996-01-04 | 1998-10-20 | Logitech, Inc. | Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad |
US6009355A (en) * | 1997-01-28 | 1999-12-28 | American Calcar Inc. | Multimedia information and control system for automobiles |
US8479122B2 (en) * | 2004-07-30 | 2013-07-02 | Apple Inc. | Gestures for touch sensitive input devices |
US9292111B2 (en) * | 1998-01-26 | 2016-03-22 | Apple Inc. | Gesturing with a multipoint sensing device |
US7358956B2 (en) * | 1998-09-14 | 2008-04-15 | Microsoft Corporation | Method for providing feedback responsive to sensing a physical presence proximate to a control of an electronic device |
US6532005B1 (en) * | 1999-06-17 | 2003-03-11 | Denso Corporation | Audio positioning mechanism for a display |
US6999066B2 (en) * | 2002-06-24 | 2006-02-14 | Xerox Corporation | System for audible feedback for touch screen displays |
US7023427B2 (en) * | 2002-06-28 | 2006-04-04 | Microsoft Corporation | Method and system for detecting multiple touches on a touch-sensitive screen |
US7242387B2 (en) * | 2002-10-18 | 2007-07-10 | Autodesk, Inc. | Pen-mouse system |
JP4387242B2 (en) * | 2004-05-10 | 2009-12-16 | 株式会社バンダイナムコゲームス | PROGRAM, INFORMATION STORAGE MEDIUM, AND GAME DEVICE |
CA2573002A1 (en) * | 2004-06-04 | 2005-12-22 | Benjamin Firooz Ghassabian | Systems to enhance data entry in mobile and fixed environment |
KR101128572B1 (en) * | 2004-07-30 | 2012-04-23 | 애플 인크. | Gestures for touch sensitive input devices |
US20060077182A1 (en) * | 2004-10-08 | 2006-04-13 | Studt Peter C | Methods and systems for providing user selectable touch screen functionality |
US7735012B2 (en) * | 2004-11-04 | 2010-06-08 | Apple Inc. | Audio user interface for computing devices |
JP2006139615A (en) * | 2004-11-12 | 2006-06-01 | Access Co Ltd | Display device, menu display program, and tab display program |
FR2878344B1 (en) * | 2004-11-22 | 2012-12-21 | Sionnest Laurent Guyot | DATA CONTROLLER AND INPUT DEVICE |
US7728818B2 (en) * | 2005-09-30 | 2010-06-01 | Nokia Corporation | Method, device computer program and graphical user interface for user input of an electronic device |
US20090213086A1 (en) * | 2006-04-19 | 2009-08-27 | Ji Suk Chae | Touch screen device and operating method thereof |
US20070257891A1 (en) * | 2006-05-03 | 2007-11-08 | Esenther Alan W | Method and system for emulating a mouse on a multi-touch sensitive surface |
US7880728B2 (en) * | 2006-06-29 | 2011-02-01 | Microsoft Corporation | Application switching via a touch screen interface |
US7843427B2 (en) * | 2006-09-06 | 2010-11-30 | Apple Inc. | Methods for determining a cursor position from a finger contact with a touch screen display |
US7581186B2 (en) * | 2006-09-11 | 2009-08-25 | Apple Inc. | Media manager with integrated browsers |
JP2008097172A (en) * | 2006-10-10 | 2008-04-24 | Sony Corp | Display and display method |
US20080129520A1 (en) * | 2006-12-01 | 2008-06-05 | Apple Computer, Inc. | Electronic device with enhanced audio feedback |
US7777732B2 (en) * | 2007-01-03 | 2010-08-17 | Apple Inc. | Multi-event input system |
US8970503B2 (en) * | 2007-01-05 | 2015-03-03 | Apple Inc. | Gestures for devices having one or more touch sensitive surfaces |
US7924271B2 (en) * | 2007-01-05 | 2011-04-12 | Apple Inc. | Detecting gestures on multi-event sensitive devices |
US8144129B2 (en) * | 2007-01-05 | 2012-03-27 | Apple Inc. | Flexible touch sensing circuits |
US7877707B2 (en) * | 2007-01-06 | 2011-01-25 | Apple Inc. | Detecting and interpreting real-world and security gestures on touch and hover sensitive devices |
US8519964B2 (en) * | 2007-01-07 | 2013-08-27 | Apple Inc. | Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display |
JP2008204275A (en) * | 2007-02-21 | 2008-09-04 | Konica Minolta Business Technologies Inc | Input operation device and input operation method |
US8115753B2 (en) * | 2007-04-11 | 2012-02-14 | Next Holdings Limited | Touch screen system with hover and click input methods |
KR100894966B1 (en) * | 2007-06-07 | 2009-04-24 | 에스케이 텔레콤주식회사 | Method for simultaneously recognizing a plurality of touches in mobile terminal and the mobile terminal of the same |
US9830804B2 (en) * | 2007-06-19 | 2017-11-28 | At & T Intellectual Property, I, L.P. | Methods, apparatuses, and computer program products for implementing situational control processes |
WO2009044770A1 (en) * | 2007-10-02 | 2009-04-09 | Access Co., Ltd. | Terminal device, link selection method, and display program |
KR101398134B1 (en) * | 2007-10-04 | 2014-05-20 | 엘지전자 주식회사 | Apparatus and method for playing moving-picture in mobile terminal |
US8063905B2 (en) * | 2007-10-11 | 2011-11-22 | International Business Machines Corporation | Animating speech of an avatar representing a participant in a mobile communication |
US20090102805A1 (en) * | 2007-10-18 | 2009-04-23 | Microsoft Corporation | Three-dimensional object simulation using audio, visual, and tactile feedback |
US20090122018A1 (en) * | 2007-11-12 | 2009-05-14 | Leonid Vymenets | User Interface for Touchscreen Device |
US20090166098A1 (en) * | 2007-12-31 | 2009-07-02 | Apple Inc. | Non-visual control of multi-touch device |
KR101499546B1 (en) * | 2008-01-17 | 2015-03-09 | 삼성전자주식회사 | Method and apparatus for controlling display area in touch screen device, and computer readable medium thereof |
US8237665B2 (en) * | 2008-03-11 | 2012-08-07 | Microsoft Corporation | Interpreting ambiguous inputs on a touch-screen |
JP4853507B2 (en) * | 2008-10-30 | 2012-01-11 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
US8456420B2 (en) * | 2008-12-31 | 2013-06-04 | Intel Corporation | Audible list traversal |
US9489131B2 (en) * | 2009-02-05 | 2016-11-08 | Apple Inc. | Method of presenting a web page for accessibility browsing |
KR101597553B1 (en) * | 2009-05-25 | 2016-02-25 | 엘지전자 주식회사 | Function execution method and apparatus thereof |
US8681106B2 (en) * | 2009-06-07 | 2014-03-25 | Apple Inc. | Devices, methods, and graphical user interfaces for accessibility using a touch-sensitive surface |
US9262063B2 (en) * | 2009-09-02 | 2016-02-16 | Amazon Technologies, Inc. | Touch-screen user interface |
US10357714B2 (en) * | 2009-10-27 | 2019-07-23 | Harmonix Music Systems, Inc. | Gesture-based user interface for navigating a menu |
US8446392B2 (en) * | 2009-11-16 | 2013-05-21 | Smart Technologies Ulc | Method for determining the location of a pointer in a pointer input region, and interactive input system executing the method |
-
2009
- 2009-12-03 US US12/630,381 patent/US20110138284A1/en not_active Abandoned
-
2010
- 2010-11-23 RU RU2012127679/08A patent/RU2559749C2/en active
- 2010-11-23 EP EP10834961.4A patent/EP2507698B1/en active Active
- 2010-11-23 JP JP2012542087A patent/JP5775526B2/en active Active
- 2010-11-23 CN CN201080054636.4A patent/CN102763062B/en active Active
- 2010-11-23 WO PCT/US2010/057701 patent/WO2011068713A2/en active Application Filing
- 2010-11-23 AU AU2010326223A patent/AU2010326223B2/en active Active
- 2010-11-23 KR KR1020127017151A patent/KR101872533B1/en active IP Right Grant
- 2010-11-23 CA CA2779706A patent/CA2779706C/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6958749B1 (en) * | 1999-11-04 | 2005-10-25 | Sony Corporation | Apparatus and method for manipulating a touch-sensitive display panel |
CN1977234A (en) * | 2003-09-25 | 2007-06-06 | 诺基亚有限公司 | User interface on a portable electronic device |
CN101384977A (en) * | 2005-09-16 | 2009-03-11 | 苹果公司 | Operation of a computer with touch screen interface |
TW200805132A (en) * | 2006-05-24 | 2008-01-16 | Lg Electronics Inc | Touch screen device and operating method thereof |
CN101098533A (en) * | 2006-06-26 | 2008-01-02 | 三星电子株式会社 | Keypad touch user interface method and mobile terminal using the same |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104516559A (en) * | 2013-09-27 | 2015-04-15 | 华硕电脑股份有限公司 | Multi-point touch method of touch input device |
CN103942000A (en) * | 2014-04-23 | 2014-07-23 | 宁波保税区攀峒信息科技有限公司 | Touch event identification method |
CN106062855A (en) * | 2014-11-03 | 2016-10-26 | 天才工厂 | Electronic device and method for providing learning information using the same |
CN113220193A (en) * | 2015-05-11 | 2021-08-06 | 碧倬乐科技有限公司 | System and method for previewing digital content |
Also Published As
Publication number | Publication date |
---|---|
KR20120117809A (en) | 2012-10-24 |
KR101872533B1 (en) | 2018-08-02 |
AU2010326223B2 (en) | 2014-05-01 |
WO2011068713A3 (en) | 2011-09-29 |
CA2779706C (en) | 2019-06-04 |
WO2011068713A2 (en) | 2011-06-09 |
CN102763062B (en) | 2015-09-16 |
CA2779706A1 (en) | 2011-06-09 |
EP2507698A2 (en) | 2012-10-10 |
US20110138284A1 (en) | 2011-06-09 |
AU2010326223A1 (en) | 2012-05-24 |
EP2507698A4 (en) | 2016-05-18 |
RU2012127679A (en) | 2014-01-10 |
EP2507698B1 (en) | 2020-09-02 |
JP5775526B2 (en) | 2015-09-09 |
RU2559749C2 (en) | 2015-08-10 |
JP2013513164A (en) | 2013-04-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102763062B (en) | 3 state touch input systems | |
US20230289023A1 (en) | Method and apparatus for displaying application | |
JP5129140B2 (en) | Computer operation using a touch screen interface | |
US9146672B2 (en) | Multidirectional swipe key for virtual keyboard | |
US20120169776A1 (en) | Method and apparatus for controlling a zoom function | |
JP2009509235A (en) | Arrangement of virtual input device on touch screen type user interface | |
JP2011248888A (en) | Method and dual screen device for user gesture on dual screen | |
JPH1040014A (en) | Method for instructing generation of virtual pointing device and device therefor | |
JPH1063425A (en) | Method for instracting generation of virtual pointing device, and computer system | |
JPH1063423A (en) | Method for instracting generation of virtual pointing device, and computer system | |
JPH1063426A (en) | Method for instruating generation of virtual pointing device, and computer system and its device | |
JP2017535898A (en) | System and method for linking applications | |
JPH1063422A (en) | Method for generating at least two virtual pointing device, and computer system | |
AU2018278777B2 (en) | Touch input device and method | |
EP4080347B1 (en) | Method and apparatus for displaying application | |
KR101919515B1 (en) | Method for inputting data in terminal having touchscreen and apparatus thereof | |
KR20160027063A (en) | Method of selection of a portion of a graphical user interface |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
ASS | Succession or assignment of patent right |
Owner name: MICROSOFT TECHNOLOGY LICENSING LLC Free format text: FORMER OWNER: MICROSOFT CORP. Effective date: 20150629 |
|
C41 | Transfer of patent application or patent right or utility model | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20150629 Address after: Washington State Applicant after: Micro soft technique license Co., Ltd Address before: Washington State Applicant before: Microsoft Corp. |
|
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |