US20140092032A1 - Synchronized audio feedback for non-visual touch interface system and method - Google Patents
Synchronized audio feedback for non-visual touch interface system and method Download PDFInfo
- Publication number
- US20140092032A1 US20140092032A1 US13/633,806 US201213633806A US2014092032A1 US 20140092032 A1 US20140092032 A1 US 20140092032A1 US 201213633806 A US201213633806 A US 201213633806A US 2014092032 A1 US2014092032 A1 US 2014092032A1
- Authority
- US
- United States
- Prior art keywords
- input
- touchscreen
- relative offset
- gesture
- subsequent
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 35
- 230000000007 visual effect Effects 0.000 title description 6
- 230000001360 synchronised effect Effects 0.000 title 1
- 230000004044 response Effects 0.000 claims description 28
- 230000003116 impacting effect Effects 0.000 claims 1
- 230000009471 action Effects 0.000 description 16
- 230000003993 interaction Effects 0.000 description 12
- 230000033001 locomotion Effects 0.000 description 7
- 230000001771 impaired effect Effects 0.000 description 6
- 230000002452 interceptive effect Effects 0.000 description 6
- 238000010079 rubber tapping Methods 0.000 description 4
- 230000006399 behavior Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 230000002787 reinforcement Effects 0.000 description 3
- 230000004913 activation Effects 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 2
- 230000001186 cumulative effect Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000000977 initiatory effect Effects 0.000 description 2
- 210000003205 muscle Anatomy 0.000 description 2
- 239000011295 pitch Substances 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000010897 surface acoustic wave method Methods 0.000 description 2
- 241000009328 Perro Species 0.000 description 1
- 230000003466 anti-cipated effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 238000012634 optical imaging Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
Definitions
- the present disclosure relates to non-visual interfaces, and more particularly, to an audio feedback system associated with a touch interface.
- a keyboard, mouse, rollerball, joystick, touchpad and touchscreen may be used to interact with and command elements of a computer system.
- instruments that associate audio feedback with touch input such as a keyboard and drum machine.
- touch input such as a keyboard and drum machine.
- the preferred method of interaction with present smart phones and tablets is through touch and gestures.
- This system presents advantages over prior mouse and external keyboard applications.
- this present popular interaction technique presents a problem for blind, visually impaired and distracted users, whom cannot see graphics presented on a display. For instance, initially knowing the orientation and location of items on a screen, learning “shapes”, and general touchscreen interaction are not intuitive. What is needed is a natural intuitive interaction system and method. Also, an interaction method providing feedback, such as self-reaffirming audio feedback is needed.
- the present disclosure is generally directed to a touch interface, and more particularly, to an audio feedback system associated with a touch interface.
- This touch interface allows a user to interact with a touch screen without requiring a priori knowledge of where (if any) items, such as icons are located on a screen or even the relative orientation of that screen.
- the present system and method provide audio feedback for tapping gestures on a touchscreen to allow users to input commands without having to look at the screen.
- the audio sounds played have a relationship to the relative touch locations of the tapping and/or touches.
- users such as blind or visually impaired users, may interact with a touchscreen interface with limited knowledge of traditional gestures, shapes and motions.
- feedback such as audio and/or haptic feedback, to a touch input and/or a series of touch inputs all having relative location, users can interact with a touch interface without having to see the orientation of the screen or visually identify items on the screen for interaction.
- this relative location interaction does not involve haptic or tactile feedback aside from the natural touch of the user's finger on the screen.
- the systems and methods disclosed herein are not limited to blind and/or visually impaired users. For instance, distracted users, such as drivers of a vehicle may avail themselves of various aspects of the present disclosure.
- the present system and method is an improvement over the current technology in that it allows a user to interact with a touch surface interface without requiring visual attention for confirmation of actions. It allows complex command actions beyond a simple tap, double tap or dragging of an icon.
- the present system and method also pairs discrete tapping gestures to audio feedback which can give confirmation and reinforcement learning to input actions.
- FIG. 1A depicts an initial reference input to a touch based input system according to various embodiments
- FIG. 1B depicts a second input to a touch based input system according to various embodiments
- FIG. 1C depicts a third input to a touch based input system according to various embodiments
- FIG. 2 illustrates various vehicular embodiments of the system according to various embodiments
- FIG. 3 illustrates an overview of a computer based system according to various embodiments.
- FIG. 4 illustrates a process associated with a touch based input and feedback system according to various embodiments.
- the present disclosure is generally directed to interaction, and more particularly, to an interactive computer system 100 having a touch based input system 155 responsive to inputs having relative locations.
- relative location comprises a location of a place in relation to another place.
- this touch based input system 155 may comprise a touchscreen 110 capable of communicating information (and/or a touchpad 105 ).
- Touchscreen 110 may comprise an interface circuit which includes those components, whether embodied in software, firmware, or hardware, which are used to interpret the position information obtained from touch screen 110 to industry standard signals understandable by the coupled computer based system 100 .
- the computer based system 100 may include a controller 190 , component or driver, to interpret the signals received from touchscreen 110 .
- controller 190 component or driver
- those skilled in the art can arrive at many other techniques for touchscreen 110 to communicate with computer based system 100 .
- Touchscreen 110 may be operated directly, such as with a finger, stylus or portions of the hand, without the need for an intermediate device (e.g. microphone, keyboard, mouse). Thus, a mouse or keyboard is not required to interact with touchscreen 110 .
- Touchscreen 110 may be any suitable touchscreen.
- touchscreen 110 may be utilize resistive, surface acoustic wave (SAW), capacitive, surface capacitance, projected capacitance, mutual capacitance, self-capacitance and/or infrared optical imaging technologies for registering inputs/commands.
- SAW surface acoustic wave
- Interactive computer system 100 may be integral to and/or coupled to a hand held device, similar to a tablet device, a fixed device and/or semi-permanently located device. According to various embodiments, rich menu navigation without visual attention is achieved via system 100 .
- input system 155 comprises touchscreen 110 configured to be responsive to touch based gestures. Any pointing device, such as a user's finger, can easily and conveniently be used to touch a surface 160 of the touchscreen 120 .
- touchscreen 110 is configured to display graphics and/or text. Also, as various aspects of the present disclosure are directed to sightless and visually impaired users, touchscreen 110 may not display graphics and/or text. In this way, touchscreen 110 and/or touchpad 105 may be made from more robust materials.
- the present system 100 may also comprise an audio feedback system 170 . This audio feedback system 170 may comprise any suitable audio feedback. For instance, audio feedback system 170 may comprise speakers 171 , headphones and/or speakers coupled to an audio output port and/or the like.
- the present system 100 may also comprise a memory 120 for storing input, timing and commands associated with touch based gestures having relative locations.
- This memory may be a RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
- an algorithm such as an algorithm stored to memory 120 , may be configured to interact with touchscreen 110 hardware where initial touch, relative offset, direction of offset, timing between touches, progression of touches and/or audio feedback are all used to aid interaction with a computer based system, such as to aid in computer based menu navigation and action selection.
- non-audio feedback may be presented in addition to audio feedback and/or in lieu of audio feedback.
- system 100 may comprise a haptic response system 175 .
- haptic response system 175 may cause device and/or system 100 and/or elements of device and/or system 100 to vibrate when touchscreen 110 is tapped.
- the feedback of haptic response system 175 may be constant and/or vary in intensity and/or duration based on different inputs/commands and/or relative offset gestures received, as disclosed herein.
- This haptic feedback may be applied in a case-by-case basis and/or responsive to a user selection.
- the shape of the interactive computer system 100 may aid a user in determining the initial orientation of touchscreen 110 .
- a distinctive shape of the perimeter of interactive computer system 100 may aid a user in determining initial orientation.
- device 100 may be shaped with a narrow top and a wide base, similar to a trapezoid so that a user may instantly be able to determine the orientation of the device.
- a motion sensor integral to system 100 may aid orienting device 100 for a user. For instance, the user need only hold system 100 upright and the device will reorient the orientation of touchscreen 110 accordingly.
- interactive computer system 100 may comprise a tactile indicator to indicate a starting orientation of device 100 similar to the raised elements on the “F” and “J” keys on the home row of a traditional QWERTY keyboard.
- This tactile indicator may be located on any location on device 100 , such as the top, side, edge, side, back, on touchscreen 110 and/or the like.
- other tactile surface characteristics may be used to assist the user to determine system 100 orientation.
- device 100 may have a soft surface feature near its top and/or a rough surface features near its bottom, and/or the like.
- system 100 may be coupled to and/or integral to at least one of a mobile device (e.g. mobile phone, tablet), computer, robot, home appliance, tool and/or toy.
- a mobile device e.g. mobile phone, tablet
- the feedback presented to a user in response to interaction with touchscreen 110 may comprise any audio and/or haptic feedback
- unique feedback is presented to the user based on the unique commands input to system 100 , such as relative location based commands.
- inputs to touchscreen 110 comprising variations in time between touches, force of a touch, number of sequential touches in a relative location, type of slides/drags in combination with relative location, relative distance from a reference touch, time for a sequence of touches and/or slides may all result in a distinct feedback provided.
- Limitless combinations of inputs are available based on combinations of the above inputs to touchscreen 110 .
- This unique feedback may be established in any suitable way, such as configured by a user and/or set according to a scheme.
- piano notes/tones may be a first scheme
- acoustic guitar notes/tones may be a second scheme.
- One example of such scheme is a musical note scheme.
- These notes and/or tones may have any frequency, such as traditional music theory pitches, normally denoted by the letters (A, B, C, D, E, F and G) including accidentals and various applicable octaves.
- two commands may be represented by the same note of varying duration.
- two commands may be represented by the same note represented by different instrument sounds.
- a received command may be represented by one or more combinations of tones/notes, such as a chord.
- Each received input to touchscreen 110 may result in feedback and/or a combination of inputs may result in feedback once all inputs in a sequence/series are registered.
- system 100 is not limited to menu actions, it may direct any command to a computer system, such as application initiation and closing, controlling settings, and/or the like. This is particularly useful for people with limited vision or people that do not wish to be visually distracted.
- the audio feedback can be a useful tool for reinforcement learning on menu navigation and action selection.
- a user may touch touchscreen 110 in any location (within the touchscreen tactile sensing area) and feedback such as an audio note is played in response to the first touch to touchscreen 110 .
- this initial touch location may be arbitrary. Stated another way, this initial touch may not be, and preferably, is not related to an icon displayed on touchscreen 110 .
- This note is could be set to any sound, such as the F2 note on a piano or a dog bark at a particular pitch.
- this same note would is played.
- a user may use the feedback as a way of confirming the actions are desired actions.
- system 100 may accomplish reinforcement learning through feedback.
- subsequent touches and/or inputs may result in the playing of an offset note or varying sound based on the relative offset of the subsequent touch from the original touch and/or the relative offset of the subsequent gesture from the original touch.
- These notes may be organized in a predictive pattern. For instance, as one travels farther away from a reference touch, notes of a higher frequency are associated with the subsequent touch. Thus, the feedback may have a predictable pattern.
- subsequent touches above or below the reference touch may be set to a higher or lower octave and/or change in instrument or tone.
- this audio feedback is similar to playing different keys on a piano, although setting of the sounds can be arbitrary, relative relation between the touches could be set based on preference, and direction of offset could cause different behaviors such as shifts in key.
- timing between touches such as relative offset touches could also be used to indicate a different desired response and/or toggle between action selections. For instance, a second touch with a relative offset of 1 inch directly horizontal to the right of the first touch within a first time period between touches may result in a different response than a second touch with a relative offset of 1 inch directly horizontal to the right of the first touch with a second longer relative time period.
- timing between the start of the first touch and the final touch in a series of touches taps may be used to toggle between action selections with regard to relative offset touches.
- relative distance between touches may also be used to toggle between action selection and/or commands to computer based system 100 with regard to relative offset touches. For instance, a second touch with a relative offset of 1 inch directly horizontal to the right of the first touch may result in a different response than a second touch with a relative offset of 2 inches directly horizontal to the right of the first touch.
- the relative direction between taps could also be used to toggle between action selections. For instance, a second touch with a relative offset of 1 inch directly horizontal to the right of the first touch may result in a different response than a second touch with a relative offset of 1 inch directly vertically above the first touch.
- system 100 may account for unintentional gesture drift. For instance, a second touch directly to the right of a first touch need not be precisely to the right of the first touch. For instance a user, such as a bind and/or visually impaired user, may slightly drift farther away from, nearer to, above and/or below, the intended second touch location without system 100 interpreting this as a “right and above” command versus a “to the right of the first touch command.” System 100 is intended to have some flexibility to account for this anticipated drift. Moreover, system 100 may use a feedback loop to learn the user habits and account for these types of behaviors and adjust system 100 accordingly.
- a series of touches may result in a different action as compared with a different series of touches.
- a combination of touches and/or slides may result in a different action by the computer based system 100 than the same series of touches and/or slides in a different order. For instance, a second touch 1 inch directly below a first touch followed by a third touch and slide/drag in a circular motion to the right of the second touch may result in a different command to the computer based system 100 as compared with a second touch directly to the right of a first touch with a slide/drag in a circular motion followed by a third touch 1 inch directly below the initiation of the second touch.
- a user may deliver an input to touchscreen 110 , such as by touching touchscreen 110 at a random location (x 1 , y 1 ), for instance location 1 with the x axis being horizontal and the y axis being vertical.
- a random location x 1 , y 1
- horizontal and vertical directions could be determined through accelerometers or set based on hardware, and/or user defined.
- a note such as note F2 is played.
- the user may then touch the screen at location (x 2 , y 1 ) with some relative offset in the x and/or y direction (this offset can be zero as shown in FIG.
- a new note is played based on the relative offset from location 1 (distance X).
- location 1 distance X
- the note A2 could be played. Timing between the touches can also be recorded and used as information in the action and response selection.
- the user may build a sequence of notes to perform a certain gesture.
- the user may build a sequence of touches and/or slides/drags to perform a certain gesture. The end of the series/sequence may be determined by a sufficient amount of delay with no further input.
- Another form of relative offset is a move in the y direction.
- a complete shift of key may designate the relative offset in the y direction, for example from note F2 to note F3 (See FIG. 1C to location 3).
- the relative offset of location 3 may be calculated from location 1 (distance Z) and/or calculated from the previous touch, such as location 2 (distance Y).
- touchscreen 110 may toggle between an interface that uses visual cues, such as icons that are tapped and/or dragged to a system that uses relative offset based touch navigation. For instance, in response to a received signal, such as a user tapping the screen three times in rapid succession in the same location, icon based interaction may be temporarily disabled and relative offset navigation may be enabled.
- Other sources of transition e.g. to toggle touchscreen 110 to receive relative offset commands
- the generally selectable icons may remain visible on the touchscreen but not be responsive to a user's touch. Instead, the relative offset of a user's subsequent touches and/or cumulative gesture may be used to command the system, such as to navigate through a menu or call upon a desired application directly.
- the generally selectable icons may disappear on the touchscreen.
- the relative offset of a user's subsequent touches and/or cumulative gesture may be used to command the system, such as to navigate through a menu or call upon a desired application directly.
- system 100 may be activated by the user ( 405 ).
- This activation signal may be an external switch or button and/or a distinct input.
- the distinct input may be any suitable input; for example the distinct input to activate the system may be an initial touch and a hold for a period of time, a series of touches in the same location, or a distinct swipe or slide command.
- system 100 may be set in a ready mode and not involve an activation selection prior to a reference input being received.
- system 100 may receive a reference input via touchscreen 110 ( 410 ). As disclosed herein, this reference input may be at any location within the boundaries of touchscreen 110 . Stated another way, this reference input may be at a random location. For example, a reference input may be at a first location during one use and at a second, different location during a second use.
- a timer may begin ( 420 ). This timer may record the time it takes to receive a total series of inputs, the time between one or more inputs, and/or the time after an input without a subsequent input ( 425 ).
- system 100 may deliver a feedback response ( 415 ).
- This feedback response may be an audio feedback response.
- This audio feedback response may be any audio feedback, but is preferably a musical note or tone. Though it could be different, preferably the note is the same note for all initial reference inputs.
- system 100 may receive a subsequent input ( 430 ). Substantially concurrently with receiving the subsequent input, system 100 may deliver a feedback response ( 450 ). In response to receiving subsequent input, a timer may begin ( 460 ). System 100 may calculate the relative x and relative y offset of the subsequent input as compared with a prior input, such as the reference input ( 440 ). If enough time passes after a subsequent input, such as a threshold of time after a subsequent touch expiring, system 100 will perceive the series of touches complete and associate and/identify a gesture with the received inputs ( 425 , 480 ). In response to identifying a gesture, a command may be initiated by system ( 490 ).
- a subsequent input two units away from reference input may register as a different gesture or portion of a gesture as compared with a subsequent input one unit away from or three units away from a reference input.
- a subsequent input two units away from reference input in a positive direction substantially along the x axis may register as a different gesture or portion of a gesture as compared with a subsequent input two units away from reference input in a negative direction substantially along the x axis.
- a gesture may be made up of a series of touch, slides or a combination thereof.
- a gesture may comprise a first tap at location 1; a second touch at location 2 and a third touch at location 3.
- a gesture may comprise a first tap at location 1, a second touch at location 2; an additional (repeated) touch at location 2 and a fourth touch at location 3.
- another gesture may comprise a first tap at location 1; a second touch at location 2 and a third touch at location 1.
- a gesture may comprise a first tap at location 1 and a second touch at location 2 with a slide to location 3.
- This slide from location 2 to location 3 may be by any path and various paths may comprise distinct portions of gestures.
- an “S” sliding motion from location 2 to location 3 may be registered as a distinct command by system 100 as compared with a straight drag/slide in a direct path from location 2 to location 3.
- any sliding motion may be input by the system.
- a mechanism for starting over may be available. For instance, a certain input designated to erase a previous input and/or series of inputs, such as a double tap or circular slide.
- Muscle memory and/or motor learning which is a form of procedural memory that involves consolidating a specific motor task into memory through repetition may be involved in interaction with system 100 . For instance, when a movement is repeated over time, such as inputting a gesture, a long-term muscle memory may be created for that task, eventually allowing it to be performed without conscious effort. This process decreases the need for attention and creates maximum efficiency within the motor and memory systems.
- system 100 may be integrated into an interface of a vehicle such as a control panel (e.g. a front seat control panel), touchscreen 110 and/or driver accessible touchpad 105 .
- a control panel e.g. a front seat control panel
- touchscreen 110 e.g. a driver accessible touchpad 105
- driver accessible touchpad 105 e.g. a driver accessible touchpad 105
- either touchpad 105 or touchscreen 110 may be used interchangeably and/or together to deliver input to system 100 .
- aspects of the present system 100 may work in concert with voice commands.
- the system 100 may receive and respond to an audio command, such as a voice command.
- an audio command may comprise a portion of a gesture.
- a software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
- An exemplary storage medium is coupled to the processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor.
- the computational steps disclosed herein may be comprised in an article of manufacture including a non-transitory, tangible computer readable storage medium having instructions stored thereon.
- references to “various embodiments”, in “some embodiments”, “one embodiment”, “an embodiment”, “an example embodiment”, etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to affect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described. After reading the description, it will be apparent to one skilled in the relevant art(s) how to implement the disclosure in alternative embodiments.
- a software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
- An exemplary storage medium is coupled to the processor such that the processor can read information from, and write information to, the storage medium.
- the storage medium may be integral to the processor.
- the processor and the storage medium may reside in an Application Specific Integrated Circuit (ASIC).
- ASIC Application Specific Integrated Circuit
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Health & Medical Sciences (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
- Stereophonic System (AREA)
Abstract
Description
- The present disclosure relates to non-visual interfaces, and more particularly, to an audio feedback system associated with a touch interface.
- In general, at present there are many methods for a computer system to receive an input. For example, a keyboard, mouse, rollerball, joystick, touchpad and touchscreen may be used to interact with and command elements of a computer system. Likewise, there are instruments that associate audio feedback with touch input such as a keyboard and drum machine. Currently the preferred method of interaction with present smart phones and tablets is through touch and gestures. This system presents advantages over prior mouse and external keyboard applications. However, this present popular interaction technique presents a problem for blind, visually impaired and distracted users, whom cannot see graphics presented on a display. For instance, initially knowing the orientation and location of items on a screen, learning “shapes”, and general touchscreen interaction are not intuitive. What is needed is a natural intuitive interaction system and method. Also, an interaction method providing feedback, such as self-reaffirming audio feedback is needed.
- The above needs are successfully met via the disclosed system and method. The present disclosure is generally directed to a touch interface, and more particularly, to an audio feedback system associated with a touch interface. This touch interface allows a user to interact with a touch screen without requiring a priori knowledge of where (if any) items, such as icons are located on a screen or even the relative orientation of that screen.
- In accordance with various embodiments, the present system and method provide audio feedback for tapping gestures on a touchscreen to allow users to input commands without having to look at the screen. The audio sounds played have a relationship to the relative touch locations of the tapping and/or touches. Thus, users, such as blind or visually impaired users, may interact with a touchscreen interface with limited knowledge of traditional gestures, shapes and motions. By assigning feedback, such as audio and/or haptic feedback, to a touch input and/or a series of touch inputs all having relative location, users can interact with a touch interface without having to see the orientation of the screen or visually identify items on the screen for interaction.
- In various embodiments, this relative location interaction does not involve haptic or tactile feedback aside from the natural touch of the user's finger on the screen. Likewise, while the present disclosure presents various advantages for blind and visually impaired users, the systems and methods disclosed herein are not limited to blind and/or visually impaired users. For instance, distracted users, such as drivers of a vehicle may avail themselves of various aspects of the present disclosure.
- In various embodiments, the present system and method is an improvement over the current technology in that it allows a user to interact with a touch surface interface without requiring visual attention for confirmation of actions. It allows complex command actions beyond a simple tap, double tap or dragging of an icon. The present system and method also pairs discrete tapping gestures to audio feedback which can give confirmation and reinforcement learning to input actions.
- The features and advantages of the embodiments of the present disclosure will become more apparent from the detailed description set forth below when taken in conjunction with the drawings. Naturally, the drawings and their associated descriptions illustrate example arrangements within the scope of the claims and do not limit the scope of the claims. Reference numbers are reused throughout the drawings to indicate correspondence between referenced elements.
-
FIG. 1A depicts an initial reference input to a touch based input system according to various embodiments; -
FIG. 1B depicts a second input to a touch based input system according to various embodiments; -
FIG. 1C depicts a third input to a touch based input system according to various embodiments; -
FIG. 2 illustrates various vehicular embodiments of the system according to various embodiments; -
FIG. 3 illustrates an overview of a computer based system according to various embodiments; and -
FIG. 4 illustrates a process associated with a touch based input and feedback system according to various embodiments. - In the following detailed description, numerous specific details are set forth to provide an understanding of the present disclosure. It will be apparent; however, to one of ordinary skill in the art that elements of the present disclosure may be practiced without some of these specific details. In other instances, well-known structures and techniques have not been shown in detail to avoid unnecessarily obscuring the present disclosure.
- The present disclosure is generally directed to interaction, and more particularly, to an
interactive computer system 100 having a touch basedinput system 155 responsive to inputs having relative locations. As used herein, relative location comprises a location of a place in relation to another place. In various embodiments, this touch basedinput system 155 may comprise atouchscreen 110 capable of communicating information (and/or a touchpad 105). -
Touchscreen 110 may comprise an interface circuit which includes those components, whether embodied in software, firmware, or hardware, which are used to interpret the position information obtained fromtouch screen 110 to industry standard signals understandable by the coupled computer basedsystem 100. The computer basedsystem 100 may include acontroller 190, component or driver, to interpret the signals received fromtouchscreen 110. Alternatively, those skilled in the art can arrive at many other techniques fortouchscreen 110 to communicate with computer basedsystem 100. -
Touchscreen 110 may be operated directly, such as with a finger, stylus or portions of the hand, without the need for an intermediate device (e.g. microphone, keyboard, mouse). Thus, a mouse or keyboard is not required to interact withtouchscreen 110.Touchscreen 110 may be any suitable touchscreen. For instance,touchscreen 110 may be utilize resistive, surface acoustic wave (SAW), capacitive, surface capacitance, projected capacitance, mutual capacitance, self-capacitance and/or infrared optical imaging technologies for registering inputs/commands. -
Interactive computer system 100 may be integral to and/or coupled to a hand held device, similar to a tablet device, a fixed device and/or semi-permanently located device. According to various embodiments, rich menu navigation without visual attention is achieved viasystem 100. - As disclosed herein,
input system 155 comprisestouchscreen 110 configured to be responsive to touch based gestures. Any pointing device, such as a user's finger, can easily and conveniently be used to touch asurface 160 of thetouchscreen 120. In various embodiments,touchscreen 110 is configured to display graphics and/or text. Also, as various aspects of the present disclosure are directed to sightless and visually impaired users,touchscreen 110 may not display graphics and/or text. In this way,touchscreen 110 and/ortouchpad 105 may be made from more robust materials. Thepresent system 100 may also comprise an audio feedback system 170. This audio feedback system 170 may comprise any suitable audio feedback. For instance, audio feedback system 170 may comprise speakers 171, headphones and/or speakers coupled to an audio output port and/or the like. Thepresent system 100 may also comprise amemory 120 for storing input, timing and commands associated with touch based gestures having relative locations. This memory may be a RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. - In various embodiments, an algorithm, such as an algorithm stored to
memory 120, may be configured to interact withtouchscreen 110 hardware where initial touch, relative offset, direction of offset, timing between touches, progression of touches and/or audio feedback are all used to aid interaction with a computer based system, such as to aid in computer based menu navigation and action selection. - In various embodiments, non-audio feedback may be presented in addition to audio feedback and/or in lieu of audio feedback. Thus,
system 100 may comprise ahaptic response system 175. For example,haptic response system 175 may cause device and/orsystem 100 and/or elements of device and/orsystem 100 to vibrate whentouchscreen 110 is tapped. The feedback ofhaptic response system 175 may be constant and/or vary in intensity and/or duration based on different inputs/commands and/or relative offset gestures received, as disclosed herein. This haptic feedback may be applied in a case-by-case basis and/or responsive to a user selection. - In various embodiments, the shape of the
interactive computer system 100 may aid a user in determining the initial orientation oftouchscreen 110. For instance, a distinctive shape of the perimeter ofinteractive computer system 100 may aid a user in determining initial orientation. For example,device 100 may be shaped with a narrow top and a wide base, similar to a trapezoid so that a user may instantly be able to determine the orientation of the device. - In various embodiments, such as with a hand held
interactive computer system 100, a motion sensor integral tosystem 100, such as an accelerometer, may aid orientingdevice 100 for a user. For instance, the user need only holdsystem 100 upright and the device will reorient the orientation oftouchscreen 110 accordingly. - In various embodiments,
interactive computer system 100 may comprise a tactile indicator to indicate a starting orientation ofdevice 100 similar to the raised elements on the “F” and “J” keys on the home row of a traditional QWERTY keyboard. This tactile indicator may be located on any location ondevice 100, such as the top, side, edge, side, back, ontouchscreen 110 and/or the like. According to various embodiments, other tactile surface characteristics may be used to assist the user to determinesystem 100 orientation. For example,device 100 may have a soft surface feature near its top and/or a rough surface features near its bottom, and/or the like. - In various embodiments, such as when
system 100 is located in a fixed position, such as the center console of a vehicle, the initial orientation oftouchscreen 110 may be known based on prior use. According to various embodiments,system 100 may be coupled to and/or integral to at least one of a mobile device (e.g. mobile phone, tablet), computer, robot, home appliance, tool and/or toy. - Though the feedback presented to a user in response to interaction with
touchscreen 110 may comprise any audio and/or haptic feedback, in accordance with aspects of the present disclosure, unique feedback is presented to the user based on the unique commands input tosystem 100, such as relative location based commands. For instance, inputs totouchscreen 110 comprising variations in time between touches, force of a touch, number of sequential touches in a relative location, type of slides/drags in combination with relative location, relative distance from a reference touch, time for a sequence of touches and/or slides may all result in a distinct feedback provided. Limitless combinations of inputs are available based on combinations of the above inputs totouchscreen 110. - This unique feedback may be established in any suitable way, such as configured by a user and/or set according to a scheme. For instance, piano notes/tones may be a first scheme, whereas acoustic guitar notes/tones may be a second scheme. One example of such scheme is a musical note scheme. These notes and/or tones may have any frequency, such as traditional music theory pitches, normally denoted by the letters (A, B, C, D, E, F and G) including accidentals and various applicable octaves. In various embodiments, two commands may be represented by the same note of varying duration. In various embodiments, two commands may be represented by the same note represented by different instrument sounds. In various embodiments, a received command may be represented by one or more combinations of tones/notes, such as a chord. Each received input to
touchscreen 110 may result in feedback and/or a combination of inputs may result in feedback once all inputs in a sequence/series are registered. - Combinations of notes and timing could allow the user to build a complete set of menu actions and behaviors without requiring any visual attention to
device 100. Of course,system 100 is not limited to menu actions, it may direct any command to a computer system, such as application initiation and closing, controlling settings, and/or the like. This is particularly useful for people with limited vision or people that do not wish to be visually distracted. Likewise the audio feedback can be a useful tool for reinforcement learning on menu navigation and action selection. - In various embodiments, according to aspects of the present disclosure, a user may touch
touchscreen 110 in any location (within the touchscreen tactile sensing area) and feedback such as an audio note is played in response to the first touch totouchscreen 110. According to various embodiments, this initial touch location may be arbitrary. Stated another way, this initial touch may not be, and preferably, is not related to an icon displayed ontouchscreen 110. This note is could be set to any sound, such as the F2 note on a piano or a dog bark at a particular pitch. According to various embodiments, in response to a user initially touching the screen this same note would is played. Thus, a user may use the feedback as a way of confirming the actions are desired actions. Thus,system 100 may accomplish reinforcement learning through feedback. - Though they may be represented by any note/tone, according to various embodiments, subsequent touches and/or inputs may result in the playing of an offset note or varying sound based on the relative offset of the subsequent touch from the original touch and/or the relative offset of the subsequent gesture from the original touch. These notes may be organized in a predictive pattern. For instance, as one travels farther away from a reference touch, notes of a higher frequency are associated with the subsequent touch. Thus, the feedback may have a predictable pattern. Also, subsequent touches above or below the reference touch may be set to a higher or lower octave and/or change in instrument or tone.
- In various embodiments, this audio feedback is similar to playing different keys on a piano, although setting of the sounds can be arbitrary, relative relation between the touches could be set based on preference, and direction of offset could cause different behaviors such as shifts in key.
- According to various embodiments, timing between touches, such as relative offset touches could also be used to indicate a different desired response and/or toggle between action selections. For instance, a second touch with a relative offset of 1 inch directly horizontal to the right of the first touch within a first time period between touches may result in a different response than a second touch with a relative offset of 1 inch directly horizontal to the right of the first touch with a second longer relative time period.
- According to various embodiments, timing between the start of the first touch and the final touch in a series of touches taps may be used to toggle between action selections with regard to relative offset touches.
- According to various embodiments, relative distance between touches may also be used to toggle between action selection and/or commands to computer based
system 100 with regard to relative offset touches. For instance, a second touch with a relative offset of 1 inch directly horizontal to the right of the first touch may result in a different response than a second touch with a relative offset of 2 inches directly horizontal to the right of the first touch. - The relative direction between taps could also be used to toggle between action selections. For instance, a second touch with a relative offset of 1 inch directly horizontal to the right of the first touch may result in a different response than a second touch with a relative offset of 1 inch directly vertically above the first touch.
- In general, it should be appreciated that
system 100 may account for unintentional gesture drift. For instance, a second touch directly to the right of a first touch need not be precisely to the right of the first touch. For instance a user, such as a bind and/or visually impaired user, may slightly drift farther away from, nearer to, above and/or below, the intended second touch location withoutsystem 100 interpreting this as a “right and above” command versus a “to the right of the first touch command.”System 100 is intended to have some flexibility to account for this anticipated drift. Moreover,system 100 may use a feedback loop to learn the user habits and account for these types of behaviors and adjustsystem 100 accordingly. - In various embodiments, a series of touches may result in a different action as compared with a different series of touches. According to various embodiments, a combination of touches and/or slides may result in a different action by the computer based
system 100 than the same series of touches and/or slides in a different order. For instance, asecond touch 1 inch directly below a first touch followed by a third touch and slide/drag in a circular motion to the right of the second touch may result in a different command to the computer basedsystem 100 as compared with a second touch directly to the right of a first touch with a slide/drag in a circular motion followed by athird touch 1 inch directly below the initiation of the second touch. - In operation, according to various embodiments and with reference to
FIGS. 1A-1C , a user may deliver an input totouchscreen 110, such as by touchingtouchscreen 110 at a random location (x1, y1), forinstance location 1 with the x axis being horizontal and the y axis being vertical. As disclosed herein, horizontal and vertical directions could be determined through accelerometers or set based on hardware, and/or user defined. Responsive to this touch, a note, such as note F2 is played. The user may then touch the screen at location (x2, y1) with some relative offset in the x and/or y direction (this offset can be zero as shown inFIG. 1B with respect to the y axis) and a new note is played based on the relative offset from location 1 (distance X). For example, in the case of a positive x displacement, in this case to the right of the first touch, the note A2 could be played. Timing between the touches can also be recorded and used as information in the action and response selection. Likewise, according to various embodiments, the user may build a sequence of notes to perform a certain gesture. According to various embodiments, the user may build a sequence of touches and/or slides/drags to perform a certain gesture. The end of the series/sequence may be determined by a sufficient amount of delay with no further input. Another form of relative offset is a move in the y direction. In this case, a complete shift of key may designate the relative offset in the y direction, for example from note F2 to note F3 (SeeFIG. 1C to location 3). The relative offset oflocation 3 may be calculated from location 1 (distance Z) and/or calculated from the previous touch, such as location 2 (distance Y). - According to various embodiments,
touchscreen 110 may toggle between an interface that uses visual cues, such as icons that are tapped and/or dragged to a system that uses relative offset based touch navigation. For instance, in response to a received signal, such as a user tapping the screen three times in rapid succession in the same location, icon based interaction may be temporarily disabled and relative offset navigation may be enabled. Other sources of transition (e.g. to toggletouchscreen 110 to receive relative offset commands) may include receiving speech input, selectable virtual buttons or physical buttons, such as those placed on a steering wheel or on and/or coupled todevice 100. For instance, after a user togglestouchscreen 110 to receive relative offset commands (e.g. tapstouchscreen 110 three times and/or a single tap and hold for a predetermined length of time), the generally selectable icons may remain visible on the touchscreen but not be responsive to a user's touch. Instead, the relative offset of a user's subsequent touches and/or cumulative gesture may be used to command the system, such as to navigate through a menu or call upon a desired application directly. - According to various embodiments, after a user toggles
touchscreen 110 to receive relative offset commands, (e.g. tapstouchscreen 110 three times), the generally selectable icons may disappear on the touchscreen. Then, the relative offset of a user's subsequent touches and/or cumulative gesture may be used to command the system, such as to navigate through a menu or call upon a desired application directly. - According to various embodiments, and with reference to flow diagram of
FIG. 4 ,system 100 may be activated by the user (405). This activation signal may be an external switch or button and/or a distinct input. The distinct input may be any suitable input; for example the distinct input to activate the system may be an initial touch and a hold for a period of time, a series of touches in the same location, or a distinct swipe or slide command. Also,system 100 may be set in a ready mode and not involve an activation selection prior to a reference input being received. - Once activated,
system 100 may receive a reference input via touchscreen 110 (410). As disclosed herein, this reference input may be at any location within the boundaries oftouchscreen 110. Stated another way, this reference input may be at a random location. For example, a reference input may be at a first location during one use and at a second, different location during a second use. In response to receiving reference input, a timer may begin (420). This timer may record the time it takes to receive a total series of inputs, the time between one or more inputs, and/or the time after an input without a subsequent input (425). Substantially concurrently with receiving the reference input,system 100 may deliver a feedback response (415). This feedback response may be an audio feedback response. This audio feedback response may be any audio feedback, but is preferably a musical note or tone. Though it could be different, preferably the note is the same note for all initial reference inputs. - Subsequent to receiving the reference input,
system 100 may receive a subsequent input (430). Substantially concurrently with receiving the subsequent input,system 100 may deliver a feedback response (450). In response to receiving subsequent input, a timer may begin (460).System 100 may calculate the relative x and relative y offset of the subsequent input as compared with a prior input, such as the reference input (440). If enough time passes after a subsequent input, such as a threshold of time after a subsequent touch expiring,system 100 will perceive the series of touches complete and associate and/identify a gesture with the received inputs (425, 480). In response to identifying a gesture, a command may be initiated by system (490). - For instance, a subsequent input two units away from reference input may register as a different gesture or portion of a gesture as compared with a subsequent input one unit away from or three units away from a reference input. Similarly, a subsequent input two units away from reference input in a positive direction substantially along the x axis may register as a different gesture or portion of a gesture as compared with a subsequent input two units away from reference input in a negative direction substantially along the x axis.
- As disclosed herein, a gesture may be made up of a series of touch, slides or a combination thereof. For instance and with renewed reference to
FIG. 1 , by way on non-limiting example, a gesture may comprise a first tap atlocation 1; a second touch atlocation 2 and a third touch atlocation 3. A gesture may comprise a first tap atlocation 1, a second touch atlocation 2; an additional (repeated) touch atlocation 2 and a fourth touch atlocation 3. Also, for example, another gesture may comprise a first tap atlocation 1; a second touch atlocation 2 and a third touch atlocation 1. Additionally, a gesture may comprise a first tap atlocation 1 and a second touch atlocation 2 with a slide tolocation 3. This slide fromlocation 2 tolocation 3 may be by any path and various paths may comprise distinct portions of gestures. For example, an “S” sliding motion fromlocation 2 tolocation 3 may be registered as a distinct command bysystem 100 as compared with a straight drag/slide in a direct path fromlocation 2 tolocation 3. As will be appreciated, any sliding motion may be input by the system. Also, it is conceived that varying force of a touch or portions of a slide may result in a different portion of a gesture. - If the feedback provided is unexpected, a mechanism for starting over may be available. For instance, a certain input designated to erase a previous input and/or series of inputs, such as a double tap or circular slide.
- Muscle memory and/or motor learning, which is a form of procedural memory that involves consolidating a specific motor task into memory through repetition may be involved in interaction with
system 100. For instance, when a movement is repeated over time, such as inputting a gesture, a long-term muscle memory may be created for that task, eventually allowing it to be performed without conscious effort. This process decreases the need for attention and creates maximum efficiency within the motor and memory systems. - As disclosed herein, those that do not wish to be visually distracted include vehicle drivers. Thus, according to various embodiments and with reference to
FIG. 2 ,system 100 may be integrated into an interface of a vehicle such as a control panel (e.g. a front seat control panel),touchscreen 110 and/or driveraccessible touchpad 105. As shown, eithertouchpad 105 ortouchscreen 110 may be used interchangeably and/or together to deliver input tosystem 100. - Aspects of the
present system 100 may work in concert with voice commands. For instance, thesystem 100 may receive and respond to an audio command, such as a voice command. Moreover, an audio command may comprise a portion of a gesture. - Those of ordinary skill will appreciate that the various illustrative logical blocks and process steps described in connection with the examples disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Ordinarily skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the disclosed apparatus and methods.
- The steps of a method or algorithm described in connection with the examples disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The computational steps disclosed herein may be comprised in an article of manufacture including a non-transitory, tangible computer readable storage medium having instructions stored thereon.
- Systems, methods and computer program products are provided. References to “various embodiments”, in “some embodiments”, “one embodiment”, “an embodiment”, “an example embodiment”, etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to affect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described. After reading the description, it will be apparent to one skilled in the relevant art(s) how to implement the disclosure in alternative embodiments.
- The steps of a method or algorithm described in connection with the examples disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an Application Specific Integrated Circuit (ASIC).
- The foregoing description of the disclosed example embodiments is provided to enable any person of ordinary skill in the art to make or use the present invention. Various modifications to these examples will be readily apparent to those of ordinary skill in the art, and the principles disclosed herein may be applied to other examples without departing from the spirit or scope of the present invention. It should be appreciated, as used herein the terms taps, slides, drags, and touches may be used interchangeably. Further, the terms touchpad and touchscreen may be used interchangeably. The described embodiments are to be considered in all respects only as illustrative and not restrictive and the scope of the invention is, therefore, indicated by the following claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.
Claims (20)
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/633,806 US9411507B2 (en) | 2012-10-02 | 2012-10-02 | Synchronized audio feedback for non-visual touch interface system and method |
EP13186658.4A EP2717150A3 (en) | 2012-10-02 | 2013-09-30 | Synchronized audio feedback for non-visual touch interface system and method |
JP2013207390A JP6188522B2 (en) | 2012-10-02 | 2013-10-02 | Method, user interface and computer readable medium for managing user input in a user interface |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/633,806 US9411507B2 (en) | 2012-10-02 | 2012-10-02 | Synchronized audio feedback for non-visual touch interface system and method |
Publications (2)
Publication Number | Publication Date |
---|---|
US20140092032A1 true US20140092032A1 (en) | 2014-04-03 |
US9411507B2 US9411507B2 (en) | 2016-08-09 |
Family
ID=49518651
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/633,806 Active 2033-09-17 US9411507B2 (en) | 2012-10-02 | 2012-10-02 | Synchronized audio feedback for non-visual touch interface system and method |
Country Status (3)
Country | Link |
---|---|
US (1) | US9411507B2 (en) |
EP (1) | EP2717150A3 (en) |
JP (1) | JP6188522B2 (en) |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140108934A1 (en) * | 2012-10-15 | 2014-04-17 | Lg Electronics Inc. | Image display apparatus and method for operating the same |
US20140240262A1 (en) * | 2013-02-27 | 2014-08-28 | Samsung Electronics Co., Ltd. | Apparatus and method for supporting voice service in a portable terminal for visually disabled people |
US20150066245A1 (en) * | 2013-09-02 | 2015-03-05 | Hyundai Motor Company | Vehicle controlling apparatus installed on steering wheel |
CN105117147A (en) * | 2015-07-24 | 2015-12-02 | 上海修源网络科技有限公司 | Method and apparatus for manipulating vehicle-mounted operating system based on gesture and vehicle-mounted device |
US20160018891A1 (en) * | 2014-07-21 | 2016-01-21 | Immersion Corporation | Systems And Methods For Determining Haptic Effects For Multi-Touch Input |
JP2016110422A (en) * | 2014-12-08 | 2016-06-20 | 富士通テン株式会社 | Operation device |
US20160193502A1 (en) * | 2015-01-06 | 2016-07-07 | Samsung Electronics Co., Ltd. | Method and apparatus for physical exercise assistance |
US20160216830A1 (en) * | 2013-12-31 | 2016-07-28 | Immersion Corporation | Systems and Methods for Controlling Multiple Displays With Single Controller and Haptic Enabled User Interface |
US20170249012A1 (en) * | 2016-02-25 | 2017-08-31 | Cirque Corporation | Method of providing audible feedback on a touch sensor using haptics |
US10572031B2 (en) | 2016-09-28 | 2020-02-25 | Salesforce.Com, Inc. | Processing keyboard input to cause re-sizing of items in a user interface of a web browser-based application |
US10642474B2 (en) * | 2016-09-28 | 2020-05-05 | Salesforce.Com, Inc. | Processing keyboard input to cause movement of items in a user interface of a web browser-based application |
CN111773657A (en) * | 2020-08-11 | 2020-10-16 | 网易(杭州)网络有限公司 | Method and device for switching visual angles in game, electronic equipment and storage medium |
CN112148408A (en) * | 2020-09-27 | 2020-12-29 | 深圳壹账通智能科技有限公司 | Barrier-free mode implementation method and device based on image processing and storage medium |
US11301128B2 (en) * | 2019-05-01 | 2022-04-12 | Google Llc | Intended input to a user interface from detected gesture positions |
CN114397996A (en) * | 2021-12-29 | 2022-04-26 | 杭州灵伴科技有限公司 | Interactive prompting method, head-mounted display device and computer readable medium |
US20230004344A1 (en) * | 2021-06-30 | 2023-01-05 | Google Llc | Activity-Dependent Audio Feedback Themes For Touch Gesture Inputs |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20160031742A (en) * | 2014-09-15 | 2016-03-23 | 현대자동차주식회사 | Vehicle and controlling method thereof, and navigation |
DE102014224674A1 (en) * | 2014-12-02 | 2016-06-02 | Siemens Aktiengesellschaft | User interface and method for operating a system |
FR3039670B1 (en) * | 2015-07-29 | 2017-12-08 | Dav | METHOD AND INTERFACE OF HAPTICALLY RETURN CONTROL FOR MOTOR VEHICLE |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090273571A1 (en) * | 2008-05-01 | 2009-11-05 | Alan Bowens | Gesture Recognition |
US20110191675A1 (en) * | 2010-02-01 | 2011-08-04 | Nokia Corporation | Sliding input user interface |
US20110291954A1 (en) * | 2010-06-01 | 2011-12-01 | Apple Inc. | Providing non-visual feedback for non-physical controls |
US20120032896A1 (en) * | 2010-08-06 | 2012-02-09 | Jan Vesely | Self-service terminal and configurable screen therefor |
US20120188285A1 (en) * | 2009-11-15 | 2012-07-26 | Ram Friedlander | Enhanced pointing interface |
US20120229410A1 (en) * | 2009-12-02 | 2012-09-13 | Sony Corporation | Remote control apparatus, remote control system, remote control method, and program |
US20130120282A1 (en) * | 2010-05-28 | 2013-05-16 | Tim Kukulski | System and Method for Evaluating Gesture Usability |
US20150097786A1 (en) * | 2012-05-31 | 2015-04-09 | Nokia Corporation | Display apparatus |
Family Cites Families (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5767457A (en) | 1995-11-13 | 1998-06-16 | Cirque Corporation | Apparatus and method for audible feedback from input device |
JP2003099171A (en) * | 2001-09-21 | 2003-04-04 | Sony Corp | Information processor, information processing method, recording medium, and its program |
US6703550B2 (en) | 2001-10-10 | 2004-03-09 | Immersion Corporation | Sound data output and manipulation using haptic feedback |
US6882337B2 (en) | 2002-04-18 | 2005-04-19 | Microsoft Corporation | Virtual keyboard for touch-typing using audio feedback |
FI20022282A0 (en) | 2002-12-30 | 2002-12-30 | Nokia Corp | Method for enabling interaction in an electronic device and an electronic device |
US7269484B2 (en) | 2004-09-09 | 2007-09-11 | Lear Corporation | Vehicular touch switches with adaptive tactile and audible feedback |
US7750893B2 (en) * | 2005-04-06 | 2010-07-06 | Nintendo Co., Ltd. | Storage medium storing input position processing program, and input position processing device |
US8587526B2 (en) | 2006-04-12 | 2013-11-19 | N-Trig Ltd. | Gesture recognition feedback for a dual mode digitizer |
JP2008009668A (en) * | 2006-06-29 | 2008-01-17 | Syn Sophia Inc | Driving method and input method for touch panel |
US7924271B2 (en) | 2007-01-05 | 2011-04-12 | Apple Inc. | Detecting gestures on multi-event sensitive devices |
US8769427B2 (en) * | 2008-09-19 | 2014-07-01 | Google Inc. | Quick gesture input |
US8661362B2 (en) * | 2009-03-16 | 2014-02-25 | Apple Inc. | Methods and graphical user interfaces for editing on a multifunction device with a touch screen display |
KR101553842B1 (en) * | 2009-04-21 | 2015-09-17 | 엘지전자 주식회사 | Mobile terminal providing multi haptic effect and control method thereof |
US8407623B2 (en) | 2009-06-25 | 2013-03-26 | Apple Inc. | Playback control using a touch interface |
US20110082618A1 (en) | 2009-10-05 | 2011-04-07 | Tesla Motors, Inc. | Adaptive Audible Feedback Cues for a Vehicle User Interface |
US8593576B2 (en) | 2009-10-15 | 2013-11-26 | At&T Intellectual Property I, L.P. | Gesture-based remote control |
JP5433375B2 (en) * | 2009-10-23 | 2014-03-05 | 楽天株式会社 | Terminal device, function execution method, function execution program, and information processing system |
US20110107216A1 (en) | 2009-11-03 | 2011-05-05 | Qualcomm Incorporated | Gesture-based user interface |
US20110163944A1 (en) | 2010-01-05 | 2011-07-07 | Apple Inc. | Intuitive, gesture-based communications with physics metaphors |
GB2477959A (en) | 2010-02-19 | 2011-08-24 | Sony Europ | Navigation and display of an array of selectable items |
US20110273379A1 (en) * | 2010-05-05 | 2011-11-10 | Google Inc. | Directional pad on touchscreen |
US20120110517A1 (en) | 2010-10-29 | 2012-05-03 | Honeywell International Inc. | Method and apparatus for gesture recognition |
JP2012123689A (en) * | 2010-12-09 | 2012-06-28 | Stanley Electric Co Ltd | Information processor having touch panel and information processing method |
US9430128B2 (en) * | 2011-01-06 | 2016-08-30 | Tivo, Inc. | Method and apparatus for controls based on concurrent gestures |
-
2012
- 2012-10-02 US US13/633,806 patent/US9411507B2/en active Active
-
2013
- 2013-09-30 EP EP13186658.4A patent/EP2717150A3/en not_active Withdrawn
- 2013-10-02 JP JP2013207390A patent/JP6188522B2/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090273571A1 (en) * | 2008-05-01 | 2009-11-05 | Alan Bowens | Gesture Recognition |
US20120188285A1 (en) * | 2009-11-15 | 2012-07-26 | Ram Friedlander | Enhanced pointing interface |
US20120229410A1 (en) * | 2009-12-02 | 2012-09-13 | Sony Corporation | Remote control apparatus, remote control system, remote control method, and program |
US20110191675A1 (en) * | 2010-02-01 | 2011-08-04 | Nokia Corporation | Sliding input user interface |
US20130120282A1 (en) * | 2010-05-28 | 2013-05-16 | Tim Kukulski | System and Method for Evaluating Gesture Usability |
US20110291954A1 (en) * | 2010-06-01 | 2011-12-01 | Apple Inc. | Providing non-visual feedback for non-physical controls |
US20120032896A1 (en) * | 2010-08-06 | 2012-02-09 | Jan Vesely | Self-service terminal and configurable screen therefor |
US20150097786A1 (en) * | 2012-05-31 | 2015-04-09 | Nokia Corporation | Display apparatus |
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140108934A1 (en) * | 2012-10-15 | 2014-04-17 | Lg Electronics Inc. | Image display apparatus and method for operating the same |
US20140240262A1 (en) * | 2013-02-27 | 2014-08-28 | Samsung Electronics Co., Ltd. | Apparatus and method for supporting voice service in a portable terminal for visually disabled people |
US20150066245A1 (en) * | 2013-09-02 | 2015-03-05 | Hyundai Motor Company | Vehicle controlling apparatus installed on steering wheel |
US10394375B2 (en) | 2013-12-31 | 2019-08-27 | Immersion Corporation | Systems and methods for controlling multiple displays of a motor vehicle |
US20160216830A1 (en) * | 2013-12-31 | 2016-07-28 | Immersion Corporation | Systems and Methods for Controlling Multiple Displays With Single Controller and Haptic Enabled User Interface |
US9851838B2 (en) * | 2013-12-31 | 2017-12-26 | Immersion Corporation | Systems and methods for controlling multiple displays with single controller and haptic enabled user interface |
US20170277263A1 (en) * | 2014-07-21 | 2017-09-28 | Immersion Corporation | Systems And Methods For Determining Haptic Effects For Multi-Touch Input |
US20160018891A1 (en) * | 2014-07-21 | 2016-01-21 | Immersion Corporation | Systems And Methods For Determining Haptic Effects For Multi-Touch Input |
US9710063B2 (en) * | 2014-07-21 | 2017-07-18 | Immersion Corporation | Systems and methods for determining haptic effects for multi-touch input |
US10013063B2 (en) * | 2014-07-21 | 2018-07-03 | Immersion Corporation | Systems and methods for determining haptic effects for multi-touch input |
JP2016110422A (en) * | 2014-12-08 | 2016-06-20 | 富士通テン株式会社 | Operation device |
US20160193502A1 (en) * | 2015-01-06 | 2016-07-07 | Samsung Electronics Co., Ltd. | Method and apparatus for physical exercise assistance |
US10241755B2 (en) * | 2015-01-06 | 2019-03-26 | Samsung Electronics Co., Ltd. | Method and apparatus for physical exercise assistance |
CN105117147A (en) * | 2015-07-24 | 2015-12-02 | 上海修源网络科技有限公司 | Method and apparatus for manipulating vehicle-mounted operating system based on gesture and vehicle-mounted device |
US20170249012A1 (en) * | 2016-02-25 | 2017-08-31 | Cirque Corporation | Method of providing audible feedback on a touch sensor using haptics |
WO2017147559A1 (en) * | 2016-02-25 | 2017-08-31 | Cirque Corporation | Method of providing audible feedback on a touch sensor using haptics |
US10572031B2 (en) | 2016-09-28 | 2020-02-25 | Salesforce.Com, Inc. | Processing keyboard input to cause re-sizing of items in a user interface of a web browser-based application |
US10642474B2 (en) * | 2016-09-28 | 2020-05-05 | Salesforce.Com, Inc. | Processing keyboard input to cause movement of items in a user interface of a web browser-based application |
US11301128B2 (en) * | 2019-05-01 | 2022-04-12 | Google Llc | Intended input to a user interface from detected gesture positions |
CN111773657A (en) * | 2020-08-11 | 2020-10-16 | 网易(杭州)网络有限公司 | Method and device for switching visual angles in game, electronic equipment and storage medium |
CN112148408A (en) * | 2020-09-27 | 2020-12-29 | 深圳壹账通智能科技有限公司 | Barrier-free mode implementation method and device based on image processing and storage medium |
US20230004344A1 (en) * | 2021-06-30 | 2023-01-05 | Google Llc | Activity-Dependent Audio Feedback Themes For Touch Gesture Inputs |
US11573762B2 (en) * | 2021-06-30 | 2023-02-07 | Google Llc | Activity-dependent audio feedback themes for touch gesture inputs |
US20230297330A1 (en) * | 2021-06-30 | 2023-09-21 | Google Llc | Activity-Dependent Audio Feedback Themes for Touch Gesture Inputs |
CN114397996A (en) * | 2021-12-29 | 2022-04-26 | 杭州灵伴科技有限公司 | Interactive prompting method, head-mounted display device and computer readable medium |
Also Published As
Publication number | Publication date |
---|---|
JP6188522B2 (en) | 2017-08-30 |
US9411507B2 (en) | 2016-08-09 |
EP2717150A3 (en) | 2017-10-18 |
JP2014075130A (en) | 2014-04-24 |
EP2717150A2 (en) | 2014-04-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9411507B2 (en) | Synchronized audio feedback for non-visual touch interface system and method | |
Poupyrev et al. | Tactile interfaces for small touch screens | |
US9195321B2 (en) | Input device user interface enhancements | |
TWI352306B (en) | Touch-sensitive screen electronic apparatus and co | |
US20140098038A1 (en) | Multi-function configurable haptic device | |
US20110148774A1 (en) | Handling Tactile Inputs | |
JP2012064075A (en) | Character input device | |
KR20150042301A (en) | Multi-touch device having dynamichaptic effects | |
US10725543B2 (en) | Input device, display device, and method for controlling input device | |
JP6144947B2 (en) | Vehicle control device | |
US11194415B2 (en) | Method and apparatus for indirect force aware touch control with variable impedance touch sensor arrays | |
JP2008269456A (en) | Character input device and program for inputting character | |
JP2018128741A (en) | Control device, input system, and control method | |
JP6747941B2 (en) | Touch input device and operation detection method | |
KR20170029180A (en) | Vehicle, and control method for the same | |
CN104714684A (en) | Display apparatus for vehicle | |
TW201005605A (en) | Touch control electronic device and operating method thereof | |
JP6393604B2 (en) | Operating device | |
KR101155805B1 (en) | A device and method for inputting korean character, and mobile device using the same | |
JP7430166B2 (en) | Information processing program, information processing device, and information processing method | |
JP2018156533A (en) | Haptic presentation device | |
EP4273668A1 (en) | Tactile feedback | |
TW201109996A (en) | Method for operating a touch screen, method for defining a touch gesture on the touch screen, and electronic device thereof | |
JP2016179027A (en) | Interface program to progress game by touch input, and terminal | |
CN115344837A (en) | Password input device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: TOYOTA MOTOR ENGINEERING & MANUFACTURING NORTH AME Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MOORE, DOUGLAS A.;REEL/FRAME:029134/0988 Effective date: 20121001 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: TOYOTA JIDOSHA KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TOYOTA MOTOR ENGINEERING & MANUFACTURING NORTH AMERICA, INC.;REEL/FRAME:039334/0296 Effective date: 20160729 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |