US20180024718A1 - Systems for and methods of providing inertial scrolling and navigation using a fingerprint sensor - Google Patents

Systems for and methods of providing inertial scrolling and navigation using a fingerprint sensor Download PDF

Info

Publication number
US20180024718A1
US20180024718A1 US15/724,285 US201715724285A US2018024718A1 US 20180024718 A1 US20180024718 A1 US 20180024718A1 US 201715724285 A US201715724285 A US 201715724285A US 2018024718 A1 US2018024718 A1 US 2018024718A1
Authority
US
United States
Prior art keywords
finger
movement
sensor
display
input device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/724,285
Inventor
Wayne Yang
Rohini KRISHNAPURA
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc filed Critical Apple Inc
Priority to US15/724,285 priority Critical patent/US20180024718A1/en
Publication of US20180024718A1 publication Critical patent/US20180024718A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06K9/00026
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1335Combining adjacent partial images (e.g. slices) to create a composite input or reference pattern; Tracking a sweeping finger movement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/033Indexing scheme relating to G06F3/033
    • G06F2203/0338Fingerprint track pad, i.e. fingerprint sensor used as pointing device tracking the fingertip image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/033Indexing scheme relating to G06F3/033
    • G06F2203/0339Touch strips, e.g. orthogonal touch strips to control cursor movement or scrolling; single touch strip to adjust parameter or to implement a row of soft keys
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • This invention relates to input devices. More specifically, this invention relates to systems for and methods of scrolling and navigating using fingerprint sensors.
  • Fingerprint sensors find many uses, including verifying a user's identity and emulating different input devices, such as computer mice, pressure-sensitive buttons, and scroll wheels. Many sensors read finger swipes to scroll through pages, menu items, slides of images, and other displayed information. Generally, when the finger swipe stops, the scrolling stops, especially if the finger is removed from the surface of the fingerprint sensor. Using conventional scrolling techniques, a user must perform multiple swipes, with several starts and stops, to scroll through a large area. Navigating in this way is both inefficient and time consuming.
  • a method in a first aspect of the invention, includes generating motion of a computer display in response to swiping an object along a finger sensor. After the swiping is completed, the motion gradually changes. In one embodiment, the motion decelerates, such as with an inertial decay. A dampening factor of the inertial decay is related to a speed of the swiping or a duration of the swiping. Preferably, the inertial decay is calculated using a model of a joystick return-to-home motion.
  • the method also includes stopping the motion in response to tapping the finger sensor after the swiping is completed.
  • the method also includes performing an action on a computer system in response to changing a pressure on the finger sensor (e.g., by tapping the sensor) after the swiping is completed. If the computer display shows an image, the action includes either zooming in on the image or zooming out from the image.
  • the motion corresponds to scrolling through a list of items, rotating an image, or moving over an image.
  • the computer display shows a list of items, a region of an image, a grid menu, slides of images, a game image, or an element of a computer simulation.
  • Changing the motion includes changing a speed of the computer display in response to one or more subsequent swipes after the swiping is completed.
  • the speed is increased if the one or more subsequent swipes are in a same direction as the swiping.
  • the speed is decreased if the one or more subsequent swipes are in a different direction as the swiping.
  • the method also includes accelerating the motion by holding the object stationary on the finger sensor before the swiping is completed.
  • the finger sensor is a finger swipe sensor.
  • the finger sensor is a finger placement sensor.
  • a navigation system in a second aspect of the invention, includes a finger sensor and a translator module.
  • the translator module is programmed for gradually changing a motion of a computer display in response to completing swiping an object across the finger sensor.
  • the motion is changed by decelerating it, such as uniformly.
  • the uniform deceleration has an inertial decay, such as one modeled on a joystick return-to-home motion.
  • the motion is changed by accelerating it in response to receiving one or more swipes across the finger sensor in a same direction as the swiping. In another embodiment, the motion is changed by decelerating it in response to receiving one or more swipes across the finger sensor in an opposite direction as the swiping.
  • the translator module is also programmed to single-step scroll through the computer display and to control the computer display in response to determining a change in pressure on a surface of the finger sensor.
  • the motion is changed by both accelerating it and decelerating it, at different times.
  • the translator module is also programmed to suddenly stop the motion in response to a performing a predetermined stop motion across the finger sensor.
  • the predetermined stop motion is a tap or a press-and-hold motion, to name only a few possible motions.
  • the translator module includes a computer-readable medium containing computer instructions that, when executed by a processor, result in gradually changing the motion, suddenly stopping the motion, or both.
  • a navigation system in a third aspect of the invention, includes a finger sensor, a movement correlator coupled to the finger sensor, an acceleration calculator coupled to the movement correlator, and multiple electronic input device emulators, each coupled to the acceleration calculator and to a computer display device.
  • the acceleration calculator is programmed to gradually accelerate, decelerate, or both, a motion of a display on a computer display device in response to completing a swipe across the finger sensor.
  • the acceleration calculator is programmed to determine an inertial decay of the deceleration.
  • the multiple input device emulators include any two or more of a mouse emulator, a scroll wheel emulator, a push-button emulator, and a wheel emulator.
  • FIGS. 1A-E illustrate inertial scrolling through menu items by swiping a finger across a finger sensor in accordance with one embodiment of the present invention.
  • FIG. 2 is a graph of finger motion along a finger sensor versus time in accordance with one embodiment of the present invention.
  • FIGS. 3A-E illustrate inertial return-to-home motion of a joystick, used to control scrolling in accordance with the present invention.
  • FIG. 4 is a flow chart showing the steps for scrolling through a screen display in accordance with one embodiment of the present invention.
  • FIG. 5 shows a window translated over an image of a map in accordance with one embodiment of the present invention.
  • FIGS. 6A-E illustrate emulation of a wheel on a gaming device using inertial deceleration in accordance with one embodiment of the present invention.
  • FIG. 7 is a flow chart of the steps for using inertial decay to emulate different types of input devices in accordance with the present invention.
  • FIG. 8 is a flow chart showing the steps for scrolling through a screen display using a sudden-stop feature in accordance with one embodiment of the present invention.
  • FIGS. 9A-H each shows finger movement along a finger sensor and a corresponding graph of display speed versus time in accordance with one embodiment of the present invention.
  • FIG. 10 is a flow chart of the steps for determining additive motion in accordance with one embodiment of the present invention.
  • FIG. 11 is block diagram of the components of a system for scrolling through a display by emulating a scroll wheel in accordance with one embodiment of the present invention.
  • FIG. 12 is block diagram of the components of a system for navigating through displays by emulating a scroll wheel, a mouse, and a wheel in accordance with one embodiment of the present invention.
  • Embodiments of the present invention use a fingerprint sensor to control the movement of elements on a computer or other display.
  • the elements such as items in a menu or a window overlying a map, are set in motion and then gradually come to a stop.
  • the display is visually pleasing and, more importantly, gives the user greater control when navigating through the display.
  • the elements are a list of items in a menu.
  • a single finger swipe sets the menu in motion before it gradually decelerates and comes to a stop. This fluid scrolling through a menu is generally more intuitive and preferred than scrolling with several starts and stops as when using conventional scrolling implementations.
  • the gradual deceleration simulates an inertial decay, much as the speed of a pinwheel decreases after it has been launched: Once the wheel is spinning, external interaction is no longer required to keep it going. Eventually, the wheel slows down and comes to a stop due to friction.
  • a movable window enclosing a portion of an image map is navigated to overlie different regions of the map.
  • the window can be set in motion along any direction (e.g., in a north western direction), toward a region or area of interest, before gradually decelerating.
  • the inertial deceleration is modeled after a joystick with a dampened return to its home or origin.
  • finger movement as computed in the traditional manner, translates to movement of the joystick head, which is then translated to a motion, such as a scrolling motion.
  • the position of the joystick head, as well as the current acceleration state of the motion model, dictates the speed with which the scrolling or other motion occurs.
  • the joystick head When a finger is lifted from the fingerprint sensor, the joystick head will return to its origin or home position.
  • an “additive” attribute of scrolling and other motion is implemented. For example, consecutive finger swipes across a surface of a finger sensor in a same direction will, with each swipe, increase the speed of a computer display. Swiping in an opposite direction will slow the motion or even bring it to a stop. Swiping in an opposite direction thus functions as a drag on the motion.
  • FIGS. 1A-E show a finger swipe sensor 105 and a display device 125 at a sequence of times T 1 -T 5 , respectively.
  • the sequence is at regular intervals, that is, the differences T 2 ⁇ T 1 , T 3 ⁇ T 2 , T 4 ⁇ T 3 , and T 5 ⁇ T 4 are the same.
  • the display device 125 shows a menu 120 of names; the swipe sensor 105 is used to scroll through the menu 120 .
  • the swipe sensor 105 is used to emulate a scroll wheel.
  • FIG. 1A at time T 1 a finger 101 is swiped along a surface of the swipe sensor 105 .
  • FIG. 1A at time T 1 a finger 101 is swiped along a surface of the swipe sensor 105 .
  • FIG. 1A shows a horizontal line above the finger 101 , having an arrow indicating the direction of the swipe, and a vertical line next to the menu 120 , having an arrow indicating the direction of the scrolling.
  • the vertical line has a thickness corresponding to a speed with which the menu 120 scrolls: A thicker vertical line indicates that the menu 120 is scrolling faster than when the menu 120 is adjacent to a thinner vertical line.
  • FIG. 1B shows the finger 101 at time T 2 .
  • the thickness of the vertical line in FIG. 1B indicates that the menu 120 scrolls faster than it did at time T 1 .
  • the finger 101 has left the finger sensor 105 , but the menu 120 continues to scroll, but slower than at time T 2 .
  • the menu 120 continues to scroll, but slower than it did at time T 3 .
  • the menu 120 has stopped scrolling.
  • the menu 120 gradually slows to a stop, preferably simulating a spring's critically damped or over-damped motion or a motion corresponding to a joystick's return-to-home motion.
  • a dampening factor such as the weight of the joystick.
  • the dampening factor acts as an inertial decay factor. By using different decay factors, different inertial scrolling behavior can be achieved.
  • the inertial scrolling can be customized by mapping several factors (including, but not limited to, the dampening factor and the acceleration factor) to the axial displacement of the joystick.
  • FIG. 2 is a graph 130 plotting the speed at which the menu 120 of FIGS. 1A-E scrolls. Throughout this Specification, identical labels refer to identical elements.
  • speed is plotted on the vertical axis 135
  • time is plotted on the horizontal axis 140 .
  • the graph 130 shows an “x” to indicate when the finger 101 is on the sensor 105 and a “ ⁇ ” to indicate when the finger 101 is off the sensor 105 .
  • the finger 101 first touches the sensor 105 at time T 0 .
  • the finger 101 then moves along the sensor 105 at a speed that increases from time T 1 to time T 2 , when the finger 101 is removed from the sensor 105 .
  • the speed with which the menu 120 scrolls gradually decreases until it stops at time T 5 .
  • a joystick return-to-home motion is modeled to generate signals used to gradually decelerate the display movement.
  • FIGS. 3A-E show a joystick 200 , whose return-to-home motion is emulated.
  • FIGS. 3A-E show the joystick 200 at the sequential times T 1 -T 5 , respectively.
  • the speed with which the joystick 200 moves a displayed object is directly proportional to an angle ⁇ that the joystick 200 makes with an axis 220 perpendicular with the joystick base, much like a throttle.
  • FIG. 3A shows the joystick 200 making an angle ⁇ 1 with the axis 220 , corresponding to the movement of the menu display 120 shown in FIG.
  • FIGS. 3B-C show the joystick making angles ⁇ 2 - ⁇ 5 , respectively, corresponding to the movement of the menu display 120 shown in FIGS. 1B-E , respectively.
  • ⁇ 5 0, indicating that the menu display 120 is not moving.
  • FIGS. 3B-E the return-to-home motion of the joystick 200 after it is released ( FIGS. 3B-E ) is dependent on several parameters, such as the angle ⁇ 2 at which the joystick 200 is released and the mass of a head 210 of the joystick 200 , to name only a few parameters.
  • the angle ⁇ i is given by Equation 1:
  • the angle ⁇ i is directly mapped to a distance a menu item is from a point of reference.
  • the distance is the distance of the head 211 from the axis 220 .
  • x length of the joystick (L)*sin( ⁇ i ).
  • the linear speed of the distance x (dx/dt) is given by Equation 2:
  • Equations 1 and 2 together are used to map a joystick (angular) damped deceleration to a scrolling (linear) damped deceleration.
  • a uniform return-to-home deceleration motion is mapped to a scrolling deceleration motion.
  • Equations 1 and 2 are only one example of a function used to calculate the angle ⁇ i at time t (e.g., each of the times T 1 -T 5 ) and thus a rate of uniform deceleration.
  • FIG. 4 is a flow chart showing the steps of a process 300 for determining inertial deceleration corresponding to a joystick return-to-home motion and using that deceleration to control the scrolling of a menu in accordance with the present invention.
  • parameters such as ⁇ (Equation 1)
  • Equation 1
  • the movement of the finger 101 along a surface of the finger swipe sensor 105 is computed.
  • this movement is determined by correlating a pattern on a surface of the finger 101 captured at sequential times (e.g., T 1 and T 2 ) to determine the speed and direction of the finger swipe.
  • the patterns are formed by the location of bifurcations, pores, ridge endings, swirls, whorls, and other fingerprint minutiae. Correlating fingerprint images is taught in U.S. Pat. No. 7,197,168, filed Jul. 12, 2002, and titled “Method and System for Biometric Image Assembly from Multiple Partial Biometric Frame Scans,” which is incorporated by reference in its entirety. It will be appreciated that other objects with patterned images, such as patterned styluses, can also be swiped across a finger sensor to scroll through menus in accordance with the present invention.
  • the finger movement is translated into joystick movement, and in the step 307 , the new joystick movement is calculated.
  • inertial/acceleration factors based on the joystick position are updated.
  • the joystick position is translated into a scrolling motion by applying the acceleration factors, and in the step 313 , the scrolling motion is used to scroll the menu 120 .
  • the process determines whether the finger 101 is still on the sensor 105 . If the finger 101 is still on the sensor 105 , the process loops back to the step 303 ; otherwise, the process continues to the step 317 , in which it applies the inertial factors to determine the deceleration.
  • These inertial factors can be based on the speed of the swipe when it is completed, the duration of the swipe, the length of the swipe, or some combination of these. For example, if the speed of the swipe is fast or the duration of the swipe is long, the inertial factors result in a slower deceleration. This result corresponds to a large momentum being imparted to a body.
  • the process continues to the step 319 , in which it determines whether the finger 101 is back on the sensor 105 . If the finger 101 is back on the sensor 101 , the process loops back to the step 303 ; otherwise, the process loops back to the step 307 .
  • FIG. 5 illustrates the finger sensor 105 used to control a display 450 on the display device 125 .
  • the display 450 is an image map overlayed by a movable window 410 , which encloses portions of the image map 450 . Swiping the finger 101 over the finger sensor 105 in the direction indicated by the arrow 475 causes the window 410 to move or translate in a corresponding direction over the image map 450 .
  • the finger sensor 105 is emulating a mouse or a track ball.
  • the window 410 moves from the location 425 A at time T 1 , to location 425 B at time T 2 , to location 425 C at time T 3 , to location 425 D at time T 4 , and finally to location 425 E at time T 5 , where it stops.
  • the thicknesses of the arrows joining adjacent locations e.g., the arrow connecting the window 410 at locations 425 A and 425 B
  • the speed with which the window 410 moves decelerates from time T 2 to T 5 , preferably in an inertial manner.
  • deceleration can be determined in other ways.
  • deceleration can be determined from a look-up table.
  • the look-up table can map the current speed and map it into a display speed for each sequential time.
  • a table stores scaling factors used for mapping current speed to subsequent speeds.
  • the table stores 10 scaling factors for 10 corresponding time cycles.
  • the speed during the next second of scrolling is the current speed times the next scaling factor (9 fps*0.7), or 6.3 fps.
  • This table lookup continues until the last scaling factor (0.0) stops the scrolling.
  • linear, non-linear, step-wise e.g., the speed is decreased, maintained over a time segment and decreased again, with the sequence continuing until scrolling stops), and other types of decay can be determined to control scrolling.
  • deceleration in accordance with the present invention is able to be uniform or non-uniform. Different types of deceleration can be used to fit the application at hand. Indeed, uniform deceleration can be used over one time interval and non-uniform deceleration over another time interval.
  • FIGS. 6A-E show another example, in which modeling inertial decay is used to decelerate a different display, a computer simulated gaming wheel 500 , such as a roulette wheel.
  • FIGS. 6A-E show the gaming wheel 500 at the times T 1 -T 5 , respectively, controlled using a finger sensor (not shown).
  • a finger sensor not shown.
  • the gaming wheel 500 is turned in the same direction.
  • the finger is removed from the finger sensor (at the time shown in FIG. 6B )
  • the gaming wheel 500 continues to rotate, but at a rate that has an inertial decay in accordance with the present invention.
  • the widths and arrows of the curved lines next to the gaming wheel 500 indicate the speed and direction, respectively, that the gaming wheel 500 is rotating.
  • FIG. 7 shows the steps of a process 600 for generally moving a computer display by using a finger sensor to emulate any number of electronic input devices in accordance with the present invention.
  • the electronic input devices include, but are not limited to, a scroll wheel, a mouse, a wheel, a track ball, a push button, and a joy stick.
  • parameters such as a dampening factor are initialized.
  • a finger movement along the finger sensor is determined.
  • the movement is translated to a motion of the emulated device. This motion can be the movement of a joystick, a scrolling motion, a translation motion (such as of a window over a map), and a button press, to name only a few.
  • the position and movement parameters of the emulated device are updated.
  • movement parameters include acceleration and direction. These parameters are used to determine the direction that is to be taken (e.g., continued) when the finger no longer touches the finger sensor.
  • the acceleration can include gradual (uniform or non-uniform) deceleration.
  • the display e.g., a list of menu items
  • the emulated electronic input device e.g., a list of menu items
  • the process determines whether the finger is still on the finger sensor. If the finger is still on the finger sensor, the process loops back to the step 603 . Otherwise, the process continues to the step 613 , in which it determines whether the display has stopped moving. If the display has not stopped moving, the process loops back to the step 603 . Otherwise, the process continues to the step 615 , in which it ends.
  • a sudden-stop feature instantly stops the inertial movement (e.g., scrolling) with a fresh touch of the finger sensor 105 .
  • inertial movement e.g., scrolling
  • a user can quickly change a scrolling direction without having to wait for the scrolling to stop.
  • This not only allows for greater ease of use but also allows quick turnaround of fresh movements in other directions. With this feature, there is no need to generate extra movement to overcome the current inertia before shifting the direction of motion.
  • FIG. 8 shows the steps of a process 700 incorporating the sudden-stop feature when scrolling through the menu 120 in FIGS. 1A-E .
  • parameters such as a dampening factor
  • the process reads the movement of the finger 101 along a surface of the sensor 105 .
  • the menu 120 is scrolled in a manner corresponding to the finger movement.
  • the process determines whether the finger 101 is still contacting the sensor 105 . If the finger 101 is still contacting the sensor 105 , the process loops back to the step 703 ; otherwise, the process continues to the step 709 .
  • the process decelerates the scrolling based on the inertial factors, such as described above.
  • the process determines whether the scrolling has stopped. If the scrolling has stopped, the process loops back to the step 703 ; otherwise, the process continues to the step 713 , in which it determines whether the finger 101 is again contacting the sensor 105 . If the finger 101 is not again contacting the sensor 105 , the process loops back to the step 703 ; otherwise, the process continues to the step 715 .
  • the process determines whether the sensor 105 was tapped quickly, thereby triggering a sudden stop. As one example, the process determines that the sensor 105 was tapped quickly if the finger 101 next contacts the sensor 105 at a time T A and is removed at a time T B , where T B ⁇ T A ⁇ 5 ms. Those skilled in the art will recognize other ways of defining and later recognizing a tap as “quick.” If, in the step 715 , the process determines that the tap is quick, the process continues to the step 717 , in which the scrolling is suddenly stopped; otherwise, the process loops back to the step 703 .
  • FIG. 8 describes scrolling based on inertial factors
  • the sudden-stop feature is able to be used to decelerate scrolling and other motions using other kinds of deceleration, including non-uniform ones.
  • the sudden-stop feature is triggered by contacting the sensor 105 and maintaining the contact for a predetermined time, such as one or two seconds.
  • Embodiments of the present invention are also able to accelerate or decelerate motion of a computer display.
  • consecutive finger swipes in a same direction result in accelerating the motion.
  • Swiping in one direction followed by a swipe in the opposite direction results in decelerating the motion.
  • FIGS. 9A-H illustrate how the motion of the display 120 ( FIGS. 1A-E ) is accelerated by swiping the finger 101 multiple times along the finger sensor 105 over a sequence of increasing times T 0 -T 7 , respectively.
  • FIGS. 9A-H depicts a graph 150 plotting a speed of the display 120 (on the vertical axis labeled “v 1 ” to “v 7 ”) versus time. Each occurrence of the graph 150 identifies the current speed by the label 155 .
  • FIGS. 9D-H also label the speed at the immediately preceding time with an “x,” tracing changes in velocity with dotted lines. As shown in FIGS. 9A-H , swiping the finger sensor 105 multiple times increases the speed of the display 120 . Increasing speed in this way is referred to as “additive” motion.
  • moving the finger 101 across the finger sensor 105 from time T 0 to T 2 causes the speed of the display 120 to increase from 0 to v 4 .
  • the speed decreases.
  • the finger 101 is swiped a second time (time T 5 to T 7 , FIGS. 9F-H ). Because this second swipe begins when the display 120 is already in motion, the swipe results in a greater speed than the initial swipe (from time T 0 to T 2 ).
  • the speed of the display 120 increases with the number of swipes and also with the total distance traveled by the swipes.
  • swiping the finger 101 along the finger sensor 105 five times will move the display 120 faster than if the finger 120 was swiped four times.
  • swiping the finger 101 five times a total distance of fives inches will result in a faster motion than swiping the finger 101 five times but a total distance of four inches.
  • FIGS. 9A-C and 9 G-H show a constant acceleration (e.g., the graph 150 during the corresponding time periods has a constant slope), other types of acceleration are able to be attained in accordance with the present invention. Some examples include exponential acceleration, with or without a maximum value; and step-wise acceleration, to name only two types. Furthermore, acceleration can be determined using a look-up table, such as one having scaling factors with values larger than one. Using the table entries of one such example, the speed is multiplied by the scaling factors 1.1, 1.5, and 2.0 in sequential time intervals.
  • the speed decreases from T 2 to T 5 with an inertial decay, in accordance with one embodiment of the present invention. It will be appreciated that in accordance with other embodiments, the speed can decrease from T 2 to T 5 in other ways, both uniform and non-uniform.
  • motion is accelerated by swiping and holding a finger or other object on a finger sensor.
  • An initial swipe will start accelerating a display (e.g., display 120 in FIGS. 1A-E ).
  • the finger is held stationary, or nearly stationary.
  • the display will continue to accelerate while the finger is held in place. The longer the finger is held in place, the faster the display moves, until a maximum speed (peak threshold) is reached. After the display reaches the desired speed, the finger is either removed or moved farther to complete the swipe.
  • the sudden stop feature can be implemented to suddenly stop the additive motion.
  • the sudden stop feature, the additive motion, and the deceleration, all in accordance with the present invention can all be combined in any combination.
  • FIG. 10 shows the steps of a process 800 for determining additive motion, such as additive scrolling, in accordance with one embodiment of the present invention.
  • Many of the steps in the process 800 are similar to the steps in the process 600 , shown in FIG. 7 , and are similarly labeled. To simplify the discussion, the common steps will not be discussed here.
  • the process determines whether the display has stopped moving. If it has not, the process continues to the step 617 , in which it determines whether a new (e.g., consecutive or sequential) swipe has occurred. If a new swipe has not occurred, the process loops back to the step 607 . Otherwise, the process loops back to the step 603 .
  • a new swipe e.g., consecutive or sequential
  • the process determines that the finger was swiped in the same direction as during the immediately preceding swipe, the process later updates the position and movement parameters in the step 607 to accelerate the display motion. On the other hand, if, in the step 603 , the process determines that the finger was swiped in a direction opposite to that of the immediately preceding swipe, the process later updates the position and movement parameters in the step 607 to decelerate the display motion.
  • FIG. 11 is a block diagram of a system 900 used to emulate a scroll wheel using decay in accordance with the present invention.
  • the system 900 includes the finger sensor 105 coupled to a translation module 925 , which translates finger movements into scroll wheel signals for scrolling the menu 120 on the display 125 , as shown in FIGS. 1A-E .
  • the translation module 925 includes a movement correlator 910 and a motion translator 915 .
  • the movement correlator 910 correlates images sequentially captured by the finger sensor 105 and determines finger movement, such as in the step 303 of FIG. 4 .
  • the motion translator 915 receives the finger movement, translates the finger movement into joystick movement (step 305 , FIG.
  • step 307 calculates new joystick movement (step 307 , FIG. 4 ), updates inertial/acceleration factors based on joystick position (step 309 , FIG. 4 ), translates the joystick position into a scrolling motion by applying the acceleration factors (step 311 , FIG. 4 ), and scrolls the menu accordingly (step 313 , FIG. 4 ).
  • both of the elements 910 and 915 include a computer-readable medium containing instructions that cause a processor to perform the steps of FIG. 4 .
  • the elements include software, hardware, firmware, or any combination of these.
  • the steps shown in FIG. 4 can be distributed among the components 910 and 915 in different ways, or among other components, such as described in FIG. 12 , below.
  • the components 910 and 915 are also configured to implement the sudden-stop feature described in FIG. 8 , the additive motion feature described in FIG. 10 , or both.
  • a motion translator is used to provide inertial deceleration when emulating multiple different input devices.
  • inertial deceleration is used to gradually decelerate movement corresponding to a scroll wheel, a wheel (e.g., a roulette wheel), and a mouse.
  • a single acceleration/deceleration module (such as one simulating deceleration, acceleration, and a sudden-stop feature) is shared among several device emulators.
  • FIG. 12 illustrates a system 950 that emulates several electronic input devices, all of which use acceleration/deceleration in accordance with the present invention.
  • the system 950 includes the elements 105 and 125 , described above.
  • the system 950 also includes a translation module 980 , which includes the movement correlator 910 and an acceleration/deceleration calculator 975 .
  • the acceleration/deceleration calculator 975 is coupled to a scroll wheel emulator 915 , a wheel emulator 917 , and a mouse emulator 919 , all of which are coupled to a switch 985 , which in turn is coupled to the display device 125 .
  • the switch 985 routes to the display device 125 the emulator (i.e., 915 , 917 , and 919 ) corresponding to the device currently being emulated.
  • Device emulation using fingerprint sensors is discussed in U.S. Pat. No. 7,474,772, filed Jun. 21, 2004, and titled “System and Method for a Miniature User Input Device,” which is incorporated by reference in its entirety.
  • the acceleration/deceleration calculator 975 performs the step 607 , determining the inertial decay from position and movement parameters.
  • the step 605 is performed by the scroll wheel emulator 915 .
  • the finger sensor 105 is used to emulate a wheel
  • device movement is determined by the wheel emulator 917 .
  • the finger sensor 105 is used to emulate a mouse
  • device movement is determined by the mouse emulator 919 .
  • the elements 915 , 917 , and 919 can all be implemented using any combination of hardware, software, firmware, or computer-readable media for controlling the operation of a processor.
  • Embodiments of the invention are also able to control a display in response to pressing a surface of a finger sensor, such as by a finger tap.
  • a surface of a finger sensor such as by a finger tap.
  • movement having an inertial decay in accordance with the present invention is used to move through a list of items, to zoom in on or zoom out from an image of a city map, to select a large grid menu, and to control an arcade game, such as billiards, that provides virtual realism in terms of movement.
  • the user presses the surface of the finger sensor 105 to select a highlighted name, such as the topmost name in the menu display 120 .
  • Contact information for the highlighted name is then immediately presented on the display device 125 .
  • tapping the finger sensor 105 in a predetermined manner zooms in on that portion of the image within the window 410 ; tapping the finger sensor 105 in another predetermined manner (e.g., two quick taps in succession or a tap-and-hold motion) zooms out from the same portion of the image.
  • a predetermined manner e.g., one quick tap
  • tapping the finger sensor 105 in another predetermined manner e.g., two quick taps in succession or a tap-and-hold motion
  • a system is configured to perform both single-step (non-inertial) scrolling and inertial scrolling through the adjustment of the dampening factor based upon the context of recent movement. If the recent movement is indicative of slow or single-step scrolling, then the dampening factor is decreased substantially, resulting in what is effectively non-inertial scrolling (e.g., the joystick reverts to the zero or home position nearly instantly). As one example, the system determines that recent movement indicates a preference for slow or single-step scrolling when a user implements the sudden-stop feature several consecutive times. In response, the system adjusts the damping factor (e.g., in the step 309 ) to ensure that the return-to-home motion is fast, approaching a single-step mode.
  • the damping factor e.g., in the step 309
  • a user swipes or traces a finger on a finger sensor to set a display in motion.
  • the system determines the direction of the swipe and other parameters, such as the duration of the swipe or the acceleration of the swipe.
  • the user can accelerate the motion by again swiping, in the same direction as before; or she can decelerate the motion by swiping in a different direction.
  • the display continues in the same direction, before slowing down. This motion provides the user with a more pleasurable viewing experience as the display rolls to a smooth stop.
  • the user also has greater control over moving the display. Later, the user can tap the finger sensor to trigger an action, such as selecting a highlighted object.
  • a finger sensor, computing elements, and display device are integrated onto a single device, such as a mobile phone, personal digital assistant, or portable gaming device.
  • a system in accordance with the present invention includes separate components, such as a swipe sensor, display screen, and host computer.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An emulation system receives a swipe along a finger sensor to set a computer display in motion. After the swipe is completed, the display continues along its previous path. Depending on their direction, subsequent swipes can be used to accelerate or decelerate the motion. Gradually, the display decelerates. In one embodiment, this deceleration simulates an inertial decay, providing the user with a pleasing display that gradually rolls to a stop. The deceleration is modeled on a joystick return-to-home inertial decay, allowing the user greater control when navigating over the display. The finger sensor is used to emulate different electronic devices, such as a mouse, a scroll wheel, and a rotating wheel.

Description

    RELATED APPLICATION
  • This application claims priority under 35 U.S.C. §119(e) of the co-pending U.S. provisional patent application Ser. No. 61/065,751, filed Feb. 13, 2008, and titled “System for Providing Inertial Scrolling/Navigation Using a Fingerprint Sensor,” which is hereby incorporated by reference.
  • FIELD OF THE INVENTION
  • This invention relates to input devices. More specifically, this invention relates to systems for and methods of scrolling and navigating using fingerprint sensors.
  • BACKGROUND OF THE INVENTION
  • Fingerprint sensors find many uses, including verifying a user's identity and emulating different input devices, such as computer mice, pressure-sensitive buttons, and scroll wheels. Many sensors read finger swipes to scroll through pages, menu items, slides of images, and other displayed information. Generally, when the finger swipe stops, the scrolling stops, especially if the finger is removed from the surface of the fingerprint sensor. Using conventional scrolling techniques, a user must perform multiple swipes, with several starts and stops, to scroll through a large area. Navigating in this way is both inefficient and time consuming.
  • SUMMARY OF THE INVENTION
  • In a first aspect of the invention, a method includes generating motion of a computer display in response to swiping an object along a finger sensor. After the swiping is completed, the motion gradually changes. In one embodiment, the motion decelerates, such as with an inertial decay. A dampening factor of the inertial decay is related to a speed of the swiping or a duration of the swiping. Preferably, the inertial decay is calculated using a model of a joystick return-to-home motion.
  • In one embodiment, the method also includes stopping the motion in response to tapping the finger sensor after the swiping is completed. The method also includes performing an action on a computer system in response to changing a pressure on the finger sensor (e.g., by tapping the sensor) after the swiping is completed. If the computer display shows an image, the action includes either zooming in on the image or zooming out from the image.
  • In one embodiment, the motion corresponds to scrolling through a list of items, rotating an image, or moving over an image. The computer display shows a list of items, a region of an image, a grid menu, slides of images, a game image, or an element of a computer simulation.
  • Changing the motion includes changing a speed of the computer display in response to one or more subsequent swipes after the swiping is completed. In one embodiment, the speed is increased if the one or more subsequent swipes are in a same direction as the swiping. The speed is decreased if the one or more subsequent swipes are in a different direction as the swiping.
  • In one embodiment, the method also includes accelerating the motion by holding the object stationary on the finger sensor before the swiping is completed.
  • Preferably, the finger sensor is a finger swipe sensor. Alternatively, the finger sensor is a finger placement sensor.
  • In a second aspect of the invention, a navigation system includes a finger sensor and a translator module. The translator module is programmed for gradually changing a motion of a computer display in response to completing swiping an object across the finger sensor. In one embodiment, the motion is changed by decelerating it, such as uniformly. Preferably, the uniform deceleration has an inertial decay, such as one modeled on a joystick return-to-home motion.
  • In one embodiment, the motion is changed by accelerating it in response to receiving one or more swipes across the finger sensor in a same direction as the swiping. In another embodiment, the motion is changed by decelerating it in response to receiving one or more swipes across the finger sensor in an opposite direction as the swiping.
  • In one embodiment, the translator module is also programmed to single-step scroll through the computer display and to control the computer display in response to determining a change in pressure on a surface of the finger sensor.
  • In one embodiment, the motion is changed by both accelerating it and decelerating it, at different times.
  • Preferably, the translator module is also programmed to suddenly stop the motion in response to a performing a predetermined stop motion across the finger sensor. The predetermined stop motion is a tap or a press-and-hold motion, to name only a few possible motions.
  • In one embodiment, the translator module includes a computer-readable medium containing computer instructions that, when executed by a processor, result in gradually changing the motion, suddenly stopping the motion, or both.
  • In a third aspect of the invention, a navigation system includes a finger sensor, a movement correlator coupled to the finger sensor, an acceleration calculator coupled to the movement correlator, and multiple electronic input device emulators, each coupled to the acceleration calculator and to a computer display device. The acceleration calculator is programmed to gradually accelerate, decelerate, or both, a motion of a display on a computer display device in response to completing a swipe across the finger sensor. In one embodiment, the acceleration calculator is programmed to determine an inertial decay of the deceleration. The multiple input device emulators include any two or more of a mouse emulator, a scroll wheel emulator, a push-button emulator, and a wheel emulator.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIGS. 1A-E illustrate inertial scrolling through menu items by swiping a finger across a finger sensor in accordance with one embodiment of the present invention.
  • FIG. 2 is a graph of finger motion along a finger sensor versus time in accordance with one embodiment of the present invention.
  • FIGS. 3A-E illustrate inertial return-to-home motion of a joystick, used to control scrolling in accordance with the present invention.
  • FIG. 4 is a flow chart showing the steps for scrolling through a screen display in accordance with one embodiment of the present invention.
  • FIG. 5 shows a window translated over an image of a map in accordance with one embodiment of the present invention.
  • FIGS. 6A-E illustrate emulation of a wheel on a gaming device using inertial deceleration in accordance with one embodiment of the present invention.
  • FIG. 7 is a flow chart of the steps for using inertial decay to emulate different types of input devices in accordance with the present invention.
  • FIG. 8 is a flow chart showing the steps for scrolling through a screen display using a sudden-stop feature in accordance with one embodiment of the present invention.
  • FIGS. 9A-H each shows finger movement along a finger sensor and a corresponding graph of display speed versus time in accordance with one embodiment of the present invention.
  • FIG. 10 is a flow chart of the steps for determining additive motion in accordance with one embodiment of the present invention.
  • FIG. 11 is block diagram of the components of a system for scrolling through a display by emulating a scroll wheel in accordance with one embodiment of the present invention.
  • FIG. 12 is block diagram of the components of a system for navigating through displays by emulating a scroll wheel, a mouse, and a wheel in accordance with one embodiment of the present invention.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
  • Embodiments of the present invention use a fingerprint sensor to control the movement of elements on a computer or other display. The elements, such as items in a menu or a window overlying a map, are set in motion and then gradually come to a stop. The display is visually pleasing and, more importantly, gives the user greater control when navigating through the display.
  • In one example, the elements are a list of items in a menu. Rather than single-stepping through the items, using conventional scrolling means, a single finger swipe sets the menu in motion before it gradually decelerates and comes to a stop. This fluid scrolling through a menu is generally more intuitive and preferred than scrolling with several starts and stops as when using conventional scrolling implementations.
  • Preferably, the gradual deceleration simulates an inertial decay, much as the speed of a pinwheel decreases after it has been launched: Once the wheel is spinning, external interaction is no longer required to keep it going. Eventually, the wheel slows down and comes to a stop due to friction.
  • Many displayed applications benefit from this simulation of inertial decay. For example, a movable window enclosing a portion of an image map is navigated to overlie different regions of the map. The window can be set in motion along any direction (e.g., in a north western direction), toward a region or area of interest, before gradually decelerating.
  • Preferably, the inertial deceleration is modeled after a joystick with a dampened return to its home or origin. In this implementation, finger movement, as computed in the traditional manner, translates to movement of the joystick head, which is then translated to a motion, such as a scrolling motion. The position of the joystick head, as well as the current acceleration state of the motion model, dictates the speed with which the scrolling or other motion occurs. When a finger is lifted from the fingerprint sensor, the joystick head will return to its origin or home position.
  • In accordance with other embodiments of the present invention, an “additive” attribute of scrolling and other motion is implemented. For example, consecutive finger swipes across a surface of a finger sensor in a same direction will, with each swipe, increase the speed of a computer display. Swiping in an opposite direction will slow the motion or even bring it to a stop. Swiping in an opposite direction thus functions as a drag on the motion.
  • The discussion that follows first explains one implementation of the invention, used to emulate a scroll wheel. The general terms discussed in that implementation are then used to explain how the invention can be extended, used to apply this gradual deceleration to other input devices. Some of this discussion is also applicable to the use of additive motion, such as scrolling.
  • FIGS. 1A-E show a finger swipe sensor 105 and a display device 125 at a sequence of times T1-T5, respectively. To better describe embodiments of the present invention, the sequence is at regular intervals, that is, the differences T2−T1, T3−T2, T4−T3, and T5−T4 are the same. The display device 125 shows a menu 120 of names; the swipe sensor 105 is used to scroll through the menu 120. In this embodiment, the swipe sensor 105 is used to emulate a scroll wheel. As shown in FIG. 1A, at time T1 a finger 101 is swiped along a surface of the swipe sensor 105. FIG. 1A shows a horizontal line above the finger 101, having an arrow indicating the direction of the swipe, and a vertical line next to the menu 120, having an arrow indicating the direction of the scrolling. The vertical line has a thickness corresponding to a speed with which the menu 120 scrolls: A thicker vertical line indicates that the menu 120 is scrolling faster than when the menu 120 is adjacent to a thinner vertical line.
  • FIG. 1B shows the finger 101 at time T2. As shown in FIG. 1B, the thickness of the vertical line in FIG. 1B indicates that the menu 120 scrolls faster than it did at time T1. At time T3, shown in FIG. 1C, the finger 101 has left the finger sensor 105, but the menu 120 continues to scroll, but slower than at time T2. At time T4, the menu 120 continues to scroll, but slower than it did at time T3. At time T5, the menu 120 has stopped scrolling. In this example, after the swiping is completed (at time T2), the menu 120 gradually slows to a stop, preferably simulating a spring's critically damped or over-damped motion or a motion corresponding to a joystick's return-to-home motion. When simulating this joystick motion, modeled in embodiments of the invention, the rate at which the joystick head returns to the home position is influenced by a dampening factor, such as the weight of the joystick.
  • The dampening factor acts as an inertial decay factor. By using different decay factors, different inertial scrolling behavior can be achieved. The inertial scrolling can be customized by mapping several factors (including, but not limited to, the dampening factor and the acceleration factor) to the axial displacement of the joystick.
  • FIG. 2 is a graph 130 plotting the speed at which the menu 120 of FIGS. 1A-E scrolls. Throughout this Specification, identical labels refer to identical elements. In the graph 130, speed is plotted on the vertical axis 135, and time is plotted on the horizontal axis 140. For each time T0-T5, the graph 130 shows an “x” to indicate when the finger 101 is on the sensor 105 and a “◯” to indicate when the finger 101 is off the sensor 105.
  • As shown in FIG. 2, the finger 101 first touches the sensor 105 at time T0. The finger 101 then moves along the sensor 105 at a speed that increases from time T1 to time T2, when the finger 101 is removed from the sensor 105. From time T2 to time T5, the speed with which the menu 120 scrolls gradually decreases until it stops at time T5.
  • Preferably, a joystick return-to-home motion is modeled to generate signals used to gradually decelerate the display movement. As one example, FIGS. 3A-E show a joystick 200, whose return-to-home motion is emulated. FIGS. 3A-E show the joystick 200 at the sequential times T1-T5, respectively. Referring to FIGS. 3A-E, the speed with which the joystick 200 moves a displayed object is directly proportional to an angle θ that the joystick 200 makes with an axis 220 perpendicular with the joystick base, much like a throttle. FIG. 3A shows the joystick 200 making an angle θ1 with the axis 220, corresponding to the movement of the menu display 120 shown in FIG. 1A. Similarly, FIGS. 3B-C show the joystick making angles θ25, respectively, corresponding to the movement of the menu display 120 shown in FIGS. 1B-E, respectively. In this example, θ5=0, indicating that the menu display 120 is not moving.
  • Those skilled in the art will recognize that the return-to-home motion of the joystick 200 after it is released (FIGS. 3B-E) is dependent on several parameters, such as the angle θ2 at which the joystick 200 is released and the mass of a head 210 of the joystick 200, to name only a few parameters.
  • In one embodiment, the angle θi is given by Equation 1:

  • θi1 e −(Ω+K)t  [Equation 1]
  • where Ω is a damping factor, K is a constant, and t is time. The damping factor Ω is related to the mass of the head 210. In one embodiment, the damping factor Ω is directly proportional to the mass of the head 210.
  • In one embodiment, the angle θi is directly mapped to a distance a menu item is from a point of reference. In one embodiment, the distance is the distance of the head 211 from the axis 220. For example, x=length of the joystick (L)*sin(θi). The linear speed of the distance x (dx/dt) is given by Equation 2:

  • dx/dt=L*dθ/dt*cos(θ)  [Equation 2]
  • Equations 1 and 2 together are used to map a joystick (angular) damped deceleration to a scrolling (linear) damped deceleration. Thus, a uniform return-to-home deceleration motion is mapped to a scrolling deceleration motion.
  • It will be appreciated that, together, Equations 1 and 2 are only one example of a function used to calculate the angle θi at time t (e.g., each of the times T1-T5) and thus a rate of uniform deceleration.
  • FIG. 4 is a flow chart showing the steps of a process 300 for determining inertial deceleration corresponding to a joystick return-to-home motion and using that deceleration to control the scrolling of a menu in accordance with the present invention. Referring to FIGS. 1A-E, 3A-E, and 4, in the start step 301, parameters, such as Ω (Equation 1), are initialized and others are computed. Next, in the step 303, the movement of the finger 101 along a surface of the finger swipe sensor 105 is computed. Preferably, this movement is determined by correlating a pattern on a surface of the finger 101 captured at sequential times (e.g., T1 and T2) to determine the speed and direction of the finger swipe. The patterns are formed by the location of bifurcations, pores, ridge endings, swirls, whorls, and other fingerprint minutiae. Correlating fingerprint images is taught in U.S. Pat. No. 7,197,168, filed Jul. 12, 2002, and titled “Method and System for Biometric Image Assembly from Multiple Partial Biometric Frame Scans,” which is incorporated by reference in its entirety. It will be appreciated that other objects with patterned images, such as patterned styluses, can also be swiped across a finger sensor to scroll through menus in accordance with the present invention.
  • Next, in the step 305, the finger movement is translated into joystick movement, and in the step 307, the new joystick movement is calculated. Next, in the step 309, inertial/acceleration factors based on the joystick position are updated. In the step 311, the joystick position is translated into a scrolling motion by applying the acceleration factors, and in the step 313, the scrolling motion is used to scroll the menu 120.
  • In the step 315, the process determines whether the finger 101 is still on the sensor 105. If the finger 101 is still on the sensor 105, the process loops back to the step 303; otherwise, the process continues to the step 317, in which it applies the inertial factors to determine the deceleration. These inertial factors can be based on the speed of the swipe when it is completed, the duration of the swipe, the length of the swipe, or some combination of these. For example, if the speed of the swipe is fast or the duration of the swipe is long, the inertial factors result in a slower deceleration. This result corresponds to a large momentum being imparted to a body.
  • From the step 317, the process continues to the step 319, in which it determines whether the finger 101 is back on the sensor 105. If the finger 101 is back on the sensor 101, the process loops back to the step 303; otherwise, the process loops back to the step 307.
  • As explained above, embodiments of the invention are able to uniformly decelerate motion generated by many electronic input devices, controlling different displays. FIG. 5 illustrates the finger sensor 105 used to control a display 450 on the display device 125. The display 450 is an image map overlayed by a movable window 410, which encloses portions of the image map 450. Swiping the finger 101 over the finger sensor 105 in the direction indicated by the arrow 475 causes the window 410 to move or translate in a corresponding direction over the image map 450. In this embodiment, the finger sensor 105 is emulating a mouse or a track ball. The window 410 moves from the location 425A at time T1, to location 425B at time T2, to location 425C at time T3, to location 425D at time T4, and finally to location 425E at time T5, where it stops. Again, the thicknesses of the arrows joining adjacent locations (e.g., the arrow connecting the window 410 at locations 425A and 425B) indicate the speed with which the window 410 moves. The speed with which the window 410 moves decelerates from time T2 to T5, preferably in an inertial manner.
  • It will be appreciated that deceleration can be determined in other ways. For example, deceleration can be determined from a look-up table. The look-up table can map the current speed and map it into a display speed for each sequential time. In one example, a table stores scaling factors used for mapping current speed to subsequent speeds. As one example, the table stores 10 scaling factors for 10 corresponding time cycles. Thus, if the current display speed is 10 frames-per-second (fps), after one second, the speed is multiplied by the first entry in the table, the scaling factor 0.9, to determine the speed after one second: 10 fps*0.9=9 fps. If the second entry in the table is 0.7, the speed during the next second of scrolling is the current speed times the next scaling factor (9 fps*0.7), or 6.3 fps. This table lookup continues until the last scaling factor (0.0) stops the scrolling. Using table look-ups in this way, linear, non-linear, step-wise (e.g., the speed is decreased, maintained over a time segment and decreased again, with the sequence continuing until scrolling stops), and other types of decay can be determined to control scrolling.
  • It will be appreciated that deceleration in accordance with the present invention is able to be uniform or non-uniform. Different types of deceleration can be used to fit the application at hand. Indeed, uniform deceleration can be used over one time interval and non-uniform deceleration over another time interval.
  • FIGS. 6A-E show another example, in which modeling inertial decay is used to decelerate a different display, a computer simulated gaming wheel 500, such as a roulette wheel. FIGS. 6A-E show the gaming wheel 500 at the times T1-T5, respectively, controlled using a finger sensor (not shown). When a user traces a circular or semi-circular path along the finger sensor, the gaming wheel 500 is turned in the same direction. When the finger is removed from the finger sensor (at the time shown in FIG. 6B), the gaming wheel 500 continues to rotate, but at a rate that has an inertial decay in accordance with the present invention. As in FIGS. 1A-E, the widths and arrows of the curved lines next to the gaming wheel 500 indicate the speed and direction, respectively, that the gaming wheel 500 is rotating.
  • FIG. 7 shows the steps of a process 600 for generally moving a computer display by using a finger sensor to emulate any number of electronic input devices in accordance with the present invention. The electronic input devices include, but are not limited to, a scroll wheel, a mouse, a wheel, a track ball, a push button, and a joy stick. First, in the start step 601, parameters such as a dampening factor are initialized. Next, in the step 603, a finger movement along the finger sensor is determined. and in the step 605, the movement is translated to a motion of the emulated device. This motion can be the movement of a joystick, a scrolling motion, a translation motion (such as of a window over a map), and a button press, to name only a few.
  • Next, in the step 607, the position and movement parameters of the emulated device are updated. Examples of movement parameters include acceleration and direction. These parameters are used to determine the direction that is to be taken (e.g., continued) when the finger no longer touches the finger sensor. The acceleration can include gradual (uniform or non-uniform) deceleration. In the step 609, the display (e.g., a list of menu items) is moved in a manner corresponding to the emulated electronic input device.
  • In the step 611, the process determines whether the finger is still on the finger sensor. If the finger is still on the finger sensor, the process loops back to the step 603. Otherwise, the process continues to the step 613, in which it determines whether the display has stopped moving. If the display has not stopped moving, the process loops back to the step 603. Otherwise, the process continues to the step 615, in which it ends.
  • In accordance with the present invention, a sudden-stop feature instantly stops the inertial movement (e.g., scrolling) with a fresh touch of the finger sensor 105. In this way, a user can quickly change a scrolling direction without having to wait for the scrolling to stop. This not only allows for greater ease of use but also allows quick turnaround of fresh movements in other directions. With this feature, there is no need to generate extra movement to overcome the current inertia before shifting the direction of motion.
  • FIG. 8 shows the steps of a process 700 incorporating the sudden-stop feature when scrolling through the menu 120 in FIGS. 1A-E. Referring to FIGS. 1A-E and 8, in the start step 701, parameters, such as a dampening factor, are initialized. In the step 703, the process reads the movement of the finger 101 along a surface of the sensor 105. In the step 705, the menu 120 is scrolled in a manner corresponding to the finger movement. Next, in the step 707, the process determines whether the finger 101 is still contacting the sensor 105. If the finger 101 is still contacting the sensor 105, the process loops back to the step 703; otherwise, the process continues to the step 709.
  • In the step 709, the process decelerates the scrolling based on the inertial factors, such as described above. In the step 711, the process determines whether the scrolling has stopped. If the scrolling has stopped, the process loops back to the step 703; otherwise, the process continues to the step 713, in which it determines whether the finger 101 is again contacting the sensor 105. If the finger 101 is not again contacting the sensor 105, the process loops back to the step 703; otherwise, the process continues to the step 715.
  • In the step 715, the process determines whether the sensor 105 was tapped quickly, thereby triggering a sudden stop. As one example, the process determines that the sensor 105 was tapped quickly if the finger 101 next contacts the sensor 105 at a time TA and is removed at a time TB, where TB−TA≦5 ms. Those skilled in the art will recognize other ways of defining and later recognizing a tap as “quick.” If, in the step 715, the process determines that the tap is quick, the process continues to the step 717, in which the scrolling is suddenly stopped; otherwise, the process loops back to the step 703.
  • While FIG. 8 describes scrolling based on inertial factors, it will be appreciated that the sudden-stop feature is able to be used to decelerate scrolling and other motions using other kinds of deceleration, including non-uniform ones. Those skilled in the art will recognize other ways of triggering a sudden stop in accordance with the present invention. In an alternative embodiment, the sudden-stop feature is triggered by contacting the sensor 105 and maintaining the contact for a predetermined time, such as one or two seconds.
  • Embodiments of the present invention are also able to accelerate or decelerate motion of a computer display. As one example, consecutive finger swipes in a same direction result in accelerating the motion. Swiping in one direction followed by a swipe in the opposite direction results in decelerating the motion. FIGS. 9A-H illustrate how the motion of the display 120 (FIGS. 1A-E) is accelerated by swiping the finger 101 multiple times along the finger sensor 105 over a sequence of increasing times T0-T7, respectively.
  • Each of the FIGS. 9A-H depicts a graph 150 plotting a speed of the display 120 (on the vertical axis labeled “v1” to “v7”) versus time. Each occurrence of the graph 150 identifies the current speed by the label 155. FIGS. 9D-H also label the speed at the immediately preceding time with an “x,” tracing changes in velocity with dotted lines. As shown in FIGS. 9A-H, swiping the finger sensor 105 multiple times increases the speed of the display 120. Increasing speed in this way is referred to as “additive” motion.
  • As shown in FIGS. 9A-C, moving the finger 101 across the finger sensor 105 from time T0 to T2 causes the speed of the display 120 to increase from 0 to v4. After the finger 101 is removed from the sensor 105 immediately after the time T2 (FIGS. 9C-D), from then until the time T5 (FIGS. 9C to 9F), the speed decreases.
  • After the finger 101 is returned to the sensor 105 at the time T6 (FIG. 9G), the finger 101 is swiped a second time (time T5 to T7, FIGS. 9F-H). Because this second swipe begins when the display 120 is already in motion, the swipe results in a greater speed than the initial swipe (from time T0 to T2).
  • Preferably, the speed of the display 120 increases with the number of swipes and also with the total distance traveled by the swipes. Thus, swiping the finger 101 along the finger sensor 105 five times will move the display 120 faster than if the finger 120 was swiped four times. And swiping the finger 101 five times a total distance of fives inches will result in a faster motion than swiping the finger 101 five times but a total distance of four inches.
  • While FIGS. 9A-C and 9G-H show a constant acceleration (e.g., the graph 150 during the corresponding time periods has a constant slope), other types of acceleration are able to be attained in accordance with the present invention. Some examples include exponential acceleration, with or without a maximum value; and step-wise acceleration, to name only two types. Furthermore, acceleration can be determined using a look-up table, such as one having scaling factors with values larger than one. Using the table entries of one such example, the speed is multiplied by the scaling factors 1.1, 1.5, and 2.0 in sequential time intervals.
  • In this example, the speed decreases from T2 to T5 with an inertial decay, in accordance with one embodiment of the present invention. It will be appreciated that in accordance with other embodiments, the speed can decrease from T2 to T5 in other ways, both uniform and non-uniform.
  • In still another embodiment, motion is accelerated by swiping and holding a finger or other object on a finger sensor. An initial swipe will start accelerating a display (e.g., display 120 in FIGS. 1A-E). At the end of the swipe, the finger is held stationary, or nearly stationary. The display will continue to accelerate while the finger is held in place. The longer the finger is held in place, the faster the display moves, until a maximum speed (peak threshold) is reached. After the display reaches the desired speed, the finger is either removed or moved farther to complete the swipe.
  • It will be appreciated that embodiments of the present invention can be combined in many ways. For example, the sudden stop feature can be implemented to suddenly stop the additive motion. Similarly, the sudden stop feature, the additive motion, and the deceleration, all in accordance with the present invention, can all be combined in any combination.
  • FIG. 10 shows the steps of a process 800 for determining additive motion, such as additive scrolling, in accordance with one embodiment of the present invention. Many of the steps in the process 800 are similar to the steps in the process 600, shown in FIG. 7, and are similarly labeled. To simplify the discussion, the common steps will not be discussed here. Referring to FIG. 10, from the step 613, the process determines whether the display has stopped moving. If it has not, the process continues to the step 617, in which it determines whether a new (e.g., consecutive or sequential) swipe has occurred. If a new swipe has not occurred, the process loops back to the step 607. Otherwise, the process loops back to the step 603. If, in the step 603, the process determines that the finger was swiped in the same direction as during the immediately preceding swipe, the process later updates the position and movement parameters in the step 607 to accelerate the display motion. On the other hand, if, in the step 603, the process determines that the finger was swiped in a direction opposite to that of the immediately preceding swipe, the process later updates the position and movement parameters in the step 607 to decelerate the display motion.
  • FIG. 11 is a block diagram of a system 900 used to emulate a scroll wheel using decay in accordance with the present invention. The system 900 includes the finger sensor 105 coupled to a translation module 925, which translates finger movements into scroll wheel signals for scrolling the menu 120 on the display 125, as shown in FIGS. 1A-E. The translation module 925 includes a movement correlator 910 and a motion translator 915. The movement correlator 910 correlates images sequentially captured by the finger sensor 105 and determines finger movement, such as in the step 303 of FIG. 4. The motion translator 915 receives the finger movement, translates the finger movement into joystick movement (step 305, FIG. 4), calculates new joystick movement (step 307, FIG. 4), updates inertial/acceleration factors based on joystick position (step 309, FIG. 4), translates the joystick position into a scrolling motion by applying the acceleration factors (step 311, FIG. 4), and scrolls the menu accordingly (step 313, FIG. 4).
  • In one embodiment, both of the elements 910 and 915 include a computer-readable medium containing instructions that cause a processor to perform the steps of FIG. 4. In other embodiments, the elements include software, hardware, firmware, or any combination of these.
  • It will be appreciated that the steps shown in FIG. 4 can be distributed among the components 910 and 915 in different ways, or among other components, such as described in FIG. 12, below. Preferably, the components 910 and 915 are also configured to implement the sudden-stop feature described in FIG. 8, the additive motion feature described in FIG. 10, or both.
  • In one embodiment, a motion translator is used to provide inertial deceleration when emulating multiple different input devices. For example, inertial deceleration is used to gradually decelerate movement corresponding to a scroll wheel, a wheel (e.g., a roulette wheel), and a mouse. Preferably, a single acceleration/deceleration module (such as one simulating deceleration, acceleration, and a sudden-stop feature) is shared among several device emulators. FIG. 12 illustrates a system 950 that emulates several electronic input devices, all of which use acceleration/deceleration in accordance with the present invention.
  • The system 950 includes the elements 105 and 125, described above. The system 950 also includes a translation module 980, which includes the movement correlator 910 and an acceleration/deceleration calculator 975. The acceleration/deceleration calculator 975 is coupled to a scroll wheel emulator 915, a wheel emulator 917, and a mouse emulator 919, all of which are coupled to a switch 985, which in turn is coupled to the display device 125. The switch 985 routes to the display device 125 the emulator (i.e., 915, 917, and 919) corresponding to the device currently being emulated. Device emulation using fingerprint sensors is discussed in U.S. Pat. No. 7,474,772, filed Jun. 21, 2004, and titled “System and Method for a Miniature User Input Device,” which is incorporated by reference in its entirety.
  • In one example, referring to FIGS. 7 and 10, the acceleration/deceleration calculator 975 performs the step 607, determining the inertial decay from position and movement parameters. When the finger sensor 105 is used to emulate a scroll wheel, the step 605 is performed by the scroll wheel emulator 915. When the finger sensor 105 is used to emulate a wheel, device movement is determined by the wheel emulator 917. When the finger sensor 105 is used to emulate a mouse, device movement is determined by the mouse emulator 919. Again, the elements 915, 917, and 919 can all be implemented using any combination of hardware, software, firmware, or computer-readable media for controlling the operation of a processor.
  • Embodiments of the invention are also able to control a display in response to pressing a surface of a finger sensor, such as by a finger tap. As such, their usefulness can be seen in all variants of fingerprint sensor navigation. As one example, movement having an inertial decay in accordance with the present invention is used to move through a list of items, to zoom in on or zoom out from an image of a city map, to select a large grid menu, and to control an arcade game, such as billiards, that provides virtual realism in terms of movement.
  • As one example, referring to FIGS. 1A-E, after the scrolling gradually comes to a stop, the user presses the surface of the finger sensor 105 to select a highlighted name, such as the topmost name in the menu display 120. Contact information for the highlighted name is then immediately presented on the display device 125.
  • As another example, referring to FIGS. 1A-E and 5, after the window 410 has come to a stop (position 425E), tapping the finger sensor 105 in a predetermined manner (e.g., one quick tap) zooms in on that portion of the image within the window 410; tapping the finger sensor 105 in another predetermined manner (e.g., two quick taps in succession or a tap-and-hold motion) zooms out from the same portion of the image. Those skilled in the art will recognize other actions that can be taken by tapping or otherwise increasing a pressure on a surface of the finger sensor 105.
  • One embodiment of the invention allows for dual-mode scrolling. In this mode, a system is configured to perform both single-step (non-inertial) scrolling and inertial scrolling through the adjustment of the dampening factor based upon the context of recent movement. If the recent movement is indicative of slow or single-step scrolling, then the dampening factor is decreased substantially, resulting in what is effectively non-inertial scrolling (e.g., the joystick reverts to the zero or home position nearly instantly). As one example, the system determines that recent movement indicates a preference for slow or single-step scrolling when a user implements the sudden-stop feature several consecutive times. In response, the system adjusts the damping factor (e.g., in the step 309) to ensure that the return-to-home motion is fast, approaching a single-step mode.
  • In the operation of one embodiment of the invention, a user swipes or traces a finger on a finger sensor to set a display in motion. During the swipe, the system determines the direction of the swipe and other parameters, such as the duration of the swipe or the acceleration of the swipe. During the swipe, the user can accelerate the motion by again swiping, in the same direction as before; or she can decelerate the motion by swiping in a different direction. Once the swipe is completed, the display continues in the same direction, before slowing down. This motion provides the user with a more pleasurable viewing experience as the display rolls to a smooth stop. The user also has greater control over moving the display. Later, the user can tap the finger sensor to trigger an action, such as selecting a highlighted object.
  • In a preferred embodiment, a finger sensor, computing elements, and display device are integrated onto a single device, such as a mobile phone, personal digital assistant, or portable gaming device. Alternatively, a system in accordance with the present invention includes separate components, such as a swipe sensor, display screen, and host computer.
  • Those skilled in the art will recognize that modifications can be made to embodiments of the invention. For example, while most of the embodiments disclose a finger swipe sensor, other embodiments use a finger placement sensor. Furthermore, in the flow charts given, some steps can be skipped, others added, and all can be performed in different sequences. It will be readily apparent to one skilled in the art that other modifications may be made to the embodiments without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (30)

1-33. (canceled)
34. A method for moving at least one display element on a display using a finger sensor as an emulated input device, the method comprising:
(a) determining finger movement along the finger sensor based upon sequentially captured finger surface patterns;
(b) translating the determined finger movement to updated position and inertial acceleration parameters of the emulated input device;
(c) moving the at least one display element on the display based upon the updated position and inertial acceleration parameters of the emulated input device; and
(d) determining if the finger is still touching the finger sensor, and when the finger is still touching the finger sensor then repeating steps (a) through (c), and when the finger is no longer touching the finger sensor then applying updated position and inertial acceleration parameters to continue movement of the at least one display element on the display.
35. The method of claim 34 comprising stopping continued movement based upon a subsequent tapping of the finger sensor.
36. The method of claim 34 wherein the determining finger movement along the finger sensor comprises determining finger movement as at least one of a speed, a direction, and a time of finger swiping movement.
37. The method of claim 34 wherein the determining finger movement along the finger sensor comprises determining speed of finger swiping movement; and wherein translating comprises translating a faster speed of finger swiping movement into a slower deceleration.
38. The method of claim 34 wherein the determining finger movement along the finger sensor comprises determining speed of finger swiping movement; and wherein translating comprises translating a greater time of finger swiping movement into a slower deceleration.
39. The method of claim 34 wherein the emulated input device comprises at least one of a joystick, mouse, and scroll wheel.
40. The method of claim 34 wherein the captured finger patterns comprise at least one of bifurcations, pores, ridge endings, swirls, and whorls.
41. The method of claim 34 wherein translating comprises translating the determined finger movement to updated position and inertial acceleration parameters of the emulated input device based upon values stored in a look-up table.
42. The method of claim 34 wherein translating comprises translating the determined finger movement to updated position and inertial acceleration parameters of the emulated input device based upon an inertial decay function.
43. The method of claim 34 wherein the computer display shows one of a list of items, a region of an image, a grid menu, slides of images, a game image, and an element of a computer simulation.
44. The method of claim 34 wherein the at least one display element comprises a movable window.
45. A method for moving at least one display element on a display using a finger sensor as an emulated input device, the method comprising:
(a) determining finger movement along the finger sensor as at least one of a speed, a direction, and a time of finger swiping movement based upon sequentially captured finger surface patterns;
(b) translating the determined finger movement to updated position and inertial acceleration parameters of the emulated input device;
(c) moving the at least one display element on the display based upon the updated position and inertial acceleration parameters of the emulated input device;
(d) determining if the finger is still touching the finger sensor, and when the finger is still touching the finger sensor then repeating steps (a) through (c), and when the finger is no longer touching the finger sensor then applying updated position and inertial acceleration parameters to continue movement of the at least one display element on the display and stopping continued movement based upon a subsequent tapping of the finger sensor.
46. The method of claim 45 wherein the determining finger movement along the finger sensor comprises determining speed of finger swiping movement; and wherein translating comprises translating a faster speed of finger swiping movement into a slower deceleration.
47. The method of claim 45 wherein the determining finger movement along the finger sensor comprises determining speed of finger swiping movement; and wherein translating comprises translating a greater time of finger swiping movement into a slower deceleration.
48. The method of claim 45 wherein the emulated input device comprises at least one of a joystick, mouse, and scroll wheel.
49. The method of claim 45 wherein the captured finger patterns comprise at least one of bifurcations, pores, ridge endings, swirls, and whorls.
50. The method of claim 45 wherein translating comprises translating the determined finger movement to updated position and inertial acceleration parameters of the emulated input device based upon values stored in a look-up table.
51. The method of claim 45 wherein translating comprises translating the determined finger movement to updated position and inertial acceleration parameters of the emulated input device based upon an inertial decay function.
52. The method of claim 45 wherein the at least one display element comprises a movable window.
53. An electronic device comprising:
a display;
a finger sensor; and
a motion translator configured to
(a) determine finger movement along the finger sensor based upon sequentially captured finger surface patterns,
(b) translate the determined finger movement to updated position and inertial acceleration parameters of an emulated input device,
(c) move the at least one display element on the display based upon the updated position and inertial acceleration parameters of the emulated input device, and
(d) determine if the finger is still touching the finger sensor, and when the finger is still touching the finger sensor then repeat steps (a) through (c), and when the finger is no longer touching the finger sensor then apply updated position and inertial acceleration parameters to continue movement of the at least one display element on the display.
54. The electronic device of claim 53 wherein the motion translator is configured to stop continued movement based upon a subsequent tapping of the finger sensor.
55. The electronic device of claim 53 wherein the motion translator is configured to determine finger movement as at least one of a speed, a direction, and a time of finger swiping movement.
56. The electronic device of claim 53 wherein the motion translator is configured to determine speed of finger swiping movement, and translate a faster speed of finger swiping movement into a slower deceleration.
57. The electronic device of claim 53 wherein the motion translator is configured to determine speed of finger swiping movement, and translate a greater time of finger swiping movement into a slower deceleration.
58. The electronic device of claim 53 wherein the emulated input device comprises at least one of a joystick, mouse, and scroll wheel.
59. The electronic device of claim 53 wherein the captured finger patterns comprise at least one of bifurcations, pores, ridge endings, swirls, and whorls.
60. The electronic device of claim 53 wherein the motion translator is configured to translate the determined finger movement to updated position and inertial acceleration parameters of the emulated input device based upon values stored in a look-up table.
61. The electronic device of claim 53 wherein the motion translator is configured to translate the determined finger movement to updated position and inertial acceleration parameters of the emulated input device based upon an inertial decay function.
62. The electronic device of claim 53 wherein the at least one display element comprises a movable window.
US15/724,285 2008-02-13 2017-10-04 Systems for and methods of providing inertial scrolling and navigation using a fingerprint sensor Abandoned US20180024718A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/724,285 US20180024718A1 (en) 2008-02-13 2017-10-04 Systems for and methods of providing inertial scrolling and navigation using a fingerprint sensor

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US6575108P 2008-02-13 2008-02-13
US12/378,338 US9785330B1 (en) 2008-02-13 2009-02-13 Systems for and methods of providing inertial scrolling and navigation using a fingerprint sensor calculating swiping speed and length
US15/724,285 US20180024718A1 (en) 2008-02-13 2017-10-04 Systems for and methods of providing inertial scrolling and navigation using a fingerprint sensor

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12/378,338 Continuation US9785330B1 (en) 2008-02-13 2009-02-13 Systems for and methods of providing inertial scrolling and navigation using a fingerprint sensor calculating swiping speed and length

Publications (1)

Publication Number Publication Date
US20180024718A1 true US20180024718A1 (en) 2018-01-25

Family

ID=59981519

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/378,338 Active 2032-05-05 US9785330B1 (en) 2008-02-13 2009-02-13 Systems for and methods of providing inertial scrolling and navigation using a fingerprint sensor calculating swiping speed and length
US15/724,285 Abandoned US20180024718A1 (en) 2008-02-13 2017-10-04 Systems for and methods of providing inertial scrolling and navigation using a fingerprint sensor

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/378,338 Active 2032-05-05 US9785330B1 (en) 2008-02-13 2009-02-13 Systems for and methods of providing inertial scrolling and navigation using a fingerprint sensor calculating swiping speed and length

Country Status (1)

Country Link
US (2) US9785330B1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10852943B2 (en) * 2018-01-02 2020-12-01 Advanced New Technologies Co., Ltd. Mobile terminal click event recognition method and apparatus
US20230393726A1 (en) * 2022-06-02 2023-12-07 Shopify Inc. Methods and apparatuses for providing condensable user interface

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8782775B2 (en) 2007-09-24 2014-07-15 Apple Inc. Embedded authentication systems in an electronic device
TWI476706B (en) * 2012-04-30 2015-03-11 Pixart Imaging Inc Method for outputting command by detecting object movement and system thereof
CN106133748B (en) * 2012-05-18 2020-01-31 苹果公司 Device, method and graphical user interface for manipulating a user interface based on fingerprint sensor input
DE102014207637A1 (en) * 2014-04-23 2015-10-29 Bayerische Motoren Werke Aktiengesellschaft Gesture interaction with a driver information system of a vehicle
JP7377088B2 (en) * 2019-12-10 2023-11-09 キヤノン株式会社 Electronic devices and their control methods, programs, and storage media
US11409410B2 (en) 2020-09-14 2022-08-09 Apple Inc. User input interfaces

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050012714A1 (en) * 2003-06-25 2005-01-20 Russo Anthony P. System and method for a miniature user input device
US20060026535A1 (en) * 2004-07-30 2006-02-02 Apple Computer Inc. Mode-based graphical user interfaces for touch sensitive input devices
US20060048073A1 (en) * 2004-08-30 2006-03-02 Microsoft Corp. Scrolling web pages using direct interaction
US20080082939A1 (en) * 2006-09-29 2008-04-03 Wildseed Llc Scrolling behavior-influenced algorithm selection to facilitate adaptive scrolling

Family Cites Families (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5327161A (en) * 1989-08-09 1994-07-05 Microtouch Systems, Inc. System and method for emulating a mouse input device with a touchpad input device
US5170364A (en) 1990-12-06 1992-12-08 Biomechanics Corporation Of America Feedback system for load bearing surface
WO1995008167A1 (en) 1993-09-13 1995-03-23 Asher David J Joystick with membrane sensor
US6546112B1 (en) 1993-11-18 2003-04-08 Digimarc Corporation Security document with steganographically-encoded authentication data
US5825907A (en) 1994-12-28 1998-10-20 Lucent Technologies Inc. Neural network system for classifying fingerprints
US5740276A (en) 1995-07-27 1998-04-14 Mytec Technologies Inc. Holographic method for encrypting and decrypting information using a fingerprint
JP3747520B2 (en) 1996-01-30 2006-02-22 富士ゼロックス株式会社 Information processing apparatus and information processing method
US5995630A (en) 1996-03-07 1999-11-30 Dew Engineering And Development Limited Biometric input with encryption
US6219793B1 (en) 1996-09-11 2001-04-17 Hush, Inc. Method of using fingerprints to authenticate wireless communications
CA2203212A1 (en) 1997-04-21 1998-10-21 Vijayakumar Bhagavatula Methodology for biometric encryption
US6011849A (en) 1997-08-28 2000-01-04 Syndata Technologies, Inc. Encryption-based selection system for steganography
US6035398A (en) 1997-11-14 2000-03-07 Digitalpersona, Inc. Cryptographic key generation using biometric data
US6330345B1 (en) 1997-11-17 2001-12-11 Veridicom, Inc. Automatic adjustment processing for sensor devices
US6408087B1 (en) 1998-01-13 2002-06-18 Stmicroelectronics, Inc. Capacitive semiconductor user input device
US6317508B1 (en) 1998-01-13 2001-11-13 Stmicroelectronics, Inc. Scanning capacitive semiconductor fingerprint detector
US6141753A (en) 1998-02-10 2000-10-31 Fraunhofer Gesellschaft Secure distribution of digital representations
US7469381B2 (en) * 2007-01-07 2008-12-23 Apple Inc. List scrolling and document translation, scaling, and rotation on a touch-screen display
US6535622B1 (en) 1999-04-26 2003-03-18 Veridicom, Inc. Method for imaging fingerprints and concealing latent fingerprints
US6681034B1 (en) 1999-07-15 2004-01-20 Precise Biometrics Method and system for fingerprint template matching
US6546122B1 (en) 1999-07-29 2003-04-08 Veridicom, Inc. Method for combining fingerprint templates representing various sensed areas of a fingerprint to derive one fingerprint template representing the fingerprint
US7054470B2 (en) 1999-12-02 2006-05-30 International Business Machines Corporation System and method for distortion characterization in fingerprint and palm-print image sequences and using this distortion as a behavioral biometrics
US6920560B2 (en) 1999-12-30 2005-07-19 Clyde Riley Wallace, Jr. Secure network user states
US6518560B1 (en) 2000-04-27 2003-02-11 Veridicom, Inc. Automatic gain amplifier for biometric sensor device
JP2003534620A (en) 2000-05-24 2003-11-18 イマージョン コーポレイション Haptic device and method using electroactive polymer
US20030028811A1 (en) 2000-07-12 2003-02-06 Walker John David Method, apparatus and system for authenticating fingerprints, and communicating and processing commands and information based on the fingerprint authentication
US7003670B2 (en) 2001-06-08 2006-02-21 Musicrypt, Inc. Biometric rights management system
EP1573426A4 (en) 2001-07-12 2009-11-25 Atrua Technologies Inc Method and system for biometric image assembly from multiple partial biometric frame scans
US6690365B2 (en) * 2001-08-29 2004-02-10 Microsoft Corporation Automatic scrolling
US7131004B1 (en) 2001-08-31 2006-10-31 Silicon Image, Inc. Method and apparatus for encrypting data transmitted over a serial link
US20030123714A1 (en) 2001-11-06 2003-07-03 O'gorman Lawrence Method and system for capturing fingerprints from multiple swipe images
US7116805B2 (en) 2003-01-07 2006-10-03 Avagotechnologies Ecbu Ip (Singapore) Pte. Ltd. Fingerprint verification device
US7274808B2 (en) 2003-04-18 2007-09-25 Avago Technologies Ecbu Ip (Singapore)Pte Ltd Imaging system and apparatus for combining finger recognition and finger navigation
US20070038867A1 (en) 2003-06-02 2007-02-15 Verbauwhede Ingrid M System for biometric signal processing with hardware and software acceleration
US7587072B2 (en) 2003-08-22 2009-09-08 Authentec, Inc. System for and method of generating rotational inputs
US7697729B2 (en) 2004-01-29 2010-04-13 Authentec, Inc. System for and method of finger initiated actions
WO2005079413A2 (en) 2004-02-12 2005-09-01 Atrua Technologies, Inc. System and method of emulating mouse operations using finger image sensors
US8131026B2 (en) * 2004-04-16 2012-03-06 Validity Sensors, Inc. Method and apparatus for fingerprint image reconstruction
US7574022B2 (en) 2004-05-20 2009-08-11 Atrua Technologies Secure system and method of creating and processing partial finger images
US7113179B2 (en) 2004-06-23 2006-09-26 Interlink Electronics, Inc. Force sensing resistor with calibration element and method of manufacturing same
JP3734819B1 (en) * 2004-07-26 2006-01-11 任天堂株式会社 GAME PROGRAM, GAME DEVICE, AND INPUT DEVICE
US20080094367A1 (en) * 2004-08-02 2008-04-24 Koninklijke Philips Electronics, N.V. Pressure-Controlled Navigating in a Touch Screen
US7280679B2 (en) 2004-10-08 2007-10-09 Atrua Technologies, Inc. System for and method of determining pressure on a finger sensor
US20060103633A1 (en) 2004-11-17 2006-05-18 Atrua Technologies, Inc. Customizable touch input module for an electronic device
US7505613B2 (en) 2005-07-12 2009-03-17 Atrua Technologies, Inc. System for and method of securing fingerprint biometric systems against fake-finger spoofing
US20070061126A1 (en) 2005-09-01 2007-03-15 Anthony Russo System for and method of emulating electronic input devices
US8090945B2 (en) 2005-09-16 2012-01-03 Tara Chand Singhal Systems and methods for multi-factor remote user authentication
US7791596B2 (en) 2005-12-27 2010-09-07 Interlink Electronics, Inc. Touch input device having interleaved scroll sensors
US7885436B2 (en) 2006-07-13 2011-02-08 Authentec, Inc. System for and method of assigning confidence values to fingerprint minutiae points

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050012714A1 (en) * 2003-06-25 2005-01-20 Russo Anthony P. System and method for a miniature user input device
US20060026535A1 (en) * 2004-07-30 2006-02-02 Apple Computer Inc. Mode-based graphical user interfaces for touch sensitive input devices
US20060048073A1 (en) * 2004-08-30 2006-03-02 Microsoft Corp. Scrolling web pages using direct interaction
US20080082939A1 (en) * 2006-09-29 2008-04-03 Wildseed Llc Scrolling behavior-influenced algorithm selection to facilitate adaptive scrolling

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10852943B2 (en) * 2018-01-02 2020-12-01 Advanced New Technologies Co., Ltd. Mobile terminal click event recognition method and apparatus
US20230393726A1 (en) * 2022-06-02 2023-12-07 Shopify Inc. Methods and apparatuses for providing condensable user interface

Also Published As

Publication number Publication date
US9785330B1 (en) 2017-10-10

Similar Documents

Publication Publication Date Title
US20180024718A1 (en) Systems for and methods of providing inertial scrolling and navigation using a fingerprint sensor
KR102151136B1 (en) Intelligent wearable device and control method thereof
US9134797B2 (en) Systems and methods for providing haptic feedback to touch-sensitive input devices
US7692637B2 (en) User input device for electronic device
JP4723799B2 (en) Control system and control method
TWI290690B (en) Selective input system based on tracking of motion parameters of an input device
EP2038730B1 (en) Techniques for interactive input to portable electronic devices
Hürst et al. Gesture-based interaction via finger tracking for mobile augmented reality
KR100801089B1 (en) Mobile device and operation method control available for using touch and drag
EP2717120B1 (en) Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications
US20070061126A1 (en) System for and method of emulating electronic input devices
Hürst et al. Multimodal interaction concepts for mobile augmented reality applications
US20090100383A1 (en) Predictive gesturing in graphical user interface
US20150029095A1 (en) Command of a device by gesture emulation of touch gestures
KR20150079421A (en) Haptic device incorporating stretch characteristics
Wilkinson et al. Expressy: Using a wrist-worn inertial measurement unit to add expressiveness to touch-based interactions
CN108553892B (en) Virtual object control method and device, storage medium and electronic equipment
WO2012112277A1 (en) Breath-sensitive digital interface
JP5184384B2 (en) Control system and control method
US9013398B2 (en) Control methods for a multi-function controller
US20120026196A1 (en) Apparatus including a sensor arrangement and methods of operating the same
US9072968B2 (en) Game device, game control method, and game control program for controlling game on the basis of a position input received via touch panel
CN110673810B (en) Display device, display method and device thereof, storage medium and processor
US20120306750A1 (en) Gesture based computer interface system and method
Atia et al. Interaction with tilting gestures in ubiquitous environments

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION