WO2016044968A1 - Moving an object on display - Google Patents

Moving an object on display Download PDF

Info

Publication number
WO2016044968A1
WO2016044968A1 PCT/CN2014/087047 CN2014087047W WO2016044968A1 WO 2016044968 A1 WO2016044968 A1 WO 2016044968A1 CN 2014087047 W CN2014087047 W CN 2014087047W WO 2016044968 A1 WO2016044968 A1 WO 2016044968A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
fingers
rate
movement
touch
Prior art date
Application number
PCT/CN2014/087047
Other languages
French (fr)
Inventor
Qiang Li
Fan Chen
Original Assignee
Hewlett-Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett-Packard Development Company, L.P. filed Critical Hewlett-Packard Development Company, L.P.
Priority to PCT/CN2014/087047 priority Critical patent/WO2016044968A1/en
Publication of WO2016044968A1 publication Critical patent/WO2016044968A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • Computing systems today often have a display and input devices such as a touch-sensitive surface (e. g. , a touchpad or a touch mat) .
  • the systems may use the touch-sensitive surface to detect user’s finger movements and to translate those movements into corresponding movements of an object (e. g. , a mouse cursor) on the display.
  • an object e. g. , a mouse cursor
  • FIG. 1 is a block diagram of an example computing system
  • FIG. 2A illustrates an example display and touch-sensitive surface of the computing system of FIG. 1;
  • FIG. 2B illustrates another example display and touch-sensitive surface of the computing system of FIG. 1;
  • FIG. 3 is a flowchart of an example
  • FIG. 4 is a block diagram of another example computing device.
  • a computing system may detect finger movements or movements by other objects (e. g. , a stylus) on a touch-sensitive surface, and translate or convert those movements into display actions.
  • the finger movements may include linear or non-linear swipes, slides, or other types of movements or gestures.
  • the user may make the movements with different degrees of force or pressure, and the touch-sensitive surface may be able to distinguish between the different degrees of force or pressure.
  • Display actions resulting from the detection of finger movements may include any types of action involving some change in the image displayed on the display.
  • a display action may include moving of one or more objects (e. g. , mouse cursors or pointers, icons, images, user-interface components such as windows, buttons, text, etc. ) displayed on the display.
  • the action may include moving (or scrolling, swiping, etc. ) the object or the object’s contents to another position on the display, or rotating, resizing, or otherwise manipulating the object and/or its contents.
  • a translation or conversion ratio or function between the finger movements and the display movements may be used.
  • a finger movement of a certain length may be into a display movement of a different (e. g. , greater or smaller) length.
  • applying the same conversion ratio or function for all finger movements may be problematic because the user may desire to have some finger movements translated into shorter display movements to achieve better precision, and to have other finger movements translated into longer display movements, for example, in order to reach a certain point on the display faster and without requiring repetitive finger movements.
  • the computing system may include, for example, a display, a touch-sensitive surface, and a detection engine to detect a movement of at least one object moving at a first rate along a first path on the touch-sensitive surface, determine a number of objects involved in the movement, convert the first rate into a second rate based at least on the number of objects, and move a display object on the display at the second rate along a second path corresponding to the first path.
  • FIG. 1 is a block diagram of an example computing system 100.
  • Computing system 100 may include, among other things, a computing device 120, a touch-sensitive surface 140 (e. g. , coupled to or included in computing device 120) , and a display 130 (e. g. , coupled to or included in computing device 120) .
  • Computing device 120 may be any type of computing device that is capable of receiving touch data from touch-sensitive surface 140, and based on the touch data performing an action on display 130, e. g. , by providing display data to display 130 based on which display 130 may perform the action.
  • Computing device 120 may be, for example, a mobile phone (e. g. , smartphone) , tablet, laptop, desktop, workstation, server, smart television, wearable computing device (e. g. , smart watch or other smart computing apparel) , retail point of sale device, display, calculator, gaming device, application-specific computing device or any other type of computing device.
  • computing device 120 may include two or more communicatively coupled computing devices.
  • Display 130 may be any type of display, screen, or monitor, such as a liquid crystal display (LCD) , light emitting diode (LED) display, organic light-emitting diode, etc.
  • display 130 may be a touch-sensitive display or a touch screen.
  • Display 130 may be included and/or communicatively coupled to computing device 120, e. g. , via a wired or wireless connection, via a network, etc.
  • Display 130 may receive display data from computing device 120 and render (e. g. , visually represent) the display data to the user via any suitable technology.
  • Touch-sensitive surface 140 may include any suitable technology for detecting physical contact with surface 140 by an object such as hand (s) , finger (s) , or other objects whose placement on or close to surface 140 may cause a detectible change in capacitance or other parameters of surface 140 or may otherwise be detectible.
  • touch-sensitive surface 140 may be any suitable touch-sensitive planar (or substantially planar) object, such as a touch-pad, touch-sensitive mat, tabletop, sheet, etc.
  • Touch-sensitive surface 140 may be configured to detect one or multiple touches by a user (e. g. , touches by fingers, styli, or other objects) to enable the user to interact with software being executed by computing device 120 or another computing device.
  • touch-sensitive surface may generate, record, process, and send to computing device 120 touch data including, for example, the locations and movements (e. g. , linear or non-linear movements, sliding, swiping, etc. ) of any objects touching surface 140.
  • touch data may describe the movement of each object touching and moving on surface 140 by including movement vectors, each vector including, for example, an initial point, a length and a direction corresponding to the movement of the object.
  • touch data may include single-touch information about a single object touching and moving on surface 140 and/or multi-touch movement information about multiple (e. g. , two or more) objects touching and moving on surface 140 simultaneously.
  • Computing device 120 may include a detection engine 122.
  • Detection engine 122 may be implemented in the form of instructions (e. g. , stored on a machine-readable storage medium) that, when executed (e. g. , by a processor of client device 120) , may implement the functionality of detection engine 122.
  • the instructions may be part of an operating system (OS) , part of one or more software drivers (e. g. , a driver for touch-sensitive surface 140 and/or a driver for display 130) or part of one or more software applications running on computing device 120.
  • OS operating system
  • detection engine 122 may include electronic circuitry (i. e. , hardware) that implements the functionality described below.
  • detection engine 122 may detect a movement on touch-sensitive surface 140, for example, based on touch data received from surface 140. For example, detection engine 122 may be continuously monitoring touch data received from surface 140 to determine whether at least one finger is moving on surface 140. In some examples, engine 122 may do nothing when no such movement is detected and only perform the functionality described below when at least one finger is moving.
  • the detected movement may be a one-finger movement corresponding to a movement of a single finger on surface 140.
  • the detected movement may also be a multi-finger movement corresponding to a movement by a plurality of (e. g. , two, three, four, five, or more) fingers on surface 140. Without limitation, any such movements may be collectively referred to as “finger movements. ”
  • detection engine 122 may determine the finger movement’s path, which may or may not be linear. Alternatively or in addition, detection engine 122 may also determine the movement’s speed, length, starting point, ending point, distance between the starting point and the ending point, average vector, or any other relevant parameters. In some examples, if the movement is a multi-finger movement, to determine some of the movement’s parameters, detection engine 122 may determine an average or median value of individual fingers’ parameters. For example, detection engine 122 may determine an average vector representing the path of a multi-finger movement based on average vectors of all the individual fingers involved in the multi-finger movement.
  • detection engine 122 may consider a movement to comprise any fingers touching surface 140. In other examples, detection engine may consider a movement to comprise only fingers that are moving and not exclude any fingers that are not moving or remain substantially still. In yet other examples, detection engine may consider the movement to comprise only fingers that are moving simultaneously and substantially in parallel or substantially in the same direction (e. g. , within a predefined degree, such as 20°) and to exclude any other fingers.
  • detection engine 122 may perform a display action on display 130 (or cause display 130 to perform an action) .
  • the display action may include moving of one or more objects or their contents to another position on the display, or rotating, resizing, or otherwise manipulating the objects and/or their contents.
  • the display action may be performed along a particular path (hereinafter, “display path” ) .
  • engine 122 may move a display object (e. g. , a mouse cursor) along a particular display path.
  • the display path may correspond to (e. g. , follow, mimic, or copy) the path of the detected finger movement.
  • the display movement (the movement of a display object) may follow or be performed in the same direction as the finger movement.
  • any linear or non-linear finger path may be considered as consisting of a series of small linear movements
  • engine 122 may move the display object in a direction corresponding to the direction of the movement. That is, if a finger movement is to the left, right, up, or down on surface 140, engine 122 may move the display object to the left, right, up, or down on display 130, respectively.
  • the display path may follow the finger path at the same rate or at a different rate, that is, the display movements may be performed at the same rate as the finger movements or at a different rate.
  • Rate may refer to the movement’s speed or the distance by which the finger (s) and the object are moved in a given period of time, also referred to as a step size.
  • engine 122 may move the display object in a corresponding direction by a second distance, where the second distance may be the same as the first distance, shorter than the first distance, or longer than the first distance.
  • the display rate may be a function of the finger rate, where the function may be referred to as the conversion function.
  • the conversion function may be a non-linear function or a linear function. Because the display movements may follow the finger movements, whether at the same rate or at a different rate, the resulting display path may be a replica of the finger path, the replica being either of the same size or of a different size (smaller or larger) .
  • the user moves a finger 205a a certain path 210 across touch-sensitive surface 140, and in response to this movement detection engine 122 moves cursor 215 along a corresponding path 220a on display 130.
  • detection engine 122 may have a plurality of conversion functions, and may select the conversion function from the plurality of functions based at least on the number of fingers involved in the finger movement. For example, detection engine may use a first conversion function if the finger movement comprises only one finger, a second conversion function if the finger movement comprises two fingers, a third conversion function if the finger movement comprises three fingers, and so forth.
  • the different functions may be of different types (e. g. , linear, quadratic, exponential, etc. ) or be of the same type but have different parameters.
  • engine 122 may obtain a predefined base function f (x) and determine the conversion functions based on the base conversion function.
  • each conversion function may be a base function multiplied by a conversion factor.
  • conversion functions associated with one finger, two fingers, and three fingers may be 1*f (x) , 1.5*f (x) , 2*f (x) , respectively.
  • engine 122 may determine the conversion function corresponding to a given number of fingers by selecting, from a plurality of conversion factors (e. g. , 1, 1.5, 2, etc. ) , a conversion factor associated with or corresponding to that number of fingers, and then multiplying the base function by the conversion factor.
  • a plurality of conversion factors e. g. , 1, 1.5, 2, etc.
  • engine 122 may also determine the conversion function by obtaining a base function having one or more parameters (e. g. , a slope, an intercept, etc. ) and determining the parameter (s) that correspond to the number of fingers involved in the movement.
  • conversion functions may be configured such that a conversion function corresponding to more fingers (e. g. , two) will convert a given finger distance to a longer (or not shorter) display distance than a conversion function corresponding to fewer fingers (e. g. , one) .
  • the conversion factor associated with more fingers may always be greater (or not smaller) than the conversion factor associated with fewer fingers.
  • display distance may be not only a function of the finger distance, but also an increasing (or strictly increasing) function of the number of fingers.
  • the user may control the display rate not only by controlling the finger rate, but also the number of fingers the user is moving. In the example described above, the user may move one finger to achieve slowest action rates, two fingers to achieve faster rates, three fingers to achieve even faster rates, and so forth.
  • the user moves two fingers 205a and 205b along substantially the same path 210 as in the one-finger example of FIG. 2A.
  • detection engine 122 uses a different conversion function (e. g. , a greater conversion factor) to determine the display rate.
  • engine 122 moves cursor 215 along a path 220b.
  • path 220b is longer than path 220a. Therefore, assuming that the finger rate (e. g. , the speed of the one-finger movement and the two-finger movement) in both examples is substantially the same, the display rate at which cursor 215 is moved along path 220b is higher than the display rate at which cursor 215 is moved along path 220b.
  • conversion functions may be configured such that a conversion function corresponding to more fingers will convert a given finger distance to a shorter (or not longer) display distance than a conversion function corresponding to fewer fingers (e. g. , one) .
  • the conversion factor associated with more fingers may always be smaller (or not greater) than the conversion factor associated with fewer fingers.
  • display distance may also be a decreasing (or strictly decreasing) function of the number of fingers. In these examples, the user may move one finger to achieve fastest action rates, two fingers to achieve slower rates, three fingers to achieve even slower rates, and so forth.
  • display distance may be neither a decreasing (nor strictly decreasing) nor an increasing (nor strictly increasing) function of the number of fingers.
  • the conversion function may always convert a finger distance or rate of zero into a display rate of zero. In other words, if no finger movement is detected or if engine 122 detects that the movement has stopped or substantially stopped, engine 122 may not move the display object or immediately stop moving the display object, irrespective of the selected conversion function or factor.
  • engine 122 may obtain predefined default conversion functions corresponding to various numbers of fingers.
  • the user may be able to define the conversion functions and/or modify the default conversion functions by modifying their parameters, for example, via a programmable user interface.
  • the conversion functions are expressed in terms of base functions multiplied by different conversion factors, the user may be able to modify the default base function and/or the conversion factors associated with each number of fingers.
  • FIG. 3 is a flowchart of a method 300, in accordance with some examples described herein.
  • Method 300 may be described below as being executed or performed by a system, for example, computing device 120 of FIG. 1. Other suitable systems and/or computing devices may be used as well.
  • Method 300 may be implemented in the form of executable instructions stored on at least one non-transitory machine-readable storage medium of the system and executed by at least one processor of the system.
  • method 300 may be implemented in the form of electronic circuitry (e. g. , hardware) .
  • one or more or blocks of method 300 may be executed substantially concurrently or in a different order than shown in FIG. 3.
  • method 300 may include more or less blocks than are shown in FIG. 3.
  • one or more of the blocks of method 300 may, at certain times, be ongoing and/or may repeat.
  • method 300 may obtain from a touch-sensitive surface (e. g. , 140) touch data representing a movement of at least one finger by a first distance (e. g. , finger distance) in a first direction, as described above.
  • method 300 may determine the number of fingers involved in the movement, that is, how many fingers the movement comprises, as discussed above.
  • the method may determine a second distance (e. g. , display distance) based at least on the first distance and based on the determined number of fingers.
  • the second distance may be, in some examples, a strictly increasing function or a strictly decreasing function of the number of fingers.
  • the second distance may be determined based on a conversion factor (alternatively or additionally, a conversion parameter) that may be associated with the number of fingers and may, in some examples, be configurable by the user.
  • a conversion factor alternatively or additionally, a conversion parameter
  • the conversion factor may be selected by the method from two or more different conversion factors (or parameters) associated with different numbers of fingers.
  • method 300 may move a cursor (or another object) on a display (e. g. , 130) by the determined second distance in a direction corresponding to the first direction (e. g. , the same direction) .
  • FIG. 4 is a block diagram of an example computing device 400.
  • computing device 400 may be similar to computing device 120 of FIG. 1.
  • computing device 400 includes a processor 410 and a non-transitory machine-readable storage medium 420.
  • processor 410 and a non-transitory machine-readable storage medium 420.
  • the instructions may be distributed (e. g. , stored) across multiple machine-readable storage mediums and the instructions may be distributed (e. g. , executed by) across multiple processors.
  • Processor 410 may be one or more central processing units (CPUs) , microprocessors, and/or other hardware devices suitable for retrieval and execution of instructions stored in non-transitory machine-readable storage medium 420.
  • processor 410 may fetch, decode, and execute instructions 422, 424, 426, or any other instructions (not shown for brevity) .
  • processor 410 may include one or more electronic circuits comprising a number of electronic components for performing the functionality of one or more of the instructions in machine-readable storage medium 420.
  • executable instruction representations e. g. , boxes
  • executable instructions and/or electronic circuits included within one box may, in alternate examples, be included in a different box shown in the figures or in a different box not shown.
  • Non-transitory machine-readable storage medium 420 may be any electronic, magnetic, optical, or other physical storage device that stores executable instructions.
  • medium 420 may be, for example, Random Access Memory (RAM) , an Electrically-Erasable Programmable Read-Only Memory (EEPROM) , a storage drive, an optical disc, and the like.
  • Medium 420 may be disposed within computing device 400, as shown in FIG. 4. In this situation, the executable instructions may be “installed” on computing device 400.
  • medium 420 may be a portable, external or remote storage medium, for example, that allows computing device 400 to download the instructions from the portable/external/remote storage medium. In this situation, the executable instructions may be part of an “installation package” .
  • medium 420 may be encoded with executable instructions for detecting finger movements and performing display actions.
  • Instructions 422 when executed by a processor (e. g. , 410) , may cause a computing device (e. g. , 400) to determine whether at least one finger is moving on a touch-sensitive surface coupled to computing device 400. If at least one finger is moving, instructions 424 may determine a number of fingers moving on the touch-sensitive surface in a first direction at a first rate, as described above.
  • a processor e. g. , 410
  • Instructions 422 when executed by a processor (e. g. , 410) , may cause a computing device (e. g. , 400) to determine whether at least one finger is moving on a touch-sensitive surface coupled to computing device 400. If at least one finger is moving, instructions 424 may determine a number of fingers moving on the touch-sensitive surface in a first direction at a first rate, as described above.
  • instructions 426 may move (or cause computing device 400 to move) an object (e. g. , a cursor) on a display coupled to computing device 400 at a second rate and in a second direction corresponding to (e. g. , similar to) the first direction.
  • the second rate may be determined based at least on the first rate and also based on the determined number of fingers.
  • the second rate may also be determined based on a user-configurable parameter or factor, such as a parameter of a conversion function or a factor multiplying the conversion function.
  • instructions 426 may select the user-configurable parameter or factor from a plurality of different parameters associated with different numbers of fingers, as discussed above. In some examples, instructions 426 may also not move or stop moving the object on the display based on a determination that no fingers are moving on the touch-sensitive surface.

Abstract

Examples disclosed herein relate, among other things, to a computing system. The computing system may include, for example, a display, a touch-sensitive surface, and a detection engine to detect a movement of at least one object moving at a first rate along a first path on the touch-sensitive surface, determine a number of objects involved in the movement, convert the first rate into a second rate based at least on the number of objects, and move a display object on the display at the second rate along a second path corresponding to the first path.

Description

MOVING AN OBJECT ON DISPLAY BACKGROUND
Computing systems today often have a display and input devices such as a touch-sensitive surface (e. g. , a touchpad or a touch mat) . The systems may use the touch-sensitive surface to detect user’s finger movements and to translate those movements into corresponding movements of an object (e. g. , a mouse cursor) on the display.
BRIEF DESCRIPTION OF THE DRAWINGS
The following detailed description references the drawings, wherein:
FIG. 1 is a block diagram of an example computing system;
FIG. 2A illustrates an example display and touch-sensitive surface of the computing system of FIG. 1;
FIG. 2B illustrates another example display and touch-sensitive surface of the computing system of FIG. 1;
FIG. 3 is a flowchart of an example; and
FIG. 4 is a block diagram of another example computing device.
DETAILED DESCRIPTION
As mentioned above, a computing system may detect finger movements or movements by other objects (e. g. , a stylus) on a touch-sensitive surface, and translate or convert those movements into display actions. The finger movements may include linear or non-linear swipes, slides, or other types of movements or gestures. In some examples, the user may make the movements with different degrees of force or pressure, and the touch-sensitive surface may be able to  distinguish between the different degrees of force or pressure. Display actions resulting from the detection of finger movements may include any types of action involving some change in the image displayed on the display. For example, a display action may include moving of one or more objects (e. g. , mouse cursors or pointers, icons, images, user-interface components such as windows, buttons, text, etc. ) displayed on the display. The action may include moving (or scrolling, swiping, etc. ) the object or the object’s contents to another position on the display, or rotating, resizing, or otherwise manipulating the object and/or its contents.
Because the touch-sensitive surface may sometimes have a limited size and a significantly lower resolution than the size and resolution of the display, a translation or conversion ratio or function between the finger movements and the display movements may be used. Thus, for example, a finger movement of a certain length may be into a display movement of a different (e. g. , greater or smaller) length. However, applying the same conversion ratio or function for all finger movements may be problematic because the user may desire to have some finger movements translated into shorter display movements to achieve better precision, and to have other finger movements translated into longer display movements, for example, in order to reach a certain point on the display faster and without requiring repetitive finger movements. Accordingly, it may be desirable to allow the user to change the conversion ratio or function dynamically and in real time, allowing the user, for example, to quickly switch between a precision mode and a speed mode.
Examples disclosed herein relate, among other things, to a computing system. The computing system may include, for example, a display, a touch-sensitive surface, and a detection engine to detect a movement of at least one object moving at a first rate along a first path on the touch-sensitive surface, determine a number of objects involved in the movement, convert the first rate into a second rate based at least on the number of objects, and move a display object on the display at the second rate along a second path corresponding to the first path.
FIG. 1 is a block diagram of an example computing system 100. Computing system 100 may include, among other things, a computing device 120, a touch-sensitive surface 140 (e. g. , coupled to or included in computing device 120) , and a display 130 (e. g. , coupled to or included in computing device 120) .
Computing device 120 may be any type of computing device that is capable of receiving touch data from touch-sensitive surface 140, and based on the touch data performing an action on display 130, e. g. , by providing display data to display 130 based on which display 130 may perform the action. Computing device 120 may be, for example, a mobile phone (e. g. , smartphone) , tablet, laptop, desktop, workstation, server, smart television, wearable computing device (e. g. , smart watch or other smart computing apparel) , retail point of sale device, display, calculator, gaming device, application-specific computing device or any other type of computing device. In some examples, computing device 120 may include two or more communicatively coupled computing devices.
Display 130 may be any type of display, screen, or monitor, such as a liquid crystal display (LCD) , light emitting diode (LED) display, organic light-emitting diode, etc. In some examples, display 130 may be a touch-sensitive display or a touch screen. Display 130 may be included and/or communicatively coupled to computing device 120, e. g. , via a wired or wireless connection, via a network, etc. Display 130 may receive display data from computing device 120 and render (e. g. , visually represent) the display data to the user via any suitable technology.
Touch-sensitive surface 140 may include any suitable technology for detecting physical contact with surface 140 by an object such as hand (s) , finger (s) , or other objects whose placement on or close to surface 140 may cause a detectible change in capacitance or other parameters of surface 140 or may otherwise be detectible. For example, touch-sensitive surface 140 may be any suitable touch-sensitive planar (or substantially planar) object, such as a touch-pad, touch-sensitive mat, tabletop, sheet, etc. Touch-sensitive surface 140 may be configured to detect one or multiple touches by a user (e. g. , touches by fingers, styli, or other objects) to  enable the user to interact with software being executed by computing device 120 or another computing device. In some examples, touch-sensitive surface may generate, record, process, and send to computing device 120 touch data including, for example, the locations and movements (e. g. , linear or non-linear movements, sliding, swiping, etc. ) of any objects touching surface 140. For example, touch data may describe the movement of each object touching and moving on surface 140 by including movement vectors, each vector including, for example, an initial point, a length and a direction corresponding to the movement of the object. As described above, touch data may include single-touch information about a single object touching and moving on surface 140 and/or multi-touch movement information about multiple (e. g. , two or more) objects touching and moving on surface 140 simultaneously. While for reasons of brevity, some examples described herein refer specifically to user’s finger (s) , it is appreciated that the functionality described herein may be applied to any other objects (e. g. , pen, stylus, etc. ) whose touch and movement on surface 140 may be detected.
Computing device 120 may include a detection engine 122. Detection engine 122 may be implemented in the form of instructions (e. g. , stored on a machine-readable storage medium) that, when executed (e. g. , by a processor of client device 120) , may implement the functionality of detection engine 122. In some examples, the instructions may be part of an operating system (OS) , part of one or more software drivers (e. g. , a driver for touch-sensitive surface 140 and/or a driver for display 130) or part of one or more software applications running on computing device 120. Alternatively or in addition, detection engine 122 may include electronic circuitry (i. e. , hardware) that implements the functionality described below.
In some examples, detection engine 122 may detect a movement on touch-sensitive surface 140, for example, based on touch data received from surface 140. For example, detection engine 122 may be continuously monitoring touch data received from surface 140 to determine whether at least one finger is moving on  surface 140. In some examples, engine 122 may do nothing when no such movement is detected and only perform the functionality described below when at least one finger is moving.
If any movement is detected, the detected movement may be a one-finger movement corresponding to a movement of a single finger on surface 140. The detected movement may also be a multi-finger movement corresponding to a movement by a plurality of (e. g. , two, three, four, five, or more) fingers on surface 140. Without limitation, any such movements may be collectively referred to as “finger movements. ”
In addition to detecting the finger movement, detection engine 122 may determine the finger movement’s path, which may or may not be linear. Alternatively or in addition, detection engine 122 may also determine the movement’s speed, length, starting point, ending point, distance between the starting point and the ending point, average vector, or any other relevant parameters. In some examples, if the movement is a multi-finger movement, to determine some of the movement’s parameters, detection engine 122 may determine an average or median value of individual fingers’ parameters. For example, detection engine 122 may determine an average vector representing the path of a multi-finger movement based on average vectors of all the individual fingers involved in the multi-finger movement.
As described above, the type of movement and the movement parameters may depend on the number of fingers involved in the movement, that is, the number of fingers the movement comprises. In some examples, detection engine 122 may consider a movement to comprise any fingers touching surface 140. In other examples, detection engine may consider a movement to comprise only fingers that are moving and not exclude any fingers that are not moving or remain substantially still. In yet other examples, detection engine may consider the movement to comprise only fingers that are moving simultaneously and substantially in parallel or substantially in the same direction (e. g. , within a predefined degree, such as 20°) and to exclude any other fingers.
In response to the detection of the one-finger movement or the multi-finger movement, detection engine 122 may perform a display action on display 130 (or cause display 130 to perform an action) . As discussed above, the display action may include moving of one or more objects or their contents to another position on the display, or rotating, resizing, or otherwise manipulating the objects and/or their contents.
In some examples, the display action may be performed along a particular path (hereinafter, “display path” ) . For example, engine 122 may move a display object (e. g. , a mouse cursor) along a particular display path. In some examples, the display path may correspond to (e. g. , follow, mimic, or copy) the path of the detected finger movement. Put differently, the display movement (the movement of a display object) may follow or be performed in the same direction as the finger movement. For example, assuming that any linear or non-linear finger path may be considered as consisting of a series of small linear movements, for each such linear finger movement, engine 122 may move the display object in a direction corresponding to the direction of the movement. That is, if a finger movement is to the left, right, up, or down on surface 140, engine 122 may move the display object to the left, right, up, or down on display 130, respectively.
In some examples, the display path may follow the finger path at the same rate or at a different rate, that is, the display movements may be performed at the same rate as the finger movements or at a different rate. Rate may refer to the movement’s speed or the distance by which the finger (s) and the object are moved in a given period of time, also referred to as a step size. For example, for every movement of finger by a first distance in a particular direction, engine 122 may move the display object in a corresponding direction by a second distance, where the second distance may be the same as the first distance, shorter than the first distance, or longer than the first distance.
In some examples, the display rate may be a function of the finger rate, where the function may be referred to as the conversion function. Thus, the first  distance in the above examples may be converted into the second distance using the conversion function. The conversion function may be a non-linear function or a linear function. Because the display movements may follow the finger movements, whether at the same rate or at a different rate, the resulting display path may be a replica of the finger path, the replica being either of the same size or of a different size (smaller or larger) . To illustrate, in the example of FIG. 2A, the user moves a finger 205a a certain path 210 across touch-sensitive surface 140, and in response to this movement detection engine 122 moves cursor 215 along a corresponding path 220a on display 130.
In some examples, detection engine 122 may have a plurality of conversion functions, and may select the conversion function from the plurality of functions based at least on the number of fingers involved in the finger movement. For example, detection engine may use a first conversion function if the finger movement comprises only one finger, a second conversion function if the finger movement comprises two fingers, a third conversion function if the finger movement comprises three fingers, and so forth. In some examples, the different functions may be of different types (e. g. , linear, quadratic, exponential, etc. ) or be of the same type but have different parameters. For example, a conversion function associated with one finger may be f (x) =3*x+10, and a conversion function associated with two fingers may be f (x) =5*x+15, where x may be the finger rate, and f (x) may be the display rate.
In some examples, engine 122 may obtain a predefined base function f (x) and determine the conversion functions based on the base conversion function. For example, each conversion function may be a base function multiplied by a conversion factor. For example, conversion functions associated with one finger, two fingers, and three fingers may be 1*f (x) , 1.5*f (x) , 2*f (x) , respectively. Accordingly, in some examples, engine 122 may determine the conversion function corresponding to a given number of fingers by selecting, from a plurality of conversion factors (e. g. , 1, 1.5, 2, etc. ) , a conversion factor associated with or corresponding to that number of fingers, and then multiplying the base function by the conversion factor.
In some examples, engine 122 may also determine the conversion function by obtaining a base function having one or more parameters (e. g. , a slope, an intercept, etc. ) and determining the parameter (s) that correspond to the number of fingers involved in the movement. For example, a base function may be defined as f(x) =a*x^2+b, where parameters a and b may be determined based on the number of fingers.
In some examples, conversion functions may be configured such that a conversion function corresponding to more fingers (e. g. , two) will convert a given finger distance to a longer (or not shorter) display distance than a conversion function corresponding to fewer fingers (e. g. , one) . For example, if conversion functions are determined based on a base function and conversion factors, as described above, the conversion factor associated with more fingers may always be greater (or not smaller) than the conversion factor associated with fewer fingers. Accordingly, display distance may be not only a function of the finger distance, but also an increasing (or strictly increasing) function of the number of fingers. Thus, the user may control the display rate not only by controlling the finger rate, but also the number of fingers the user is moving. In the example described above, the user may move one finger to achieve slowest action rates, two fingers to achieve faster rates, three fingers to achieve even faster rates, and so forth.
For instance, in the example of FIG. 2B, the user moves two  fingers  205a and 205b along substantially the same path 210 as in the one-finger example of FIG. 2A.In this example, however, detection engine 122 uses a different conversion function (e. g. , a greater conversion factor) to determine the display rate. Accordingly, in this example, engine 122 moves cursor 215 along a path 220b. As shown in FIGS. 2A and 2B, path 220b is longer than path 220a. Therefore, assuming that the finger rate (e. g. , the speed of the one-finger movement and the two-finger movement) in both examples is substantially the same, the display rate at which cursor 215 is moved along path 220b is higher than the display rate at which cursor 215 is moved along path 220b.
In some examples, conversion functions may be configured such that a conversion function corresponding to more fingers will convert a given finger distance to a shorter (or not longer) display distance than a conversion function corresponding to fewer fingers (e. g. , one) . For example, if conversion functions are determined based on a base function and conversion factors, as described above, the conversion factor associated with more fingers may always be smaller (or not greater) than the conversion factor associated with fewer fingers. Accordingly, display distance may also be a decreasing (or strictly decreasing) function of the number of fingers. In these examples, the user may move one finger to achieve fastest action rates, two fingers to achieve slower rates, three fingers to achieve even slower rates, and so forth.
In some examples, however, display distance may be neither a decreasing (nor strictly decreasing) nor an increasing (nor strictly increasing) function of the number of fingers. In some example, irrespective of the selected conversion function (or factor) , the conversion function may always convert a finger distance or rate of zero into a display rate of zero. In other words, if no finger movement is detected or if engine 122 detects that the movement has stopped or substantially stopped, engine 122 may not move the display object or immediately stop moving the display object, irrespective of the selected conversion function or factor.
In some examples, engine 122 may obtain predefined default conversion functions corresponding to various numbers of fingers. In some examples, the user may be able to define the conversion functions and/or modify the default conversion functions by modifying their parameters, for example, via a programmable user interface. In the examples where the conversion functions are expressed in terms of base functions multiplied by different conversion factors, the user may be able to modify the default base function and/or the conversion factors associated with each number of fingers.
FIG. 3 is a flowchart of a method 300, in accordance with some examples described herein. Method 300 may be described below as being executed or  performed by a system, for example, computing device 120 of FIG. 1. Other suitable systems and/or computing devices may be used as well. Method 300 may be implemented in the form of executable instructions stored on at least one non-transitory machine-readable storage medium of the system and executed by at least one processor of the system. Alternatively or in addition, method 300 may be implemented in the form of electronic circuitry (e. g. , hardware) . In alternate examples of the present disclosure, one or more or blocks of method 300 may be executed substantially concurrently or in a different order than shown in FIG. 3. In alternate examples of the present disclosure, method 300 may include more or less blocks than are shown in FIG. 3. In some examples, one or more of the blocks of method 300 may, at certain times, be ongoing and/or may repeat.
At block 305, method 300 may obtain from a touch-sensitive surface (e. g. , 140) touch data representing a movement of at least one finger by a first distance (e. g. , finger distance) in a first direction, as described above. At block 310, method 300 may determine the number of fingers involved in the movement, that is, how many fingers the movement comprises, as discussed above. At block 315, the method may determine a second distance (e. g. , display distance) based at least on the first distance and based on the determined number of fingers. As described above, the second distance may be, in some examples, a strictly increasing function or a strictly decreasing function of the number of fingers. As further described above, the second distance may be determined based on a conversion factor (alternatively or additionally, a conversion parameter) that may be associated with the number of fingers and may, in some examples, be configurable by the user. As discussed above, the conversion factor (or parameter) may be selected by the method from two or more different conversion factors (or parameters) associated with different numbers of fingers.
At block 320, method 300 may move a cursor (or another object) on a display (e. g. , 130) by the determined second distance in a direction corresponding to the first direction (e. g. , the same direction) .
FIG. 4 is a block diagram of an example computing device 400. In some examples, computing device 400 may be similar to computing device 120 of FIG. 1. In the example of FIG. 4, computing device 400 includes a processor 410 and a non-transitory machine-readable storage medium 420. Although the following descriptions refer to a single processor and a single machine-readable storage medium, it is appreciated that multiple processors and multiple machine-readable storage mediums may be anticipated in other examples. In such other examples, the instructions may be distributed (e. g. , stored) across multiple machine-readable storage mediums and the instructions may be distributed (e. g. , executed by) across multiple processors.
Processor 410 may be one or more central processing units (CPUs) , microprocessors, and/or other hardware devices suitable for retrieval and execution of instructions stored in non-transitory machine-readable storage medium 420. In the particular example shown in FIG. 4, processor 410 may fetch, decode, and execute  instructions  422, 424, 426, or any other instructions (not shown for brevity) . As an alternative or in addition to retrieving and executing instructions, processor 410 may include one or more electronic circuits comprising a number of electronic components for performing the functionality of one or more of the instructions in machine-readable storage medium 420. With respect to the executable instruction representations (e. g. , boxes) described and shown herein, it should be understood that part or all of the executable instructions and/or electronic circuits included within one box may, in alternate examples, be included in a different box shown in the figures or in a different box not shown.
Non-transitory machine-readable storage medium 420 may be any electronic, magnetic, optical, or other physical storage device that stores executable instructions. Thus, medium 420 may be, for example, Random Access Memory (RAM) , an Electrically-Erasable Programmable Read-Only Memory (EEPROM) , a storage drive, an optical disc, and the like. Medium 420 may be disposed within computing device 400, as shown in FIG. 4. In this situation, the executable  instructions may be “installed” on computing device 400. Alternatively, medium 420 may be a portable, external or remote storage medium, for example, that allows computing device 400 to download the instructions from the portable/external/remote storage medium. In this situation, the executable instructions may be part of an “installation package” . As described herein, medium 420 may be encoded with executable instructions for detecting finger movements and performing display actions.
Instructions 422, when executed by a processor (e. g. , 410) , may cause a computing device (e. g. , 400) to determine whether at least one finger is moving on a touch-sensitive surface coupled to computing device 400. If at least one finger is moving, instructions 424 may determine a number of fingers moving on the touch-sensitive surface in a first direction at a first rate, as described above.
Based on the determination, instructions 426 may move (or cause computing device 400 to move) an object (e. g. , a cursor) on a display coupled to computing device 400 at a second rate and in a second direction corresponding to (e. g. , similar to) the first direction. As discussed above, the second rate may be determined based at least on the first rate and also based on the determined number of fingers. The second rate may also be determined based on a user-configurable parameter or factor, such as a parameter of a conversion function or a factor multiplying the conversion function.
In some examples, instructions 426 may select the user-configurable parameter or factor from a plurality of different parameters associated with different numbers of fingers, as discussed above. In some examples, instructions 426 may also not move or stop moving the object on the display based on a determination that no fingers are moving on the touch-sensitive surface.

Claims (15)

  1. A method comprising:
    obtaining from a touch-sensitive surface touch data representing a movement of at least one finger by a first distance in a first direction;
    determining a number of fingers involved in the movement;
    determining a second distance based at least on the first distance and the number of fingers;
    moving an object on a display by the second distance in a direction corresponding to the first direction, where the object comprises at least one of a cursor and a user-interface component.
  2. The method of claim 1, wherein the second distance is at least one of i) a strictly increasing function of the number of fingers and ii) a strictly decreasing function of the number of fingers.
  3. The method of claim 1, further comprising obtaining a conversion factor associated with the number of fingers, wherein the determination of the second distance is further based on the conversion factor.
  4. The method of claim 3, wherein the conversion factor is configurable by a user.
  5. The method of claim 3, wherein obtaining the conversion factor comprises selecting one of two or more different conversion factors, each of the conversion factors being associated with a different number of fingers.
  6. A computing system comprising:
    a display;
    a touch-sensitive surface; and
    a detection engine to:
    detect a movement of at least one object moving at a first rate along a first path on the touch-sensitive surface;
    determine a number of objects involved in the movement;
    convert the first rate into a second rate based at least on the number of objects; and
    move a display object on the display at the second rate along a second path corresponding to the first path.
  7. The computing system of claim 6, wherein the first rate is converted into the second rate based on a first conversion function if the number of objects is one and based on a second conversion function if the number of objects is two.
  8. The computing system of claim 7, wherein at least one of the first conversion function and the second conversion function is configurable by a user.
  9. The computing system of claim 6, wherein the detection engine is further to:
    detect that the movement has stopped; and
    if the movement has stopped, stop moving the display object on the display.
  10. The computing system of claim 6, wherein the determination of the number of objects involved in the movement comprises a determination of a number of objects moving on the touch-sensitive surface simultaneously and in the same direction.
  11. The computing system of claim 6, wherein the first path and the second path are non-linear paths.
  12. A non-transitory machine-readable storage medium encoded with instructions executable by at least one processor of a computing device to cause the computing device to:
    determine whether at least one finger is moving on a touch-sensitive surface; and
    if at least one finger is moving on the touch-sensitive surface:
    determine a number of fingers moving in a first direction at a first rate, and
    move an object on a display at a second rate in a second direction corresponding to the first direction, wherein the second rate is determined based at least on the first rate and the number of fingers.
  13. The non-transitory machine-readable storage medium of claim 12, wherein the instructions are further to cause the computing device to:
    if no fingers are moving on the touch-sensitive surface, stop moving the object on the display.
  14. The non-transitory machine-readable storage medium of claim 12, wherein the determination of the second rate is further based on a user-configurable parameter.
  15. The non-transitory machine-readable storage medium of claim 13, wherein the instructions are further to cause the computing device to:
    select the user-configurable parameter from a plurality of different parameters, each of the different parameters being associated with a different number of fingers.
PCT/CN2014/087047 2014-09-22 2014-09-22 Moving an object on display WO2016044968A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2014/087047 WO2016044968A1 (en) 2014-09-22 2014-09-22 Moving an object on display

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2014/087047 WO2016044968A1 (en) 2014-09-22 2014-09-22 Moving an object on display

Publications (1)

Publication Number Publication Date
WO2016044968A1 true WO2016044968A1 (en) 2016-03-31

Family

ID=55580018

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/087047 WO2016044968A1 (en) 2014-09-22 2014-09-22 Moving an object on display

Country Status (1)

Country Link
WO (1) WO2016044968A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101727286A (en) * 2008-10-28 2010-06-09 索尼株式会社 Information processing apparatus, information processing method and program
US20110285649A1 (en) * 2010-05-24 2011-11-24 Aisin Aw Co., Ltd. Information display device, method, and program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101727286A (en) * 2008-10-28 2010-06-09 索尼株式会社 Information processing apparatus, information processing method and program
US20110285649A1 (en) * 2010-05-24 2011-11-24 Aisin Aw Co., Ltd. Information display device, method, and program

Similar Documents

Publication Publication Date Title
US9996176B2 (en) Multi-touch uses, gestures, and implementation
RU2505848C2 (en) Virtual haptic panel
US8890808B2 (en) Repositioning gestures for chromeless regions
US20150153897A1 (en) User interface adaptation from an input source identifier change
TWI590147B (en) Touch modes
US20120056831A1 (en) Information processing apparatus, information processing method, and program
WO2015088882A1 (en) Resolving ambiguous touches to a touch screen interface
US10168895B2 (en) Input control on a touch-sensitive surface
US9626086B1 (en) Adjusting eraser size in drawing applications
US20150033161A1 (en) Detecting a first and a second touch to associate a data file with a graphical data object
US10345932B2 (en) Disambiguation of indirect input
US20180121000A1 (en) Using pressure to direct user input
US10048790B2 (en) Digital object sharing using a flexible display
WO2016044968A1 (en) Moving an object on display
US20160041749A1 (en) Operating method for user interface
US20200285380A1 (en) Gesture buttons
JP2014164761A (en) Mouse pointer control method
KR102205235B1 (en) Control method of favorites mode and device including touch screen performing the same
KR20210029175A (en) Control method of favorites mode and device including touch screen performing the same
CN110945470A (en) Programmable multi-touch on-screen keyboard
KR20160031276A (en) Electronic device, and method thereof
KR20160045282A (en) Electronic device, and method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14902803

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14902803

Country of ref document: EP

Kind code of ref document: A1