GB2466077A - Emulator for multiple computing device inputs - Google Patents

Emulator for multiple computing device inputs Download PDF

Info

Publication number
GB2466077A
GB2466077A GB0822845A GB0822845A GB2466077A GB 2466077 A GB2466077 A GB 2466077A GB 0822845 A GB0822845 A GB 0822845A GB 0822845 A GB0822845 A GB 0822845A GB 2466077 A GB2466077 A GB 2466077A
Authority
GB
United Kingdom
Prior art keywords
pointing
signal
emulated
pointing signal
computing device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0822845A
Other versions
GB0822845D0 (en
Inventor
Krzyzstof Choma
Anand Reddy
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nokia Oyj
Original Assignee
Nokia Oyj
Symbian Software Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Oyj, Symbian Software Ltd filed Critical Nokia Oyj
Priority to GB0822845A priority Critical patent/GB2466077A/en
Publication of GB0822845D0 publication Critical patent/GB0822845D0/en
Priority to PCT/IB2009/055573 priority patent/WO2010070528A1/en
Publication of GB2466077A publication Critical patent/GB2466077A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0381Multimodal input, i.e. interface arrangements enabling the user to issue commands by simultaneous use of input devices of different nature, e.g. voice plus gesture on digitizer

Abstract

The computing device is capable of receiving at least one pointing signal from each of one or more pointing devices, such as a mouse, joysticks, touch screens or pads. Each pointing signal of the or each pointing device contains data relating to at least part of an instruction from a user of the or each pointing device to the computing device. The emulator comprises an input arranged to receive concurrently a plurality of pointing signals relating to at least one instruction. The emulator generates a single pointing command to imitate the action of a single input device, combining the received pointing signals according to a predefined set of emulation rules. The emulated pointing signal is arranged to imitate a single cursor from a single pointing device, for a PDA, smart phone, computer or laptop.

Description

SINGLE POINTER EMULATION
TECHNICAL FIELD OF THE INVENTION
The present invention relates to a computing device and in particular, a method of operating a computing device to receive multiple pointing signals from one or more pointing devices and generate a corresponding single emulated pointing signal.
BACKGROUND TO THE INVENTION
Computing devices are frequently operated by a human user through an input device, such as, for example, a mouse, a keyboard or a joystick. A pointing device is a type of input device.
Generally, a pointing device is a human interface device that allows a user to input continuous and multi-dimensional spatial data to a computing device. Accordingly, the pointing device category includes the following input devices: a mouse, a joystick, a touch-screen, a touch-pad, and any further data input devices that may be developed in the future. A pointing device is typically used with a computing device having a display screen or monitor.
In the example of a mouse, a position indicator, such as a cursor, is typically presented on the screen by the computing device and the cursor's position on the screen is controlled by a pointing signal received from the mouse. A user of the computing device can move the mouse in order to move the position indicator around the screen. It is not necessary for a visible cursor to be present on a display screen in order for the computing device to be able to detect movement of a pointing device. Many uses of computing devices do not involve displaying a cursor to a user, such as where a stylus is used to select a part of a touchscreen, or where a rollable ball or other joystick-type navigation button is present.
It is also known for pointing devices to further comprise selection means, such as, for example, one or more buttons in the case of a mouse. A pointing device provides a digital interpretation of a user's gestures to the computing device wherein such gestures can include movements and selections made by the user. Accordingly, a pointing device provides the user with a means of controlling the computing device by identifying a position on the screen and selecting various options presented on different parts of the screen, as is well known in the art.
For the purposes of this specification the pointing device as seen by a user, i.e. the physical mouse or touchpad connected to the computing device that is held or touched by a user, is referred to as the physical pointer'. On the other hand, the pointing device as seen by the computing device, i.e. the pointing device as defined by the pointing signal sent to the computing device, is referred to as the logical pointer'.
Historically, computing devices have typically been designed to operate with a single pointing signal from a single pointing device. Also, software for such computing devices has been designed to operate in response to a single pointing signal from a single pointing device.
For example, known on-screen user interface frameworks, such as, for example, Macromedia� Flash� and Symbian�, assume the existence of a single pointing device providing a single pointing signal. Additionally, software applications that use such user interface frameworks, such as, for example, Java� applications, Flash� applications, Microsoft� Windows� applications, also assume the existence of a single pointing device providing a single pointing signal. Such software applications or frameworks are hereinafter referred to as single pointer applications or frameworks'.
Many known computing devices are capable of recognising different types of pointing device and such computing devices are often provided with multiple input ports for connecting to, and receiving data from, multiple pointing devices. Therefore, it is the case that multiple physical pointers may be arranged to provide multiple pointing signals to the computing device however, the single pointer applications and frameworks of the computing device expect to see only a single logical pointer.
Advances in technology have seen the development and growth of pointing devices which are capable of recognising more than one input source and are further capable of providing a pointing signal containing one pointing signal for each input source. For example, the multi-touch� touch screen of the Apple� iPhone� is capable of recognising more than one finger touching the screen and is further capable of providing a pointing signal comprising one pointing signal for each finger. Such pointing devices are hereinafter referred to as multipointer devices', and a pointing signal from a multipointer device is hereinafter referred to as a multipointer signal'. A pointing device capable of only providing one pointing signal is hereinafter referred to as a single-pointer device'.
In addition to the above, a known computing device may be arranged to receive multiple individual pointing signals if the computing device is connected to multiple single-pointer devices. For example, the computing device may be connected to multiple mice, joysticks, touch-screens or touch-pads or, any combination thereof.
It is known for problems to occur when a single pointer application or framework is operated by a multipointer device (or multiple single-pointer devices). More specifically, the arbitrary delivery of a multipointer signal from a multipointer device (or the pointing signals from multiple single-pointer devices) to a single pointer application or framework can cause the logic of the single pointer application or framework to malfunction or not function at all.
These problems can manifest themselves in a number of different ways. For example, if a touch screen having a multi-touch capability is connected to a computing device, a single pointer application or framework running on the computing device may only recognise the pointing signal relating to the first input source to send data, i.e. the first finger to touch the screen. This operation can prove particularly problematic when the user's focus has changed to a second finger and the second finger now provides the primary instruction instead of the first finger.
In a specific prior art example, if multiple mice are connected to a computing device running Microsoft� Windows XP� the multiple physical pointers will be combined into a single logical pointer that controls the position of a single cursor. According to this operation, the single cursor can be moved around the screen with any of the physical pointers. This operation can be acceptable in some instances; however, it is often the case that having multiple pointing devices fighting' for control of a cursor is undesirable. In particular, this method of control can be frustrating to a user who is operating one mouse and accidentally moves another which causes the computing device to perfonn an unintended operation. This problem leads to a worse user experience, since the device might behave unpredictably.
It is also known that some individual single-pointer software applications for the Apple� iPhone� are programmed to handle a multipointer signal. However, all single-pointer software applications for the Apple� iPhone� are not consistent in their approach and therefore, a user cannot be certain how their gestures will be interpreted across different single-pointer software applications or frameworks. This problem also leads to a worse user experience.
Additionally, if it is left for individual single-pointer software applications and frameworks to choose how they will handle multipointer signals then, software developers of each single.-pointer application or framework must consider how that application or framework will handle a multipointer signal. The development of single pointer applications and frameworks is therefore made more complicated.
Furthermore, the problems associated with rulming single pointer software on multipointer devices can be a significant one, for the following reasons. In many instances, multipointer devices are developed to replace older single pointer devices. It is conceivable that a single pointer device could become a multipointer device on receipt of a firmware update. In view of this, new multipointer devices are required to operate effectively with legacy single pointer software applications and frameworks. Moreover, it can be a significant barrier to entry into the market for a new multipointer device if it does not operate with legacy single pointer software. Likewise, it would be onerous for application developers to have to re-write all applications to enable them to perform on both multipointer and single pointer devices.
Additionally, it is possible that a user of a device having a multipointer capability may wish to operate the device with a single pointer software application or framework, even though a multipointer version is available, perhaps because he is not aware that another version of the application exists.
In view of the above, it is important that multipointer devices are capable of operating effectively with single pointer applications and frameworks. Moreover, it is a significant problem if multipointer devices do not operate with single pointer software applications or frameworks.
SUMMARY OF THE INVENTION
It is an object of the present invention to address the above-identified problems with the prior art by providing a method of generating an emulated pointing signal within a computing device, the computing device being capable of receiving at least one pointing signal from each of one or more pointing devices, each pointing signal containing data relating to at least part of an instruction from a user of the or each pointing device to the computing device, the method comprising: a. receiving concurrently a plurality of pointing signals; and, b. generating an emulated pointing signal in dependence on the received pointing signals according to a predefined set of emulation rules, the emulated pointing signal imitating a single pointing signal from a single pointing device.
It is an advantage of the present invention that single pointer software running on the computing device can ignore multipointer signals which may cause the computing device to malfunction and instead choose to receive the emulated pointer signal. Further, it is an advantage that the emulated pointer signal corresponds to the received pointing signals such that all essential information included in the received pointing signals is also included in the emulated pointing signal. Accordingly, it is an advantage that the emulated pointer signal may be used as an approximation of, and an equivalent to, the plurality of received pointing signals.
Preferably, the received pointing signals relate to at least one instruction and the emulated pointing signal relates to the same at least one instruction. It is an advantage of this embodiment that the emulated pointing signal provides the same essential information as the plurality of pointing instructions so that the emulated pointing signal can be used to replace the plurality of pointing instructions effectively.
Preferably, one of the received pointing signals is selected to provide the emulated pointing signal and selection is performed in dependence on the received pointing signals and according to the predefined set of emulation rules. It is an advantage of this embodiment that the received pointing signal which is most suitable for providing the emulated pointing signal is chosen to provide the emulated pointing signal. It is also an advantage that selection is performed actively and is based on the content of the received pointing signals and the emulation rules. This is preferable to making an incidental selection based on the physical or logical construction of either the computing device or the pointing device.
Preferably, the received pointing signal selected to provide the emulated pointing signal is changed in dependence on the received pointing signals and according to the predefined set of emulation rules while concurrent receipt of pointing signals continues. It is an advantage of this embodiment that the received pointer signal which provides the emulated pointer signal is switched when it becomes less suitable for providing the emulated pointing signal than a different received pointing signal. This dynamic operation ensures that the emulated pointer signal always provides the most relevant information from the plurality of received pointing signals. This operation ensures that the emulated pointing signal provides a good approximation of, and equivalent to, the plurality of received pointing signals.
Preferably, the emulated pointing signal relates to the data of at least two of the received pointing signals. It is additionally preferable that the data of each pointing signal comprises data relating to gestures of the user and the emulated pointing signal relates to gestures of at least two of the received pointing signals. It is an advantage of these embodiments that when a number of received pointing signals concurrently contain information which is relevant, all the relevant information is combined into the emulated pointing signal. This operation ensures that the emulated pointing signal provides a good approximation of, and equivalent to, the plurality of received pointing signals.
Preferably, the emulated pointing signal is generated for provision to software of the computing device, said software being operable only with a single pointing signal from a single pointing device. It is an advantage of this embodiment that single pointer software which would not be able to operate effectively using multiple pointing signals is enabled to operate effectively by using the emulated pointer signal.
Preferably, the computing device receives the plurality of pointing signals from a single touch-screen or touch-pad having a multi-touch capability. Alternatively, the computing device may receive a single pointing signal from each of a plurality of pointing devices including at least one of the following pointing devices: a mouse, a joystick and a touch-screen or touch-pad. It is an advantage of these embodiments that the present invention is capable of operating with a variety of different types of pointing device and combinations of computing device and pointing device(s).
Preferably an operating system of the computing device generates the emulated pointing signal. It is an advantage of this embodiment that the same emulated pointing signal can be accessed easily by all software components of the computing device to ensure that the same method of emulation is used by all software components. The arrangement of this embodiment enables the emulated pointing signal to act as a global variable which is accessible to all software components of the computing device.
A second aspect of the present invention provides an emulator for a computing device, the computing device being capable of receiving at least one pointing signal from each of one or more pointing devices, each pointing signal containing data relating to at least part of an instruction from a user of the or each pointing device to the computing device, wherein the emulator comprises an input arranged to receive concurrently a plurality of pointing signals relating to at least one instruction, and the emulator further comprises an emulated signal generator arranged to generate an emulated pointing signal in dependence on the received pointing signals according to a predefined set of emulation rules, the emulated pointing signal imitating a single pointing signal from a single pointing device.
A third aspect of the present invention provides a computing device having an emulator according to the second aspect.
A fourth aspect of the present invention provides a computer program or suite of computer programs so arranged such that when executed by a computer it/they cause the computer to operate in accordance with the first aspect.
A fifth aspect of the present invention provides a computer readable storage medium storing a computer program or at least one of the suite of computer programs according to the third aspect.
Within the second, third, fourth and fifth aspects of the present invention the further features and advantages as described in relation to the first aspect may also be obtained.
BRIEF DESCRIPTION OF THE DRAWINGS
Embodiments of the present invention will now be described by way of example only and by reference to the accompanying drawings in which: Figure 1 is a representation of a smartphone computing device; Figure 2 is a schematic of some of the internal elements of the smartphone of Figure 1; Figure 3 is another schematic of some of the internal elements of the smartphone of Figure 1; Figure 4 is a state diagram of a pointing device of the smartphone of Figure 1; Figure 5 is a schematic of some of the internal elements of the smartphone of Figure 1 when arranged according to a preferred embodiment of the present invention; Figure 6 is a flow diagram defining a set of emulation rules for providing an emulated pointing signal according to a preferred embodiment of the present invention; and, Figures 7 to 12 are flow diagrams defining sets of emulation rules for providing an emulated pointing signal according to alternative embodiments of the present invention.
DESCRIPTION OF THE EMBODIMENTS
Before a preferred embodiment of the present invention is described in detail, a known smartphone and its operation is described with reference to Figures 1 to 3. The smartphone may provide the operating environment for the embodiments of the invention.
Figure 1 represents a known smartphone 2 which comprises a keypad 4, a touch-screen 6, a microphone 8, a speaker 10 and an antenna 12. The touch-screen 6 provides a pointing device of the smartphone 2. The smartphone 2 is capable of being operated by a user to perform a variety of different functions, such as, for example, hosting a telephone call, browsing the internet or sending an email.
Figure 2 shows a schematic view of some of the internal hardware elements of the known smartphone 2. With reference to Figure 2, the smartphone 2 comprises hardware to perform telephony functions, together with an application processor and corresponding support hardware to enable the phone to have other functions which are desired by a smartphone, such as messaging, internet browsing, email functions and the like. In Figure 2, the telephony hardware is represented by the RF processor 102 which provides an RF signal to the antenna 12 for the transmission of telephony signals, and the receipt therefrom.
Additionally provided is baseband processor 104, which provides signals to and receives signals from the RF Processor 102. The baseband processor 104 also interacts with a subscriber identity module 106, as is well known in the art. The telephony subsystem of the smartphone 2 is beyond the scope of the present invention.
The keypad 4 and the touch-screen 6 are controlled by an application processor 108. A power and audio controller 109 is provided to supply power from a battery (not shown) to the telephony subsystem, the application processor 108, and the other hardware. Additionally, the power and audio controller 109 also controls input from the microphone 8, and audio output via the speaker 10.
In order for the application processor 108 to operate, various different types of memory are provided. Firstly, the smartphone 2 includes Random Access Memory (RAM) 112 connected to the application processor 108 into which data and program code can be written and read from at will. Code placed anywhere in RAM 112 can be executed by the application processor 108 from the RAM 112. RAM 112 represents a volatile memory of the smartphone 2.
Secondly, the smartphone 2 is provided with a long-term storage 114 connected to the application processor 108. The long-term storage 114 comprises three partitions, an operating system (OS) partition 116, a system partition 118 and a user partition 120. The long-term storage 114 represents a non-volatile memory of the smartphone 2.
In the present example, the OS partition 116 contains the firmware of the computing device which includes an operating system. An operating system is necessary in order for the application processor 108 to operate and therefore, the operating system must be started as soon as the smartphone system 2 is first switched on. Generally speaking it is the role of the operating system to manage hardware and software resources of the computing device. These resources include such things as the application processor 108, the RAM 112, and the long-term storage 114. As such, the operating system provides a stable, consistent way for software applications running on the smartphone 2 to deal with the hardware resources of the smartphone 2 without the application needing to know all the details of the physical resources available to the hardware. In the case of smartphones, a well known operating system is that produced by the present applicant, known as Symbian� OS.
Other computer programs may also be stored on the long-term storage 114, such as application programs, and the like. In particular, application programs which are mandatory to the device, such as, in the case of a smartphone, communications applications and the like are typically stored in the system partition 118. The application programs stored on the system partition 118 would typically be those which are bundled with the smartphone by the device manufacturer when the phone is first sold. Application programs which are added to the smartphone by the user would usually be stored in the user partition 120.
As stated, the representation of Figure 2 is schematic. In practise, the various functional components illustrated may be substituted into one and the same component. For example, the long-term storage 114 may comprise NAND flash, NOR flash, a hard disk drive or a combination of these.
Figure 3 shows another schematic diagram of the smartphone 2 which includes some of the hardware components mentioned above with respect to Figure 2. More particularly, Figure 3 shows an operating system (OS) 122 in communication with the application processor 108.
Figure 3 also shows the touch-screen 6 and the various memory elements of the smartphone 2, as described above. As mentioned previously, the OS 122 is stored on the OS partition 118 and controls the operation of the application processor 108. In particular, the OS 122 controls the application processor 108 to provision access of software stored on the memory elements 112 and 114 to the input devices, such as the touch-screen 6.
During operation, it is often necessary for the software of the smartphone 2 to receive commands or instructions from a user (not shown) of the smartphone 2. In some circumstances such instructions can be specifically requested by the software from the user, other times, such instructions are issued by the user without being specifically requested. In any given situation, the user may provide instructions to the software using one or more of the input devices of the smartphone 2, such as, the keypad 4, the touch-screen 6, or the microphone 8. However, if a pointing device input is required by the software then the user must use the touch-screen 6 as this is the only pointing device of the smartphone 2.
When the user controls the touch-screen 6 to provide an instruction, the application processor 108 (controlled by the OS 122) receives the pointing signal relating to the user's instruction from the touch-screen 6 for provision to the relevant software at the appropriate place in memory. For example, if the pointing signal has been provided for a software application stored on the user partition 120, the application processor 108 receives the pointing signal from the touch-screen 6 for provision to the user partition 120. It is also the case that the software of the smartphone 2 can issue instructions to the touch-screen 6 via the application processor 108. For example, such instructions may be to show a particular window or dialogue box on the touch-screen 6, or to highlight a selection in response to a previously received gesture from the user.
The touch-screen 6 has a multi-touch capability and therefore, can recognise more than one finger (or other pointing instrument) concurrently contacting the touch-screen 6. Accordingly, the touch-screen 6 is defined as a multipointer device. More specifically, the touch-screen 6 provides a single physical pointer (the physical touch-screen) however, it is capable of providing multiple logical pointers (i.e. one for each finger concurrently contacting the touch-screen). In view of this functionality, the touch-screen 6 is preferably capable of providing a multipointer signal comprising one pointer signal for each finger concurrently contacting the screen. In addition to the above, the touch-screen 6 is provided with proximity capability which enables the touch-screen 6 to recognise not only a finger contacting its screen but also a finger close to but maintaining up to a predefined distance moving in predefined directions from the screen. Accordingly, the touch-screen 6 is capable of providing a multipointer signal comprising multiple pointer signals wherein each pointer signal corresponds to a different finger concurrently contacting, or in close proximity to the screen.
The touch-screen 6 can also preferably identify the pressure with which a finger (or other pointing instrument) contacts the screen. Therefore, for each finger contacting the screen, the touch-screen 6 preferably provides a pressure rating in the pointing signal which corresponds to the pressure that the finger applies to the screen.
Figure 4 provides a state model of the touch-screen 6 which defines its operation in relation to gestures of a single finger (or other pointing instrument). Therefore, as the touch-screen 6 has multi-touch capabilities, the operation of the touch-screen 6 in relation to each different controlling finger is defined by its own version of the state model. In Figure 4, the following three states are defined, an initial state 200, an OutOfRange (OOR) state 202, an UP state 204 and a DOWN state 206. Connecting each state to other states are labelled arrows each of which represents an event. In summary, the touch-screen 6 can be in anyone of the three defined states and changes state when an event defined by the arrow labels is performed.
More specifically, the operation of the touch-screen 6 with reference to Figure 4 is defined as follows.
The touch-screen starts in the initial state 200 and changes to the OOR state 202 when the touch-screen 6 is first operated, for example, when the smartphone 2 is turned on. The OOR state 202 defines that a finger is not detected by the touch-screen 6. If a finger moves into detectable range of the touch-screen 6, this represents a MOVE event and causes a state change from the OOR state 202 to the UP state 204. Each MOVE event is included in the pointing signal corresponding to the detected finger together with locations corresponding to the physical locations of the detected finger in relation to the screen. Importantly, the UP state 204 defines that a finger is in detectable range but is not contacting the touch-screen 6.
If the detected finger contacts the screen, this represents a DOWN event and causes a state change from the UP state 204 to the DOWN state 206. Each DOWN event is included in the pointing signal corresponding to the detected finger together with a location corresponding to the initial contact point between the finger and the screen. If the finger changes position while it is in the down state (i.e. while it is still touching the screen) this represents a DRAG event.
Each DRAG event is included in the pointing signal corresponding to the detected finger together with locations corresponding to the physical locations of the detected finger on the screen. If the finger moves out of range while still in the DOWN state 206 this represents an OOR event and causes a state change from the DOWN state 206 to the OOR state 202. Each OOR event is included in the pointing signal corresponding to the finger together with the physical location where the finger was last detected in the DOWN state 206. Alternatively, if the finger is moved so that it no longer contacts the screen but is still within detectable range this represents an UP event and the state changes from the DOWN state 206 back to the UP state 204. Each UP event is included in the pointing signal corresponding to the detected finger together with the physical location on the screen where the finger lost contact. While in the UP state 204, if the finger changes location with respect to the touch-screen 6 MOVE events are issued as appropriate. From the UP state 204, if the finger moves out of range, or in other words, out of detectable range of the touch screen 6 this represents an OOR event and causes a state change from the UP state 204 to the OOR state 202. Each OOR event is included in the pointing signal corresponding to the finger together with the physical location where the finger was last detected.
According to the above described operation of the smartphone 2 and the touch screen 6, the OS 122 (via the application processor 108) maintains an up-to-date state model for each finger detected by the touch-screen 6. Each time a multipointer signal is received from the touch-screen 6 by the smartphone 2, the state model(s) relating to multipointer signal are updated appropriately.
As mentioned previously, if a multipointer signal from the touch-screen 6 is provided to software of the smartphone 2 which is only designed to operate with a single pointing signal (i.e. a signal from a single-pointer device) then the logic of the software may cause the smartphone 2 to malfunction or fail to function at all. For example, such software may assume that each pointer signal of the multipointer signal originates from the same logical pointer, or alternatively, such software may only detect instruction from the logical pointer corresponding to the first pointing signal received. Software (including applications and frameworks) which is only designed to operate with a single pointing signal, and not a multipointer signal, is hereinafter referred to as single pointer software'.
Thus far, the above description of the smartphone 2 and its operation is known. Next is described the additions provided by the preferred embodiment of the present invention to address the problems noted earlier. Figure 5 corresponds to previous Figure 3 and shows the additions provided by the present invention.
The difference between Figure 5 and Figure 3 is the addition of an emulator 124 which is connected in-between the OS 122 and the various memory elements 112 to 120. In operation, the emulator 124 receives data from the OS 122 and provides data to the various memory elements 112 to 120. More specifically, an input of the emulator 124 receives data from the OS 122 in the form of a multipointer signal which the OS 122 receives via the application processor 108 from the touch-screen 6. An emulated signal generator of the emulator 124 is then capable of generating a corresponding single pointer signal (hereinafter called an emulated pointing signal') in dependence on the received multipointer signal. The emulated pointing signal aims to provide instructions which correspond to the instructions provided by the received multipointer signal. The emulated pointing signal can then be accessed from the emulator 124 by software running on any memory element of the smartphone 2. It is envisioned that the emulated pointing signal would be used by single pointer software of the smartphone 2. In such cases, the single pointer software would disregard multipointer signals provided by the application processor 108 and instead only concern itself with the emulated pointing signal provided by the emulator 124.
In view of the fact that a multipointer signal can comprise multiple pointing signals and the emulated pointing signal must comprise only a single pointing signal, the multipointer signal is a richer information bearer than the emulated pointing signal. Accordingly, the information carried by the multipointer signal must be interpreted by the emulator 124 so that the corresponding emulated pointing signal contains at least the essential information of the multipointer signal. Some of the information carried by the multipointer signal will not be represented in the emulated pointing signal.
Assessing which information of the multipointer signal is essential and should be incorporated into the emulated pointing signal is one of the functions of the emulator 124. To perform this function the emulator 124 interprets the multipointer signal according to a set of predefined rules called gesture interpretations'. More specifically, the gesture interpretations define how to interpret the user's gestures as defined by the multipointer signal in order to establish which pointing signal or pointing signals of the multipointer signal represent the user's intended instruction. Accordingly, the gesture interpretations define how to interpret the pointing signals of the multipointer signal to help identify which one to use as the emulated pointing signal.
According to the preferred embodiment, the emulator 124 applies the following gesture interpretations. The first gesture interpretation relates to touching' events, i.e. when a user moves a finger from the UP state 204 to the DOWN state 206, and the second gesture interpretation relates to lifting' events, i.e. when a user moves a finger from the DOWN state 206 to the UP state 204.
Whenever the user touches the screen with a new finger, the user's focus is Touching moved to this finger. The action performed with the previous finger should be cancelled. The action performed with the new finger should be performed.
If the user lifts the focused finger and there are other fingers touching the screen, Lifting the current action should be moved to one of the other touching fingers.
According to the above gesture interpretations the emulator 124 is able to receive a multipointer signal comprising multiple pointing signals and interpret that multipointer signal to identify which pointing signal to use as the emulated pointing signal.
However, the gesture interpretations are not in a state in which they can be directly applied to a multipointer signal in order to generate the emulated pointing signal. Therefore, a second set of predefined rules, called emulation rules' are created based on the gesture interpretations and it is the emulation rules which are applied directly to a multipointer signal in order to generate the emulated pointing signal.
Figure 6 represents the emulation rules of the preferred embodiment in a flow diagram.
Before describing the emulation rules with reference to Figure 6 it is important to note that the following operation of the emulator 124. When a multipointer signal is provided by the touch-screen 6, one of the pointing signals of the multipointer signal is designated as the primary' pointing signal. The primary pointing signal provides the emulated pointing signal.
All events from pointing signals of the multipointer signal, other that the primary pointing signal, are not included in the emulated pointing signal. The only exception to this is for some transition events from non-primary pointing signals which are included in the emulated pointing signal when the primary pointing signal is changed. The flow diagram of Figure 6 defines this operation in more detail.
The flow diagram of Figure 6 begins at step 302 wherein the emulator 124 identifies if a primary pointing signal of the multipointer signal is assigned. If a primary pointing signal is not assigned then processing flows to step 304, alternatively, if a primary pointing signal is assigned then processing flows to step 308, which will be discussed later. At step 304, the emulator 124 determines if a multipointer signal comprising at least one pointing signal is present. Practically, in this step the emulator 124 determines if one or more fingers are detected by the touch-screen 6. If a multipointer signal comprising at least one pointing signal is detected (i.e. at least one finger is detected) then processing flows to step 306 wherein, the first pointing signal of the multipointer signal detected is designated the primary pointing signal. Alternatively, if a multipointer signal comprising at least one pointing signal is not detected at step 304 (i.e. no fingers are detected by the touch screen 6) then processing waits at step 304 until one is detected, after-which processing flows to step 306. In either case, once a pointing signal of the multipointer signal has been designated primary at step 306, processing flows to step 308.
According to the above, if processing reaches step 308 then a pointing signal of the multipointer signal is designated as the primary pointing signal. Accordingly, the emulated pointing signal will be set equal to the primary pointing signal. Whether or not the pointing signal designated as primary is maintained as the primary pointing signal will depend on the behaviour of the finger relating to the primary pointing signal and the behaviour of any other fingers which are detected by the touch-screen 6. More specifically, at step 308 the emulator 124 detects if any finger moves into the DOWN state, other than the finger corresponding to the primary pointing signal. In order to perform this operation, the emulator 124 detects if any pointing signal issues a DOWN event, other then the primary pointing signal. If the emulator 124 does not detect a DOWN event from a non-primary pointing signal at step 308, processing flows to step 314, which will be discussed later. Alternatively, if the emulator 124 does detect a DOWN event from a non-primary pointing signal at step 308 then processing flows to step 310. At step 310 the emulator detects the current state of the finger represented by the primary pointing signal. If the finger represented by the primary pointing signal is in the DOWN state then processing flows from step 310 back to step 308. Alternatively, if the finger represented by the primary pointing signal is in any state other than the DOWN state, processing flows from step 310 to step 312. At step 312, the emulator 124 demotes the current primary pointing signal from its primary status and promotes the non-primary pointing signal which issued the DOWN event at step 308 to primary status. In addition to changing which pointing signal is designated primary, the emulator 124 also includes in the emulated signal the DOWN event that was issued by the new primary pointing signal just before it became the primary pointing signal. Thereafler, the emulated pointing signal matches the new primary pointing signal and processing flows from step 312 back to step 308.
As mentioned briefly above, if the emulator 124 does not detect a DOWN event from a non-primary pointing signal at step 308, processing flows from step 308 to step 314. At step 314, the emulator 124 detects if the finger represented by the primary pointing signal moves to an OOR state based on whether an OUR event is issued by the primary pointing signal. If the emulator 124 does not detect an OOR event in the primary pointing signal, processing flows back to step 308. Alternatively, if the emulator 124 detects an OOR event in the primary pointing signal, processing flows to step 316. At step 316, the emulator 124 includes the OOR event issued at step 314 in the emulated pointing signal, then demotes the primary pointing signal from its primary status and leaves the primary status unassigned. Processing then flows from step 316 back to the step 302, which was discussed above.
The operation of the preferred embodiment, as described above with respect to Figure 6, can be modified as follows to create a variant of the preferred embodiment which is not shown by Figure 6. Operation of the variant is identical to the preferred embodiment with the following exceptions. After processing flows from step 316 to step 302 and then to step 304, no pointing signal which was detected by the emulator 124 when the primary status was unassigned at step 316 can be promoted to primary status. Instead, at steps 304 and 306, only a new pointing signal which starts to be detected after processing at step 316 has finished can be promoted to primary status. For example, a pointing signal which is detected by the emulator 124 when the primary status is unassigned at step 316, moves out of range and therefore ceases to be detected. Very soon after processing at step 316 has finished, the same pointing signal then starts to become detected again by the emulator 124 as it moves back in range. In this case, the pointing signal qualifies as a new pointing signal when it moves back in range, and after that time it is eligible for promotion to primary status.
Based on the above description of the flow diagram of Figure 6, it can be seen that some steps can be grouped together based on the operation they are associated with. More specifically, those steps in a rectangular box, i.e. steps 306, 312 and 316 indicate instructions which must be performed by the emulator 124. Alternatively, steps 302, 304 and 310 indicate tests wherein the emulator must examine the current state of an element, for example, the current state of the finger represented by the primary input signal in step 310. Alternatively, steps 308 and 314 indicate tests wherein the emulator must react to an event, for example, the receipt of a DOWN event from a non-primary pointing signal in step 308. Accordingly, while the primary pointer signal is assigned and unchanged, processing of the emulator 124 with respect to Figure 6 flows in a loop consisting of steps 308 and 314. Processing only breaks from this loop when an event that is tested for by either step 308 or 314 occurs.
According to the above described operation of the emulator 124, the emulated pointing signal is always defined as one of the pointing signals of the multipointer signal received by the emulator 124 from the touch-screen 6. Accordingly, the software of the smartphone 2, in particular the single pointer software of the smartphone 2, can receive the emulated pointing signal in preference the multipointer signal which may cause the software and smartphone 2 to malfunction.
It is an advantage of the preferred embodiment that legacy and new single pointer software can operate on the smartphone 2. Additionally, it is an advantage that the same emulated pointing signal can be used by each software component of the smartphone 2, irrespective of where each software component is stored in the memory of the smartphone 2. This is particularly beneficial because it means that the same approach to emulation will be adopted for all software applications which use the emulated pointing signal. Another benefit is that the development of new single pointer software is simplified because software developers do not need to consider how to handle multipointer signals. Instead, they can simply build new single pointer software to use the emulated pointer signal.
Various modifications can be made to the preferred embodiment to provide alternative embodiments which are also covered by the scope of the appended claims. In particular, the emulation rules applied by the emulator of the present invention can be modified. Alternative embodiments of the present invention will now be described with reference to Figures 7 to 12.
Figure 7 shows a first set of alternative emulation rules which have been created based on the same gesture interpretations as described above with reference to the preferred embodiment.
The single difference between the emulation rules of Figure 7 and those of the preferred embodiment is the addition of a step 400 in-between the step 310 arid the step 308. The following describes how this new step is integrated into the flow diagram of Figure 7. At step 310, the emulator 124 detects the current state of the finger represented by the primary pointing signal. If the finger is in a DOWN state, processing flows from step 310 to step 400 and not back to step 308. At step 400, the emulator 124 demotes the current primary pointing signal from its primary status and promotes the non-primary pointing signal which issued the DOWN event at step 308. The emulator 124 also includes in the emulated pointing signal the first UP event sent by either the old primary signal or the new primary signal however, in either case the emulator includes this UP event as issued by the new primary pointing signal.
The emulator then includes in the emulated pointing signal the DOWN event that was issued by the new primary pointing signal at step 308. Thereafter, the emulated pointing signal matches the primary pointing signal and processing flows from step 400 back to step 308.
Figure 8 shows a second set of alternative emulation rules which have been created based on the same lifting gesture interpretation as the preferred embodiment and the following alternative touching gesture interpretation.
Whenever the user touches the screen with a new finger, the user's focus is Touching moved to this finger. The action performed with the previous finger should be continued at the location of the new finger.
The single difference between the emulation rules of Figure 8 and those of the preferred embodiment is the addition of a step 402 in-between the step 310 and the step 308. The following describes how this new step is integrated into the flow diagram of Figure 8. At step 310, the emulator 124 detects the current state of the finger represented by the primary pointing signal. If the finger is in a DOWN state then processing flows from step 310 to step 402 and not back to step 308. At step 402, the emulator 124 demotes the current primary pointing signal from its primary status and promotes the non-primary pointing signal which issued the DOWN event at step 308. The emulator 124 also includes in the emulated pointing signal a MOVE event specifying a move from the last position of the old primary pointing signal to the current position of the new primary pointing signal. Thereafter, the emulated pointing signal matches the new primary pointing signal and processing flows from step 402 back to step 308.
Figure 9 shows a third set of alternative emulation rules which have been created based on the same lifting gesture interpretation as the preferred embodiment and the following alternative touching gesture interpretation.
The user's focus is always on the first finger that touched the screen and does Touching not change until this finger is lifted.
The differences between the emulation rules of Figure 9 and those of the preferred embodiment are the substitution of steps 308 to 312 with new steps 404 and 406. The following describes how these new steps are integrated into the flow diagram of Figure 9. At step 302, if a primary pointing signal is assigned then processing flows to step 404 and not step 308. At step 404 the emulator 124 detects if each of the fingers represented by all of the detected pointing signals are in the UP state and one of the non-primary pointing signals issues a DOWN event. If the conditions of step 404 are not true then processing flows to step 314, which was discussed above with reference to the preferred embodiment (the only exception being that processing flows to step 404 rather than step 308 if the test is false).
Alternatively, if the conditions of step 404 are true then processing flows from step 404 to step 406. At step 406, the emulator 124 demotes the current primary pointing signal from its primary status and promotes the non-primary pointing signal which issued the DOWN event at step 404. The emulator 124 also includes in the emulated pointing signal the DOWN event that was issued by the new primary pointing signal at step 404. Thereafter, the emulated pointing signal matches the new primary pointing signal and processing flows from step 406 back to step 404.
Figure 10 shows a fourth set of alternative emulation rules which have been created based on the same gesture interpretations as the preferred embodiment. The differences between the emulation rules of Figure 10 and those of the preferred embodiment are the addition of new steps 408 to 424.
The following describes how new steps 408 to 412 are integrated into the flow diagram of Figure 10. At step 308, if the emulator 124 does not detect a DOWN event from a non-primary pointing signal then processing flows to step 408 and not step 314. At step 408, the emulator 124 detects if the primary pointing signal sends an UP event. If the primary pointing signal does not issue an UP event, processing flows to step 314 which is discussed above with reference to the preferred embodiment. Alternatively, if the primary pointing signal does issue an UP event then processing flows to step 410. At step 410, the emulator 124 detects if each finger relating to each non-primary pointing signal is in the DOWN state. If any of such fingers are in a DOWN state then processing flows from step 410 to step 412, alternatively, processing flows from step 410 back to step 308, which is discussed above with reference to the preferred embodiment. At step 412, the emulator 124 demotes the current primary pointing signal from its primary status and promotes the non-primary pointing signal relating to the closest finger in a DOWN state. More specifically, the closest finger is taken to be the finger in closest proximity to the touch-screen 6 or, if more than one finger is contacting the touch-screen 6, the finger contacting with the highest pressure. The emulator 124 then includes in the emulated pointing signal a MOVE event specifying a move from the last position of the old primary pointing signal to the current position of the new primary pointing signal. Thereafter, the emulated pointing signal matches the new primary pointing signal and processing flows from step 412 back to step 308.
The following describes how the new steps 414 to 424 are integrated into the flow diagram of Figure 10. At step 314, if the primary pointing signal issues an OOR event, processing flows to step 414 and not step 316. At step 414, the emulator 124 detects if any non-primary pointing signals are present. If no non-primary pointing signals are present then processing flows back to step 316, which is discussed above with reference to the preferred embodiment.
Alternatively, if at least one non-primary pointing signal is present, processing flows from step 414 to step 416. Step 416 is a test step wherein the emulator 124 examines if the state of the finger represented by the primary pointing signal was in a DOWN state before the OOR event was sent at step 314. Also at step 416, the emulator examines if the state of the closest finger is in the DOWN state. If both condition are not true, processing flows from step 416 to step 420, which will be discussed later. Alternatively, if both conditions are true, processing flows from step 416 to step 418. At step 418, the emulator 124 demotes the current primary pointing signal from its primary status and promotes to primary status the non-primary pointing signal representing the closest finger. The emulator 124 also includes in the emulated pointing signal a DRAG event specifying a drag from the last position of the old primary pointing signal to the current position of the new primary pointing signal. Thereafter, the emulated pointing signal matches the new primary pointing signal and processing flows from step 418 back to step 308.
As mentioned briefly above processing flows from step 416 to step 420 if both conditions specified in step 416 are not true. Step 420 provides another test step wherein the emulator 124 examines if the state of the finger represented by the primary pointing signal was in a DOWN state before the OOR event was sent at step 314. Also at step 420, the emulator 124 examines if the state of the closest finger is in the UP state. If both conditions are true then processing flows to step 422, otherwise processing flows to step 424. At step 422, an UP event is included in the emulated pointing signal from the current primary pointing signal.
The emulator 124 then demotes the current primary pointing signal from its primary status and promotes to primary status the non-primary pointing signal representing the closest finger. The emulator 124 then includes a MOVE event in the emulated pointing signal specifying a move from the last position of the old primary pointing signal to the current position of the new primary pointing signal. Thereafter, the emulated pointing signal matches the new primary pointing signal and processing flows from step 422 back to step 308. At step 424, the emulator 124 demotes the current primary pointing signal from its primary status and promotes to primary status the non-primary pointing signal representing the closest finger.
The emulator 124 includes a MOVE event in the emulated pointing signal specifying a move from the last position of the old primary pointing signal to the current position of the new primary pointing signal. Thereafter, the emulated pointing signal matches the new primary pointing signal and processing flows from step 424 back to step 308.
Figure 11 shows a fifth set of alternative emulation rules which have been created based on the same touching gesture interpretation as the preferred embodiment and the following alternative lifting gesture interpretation.
If the user lifts the focused finger and there are other fingers touching the screen, Lifting the current action should be cancelled and a new action should be started in the location of one of the other touching fingers.
The differences between the emulation rules of Figure 11 and those of Figure 10 are that steps 412 and 418 are replaced by steps 426 and 428, respectively. At step 426, the emulator 124 demotes the current primary pointing signal from its primary status and promotes the non-primary pointing signal relating to the closest finger in the DOWN state. The emulator 124 then includes in the emulated pointing signal the first UP event sent by either the old primary or the new primary signals however, in either case the emulator includes this UP event as issued by the new primary pointing signal. The emulator 124 then generates and includes in the emulated pointing signal a DOWN event from the new primary pointing signal. Thereafter, the emulated pointing signal matches the primary pointing signal and processing flows from step 426 back to step 308.
At step 428, the emulator 124 demotes the current primary pointing signal from its primary status and promotes to primary status the non-primary pointing signal representing the closest finger. The emulator 124 also includes in the emulated pointing signal an UP event from the new primary pointing signal. The emulator 124 then generates and includes in the emulated signal a DOWN event from the new primary pointing signal. Thereafter, the emulated pointing signal matches the new primary pointing signal and processing flows from step 428 back to step 308.
Figure 12 shows a sixth set of alternative emulation rules which have been created based on the same touching gesture interpretation as the preferred embodiment and the following alternative lifting gesture interpretation.
If the user lifts the focused finger and there are other fingers touching the screen, Lifting the current action should be finished/executed then a new action should be started in the location of one of the other touching fingers.
The differences between the emulation rules of Figure 12 and those of Figure 10 are that steps 412 and 418 are replaced by steps 430 and 432, respectively. At step 430, the emulator 124 includes in the emulated pointing signal the first UP event sent by either the old primary signal or the new primary signal however, in either case the emulator includes this UP event is sent from the new primary signal. The emulator 124 then demotes the current primary pointing signal from its primary status and promotes the non-primary pointing signal relating to the closest finger in the DOWN state. The emulator 124 then generates and includes in the emulated pointing signal a DOWN event from the new primary pointing signal. Thereafter, the emulated pointing signal matches the primary pointing signal and processing flows from step 430 back to step 308.
At step 432, the emulator 124 includes in the emulated pointing signal an UP event from the current primary pointing signal. The emulator 124 then demotes the current primary pointing signal from its primary status and promotes the non-primary pointing signal relating to the closest finger in the DOWN state. The emulator 124 then includes in the emulated pointing signal a MOVE event specifying a move from the last position of the old primary pointing signal to the current position of the new primary pointing signal. The emulator 124 then generates and includes in the emulated signal a DOWN event from the new primary pointing signal. Thereafter, the emulated pointing signal matches the primary pointing signal and processing flows from step 432 back to step 308.
In the preferred embodiment, the emulator 124 is shown as being distinct from the OS 122 however, it is within the scope of the invention that the emulator 124 and the 05 122 are part of the same element of the smartphone 2. Additionally, it is within the scope of the invention that the OS 122 provides the emulator 124. Regardless of the precise arrangement, it is key to the present invention that the emulated pointer signal is formed such that it is available to all software components of the computing device. In other words, it is important that the emulated pointing signal acts as a global' variable within the context of the computing device. This will minimise the chances that separate software components of the computing device will need to form their own individual emulated pointing signal. This is advantageous since it is likely that the presence of more than one emulated pointer signal will reduce the user experience. More specifically, it is likely that different emulated signals will use different approaches to emulation and therefore, the user will be uncertain of how their gestures will be interpreted into instructions.
The preferred and alternative embodiments of the present invention have been described with reference to a computing device and a single pointing device being a touch-screen having both proximity and multi-touch capabilities. However, the present invention is suitable for use with a wide range of different computing devices together with a wide range of pointing devices. For example, instead of a smartphone, a PDA, a desktop computer or a laptop could be used. Further, instead of a touch-screen, a touch-pad could be used. Further-still, instead of a single pointing device, multiple pointing devices could be used, including at least one mouse, joystick, touch-screen or touch-pad.
The preferred and alternative embodiments of the present invention have been described with reference to a pointing device whose operation can be described with reference to the state diagram of Figure 4. However, the present invention is intended to operate with a pointing device whose operation is defined by a different state model to that of Figure 4. For example, the state model may comprise only a subset of the states defined with respect to Figure 4.
More specifically, an alternative state model comprises only the UP and DOWN states and not the OOR state. Such a state model can be used to describe a mouse which in operation cannot reach an OOR state. Another alternative state model comprises only the DOWN and OOR states and not the UP state. Such a state model can be used to describe a touch-screen without a proximity sensing capability. Such touch-screens are only able to detect pointing instruments when they actually contact the screen and therefore, these devices cannot detect an instrument which is in close proximity but is not contacting the screen.
Finally, various additions and modifications may be made to the above described embodiments to provide further embodiments, apparent to the intended reader being a person skilled in the art, any and all of which are intended to fall within the scope of the appended claims. For example, features from two or more of the above described embodiments may be combined together to generate further embodiments of the present invention.

Claims (18)

  1. CLAIMS1. A method of generating an emulated pointing signal within a computing device, the computing device being capable of receiving at least one pointing signal from each of one or more pointing devices, each pointing signal containing data relating to at least part of an instruction from a user of the or each pointing device to the computing device, the method comprising: a. receiving concurrently a plurality of pointing signals; and, b. generating an emulated pointing signal in dependence on the received pointing signals according to a predefined set of emulation rules, the emulated pointing signal imitating a single pointing signal from a single pointing device.
  2. 2. The method as claimed in claim I further comprising controlling the computing device using the emulated pointing signal.
  3. 3. The method as claimed in claim 1 or 2 wherein the received pointing signals relate to at least one instruction and the emulated pointing signal relates to the same at least one instruction.
  4. 4. The method as claimed in any preceding claim wherein one of the received pointing signals is selected to provide the emulated pointing signal and selection is performed in dependence on the received pointing signals and according to the predefined set of emulation rules.
  5. 5. The method as claimed in claim 4 wherein the received pointing signal selected to provide the emulated pointing signal is changed in dependence on the received pointing signals and according to the predefined set of emulation rules while concurrent receipt of pointing signals continues.
  6. 6. The method as claimed in any preceding claim wherein the emulated pointing signal relates to the data of at least two of the received pointing signals.
  7. 7. The method as claimed in claim 6 wherein the data of each pointing signal comprises data relating to gestures of the user and the emulated pointing signal relates to gestures of at least two of the received pointing signals.
  8. 8. The method as claimed in any preceding claim wherein the data of each pointing signal comprises data relating to gestures of the user, and wherein the set of predefined emulation rules are created according to a predefined set of interpretation rules which define how gestures are interpreted into instructions.
  9. 9. The method as claimed in any preceding claim wherein the emulated pointing signal is generated for provision to software of the computing device, said software being operable only with a single pointing signal from a single pointing device.
  10. 10. The method as claimed in any preceding claim wherein the computing device receives the plurality of pointing signals from a single touch-screen or touch-pad having a multi-touch capability.
  11. 11. The method as claimed in any of claims 1 to 9 wherein the computing device receives a single pointing signal from each of a plurality of pointing devices including at least one of the following pointing devices: a mouse, a joystick and a touch-screen or touch-pad.
  12. 12. The method as claimed in any of the preceding claims wherein an operating system of the computing device generates the emulated pointing signal.
  13. 13. An emulator for a computing device, the computing device being capable of receiving at least one pointing signal from each of one or more pointing devices, each pointing signal containing data relating to at least part of an instruction from a user of the or each pointing device to the computing device, wherein the emulator comprises an input arranged to receive concurrently a plurality of pointing signals relating to at least one instruction, and the emulator further comprises an emulated signal generator arranged to generate an emulated pointing signal in dependence on the received pointing signals according to a predefined set of emulation rules, the emulated pointing signal imitating a single pointing signal from a single pointing device.
  14. 14. A computing device comprising an emulator according to claim 13.
  15. 15. A computer program or suite of computer programs so arranged such that when executed by a computer it/they cause the computer to operate in accordance with the method of any of claims 1 to 12.
  16. 16. A computer readable storage medium storing a computer program or at least one of the suite of computer programs according to claim 15.
  17. 17. A method of generating an emulated pointing signal substantially as hereinbefore described with reference to accompanying Figures 5 to 12.
  18. 18. An emulator substantially as hereinbefore described with reference to accompanying Figures 5 to 12.
GB0822845A 2008-12-15 2008-12-15 Emulator for multiple computing device inputs Withdrawn GB2466077A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB0822845A GB2466077A (en) 2008-12-15 2008-12-15 Emulator for multiple computing device inputs
PCT/IB2009/055573 WO2010070528A1 (en) 2008-12-15 2009-12-08 Method of and apparatus for emulating input

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0822845A GB2466077A (en) 2008-12-15 2008-12-15 Emulator for multiple computing device inputs

Publications (2)

Publication Number Publication Date
GB0822845D0 GB0822845D0 (en) 2009-01-21
GB2466077A true GB2466077A (en) 2010-06-16

Family

ID=40326139

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0822845A Withdrawn GB2466077A (en) 2008-12-15 2008-12-15 Emulator for multiple computing device inputs

Country Status (2)

Country Link
GB (1) GB2466077A (en)
WO (1) WO2010070528A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013103927A1 (en) 2012-01-06 2013-07-11 Microsoft Corporation Supporting different event models using a single input source
US20160170779A1 (en) * 2014-12-11 2016-06-16 Marek Piotr Zielinski Device emulator

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9665205B1 (en) * 2014-01-22 2017-05-30 Evernote Corporation Programmable touch emulating device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6100875A (en) * 1992-09-03 2000-08-08 Ast Research, Inc. Keyboard pointing device
US20050172045A1 (en) * 2000-06-12 2005-08-04 Gerardo Bermudez Manager component for managing input from both legacy and non-legacy input devices in a similar manner
WO2007030310A2 (en) * 2005-09-01 2007-03-15 Atrua Technologies, Inc. System for and method of emulating electronic input devices
EP1852774A2 (en) * 2006-05-03 2007-11-07 Mitsubishi Electric Corporation Method and system for emulating a mouse on a multi-touch sensitive surface

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070177804A1 (en) * 2006-01-30 2007-08-02 Apple Computer, Inc. Multi-touch gesture dictionary
WO2005029460A1 (en) * 2003-08-21 2005-03-31 Microsoft Corporation Focus management using in-air points
TWI291161B (en) * 2004-07-15 2007-12-11 N trig ltd Automatic switching for a dual mode digitizer
KR101257964B1 (en) * 2005-03-04 2013-04-30 애플 인크. Multi-functional hand-held device
DE202007018940U1 (en) * 2006-08-15 2009-12-10 N-Trig Ltd. Motion detection for a digitizer
US8970501B2 (en) * 2007-01-03 2015-03-03 Apple Inc. Proximity and multi-touch sensor detection and demodulation
US8645827B2 (en) * 2008-03-04 2014-02-04 Apple Inc. Touch event model

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6100875A (en) * 1992-09-03 2000-08-08 Ast Research, Inc. Keyboard pointing device
US20050172045A1 (en) * 2000-06-12 2005-08-04 Gerardo Bermudez Manager component for managing input from both legacy and non-legacy input devices in a similar manner
WO2007030310A2 (en) * 2005-09-01 2007-03-15 Atrua Technologies, Inc. System for and method of emulating electronic input devices
EP1852774A2 (en) * 2006-05-03 2007-11-07 Mitsubishi Electric Corporation Method and system for emulating a mouse on a multi-touch sensitive surface

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013103927A1 (en) 2012-01-06 2013-07-11 Microsoft Corporation Supporting different event models using a single input source
US20130179598A1 (en) * 2012-01-06 2013-07-11 Microsoft Corporation Supporting Different Event Models using a Single Input Source
CN104024991A (en) * 2012-01-06 2014-09-03 微软公司 Supporting different event models using single input source
EP2801012A4 (en) * 2012-01-06 2015-12-09 Microsoft Technology Licensing Llc Supporting different event models using a single input source
US9274700B2 (en) 2012-01-06 2016-03-01 Microsoft Technology Licensing, Llc Supporting different event models using a single input source
CN104024991B (en) * 2012-01-06 2017-06-27 微软技术许可有限责任公司 Different event models are supported using single input source
US10168898B2 (en) 2012-01-06 2019-01-01 Microsoft Technology Licensing, Llc Supporting different event models using a single input source
US20160170779A1 (en) * 2014-12-11 2016-06-16 Marek Piotr Zielinski Device emulator
US10255101B2 (en) * 2014-12-11 2019-04-09 Sap Se Device emulator

Also Published As

Publication number Publication date
GB0822845D0 (en) 2009-01-21
WO2010070528A1 (en) 2010-06-24

Similar Documents

Publication Publication Date Title
US20220244844A1 (en) Single contact scaling gesture
US10156980B2 (en) Toggle gesture during drag gesture
US10203815B2 (en) Application-based touch sensitivity
CN110362414B (en) Proxy gesture recognizer
US9146672B2 (en) Multidirectional swipe key for virtual keyboard
US8881047B2 (en) Systems and methods for dynamic background user interface(s)
EP2413237B1 (en) Event recognition
EP2752749B1 (en) Processing method of touch screen device user interface and touch screen device
RU2675153C2 (en) Method for providing feedback in response to user input and terminal implementing same
KR101756579B1 (en) Method, electronic device, and computer readable storage medium for detecting touch at bezel edge
EP3258366A1 (en) Event recognition
CN103729065A (en) System and method for mapping touch operations to entity keys
US8842088B2 (en) Touch gesture with visible point of interaction on a touch screen
WO2018080940A1 (en) Using pressure to direct user input
GB2466077A (en) Emulator for multiple computing device inputs
US9026691B2 (en) Semi-autonomous touch I/O device controller operation under control of host
KR20150111651A (en) Control method of favorites mode and device including touch screen performing the same
AU2018211275B2 (en) Event recognition
KR20200031598A (en) Control method of favorites mode and device including touch screen performing the same
US11635874B2 (en) Pen-specific user interface controls
KR20210029175A (en) Control method of favorites mode and device including touch screen performing the same
KR20170071460A (en) Control method of favorites mode and device including touch screen performing the same
Demski et al. Context Sensitive Gestures

Legal Events

Date Code Title Description
COOA Change in applicant's name or ownership of the application

Owner name: NOKIA CORPORATION

Free format text: FORMER OWNER: SYMBIAN SOFTWARE LTD

WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)