CN115562502B - Method and apparatus for integrating swipe and touch on an input device - Google Patents

Method and apparatus for integrating swipe and touch on an input device Download PDF

Info

Publication number
CN115562502B
CN115562502B CN202211081423.1A CN202211081423A CN115562502B CN 115562502 B CN115562502 B CN 115562502B CN 202211081423 A CN202211081423 A CN 202211081423A CN 115562502 B CN115562502 B CN 115562502B
Authority
CN
China
Prior art keywords
input
type
touch
touch motion
ultrasonic sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211081423.1A
Other languages
Chinese (zh)
Other versions
CN115562502A (en
Inventor
艾江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202211081423.1A priority Critical patent/CN115562502B/en
Publication of CN115562502A publication Critical patent/CN115562502A/en
Application granted granted Critical
Publication of CN115562502B publication Critical patent/CN115562502B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/043Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/0227Cooperation and interconnection of the input arrangement with other functional units of a computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/033Indexing scheme relating to G06F3/033
    • G06F2203/0339Touch strips, e.g. orthogonal touch strips to control cursor movement or scrolling; single touch strip to adjust parameter or to implement a row of soft keys

Abstract

A device may include a physical input to activate a first function of the device when a first type of touch motion on the physical input is detected by the physical input. The apparatus may also include an ultrasonic sensor including an ultrasonic transmitter and a first ultrasonic sensor. The ultrasonic sensor is disposed beneath the physical input and is configured to detect a second type of touch motion on the physical input and activate a second function of the device when the ultrasonic sensor detects the second type of touch motion on the physical input. Thus, by enabling different functions in response to different touch forces and/or touch movements, different functions may be accommodated in a single physical input.

Description

Method and apparatus for integrating swipe and touch on an input device
The application claims priority from U.S. provisional application No. 62/687,796 entitled "method of integrating sliding and touching on buttons" filed on day 20, 6, 2018.
Technical Field
The present disclosure relates generally to a system and method for using a functional input device on a handheld device, and in particular embodiments, to a system and method for integrating slide and touch functionality on an input on a handheld device.
Background
Handheld devices such as mobile phones, tablet computers, iPad and the like greatly facilitate and enrich the daily life of people. Various technologies including software and hardware have been developed to facilitate and facilitate the use of handheld devices over time, and new technologies continue to be developed so that handheld devices can be more conveniently operated.
Disclosure of Invention
According to one aspect of the present disclosure, there is provided an apparatus comprising: a physical input for activating a first function of the device when a first type of touch motion is detected on the physical input; and an ultrasonic sensor including an ultrasonic transmitter and a first ultrasonic receiver. The physical input may take the form of, for example, a button or knob that is depressible or laterally displaceable. The ultrasonic sensor may be disposed beneath the physical input and configured to detect a second type of touch motion on the physical input and activate a second function of the device when the second type of touch motion is detected on the physical input. The second type of touch motion may have a different force or motion than the first type of touch motion.
The above aspects enable the integration of two different types of touch movements on the same physical input, activating different functions on the device, respectively. This allows the user to operate the same physical input and activate different functions. This greatly simplifies the operation of the user to activate different functions and operate the device, especially when the user is already familiar with the location of the physical input on the device. Thus, the above aspects improve the user experience.
Optionally, in any preceding aspect, the second function comprises launching an application on the device.
Optionally, in any preceding aspect, the second function comprises activating hardware on the device.
Optionally, in any preceding aspect, the second function comprises performing an operation on a touch screen of the device.
Optionally, in any of the foregoing aspects, the first type of touch motion and the second type of touch motion have different gestures on the physical input.
Optionally, in any of the preceding aspects, the first type of touch motion comprises pressing the physical input and the second type of touch motion comprises a sliding touch on the physical input.
Optionally, in any of the preceding aspects, the first type of touch motion and the second type of touch motion have different touch forces on the physical input.
Optionally, in any of the preceding aspects, the physical input is a volume input of the device.
Optionally, in any of the preceding aspects, the physical input is a power input of the device.
Optionally, in any of the preceding aspects, the physical input is a key for inputting information into the device.
Optionally, in any one of the preceding aspects, the apparatus further comprises: a touch screen for displaying information.
Optionally, in any one of the preceding aspects, the apparatus further comprises: a second ultrasonic receiver disposed beneath the physical button.
According to another aspect of the present disclosure, there is provided a method comprising: the device detects a first type of touch motion applied to a physical component of the device using an ultrasonic sensor disposed beneath the physical component, the ultrasonic sensor comprising an ultrasonic transmitter and an ultrasonic receiver and being configured to detect the first type of touch motion on the physical component; and when the first type of touch motion applied to the physical component is detected, the device activates a first function on the device, the first function being associated with the detection of the first type of touch motion on the physical component.
The above aspects enable the use of existing physical components of a device to activate functions on the device by imparting a touch motion on the physical components. This greatly simplifies the user's operation of activating the function, especially when the user is already familiar with the location of the physical component. Thus, the above aspects improve the user experience.
Optionally, in any one of the preceding aspects, the method further comprises: the device detects, using the physical component, a second type of touch motion that is different from the first type of touch motion applied to the physical component; and when the second type of touch motion applied to the physical component is detected, the device activates a second function on the device, the second function being associated with the detection of the second type of touch motion on the physical component and being different from the first function.
Optionally, in any of the foregoing aspects, the first type of touch motion and the second type of touch motion have different gestures on the physical component.
Optionally, in any of the preceding aspects, the first type of touch motion and the second type of touch motion have different touch forces on the physical component.
Optionally, in any of the preceding aspects, the first type of touch motion comprises a touch to the physical component and the second type of touch motion comprises a press of the physical component.
Optionally, in any of the preceding aspects, the physical component is a physical button or key.
Optionally, in any of the preceding aspects, the physical component is a volume button of the device.
Optionally, in any of the preceding aspects, the physical component is a power button of the device.
Optionally, in any of the preceding aspects, the physical component is a key for inputting information.
Optionally, in any preceding aspect, the first function comprises launching an application on the device.
Optionally, in any preceding aspect, the first function comprises activating hardware on the device.
Optionally, in any preceding aspect, the first function comprises performing an operation on a touch screen of the device.
According to another aspect of the present disclosure, there is provided an apparatus comprising a physical component in which the apparatus is usable, and an ultrasonic sensor comprising an ultrasonic transmitter and an ultrasonic receiver, the ultrasonic sensor being disposed beneath the physical component and configured to detect a first type of touch motion applied to the physical component and to activate a first function of the apparatus when the first type of touch motion on the physical component is detected.
The above aspects enable the use of existing physical components of a device to activate a function on the device when touch motion is detected on the physical component. This greatly simplifies the user's operation of activating the functions and operating the device, especially when the user is already familiar with the location of the physical component. Thus, the above aspects improve the user experience.
Optionally, in any of the preceding aspects, the physical component is configured to activate a second function of the device upon detection by the physical component of a second type of touch motion applied to the physical component different from the first type of touch motion.
Optionally, in any of the foregoing aspects, the first type of touch motion and the second type of touch motion have different gestures on the physical component.
Optionally, in any of the preceding aspects, the first type of touch motion and the second type of touch motion have different touch forces on the physical component.
Optionally, in any of the preceding aspects, the first type of touch motion comprises a touch to the physical component and the second type of touch motion comprises a press of the physical component.
Optionally, in any of the preceding aspects, the physical component is a physical button of the device.
Optionally, in any of the preceding aspects, the physical component is a volume button of the device.
Optionally, in any of the preceding aspects, the physical component is a power button of the device.
Optionally, in any of the preceding aspects, the physical button is a key for inputting information.
Optionally, in any preceding aspect, the first function comprises launching an application on the device.
Optionally, in any preceding aspect, the first function comprises activating hardware on the device.
Optionally, in any preceding aspect, the first function comprises performing an operation on a touch screen of the device.
Drawings
For a more complete understanding of the present invention, and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
FIG. 1 illustrates a diagram of a handset of one embodiment;
FIG. 2 illustrates a diagram of a volume input of a cell phone assembled in a housing of the cell phone, according to one embodiment;
FIG. 3 shows a diagram of volume input with an ultrasonic sensor attached underneath in accordance with another embodiment;
FIG. 4 illustrates a flow chart of an embodiment method of displaying operation of a handheld device;
FIG. 5 shows a block diagram of a processing system;
fig. 6 shows a block diagram of a processing system.
Detailed Description
The structure, making and using of the presently preferred embodiments are discussed in detail below. It should be appreciated that the novel concepts of the invention as provided by the present invention may be embodied in a wide variety of specific contexts. The specific embodiments discussed are merely illustrative of specific ways to make and use the invention, and do not limit the scope of the invention.
Various techniques have been developed and utilized on handheld devices to facilitate operation of the handheld device, for example, to facilitate one-handed manipulation of the handheld device. Embodiments of the present disclosure utilize existing physical components of the handheld device and ultrasonic sensing processes to provide one or more functions of operating the handheld device.
In some embodiments, an ultrasonic sensor comprising an ultrasonic transmitter and an ultrasonic receiver is placed under a physical input such as a volume button or a power button. The physical input is used to activate a first function of the handheld device when a first type of touch motion, e.g., a press (or "click") motion, is applied to the physical input. The ultrasonic sensor is configured to detect a second type of touch motion on the physical input and activate a second function of the device when the ultrasonic sensor detects the second type of touch motion on the physical input, e.g., a sliding touch or a non-sliding touch. The touch force on the physical input detectable by the ultrasonic sensor may be less than 20 grams and adjustable, while the pressing motion on the physical input may require a touch force of greater than 200 grams. Thus, touch motions can be performed on physical inputs without disturbing existing press motions (or presses or swings) configured for physical inputs.
By using existing physical inputs and ultrasonic sensors, different touch movements on the same physical input may activate different functions, which greatly simplifies the user's operation of activating different functions, especially when the user is already familiar with the location of the physical input on the handheld device. Thus, these embodiments improve the user experience without placing much additional burden on the user. Details of the embodiments will be provided below.
The following embodiments of the present disclosure will be described with respect to a mobile phone. However, these embodiments may also be applied to other handheld devices, such as handheld tablet computers or ipads, without departing from the spirit of the present disclosure.
Cell phones are becoming increasingly popular due to their powerful functions and high portability. With the technology developed, handsets have evolved from having a large number of physical inputs, keys and/or switches, and a small screen to having a small number of physical inputs and a large screen. Among the few physical inputs are volume buttons and power buttons, which are still provided in many current handsets. Many users are already familiar with the location of the physical volume buttons and power buttons on the handset. These buttons can be easily found and operated by the user. The user is able to obtain a tactile sensation and hear a clear "click" sound when operating these buttons, which gives them a grasp in using the handset.
Fig. 1 shows a diagram of a cell phone 100 of one embodiment. The handset may be a smart phone. As shown, the handset 100 includes a screen 102 for displaying information, a volume input 104 for adjusting (increasing or decreasing) the volume, and a power input 106 for turning the handset 100 on or off. The screen 102 may be a touch screen or a multi-touch screen. Typically, the volume input and the power input are located on the periphery (or side) of the handset. Fig. 1 shows that volume input 104 is located on the left side of handset 100 and power input 106 is located on the right side of handset 100. The volume input 104 or the power input 106 may be located in a different location than that shown in fig. 1 and may be used with a number of different types of input devices, such as switches, depressible buttons, etc. For example, the volume input 104 may be located on the right side and the power input 106 on the left side. In another example, the volume input 104 may be located on the top side of the cell phone 100. Fig. 1 also shows the hand 110 of a user holding the handset 100 without losing the generality of many users holding a hand-held handset. The user's fingers hold the periphery of the handset 100. The user's thumb 112 holds the handset 100 on one side (right side as shown) and the other four fingers hold the handset 100 on the other side (left side as shown). Holding the handset 100 in this manner, the user can operate the power input 106 with his/her thumb 112 and the volume input 104 with his/her other fingers, such as with the index finger 114 or the middle finger 116.
While the user holds the handset 100 with the hand 110, he/she can use the other hand to interact with the screen 102, i.e. on the screen with the handset 100. Interaction with the screen may include performing an operation on the screen. For example, interactions with the screen may include making selections on the screen (e.g., options or notifications), confirming on the screen (e.g., options or notifications), starting or closing an application, browsing web pages, moving items displayed on the screen, and so forth. The user may interact with the screen 102 in a limited manner using his/her thumb 112, such as sliding or touching within an on-screen area that the thumb 112 may reach. However, the hand 110 is holding the periphery of the handset 100, and it is often inconvenient for the user to interact with the screen 102 with only the hand 110. For example, the user may have to stretch the thumb 112 on the touch screen while holding the periphery of the cell phone 100, or the thumb 112 needs to reach the top half of the screen 102, which may be difficult or impossible. An intuitive and convenient way for a user to interact with information on a screen using a single hand is desired.
There are currently handsets on the market that use virtual inputs for extrusion-based selection. But it is difficult to adjust the pressing force. Further, by using virtual input instead of physical input, the user experience may be reduced because it is difficult to find the location of the virtual input, create a tactile sensation, and hear a clear "click" sound that the user expects to begin when operating the physical input, such as a button.
Embodiments of the present disclosure provide methods for operating one or more functions of a handheld device utilizing existing physical input and ultrasonic sensing technology on the handheld device. In particular, one embodiment may integrate sliding and/or tapping motions on top of physical inputs (e.g., volume or power inputs, intuitively seek and register) of the handheld device by utilizing ultrasonic sensing techniques. Examples of functions that may be provided include interacting with the screen of the handheld device (e.g., making selections on the screen, confirming on the screen, launching or closing an application, browsing web pages, moving items displayed on the screen or other suitable interactions), launching and closing an application or hardware (e.g., turning on and off a flashlight, or starting and closing a camera program, voice assistant, making screenshots, enabling or disabling WiFi, etc.), receiving mail, messages or notifications, and other suitable functions.
As used herein, "physical input" of a handheld device refers to the tangible components of the handheld device that are visible to and usable by a user. Examples of physical inputs may include volume buttons or keys, power buttons or keys, lock keys, keyboard keys (e.g., for inputting information, such as typing letters or numbers), or any other buttons or keys. An advantage of embodiments that utilize existing physical inputs is that they have become familiar components to users and they are intuitive and convenient for users to find and register. The terms "physical input" and "input" will be used interchangeably throughout this disclosure.
Ultrasonic sensing is a technique that measures the distance to an object using sound waves. This is typically done by an ultrasonic sensor, which may also be referred to as an ultrasonic transducer (e.g., a piezoceramic transducer) or transceiver. The ultrasonic sensor may be configured as an ultrasonic transmitter and/or an ultrasonic receiver, for example, by control of a driver IC. An ultrasonic transmitter (as a transmitter) generates and transmits a high-frequency sound wave, and an ultrasonic receiver receives the sound wave reflected from the object. The time interval between the emission of the sound wave and the receipt of the sound wave that was bounced back is calculated to determine the distance to the object. The techniques may be used to detect high precision contact of any object with a surface where an ultrasonic transceiver is attached to the back side of the surface. The techniques may be used to detect touch locations, touch areas, and touch forces of touch points on a surface. The touch force on the surface detectable by the ultrasonic sensor may be less than 50 grams or even 10 grams and is adjustable.
In the present disclosure, the terms "ultrasonic transmitter" and "transmitter" are used interchangeably, the terms "ultrasonic receiver" and "receiver" are used interchangeably, and the terms "ultrasonic transceiver" and "transceiver" are used interchangeably. As used in this disclosure, an ultrasonic sensor (or ultrasonic transceiver or transducer) may include one or more transmitters and one or more receivers.
In some embodiments, one or more ultrasonic sensors may be placed under the physical input floor of the handheld device and used to detect touch movements performed on the physical input. For example, the ultrasonic sensor may be attached to the rear side (or back side). The ultrasonic sensor may be connected to a main Printed Circuit Board (PCB) through a flexible PCB. An ultrasonic sensor is placed under the input so that a touch on the input can be detected by the ultrasonic sensor. This also facilitates manufacturing, wherein the ultrasonic sensor may be attached to the inwardly facing surface of the input.
The input, e.g., an existing input of the handheld device, may have been used to detect a first type of operational movement on the input, and the detection of the first type of operational movement may be used to activate a first function of the handheld device. The first type of operating movement may be a pressing movement for input, such as a pushing down (or squeezing) movement, a rotating movement or a rocking movement, which has been conventionally configured. Typically, the pressing motion requires an activation force of greater than 200 grams.
The ultrasonic sensor may be used to detect a second type of operational movement on the input, and the detection of the second type of operational movement may be used for a second function of the handheld device that is different from the first function activated by the pressing movement. The second type of operational motion does not move the input, e.g., does not press a physical input. The second type of operational motion may be touch motion detectable by an ultrasonic sensor. The touch motion may include a sliding touch on the input or a non-sliding touch on the input. At this time, the touch motion and the pressing motion are integrated on the input. Various functions may be activated through the use of inputs and ultrasonic sensors. The user can adjust the sensitivity of the touch motion on the input and the ultrasonic sensor is detectable. The user may also adjust the type of function that may be activated upon detection of touch movement on the input. The adjustment of the type of sensitivity and function may be performed by configuring settings of the handheld device.
In one example, the user may perform a sliding touch gently (i.e., as compared to a pressing motion on the input) on the input surface, and the ultrasonic sensor detects the position of one or more fingers (e.g., index finger 114 or thumb 112, depending on which finger is performing the touch and where the input is located on the handheld device), converting the detection of the sliding touch to a function for selecting from among various options (e.g., activating the function), e.g., selecting to launch one of a plurality of most commonly used applications such as a message, camera, video call, or email. In response to the sliding touch, the handset may display selectable options on the screen. The user may then touch (non-sliding touch detectable by the ultrasonic sensor instead of entering) an input (surface of the input) to make a selection of an option (e.g., selecting an application to launch). Thus, touch movements on the input may be used to perform operations that were once performed by the user with additional touches on the screen.
The configuration of the second function activated by the second type of operational movement on the input detected by the ultrasonic sensor has little or no effect on the existing functions of the physical input, as the ultrasonic sensor can detect light force movements. The touch force on the surface detectable by the ultrasonic sensor may be less than 20 grams and adjustable, while the pressing motion may require a touch force of greater than 200 grams. A touch motion may be performed on an input without disturbing the existing pressing motion (or pressing or rocking) configured for the input. The configuration of the second function enabled by using the ultrasonic sensor makes it possible for the user to operate the handheld device using one-handed manipulation without frequently stretching a finger (e.g., thumb 112) back and forth on the screen to perform multiple interactions with the screen.
Taking the volume input of the handset as an illustrative example, an ultrasonic sensor may be embedded under the volume input so that different operational movements performed on the volume input may be applied and detected and used to activate different functions on the handset. Fig. 2 shows a diagram of the volume input 200 of a cell phone of one embodiment assembled in the housing 220 of the cell phone. The volume input 200 has a top (or front) surface 210, which is the surface exposed to the outside and will be touched by the user to adjust the volume, and a bottom (or rear) surface 212 inside the handset and not touchable by the user. The volume input 200 is used to adjust the volume of the handset. Specifically, pressing ("clicking") on the top 214 of the top surface 210 of the volume input 200 increases the volume, while pressing ("clicking") on the bottom 216 of the top surface 210 of the volume input 200 decreases the volume. The volume input 200 also includes two legs 202 and 204 protruding from the bottom surface 212 away from the top surface 210, each connected to a contact tactile switch 206 and a contact tactile switch 208, respectively. The push down motion on the top 214 of the volume input 200 switches the contact tactile switch 206 and activates the increase in volume. A push down on the bottom 216 of the volume input 200 switches the touch tact switch 208 and activates a decrease in volume.
The ultrasonic sensor may be embedded underneath the volume input 200. For example, the ultrasonic sensor may be attached (e.g., glued) to the bottom surface 212, as shown in fig. 2. The ultrasonic sensor in the example comprises an ultrasonic transmitter 232 and two ultrasonic receivers 234 and 236. The ultrasonic transmitter 232 and ultrasonic receivers 234 and 236 are connected to a PCB in the housing 220. Based on the transmitted and received ultrasonic signals, it may be determined whether a touch motion is performed on the volume input 200, which type of touch motion is, for example, a sliding touch or a non-sliding touch.
The pressing action on the volume input 200 switches the contact tactile switch 206 or 208 to activate a function of increasing or decreasing the volume, as in conventional configurations. Touch movements on the volume input 200 that can be detected by the ultrasonic sensor can be used to activate different functions, such as launching an application or making a selection on a screen, or any of the functions discussed above. Because the activation force required to switch the touch tact switch 206 or 208 (e.g., typically greater than 200 grams) is much greater than the touch (or contact) force required by the ultrasonic sensor (e.g., less than 50 grams or even 10 grams), the conventional functionality of the volume input may be configured such that it is not affected by touch movements on the volume input detectable by the ultrasonic sensor. For example, the ultrasonic sensor may be configured so that it does not activate any function when a force greater than a threshold is detected, such as 150 grams, is applied to the volume input. Thus, when the volume input 200 is pressed in a conventional manner, which switches the touch tact switch, the user adjusts the volume. When the user makes a sliding touch or a slight touch to the volume input 200, which is detectable by the ultrasonic sensor, the user activates different functions configurable by the user. The circuit may be designed to hide the detection result of the ultrasonic sensor from activation of functions on the handset. Activating the function based on the sensor detection result is a technique well known to those of ordinary skill in the art and therefore is not described herein.
For illustrative purposes, fig. 2 shows one ultrasonic transmitter 232 and two ultrasonic receivers 234 and 236 positioned below the volume input 200. The volume input 200 may have attached to its bottom surface one or more ultrasonic transmitters and one or more ultrasonic receivers. For example, the volume input 200 may be embedded in one transmitter and one receiver, or in one transmitter and three receivers, or in two transmitters and two receivers. The number of ultrasonic transmitters and receivers that may be embedded below the volume input may be determined based on the size of the volume input and the size of the ultrasonic transmitters and receivers. The location of the ultrasonic emitter and receiver on the bottom surface 212 may also vary, for example, based on the volume input and the size of the ultrasonic emitter and receiver, and/or the structural location of the volume input, such as the legs 202 and 204. For example, the ultrasonic transmitters and receivers may be evenly spaced apart on the bottom surface 212. In another example, as shown in fig. 2, an emitter 232 is placed between two legs 202 and 204. One receiver 234 is positioned between the leg 202 and one end of the volume input 200 and the other receiver 236 is positioned between the leg 204 and the other end of the volume input 200. The number and location of the ultrasonic transmitters and receivers may be determined during the design phase of the handheld device. In some embodiments, when multiple inputs are available on the handheld device, a longer or larger input may be selected to have one ultrasonic sensor embedded underneath to integrate the sliding/touching and pressing motions on the inputs. The connection of the ultrasonic sensor to the PCB may also be considered when selecting the input.
Fig. 3 shows a diagram of a volume input 300 with an ultrasonic sensor attached underneath in another embodiment. The volume input 300 includes a top surface 312, a bottom surface 314, and two legs 316 and 318 for opening a touch tact switch. An ultrasonic transmitter 322 and an ultrasonic receiver 324 are attached to the bottom surface 314 between the two legs 316 and 318. The length of the volume input is L1. The ultrasonic transmitter 322 and the ultrasonic receiver 324 have the same length L2. The distance between legs 316 and 318 is L3. Based on L1, L2, and L3, the number and positions of ultrasonic transmitters and receivers can be determined. For example, in the case where L1 is 20mm, L2 is 2.5mm, and L3 is 12mm, the emitter and one receiver may be placed between the two legs 316 and 318 (as shown in FIG. 3), or one emitter and two receivers may be placed, for example, uniformly between the two legs 316 and 318 (not shown).
In manufacture, a predetermined number of ultrasonic transmitters and receivers may be attached (e.g., glued) to a bottom surface (e.g., 212 or 314) of a predetermined input at a predetermined location, and then the input may be assembled on a rack. To accurately detect touch motion on an input, it is desirable that the distance of each ultrasonic transmitter and receiver to the top surface of the input (e.g., 210 or 312) be the same or about the same, and that the difference meet a threshold.
The above-described embodiments may be applied to any applicable input on the periphery of a handheld device. Embodiments may also be applied to other physical components of the handheld device that are accessible to operate or use the handheld device. Such a physical component may be removable from the handset or accessible by a user of the handset or may be separate from the housing of the handset so that one or more ultrasonic sensors may be attached to the component or PCB may be conventionally attached before the physical component is assembled into the housing of the handset during manufacture. Ultrasonic sensors may be used to detect touch motion on a physical component. Examples of physical components may include SIM card door or rear camera trim or other suitable components. For example, an ultrasonic sensor may be attached to the back of the glass cover of the phone (for aesthetic purposes). Touch of the glass cover surface can be detected by the ultrasonic sensor and activate the function.
Embodiments utilize existing physical components of the handheld device and one or more ultrasonic sensors or ultrasonic arrays to provide more functionality for using or operating the handheld device. These functions may be used to enable single-handed operation of the handheld device or to further facilitate operation of the handheld device. By using existing physical components, the user can use new functions without having to learn and remember new locations on the handheld device, and the appearance of the handheld device is not compromised. Thus, these embodiments improve the user experience without increasing the burden on the user.
Fig. 4 shows a flow chart of an embodiment method 400 of operation of a display device. As illustrated, in step 402, the device detects a first type of touch motion applied to a physical component of the device using an ultrasonic sensor disposed beneath the physical component, wherein the ultrasonic sensor includes an ultrasonic transmitter and an ultrasonic receiver and is configured to detect the first type of touch motion on the physical component. At step 404, when a first type of touch motion is detected that is applied to the physical component, the device activates a first function on the device, wherein the first function is associated with the detection of the first type of touch motion on the physical component.
Fig. 5 is a block diagram of a processing system 500 that may be used to implement the apparatus and methods disclosed herein. A particular device may utilize all or only a subset of the components shown, and the degree of integration may vary between devices. Further, the device may include multiple instances of the component, such as multiple processing units, processors, memories, transmitters, receivers, and the like. The processing system may include a processing unit equipped with one or more input/output devices, such as speakers, microphones, mice, touch-screens, keys, keyboards, printers, displays, etc. The processing unit may include a Central Processing Unit (CPU), memory, mass storage devices, video adapters, and I/O interfaces to the bus.
The bus may be one or more of any type of several bus architectures including a memory bus or memory controller, a peripheral bus, a video bus, and so forth. The CPU may comprise any type of electronic data processor. The memory may include any type of system memory, such as Static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), synchronous DRAM (SDRAM), read Only Memory (ROM), combinations thereof, or the like. In an embodiment, the memory may include ROM for use at power-on and DRAM for program and data memory for use when executing programs.
The mass memory device may include any type of non-transitory memory device for storing and having access to data, programs, and other information via a bus. The mass memory device may include one or more of the following: solid state magnetic disks, hard disk drives, magnetic disk drives, optical disk drives, and the like.
The display card and the I/O interface provide an interface to couple external input and output devices to the processing unit. As shown, examples of input and output devices include a display coupled to a video adapter, and a mouse/keyboard/printer/input/keys coupled to an I/O interface. Other devices may be coupled to the processing unit and additional or fewer interface cards may be utilized. For example, a serial interface such as a Universal Serial Bus (USB) (not shown) may be used to provide the interface to the printer.
The processing unit also contains one or more network interfaces, which may include wired links, such as ethernet cables or the like, and/or wireless links to access nodes or different networks. The network interface allows the processing unit to communicate with the remote unit via a network. For example, the network interface may provide wireless communication via one or more transmitters/transmit antennas and one or more receivers/receive antennas. In one embodiment, the processing unit is coupled with a local area network or wide area network for data processing and communication with remote devices such as other processing units, the Internet, remote storage devices, and the like.
FIG. 6 illustrates a block diagram of a processing system 600 that may be used to implement another embodiment of the devices and methods disclosed herein, which may be installed in a host device. As shown, the processing system 600 includes a processor 604, a memory 606, and interfaces 610-614, which may (or may not) be arranged as shown. Processor 604 may be any component or collection of components adapted to perform computing and/or other processing-related tasks, and memory 606 may be any component or collection of components adapted to store programs and/or instructions for execution by processor 604. In one embodiment, memory 606 includes a non-transitory computer-readable medium. Interfaces 610, 612, 614 may be any component or collection of components that allow processing system 600 to communicate with other devices/components and/or users. For example, one or more of the interfaces 610, 612, 614 may be adapted to communicate data, control, or management messages from the processor 604 to applications installed on the host device and/or remote device. As another example, one or more of the interfaces 610, 612, 614 may be adapted to allow a user or user device (e.g., a Personal Computer (PC), etc.) to interact/communicate with the processing system 600. The processing system 600 may include additional components not depicted in the figures, such as long-term storage (e.g., non-volatile memory, etc.).
In some embodiments, the processing system 600 is in a user-side device accessing a wireless or wired telecommunications network, such as a mobile station, a User Equipment (UE), a Personal Computer (PC), a tablet, a wearable communication device (e.g., a smart watch, etc.), or any other handheld device. The user-side device may be adapted to access a telecommunications network. In some embodiments, one or more of the interfaces 610, 612, 614 connect the processing system 600 to a transceiver adapted to send and receive signaling over a telecommunications network.
It should be understood that one or more steps of the methods of the embodiments provided herein may be performed by corresponding units or modules. For example, the signal may be transmitted by a transmitting unit or a transmitting module. The signal may be received by a receiving unit or a receiving module. The signals may be processed by a processing unit or processing module. Other steps may be performed by the detection unit/module, the activation unit/module and/or the input unit/module. The respective units/modules may be hardware, software or a combination thereof. For example, one or more of the units/modules may be an integrated circuit, such as a Field Programmable Gate Array (FPGA) or an Application Specific Integrated Circuit (ASIC).
While this invention has been described with reference to illustrative embodiments, this description is not intended to limit the invention. Various modifications and combinations of the illustrative embodiments, as well as other embodiments of the invention, will be apparent to persons skilled in the art upon reference to the description. Accordingly, the appended claims are intended to cover any such modifications or embodiments.

Claims (18)

1. An electronic device, comprising:
a physical input for providing a first function when a first type of touch motion is detected on the physical input; and
an ultrasonic sensor, wherein the ultrasonic sensor comprises an ultrasonic transmitter and a first ultrasonic receiver, the ultrasonic sensor disposed beneath the physical input and configured to detect a second type of touch motion on the physical input; providing a second function after the ultrasonic sensor detects the second type of touch motion on the physical input; wherein the first function is different from the second function, the first type of touch motion is different from the second type of touch motion, the second type of touch motion does not move the physical input;
wherein the ultrasonic sensor provides a second function upon detecting the second type of touch motion on the physical input, comprising:
the ultrasonic sensor detecting the second type of touch motion at a first location on the physical input, selecting a first option;
the ultrasonic sensor detecting the second type of touch motion at a second location on the physical input, selecting a second option;
wherein the first location is different from the second location and the first option is different from the second option.
2. The electronic device of claim 1, wherein the first function comprises displaying selectable options.
3. The electronic device of claim 1, wherein the first type of touch motion comprises a sliding input and the second type of touch motion comprises a non-sliding input.
4. The electronic device of claim 1, wherein the first type of touch motion comprises a press of the physical input and the second type of touch motion comprises a sliding touch of the physical input.
5. The electronic device of claim 1, wherein the second function comprises launching an application on the electronic device.
6. The electronic device of claim 1, wherein the second function comprises activating hardware on the electronic device.
7. The electronic device of claim 1, wherein the ultrasonic sensor is further configured to detect positional information of a plurality of fingers, activating a third function of the electronic device.
8. The electronic device of claim 1, wherein the physical input is a volume input of the electronic device.
9. The electronic device of claim 1, wherein the physical input is a power input of the electronic device.
10. An input method applied to an electronic device, the electronic device comprising a physical input and an ultrasonic sensor, wherein the ultrasonic sensor comprises an ultrasonic transmitter and a first ultrasonic receiver, the ultrasonic sensor being disposed under the physical input, the method comprising:
providing a first function upon detecting a first type of touch motion on the physical input;
the ultrasonic sensor detecting a second type of touch motion on the physical input providing a second function; wherein the first function is different from the second function, the first type of touch motion is different from the second type of touch motion, the second type of touch motion does not move the physical input;
wherein the ultrasonic sensor provides a second function upon detecting the second type of touch motion on the physical input, comprising:
the ultrasonic sensor detecting the second type of touch motion at a first location on the physical input, selecting a first option;
the ultrasonic sensor detecting the second type of touch motion at a second location on the physical input, selecting a second option;
wherein the first location is different from the second location and the first option is different from the second option.
11. The input method of claim 10, wherein the first function comprises displaying selectable options.
12. The input method of claim 10, wherein the first type of touch motion comprises a sliding input and the second type of touch motion comprises a non-sliding input.
13. The input method of claim 10, wherein the first type of touch motion comprises a press of the physical input and the second type of touch motion comprises a sliding touch of the physical input.
14. The method of claim 10, wherein the second function comprises launching an application on the electronic device.
15. The input method of claim 10, wherein the second function comprises activating hardware on the electronic device.
16. The input method of claim 10, wherein the ultrasonic sensor is further configured to detect positional information of a plurality of fingers, activating a third function of the electronic device.
17. The input method of claim 10, wherein the physical input is a volume input of the electronic device.
18. The input method of claim 10, wherein the physical input is a power input of the electronic device.
CN202211081423.1A 2018-06-20 2019-06-19 Method and apparatus for integrating swipe and touch on an input device Active CN115562502B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211081423.1A CN115562502B (en) 2018-06-20 2019-06-19 Method and apparatus for integrating swipe and touch on an input device

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201862687796P 2018-06-20 2018-06-20
US62/687,796 2018-06-20
PCT/US2019/038032 WO2019246295A1 (en) 2018-06-20 2019-06-19 Method and apparatus of integrating slide and touch on an input device
CN202211081423.1A CN115562502B (en) 2018-06-20 2019-06-19 Method and apparatus for integrating swipe and touch on an input device
CN201980040527.8A CN112313609B (en) 2018-06-20 2019-06-19 Method and apparatus for integrating swipe and touch on input device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201980040527.8A Division CN112313609B (en) 2018-06-20 2019-06-19 Method and apparatus for integrating swipe and touch on input device

Publications (2)

Publication Number Publication Date
CN115562502A CN115562502A (en) 2023-01-03
CN115562502B true CN115562502B (en) 2023-10-20

Family

ID=68984362

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201980040527.8A Active CN112313609B (en) 2018-06-20 2019-06-19 Method and apparatus for integrating swipe and touch on input device
CN202211081423.1A Active CN115562502B (en) 2018-06-20 2019-06-19 Method and apparatus for integrating swipe and touch on an input device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201980040527.8A Active CN112313609B (en) 2018-06-20 2019-06-19 Method and apparatus for integrating swipe and touch on input device

Country Status (2)

Country Link
CN (2) CN112313609B (en)
WO (1) WO2019246295A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101952796A (en) * 2008-02-19 2011-01-19 索尼爱立信移动通讯有限公司 Identifying and responding to multiple time-overlapping touches on a touch panel
AU2012202122A1 (en) * 2005-03-04 2012-05-03 Apple Inc. Multi-functional hand-held device
CN102629184A (en) * 2012-02-29 2012-08-08 北京创造力教育科技有限公司 Handheld terminal and operation method thereof
CN104915030A (en) * 2015-05-28 2015-09-16 努比亚技术有限公司 Operation method and device based on mobile terminal rolling wheel key
US9389718B1 (en) * 2013-04-04 2016-07-12 Amazon Technologies, Inc. Thumb touch interface
CN106293221A (en) * 2016-08-05 2017-01-04 歌尔股份有限公司 Touch pressure control method and equipment
CN108073310A (en) * 2016-11-10 2018-05-25 森顿斯公司 It is detected along the touch input of equipment side wall

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3725747B2 (en) * 2000-01-07 2005-12-14 三菱電線工業株式会社 Ultrasonic degradation diagnostic equipment for low-voltage cable for railway equipment
US20070176903A1 (en) * 2006-01-31 2007-08-02 Dahlin Jeffrey J Capacitive touch sensor button activation
TWM317616U (en) * 2007-02-06 2007-08-21 Inventec Appliances Corp Touch input device
US8421483B2 (en) * 2008-06-13 2013-04-16 Sony Ericsson Mobile Communications Ab Touch and force sensing for input devices
CN101907959B (en) * 2010-08-20 2012-10-10 鸿富锦精密工业(深圳)有限公司 Slidably controlled touch type electronic device
WO2014018115A1 (en) * 2012-07-26 2014-01-30 Changello Enterprise Llc Ultrasound-based force sensing of inputs
US20150286341A1 (en) * 2012-11-21 2015-10-08 The Board Of Trustees Of The Leland Stanford Junior University multi-touch ultrasonic touch screen
CN104932815A (en) * 2015-05-06 2015-09-23 努比亚技术有限公司 Mobile terminal and operation method thereof
KR102383790B1 (en) * 2015-05-22 2022-04-08 삼성전자주식회사 Environment recognition method and electronic device thereof
CN107943267A (en) * 2017-11-21 2018-04-20 北京小米移动软件有限公司 A kind of method and apparatus for controlling screen

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2012202122A1 (en) * 2005-03-04 2012-05-03 Apple Inc. Multi-functional hand-held device
CN101952796A (en) * 2008-02-19 2011-01-19 索尼爱立信移动通讯有限公司 Identifying and responding to multiple time-overlapping touches on a touch panel
CN102629184A (en) * 2012-02-29 2012-08-08 北京创造力教育科技有限公司 Handheld terminal and operation method thereof
US9389718B1 (en) * 2013-04-04 2016-07-12 Amazon Technologies, Inc. Thumb touch interface
CN104915030A (en) * 2015-05-28 2015-09-16 努比亚技术有限公司 Operation method and device based on mobile terminal rolling wheel key
CN106293221A (en) * 2016-08-05 2017-01-04 歌尔股份有限公司 Touch pressure control method and equipment
CN108073310A (en) * 2016-11-10 2018-05-25 森顿斯公司 It is detected along the touch input of equipment side wall

Also Published As

Publication number Publication date
CN115562502A (en) 2023-01-03
WO2019246295A1 (en) 2019-12-26
CN112313609A (en) 2021-02-02
CN112313609B (en) 2022-09-16

Similar Documents

Publication Publication Date Title
US9024877B2 (en) Method for automatically switching user interface of handheld terminal device, and handheld terminal device
CN112527431B (en) Widget processing method and related device
CN109428969B (en) Edge touch method and device of double-screen terminal and computer readable storage medium
WO2017096944A1 (en) Virtual keyboard based content input method and apparatus, and touch device
CN102119376B (en) Multidimensional navigation for touch-sensitive display
EP3525075B1 (en) Method for lighting up screen of double-screen terminal, and terminal
US20110050575A1 (en) Method and apparatus for an adaptive touch screen display
KR20190100339A (en) Application switching method, device and graphical user interface
WO2017161826A1 (en) Functional control method and terminal
RU2689430C1 (en) System and method of touch screen control by means of two knuckles of fingers
JP7331245B2 (en) Target position adjustment method and electronic device
WO2023016372A1 (en) Control method and apparatus, and electronic device and storage medium
CN108920052B (en) Page display control method and related product
WO2018112803A1 (en) Touch screen-based gesture recognition method and device
WO2018039914A1 (en) Method for copying data, and user terminal
EP3350677B1 (en) Coordinate measuring apparatus and method of controlling the same
KR20220123036A (en) Touch keys, control methods and electronics
CN115562502B (en) Method and apparatus for integrating swipe and touch on an input device
CN109683721A (en) A kind of input information display method and terminal
JPWO2019208323A1 (en) Remote control device for mobile terminals
US20110001716A1 (en) Key module and portable electronic device
KR20120134399A (en) Method for providing schedule information using movement sensing device and apparatus therefof
US20170277427A1 (en) Control method and device for terminal
KR20120134485A (en) Method for searching index list using movement sensing device and apparatus therefof
CN109643216A (en) A kind of icon display method and terminal device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant