CN107229408B - Terminal, input control method thereof, and computer-readable storage medium - Google Patents

Terminal, input control method thereof, and computer-readable storage medium Download PDF

Info

Publication number
CN107229408B
CN107229408B CN201710376318.3A CN201710376318A CN107229408B CN 107229408 B CN107229408 B CN 107229408B CN 201710376318 A CN201710376318 A CN 201710376318A CN 107229408 B CN107229408 B CN 107229408B
Authority
CN
China
Prior art keywords
preset
event
sliding gesture
terminal
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710376318.3A
Other languages
Chinese (zh)
Other versions
CN107229408A (en
Inventor
刘永华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201710376318.3A priority Critical patent/CN107229408B/en
Publication of CN107229408A publication Critical patent/CN107229408A/en
Application granted granted Critical
Publication of CN107229408B publication Critical patent/CN107229408B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

The invention discloses an input control method of a terminal, which comprises the following steps: responding to a preset sliding gesture event detected on a touch input unit of a terminal, and calling an application interface corresponding to the preset sliding gesture event; when the sliding distance of the preset sliding gesture reaches a preset threshold value, injecting a cancellation event, and canceling the continuous dispatch of the preset sliding gesture event; switching an application interface corresponding to the preset sliding gesture event into a foreground application window; injecting a sliding pressing event, and continuously dispatching a preset sliding gesture event detected on the touch input unit; and displaying an application interface corresponding to the preset sliding gesture event according to the preset sliding gesture event which is continuously dispatched. The input control method can shield the response of the user interface of the current foreground application window to the preset sliding gesture, and improves the user experience. The invention also provides a terminal with the input control method and a computer readable storage medium.

Description

Terminal, input control method thereof, and computer-readable storage medium
Technical Field
The present invention relates to the field of electronic technologies, and in particular, to a terminal, an input control method of the terminal, and a computer-readable storage medium.
Background
With the wide application of touch control technology to various terminals, more and more touch gestures can be recognized and executed on the terminals. The terminal includes, but is not limited to, a computer, a portable terminal, a smart phone, a tablet computer, a personal Digital assistant pda (personal Digital assistant), and the like. A pull-up gesture sliding on the bottom of the touch screen of the terminal is generally used to slide out from the bottom of the display interface to start a list of background running programs, start a shortcut button bar, and the like.
However, in an operating system of a current terminal, for example, in an intelligent terminal adopting an android system, when an input device detects an event, the detected event is generally dispatched to a distribution filter layer (inputflag.so) for processing, then the distribution filter layer distributes the dispatched event to a Framework service layer (Framework service.jar), and finally the Framework service layer distributes the dispatched event to an application program (APP) corresponding to a current interface for processing by the APP. This results in that, when the user performs the pull-up gesture of sliding up the bottom on the touch screen, before the bottom of the display interface does not slide out of the corresponding pull-up application, the event dispatched by the system can be dispatched to both the pull-up application and the current window application, and at this time, the user interface corresponding to the current window application can also slide along with the pull-up gesture.
The sliding of the user interface corresponding to the current window application is not expected by the user, and the experience of the user is reduced.
Disclosure of Invention
The invention mainly aims to provide a terminal, an input control method of the terminal and a computer readable storage medium, aiming at shielding the response of a user interface corresponding to a current foreground application window to a preset sliding gesture and only responding to and executing an application corresponding to a preset sliding gesture event.
In order to achieve the above object, the present invention provides an input control method for controlling input of a terminal, comprising the steps of:
responding to a preset sliding gesture event detected on a touch input unit of the terminal, and calling an application interface corresponding to the preset sliding gesture event;
when the sliding distance of the preset sliding gesture reaches a preset threshold value, injecting a cancellation event, and canceling the continuous dispatch of the preset sliding gesture event;
switching an application interface corresponding to the preset sliding gesture event into a foreground application window;
injecting a sliding pressing event, and continuously dispatching a preset sliding gesture event detected on a touch input unit of the terminal;
and displaying an application interface corresponding to the preset sliding gesture event according to the preset sliding gesture event which is continuously dispatched.
Further, the step of responding to a preset sliding gesture event detected on a touch input unit of the terminal and calling an application interface corresponding to the preset sliding gesture event includes:
responding to a preset sliding gesture of a user at a preset position and in a preset direction on a touch input unit of the terminal, and dispatching a preset sliding gesture event;
and responding to the preset sliding gesture event, and calling an application interface corresponding to the preset sliding gesture event.
Further, the step of responding to a preset sliding gesture event detected on a touch input unit of the terminal and calling an application interface corresponding to the preset sliding gesture event includes:
responding to a preset sliding gesture triggered by a user on a touch input unit of the terminal, and dispatching the preset sliding gesture event through a pipeline registration event;
and responding to the preset sliding gesture event, and calling an application interface corresponding to the preset sliding gesture event.
Further, when the sliding distance of the preset sliding gesture reaches a preset threshold, injecting a cancellation event, where the step of canceling the continuous dispatch of the sliding gesture event includes:
judging whether the sliding distance of the preset sliding gesture on a touch input unit of the terminal from a preset position in a preset direction reaches a preset threshold value or not;
when the sliding distance of the preset sliding gesture from the preset position in the preset direction reaches a preset threshold value, injecting a cancellation event with a preset identification position;
and responding to the cancellation event, and canceling the continuous dispatch of the preset sliding gesture event.
Further, the preset flag bit of the cancellation event in the step of injecting the cancellation event with the preset flag bit is "True";
injecting a slide press event, and continuing to serve a preset slide gesture event detected on a touch input unit of the terminal, wherein the step comprises the following steps:
injecting a sliding press event, and recording that the preset flag bit of the cancellation event is 'non-true False';
and continuously dispatching the preset sliding gesture event detected on the touch input unit of the terminal.
Further, the step of continuously dispatching the preset slide gesture event detected on the touch input unit of the terminal includes:
and continuously dispatching the preset sliding gesture event detected on the touch input unit of the terminal to a foreground application window through the pipeline registration event.
Further, the preset sliding gesture is a touch gesture pulled up from the bottom of the terminal.
Further, the preset threshold value is 4-10 dp. :
a terminal comprises a touch screen unit, a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein the processor realizes the steps of the input control method when executing the computer program.
A computer-readable storage medium, on which a computer program is stored, characterized in that the computer program realizes the steps of the above-mentioned input control method when being executed by a processor.
By adopting the input control method and the terminal using the input control method, a cancellation event can be injected when the sliding distance of the preset sliding gesture reaches a preset threshold value, so that the continuous dispatch of the preset sliding gesture event is cancelled; then switching an application interface corresponding to the preset sliding gesture event into a foreground application window; injecting a sliding pressing event again to enable the user to continue triggering the subsequent touch action of the preset sliding gesture on the touch input unit of the terminal; therefore, the effects of shielding the response of the user interface corresponding to the current foreground application window to the preset sliding gesture and only executing and calling the application interface corresponding to the preset sliding gesture event are achieved. Meanwhile, the processing speed is optimized, the processing response time is shortened, and the user experience is enhanced.
Drawings
Fig. 1 is a schematic diagram of a hardware structure of a mobile terminal according to various embodiments of the present invention;
fig. 2 is a schematic block diagram of a terminal according to an embodiment of the present invention;
FIG. 3 is a schematic diagram illustrating a preset touch gesture performed on the terminal shown in FIG. 2;
fig. 4 is a flowchart of a method of an input control method for the terminal of fig. 2 according to a first embodiment of the present invention;
fig. 5 is a schematic view of a user interface display corresponding to a pull-up gesture executed on a terminal according to an embodiment of the present invention.
Fig. 6 is a schematic view of a user interface display of a current foreground application window on a terminal in an embodiment of the present invention.
Fig. 7 is a schematic diagram illustrating a user interface switching state display corresponding to a pull-up gesture executed on a terminal according to an embodiment of the present invention.
Fig. 8 is a flowchart of a method of an input control method for the terminal of fig. 2 according to a second embodiment of the present invention;
fig. 9 is a flowchart of a method of an input control method for the terminal of fig. 2 according to a third embodiment of the present invention;
fig. 10 is a flowchart of a method of an input control method for the terminal of fig. 2 according to a fourth embodiment of the present invention;
fig. 11 is a flowchart of a method of an input control method for the terminal in fig. 2 according to a fifth embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in itself. Thus, "module", "component" or "unit" may be used mixedly.
The terminal may be implemented in various forms. For example, the terminal described in the present invention may include a mobile terminal such as a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable device, a smart band, a pedometer, and the like, and a mobile or stationary terminal such as a Digital TV, a desktop computer, and the like.
The following description will be given by way of example of a mobile terminal, and it will be understood by those skilled in the art that the construction according to the embodiment of the present invention can be applied to a fixed type terminal, in addition to elements particularly used for mobile purposes.
Referring to fig. 1, which is a schematic diagram of a hardware structure of a mobile terminal 100 for implementing various embodiments of the present invention, the mobile terminal 100 may include: a display unit 106, a user input unit 107, an interface unit 108, a memory 109, a processor 110, and a power supply 111. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 1 is not intended to be limiting of the mobile terminal 100, and that the mobile terminal 100 may include more or fewer components than shown, or some components may be combined, or a different arrangement of components; for example, in the example shown in fig. 1, the mobile terminal 100 further includes an RF (Radio Frequency) unit 101, a WiFi module 102, an audio output unit 103, an a/V (audio/video) input unit 104, a sensor 105, and the like.
The various components of the mobile terminal 100 are described in detail below with reference to fig. 1:
the display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 107 may include a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a touch screen, may collect a touch operation performed by a user on or near the touch panel 1071 (e.g., an operation performed by the user on or near the touch panel 1071 using a finger, a stylus, or any other suitable object or accessory), and drive a corresponding connection device according to a predetermined program. The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and can receive and execute commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. In particular, other input devices 1072 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, screen keys, home keys, etc.), a trackball, a mouse, a joystick, and the like, and are not limited thereto.
Further, the touch panel 1071 may cover the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although the touch panel 1071 and the display panel 1061 are shown in fig. 1 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the mobile terminal, and is not limited herein.
The interface unit 108 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal 100 and external devices.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the mobile terminal. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The mobile terminal 100 may further include a power supply 111 (e.g., a battery) for supplying power to various components, and preferably, the power supply 111 may be logically connected to the processor 110 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system.
Although not shown in fig. 1, the mobile terminal 100 may further include a bluetooth module or the like, which is not described in detail herein.
Based on the above terminal hardware structure, various embodiments of the method of the present invention are provided.
Referring to fig. 2 and fig. 3, fig. 2 is a schematic block structure diagram of the terminal 10 according to an embodiment of the present invention, and fig. 3 is a schematic action diagram of executing a predetermined touch gesture on the terminal 10 in fig. 2.
The terminal 10 includes a display unit 11, a touch input unit 12, a storage unit 13, a processing unit 14, a power supply 15, and the like. Those skilled in the art will appreciate that the terminal 10 shown in fig. 2 may also include more or fewer components than shown, or combine certain components, or a different arrangement of components.
The touch input unit 12 is overlaid on the display unit 11, and is configured to detect a touch operation of a user on or near the touch input unit 12, and transmit the touch operation to the processing unit 14 to determine a type of the touch event, and then the processing unit 14 provides a corresponding visual output on the display unit 11 according to the type of the touch event.
The user may trigger a preset slide gesture at the touch input unit 12 to perform a corresponding operation. For example, the user interface displayed on the display unit 11 is moved or switched accordingly by triggering horizontal sliding or up-down sliding on the touch input unit 12. In an embodiment, a corresponding relationship between a specific sliding gesture and a control instruction may be preset in the system, and the specific sliding gesture is triggered by sliding to a preset direction and a preset position on the touch input unit 12, so as to call the corresponding control instruction to implement a corresponding function; for example, a pull-up gesture sliding on the bottom of the touch screen of the terminal is generally used to slide out the bottom of the display interface to start a list of background running programs, start a shortcut button bar, and the like.
Referring to fig. 4, a flowchart of an input control method 200 for the terminal 10 in fig. 2 according to a first embodiment of the present invention is shown.
Wherein, the input control method 200 comprises the following steps:
and step S10, responding to a preset sliding gesture event detected on the touch input unit of the terminal, and calling an application interface corresponding to the preset sliding gesture event.
When a user triggers a TOUCH operation on the TOUCH input unit 12 of the terminal 10, the system generates a corresponding TOUCH event according to the TOUCH input unit 12 detecting the TOUCH operation of the user on the TOUCH input unit 12, wherein the TOUCH event includes, but is not limited to, a click (TOUCH) event, a click DOWN (TOUCH) event, a click UP (TOUCH UP) event, a slide gesture (MOTION) event, a slide DOWN (MOTION DOWN) event, a slide UP (MOTION UP) event, and the like.
Specifically, in an operating system of a current terminal, for example, in an intelligent terminal adopting an android system, when an input device detects an event, the detected event is generally dispatched to a distribution filter layer (inputflag.so) for processing, then the distribution filter layer distributes the dispatched event to a Framework service layer (Framework service.jar), and finally the Framework service layer distributes the dispatched event to an application program (APP) corresponding to a current interface for processing by the APP. It is understood that in other operating systems, when an event is detected by an input device, a corresponding input event is generated, the detected input event is recognized by the input device or the detected input event is recognized by the processor, and the recognized input event is dispatched to a corresponding functional layer of the operating system by the input device or the processor.
In step S10 of the present embodiment, when the user triggers a touch operation of a preset slide gesture on the touch input unit 12 of the terminal 10, the touch input unit 12 detects the touch operation of the user on the touch input unit 12 and generates a corresponding touch signal; the processing unit 14 determines whether the touch operation is a preset sliding gesture according to the touch signal, and when it is determined that the touch operation is the preset sliding gesture, the touch input unit 12 or the processing unit 14 dispatches the preset sliding gesture event to a corresponding functional layer of an operating system, and the system responds to the preset sliding gesture event and calls an application interface corresponding to the preset sliding gesture event. For example, when the preset slide gesture is a pull-up gesture sliding up from the bottom of the touch screen of the terminal, the application interface corresponding to the preset slide gesture event may be generally a list interface sliding out from the bottom of the display interface to start a background running program, a shortcut button interface, or the like.
And step S20, when the sliding distance of the preset sliding gesture reaches a preset threshold value, injecting a cancellation event, and canceling the continuous dispatch of the preset sliding gesture event.
When a user triggers a touch operation of a preset slide gesture on the touch input unit 12 of the terminal 10, the touch input unit 12 detects the touch operation of the user on the touch input unit 12 and generates a corresponding touch signal; the processing unit 14 determines the touch operation as a preset sliding gesture according to the touch signal, determines a sliding distance value of the preset sliding gesture according to the touch signal, and determines whether the sliding distance value reaches a preset threshold value; and when the sliding distance of the preset sliding gesture reaches a preset threshold value, injecting a cancellation event, and canceling the continuous dispatch of the preset sliding gesture event.
Specifically, in step S20 of the present embodiment, when the processing unit 14 determines that the sliding distance of the preset sliding gesture reaches the preset threshold value according to the touch input unit 12 detecting the touch operation of the user on the touch input unit 12 to generate the corresponding touch signal, a CANCEL (CANCEL) event is injected; after receiving the cancel event, the system does not dispatch the corresponding touch event generated by responding to the follow-up action of the preset sliding gesture of the user on the touch input unit 12 detected by the touch input unit 12, that is, does not dispatch the preset sliding gesture event.
It can be understood that, in this embodiment, after the cancel event is injected, the user continues to trigger the subsequent touch operation of the preset slide gesture on the touch input unit 12 of the terminal 10, and the touch input unit 12 detects that the user continues to trigger the subsequent touch operation of the preset slide gesture on the touch input unit 12 to generate a corresponding touch signal; after the touch input unit 12 or the processing unit 14 responds to the touch signal to generate a corresponding preset slide gesture event, the preset slide gesture event is not dispatched any more.
After the cancel event is injected, the system does not dispatch the preset slide gesture event any more, and neither the foreground application window before the trigger of the preset slide gesture nor the application corresponding to the preset slide gesture event receives the corresponding touch event dispatched by the subsequent action of the preset slide gesture.
And step S30, switching the application interface corresponding to the preset sliding gesture event into a foreground application window.
Specifically, in this embodiment, the application interface corresponding to the preset sliding gesture event is already called in step S10, after the cancel event is injected, the system switches the current foreground application window to the background, and switches the application interface corresponding to the preset sliding gesture event to the new foreground application window, so that the original foreground application window is switched to the background.
And step S40, injecting a sliding pressing event, and continuing to serve the preset sliding gesture event detected on the touch input unit of the terminal.
A slide DOWN event is injected on the system to allow the user to continue a subsequent touch action on the touch input unit 12 of the terminal 10 that triggers the preset slide gesture. Specifically, a sliding press event is re-injected into the system, the user continues to trigger a subsequent touch operation of the preset sliding gesture on the touch input unit 12 of the terminal 10, and the touch input unit 12 detects that the user continues to trigger the subsequent touch operation of the preset sliding gesture on the touch input unit 12 to generate a corresponding touch signal; when the touch input unit 12 or the processing unit 14 generates a corresponding preset slide gesture event in response to the touch signal, the system continues to dispatch the preset slide gesture event generated by a subsequent touch operation of the preset slide gesture to a corresponding functional layer of the operating system.
At this time, since the original foreground application window is already switched to the application interface corresponding to the preset sliding gesture event, the original foreground application window cannot receive the preset sliding gesture event which is distributed and generated due to the subsequent touch operation of the preset sliding gesture any more, and only the application corresponding to the preset sliding gesture event receives the distributed preset sliding gesture event which is generated due to the subsequent touch operation of the preset sliding gesture.
And step S50, displaying an application interface corresponding to the preset sliding gesture event according to the preset sliding gesture event which is continuously dispatched.
The application corresponding to the preset sliding gesture event receives the dispatched preset sliding gesture event generated due to the subsequent touch operation of the preset sliding gesture, and displays an application interface corresponding to the preset sliding gesture event on the display unit 11 according to the continuously dispatched preset sliding gesture event.
Specifically, the original foreground application window is switched to the background, and the preset sliding gesture event which is distributed and generated due to the subsequent touch operation of the preset sliding gesture cannot be received any more, so that no response can be generated on the distributed preset sliding gesture event which is generated due to the subsequent touch operation of the preset sliding gesture; and the application corresponding to the preset sliding gesture event controls the display unit 11 to display the application interface corresponding to the preset sliding gesture event according to the received preset sliding gesture event generated by the subsequent touch operation of the preset sliding gesture.
For example, when the preset slide gesture is a pull-up gesture sliding up from the bottom of the touch screen of the terminal, the application interface corresponding to the preset slide gesture event may be called to generally slide out from the bottom of the display interface to start a list interface of a background running program, start a shortcut button interface, and the like, and the application corresponding to the preset slide gesture event controls the display unit 11 to display the list interface of the background running program, start the shortcut button interface, and the like in a manner of sliding out from the bottom according to the preset slide gesture event generated by receiving a subsequent touch operation of the preset slide gesture.
In the input control method 200 according to the first embodiment of the present invention, when the sliding distance of the preset sliding gesture reaches the preset threshold, a cancel event is injected to cancel the continuous dispatch of the preset sliding gesture event; then switching an application interface corresponding to the preset sliding gesture event into a foreground application window; a sliding press-down event is injected again, so that the subsequent touch action of the user continuously triggering the preset sliding gesture on the touch input unit 12 of the terminal 10 can be continued; therefore, the effects of shielding the response of the user interface corresponding to the current foreground application window to the preset sliding gesture and only executing and calling the application interface corresponding to the preset sliding gesture event are achieved. Meanwhile, the processing speed is optimized, the processing response time is shortened, and the user experience is enhanced.
The following will further describe the control process and principle of the input control method 200 of the terminal 10 according to the first embodiment of the present invention with reference to fig. 5 to 8.
The terminal 10 is a portable intelligent terminal, the preset sliding gesture is a pull-up gesture of a user at the bottom of the touch input unit 12, an application corresponding to the pull-up gesture of the user at the bottom of the touch input unit 12 is a shortcut button bar, and the current foreground application window is an application interface of a system setting application window.
Referring to fig. 5, an application corresponding to the pull-up gesture is a start shortcut button bar, and an application interface after the shortcut button bar is started is as shown in fig. 5, in which an application interface 11A having a plurality of shortcut start icons (icons) is pulled out from the bottom of a display unit 11 for a user to select a corresponding shortcut start Icon to start a corresponding function; in this embodiment, the shortcut start icon user interface 11A may be laid out on the lower half display interface of the display unit 11.
Referring to fig. 6, the current foreground application window is an application interface of the system setting application window, and is a user interface 11B of a plurality of setting menu options longitudinally arranged and displayed on the display unit 11, so that the user may click different setting menu options to enter a corresponding sub-setting menu or select a corresponding setting parameter. In this embodiment, the user interface 11B of the plurality of setting menu options displayed in the vertical arrangement may be laid out on the entire display interface of the display unit 11.
It should be understood that fig. 5 and fig. 6 are only examples of the application interface triggered by the pull-up gesture and the current foreground application window interface, and do not specifically limit the present invention.
Referring to fig. 7, when a user performs a pull-up gesture, the touch input unit 12 detects a touch operation of the user on the touch input unit 12 and generates a corresponding touch signal, and transmits the touch signal to the processing unit 14, and when the processing unit 14 determines that the touch operation forms a preset sliding gesture (i.e., a pull-up gesture), the pull-up gesture event is dispatched to a corresponding functional layer of the operating system; and then calling a corresponding shortcut startup bar application program, wherein the shortcut startup bar application program generates a corresponding shortcut startup bar application interface.
The touch input unit 12 continuously detects a touch operation of a user on the touch input unit 12 and generates a corresponding touch signal, the processing unit 14 injects a cancel event when determining that the sliding distance of the preset sliding gesture reaches a preset threshold value according to the touch signal, and cancels continuous dispatch of the preset sliding gesture event, and neither the system setting application nor the shortcut start bar application program receives a corresponding preset sliding gesture event dispatched by a subsequent action of the preset sliding gesture.
Switching the application interface of the quick start bar to be a foreground application window, and injecting a sliding pressing event so that a subsequent touch action event which is used by a user to continuously trigger a preset sliding gesture on the touch input unit 12 of the terminal 10 can be continuously dispatched; the original system setting application window is switched to the background, the preset sliding gesture event which is distributed and generated due to the follow-up touch operation of the preset sliding gesture cannot be received, and only the quick starting bar application interface receives the preset sliding gesture event which is distributed and generated due to the follow-up touch operation of the preset sliding gesture.
And the quick launch bar application program controls the display unit 11 to display the application interface of the quick launch bar in a mode of sliding out from the bottom according to the preset sliding gesture event which is continuously dispatched.
In this embodiment, the foreground application window (system setting application window) before the pull-up gesture is triggered on the touch input unit 12, and will not slide along with the pull-up gesture, and the system will only pull out the shortcut start bar application interface from the bottom of the display unit 11 in response to the pull-up gesture.
Referring to fig. 8, a flowchart of an input control method 202 for the terminal 10 in fig. 2 according to a second embodiment of the present invention is shown. Wherein, the steps S20, S30, S40 and S50 of the input control method 202 in the second embodiment are the same as those in the first embodiment, and are not repeated herein; the difference is that the step 10 comprises:
step 121, responding to a preset sliding gesture of a user at a preset position and a preset direction on a touch input unit of the terminal, and dispatching a preset sliding gesture event;
and step 122, responding to the preset sliding gesture event, and calling an application interface corresponding to the preset sliding gesture event.
Specifically, in step S121, when the preset position on the touch input unit 12 of the terminal 10 and the touch operation corresponding to the preset direction trigger the preset slide gesture, the touch input unit 12 detects that the touch operation on the touch input unit 12 by the user generates a corresponding touch signal; when the processing unit 14 determines that the touch operation is a preset slide gesture according to the touch signal, the touch input unit 12 or the processing unit 14 dispatches the preset slide gesture event to a corresponding functional layer of an operating system.
Further, in an embodiment, the processing unit 14 may determine, according to the TOUCH signal, a first TOUCH DOWN (TOUCH DOWN) action of a user in the TOUCH operation, and acquire whether a TOUCH position of the first TOUCH DOWN action matches the preset position, further determine, according to the TOUCH signal, whether a moving direction of the TOUCH operation matches a preset direction when the TOUCH position of the first TOUCH DOWN action matches the preset position, and determine that the TOUCH operation is a preset swipe gesture when the moving direction of the TOUCH operation matches the preset direction.
Specifically, in step S122, the system responds to the preset slide gesture event and invokes an application interface corresponding to the preset slide gesture event. For example, when the preset slide gesture is a pull-up gesture sliding up from the bottom of the touch screen of the terminal, the application interface corresponding to the preset slide gesture event may be generally a list interface sliding out from the bottom of the display interface to start a background running program, a shortcut button interface, or the like.
In the input control method 202 in the second embodiment of the present invention, when a user executes a preset sliding gesture at a preset position and in a preset direction on the touch input unit 12 of the terminal 10, a preset sliding gesture event may be injected to invoke an application interface corresponding to the preset sliding gesture event. The preset sliding gesture adopts a sliding gesture from a preset position to a preset direction, sufficient reaction time can be provided, and when the sliding is started from the preset position to the preset direction, the system responds to the preset sliding gesture event and calls an application interface corresponding to the preset sliding gesture event.
Referring to fig. 9, a flowchart of an input control method 203 for the terminal 10 in fig. 2 according to a third embodiment of the present invention is shown. Step S20, step S30, step S40 and step S50 of the input control method 203 in the third embodiment are the same as those in the first embodiment, and are not repeated herein; the difference is that the step 10 comprises:
s131, responding to a preset sliding gesture triggered by a user on a touch input unit of the terminal, and dispatching a preset sliding gesture event through a pipeline registration event;
and S132, responding to the preset sliding gesture event, and calling an application interface corresponding to the preset sliding gesture event.
In steps S131 and S132, the preset sliding gesture event is reported in a manner of directly dispatching a pipeline registration event, and the system bottom layer directly receives the preset sliding gesture event through the pipeline registration event and calls an application interface corresponding to the preset sliding gesture event. At this time, the foreground application window may also receive a preset swipe gesture event dispatched via the distribution filter layer and the framework service layer.
In the input control method 202 in the third embodiment of the present invention, the preset slide gesture event is directly dispatched through the pipeline registration event and received by the system bottom layer directly through the pipeline registration event, and the application interface corresponding to the preset slide gesture event is called, so as to further improve the rapidity and accuracy of the corresponding preset slide gesture of the system.
Referring to fig. 10, a flowchart of an input control method 204 for the terminal 10 in fig. 2 according to a fourth embodiment of the present invention is shown. Step S10, step S30, step S40 and step S50 of the input control method 204 in the fourth embodiment are the same as those in the first embodiment, and are not repeated herein; the difference is that the step 20 comprises:
step S241, judging whether the sliding distance of the preset sliding gesture on the touch input unit of the terminal in a preset direction from a preset position reaches a preset threshold value;
step S242, when the sliding distance of the preset sliding gesture from the preset position in the preset direction reaches a preset threshold, injecting a cancellation event having a preset identification position;
and step S243, responding to the cancellation event, and canceling the continuous dispatch of the preset sliding gesture event.
Specifically, in step S241, when the user triggers a touch operation of a preset slide gesture on the touch input unit 12 of the terminal 10, the touch input unit 12 detects the touch operation of the user on the touch input unit 12 and generates a corresponding touch signal; when the processing unit 14 determines that the touch operation is a preset sliding gesture according to the touch signal, the processing unit 14 further determines whether a sliding distance of the preset sliding gesture on the touch input unit of the terminal from a preset position in a preset direction reaches a preset threshold value according to the touch signal.
Further, the preset position may be a position corresponding to a first TOUCH DOWN (TOUCH DOWN) action of a user in a preset slide gesture, and the preset position may also be another position in the preset slide gesture.
In an embodiment, the processing unit 14 may determine, according to the TOUCH signal, a first TOUCH DOWN (TOUCH DOWN) action of a user in the TOUCH operation, and obtain whether a TOUCH position of the first TOUCH DOWN action matches the preset position, and when the TOUCH position of the first TOUCH DOWN action matches the preset position, further determine, according to the TOUCH signal, whether a moving direction of the TOUCH operation matches a preset direction; when the moving direction of the touch operation is determined to be matched with the preset direction, whether the sliding distance of the preset sliding gesture on the touch input unit of the terminal in the preset direction from the preset position reaches a preset threshold value is further determined.
In step S242, when the sliding distance of the preset sliding gesture from the preset position in the preset direction reaches a preset threshold, a cancellation event having a preset identification position is injected. Specifically, the cancellation event has a preset identification bit, and the preset identification bit can be used as an execution basis for subsequent operations.
In step S243, in response to the cancel event, the continuous dispatch of the preset slide gesture event is cancelled. After the cancel event is injected, the user continues to trigger a subsequent touch action of the preset slide gesture on the touch input unit 12 of the terminal 10, and the touch input unit 12 detects that the user continues to trigger the subsequent touch action of the preset slide gesture on the touch input unit 12 to generate a corresponding touch signal; after the touch input unit 12 or the processing unit 14 responds to the touch signal to generate a corresponding preset slide gesture event, the preset slide gesture event is not dispatched any more.
After the cancel event is injected, the system does not dispatch the preset slide gesture event any more, and neither the foreground application window before the trigger of the preset slide gesture event nor the operation corresponding to the preset slide gesture event receives the corresponding touch event dispatched by the subsequent action of the preset slide gesture.
In this embodiment, when a preset sliding gesture is executed to a certain stage, that is, when a sliding distance of the preset sliding gesture on the touch input unit of the terminal in a preset direction from a preset position reaches a preset threshold, a cancel event with a preset identification position is injected, and the continuous dispatch of the preset sliding gesture event is cancelled, so that the corresponding touch event dispatched by the subsequent action of the preset sliding gesture cannot be received when the current foreground application window does not respond to the preset sliding gesture, and therefore, the action does not occur in response to the preset sliding gesture.
In this embodiment, the preset threshold is 8 dp.
In one embodiment, the predetermined threshold may be 4 to 10 dp. Further, the preset threshold value can be 6-8 dp.
Further, please refer to fig. 11 together with fig. 11, which is a flowchart illustrating a method for controlling the input 205 of the terminal 10 in fig. 2 according to a fifth embodiment of the present invention. Step S10, step S30 and step 50 of the input control method 205 in the fifth embodiment are the same as those in the fourth embodiment, and are not repeated herein; the difference lies in that: in the step S242, when the sliding distance of the preset sliding gesture from the preset position in the preset direction reaches a preset threshold, injecting a cancellation event having a preset identification position; wherein the preset flag bit of the cancellation event is "True".
In the fifth embodiment, step S40 of the input control method 205 includes:
step S451, injecting a slide-down event, and recording that a preset flag bit of the cancel event is "False";
and step S452, continuously dispatching the preset sliding gesture event detected on the touch input unit of the terminal.
Specifically, in step S242, when the sliding distance of the preset sliding gesture from the preset position in the preset direction reaches a preset threshold, a cancel event with a preset identification position "True" is injected. In step S243, in response to the cancellation event that the preset flag is "True", the system cancels the continuous dispatch of the preset slide gesture event.
In step S451 and step S452, a slide down event is injected, and the preset flag bit of the cancel event is recorded as "False". At this time, the system does not respond to the cancellation event any more because the preset flag bit of the cancellation event is 'False'; meanwhile, due to the injection of the slide-down event, the dispatch of the touch event that the user continues to trigger subsequent touch operations of the preset slide gesture on the touch input unit 12 of the terminal 10 is continued. At this time, since the original foreground application window is already switched to the application interface corresponding to the preset sliding gesture event, the original foreground application window cannot receive the preset sliding gesture event which is distributed and generated due to the subsequent touch operation of the preset sliding gesture any more, and only the application corresponding to the preset sliding gesture event receives the preset sliding gesture event which is distributed and generated due to the subsequent touch operation of the preset sliding gesture.
In the fifth embodiment, the input control method 205 cancels the cancellation event by assigning two different values, "True" and "False" to the preset flag bit of the cancellation event, and records that the preset flag bit of the cancellation event is "False" when the sliding press event is injected, so as to continue the dispatch of the touch event that the user continues to trigger the subsequent touch operation of the preset sliding gesture on the touch input unit 12 of the terminal 10.
Preferably, in an embodiment, the step S452 further includes: and continuously dispatching the preset sliding gesture event detected on the touch input unit of the terminal to a foreground application window through the pipeline registration event. In this embodiment, the preset slide gesture event generated by the subsequent touch operation of the preset slide gesture is directly dispatched through the pipeline registration event, and the application corresponding to the preset slide gesture event receives the subsequent slide gesture event through the pipeline registration event, so that the rapidity and the accuracy of the corresponding preset slide gesture of the system are further improved.
Referring again to fig. 2, the memory unit 13 of the terminal 10 is further configured to store a computer program operable on the processing unit 14, and the processing unit 14 is configured to execute the computer program; wherein the processing unit 14 is configured to implement the steps of the input control method in any of the above embodiments when executing the computer program.
Specifically, the processing unit 14 is configured to implement the following steps when executing the computer program:
step S10, responding to a preset sliding gesture event detected on a touch input unit of the terminal, and calling an application interface corresponding to the preset sliding gesture event;
step S20, when the sliding distance of the preset sliding gesture reaches a preset threshold value, injecting a cancellation event, and canceling the continuous dispatch of the preset sliding gesture event;
step S30, switching the application interface corresponding to the preset sliding gesture event into a foreground application window;
step S40, injecting a sliding pressing event, and continuing to dispatch a preset sliding gesture event detected on a touch input unit of the terminal;
and step S50, displaying an application interface corresponding to the preset sliding gesture event according to the preset sliding gesture event which is continuously dispatched.
The terminal 10 injects a cancellation event when the sliding distance of the preset sliding gesture reaches a preset threshold value, so as to cancel the continuous dispatch of the preset sliding gesture event; then switching an application interface corresponding to the preset sliding gesture event into a foreground application window; a sliding press-down event is injected again, so that the subsequent touch action of the user continuously triggering the preset sliding gesture on the touch input unit 12 of the terminal 10 can be continued; therefore, the effects of shielding the response of the user interface corresponding to the current foreground application window to the preset sliding gesture and only executing and calling the application interface corresponding to the preset sliding gesture event are achieved. Meanwhile, the processing speed is optimized, the processing response time is shortened, and the user experience is enhanced.
Further, when the processing unit 14 executes the computer program, the step 10 includes:
step 121, responding to a preset sliding gesture of a user at a preset position and a preset direction on a touch input unit of the terminal, and dispatching a preset sliding gesture event;
and step 122, responding to the preset sliding gesture event, and calling an application interface corresponding to the preset sliding gesture event.
The terminal 10 may inject a preset slide gesture event to invoke an application interface corresponding to the preset slide gesture event when a user performs a preset slide gesture at a preset position and in a preset direction on the touch input unit 12 of the terminal 10. The preset sliding gesture adopts a sliding gesture from a preset position to a preset direction, sufficient reaction time can be provided, and when the sliding is started from the preset position to the preset direction, the system responds to the preset sliding gesture event and calls an application interface corresponding to the preset sliding gesture event.
Further, when the processing unit 14 executes the computer program, the step 10 includes:
s131, responding to a preset sliding gesture triggered by a user on a touch input unit of the terminal, and dispatching a preset sliding gesture event through a pipeline registration event;
and S132, responding to the preset sliding gesture event, and calling an application interface corresponding to the preset sliding gesture event.
The preset sliding gesture event is directly dispatched by the terminal 10 through the pipeline registration event, the preset sliding gesture event is directly received by the system bottom layer through the pipeline registration event, and the application interface corresponding to the preset sliding gesture event is called, so that the rapidity and the accuracy of the corresponding preset sliding gesture of the system are further improved.
Further, when the processing unit 14 executes the computer program, the step 20 includes:
step S241, judging whether the sliding distance of the preset sliding gesture on the touch input unit of the terminal in a preset direction from a preset position reaches a preset threshold value;
step S242, when the sliding distance of the preset sliding gesture from the preset position in the preset direction reaches a preset threshold, injecting a cancellation event having a preset identification position;
and step S243, responding to the cancellation event, and canceling the continuous dispatch of the preset sliding gesture event.
The terminal 10 injects a cancellation event with a preset identification position when a preset sliding gesture is executed to a certain stage on a touch input unit of the terminal, that is, when a sliding distance of the preset sliding gesture from a preset position in a preset direction on the touch input unit of the terminal reaches a preset threshold value, cancels the continuous dispatch of the preset sliding gesture event, so that a corresponding touch event dispatched by a subsequent action of the preset sliding gesture cannot be received when a current foreground application window does not respond to the preset sliding gesture, and therefore, the action cannot be carried out in response to the preset sliding gesture.
Further, when the processing unit 14 executes the computer program, in step S242, when the sliding distance of the preset sliding gesture from the preset position in the preset direction reaches a preset threshold, a cancellation event with a preset identification position is injected; wherein the preset flag bit of the cancellation event is "True"; the step S40 includes:
step S451, injecting a slide-down event, and recording that a preset flag bit of the cancel event is "False";
and step S452, continuously dispatching the preset sliding gesture event detected on the touch input unit of the terminal.
The terminal 10 logs out the cancellation event by assigning two different values, "True" and "False" to the preset flag bit of the cancellation event, and recording that the preset flag bit of the cancellation event is "False" when the sliding press event is injected, and continuing to trigger the dispatch of the touch event of the subsequent touch operation of the preset sliding gesture by the user on the touch input unit 12 of the terminal 10.
Further, when the processing unit 14 executes the computer program, the step S452 further includes: and continuously dispatching the preset sliding gesture event detected on the touch input unit of the terminal to a foreground application window through the pipeline registration event. The terminal 10 directly dispatches the preset sliding gesture event generated by the subsequent touch operation of the preset sliding gesture through the pipeline registration event, and the application corresponding to the preset sliding gesture event receives the subsequent sliding gesture event through the pipeline registration event, so that the rapidity and the accuracy of the corresponding preset sliding gesture of the system are further improved.
In this embodiment, the preset threshold is 8 dp.
In one embodiment, the predetermined threshold may be 4 to 10 dp. Further, the preset threshold value can be 6-8 dp.
In this embodiment, the preset sliding gesture is a touch gesture pulled up from the bottom of the terminal.
The present invention also provides a computer-readable storage medium having stored thereon a computer program which, when being executed by a processor, carries out the steps of the input control method described above.
In the description herein, references to the description of the term "one embodiment," "another embodiment," or "first through xth embodiments," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, method steps, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. An input control method for touch input control of a terminal, comprising the steps of:
responding to a preset sliding gesture event detected on a touch input unit of the terminal, and calling an application interface corresponding to the preset sliding gesture event;
when the sliding distance of the preset sliding gesture reaches a preset threshold value, injecting a cancellation event, switching the current foreground application window to a background, and canceling the continuous dispatch of the preset sliding gesture event;
switching an application interface corresponding to the preset sliding gesture event into a foreground application window;
injecting a sliding pressing event, and continuously dispatching a preset sliding gesture event detected on a touch input unit of the terminal;
and displaying an application interface corresponding to the preset sliding gesture event according to the preset sliding gesture event which is continuously dispatched.
2. The input control method according to claim 1, wherein the step of invoking an application interface corresponding to a preset slide gesture event in response to the preset slide gesture event detected on the touch input unit of the terminal comprises:
responding to a preset sliding gesture of a user at a preset position and in a preset direction on a touch input unit of the terminal, and dispatching a preset sliding gesture event;
and responding to the preset sliding gesture event, and calling an application interface corresponding to the preset sliding gesture event.
3. The input control method according to claim 1, wherein the step of invoking an application interface corresponding to a preset slide gesture event in response to the preset slide gesture event detected on the touch input unit of the terminal comprises:
responding to a preset sliding gesture triggered by a user on a touch input unit of the terminal, and dispatching the preset sliding gesture event through a pipeline registration event;
and responding to the preset sliding gesture event, and calling an application interface corresponding to the preset sliding gesture event.
4. The input control method according to claim 1, wherein when the sliding distance of the preset sliding gesture reaches a preset threshold, a cancellation event is injected, and the step of canceling the continued dispatch of the sliding gesture event comprises:
judging whether the sliding distance of the preset sliding gesture on a touch input unit of the terminal from a preset position in a preset direction reaches a preset threshold value or not;
when the sliding distance of the preset sliding gesture from the preset position in the preset direction reaches a preset threshold value, injecting a cancellation event with a preset identification position;
and responding to the cancellation event, and canceling the continuous dispatch of the preset sliding gesture event.
5. The input control method according to claim 4, wherein the preset flag bit of the cancellation event in the step of injecting the cancellation event having the preset identification bit is "True";
injecting a slide press event, and continuing to serve a preset slide gesture event detected on a touch input unit of the terminal, wherein the step comprises the following steps:
injecting a sliding press event, and recording that the preset flag bit of the cancellation event is 'non-true False';
and continuously dispatching the preset sliding gesture event detected on the touch input unit of the terminal.
6. The input control method of claim 5, wherein the step of continuously dispatching the preset slide gesture event detected on the touch input unit of the terminal comprises:
and continuously dispatching the preset sliding gesture event detected on the touch input unit of the terminal to a foreground application window through the pipeline registration event.
7. The input control method of claim 1, wherein the preset slide gesture is a touch gesture pulled up from a bottom of the terminal.
8. The input control method according to claim 1, wherein the preset threshold value is 4 to 10 dp.
9. A terminal comprising a touch input unit, a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps of the input control method according to any one of claims 1 to 8 when executing the computer program.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the input control method according to any one of claims 1 to 8.
CN201710376318.3A 2017-05-24 2017-05-24 Terminal, input control method thereof, and computer-readable storage medium Active CN107229408B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710376318.3A CN107229408B (en) 2017-05-24 2017-05-24 Terminal, input control method thereof, and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710376318.3A CN107229408B (en) 2017-05-24 2017-05-24 Terminal, input control method thereof, and computer-readable storage medium

Publications (2)

Publication Number Publication Date
CN107229408A CN107229408A (en) 2017-10-03
CN107229408B true CN107229408B (en) 2021-07-23

Family

ID=59934373

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710376318.3A Active CN107229408B (en) 2017-05-24 2017-05-24 Terminal, input control method thereof, and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN107229408B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DK180316B1 (en) * 2018-06-03 2020-11-06 Apple Inc Devices and methods for interacting with an application switching user interface
CN109308160B (en) * 2018-09-18 2019-08-06 北京达佳互联信息技术有限公司 Operational order triggering method, device, electronic equipment and storage medium
CN110456978B (en) * 2019-08-13 2021-06-01 青度互娱(重庆)科技有限公司 Touch control method, system, terminal and medium for touch terminal
CN111497613A (en) * 2020-04-03 2020-08-07 广州小鹏汽车科技有限公司 Vehicle interaction method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150121281A1 (en) * 2013-10-30 2015-04-30 Samsung Electronics Co., Ltd. Apparatus and method for displaying notification in electronic device
CN105094908A (en) * 2015-07-24 2015-11-25 北京金山安全软件有限公司 Application program starting method and device and mobile device
CN105183364A (en) * 2015-10-30 2015-12-23 小米科技有限责任公司 Application switching method, application switching device and application switching equipment
CN106648333A (en) * 2016-12-15 2017-05-10 天脉聚源(北京)传媒科技有限公司 Interface treatment method and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150121281A1 (en) * 2013-10-30 2015-04-30 Samsung Electronics Co., Ltd. Apparatus and method for displaying notification in electronic device
CN105094908A (en) * 2015-07-24 2015-11-25 北京金山安全软件有限公司 Application program starting method and device and mobile device
CN105183364A (en) * 2015-10-30 2015-12-23 小米科技有限责任公司 Application switching method, application switching device and application switching equipment
CN106648333A (en) * 2016-12-15 2017-05-10 天脉聚源(北京)传媒科技有限公司 Interface treatment method and device

Also Published As

Publication number Publication date
CN107229408A (en) 2017-10-03

Similar Documents

Publication Publication Date Title
US10338789B2 (en) Operation of a computer with touch screen interface
EP2508972B1 (en) Portable electronic device and method of controlling same
KR102090750B1 (en) Electronic device and method for recognizing fingerprint
CN107229408B (en) Terminal, input control method thereof, and computer-readable storage medium
US9459704B2 (en) Method and apparatus for providing one-handed user interface in mobile device having touch screen
US11934648B2 (en) Permission setting method and apparatus and electronic device
US20140035853A1 (en) Method and apparatus for providing user interaction based on multi touch finger gesture
CN109800045B (en) Display method and terminal
CN109032447B (en) Icon processing method and mobile terminal
CN111078076A (en) Application program switching method and electronic equipment
CN110049486B (en) SIM card selection method and terminal equipment
CN108446156B (en) Application program control method and terminal
CN111163224B (en) Voice message playing method and electronic equipment
CN111399716A (en) Icon position adjusting method and electronic equipment
CN109165033B (en) Application updating method and mobile terminal
CN111338524A (en) Application program control method and electronic equipment
CN110825295A (en) Application program control method and electronic equipment
CN107179849B (en) Terminal, input control method thereof, and computer-readable storage medium
CN107728898B (en) Information processing method and mobile terminal
CN110990032A (en) Application program installation method and electronic equipment
CN108491125B (en) Operation control method of application store and mobile terminal
EP3528103B1 (en) Screen locking method, terminal and screen locking device
CN111026302B (en) Display method and electronic equipment
WO2023093661A1 (en) Interface control method and apparatus, and electronic device and storage medium
CN109901760B (en) Object control method and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant