CN110248023B - Intelligent terminal control method, device, equipment and medium - Google Patents

Intelligent terminal control method, device, equipment and medium Download PDF

Info

Publication number
CN110248023B
CN110248023B CN201910497895.7A CN201910497895A CN110248023B CN 110248023 B CN110248023 B CN 110248023B CN 201910497895 A CN201910497895 A CN 201910497895A CN 110248023 B CN110248023 B CN 110248023B
Authority
CN
China
Prior art keywords
touch
target touch
gesture
target
boundary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910497895.7A
Other languages
Chinese (zh)
Other versions
CN110248023A (en
Inventor
张超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wingtech Communication Co Ltd
Original Assignee
Wingtech Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wingtech Communication Co Ltd filed Critical Wingtech Communication Co Ltd
Priority to CN201910497895.7A priority Critical patent/CN110248023B/en
Publication of CN110248023A publication Critical patent/CN110248023A/en
Application granted granted Critical
Publication of CN110248023B publication Critical patent/CN110248023B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Environmental & Geological Engineering (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses a method, a device, equipment and a medium for controlling an intelligent terminal, and relates to the technical field of intelligent terminals. The method comprises the following steps: responding to a target touch operation acting on a gesture response area, and determining a touch parameter of the target touch operation, wherein the position of the gesture response area in a screen is determined according to a gesture holding behavior of a user; and generating and executing a target touch instruction according to the touch parameter of the target touch operation. According to the embodiment of the invention, the target touch control operation of the user in the gesture response area is responded, the touch control parameter is determined, and then the target touch control instruction is generated and executed according to the touch control parameter, so that the effect of quickly realizing the target touch control instruction is realized on the premise that the user keeps the same holding posture.

Description

Intelligent terminal control method, device, equipment and medium
Technical Field
The embodiment of the invention relates to the technical field of intelligent terminals, in particular to an intelligent terminal control method, an intelligent terminal control device, intelligent terminal control equipment and an intelligent terminal control medium.
Background
With the development of the technology level, the screen of the intelligent terminal is larger, but the size of the screen reaches about 6.0 inches, and the limit of the operable range of the palm is reached. And the screen size is increased continuously, so that the operation of a user is influenced, and the carrying is inconvenient. Therefore, each intelligent terminal manufacturer starts to continuously reduce the distance between the screen and the boundary of the mobile phone, and starts to remove the physical keys below the screen on the mobile phone and use the virtual keys.
However, in some scenarios, the user does not need to display the virtual keys, but rather needs to display the content of the mobile phone on the full screen. At this time, how to display the virtual keys is not only the requirement of the user on one-hand operation is not influenced, but also the problem that the gesture of the comprehensive screen needs to be solved.
Disclosure of Invention
The embodiment of the invention provides an intelligent terminal control method, an intelligent terminal control device, intelligent terminal control equipment and an intelligent terminal control medium, and aims to solve the problem that a user operating an intelligent terminal with one hand cannot conveniently control the intelligent terminal.
In a first aspect, an embodiment of the present invention provides an intelligent terminal control method, where the method includes:
responding to a target touch operation acting on a gesture response area, and determining a touch parameter of the target touch operation, wherein the position of the gesture response area in a screen is determined according to a gesture holding behavior of a user;
and generating and executing a target touch instruction according to the touch parameter of the target touch operation.
In a second aspect, an embodiment of the present invention provides an intelligent terminal control device, where the device includes:
the touch parameter determination module is used for responding to target touch operation acting on a gesture response area, and determining touch parameters of the target touch operation, wherein the position of the gesture response area in a screen is determined according to gesture holding behaviors of a user;
and the target touch instruction execution module is used for generating and executing a target touch instruction according to the touch parameters of the target touch operation.
In a third aspect, an embodiment of the present invention provides an apparatus, where the apparatus includes:
one or more processors;
a storage device for storing one or more programs,
when the one or more programs are executed by the one or more processors, the one or more processors implement the intelligent terminal control method according to any one of the embodiments of the present invention.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to implement the intelligent terminal control method according to any one of the embodiments of the present invention.
According to the embodiment of the invention, the target touch control operation of the user in the gesture response area is responded, the touch control parameter is determined, and then the target touch control instruction is generated and executed according to the touch control parameter, so that the effect of quickly realizing the target touch control instruction is realized on the premise that the user keeps the same holding posture.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1A is a flowchart of an intelligent terminal control method according to an embodiment of the present invention;
fig. 1B is a schematic diagram of a gesture response area according to an embodiment of the present invention;
fig. 2 is a flowchart of an intelligent terminal control method according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of an intelligent terminal control device according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of an apparatus according to a fourth embodiment of the present invention.
Detailed Description
The embodiments of the present invention will be described in further detail with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the embodiments of the invention and that no limitation of the invention is intended. It should be further noted that, for convenience of description, only some structures, not all structures, relating to the embodiments of the present invention are shown in the drawings.
Example one
Fig. 1A is a flowchart of a control method of an intelligent terminal according to an embodiment of the present invention, and fig. 1B is a schematic diagram of a gesture response area according to an embodiment of the present invention. The embodiment is used for the situation that the user operates the intelligent terminal by one hand. The method can be executed by the intelligent terminal control device provided by the embodiment of the invention. Referring to fig. 1, the method specifically includes:
step 101, responding to a target touch operation acting on a gesture response area, and determining a touch parameter of the target touch operation, wherein the position of the gesture response area in a screen is determined according to a gesture holding behavior of a user.
The gesture response area is located on a touch display interface of the smart terminal, for example, a touch display screen of a smart phone, and the specific position of the gesture response area is determined by gesture holding behaviors, where the gesture holding behaviors include, but are not limited to, right-hand holding and left-hand holding, that is, if the gesture holding behaviors are different, the positions of the gesture response area are different. The target touch operation includes, but is not limited to, sliding, clicking, long-pressing, and the like, and the touch parameters include, but are not limited to, a start touch position, a touch duration, and the like.
Optionally, this embodiment provides a specific implementation manner of determining a position of a gesture response area in a screen according to a gesture holding behavior of a user, including:
if the gesture holding behavior of the user belongs to right-hand holding, the gesture response area comprises a first boundary located at the bottom of the terminal part and a second boundary located at the right side of the terminal part; if the gesture holding behavior of the user belongs to left-handed holding, the gesture response area comprises a first boundary located at the bottom of the terminal part and a second boundary located at the left side of the terminal part; wherein the first boundary is connected to the second boundary.
Specifically, when the user holds the touch screen with the left hand or the right hand, the areas touched by the fingers of the user are different, so that different gesture response areas are generated based on the left hand or the right hand holding of the user to facilitate the user to implement the target touch operation.
Optionally, pressure sensors are loaded on the left side and the right side of the intelligent terminal to measure pressure values on the left side and the right side of the intelligent terminal, so that the gesture holding behavior of the user is determined. When held by a right hand of a user, the gesture responsive area includes a first boundary at a bottom of the terminal portion and a second boundary at a right side of the terminal portion; when held by the left hand of a user, the gesture responsive area comprises a first boundary at the bottom of the terminal portion and a second boundary at the left side of the terminal portion; the lengths of the first boundary and the second boundary can be set according to the size of the intelligent terminal display interface. Optionally, the shape of the gesture response area includes, but is not limited to, a sector, a rectangle, a triangle, or the like. Fig. 1B is a schematic diagram of a fan-shaped gesture response area, where 11 represents a gesture response area corresponding to a left-hand holding of a user, and 12 represents a gesture response area corresponding to a right-hand holding of a user.
The touch parameters are determined by responding to the target touch operation of the user in the gesture response area, and a data basis is laid for generating and executing a target touch instruction according to the touch parameters.
And 102, generating and executing a target touch instruction according to the touch parameter of the target touch operation.
The target touch instruction is used for controlling the intelligent terminal to execute corresponding operations, namely different touch parameters correspond to different target touch instructions, and finally the intelligent terminal is controlled to execute different operations.
The target touch instruction is generated and executed according to the touch parameter of the target touch operation, so that the technical effect of controlling the intelligent terminal to execute different operations is achieved.
According to the technical scheme provided by the embodiment of the invention, the target touch control operation of the user in the gesture response area is responded, the touch control parameter is determined, and then the target touch control instruction is generated and executed according to the touch control parameter, so that the effect of quickly realizing the target touch control instruction is realized on the premise that the user keeps the same holding posture.
Example two
Fig. 2 is a flowchart of an intelligent terminal control method according to a second embodiment of the present invention, which provides a specific implementation manner for "generating and executing a target touch instruction according to a touch parameter of a target touch operation" in the first embodiment. The method specifically comprises the following steps:
in step 201, if the initial touch position of the target touch operation is located on the first boundary of the gesture response area and the touch duration is less than the first duration threshold, a first target touch instruction is generated and executed.
The first boundary of the gesture response area is located at the bottom of the terminal portion, the touch duration represents the touch duration of a user in the gesture response area, and the first target touch instruction comprises a main screen display touch instruction.
Illustratively, the optional first duration threshold is 100ms, the gesture holding behavior of the user is holding with a right hand, the target touch operation is sliding, when a certain position of the user at the bottom of the terminal part is an initial point and slides in the gesture response area, and the sliding time is less than 100ms, a main screen display touch instruction is generated and executed, wherein the sliding track includes, but is not limited to, a straight track or a curved track.
Step 202, if the initial touch position of the target touch operation is located on the first boundary of the gesture response area, and the touch duration is greater than the first duration threshold and less than the second duration threshold, a second target touch instruction is generated and executed, wherein the second duration threshold is greater than the first duration threshold.
The second target touch instruction comprises a background management touch instruction.
Illustratively, the optional first duration threshold is 100ms, the optional second duration threshold is 500ms, the gesture holding behavior of the user is right-handed holding, the target touch operation is sliding, when a certain position of the user at the bottom of the terminal portion is an initial point and slides in the gesture response area, and the sliding time is greater than 100ms and less than 500ms, a background management touch instruction is generated and executed, wherein the sliding trajectory includes, but is not limited to, a straight trajectory or a curved trajectory.
In step 203, if the initial touch position of the target touch operation is located on the second boundary of the gesture response area and the touch duration is less than the first duration threshold, a third target touch instruction is generated and executed.
And the second boundary of the gesture response area is positioned at the left side of the terminal part or the right side of the terminal part, and the third target touch instruction comprises a return previous-stage touch instruction.
For example, the optional first duration threshold is 100ms, the gesture holding behavior of the user is holding with a right hand, the target touch operation is sliding, when a position of the user on the right side of the terminal portion is an initial point, the user slides in the gesture response area, and the sliding time is less than 100ms, a return-to-previous-stage touch instruction is generated and executed, where the sliding trajectory includes, but is not limited to, a straight trajectory or a curved trajectory.
Step 204, if the initial touch position of the target touch operation is located on the second boundary of the gesture response area, and the touch duration is greater than the first duration threshold and less than the second duration threshold, generating and executing a fourth target touch instruction, wherein the second duration threshold is greater than the first duration threshold.
The fourth target touch instruction comprises a screen capture touch instruction.
Illustratively, the optional first duration threshold is 100ms, the optional second duration threshold is 500ms, the gesture holding behavior of the user is right-hand holding, the target touch operation is sliding, when a position of the user on the right side of the terminal portion is an initial point and slides in the gesture response area, and the sliding time is greater than 100ms and less than 500ms, a screen capture touch instruction is generated and executed, wherein the sliding trajectory includes, but is not limited to, a straight trajectory or a curved trajectory.
According to the technical scheme provided by the embodiment of the invention, four different target touch instructions are generated and executed according to the initial touch position and the touch duration of the target touch operation, so that the effect of quickly realizing the target touch instructions is realized.
EXAMPLE III
Fig. 3 is a schematic structural diagram of an intelligent terminal control device according to a third embodiment of the present invention, which is capable of executing an intelligent terminal control method according to any embodiment of the present invention, and has functional modules and beneficial effects corresponding to the execution method. As shown in fig. 3, the apparatus may include:
a touch parameter determination module 31, configured to determine, in response to a target touch operation applied to a gesture response area, a touch parameter of the target touch operation, where a position of the gesture response area in the screen is determined according to a gesture holding behavior of a user;
and a target touch instruction execution module 32, configured to generate and execute a target touch instruction according to the touch parameter of the target touch operation.
On the basis of the above embodiment, the apparatus further includes a gesture response area determining module, specifically configured to:
if the gesture holding behavior of the user belongs to right-hand holding, the gesture response area comprises a first boundary located at the bottom of the terminal part and a second boundary located at the right side of the terminal part;
if the gesture holding behavior of the user belongs to left-handed holding, the gesture response area comprises a first boundary located at the bottom of the terminal part and a second boundary located at the left side of the terminal part;
wherein the first boundary is connected to the second boundary.
On the basis of the foregoing embodiment, the target touch instruction execution module 32 is specifically configured to:
if the initial touch position of the target touch operation is located on a first boundary of the gesture response area and the touch duration is smaller than a first duration threshold, generating and executing a first target touch instruction;
if the initial touch position of the target touch operation is located on a first boundary of the gesture response area, and the touch duration is greater than a first duration threshold and less than a second duration threshold, generating and executing a second target touch instruction, wherein the second duration threshold is greater than the first duration threshold;
if the initial touch position of the target touch operation is located on a second boundary of the gesture response area, and the touch duration is smaller than a first duration threshold, generating and executing a third target touch instruction;
and if the initial touch position of the target touch operation is located on a second boundary of the gesture response area, and the touch duration is greater than the first duration threshold and less than a second duration threshold, generating and executing a fourth target touch instruction, wherein the second duration threshold is greater than the first duration threshold.
On the basis of the above embodiment, the first target touch instruction includes a return previous-stage touch instruction; the second target touch instruction comprises a screen capture touch instruction; the third target touch instruction comprises a main screen display touch instruction; the fourth target touch instruction comprises a background management touch instruction.
The intelligent terminal control device provided by the embodiment of the invention can execute the intelligent terminal control method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method. For details of the technology that are not described in detail in this embodiment, reference may be made to the intelligent terminal control method provided in any embodiment of the present invention.
Example four
Fig. 4 is a schematic structural diagram of an apparatus according to a fourth embodiment of the present invention. Fig. 4 illustrates a block diagram of an exemplary device 400 suitable for use in implementing embodiments of the present invention. The apparatus 400 shown in fig. 4 is only an example and should not bring any limitations to the functionality or scope of use of the embodiments of the present invention.
As shown in FIG. 4, device 400 is in the form of a general purpose computing device. The components of device 400 may include, but are not limited to: one or more processors or processing units 401, a system memory 402, and a bus 403 that couples the various system components (including the system memory 402 and the processing unit 401).
Bus 403 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Device 400 typically includes a variety of computer system readable media. Such media can be any available media that is accessible by device 400 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 402 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)404 and/or cache memory 405. The device 400 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 406 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 4, and commonly referred to as a "hard drive"). Although not shown in FIG. 4, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to the bus 403 by one or more data media interfaces. Memory 402 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 408 having a set (at least one) of program modules 407 may be stored, for example, in memory 402, such program modules 407 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 407 generally perform the functions and/or methods of the described embodiments of the invention.
Device 400 may also communicate with one or more external devices 409 (e.g., keyboard, pointing device, display 410, etc.), with one or more devices that enable a user to interact with device 400, and/or with any devices (e.g., network card, modem, etc.) that enable device 400 to communicate with one or more other computing devices. Such communication may be through input/output (I/O) interface 411. Also, device 400 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet) through network adapter 412. As shown, the network adapter 412 communicates with the other modules of the device 400 over the bus 403. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with device 400, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 401 executes various functional applications and data processing by running the program stored in the system memory 402, for example, to implement the intelligent terminal control method provided by the embodiment of the present invention, including:
responding to a target touch operation acting on a gesture response area, and determining a touch parameter of the target touch operation, wherein the position of the gesture response area in a screen is determined according to a gesture holding behavior of a user;
and generating and executing a target touch instruction according to the touch parameter of the target touch operation.
Of course, it can be understood by those skilled in the art that the processing unit 401 may also implement the technical solution of the intelligent terminal control method provided by any embodiment of the present invention by running the program stored in the system memory 402.
EXAMPLE five
An embodiment of the present invention further provides a computer-readable storage medium, where the computer-executable instructions, when executed by a computer processor, are configured to perform a method for controlling an intelligent terminal, where the method includes:
responding to a target touch operation acting on a gesture response area, and determining a touch parameter of the target touch operation, wherein the position of the gesture response area in a screen is determined according to a gesture holding behavior of a user;
and generating and executing a target touch instruction according to the touch parameter of the target touch operation.
Of course, the storage medium containing the computer-executable instructions provided by the embodiments of the present invention is not limited to the method operations described above, and may also perform related operations in an intelligent terminal control method provided by any embodiment of the present invention. The computer-readable storage media of embodiments of the invention may take any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (6)

1. According to the intelligent terminal control method, pressure sensors are respectively arranged on the left side and the right side of an intelligent terminal; characterized in that the method comprises:
responding to a target touch operation acting on a gesture response area, determining touch parameters of the target touch operation, wherein the position of the gesture response area in a screen is determined according to the gesture holding behavior of a user, and measuring pressure values of the left side and the right side of the intelligent terminal by loading the pressure sensors so as to determine the gesture holding behavior of the user;
if the gesture holding behavior belongs to right-hand holding, the gesture response area comprises a first boundary located at the bottom of the intelligent terminal part and a second boundary located at the right side of the intelligent terminal part;
if the gesture holding behavior belongs to left-handed holding, the gesture response area comprises a first boundary located at the bottom of the intelligent terminal part and a second boundary located at the left side of the intelligent terminal part; wherein the first boundary is connected to the second boundary;
generating and executing a target touch instruction according to the touch parameter of the target touch operation;
generating and executing a target touch instruction according to the touch parameter of the target touch operation, wherein the target touch instruction comprises:
if the initial touch position of the target touch operation is located on a first boundary of the gesture response area, and the touch duration is smaller than a first time duration threshold, generating and executing a first target touch instruction;
if the initial touch position of the target touch operation is located on a first boundary of the gesture response area, and the touch duration is greater than the first duration threshold and less than a second duration threshold, generating and executing a second target touch instruction, wherein the second duration threshold is greater than the first duration threshold;
if the initial touch position of the target touch operation is located on a second boundary of the gesture response area, and the touch duration is smaller than a first time duration threshold, generating and executing a third target touch instruction;
and if the initial touch position of the target touch operation is located on a second boundary of the gesture response area, and the touch duration is greater than a first duration threshold and less than a second duration threshold, generating and executing a fourth target touch instruction, wherein the second duration threshold is greater than the first duration threshold.
2. The method of claim 1, wherein the first target touch command comprises a home screen display touch command; the second target touch instruction comprises a background management touch instruction; the third target touch instruction comprises a return previous-level touch instruction; the fourth target touch instruction comprises a screen capture touch instruction.
3. The intelligent terminal control device is characterized in that pressure sensors are respectively arranged on the left side and the right side of an intelligent terminal; characterized in that the device comprises:
a touch parameter determination module for determining a touch parameter of a target touch operation in response to the target touch operation acting on a gesture response area, wherein the position of the gesture response area in a screen is determined according to a gesture holding behavior of a user,
measuring pressure values of the left side and the right side of the intelligent terminal by loading the pressure sensors, so as to determine the gesture holding behavior of the user;
if the gesture holding behavior belongs to right-hand holding, the gesture response area comprises a first boundary located at the bottom of the intelligent terminal part and a second boundary located at the right side of the intelligent terminal part;
if the gesture holding behavior belongs to left-handed holding, the gesture response area comprises a first boundary located at the bottom of the intelligent terminal part and a second boundary located at the left side of the intelligent terminal part; wherein the first boundary is connected to the second boundary;
the target touch instruction execution module is used for generating and executing a target touch instruction according to the touch parameters of the target touch operation;
the target touch instruction execution module is specifically configured to:
if the initial touch position of the target touch operation is located on a first boundary of the gesture response area, and the touch duration is smaller than a first time duration threshold, generating and executing a first target touch instruction;
if the initial touch position of the target touch operation is located on a first boundary of the gesture response area, and the touch duration is greater than the first duration threshold and less than a second duration threshold, generating and executing a second target touch instruction, wherein the second duration threshold is greater than the first duration threshold;
if the initial touch position of the target touch operation is located on a second boundary of the gesture response area, and the touch duration is smaller than a first time duration threshold, generating and executing a third target touch instruction;
and if the initial touch position of the target touch operation is located on a second boundary of the gesture response area, and the touch duration is greater than a first duration threshold and less than a second duration threshold, generating and executing a fourth target touch instruction, wherein the second duration threshold is greater than the first duration threshold.
4. The apparatus of claim 3, wherein the first target touch command comprises a home screen display touch command; the second target touch instruction comprises a background management touch instruction; the third target touch instruction comprises a return previous-level touch instruction; the fourth target touch instruction comprises a screen capture touch instruction.
5. An apparatus, characterized in that the apparatus further comprises:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the intelligent terminal control method of any of claims 1-2.
6. A computer-readable medium, on which a computer program is stored, characterized in that the program, when executed by a processor, implements the intelligent terminal control method according to any one of claims 1-2.
CN201910497895.7A 2019-06-10 2019-06-10 Intelligent terminal control method, device, equipment and medium Active CN110248023B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910497895.7A CN110248023B (en) 2019-06-10 2019-06-10 Intelligent terminal control method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910497895.7A CN110248023B (en) 2019-06-10 2019-06-10 Intelligent terminal control method, device, equipment and medium

Publications (2)

Publication Number Publication Date
CN110248023A CN110248023A (en) 2019-09-17
CN110248023B true CN110248023B (en) 2021-09-10

Family

ID=67886444

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910497895.7A Active CN110248023B (en) 2019-06-10 2019-06-10 Intelligent terminal control method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN110248023B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112114736A (en) * 2020-09-09 2020-12-22 江苏紫米电子技术有限公司 Operation control method and device, electronic equipment and storage medium
WO2023077292A1 (en) * 2021-11-03 2023-05-11 北京奇点跳跃科技有限公司 Method and apparatus for controlling terminal screen by means of trackpad, control device, and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107562346A (en) * 2017-09-06 2018-01-09 广东欧珀移动通信有限公司 Terminal control method, device, terminal and computer-readable recording medium
CN109766043A (en) * 2018-12-29 2019-05-17 华为技术有限公司 The operating method and electronic equipment of electronic equipment

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170199662A1 (en) * 2014-05-26 2017-07-13 Huawei Technologies Co., Ltd. Touch operation method and apparatus for terminal
CN107493389A (en) * 2017-08-29 2017-12-19 深圳市金立通信设备有限公司 Singlehanded mode implementation method, terminal and computer-readable medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107562346A (en) * 2017-09-06 2018-01-09 广东欧珀移动通信有限公司 Terminal control method, device, terminal and computer-readable recording medium
CN109766043A (en) * 2018-12-29 2019-05-17 华为技术有限公司 The operating method and electronic equipment of electronic equipment

Also Published As

Publication number Publication date
CN110248023A (en) 2019-09-17

Similar Documents

Publication Publication Date Title
CN108196759B (en) Icon control method and terminal
US20210256077A1 (en) Methods, devices and computer-readable storage media for processing a hosted application
US9612675B2 (en) Emulating pressure sensitivity on multi-touch devices
CN107992251B (en) Skill control method, skill control device, electronic equipment and storage medium
US20190361593A1 (en) Screen Capturing Method and Apparatus
US9632693B2 (en) Translation of touch input into local input based on a translation profile for an application
US20220152476A1 (en) Method and device for processing information in game, storage medium and electronic device
CN105335099A (en) Memory cleaning method and terminal
US20210326151A1 (en) Methods, devices and computer-readable storage media for processing a hosted application
CN110248023B (en) Intelligent terminal control method, device, equipment and medium
EP2874063A2 (en) Method and apparatus for allocating computing resources in touch-based mobile device
CN110471610A (en) Terminal control method, device, terminal and storage medium
CN116483246A (en) Input control method and device, electronic equipment and storage medium
CN109857298B (en) Application starting method, device, equipment and storage medium
CN110850982A (en) AR-based human-computer interaction learning method, system, device and storage medium
US9026691B2 (en) Semi-autonomous touch I/O device controller operation under control of host
CN116107531A (en) Interface display method and device
US20230070059A1 (en) False touch rejection method, terminal device, and storage medium
CN110908568A (en) Control method and device for virtual object
US10620718B2 (en) Device selection in three-dimensional environments
US20180217747A1 (en) Adaptive user interface based on left or right handed mode
CN109358755B (en) Gesture detection method and device for mobile terminal and mobile terminal
CN110162251B (en) Image scaling method and device, storage medium and electronic equipment
CN111352357B (en) Robot control method and device and terminal equipment
CN110413153B (en) False touch prevention method and device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant