CN112925455A - Touch operation execution method and device - Google Patents

Touch operation execution method and device Download PDF

Info

Publication number
CN112925455A
CN112925455A CN202110082236.4A CN202110082236A CN112925455A CN 112925455 A CN112925455 A CN 112925455A CN 202110082236 A CN202110082236 A CN 202110082236A CN 112925455 A CN112925455 A CN 112925455A
Authority
CN
China
Prior art keywords
touch
icon
target
central point
range
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110082236.4A
Other languages
Chinese (zh)
Inventor
杨勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Software Technology Co Ltd
Original Assignee
Vivo Software Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Software Technology Co Ltd filed Critical Vivo Software Technology Co Ltd
Priority to CN202110082236.4A priority Critical patent/CN112925455A/en
Publication of CN112925455A publication Critical patent/CN112925455A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a touch operation execution method and device, and belongs to the technical field of communication. The method comprises the following steps: after receiving touch input of a user on a display screen, acquiring a touch icon in a target range where the touch input is located and a target central point of the target range; acquiring an adaptive value corresponding to each touch icon according to the target central point; determining a target touch icon in the touch icons according to the adaptive value; and executing touch operation on the target touch icon. According to the method and the device, the accuracy of touch screen operation in large-area touch screen can be improved.

Description

Touch operation execution method and device
Technical Field
The application belongs to the technical field of communication, and particularly relates to a touch operation execution method and device.
Background
With the continuous development of science and technology, electronic devices (such as mobile phones, tablet computers and the like) have gradually become an indispensable tool in the life and work of people.
In the process of using the electronic device by the user, the two hands may be occupied, such as cooking, carrying goods, and the like, and the user may naturally think of using other parts of the body to operate the mobile phone, such as elbows, toes, and the like. However, these parts are far less flexible and delicate than fingers, which results in large-area screen contact, thereby affecting the accuracy of the final touch electronic device.
Disclosure of Invention
An object of the embodiments of the present application is to provide a method and an apparatus for performing touch operation, which can solve the problem that accuracy of a touch electronic device is affected when a large area of a screen is touched.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides a touch operation execution method, including:
after receiving touch input of a user on a display screen, acquiring a touch icon in a target range where the touch input is located and a target central point of the target range;
acquiring an adaptive value corresponding to each touch icon according to the target central point;
determining a target touch icon in the touch icons according to the adaptive value;
and executing touch operation on the target touch icon.
In a second aspect, an embodiment of the present application provides a touch operation execution apparatus, including:
the touch icon acquisition module is used for acquiring a touch icon in a target range where a user touch input is located and a target central point of the target range after receiving the touch input of the user on a display screen;
the adaptive value acquisition module is used for acquiring adaptive values corresponding to the touch icons respectively according to the target central point;
the target icon determining module is used for determining a target touch icon in the touch icons according to the adaptive value;
and the first operation execution module is used for executing touch operation on the target touch icon.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or an instruction stored on the memory and executable on the processor, and when the program or the instruction is executed by the processor, the method for executing a touch operation according to the first aspect is implemented.
In a fourth aspect, an embodiment of the present application provides a readable storage medium, on which a program or instructions are stored, and when executed by a processor, the program or instructions implement the steps of the touch operation execution method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the touch operation execution method according to the first aspect.
In the embodiment of the application, after receiving the touch input of the user on the display screen, the touch icons in the target range where the touch input is located and the target central point of the target range are obtained, the adaptive value corresponding to each touch icon is obtained according to the target central point, the target touch icon in the touch icons is determined according to the adaptive value, and the touch operation is performed on the target touch icon. According to the embodiment of the application, the optimal target touch icon is determined by combining the adaptive value of the touch icon when a plurality of touch icons exist in the range of the touch screen, so that the accuracy of touch screen operation during large-area touch screen can be improved.
Drawings
Fig. 1 is a flowchart illustrating a method for executing a touch operation according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of obtaining a center point of a large-area touch screen according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram illustrating a calculation of an adaptation value according to a center point according to an embodiment of the present application;
fig. 4 is a schematic diagram of obtaining a force center point according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of a touch operation execution device according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of another electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The following describes in detail a touch operation execution scheme provided by the embodiments of the present application through specific embodiments and application scenarios thereof with reference to the accompanying drawings.
Referring to fig. 1, a flowchart illustrating steps of a touch operation execution method provided in an embodiment of the present application is shown, and as shown in fig. 1, the touch operation execution method may specifically include the following steps:
step 101: after receiving touch input of a user on a display screen, acquiring a touch icon in a target range where the touch input is located and a target central point of the target range.
The method and the device for obtaining the target touch icon in the touch screen range can be applied to a scene of obtaining the target touch icon in the touch screen range when a large-area touch screen is touched.
The touch input refers to an input performed by a user on a display screen, for example, an input formed by the user touching the display screen with an elbow, or the like.
The target range refers to a range covered by the touch input in the display screen, for example, when the user touches the display screen of the electronic device with an elbow, the touch input range is large, and a touch screen sensor preset in the electronic device can detect the range of the touch input on the display screen, that is, the target range.
The touch icon refers to a touch icon located in a target range on the display screen, and in this embodiment, the touch icon may be an icon of an application program, or may also be a button icon or the like that is touched on a display page.
The target center point is a center point of the target range, as shown in fig. 3, after the user performs the touch input on the display screen, the range where the touch input is located on the display screen may be obtained first, and then a geometric center point of the touch range, as shown in fig. 3 as geometric center point 2, may be calculated, where the geometric center point is the target center point of the target range.
It should be understood that the above examples are only examples for better understanding of the technical solutions of the embodiments of the present application, and are not to be taken as the only limitation to the embodiments.
After receiving the touch input of the user on the display screen, the touch icon in the target range where the touch operation is located and the target center point of the target range may be obtained, and step 102 is performed.
Step 102: and acquiring the adaptive value corresponding to each touch icon according to the target central point.
The adaptation value is an adaptation index value between each touch icon in the target range and the target central point, the larger the adaptation value is, the larger the probability that the user wants to touch the touch icon is, and conversely, the smaller the adaptation value is, the smaller the probability that the user wants to touch the touch icon is.
After the target central point of the target range is obtained, the adaptation value of the touch icon in the target range may be calculated by combining the target central point, and specifically, the following specific implementation manner may be described in detail.
In a specific implementation manner of the present application, the step 102 may include:
substep S1: and acquiring an icon center point and an active value of each touch icon.
In this embodiment, the icon center point refers to a center point of the touch icon, in general, the touch icon is a square icon, and a focus obtained by connecting opposite corners of the square icon is the icon center point.
The active value may be used to indicate the number of times the touch icon is clicked and the importance of the touch icon, for example, the touch icon is an application a icon and an application B icon, the number of times the user touches the application a icon is 10 times, the number of times the user touches the application B icon is 5 times, at this time, the active value of the application a icon may be given as 10, the active value of the application B icon may be given as 5, and the like.
Of course, not limited to this, when calculating the active value of the touch icon, the active value of the touch icon may also be calculated in combination with the importance degree of the touch icon, that is, in combination with the preset importance degree index of the touch icon and the acquired number of times the touch icon is clicked, in combination with the corresponding weight, the weight given to the number of times the touch icon is clicked and the index corresponding to the importance degree is calculated in advance.
It should be understood that the above examples are only examples for better understanding of the technical solutions of the embodiments of the present application, and are not to be taken as the only limitation of the embodiments of the present application.
After the touch icon within the target range is obtained, the icon center point and the active value of the touch icon may be obtained, and then sub-step S2 is performed.
Substep S2: and obtaining the center point distance between the center point of the icon and the target center point.
The center point distance is the distance between the center point of the icon and the center point of the target.
After obtaining the icon center point of the touch icon within the target range, the distance between the icon center sheet and the target center point may be calculated, specifically, the icon center point and the target center point may be connected by a line, and the length of the line is the center point distance between the icon center point and the target center point.
After the center point distance between the center point of the icon and the target center point is acquired, sub-step S3 is performed.
Substep S3: and calculating to obtain an adaptive value of the touch icon according to the central point distance and the active value.
After the center point distance between the center point of the icon and the target center point is obtained, the adaptation value of the touch icon can be calculated by combining the center point distance and the active value, as shown in fig. 3, the touch icon included in the target range is: the browser application program icon, the WeChat icon and the attendance card punching icon can be combined with the following formula to respectively calculate the adaptation values of the three icons, namely Q2、Q3And Q1
The calculation of the adaptation value may be performed by combining the following formula (1):
Figure BDA0002909495220000061
in the above formula (1), QiK is a constant (i.e., an adaptation coefficient) for the adaptation value of the touch icon, ciIs the active value of the ith touch icon, diThe distance between the icon center point of the ith touch icon and the target center point is obtained.
As can be seen from the above formula (1), the hit value (i.e., the adaptation value) of a touch icon is proportional to the active value of the touch icon and inversely proportional to the distance between the center points, i.e., the larger the active value is, the larger the adaptation value is, the larger the distance between the center points is, the smaller the adaptation value is.
After the adaptive value corresponding to the touch icon is obtained according to the target center point, step 103 is executed.
Step 103: and determining a target touch icon in the touch icons according to the adaptive value.
After the adaptation value of the touch icon in the target range is obtained, the target touch icon may be screened from the touch icons according to the adaptation value, specifically, the touch icon with the largest adaptation value may be screened from the touch icons according to the adaptation value, and the touch icon with the largest adaptation value is used as the target touch icon, for example, the touch icons included in the target range are icon 1, icon 2, and icon 3, where the adaptation value of icon 1 is 3, the adaptation value of icon 2 is 5, and the adaptation value of icon 3 is 9, then icon 3 may be used as the target touch icon.
It should be understood that the above examples are only examples for better understanding of the technical solutions of the embodiments of the present application, and are not to be taken as the only limitation to the embodiments.
After the target touch icon in the touch icons is determined according to the adaptation value, step 104 is executed.
Step 104: and executing touch operation on the target touch icon.
After the target touch icon in the touch icons is determined, a touch operation, such as a click operation, may be performed on the target touch icon.
According to the embodiment of the application, the optimal target touch icon is determined by combining the adaptive value of the touch icon when a plurality of touch icons exist in the range of the touch screen, so that the accuracy of touch screen operation during large-area touch screen can be improved.
In this embodiment, the area of the target range may be gradually reduced in combination with the geometric center point of the target range, and when only one touch icon is stored in the target range, only one touch icon reserved in the target range is used as an icon that a user needs to touch. In particular, the detailed description may be combined with the following specific implementations.
In a specific implementation manner of the present application, after the step 101, the method may further include:
step M1: and reducing the area of the target range according to a preset reduction proportion and the target central point to obtain the reduced target range.
In this embodiment, the preset reduction ratio refers to a ratio of a reduction target range preset by a service person.
After the target range is obtained, the area of the target range can be reduced according to a preset reduction proportion and the target center point, so that the reduced target range is obtained.
Of course, in a specific implementation, the area of the target range may be gradually reduced according to a preset reduction ratio until only one touch icon is reserved in the target range, and at this time, the reduced icon range may be obtained.
Step M2: and under the condition that only one first touch icon in the touch icons exists in the icon range, performing touch operation on the first touch icon.
The first touch icon is only one touch icon stored in the reduced icon range.
After obtaining that only one first touch icon of the touch icons exists in the reduced icon range, the touch operation may be performed on the first touch icon, as shown in fig. 2, the reduced icon range is an icon range 1 shown in fig. 2, at this time, only one touch icon, that is, a wechat icon, is included in the icon range 1, and at this time, the wechat icon may be used as the first touch icon.
And under the condition that at least two touch icons exist in the reduced icon range, determining adaptation values corresponding to the at least two touch icons according to the target central point so as to determine the touch icon needing to execute the touch operation by combining the adaptation values.
According to the touch control method and the touch control device, the touch control icon which is possibly touched by a user can be obtained by obtaining the touch control center point, the method is simple, and the accuracy of the touch screen can be improved.
In this embodiment, a touch icon that a user may need to click within the touch range may also be determined by combining the force center point of the touch range, and specifically, the detailed description may be described by combining the following specific implementation manner.
In another specific implementation manner of the present application, the step 101 may include:
substep N1: and carrying out area division on the target range to obtain a plurality of target areas corresponding to the target range.
In the present embodiment, the target area refers to an area obtained after area division of the target range.
After the target range of the touch operation in the display screen is obtained, the target range may be subjected to area division, and a plurality of target areas corresponding to the target range may be obtained.
After obtaining a plurality of target regions corresponding to the target range, sub-step N2 is performed.
Substep N2: and acquiring a touch pressure value of each target area according to the touch input.
The touch pressure value refers to a pressure value of the touch target area.
The electronic equipment is internally and preliminarily provided with a pressure sensor, and after a target range is divided into a plurality of target areas, the pressure sensor can be used for detecting the touch pressure value in each target area.
After the touch pressure value of each target area is obtained, substep N3 is performed.
Substep N3: and determining a force central point corresponding to the target range according to each touch pressure value.
After the touch pressure value of each target area is obtained, the force central point of the target range can be determined according to each touch pressure value, specifically, after the touch pressure value of each target area is obtained, each touch pressure value can be converted into a force weight of each target area, the central point is comprehensively calculated by combining the force weights, and understandably, when a user touches a display screen in a large area in a touch manner, the pressure value of the force central point is the largest under the normal condition.
Substep N4: and taking the force central point as the target central point.
The second touch icon refers to an icon in the touch icon within the target range, which a user needs to touch.
After the force center point is obtained, the force center point may be determined as the target center point. As shown in fig. 4, the force center point is determined to be the center point 3, and at this time, the center point 3 may be used as the target center point.
It should be understood that the above examples are only examples for better understanding of the technical solutions of the embodiments of the present application, and are not to be taken as the only limitation to the embodiments.
The embodiment of the application considers the force of touching the screen in the algorithm. When the center point is calculated, the position with larger touch force has higher weight, so the final center point calculation is more biased to the position with larger touch force. The scheme considering the actual use experience of the user can effectively solve the problem that the calculation center point of the large-area touch screen is inaccurate.
According to the touch operation execution method provided by the embodiment of the application, after the touch input of a user on a display screen is received, the touch icons in the target range where the touch input is located and the target central point of the target range are obtained, the adaptive value corresponding to each touch icon is obtained according to the target central point, the target touch icon in the touch icons is determined according to the adaptive value, and the touch operation is executed on the target touch icon. According to the embodiment of the application, the optimal target touch icon is determined by combining the adaptive value of the touch icon when a plurality of touch icons exist in the range of the touch screen, so that the accuracy of touch screen operation during large-area touch screen can be improved.
It should be noted that, in the touch operation execution method provided in the embodiment of the present application, the execution main body may be a touch operation execution device, or a control module in the touch operation execution device for executing the touch operation execution method. In the embodiment of the present application, a method for executing a touch operation by a touch operation execution device is taken as an example to describe the touch operation execution device provided in the embodiment of the present application.
Referring to fig. 5, a schematic structural diagram of a touch operation execution device provided in an embodiment of the present application is shown, and as shown in fig. 5, the touch operation execution device 500 may specifically include the following modules:
a touch icon acquiring module 510, configured to, after receiving a touch input of a user on a display screen, acquire a touch icon within a target range where the touch input is located and a target center point of the target range;
an adaptation value obtaining module 520, configured to obtain, according to the target central point, an adaptation value corresponding to each touch icon;
a target icon determining module 530, configured to determine a target touch icon in the touch icons according to the adaptation value;
a first operation executing module 540, configured to execute a touch operation on the target touch icon.
Optionally, the adaptation value obtaining module 520 includes:
the center point acquisition unit is used for acquiring an icon center point and an active value of each touch icon;
a central distance obtaining unit, configured to obtain a central point distance between the icon central point and the target central point;
and the adaptation value calculating unit is used for calculating the adaptation value of the touch icon according to the central point distance and the active value.
Optionally, the target icon determining module 530 includes:
and the target icon acquisition unit is used for screening out the touch icon with the maximum adaptation value from the touch icons and taking the touch icon with the maximum adaptation value as the target touch icon.
Optionally, the method further comprises:
the icon range acquisition module is used for reducing the area of the target range according to a preset reduction proportion and the target central point to obtain a reduced icon range;
the second operation execution module is used for executing touch operation on a first touch icon in the first touch icon under the condition that only one first touch icon exists in the icon range;
the adaptation value obtaining module 520 includes:
and the adaptive value acquiring unit is used for acquiring adaptive values corresponding to the touch icons respectively according to the target central point under the condition that at least two touch icons exist in the icon range.
Optionally, the touch icon obtaining module 510 includes:
a target area obtaining unit, configured to perform area division on the target area to obtain multiple target areas corresponding to the target area;
the touch pressure acquisition unit is used for acquiring a touch pressure value of each target area according to the touch input;
the force center determining unit is used for determining a force center point corresponding to the target range according to each touch pressure value;
and the target central point acquisition unit is used for taking the force central point as the target central point.
According to the touch operation execution device provided by the embodiment of the application, after the touch input of a user on a display screen is received, the touch icons in the target range where the touch input is located and the target central point of the target range are obtained, the adaptive value corresponding to each touch icon is obtained according to the target central point, the target touch icon in the touch icons is determined according to the adaptive value, and the touch operation is executed on the target touch icon. According to the embodiment of the application, the optimal target touch icon is determined by combining the adaptive value of the touch icon when a plurality of touch icons exist in the range of the touch screen, so that the accuracy of touch screen operation during large-area touch screen can be improved.
The touch operation execution device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The touch operation execution device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The touch operation execution device provided in the embodiment of the present application can implement each process implemented in the method embodiment of fig. 1, and is not described here again to avoid repetition.
Optionally, as shown in fig. 6, an electronic device 600 is further provided in this embodiment of the present application, and includes a processor 601, a memory 602, and a program or an instruction stored in the memory 602 and executable on the processor 601, where the program or the instruction is executed by the processor 601 to implement each process of the touch operation execution method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 7 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 700 includes, but is not limited to: a radio frequency unit 701, a network module 702, an audio output unit 703, an input unit 704, a sensor 705, a display unit 706, a user input unit 707, an interface unit 708, a memory 709, and a processor 710.
Those skilled in the art will appreciate that the electronic device 700 may also include a power supply (e.g., a battery) for powering the various components, and the power supply may be logically coupled to the processor 710 via a power management system, such that the functions of managing charging, discharging, and power consumption may be performed via the power management system. The electronic device structure shown in fig. 7 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
The processor 710 is configured to, after receiving a touch input of a user on a display screen, acquire a touch icon within a target range where the touch input is located and a target center point of the target range;
acquiring an adaptive value corresponding to each touch icon according to the target central point;
determining a target touch icon in the touch icons according to the adaptive value;
and executing touch operation on the target touch icon.
According to the embodiment of the application, the optimal target touch icon is determined by combining the adaptive value of the touch icon when a plurality of touch icons exist in the range of the touch screen, so that the accuracy of touch screen operation during large-area touch screen can be improved.
Optionally, the processor 710 is further configured to, for each touch icon, obtain an icon center point and an active value of the touch icon; obtaining the center point distance between the icon center point and the target center point; and calculating to obtain an adaptive value of the touch icon according to the central point distance and the active value.
Optionally, the processor 710 is further configured to screen out a touch icon with a largest adaptation value from the touch icons, and use the touch icon with the largest adaptation value as the target touch icon.
Optionally, the processor 710 is further configured to reduce an area of the target range according to a preset reduction ratio and the target center point, so as to obtain a reduced icon range; under the condition that only one first touch icon in the touch icons exists in the icon range, performing touch operation on the first touch icon; and under the condition that at least two touch control icons exist in the icon range, acquiring the adaptive value corresponding to each touch control icon according to the target central point.
Optionally, the processor 710 is further configured to perform area division on the target range to obtain a plurality of target areas corresponding to the target range; acquiring a touch pressure value of each target area according to the touch input; determining a force central point corresponding to the target range according to each touch pressure value; and taking the force central point as the target central point.
According to the embodiment of the application, when the central point is calculated, the position with larger touch force has higher weight, so that the final central point calculation can be biased to the position with large force. The scheme considering the actual use experience of the user can effectively solve the problem that the calculation center point of the large-area touch screen is inaccurate
It should be understood that in the embodiment of the present application, the input Unit 704 may include a Graphics Processing Unit (GPU) 7041 and a microphone 7042, and the Graphics Processing Unit 7041 processes image data of still pictures or videos obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The display unit 706 may include a display panel 7061, and the display panel 7061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 707 includes a touch panel 7071 and other input devices 7072. The touch panel 7071 is also referred to as a touch screen. The touch panel 7071 may include two parts of a touch detection device and a touch controller. Other input devices 7072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. Memory 709 may be used to store software programs as well as various data, including but not limited to applications and operating systems. Processor 710 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 710.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the process of the embodiment of the touch operation execution method is implemented, and the same technical effect can be achieved, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the embodiment of the touch operation execution method, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. A touch operation execution method is characterized by comprising the following steps:
after receiving touch input of a user on a display screen, acquiring a touch icon in a target range where the touch input is located and a target central point of the target range;
acquiring an adaptive value corresponding to each touch icon according to the target central point;
determining a target touch icon in the touch icons according to the adaptive value;
and executing touch operation on the target touch icon.
2. The method according to claim 1, wherein the obtaining, according to the target central point, an adaptation value corresponding to each of the touch icons respectively comprises:
acquiring an icon center point and an active value of each touch icon;
obtaining the center point distance between the icon center point and the target center point;
and calculating to obtain an adaptive value of the touch icon according to the central point distance and the active value.
3. The method of claim 1, wherein determining a target one of the touch icons according to the adaptation value comprises:
and screening out the touch icon with the maximum adaptation value from the touch icons, and taking the touch icon with the maximum adaptation value as the target touch icon.
4. The method of claim 1, further comprising, after the obtaining the touch icon in the target range of the touch input and the target center point of the target range:
according to a preset reduction proportion and the target central point, reducing the area of the target range to obtain a reduced icon range;
under the condition that only one first touch icon in the touch icons exists in the icon range, performing touch operation on the first touch icon;
the obtaining, according to the target central point, an adaptation value corresponding to each of the touch icons specifically includes:
and under the condition that at least two touch control icons exist in the icon range, acquiring the adaptive value corresponding to each touch control icon according to the target central point.
5. The method according to claim 1, wherein the obtaining the target center point of the target range specifically includes:
performing area division on the target range to obtain a plurality of target areas corresponding to the target range;
acquiring a touch pressure value of each target area according to the touch input;
determining a force central point corresponding to the target range according to each touch pressure value;
and taking the force central point as the target central point.
6. A touch operation execution device, comprising:
the touch icon acquisition module is used for acquiring a touch icon in a target range where a user touch input is located and a target central point of the target range after receiving the touch input of the user on a display screen;
the adaptive value acquisition module is used for acquiring adaptive values corresponding to the touch icons respectively according to the target central point;
the target icon determining module is used for determining a target touch icon in the touch icons according to the adaptive value;
and the first operation execution module is used for executing touch operation on the target touch icon.
7. The apparatus of claim 6, wherein the adaptation value obtaining module comprises:
the center point acquisition unit is used for acquiring an icon center point and an active value of each touch icon;
a central distance obtaining unit, configured to obtain a central point distance between the icon central point and the target central point;
and the adaptation value calculating unit is used for calculating the adaptation value of the touch icon according to the central point distance and the active value.
8. The apparatus of claim 6, wherein the target icon determination module comprises:
and the target icon acquisition unit is used for screening out the touch icon with the maximum adaptation value from the touch icons and taking the touch icon with the maximum adaptation value as the target touch icon.
9. The apparatus of claim 6, further comprising:
the icon range acquisition module is used for reducing the area of the target range according to a preset reduction proportion and the target central point to obtain a reduced icon range;
the second operation execution module is used for executing touch operation on a first touch icon in the first touch icon under the condition that only one first touch icon exists in the icon range;
the adaptation value acquisition module comprises:
and the adaptive value acquiring unit is used for acquiring adaptive values corresponding to the touch icons respectively according to the target central point under the condition that at least two touch icons exist in the icon range.
10. The apparatus of claim 6, wherein the touch icon obtaining module comprises:
a target area obtaining unit, configured to perform area division on the target area to obtain multiple target areas corresponding to the target area;
the touch pressure acquisition unit is used for acquiring a touch pressure value of each target area according to the touch input;
the force center determining unit is used for determining a force center point corresponding to the target range according to each touch pressure value;
and the target central point acquisition unit is used for taking the force central point as the target central point.
CN202110082236.4A 2021-01-21 2021-01-21 Touch operation execution method and device Pending CN112925455A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110082236.4A CN112925455A (en) 2021-01-21 2021-01-21 Touch operation execution method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110082236.4A CN112925455A (en) 2021-01-21 2021-01-21 Touch operation execution method and device

Publications (1)

Publication Number Publication Date
CN112925455A true CN112925455A (en) 2021-06-08

Family

ID=76165662

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110082236.4A Pending CN112925455A (en) 2021-01-21 2021-01-21 Touch operation execution method and device

Country Status (1)

Country Link
CN (1) CN112925455A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103135930A (en) * 2013-02-05 2013-06-05 深圳市金立通信设备有限公司 Touch screen control method and device
CN103294236A (en) * 2012-02-29 2013-09-11 佳能株式会社 Method and device for determining target position, method and device for controlling operation, and electronic equipment
JP5712339B1 (en) * 2014-05-30 2015-05-07 楽天株式会社 Input device, input method, and program
CN105278714A (en) * 2014-07-18 2016-01-27 国基电子(上海)有限公司 Electronic equipment and touch operation identification method
CN109375866A (en) * 2018-12-27 2019-02-22 广州市久邦数码科技有限公司 A kind of system that screen touch clicks the method and realization that respond

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103294236A (en) * 2012-02-29 2013-09-11 佳能株式会社 Method and device for determining target position, method and device for controlling operation, and electronic equipment
CN103135930A (en) * 2013-02-05 2013-06-05 深圳市金立通信设备有限公司 Touch screen control method and device
JP5712339B1 (en) * 2014-05-30 2015-05-07 楽天株式会社 Input device, input method, and program
CN105278714A (en) * 2014-07-18 2016-01-27 国基电子(上海)有限公司 Electronic equipment and touch operation identification method
CN109375866A (en) * 2018-12-27 2019-02-22 广州市久邦数码科技有限公司 A kind of system that screen touch clicks the method and realization that respond

Similar Documents

Publication Publication Date Title
CN105786878B (en) Display method and device of browsing object
CN108446058B (en) Mobile terminal operation method and mobile terminal
CN109062464B (en) Touch operation method and device, storage medium and electronic equipment
CN107390923B (en) Screen false touch prevention method and device, storage medium and terminal
CN108509108B (en) Application icon arrangement method and mobile terminal
CN112099714B (en) Screenshot method and device, electronic equipment and readable storage medium
CN110795189A (en) Application starting method and electronic equipment
CN113138818A (en) Interface display method and device and electronic equipment
CN112433693A (en) Split screen display method and device and electronic equipment
CN112269501A (en) Icon moving method and device and electronic equipment
CN107967086B (en) Icon arrangement method and device for mobile terminal and mobile terminal
CN111064847B (en) False touch prevention method and device, storage medium and electronic equipment
CN112929734A (en) Screen projection method and device and electronic equipment
CN112783406A (en) Operation execution method and device and electronic equipment
WO2022247745A1 (en) Touch control method, touch-control apparatus, and electronic device
CN112925455A (en) Touch operation execution method and device
CN112162689B (en) Input method and device and electronic equipment
CN113360038A (en) Application function searching method and device and electronic equipment
JP5471286B2 (en) Touch panel pointing position calculation device, touch panel device, electronic device including the same, touch panel pointing position calculation method and program
CN111857496A (en) Operation execution method and device and electronic equipment
CN113157197B (en) Touch pad control method and device
CN111601036B (en) Camera focusing method and device, storage medium and mobile terminal
CN110941473B (en) Preloading method, preloading device, electronic equipment and medium
US11086478B2 (en) Icon display method and terminal device
CN112162811A (en) Display method and display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210608

RJ01 Rejection of invention patent application after publication